Monday, 21 December 2020

Moving to a ‘cloud first’ IT strategy.

 Introduction.

Schools thinking of adopting a ‘cloud first’ strategy can often be overwhelmed by the number of things that need to be considered, many of which appear to be project blockers. 

However if you tackle each issue in isolation the process can be simplified. Most of these issues can be addressed as independent mini projects, many working in parallel with each other so significant progress can be achieved in a short amount of time. 

This post outlines an approach that a school might consider when moving to the cloud. The advice is specific to UK schools but many of the ideas can be transferred to other countries.

Basic Strategy and Project definition.

As a first step the school needs to decide if they are moving to Google only (no Windows devices or Microsoft Office) or a solution that uses both platforms. If a school is planning to go solely with Microsoft Azure most of this document still applies, just ignore the Google references.

If  the school is moving to “Google only” this is a different pathway that focuses on migrating the existing application set to SaaS resources and retraining staff.  Both pathways have the aim of reducing the dependency on local server infrastructure.

In reality many schools will retain an element of both. In the UK it’s common to use Google for teaching and learning and Microsoft for administration, finance and supporting the SLT team.

This post assumes the school will maintain some Microsoft applications and will be running a local copy of MS Office. It also assumes the school already operates an Office365 tenancy using A1 (free) licences. 

This document does not cover the requirements of the internal network in detail.

However the basic network requirements for a cloud solution are:

  • Robust wireless network linked by an appropriately sized wired backbone.
  • High speed internet connection.
  • Edge Security device  (Firewall)

Mini-project List.

 Implementing G Suite                                                                     >

G Suite can be introduced without relying or affecting the local system. If possible use the existing Office365 domain for the Google organization and adopt the same naming schema for staff and student accounts. This approach makes Single-Sign-On easy to implement.

School Action: Create a G Suite Tenancy.



 Organise the Cloud Directories.                                                     >

The Office365 domain will hold user accounts in Azure AD. These accounts are very likely synchronized from local Active Directory using the Microsoft toolset.

Using standard facilities provided by Microsoft the Google tenancy can be made to defer to Azure AD for authentication.  In this way a user will logon to Google using the password held in Azure.

Having two cloud directories requires a method to keep them in sync and a policy decision to decide which is the ‘master directory’. Since most schools already have a well developed Azure AD with Office365 this normally maintains the position of master, pushing accounts into Google and letting Google check passwords against the Microsoft cloud directory.. 

It is possible to work the other way round but it’s uncommon. While it’s easy for a Chromebook to authenticate using an Azure service the ability for Windows10 laptops to communicate with a Google directory service is still a development feature.


School Action: Set the Cloud Directory hierarchy and install a syncing mechanism.


 Align applications to Cloud Directories                                            >

Legacy applications may require a local Active Directory to maintain a user list. Common examples include cashless catering, RADIUS based wireless authentication and web content filtering.
A ‘cloud first’ school does not incorporate local AD and so these functions need to be updated to support external hosted user directories and the related authentication services.

There are SaaS alternatives to cashless catering, web content filtering which natively support web directories so one solution might be to shift suppliers. However most vendors now have a road map that includes a SaaS offering that supports external directories. Enquire about timelines for release and migration tools.

A cloud first school gives an opportunity to re-examine the requirements for wireless authentication. In a traditional solution the internal network has to be protected to reduce the attack vector on the local server infrastructure. In a serverless situation that’s no longer the case as systems are hosted externally. In general SaaS services are accessed using wireless networks that do not require user level authentication (home broadband, mobile).  A simpler system based on a WPA2  PSK might be appropriate.

School Action: Approach incumbent vendors to gauge support for Cloud Directories.



 Align licensing to Cloud Services.                                                   >

Cloud licensing is based on a subscription model. Previous Microsoft licencing terms have been centred on user/device qualities or concurrent access and are out of step with modern cloud deployments. New services such as Azure Information Protection and Mobile Device Management are not covered by existing licencing. The version of Azure AD that comes with Office 365 (Basic) does not have the feature set to support a full cloud deployment.

Schools need to move to the new CSP model and purchase Microsoft 365 A3 license for all staff members who will continue to use a Microsoft desktop.


School Action: Re-evaluate the Microsoft licensing model



 Develop a data migration strategy for local data.                           >

A cloud first solution does not hold centralised data on-premise. To be specific, Windows files shares are not supported.

The cloud model stores holds all files external to the site, synchronizing data to the local desktop as required. Both Microsoft (OneDrive) and Google (Drive) have facilities that allow you to work offline and experience the advantage of local access while maintaining the benefits of cloud storage. They can also be incorporated with upsetting existing workflows or the application set.

By promoting cloud storage over an extended period of time the value of the local store will degrade over time. It may even be possible to migrate without transferring large amounts of legacy data.


School Action: Re-evaluate the value of locally held data and promote cloud storage.



 Develop a data migration strategy for email.                                   >

It’s highly likely that the school mailboxes are already cloud hosted, normally within Office365 or a district service using Office365.

Migrating email into the core service is not a requirement but it can provide efficiencies. If a migration is required sufficient time should be allowed for the data transfer. 


School Action: Re-evaluate the location of email.



 Examine local devices.                                                                         >

Windows devices operating within a cloud first school must run a recent version of Windows10. All devices not physically capable of running Windows10 in a responsive manner should be upgraded or retired. For legacy devices the Cloud Ready option provides an alternative strategy to decommissioning.

Microsoft has a toolkit that can be run to evaluate upgrade readiness for a large estate.

School Action: Establish the upgrade readiness for the Windows estate.



 Setup a PoC for Windows cloud device management                          >

A proof of concept (PoC) should be established on a sub-set of Windows devices to identify the blockers to a full migration.  The PoC should include Azure AD user authentication, enrolment in Microsoft Endpoint management (InTune),  application deployment, compliance checking and data security.

The plan could also include early adoption for BYOD devices that are not catered for using local Active Directory.

School Action: Plan a PoC for cloud based device management.



 Reevaluate the application set                                                                 >

Not all applications will be suitable for a cloud based solution. The application set should be standardised prior to migration and SaaS alternatives adopted where possible.

If the application is incompatible with Windows 10 it’s function needs to be examined and replaced.

If the school is planning a BYOD strategy the browser will be the common interface across all platforms so it makes sense to move as many services to a web delivery platform as possible.

School Action: Create a software catalogue. Start the migration to SaaS.



 Reevaluate the use of Print                                                                    >

The importance of print is greatly reduced in a cloud first school. The adoption of collaborative working practices and the ease of document sharing has the capability to remove the requirement for students to print altogether.  From a management perspective arranging print services across a BYOD environment that allows home use is a challenge that is best avoided.

One point that is generally overlooked is that once data is transferred to paper it moves outside any security control mechanism and therefore presents a potential backdoor to any data control policy.


School Action: Promote sharing as an alternative to print for students.



 Move MIS and other admin functions to SaaS.                                    >

Schools should look to move the MIS system and related functions such as finance to a SaaS platform. 

School Action: Start a program to migration on-premise admin apps to SaaS.



 Examine the internet connection.                                                            >

In a cloud first school the internet connection is the primary channel to services and data. For this reason it holds the same level of priority as the network backbone with a focus on bandwidth, latency and resilience.  A large secondary should be looking to upgrade to a 1Gbs contract with some form of failover option. Try not to pay for bundled services that are not used.

School Action: Re-examine the ISP contract.



 Adopt a security model that protects data not systems.                         >

One advantage of a cloud first school is that it embraces mobility which often includes personal devices that are outside the security boundary of Active Directory.  A security policy that controls user access to the network (802.11X) or user access to file data from the local network (Kerberos) is not suitable for a based cloud model.

Both Azure AD and Google can create a secure boundary around data that protects sensitive information on both school and personal devices.  The challenge is to migrate the access control checks away from the network, hosting systems and containers and onto the data itself.


School Action: Create a data security policy.




Tuesday, 3 November 2020

GDPR and the Googles model contract.

It's a universal truth that a parent on the board of governors of a British school will at some time ask the question;

             "Where does Google store the schools data and is it within the EU."

The first response to this question is that GDPR itself does not require data to remain within the EU. 

The second fact is - GDPR is whatever you decide it is.

To this end the Department for Education  and Google have negotiated what is called a 'model contract' which defines what GDPR compliance means with respect to using Google Cloud as a Data Processor.

So long as Google sticks to the clauses of the model contract and the school agrees to the same clauses both the school and the Google are working within the GDPR framework. Although the model contract does not require Google to hold data exclusively within the EU it's almost certain that the schools data stored on the datacentre in Dublin. However it's likely that recovery copies also exist in other data centres outside the EU.

The more important question is where does the school agree to the model clause?  

It can be found in the admin console under Account Settings - Legal and Compliance.

A school administrator needs to accept the model contract clause and also fill in the details of the local data officer.  If these actions are not completed the school is technically non-compliant if, in the unlikely event, it ever came to an data audit.  This fact is probably more important than worrying about exactly where the data is stored.

Of course if the school has an independent GDPR policy which states that all data MUST remain in the EU then you'll have to migrate it all back to local servers.

Hold on... England's not in the EU either.   Hmmm - USB sticks.

Tuesday, 6 October 2020

Managing Digital Displays with InTune.

A common requirement for schools is the management of digital displays. While there are a dozens of excellent SaaS applications that will do the job perfectly well it’s also possible to put together a workable solution using the standard features provided by Intune and a third party resources such as Google Slides without any additional cost.

One of the more useful features of Intune are the preconfigured device templates and one of these is Single Application Kiosk which is exactly what you need for digital displays. It’s not worth going through the details of how this is set up as it’s covered in a number of other posts including this excellent video walkthrough. Rather this post lists some of the tweaks that take this general idea and makes it work in practice.

One tip mentioned in the video and worth repeating is that you really need to set a maintenance window when you config the kiosk policy. You really don’t want your digital player to be rebooting in the middle of the days to take a feature update.

You don’t need any special windows app to run a web session in kiosk mode - Explorer/Edge will do nicely. The standard kiosk policy takes care of all the auto-login and full screen requirements without any extra effort on your part. What you do need to do is create a separate policy to control the target URL for the display.

This is done through creating a new Device restrictions policy shown below replacing your value for the URL.

If you are running a number of displays all presenting different slide shows, each will need it’s own policy assigned through security groups containing the appropriate display device. Changing the display then becomes as simple as moving the device between groups.

You need to be a little careful working with Kiosk mode as some of the features such as autologon will conflict with standard security settings set in other policies especially if these are held by the All Devices group. The best approach is to create an All Digital Displays security group and then explicitly exclude it from all policies set on All Devices unless it carries a policy you require.

Setting your policy apply should force an autologon and present a full screen display and so you might well believe it’s a job well done - not quite. If you return in ten minutes you’re likely to find sleep mode has kicked in and your screen is now a blank page.

Intune has a number of configuration policies that control aspects of sleep mode but these are not always effective in kiosk mode, The solution is to create a new custom profile type with three OMA-URI entries using the information below.


Name: DisplayOffTimeoutPluggedIn

OMA-URI: ./Device/Vendor/MSFT/Policy/Config/Power/DisplayOffTimeoutPluggedIn

Data type: String

Value: <enabled/><data id="EnterVideoACPowerDownTimeOut" value="0"/>


Name: StandbyTimeoutPluggedIn

OMA-URI: ./Device/Vendor/MSFT/Policy/Config/Power/StandbyTimeoutPluggedIn

Data type: String

Value: <enabled/><data id="EnterACStandbyTimeOut" value="0"/>


Name: HibernateTimeoutPluggedIn

OMA-URI: ./Device/Vendor/MSFT/Policy/Config/Power/HibernateTimeoutPluggedIn

Data type: String

Value: <enabled/><data id="EnterACHibernateTimeOut" value="0"/>


Once these are set and applied to your security group the display will stay fixed.


Its also worth checking if your hosting device has a BIOS setting that allows reboot on power loss. This is a standard feature on the Intel NUC and allows the screen to recover to the display from a power on without any manual intervention. You don't really want to be searching for the on/off switch when the screen is 2m from the floor.

In this example I used a Google Slide that’s published to the web as the target. You can use any URL or third party resource. If you know how to replicate the functions of published Google Slides with Microsoft PowerPoint please drop me a line.

If you are using a third party platform it's likely that the display will be driven through a local application that controls the update cycle. Using a simple URL like the one provided by Google Slides allows you to control the advance rate but not it's refresh. Once the slide deck is loaded it's cached locally which has some advantages if the internet connection is dropped but it also means that new information is only going to be visible once the URL is reloaded.

The easiest way guarantee a URL reload is schedule a reboot of the device.  This can be achieved using another custom profile type.


Name: ReoccuringRebootSchedule

OMA-URI: ./Vendor/MSFT/Reboot/Schedule/DailyRecurrent

Data type: String

Value: 2019-10-01T02:00:00Z

The date and time value is in ISO8601format and both are required. This will reboot the device each day at 02:00 am to ensure the presentation is current for each day.  

Other reset options can be found in this post.


Thursday, 10 September 2020

AI - the second wave of SaaS.

If nothing else the events of the last few months have highlighted the limitations of traditional IT solutions based on servers and local data.

Schools that embraced cloud storage and SaaS have found the adoption of remote learning an easier pathway than those with teaching resources locked up behind firewalls or maintaining a heavy reliance on server based applications.

For education it’s been a significant change. Numerous SaaS programs were fast tracked over the summer break and there’s no returning to the old way of working. In the future IT systems will be designed to allow the efficient consumption of SaaS services without the requirement for local stateful data. While people talk about a hybrid scenario it’s really only an interim solution or a ramp to move processes and data offsite. The future is firmly SaaS.

While remote learning is an immediate payback of this transition it’s only a small part of the SaaS advantage. Previous posts have discussed other elements such as cost management, scalability and the levelling of the ‘tech’ playing field but perhaps the biggest advantage has yet to be realised.

ADS - Classroom Dashboard

Visualisation Suite for Google Classroom.

Once data is centralised in the cloud, a canvas that was once just fragmented shards of black and white expands into a kaleidoscope of colour painted by Data Analytics and the emerging field of Artificial Intelligence (AI).

The resulting landscape is not just better than what we have currently but completely new.  It’s the same transformation that drives the success of platforms such as Amazon Facebook and Google and it’s inevitable that both processes will have an important role to play in education.

Most schools and businesses already make use of Data Analytics and AI. Microsoft’s Data Loss Prevention (DLP) features rest on top of these platforms as do most of the processes that intercept email spam and control the threats to your internal network. AI based systems have the capability to draw relationships between seemingly unrelated points of data and then use this information to improve the response. The power of continuous improvement should be familiar to anybody who works in teaching and now it can be put to work in a practical way, analysing the schools data resources in ways that were impossible only a few years ago,

The information stored in platforms such as Google Classroom and Microsoft Teams can be opened out in new and exciting directions. Not just the simple lists of students and classes (although this is useful enough) but insights into how it’s being used, identifying those students who are engaging, those who are being left behind. Not just raw numbers but the patterns of use within that data drawn out across year groups, subjects or any label type and then presented in a secure way using a web dashboard.

Every school using Google G Suite and Microsoft Office 365 already has access to an advanced analytics toolset through Google Cloud Platform (GCP) or Microsoft Azure but because they are not fully understood they are rarely used. This is almost certain to change because the benefits of adopting this toolset are almost limitless.

Established SaaS platforms such as Securly use an AI engine to scan messages for signs of depression and self-harm thats capable of understanding local nuances and working across language barriers. Senso.cloud offers a visual threat intelligence feature as a standard component in its safeguarding product also using AI.

Other company's such as Applied Data Science are working with trusts in the UK to help them build customised analytics platforms that open out the data they hold in platforms such as Google Classroom. The result goes far beyond the simple snapshot view that you get with a spreadsheet download providing ongoing analysis that can expose trends and patterns over time and give real insights into how a school or Trust is operating and performing.

The real takeaway for education is the fact that none of this is particularly difficult or costly to implement. Once the school has adopted a SaaS platform the data is in the cloud and the delivery platform is in place (GCP/Azure). Both come with a generous free tier that can be used to trial a service. No local infrastructure is required (of course) and ongoing costs are mainly limited to data storage.  Data remains within the same security boundary controlled by the school or Trust -  it’s just moved from one database to another. 

The data is already there, it just needs to be put to work. 

Disclosure: The Serverless School provided consultancy services to Applied Data Science to help realise the Visualisation Suite for Google Classroom.

Thursday, 13 August 2020

Chrome Sign Builder gets a reprieve.

For those schools using Chrome Sign Builder to provide digital signage, Googles announcement that it plans to phase out support for Chrome Apps over the next year came as a bit of bad news.

However in a recent update those schools using the app on Chrome OS got a small reprieve and will now receive support through to June 2022.  As a result enterprise administrators can continue submitting and updating private and unlisted apps for two more years but by June 2022 Chrome Apps platform will be entirely phased out.



Google says it's "committed to providing a useful extension platform for customising the browsing experience for all users." and that may extend to creating an alternative version based on the progressive webapp platform, so hope is not entirely lost.

These changes only refer to the Chrome Apps platform. The many Chrome extensions that education relies on will be around well after Chrome Sign Builder disappears.

Wednesday, 29 July 2020

It’s a local file cache - just not as you know it.


When you design a serverless school there’s the always the option to leave a little bit of local storage in the mix, just to be on the safe side but this is always a mistake.

To operate a local file server within a role based security model you need local accounts. Cloud directories do not understand Kerberos unless you reintroduce a local domain controller and Active Directory on yet another server. 

Once you’ve put Active Directory back into the mix and installed the device to run it on the temptation will be to solve any problem using the old techniques and before you know it you’ll have a rack of servers or, more likely, be suffering 'virtualisation creep'. Nothing has changed and you're back to square one.

The common accusation against a cloud first school is that you can’t access cloud data without some form of local storage or caching of files. When a class of 30 students opens a 10Gb media stored in the cloud everything will freeze as 300 Gb of data is pulled down a 100Mbs connection and two years ago that was probably true. Except now it doesn’t freeze because there is a local cache, just not the one you might expect.


In a cloud first school the local cache is distributed almost all the workstation and managed directly by the One Drive or the Google File Stream client. This creates a distributed, fault tolerant local cache with access to TB’s of local solid state storage and almost limitless CPU cycles all talking to a back-end that is moving data to and from the site using predictive on-demand technologies. 

One Drive supports delta level file updates across a wide range of file types including most graphics packages. A 90K update to a 10GB file creates 90k of traffic. The system has its own built in form of QoS, trickle feeding updates back to the cloud while making sure common files are received from the local cache.

Collaborative workflow is standard, as is file versioning and user on-demand recovery.  

If configured correctly the data never moves outside the school security boundary.  DLP policies and intelligent labelling and classification controls access based on content so that files are secured from any location and any platform. The school data protection strategy can be realised in an observable rule set applied to every device, personal or school owned.

Technically the distributed replication approach backed by DLP is so superior to a local file server it's like trying to compare a firework to a Falcon Heavy.  This is the model both Google and Microsoft are betting business on and trying to retro-fit centralised file syncing to the cloud goes against the technological direction for both companies

Distributed sync, cloud to device, no servers required is the way forward.


Friday, 1 May 2020

Win32 app lifecycle for Intune.


Microsoft's documentation on the format and deployment of Windows apps (Win32) within InTune is pretty comprehensive and is well supported by a number of technical blogs which take you through the packaging and the InTune Management Extension (IME) workflow

What is less well explained is what happens next.

Your V1 app has been marked as Required and deployed successfully but now the vendor has released V2. How do you get V2 onto the desktop ?

The new V2 app clearly requires repackaging to create an updated .intunewin payload and logic would suggest that if the V2 package replaces the old V1 version in the original InTune app definition the change will roll out to the desktops - but it doesn’t.

As far as InTune is concerned the V1 app is marked as installed for the device or the user. Simply uploading an updated .intunewin file doesn’t change that fact.  The only way to break the log jam is to convince InTune that the app isn’t installed anymore which forces a re-install and a subsequent upgrade.

The Win32 object has a number of ways to detect if an app is installed. Again these are well documented in other technical blogs but in summary it involves checking for files, folders or registry entries or a combination of all three. This works for the initial deployment because it’s a fair bet that if the startup executable can’t be found in the install path it’s probably not installed. However for an upgrade this approach cannot be relied on. Unless the process creates a new file / folder or updates a registry entry that you can check for, the logic will always return ‘installed’ and assume there is nothing to do.

Even if you can update the original app object and identify a feature to test for, you are not going to get much feedback on how the upgrade is progressing. The best that you can hope for is a report that tells you that 100 instances are installed and, at any time 100 instances are still installed. There’s no feedback on the roll-out process because the app only reports if it’s installed - which it is in all circumstances.

For this reason, best practice suggests creating a new Win32 object for each app version and retiring the old version by removing the assigned group or changing the status from Required to Available. This makes things nice and clean and gives you a good idea of how things are progressing but doesn’t solve the problem of triggering the install process in the first place.




Fortunately the Win32 object gives you the option of running a script instead of looking for files and folders which allows you to check the version of the application using the script below.


$ver = (Get-Command "<<< Path to the app.exe >>").FileVersionInfo.FileVersion
if ($ver -eq "<< Version Number to Test For >>") 
{
    Write-Host "Updated Version Installed"
}

The script must return zero in the exit code and write to STDOUT to signal that the application has been detected.

https://www.petervanderwoude.nl/post/working-with-custom-detection-rules-for-win32-apps/

This will force the update onto V1 machines and since the check is also run at the end of the process it’s a surefire way of ensuring the update has been a success.

Once you start scripting you can embed any logic you like but it’s best to keep it simple because once the code has been uploaded to Azure store there’s currently no method within the GUI to recover the script or even view the contents so this process has to be manually documented.

Clearly this is not an ideal situation and it’s likely that Microsoft has a roadmap to make this process easier, possibly by involving a version label or something similar. In the meantime it's worth giving some thought to how you intend to maintain Win32 apps before the initial install goes out.


Monday, 20 April 2020

Take a train ride to Azure.

For a while now Microsoft has been signalling it's intention to move towards role-based training in an attempt to test real world problem solving skills rather than the simple accumulation of facts around a specific platform or technology. This reorganization has resulted in the wholesale retirement of the old MCSx accreditation tracks which have formed the cornerstone of Microsoft training since Windows Server 3.5 launched in 1994.

The original announcement fixed the retirement date on June 30, 2020. In response to the current situation this has been extended to January 31, 2021 but this still places the cut-off within a nine month period. Any exam passed prior to the retirement date will stand for one year after the exam is retired but after that all current MCSx credentials will be stamped as inactive. From that point it’s over to the new role-based certification tracks.

Microsoft is well known for updating the training programs at regular intervals. Any network admin attempting to keep their CV up to date will know it’s pretty much a full time job so why is this change any different?




Well it’s down to the number of exams being retired and the wholesale shift to cloud technologies.

Consider this simple fact: there no longer an exam that explicitly tests for proficiency in Windows Server 2019 administration.

The official line is that

 “Windows Server 2019 content will be included in role-based certifications on an as-needed basis for certain job roles in Azure”.

The Windows Server admin exams were the cornerstone of the old MCSE but now they don’t even exist. As far as Microsoft is concerned Windows Server knowledge is still important but only as it applies to Azure cloud services.

Looking for the update to the SQL Server admin exam?  Much the same I’m afraid because you really should be using the Azure SQL Database as a PaaS.

The new Microsoft accreditation tracks are wholly and unashamedly focused on Azure and the associated cloud services such as Modern Management and Desktop Analytics. On-premise is is part of that but only as far as it supports Azure.

This change will feed into the partner channels who will need to rapidly re-skill before the cut-off date so it might be a good time to invest in training companies or get that training budget signed off.

For the traditional Microsoft IT administrator who expects to be cramming facts about Windows Server 2019 installation procedures, scaling limitations and hardware requirements it’s all going to look a little strange but the plan to sit tight and wait for the cloud to blow over is no longer an option.

There’s a general rule that if you want to get an insight into the future direction for any tech company - check out it’s training program.

Monday, 13 April 2020

Goodbye Office A1, hello Microsoft A3.

Why schools should expect to move away from free MS licencing.


There’s little doubt that one of the attractions of the Office 365 for Education A1 (O365 A1) licence is the price.

For no cost at all schools receive hosted mailboxes, a generous amount of cloud storage, office web apps and a user directory for an unlimited number of users that supports Single Sign On.

So with the ever increasing facilities offered by Office 365 it might seem like a plan to ditch the servers and the local licensing, operate entirely from the cloud and pay Microsoft nothing at all. Unfortunately as schools and businesses start to understand the requirements of Microsoft's Modern Management strategy that idea is a non-starter for a number of reasons.



Windows 10.
Without onsite servers schools will be relying on Microsoft InTune for device management and that requires a licence that’s not covered by O365 A1. It’s quite possible to register devices with Azure AD without incurring a licence and this gives you a certain amount of control around device security but this is best suited to BYOD deployments. It’s also possible to join Windows 10 devices to Azure AD in a similar way to adding a device to an on-premise Active Directory but this is not a full management package.  Without enrolment into InTune you have no control over the way users access and share information and, more importantly you are unable to deploy and authenticate applications.

Therefore licensing in a serverless solution will need at least Office 365 A1 + InTune for each user.

Azure AD.
The cloud directory service that you get bundled with O365 A1 is the Office365 Apps version which was previously called Basic. As the name suggests a few key features are missing from this package and one of the most important is auto-enrolment. This allows users to use a school account to join devices to Azure Active Directory while automatically enrolling into InTune.

Combining auto-enrolment with Auto-pilot  it’s possible to ship devices directly to the user from the supplier and be assured that the device will exit the OOBE with a secure work profile and an approved application set installed.

Auto-enrolment is closely related to Dynamic Groups which is another capability missing from the Office365 Apps version. Dynamic Groups allows a user or device security group to be defined on the basis of a user property. Because groups are the primary method of controlling the allocation of policy and access rights (Azure AD does not use an directory OU structure like on-premise AD) dynamic groups are pretty much essential in an environment where users and not admins are adding devices to the directory.

Going forward you are also going to need Conditional Access, the ability to manage access to data and systems based on user groups, locations, device platform and client application.   Another key requirement is Enterprise State Roaming which performs a similar function to roaming profiles providing users with a unified experience across their Windows devices.

Basically the Office 365 Apps version of Azure AD doesn't meet the requirements of a Window 10 deployment which means an upgrade to Azure AD Premium P1 as a minimum.

So you now need Office 365 A1 + InTune + Azure AD Premium P1.

Microsoft Office.
To activate and manage the Office desktop apps deployed through Microsoft Intune you need an Office 365 ProPlus licence allocated to each user.  So long as the user holds a licence the apps can be installed on multiple devices including Macs, iOS and Android platforms.

If you are keeping track the lists now reads Office 365 A1 + InTune + Azure AD Premium P1 + Office Pro Plus.

Azure Information Protection.
For most schools and businesses Azure Information Protection (AIP) is probably seen as a nice-to-have or even more likely, a complete unknown.

AIP helps an organisation to classify and protect its documents and emails by applying labels. Labels can be applied automatically by administrators which are then used to drive rules and conditions that control how that data might be shared and used within an organization and importantly external to the workplace. Once you adopt a strategy based on mobility and collaboration the security framework provided by the share permissions tied to a fixed storage location only goes so far.  Both business and schools need to adopt a new security model based on zero trust networking and move away from the historic perimeter method which is no longer effective.

With AIP the access permissions rest with the document itself regardless of its location and this allows far tighter control and visibility over where the sensitive data is and who can see it. As ever tighter regulation is placed on schools to demonstrate a robust data management policy, AIP will become a necessity.

Although rarely implemented in schools the O365 A1 Education licence includes some of the data protection capabilities of the Azure Information Protection platform. This feature is referred to as Azure Information Protection for Office 365. The full package extends data protection across non-Microsoft Office file formats as well as providing manual, default and mandatory document classification and for that you require a minimum of Azure Information Protection P1. 

So now you need Office 365 A1 + InTune + Azure AD Premium P1 + Office Pro Plus + Azure Information Protection P1.


The list is starting to grow but you are unlikely to upgrade your Office 365 A1 licence by purchasing each additional element separately because Microsoft offers some licence bundles to make life easier.

The obvious one is Enterprise Mobility + Security E3. This includes Azure Active Directory Premium P1, Microsoft Intune and Azure Information Protection P1 in a single licence so it gets you most of the way but without Office Pro Plus.

Previously, the easiest way to get a Office Pro Plus licence was to simply upgrade to Office 365 A3 which was essentially an Office 365 A1 licence with larger storage allocations and the ability to install the local Office apps. Putting both Office 365 E3 and Enterprise Mobility + Security E3 together gets you what you need but there is an easier way.

In the future Microsoft expects you to purchase the Microsoft 365 A3 licence which is the union of Office 365 A3 plus Enterprise Mobility + Security E3.  In many respects this is the end game - a single user licence that delivers Microsoft as a Service as a yearly subscription.

The pressure to move to this new licensing model comes from a number of directions. First, Microsoft's strategy is now fully focused on cloud services such as Teams and Desktop Analytics. In fact any new feature on the server platform is normally prefaced by the word "hybrid" which is generally a hint that a move to the cloud is imminent. When you see this, pack your bags.

Second, education has always been dependent on the ‘student use benefit’ which grants students free use of licences if the teaching team is fully licenced. Few schools would be able to afford the licencing bill without this scheme. However only the larger licence bundles are covered in any practical way.  Purchasing an individual licence for InTune or Office Pro Plus allows you to licence 15 students while the larger Microsoft 365 A3 bundle gives you 40. Trying to save money by targeting specific groups with individual licence packs will cost more in the long run because you need to cover the shortfall for students.

So what would be the cost of licencing a ‘serverless’ Microsoft school in the UK.

Let's say the annual subscription to A3 for education is £5.00 user/month which equates to £60 per year. Therefore a school with 70 staff will be paying £4,200 a year (60 X 70) with the rights to licence a further 2800 student accounts (70 x 40) under a student benefit agreement.

That seems like a scandal! One minute you’re paying nothing for Microsoft cloud services and the next you’re being scalped for just over four grand a year - but that's not the whole picture.

Running Office 365 A1 with local infrastructure has a range of hidden costs once you take into account the obvious requirement for  local server licences and user CALS. Servers and storage cost money to power up and cool down and represent a large initial investment that needs refreshing every five years. There’s an IT maintenance contract or internal staff costs to consider, backup hardware and software and a disaster recovery plan (maybe).

Also factor in the money used to provide end-point security such as anti-virus and drive encryption especially after you consider support contracts and upgrades. Remember to include the the annual renewal for the fashionable learning platform that promised to deliver collaborative workflow and remote learning but was never widely adopted.

In conclusion, schools pay indirectly for Office 365 A1 through the ancillary services but without having much idea of what the overall cost is.  Well now you do.

In our example and using Microsoft A3 it’s just under £1.50 a year for each of the 2870 staff and student members but only if you ditch those servers and embrace the new normal.

Thursday, 19 March 2020

Keep Calm and Get SaaS.

The recent announcement of school closures in the UK and across Europe has thrown up a raft of new challenges, one of which is  - “how can we teach without a school”.

For those establishments who have made the move to Software as a Service (SaaS) and reduced the role of local servers and infrastructure this may not be too much of a problem. If implemented correctly and protected by a cloud user directory such as Azure AD or Google it’s quite possible that learning could continue remotely so long as students and teachers have access to the internet and some form of mobile device.

But what can be done for those schools whose data and systems are locked behind the school firewall for the next few months.


First, it’s never too late to stand up an educational account in either Microsoft Office 365 or Google G Suite for Education and start moving services to the cloud. In response to the crisis Google are fast tracking school requests and it’s possible to be up and running with a tenancy within a few days. Both platforms have the ability to quickly import accounts, set up shared storage accounts and move data across. It may not be perfect but your school now has a fully functional collaborative workspace that can operate from any location.

If you are going Google, Classroom is going to be the answer to remote teaching for the next few months. This is now a fully featured, mature service and since it’s entirely web based and free from any licensing you can have a remote learning platform up and running in days.

Both Office 365 and G Suite have integrated video conferencing and messaging platforms that can be used for teaching. Google has made the premium features of Google Meet free to all G Suite for Education customers until July 1, 2020. This includes the ability to record meetings, livestream up to 100k people and add 250 people to a Hangout.

Consider getting hold of some Chromebooks for remote working. These devices are dead easy to set up and manage and work just as well with the Office 365 web apps as with G Suite. If you already have a remote access solution based on Citrix, VMware or MS Terminal services, Chromebooks are the dream client platform. If you can’t afford any hardware and only have a stock of underpowered laptops that aren't up to the mobile challenge, you can easily re-purpose them with Neverware and plug them back into your new cloud services.

Last of all,  if this all seems a bit overwhelming, you can make the transition as easy as possible by contacting a partner or supplier who can help you with the setup process and training.

Keep calm - contact a platform partner or just roll up your sleeves and get started with SaaS.



Friday, 14 February 2020

Provisioning OneDrive for new users.

Although you can create a student account in Office 365 and allocate OneDrive as a resource, behind the scenes the storage location is not actually assigned to the user account.

This normally occurs the first time the users tries to access or browse to their OneDrive which sometimes causes a noticeable delay before the site opens. For one off accounts this not too much of an issue but for a class groups in the first day of term it's not something you want to be dealing with.


In this situation it's a good idea to pre-provision OneDrive to improve the user experience and reduce the number of hands in the air.

First create a list of student accounts and save it as a file. For example a text file named users.txt that contains:

student1@myschool.net
student2@myschool.net
student3@myschool.net

Next run the PowerShell command Request-SPOPersonalSite  referencing the file you created.

$users = Get-Content -path "C:\users.txt" Request-SPOPersonalSite -UserEmails $users

That will kick off a background task to create the site for all accounts. If you are pre-provisioning OneDrive for a whole year group it might take up to 24 hours for the OneDrive locations to be created, so be patient or plan ahead,

Tuesday, 11 February 2020

Getting the Hardware ID for AutoPilot Enrolment.


If you have a new device its very easy to get the hardware information you need to enrol with Autopilot without going through the whole OOBE setup.




Start the device and wait a few second until the region selection page appears.

Press the following key combination SHIFT + F10

A CMD prompt will appear, type in PowerShell and hit Enter

The next step involves creating a local directory to store the information.

cd\
md scripts
cd scripts


You then need to set the execution level.

Set-ExecutionPolicy -ExecutionPolicy RemoteSigned

At the command prompt run the following commands.

Save-Script -Name Get-WindowsAutoPilotInfo -Path c:\scripts

The NuGet provider is required for this action. Press Y and Enter. NuGet will be downloaded and installed.

If you run a listing on the directory again you'll see the script downloaded to the scripts folder.

Run the following command;

Get-WindowsAutoPilotInfo.ps1 -outputfile c:\scripts\intunehwid.csv

This writes the required data into the intunehwid.csv file. This file needs to be uploaded to InTune either using either the Windows Store or the Device Management portal.

If you are enrolling a number of devices it's probably a good idea to script the whole process and run it directly from the USB key, appending the output data to a file on the drive rather then saving locally.

After that run; shutdown /p

.. to turn off the device.

Thursday, 30 January 2020

Windows Delivery Optimisation for the G Suite admin.

Chromebook admins who manage large estates soon become familiar with the OS refresh cycle which pushes out a full update every six weeks and the minor update cycle which arrives every two to three weeks.

On average a full Chrome OS update normally checks in at about 400 MB with minor updates around 50 MB. This amount of data is manageable for a class set of Chromebooks but adds up to a sizable chunk of bandwidth up if you are running a network with 2000 active devices.

For this reason Google allows some degree of  control over how these actions occur including the option to defer upgrades and stagger the update period. However one of the most useful features is a peering model which allows Chromebooks to pull updates from nearby devices of the same type.  This dramatically reduces the load on the internet connection and removes the download bottleneck for any school considering going 1:1 with Chromebooks.

For schools looking to go serverless with Windows10 clients it should come as no surprise that Microsoft have adopted a similar model for devices using Windows Update for Business rather than a local WSUS server - after all a good idea is a good idea.


Windows 10 introduces a new feature called Delivery Optimisation which can be controlled using InTune or standard GPO policies, for those who have yet to make the move to the cloud.

The GPO settings for Delivery Optimisation can be found at 
Computer Configuration > Administrative Templates > Windows Components > Delivery Optimisation.

A simple switch in the config sets the download mode. There are six options, including one to turn it off altogether but the one that comes closest to the Chromebook model is:

       HTTP blended with peering behind the same NAT

In simple terms Windows 10 devices will attempt to get updates from other computers on the same network but will fall back to the internet if there is no response.

The process can be fine tuned using both GPO and Intune policy settings including the option to define minimum settings for RAM and disk size before a device can take part in peer caching. A threshold can also be set on the minimum file size to be cached since, for smaller files it’s actually more efficient to simply download them from the web.

Windows 10 also has a feature similar to the Chrome software channels (Stable, Beta and Dev).

Devices can be allocated to various deployment rings  which includes an Insider program that allows organisation to test and provide feedback on future feature update release. Like Chrome OS upgrades arrive on a set schedule with Feature Updates released twice a year, normally in March and September. Quality updates contain security and critical fixes and normally occur at least once a month. Also like Chrome, updates can be deferred for a fixed period but not indefinitely.

For the Chrome admin this should all sound very familiar.

One problem facing Windows 10 that’s not such a high profile issue with Chromebooks is the maintenance of locally installed applications.

An Android application installed on a Chromebook will provide a notification when a new version is available with updates transferred via the Google Play Store. Universal applications on a Windows 10 device update in a similar same way but through the Windows Store for Business. Like Feature Updates you have the option to defer any changes if you are concerned about application stability.

For most organisations the largest application likely to be installed onto Windows 10 is Microsoft Office and the same Delivery Optimisation process can now be used for background updates of this package so long as you are running version Windows 10 1808 and are using a licensing and deployment model that employs Office 365 Pro Plus.

For once the setup is pretty simple as Delivery Optimization is enabled by default on devices running Windows 10 Enterprise or Windows 10 Education editions. Therefore, there isn’t anything additional you need to do to start taking advantage of Delivery Optimization for Office background updates.



Any G Suite administrator trying to get to grips with Windows10 update management will find a lot in common between the new Microsoft model and that of Chrome.

In many respects the transition will be harder for the local Windows admin who must now be starting to realise that in the future Microsoft will be running the management plane for Windows clients as well as the user directory and as a consequence reign of the local server is slowly drawing to a close.