tag line

moving IT to the cloud with service not servers

Tuesday, 18 September 2018

Managing G Suite & Office365 in a MAT

A cloud based approach to Single Sign On and user provisioning.


An IT setup that combines both Microsoft Office365 and Google G Suite is more common than you might expect, especially in UK schools. Managing both platforms is fairly straightforward for a single site but things start to get interesting when you attempt you try to incorporate multiple schools, all feeding back into a central organization - a situation commonly found within regional districts and trusts.

In a series of posts we’ll examine one approach to management based on a recent engagement with a Multi-Academy Trust (MAT) that planned to incorporate Google G Suite across a number of schools currently using Microsoft Office365. The goal was not to replace Microsoft Office365 but to provide each school with open access to both services. The trust was looking to manage both Office365 and G Suite as a single tenancy/organisation with each school existing as independent policy unit (sub-organisation).

As a prime objective the trust intended to employ Azure AD as the main directory and authentication source across all the sites. User management would use on-site AD but the actions would drive the automatic creation and updating of accounts into both Azure AD and G Suite.  The plan also included the adoption of Chromebooks as the preferred student device for both Office365 and G Suite, using Azure AD as the authentication source.

An earlier proposal involved synchronising data from each remote site. In this model each school would run Azure Active Directory (AD) Connect to provision user accounts into Office 365 (Azure AD) while the complementary service, Google Cloud Directory Sync (GCDS) maintained the accounts in G Suite. Each site would also run the Active Directory Federation service (ADFS) to authenticate users against the local AD database.

During the planning stage it was clear that this approach had a number of problems.

  • The rollout involved over 40 locations some of which were small primary schools with limited space and resource. Some sites required additional  investment on onsite hardware as well as software upgrades which only only added to the technical complexity and the ongoing support burden. The timescales for the deployment did not allow for an upgrade program.
  • A Google Organisation can only direct SSO towards a single external source. Therefore the plan to have have a unified Google organisation talking directly to multiple on-site ADFS sources couldn’t be supported.
  • Google Cloud Directory Sync (GCDS) was never designed to work in multi-site setup.  Without a complex set of exclusion rules and there was a real risk that accounts from one school could be suspended by a second copy of GCDC running at another site.  In a smaller deployment this might be manageable but the trust required a solution that could be scaled up without hitting a configuration bottleneck. Although running multiple G Suite organisations was one option this didn’t fit with the trusts overall strategy.

After a review period it was decided to trial a second approach that provisioned and authenticated users directly from Azure AD using the techniques described in an earlier post. Although this had proved successful for a single school there was no published documentation that described a more complex deployment involving multiple sites and domains. While this presented a significant risk the general approach had a number of benefits for the trust.

In order to support Office365 each school synchronised to the central Office365 tenancy using  Azure Active Directory (AD) Connect. Therefore a centralised directory that Google could query was already in place - Azure AD. However some of the larger schools maintained a local ADFS service while others simply synchronised local passwords into Azure AD. It was hoped that by pointing the G Suite organisation at Azure AD as the single target it would acts as broker, authenticating against local cloud accounts, synchronised password accounts or deferring to on-site ADFS as required.

Since Azure AD acts as an integrated directory for the entire trust it made sense to try and provision accounts into G Suite directly from this source rather from each local AD.  In this way configuration was centrally controlled and did not require onsite installations of GCDS. In fact the whole solution was zero-touch on the remote sites which enabled a much faster rollout schedule.


So can this approach be made to work in a MAT? Yes it can.

The trust now has G Suite authentication running centrally from the Azure AD database while maintaining day-to-day user administration through local AD at each site. Chromebooks present an Azure AD logon challenge on startup directed to the appropriate authentication source, either Azure or on-site ADFS.  There’s no requirement for GCDS as user accounts are created and suspended in G Suite using the auto-provisioning objects provided by MS Azure.

Essentially the solution proved to be a scaled up version of the technique outlined in the earlier post with a few tweaks. Certainly there were some important lessons learnt during the process and we’ll be outlining these in more detail in a future post.

If you’re interested in attempting a similar program in your organisation please drop me a line through the contact form and I’d be happy to link you up with the expertise.

Tuesday, 28 August 2018

The problem of local file shares in a SaaS school.

For schools moving to a SaaS based model, the requirement for local files is a difficult problem to solve.

This is particularly true of established sites that have curriculum requirements and admin workflows that depend heavily on traditional Windows file shares.  Copying the data to a cloud repository sounds like a good idea but when the Year 10 Media class try to open their video projects and 20 minutes later everybody is still looking at a spinning wheel you’d better expect some negative feedback.

Whether you work in the cloud or on premises the golden rule when working with large data sets is always the same: both the data and the application have to be in the same place.

If the curriculum demands that the application is Adobe Suite or a video editing package hosted on a high end workstation then the data has to be delivered from a local store to ensure an acceptable level of performance. However after installing the file server, the failover host and the multi-tier tape backup system you may find that your bold serverless initiative is now in tatters.

The answer is to have the file shares stored in the cloud but accessed from on-site which doesn’t sound possible, but is.

One of the many storage options offered by the Microsoft cloud service is Azure Files. It’s a simple concept that allows you to define an area of storage and then advertise like a standard Windows file share. You don’t have to worry about servers or redundancy, all that is handled by the fine folks at Microsoft. Users can consume and interact with the share in exactly the same way as if the share was hosted locally which solves the first part of the puzzle but not the second.

In response to this Microsoft have announced the general availability of a new service called Azure File Sync (AFS) which is a caching engine for Azure Files. The technical details are listed here  but in simple terms an agent on a local server synchronizes the contents of a Azure File store to local storage and then keeps the two stores in sync. Files now open with the speed of local storage but are hosted on the cloud.

That’s pretty clever but the advantages don’t stop there. Using another feature termed ‘cloud tiering’ it's possible to only synchronize the files that are currently active while keeping all the old stuff in the cloud. To any user browsing the share it would appear that all the files are stored locally. Users have access to all the file properties and permissions settings just as before, data is retrieved from Azure Files only when the file is opened. Access the same file a second time and the data is now considered ‘hot’ and will be maintained on the local drive with the changes written back to Azure as a background process.

The Year 10 media class issue is solved. You have the fast response of  a local filestore while still maintaining the advantages of cloud storage.

Lets run through a few of the other advantages you get with Azure File Sync.

If implemented properly the local server is reduced to the status of a cache appliance. By only holding copies of active files it can host a multi TB share while only having a 800 GB data drive and none of the local data requires backing up. Azure File Sync come with a useful facility that will enable fast recovery of the file system onto only any suitable hardware.  By turning aging file servers into cache appliances their useful life could be significantly extended.

The Azure file store can act as a central repository for a group.  By consolidating local file shares into a central store and then synchronizing data out to edge sites you get the benefits of a local share but without the problems of maintaining an on-site data silo. In this model all backups are done using Azure Backup services (no hardware required) and users have the ability to recovery data directly using the facility built into the Windows UI.

With all this sweetness there has to be a little sour.

There are number of technical limitations that still need ironing out.  The service has a 4TB limit on a single Azure Files storage group but this is expected to be increased to 100TB in the near future.

Azure File Sync uses a simple conflict-resolution strategy as it doesn’t doesn't pretend to be a fully featured collaboration platform. If a file is updated from two server end points at the same time the most recent change will recorded in the original file. The older update will result in a new file with the  name of the source server and a conflict number appended to the original file name. It’s up to the user to manually incorporate the changes if required.

However the biggest hurdle to general adoption within education might be the question “How much does it cost”?

Subscription charges based on usage apply to the central store and prices vary between each Azure region. At the time of writing a  1TB store using Local Redundancy in Europe West will cost around £46/month  (£553 pa). Standard data egress and transaction rates apply to the central store and any additional capacity and transactions used on the share snapshots. There’s also a cost to implement Azure File Sync which amounts to £5.50/month (£66 pa) per server end-point and a charge for any additional Azure backup services you may choose to employ.

So the answer to the question is;  for a 1TB store AFS will cost upwards of £553 pa but the actual cost will depend on a whole bunch of factors which you can’t estimate until you start using it.

Let’s assume the cost is £700 pa. I think that’s a pretty good deal for a fully managed distributed file system but the problem is:  £700 pa compared with what?

How many schools understand the cost of storing data locally? To the casual observer it might even appear free. After all both solutions still require local hardware and Microsoft’s generous EDU licencing agreements makes standing up another file server a cheap option.  So the argument could be raised: why put your data in the the cloud and then commit to an open ended charging scheme just to access it ?

From a personal point of view I can think of a dozen technical reasons why it would be a good idea to implement AFS but they could all be overridden by fears and concerns over pricing and lock in.  Microsoft response to all this is simple, reduce the subsidy that education receives for local licencing until Azure looks like an attractive option. It’s a long term strategy that will win out in the end.

In the short term it would be useful if education could find a way to place a real cost on local data storage, particularly in the brave new world GDPR, so they can properly evaluate SaaS offerings such as Azure File Sync. If not they could be missing out on a good deal.

Sunday, 22 July 2018

Login as A but send email as B

Operating your G Suite inbox on a separate domain to your logon address.

In a simple world a school would create a Google organization using the internet domain employed by the external website and public email and that would be the end of it.

Unfortunately we don’t live in a simple world and the direct relationship between the email address and the user logon is often affected by a change in circumstance

For instance, the school might be planning to consolidate under a standardised Trust or District logon as part of a rebranding exercise. Its also possible that the organisation was originally created using an domain that was less than ideal.  After all east-walthamstow-college-of-arts.co.uk is a great descriptor but a tedious logon address.

Whatever the case most organizations are reluctant to give up an email address that is printed on stationary, external signage or embedded in software. So for a variety of reasons a school may require a different logon address to the one that routes mail.

In this example our fictional college wants to move to ewca.co.uk as the logon identifier for the long suffering students and staff but maintain east-walthamstow-college-of-arts.co.uk as the primary email address on all G Suite accounts. So how can this be achieved?


The first thing to understand is that in order to use ewca.co.uk for any purpose it must be registered within G Suite as a secondary domain. This process is fairly straightforward and is well documented by Google so there’s little point repeating it here.

Once the secondary domain is verified the G Suite  administrator has ability to upgrade each users primary address from student@east-walthamstow-college-of-arts.co.uk to student@ewca.co.uk. It’s a simple select and save action, Google takes care of all the backend housekeeping with a few advisory notes that are listed later.

As soon as the account is updated the user can logon as student@ewca.co.uk  while maintaining all the features and data of the original account. As a bonus the old student@east-walthamstow-college-of-arts.co.uk address is pinned to the account as an alias which allows the user to continue to receive mail.

Job done. Well not quite.

By default the primary (logon) address is the one that GMail uses when it  sends mail. Therefore although the user can receive mail on student@east-walthamstow-college-of-arts.co.uk
they are sending on  student@ewca.sch.uk which is not quite what we want.

Fortunately GMail has a way of fixing that.

In the 'Settings' dialog of the users GMail navigate to the 'Accounts' tab where you find the 'Send mail as' option.


Selecting 'Add another email address' allows the user to add the new alias as a send address after which it can be upgraded the new default (below).


Once that’s been done the account will send mail on the student@east-walthamstow-college-of-arts.co.uk address by default. The new address will be used to reply to mail regardless of the original send address.

This seems pretty straightforward but there’s an obvious problem. This is a user driven process, none of the actions can be controlled or managed from the admin console. Even allowing for a well informed student and staff body, publishing a crib-sheet for each user is going to lead to issues.

Mistyping the alias or failure to do anything at all will result in mail moving in unexpected directions. A manual process might be manageable for 100 users but not 1000+.

Faced with a problem like this the solution is always the same…  send for GAM.

GAM is a open source project that exposes the Google API’s as a simple command line interface that you can use to build batch files. It’s an essential component of every G Suite admins toolkit.

Fortunately GAM has a command for this, as it has for most functions.

gam user student sendas student@east-walthamstow-college-of-arts.co.uk "Student" replyto      student@east-walthamstow-college-of-arts.co.uk default treatasalias true 

It should be pretty clear how each of the elements in the command relate to the options in the user dialog show above.  Of course GAM can also help out with the first part  of the project, updating the users primary logon address using a command similar to that shown below.

gam update user student username student@ewca.co.uk

Therefore the process is reduced to running two GAM commands on each user account.
  • Change the users primary logon account to the new secondary domain.
  • Update the default send address to the email alias created by the first command.

Note: GAM provides methods to draw data from CSV files that will update 100’s of users in  single command .

For a large school it’s not recommended that to do this in the middle of the day. The process of renaming can take up to 10 minutes to propagate across all services and you must allow at least 24 hours for the directory to catch up. Like all major updates it should be handled using proper change control mechanisms and tested on a subset of users.

A couple of points worth noting.

The GMail user still has the ability to update the Send address by updating the configuration in the settings dialog. GAM will make change but it’s not locked down in any way. The more curious user can and probably will mess with this at some point.

Lastly, larger organisations may be using utilities such as Google Cloud Directory Sync to maintain G Suite user accounts which adds an additional degree of complexity. In this case the update to the primary user logon needs to be matched with a change in the synchronization rules otherwise you run the risk of creating duplicate accounts. Again it’s best to test on group of test users first.

Happy GAM’ing.