It used to be so simple, where did it all go wrong ?
In the past managing student web activity usually involved installing an in-house proxy server, commonly Microsoft Threat Management Gateway (TMG) or even further back Microsoft ISA server. Client web traffic was forwarded directly to the proxy server which queried the web in the users behalf but only after passing the request through a series of filters that denied access to sites displaying inappropriate material. All of this was neatly integrated into Microsoft AD security for ease of management.
Alternatively the school could rely on a ‘clean’ feed provided by an educational authority, district or ISP, so long as you were happy with a ‘one-size-fits-all’ approach.
All was well with the world….. and then it went horribly wrong.
It difficult to say exactly when this happened but Microsoft’s formal announcement of the end-of-life for TMG in 2012 without providing a direct replacement for outbound URL filtering was probably the start although things were not well in the ‘walled garden’ before then. The idea that you could effective ‘whitelist’ the internet had long gone and the expansion of platforms such as YouTube to include educational content meant that IT services were under pressure to unblock access to sites previously viewed as undesirable.
However there are other suspects in this crime.
The growth of interest in BYOD and 1:1 tablet programs meant that configuring proxy settings on each device added a new layer of administration outside of Active Directory group policy.
In addition student using iPads and Android tablets were no longer authenticating against Active Directory before browsing the web. A work round required the student to provide a username and password using a captive web portal before proceeding. This worked pretty well for simple web access but proved a challenge for tablet apps that were not expecting a “call home” to return the captive portal page.
Things were getting difficult.
For a while the solution seemed to be to install the filtering service in-line with the firewall (transparent mode), silently ‘snooping’ on the packets as they pass between the user device and the internet. This certainly made device configuration easier (there wasn’t any) which fixed the BYOD and app issue although user identification remained a challenge.
All was well for a while until Google made an announcement that as of 23rd June 2015 the Google search services will move all search results behind SSL encryption.
This means that in the future the session from the device to internet will be encrypted end-to-end and although a transparent proxy could still access the data packets it could not decipher the information to make any sensible decisions with regards filtering. So as Google and other content providers made the switch to secure communication the transparent proxy, which had solved so many problems, effectively became ‘blind’.
As a quick fix schools simply blocked sites that used a secure connection or bypassed the proxy entirely but this quickly became impractical as more external sites adopted the secure standard.
The transparent proxy approach has survived, employing the latest encryption standard (TLS) which was slightly more cooperative with regards to filtering. Using TLS the gateway knows that a device had accessed YouTube but would have no further details as to what resource the student was requesting from that site.
A number of workarounds quickly emerged but most fell back on a variation of the “relay’ model whereby the client held a secure connection with the gateway and the gateway maintained the secure connection to the external service. Unfortunately this simply reintroduced all the historical issues with client configuration (with the addition of certificate deployment) as well as loading the proxy with a significant amounts of additional processing as it encrypts and decrypts the packets.
Which is pretty much where we are today. Frankly it's a bit of a mess.
So in the future what's likely to happen and how can SaaS help.
There’s little doubt that sites will continue to adopt secure protocols until it’s the de-facto standard for web traffic. Traditional on-premise systems will just have to come to terms with the fact that the data stream from client to host will be encrypted.
Protocols like TLS provide some visibility of the stream but not enough to support a comprehensive filtering service. Filtering support is a by-product of using TLS and is not its prime purpose so we can’t expect improvements in this area to deliver the solution.
Currently on-premise proxy servers can intercept the secure channel by acting as an intermediary and decrypting and encrypting the packets as they pass through but this activity is processor intensive and to achieve this on any scale without affecting latency is not going to be easy especially as the proportion of encrypted traffic rapidly increases.
So when the renewal of the proxy support contract comes around do you invest a tin-box that can can handle your schools current data load or the projection for the next three years? Are you going to be forced to purchase an expensive device that can handle the peak loads prior to the exam season only to remain idle for four months of the year?
Remember you still have the problem of client proxy management, certificate deployment and user authentication on top of this.
Or maybe we should just throw away all the historical baggage and a design a system for the world of mobility and SaaS.
And if we did this, what would it look like and what advantages do you gain?
Web Filtering for the SaaS School - Part 2.
Further Reading.
Adams blog gives an excellent technical summary of this issue.
http://adamwelch.com/2014/11/google-ssl-search/
No comments:
Post a Comment