Skip to main content

Safer Internet Centre

Do you have a cyberbullying or digital safety concern?

helpline@saferinternet.org.uk 0844 381 4772

Appropriate Filtering

Guide for education settings and filtering providers about establishing 'appropriate levels of filtering'

Schools in England (and Wales) are required “to ensure children are safe from terrorist and extremist material when accessing the internet in school, including by establishing appropriate levels of filtering”. Furthermore, the Department for Education’s statutory guidance ‘Keeping Children Safe in Education’ obliges schools and colleges in England to “ensure appropriate filters and appropriate monitoring systems are in place. Children should not be able to access harmful or inappropriate material from the school or colleges IT system” however, schools will need to “be careful that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching and safeguarding.” Ofsted concluded in 2010 that “Pupils in the schools that had ‘managed’ systems had better knowledge and understanding of how to stay safe than those in schools with ‘locked down’ systems". Pupils were more vulnerable overall when schools used locked down systems because they were not given enough opportunities to learn how to assess and manage risk for themselves.”

Included within the Scottish Government national action plan on internet safety, schools in Scotland are expected to “have policies in place relating to the use of IT and to use filtering as a means of restricting access to harmful content.”
The aim of this document is to help education settings (including Early years, schools and FE) and filtering providers comprehend what should be considered as ‘appropriate filtering’.

It is important to recognise that no filtering systems can be 100% effective and need to be supported with good teaching and learning practice and effective supervision.

Illegal Online Content

In considering filtering providers or systems, schools should ensure that access to illegal content is blocked, specifically that the filtering providers:

  • Are IWF members and block access to illegal Child Sexual Abuse Material (CSAM)
  • Integrate the ‘the police assessed list of unlawful terrorist content, produced on behalf of the Home Office’

Inappropriate Online Content

Recognising that no filter can guarantee to be 100% effective, schools should be satisfied that their filtering system manages the following content (and web search):

  • Discrimination: Promotes the unjust or prejudicial treatment of people on the grounds of the protected characteristics listed in the Equality Act 2010
  • Drugs / Substance abuse: displays or promotes the illegal use of drugs or substances
  • Extremism: promotes terrorism and terrorist ideologies, violence or intolerance
  • Malware / Hacking: promotes the compromising of systems including anonymous browsing and other filter bypass tools as well as sites hosting malicious content
  • Pornography: displays sexual acts or explicit images
  • Piracy and copyright theft: includes illegal provision of copyrighted material
  • Self Harm: promotes or displays deliberate self harm (including suicide and eating disorders)
  • Violence: Displays or promotes the use of physical force intended to hurt or kill

This list should not be considered exhaustive and providers will be able to demonstrate how their system manages this content and many other aspects

Regarding the retention of logfile (Internet history), schools should be clear about the data retention policy of their provider.

Providers should be clear how their system does not over block access so it does not lead to unreasonable restrictions.

Filtering System Features

Additionally schools should consider that their filtering system meets the following principles:

  • Age appropriate, differentiated filtering – includes the ability to vary filtering strength appropriate to age and role
  • Circumvention – the extent and ability to identify and manage technologies and techniques used to circumvent the system, for example VPN, proxy services and DNS over HTTPS
  • Control - has the ability and ease of use that allows schools to control the filter themselves to permit or deny access to specific content
  • Filtering Policy – the filtering provider publishes a rationale that details their approach to filtering with classification and categorisation as well as over blocking
  • Group/Multi-site Management – the ability for deployment of central policy and central oversight or dashboard
  • Identification - the filtering system should have the ability to identify users
  • Mobile and App content – mobile and app content is often delivered in entirely different mechanisms from that delivered through a traditional web browser. To what extent does the filter system block inappropriate content via mobile and app technologies (beyond typical web browser delivered content)?
  • Multiple language support – the ability for the system to manage relevant languages
  • Network level - filtering should be applied at ‘network level’ ie, not reliant on any software on user devices
  • Reporting mechanism – the ability to report inappropriate content for access or blocking
  • Reports – the system offers clear historical information on the websites visited by your users

Schools and Colleges should ensure that there is sufficient capability and capacity in those responsible for and those managing the filtering system. The UK Safer Internet Centre Helpline may be a source of support for schools looking for further advice in this regard.

Filtering systems are only ever a tool in helping to safeguard children when online and schools have an obligation to “consider how children may be taught about safeguarding, including online, through teaching and learning opportunities, as part of providing a broad and balanced curriculum”. To assist schools and colleges in shaping an effective curriculum, UKCCIS have published Education for a Connected World.

Acknowledgment

This detail has been developed by the South West Grid for Learning, as coordinators of the UK Safer Internet Centre, and in partnership and consultation with the 120 national 360 degree safe 'eSafety Mark’ assessors and the NEN Safeguarding group.

UK Safer Internet Centre recommends that those responsible for Schools and Colleges undertake (and document) an annual online safety risk assessment, assessing their online safety provision that would include filtering (and monitoring) provision. The risk assessment should consider the risks that both children and staff may encounter online, together with associated mitigating actions and activities. The UK Safer Internet Centre will be publishing further guidance in 2018 in support of this recommendation.
 
A risk assessment module has been integrated in 360 degree safe. Here schools can consider identify and record the risks posed by technology and the internet to their school, children, staff and parents.