Guide for education settings and filtering providers about establishing 'appropriate levels of filtering'
The aim of this guide is to help education settings (including Early years, schools and FE) and filtering providers comprehend what should be considered as ‘appropriate filtering’.
It is important to recognise that no filtering systems can be 100% effective and need to be supported with good teaching and learning practice and effective supervision.
Illegal Online Content
In considering filtering providers or systems , schools should ensure that access to illegal content is blocked, specifically that the filtering providers:
- Are IWF members and block access to illegal Child Abuse Images and Content (CAIC)
- Integrate the ‘the police assessed list of unlawful terrorist content, produced on behalf of the Home Office’
Inappropriate Online Content
Recognising that no filter can guarantee to be 100% effective, schools should be satisfied that their filtering system manages the following content (and web search)
- Discrimination: Promotes the unjust or prejudicial treatment of people on the grounds of the protected characteristics listed in the Equality Act 2010
- Drugs / Substance abuse: displays or promotes the illegal use of drugs or substances
- Extremism: promotes terrorism and terrorist ideologies, violence or intolerance
- Malware / Hacking: promotes the compromising of systems including anonymous browsing and other filter bypass tools as well as sites hosting malicious content
- Pornography: displays sexual acts or explicit images
- Piracy and copyright theft: includes illegal provision of copyrighted material
- Self Harm: promotes or displays deliberate self harm (including suicide and eating disorders)
- Violence: Displays or promotes the use of physical force intended to hurt or kill
This list should not be considered an exhaustive list and providers will be able to demonstrate how their system manages this content and many other aspects
Providers should be clear how their system does not over block access so it does not lead to unreasonable restrictions
Filtering System Features
Additionally schools should consider that their filtering system meets the following principles
- Age appropriate, differentiated filtering – includes the ability to vary filtering strength appropriate to age and role
- Control - has the ability and ease of use that allows schools to control the filter themselves to permit or deny access to specific content
- Filtering Policy – the filtering provider publishes a rationale that details their approach to filtering with classification and categorisation as well as over blocking
- Identification - the filtering system should have the ability to identify users
- Mobile and App content – mobile and app content is often delivered in entirely different mechanisms from that delivered through a traditional web browser. To what extent does the filter system block inappropriate content via mobile and app technologies (beyond typical web browser delivered content)
- Multiple language support – the ability for the system to manage relevant languages
- Network level - filtering should be applied at ‘network level’ ie, not reliant on any software on user devices
- Reporting mechanism – the ability to report inappropriate content for access or blocking
- Reports – the system offers clear historical information on the websites visited by your users
Filtering systems are only ever a tool in helping to safeguard children when online and schools have an obligation to “consider how children may be taught about safeguarding, including online, through teaching and learning opportunities, as part of providing a broad and balanced curriculum”.
This detail has been developed by the South West Grid for Learning, as coordinators of the UK Safer Internet Centre, and in partnership and consultation with the 120 national ‘360 degree safe e Safety Mark’ assessors (www.360safe.org.uk) and the NEN Safeguarding group (www.nen.gov.uk).