IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
Web filtering for business: Keep your secrets safe, and keep your employees happy
Mon, 15th Aug 2016
FYI, this story is more than a year old

Web filtering. The phrase connotes keeping employees from spending too much time monitoring Beanie Baby auctions on eBay, and stopping school children from encountering (accidentally or deliberately) naughty images on the internet. Were it that simple - but nowadays, web filtering goes far beyond monitoring staff productivity and maintaining the innocence of childhood. For nearly every organisation today, web filtering should be considered an absolute necessity. Small business, K-12 school district, Fortune 500, non-profit or government… it doesn't matter. The unfiltered internet is not your friend, and legally, it's a liability; a lawsuit waiting to happen.

Web filtering means blocking internet applications – including browsers – from contacting or retrieving content from websites that violate an Acceptable Use Policy (AUP). The policy might set rules blocking some specific websites (like a competitor's website). It might block some types of content (like pornography), or detected malware, or even access to external email systems via browser or dedicated clients.

In some cases, the AUP might include what we might call government-mandated restrictions (like certain websites in hostile countries, or specific news sources).

Unacceptable Use in the AUP

The specifics of the AUP might be up to the organisation to set entirely on its own; that would be the case for a small business, perhaps. Government organisations, such as schools or military contractors, might have specific AUP requirements placed on them by funders or government regulators, thereby becoming a compliance/governance issue as well. And of course, legal counsel should be sought when creating policies that balance an employee's ability to access content of his/her choice, against the company's obligations to protect the employee (or the company) from unwanted content.

It sounds easy – the organisation sets an AUP, consulting legal, IT and the executive suite. The IT department implements the AUP through web filtering, perhaps with software installed and configured on devices; perhaps through firewall settings at the network level; and perhaps through filters managed by the internet service provider. It's not simple, however. The internet is constantly changing, employees are adept at finding ways around web filters; and besides, it's tricky to translate policies written in English (as in the legal policy document) into technological actions. We'll get into that a bit more shortly. First, let's look more closely at why organisations need those Acceptable Use Policies.

Improving employee productivity. This is the low-hanging fruit. You may not want employees spending too much time on Facebook on their company computers. (Of course, if they are permitted to bring mobile devices into the office, they can still access social media via cellular). That's a policy consideration, though the jury is out if a blank blockage is the best way to improve productivity.

Preserving bandwidth. For technical reasons, you may not want employees streaming Netflix movies or Hulu-hosted classic TV shows across the business network. Seinfeld is fun, but not on company bandwidth. As with social media, this is truly up to the organisation to decide.

Blocking email access. Many organisations do not want their employees accessing external email services from the business computers. That's not only for productivity purposes, but also makes it difficult to engage in unapproved communications – such as emailing confidential documents to yourself. Merely configuring your corporate email server to block the exfiltration of intellectual property is not enough if users can access personal gmail.com or hushmail.com accounts. Blocking external email requires filtering multiple protocols as well as specific email hosts, and may be required to protect not only your IP, but also customers' data, in addition to complying with regulations from organisations like the U.S. Securities and Exchange Commission.

Blocking access to pornography and NSFW content. It's not that you are being a stick-in-the-mud prude, or protecting children. The initial NSFW (not safe for work) are often said as a joke, but in reality, some content can be construed as contributing to an hostile work environment. Just like the need to maintain a physically safe work environment – no blocked fire exits, for example – so too must you maintain a safe internet environment. If users can be unwillingly subjected to offensive content by other employees, there may be significant legal, financial and even public-relations consequences if it's seen as harassment.

Blocking access to malware. A senior manager receives a spear-phishing email that looks legit. He clicks the link and, wham; ransomware is on his computer. Or spyware, like a keylogger. Or perhaps a back-door that allows other access by hackers. You can train employees over and over, and they will still click on unsafe email links or on web pages. Anti-malware software on the computer can help, but web filtering is part of a layered approach to anti-malware protection. This applies to trackers as well: As part of the AUP, the web filters may be configured to block ad networks, behavior trackers and other web services that attempt to glean information about your company and its workers.

Blocking access to specific internet applications. Whether you consider it Shadow IT or simply an individual's personal preference, it's up to an AUP to decide which online services should be accessible; either through an installed application or via a web interface. Think about online storage repositories such as Microsoft OneDrive, Google Drive, Dropbox or Box: Personal accounts can be high-bandwidth conduits for exfiltration of vast quantities of valuable IP. Web filtering can help manage the situation.

Compliance with government regulations. Whether it's a military base commander making a ruling, or a government restricting access to news sites out-of-favor with the current regime; those are rules that often must be followed without question. It's not my purpose here to discuss whether this is “censorship,” though in some cases it certainly is. However, the laws of the United States do not apply outside the United States, and blocking some internet sites or types of web content may be part of the requirements for doing business in some countries or with some governments. What's important here is to ensure that you have effective controls and technology in place to implement the AUP – but don't go broadly beyond it.

Compliance with industry requirements. Let's use the example of the requirements that schools or public libraries must protect students (and the general public) from content deemed to be unacceptable in that environment. After all, just because a patron is an adult doesn't mean he/she is allowed to watch pornography on one of the library's publicly accessible computers, or even on his/her computer on the library's Wi-Fi network.

Think About the Children

A key ingredient in creating an AUP for schools and libraries in the United States is the Children's Internet Protection Act (CIPA), managed by the U.S. Federal Communications Commission. In order to receive government subsidies or discounts, the schools and libraries must comply with these regulations. (Other countries may have an equivalent to these policies.)

To quote from the CIPA rules:

The Children's Internet Protection Act (CIPA) was enacted by Congress in 2000 to address concerns about children's access to obscene or harmful content over the Internet. CIPA imposes certain requirements on schools or libraries that receive discounts for Internet access or internal connections through the E-rate program – a program that makes certain communications services and products more affordable for eligible schools and libraries. In early 2001, the FCC issued rules implementing CIPA and provided updates to those rules in 2011.

Schools and libraries subject to CIPA may not receive the discounts offered by the E-rate program unless they certify that they have an Internet safety policy that includes technology protection measures. The protection measures must block or filter Internet access to pictures that are: (a) obscene; (b) child pornography; or (c) harmful to minors (for computers that are accessed by minors). Before adopting this Internet safety policy, schools and libraries must provide reasonable notice and hold at least one public hearing or meeting to address the proposal.

Schools subject to CIPA have two additional certification requirements: 1) their Internet safety policies must include monitoring the online activities of minors; and 2) as required by the Protecting Children in the 21st Century Act, they must provide for educating minors about appropriate online behavior, including interacting with other individuals on social networking websites and in chat rooms, and cyberbullying awareness and response.

Schools and libraries subject to CIPA are required to adopt and implement an Internet safety policy addressing:

  • Access by minors to inappropriate matter on the Internet;
  • The safety and security of minors when using electronic mail, chat rooms and other forms of direct electronic communications;
  • Unauthorised access, including so-called “hacking,” and other unlawful activities by minors online;
  • Unauthorised disclosure, use, and dissemination of personal information regarding minors; and
  • Measures restricting minors' access to materials harmful to them.
  • Schools and libraries must certify they are in compliance with CIPA before they can receive E-rate funding.

Those are complex requirements, covering broad types of content, multiple environments (such as chat rooms) and even subjective criteria (like determining which content would be considered cyberbullying). Yet increasingly, school and libraries must make sure their AUP includes those conditions – and their Web filtering technology is up to the challenge of blocking disallowed content, but not blocking too much. Fortunately, there are practical resources available, such as the

 E-Rate Central CIPA Compliance Checklist and an excellent primer from the American Library Association, “Legal Issues: CIPA - Filtering.

The CIPA is merely one example of the type of governmental or industry filtering practices that are well intended, and whic

h reasonable people might agree is necessary — but that can be quite difficult to implement in practice.

The Best Place to Filter: At the ISP Level

Where can you filter? At the end-user device, on the organisational network, or at the ISP. Frankly, it's too easy for users to circumvent device filters – and that doesn't help with rogue devices on the network. Network filters, often implemented by a firewall, are much better, but must be maintained rigorously by the organization's IT staff; not only to ensure proper functioning, but also to make sure that blacklists, malware detectors, and other static and dynamic filtering criteria are up to date.

In my opinion, the best place to filter is at the ISP level. It's very difficult for an employee to bypass or subvert an ISP-based filter – especially if nobody in the organisation has access to its settings. Also, all customers of the ISP can benefit from some types of filtering: If one client tries to access a website and the ISP's filters determine that it contains malware, countermeasures can be instantly deployed for all of that ISP's customers.

One such ISP-based Web filter is offered by a Canadian security firm, Wedge Networks. It offers a system to ISPs that enables them to offer Web Filtering as a Service (WFaaS). Wedge says that it has classified 280 million top-level domains into 95 specific categories and, furthermore, those domains are reviewed by humans – not just bots and algorithms.

Some of the categories Wedge can filter include: Alcohol, Anonymisers, Auction, Criminal Skills, Dating/Social, Drugs, Gambling, Gruesome Content, Hacking, Hate Speech, Malicious Sites, Nudity, P2P/File Sharing, Personal Network Storage, Phishing, Pornography, Profanity, School Cheating Information, Shareware/Freeware, Spam Email URLs, Spyware, Tobacco, Violence, Weapons, Web Mail, Web Phone

That's quite a list – and presumably some of those categories should be in every organization's AUP.

Bottom line: The open internet sounds wonderful, but there is content that simply should be blocked by every organisation for its own protection; and some applications that should be blocked in order to protect intellectual property.  There is also content that must be blocked to help comply with government regulations like CIPA for schools and libraries, HIPAA for health-care organisations, and SEC rules regarding financial data for public companies. Yes, we want to block employees from watching porn and bidding on beanie babies while in the office. That's only the tip of the iceberg, however. Web filtering: You need it.