Posted by Lori Ayre on June 29, 2012

I haven't talked much about filters lately as it seems that they have just quietly found their way into some libraries or have been decidedly eliminated from consideration in others (San Jose Public being one of the more public examples). But every once in a while something comes through the newsfeed that really points out the importance of paying attention to what is happening with filters in some libraries and, in this case, some schools.

American Libraries magazine recently reported on a suit filed by the ACLU against a Missouri school district which was using a filter that consistently 'allowed' Internet content that was anti-LGBT, but 'blocked' anything that was positive.  The school district's objective was to block all "sexually explict" content.  I'm not sure whether the school district (or some people in that school district) actually hoped to block pro-LGBT content or if it just happened and they weren't paying attention. Either way, the issue points to two issues that come up with filtering that I'd like to discuss.

Filter Issue #1

There's always a person or persons behind the filter.  And those people have opinions and points of view.  Filters work by categorizing content so whether you wrote the code to do the analysis of that content to "automagically" dump it into a category or if you are one of several humans looking at websites and categorizing them....it is a subjective decision. This is why filtering needs to be monitored.  

Library values are different from business values and school values and parent values.  All of these groups use filters.  If libraries use filters designed for parents...it is quite likely that they will be blocking way more content than we want to block because library values are that information should be freely available in all but very limited circumstances (e.g. child pornography and obscenity).  Parents are focused on their child's safety from threats and what they deem a threat is likely broader than child pornography and obscenity.  May home filters also include keyword blocking and some kind of monitoring of the kids' online activities. Businesses have different objectives that parents.  They use filters to keep people from getting distracted at work (so severely restrict what can be accessed online). They can also be used for monitoring.  

In the case of Missouri, the school district was using a filter, URLBlacklist, that was simply a URL blocker that one guy populated with sites he, personally, identified as "sexually explicit."  Since the library wanted to block sexually explicit content....bingo, they figured this was the one for them! That's what they wanted to block so they chose this filter.  They didn't bother to find out 'how' the sites were being categorized. They just bought the category. This brings me to my second point....how people interpret "sexually explicit."

Filter Issue #2

When it comes to things related to sexuality, anything other than traditionally defined gender roles and heterosexuality are often interpreted as "sexually explicit" or at least sexual. I have experienced this myself out in the world and its no different in the backroom of a filter company where those workers are busily categorizing content. The classic example is the straight person who refers to her husband frequently in conversation but if a gay person refers to his or her same-sex "partner," it is perceived as TMI (too much information!) Or if a gay or lesbian couple hold hands or kiss in public (most recent example being the lesbian couple on a Southwest flight who were eventually removed from the flight), they are seen as putting it in people's faces or being "sexual." A kiss on a plane between a man and woman may be seen as sweet, but between two lesbians, it is seen (by some) as offensive, sexual, or maybe even obscene. Anything that bumps against these traditional values may be seen as sexual whereas the same behaviors between heterosexuals never is. 

These are two of the reasons that I encourage libraries to a) choose a filter designed for business rather than homes or schools.  These products, at least, have a lot more options for how to configure them; b) choose the FEWEST possible categories you can when trying to address your objective (e.g. filtering out adult content from children's computers, creating a safe environment for libraries with one staffperson, or complying with CIPA; c) create a Internet Use Policy that explicity states what you are blocking; and d) monitor what you blocking.

From what I can tell, no one actually monitors what their filters block.  I think all libraries should regularly look through their logs and see what sites they have blocked so they know if what they want to happen is happening.  Is the filter consistent with our library's values and is it implementing your Internet Use Policy?  You should really know the answer to that question.  

Filters can be handy in certain situations but they are not to be trusted.