Guardian man

The black screen flashes on in a dark room.  A child moves the mouse towards the button labeled search.  He clicks it.  There is no turning back now.  This poor, innocent child has opened himself up to a land full of sexual photographs and immoral references.  A vast informational landscape… Surely this young child is incapable of staying away from all of the dangerous places that lurk in this tangled Web.  Of course the parents would watch him if they could, but they’re just so busy with their own lives.  In such a troubling situation, who will come to this poor boy’s rescue?  Dah dah dah dah!  It’s Internet Guardian Man to the rescue!  “Don’t worry unsupervised adolescent, I will protect you from the images and information on the Web that I deem inappropriate,” exclaims Super-Filter Man.

Although protective provisions on the Internet such as guardians, safe-searches, and filters have positive impacts, there are negative repercussions that run deeper than simply sheltering unsupervised children.  Of course these programs can help to prevent the wrong audiences from viewing explicit or unacceptable pictures and other material.  Parents can even take some of the control into their own hands and pick and choose what sites they don’t want their children to view.  However, beyond this, these programs say something about our culture and the faith we put in the Internet.  Often times these filters block information that is not obscene.  They blocks sites on which there might be unsuitable material, even if the content being searched for is not something of that nature.  With this, we are allowing these programs to censor not only explicit objects, but also what they don’t want us to see, regardless of its actual content.  At the opposite end of the spectrum, sometimes these filters fail in blocking material that actually is obscene.  In this same negative light, parents are putting too much faith in these filters.  Instead of directly keeping an eye on what their children are doing online, they leave it up to Google SafeSearch and other filtering systems.  It is important for media literate people to be aware of the impacts that these programs have and to not trust them as the end all for safe internet usage.

There are definite benefits to Internet Guardian programs.  Filters help children stay away from inappropriate content.  It’s as simple as that.  Andrew Robb, a member of Australian Parliament, said, “Parents have been concerned about this [inappropriate content] and I believe that these new safeguards and free internet filters will help local families enjoy the benefits the internet can provide without worrying about what their children may accidentally click onto” (Goldstein Media Releases).  Obviously, this is the main goal of filters and safeguards.  They aim to protect children from explicit material.  After all, children might accidently click on a link that could take them to a site with this type of inappropriate material on it.  There is also danger in kids misspelling certain site names or visiting sites that sounds legitimate, but actually turn out to be pornographic.  In this case, these programs are very effective.  Safeguards can block this obscene material and even prevent children from visiting those innocent sounding (but in reality not) sites.

Some programs, such as the one that Microsoft Windows provides, allow parents to choose what exactly they want to filter.  The Windows’ web filter includes four different restriction levels that vary in the amount of protection they provide.  The highest restricts Internet use to “children’s sites” that have language appropriate for eight to twelve-year-olds and content that is understandable, usable to children, and “accessible to younger minds.”  With a different level, parents can filter sites based upon certain content categories.  Some of these categories include pornography, sexual education, hate speech, bomb making, and drugs (“How does the Parental Control web filter Work?”).  With such programs, parents can take the reins of what their children view online.  By making some simple decisions, they choose what is appropriate for them and what exactly is not.  Oppositely, when some of these sites are restricted, other dangers occur.  For instance, some free speech advocates argue that when filters are used on public machines, such as in libraries, virtually any content can be blocked.  This can deny more mature user their freedoms and prevent them from viewing important information about such topics as birth control information (Baran 280-281).  Clearly, in many instances, filters do their job at preventing.  In others though, they sometimes block content that might be useful to slightly more mature audiences.

However, sometimes, these programs don’t work in the way that they are intended to.  Sometimes they filter out material that is not actually adult content.   Research released by the Harvard Law School’s Berkman Center for Internet & Society revealed that Google’s SafeSearch excludes many harmless Web pages from search-result listings.  Some of these sites include ones associated with the White House, IBM, and clothing company Liz Claiborne.  SafeSearch uses a proprietary algorithm that automatically analyzes the pages and makes an educated guess, without intervention by Google employees.  Although this way of doing things is cheaper for Google, here is where the problem occurs.  This shows just what can happen when we put this type of faith in computers.  We are essentially leaving it up to mathematical calculations to decide what is inappropriate.  However, Google says that some (the Harvard study found only 11.3%) of the sites were being blocked because of a device called the “robots.txt” file (McCullah).  Robots.txt files implement the Robots Exclusion Protocol.  This allows the web site administrator to define what parts of the site are off-limits to specific robot user agent names. The administers can then disallow access to their content to private and temporary directories which the site might not index (“About Robots.txt and Search Indexing Robots”).  Although Google claims to err on the cautious side, there are certain issues that can be swept under the rug with this topic.  We have to wonder if this content is truly just accidently being blocked from the site or if it is being blocked for some deeper, darker reason.  Many news sites have been filtered through the site.  In fact, papers written by Ben Edelman and other Harvard researchers that describe the errors in SafeSearch and other filtering programs are blocked as well.  “It might be difficult for an AI (artificial intelligence-based) system to figure out that this is a site about regulating pornography on the Internet instead of actual pornography,” Edelman said  (McCullah).   Clearly, when filtering is left up to algorithms, material that is not obscene in any way is occasionally blocked.  This is contrary to what these filters are actually intended to do.

On the other hand, sometimes safe searches don’t do an adequate job of actually blocking inappropriate material.  Google SafeSearch is made for users who prefer not to have “adult sites” included in their search results.  SafeSearch screens for sites that have this explicit sexual content and deletes them from the search results.  However, sometimes it doesn’t succeed at blocking all of this content.  “No filter is 100 percent accurate, but SafeSearch should eliminate most inappropriate material.  We do our best to keep SafeSearch as up-to-date and comprehensive as possible, but inappropriate sites will sometimes slip through the cracks” (“Google Web Search Help”).  Often times when I did searches for well-intentioned terms for research projects online in my high school, pornographic material would still pop up.  This was even with Google’s SafeSearch Moderate Filtering turned on and other security programs in use.  To me, it wasn’t a big deal.  But if a child was making the same search, their reaction might have been much different.  Although these safeguards try to do the best filtering they can, it is evident that they are not always completely successful in doing so.

In this same token, this shows that parents should not put their full reliance in these programs.  It is equally important for them to keep an eye on what their children are doing online.  Although most parents say that they take some action to protect their children online, others admit that they are oblivious about how to go about doing so.   A poll conducted by Cable in the Classroom found that 55% of parents have installed Web content filters or blocking software on their computers.  However, a 2006 study by research firm Harris Interactive discovered that one third of all parents said that they didn’t feel confident teaching their kids how to use the Internet in a safe and responsible way.  Even with this statistic though, 88% of parents say that they have talked to their children about being safe online (Olsen).  Obviously an important problem arises from all of these statistics.  Even though most parents have a conversation with their children about Internet safety, one out of every three doesn’t even know what type of information they should discuss during this dialogue.  It is hard for parents to teach their children to be media literate when they themselves aren’t too sure what exactly that means.

It is evident that it takes more than just general talk and filtering programs to appropriately teach children Internet safety.  The Harris Interactive poll also found that 82 percent of parents monitor their kids’ online activity, 75 percent limit Internet use to a family room or open space, and 74 percent have set time limitations (Olsen).  Maybe some of these parents realize that it takes more than just a filter to keep their children safe online.  The afore mentioned survey found that 90% of the 374 polled parents with kids age eight to eighteen said that parents themselves should bear most of the responsibility when it comes to protecting children online.  The majority of these parents also believed that schools should help in this effort by educating and shielding students.  Despite this, 60% of teachers polled said that that type of information and media literacy skills aren’t taught enough within the school setting.  Plus, the schools aren’t even educating their teachers.  Seventy-eight percent of teachers said that they had to learn these media literacy skills on their own so they could educate their students effectively (Olsen).  With this we have to wonder just who is going to step in and teach not only children, but also parents and teachers, about how to keep the youth safe while they are on the internet.

In conclusion, there are many issues related to Internet Guardians such as Google’s SafeSearch and other filtering and blocking systems.  While these programs serve to prevent children from viewing inappropriate material on the Internet, they don’t always succeed.  Sometimes these programs block content that in fact isn’t really explicit.  On the other hand, sometimes these filters are not 100% effective in blocking inappropriate content.  We have to wonder just what kind of control and power we are handing over to these programs and just what content they are deeming unworthy to view.

Works Cited

“About Robots.txt and Search Indexing Robots.” SearchTools. 19 September 2008. 1 November 2009 <http://www.searchtools.com/robots/robots-txt.html&gt;.

Baran, Stanley J. “The Internet and the World Wide Web.” Introduction to Mass Communication: Media Literacy and Culture. New York: McGraw Hill. 2008. 280-281.

Goldstein Media Releases. Andrew Robb MP – Federal Member for Goldstein. 31 August 2007. 2 November 2009 <http://www.andrewrobb.com.au/Media/GoldsteinMediaReleases/tabid/72/articleType/ArticleView/articleId/724/Internet-safeguards-for-Bayside-families.aspx&gt;.

“Google Web Search Help.” Google. 2009. 3 November 2009 <http://www.google.com/support/websearch/bin/answer.py?hl=en&answer=510&gt;.

“How does the Parental Control web filter Work?”. Microsoft Windows. 2009. 2 November 2009 <http://windows.microsoft.com/en-us/windows-vista/How-does-the-Parental-Controls-web-filter-work&gt;.

McCullah, Declan. CNET News. 10 April 2003. 2 November 2009 <http://news.cnet.com/2100-1032-996417.html&gt;.

Olsen, Stephanie. CNET News. 10 August 2006. 2 November 2009 <http://news.cnet.com/Parents-shaky-about-kids-safety-online/2009-1025_3-6104028.html&gt;.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s