READERS

30 Apr 2015

David Cameron really, really has not the faintest what he is talking about when it comes to the internet.

David Cameron really, really has not the faintest what he is talking about when it comes to the internet.

Earlier this week, I asked a simple question of his latest pronouncements on online porn, child abuse, internet filtering and related stuff: is he an internet ignoramus – or a master manipulator?

Today, I think the answer is clear: like far too many of our legislators, his grasp of matters internet is fleeting at best, leading him over and over to an excess of soundbite over substance.

Let’s start with two simple propositions, both of which his press office readily agreed to:

1) David Cameron is in favour of companies such as Google adopting filters to filter out online porn, and will bring in regulation to ensure this happens if they fail to act voluntarily.

And 2) he is definitely not in favour of regulators intervening to block or ban page 3.

As he explained on Woman’s Hour very recently: “This is an area where we should leave it to consumers to decide, rather than to regulators … As politicians we have to decide where is the right place for regulation, where is the right place for legislation, where is the right place for consumers to decide.”

Unfortunately for Mr Cameron, there is a slight logical impediment to his position.

If Cameron had actually done what most people blithely advocating filters seem not to, and looked at what filters filter, he’d have realised that Google SafeSearch already filters page 3.  So his position is: we leave it to consumers to decide about page 3, we do not regulate, but if search engine operators do NOT introduce filters (which happen to block access to page 3 anyway), we do regulate.

Is that clear?

The real issue here is the blind faith placed by politicians in a solution with a sexy-sounding name. It’s a “porn filter”.  So presumably it blocks porn.

What they haven’t done — what most of the filter peddlers in this country haven’t done — is asked any really searching questions about how they work or what they actually block.

F’rinstance.  Doing research for another article on BDSM, while visiting my local Costa, I accessed the internet via wi-fi kindly provided by O2.  This, in case you were not already aware, automatically turns Google to SafeSearch mode, resulting in the following polite but unhelpful message: “The word “bdsm” has been filtered from the search because Google SafeSearch is active”.

What about ‘bondage’? Ditto!  Which must be a bit hard on any English students researching Of Human Bondage. Or Lolita? Yep: Vladimir Nabokov’s most (in)famous novel erased completely from online existence.

‘Fetish’ also returns a blank — though paraphilia gets through. As does “spit roast” (which will be a relief both to chefs and those with certain sexual proclivities).

Go on, I dare you.  It’s a fun game.  Pick a word — any word! — and google it first with, and then without, SafeSearch turned on.

As for The Sun. Try looking under images with the following as your search terms: “site:www.thesun.co.uk page 3”.  Now you see ‘em (semi-naked women, that is): now you don’t.  Magic!

The basic issue is one I first spotted while looking, for wholly non-academic purposes, for shirtless pics of actor Richard Armitage. Then, too, I was out at a cafĂ©, with a friend … and initially puzzled as to why I suddenly could not show her some perfectly innocuous images instantly accessible on my PC at home.

Then I realised.  My filters had automatically been turned on.  Non-consensually?  Probably not, in the sense that buried in the small print of the local wi-fi somewhere was likely to be a clause saying I agreed to this.  But what had I “agreed to”?

Some filters work by blocking some search terms.  Some block access to sites based on reports of unlawful activity on those sites (and in this respect, I have no problems at all with the Internet Watch Foundation’s block list). And some just calculate a proportion of skin tone in an image and block it. (Naked green martians would probably escape many filter systems!)

Then they wrap it all up in one neat package behind a button labelled ‘porn filter’ and job done.  In the sense of done haphazardly, inaccurately and superficially. But those have never been reasons for politicians not to adopt a particular solution before, so why change the habits of a lifetime?

Besides, the public seem to fall for it. If ‘porn filter’ is what it says on the label, then ‘filtering porn’ is what it must do. And if later we discover it’s blocking Auntie Agnes’ holiday snaps too, that’s “a price worth paying”.

The problem, as so often, is that accurate filtering takes time, money and resource.  Much, much easier to out-source the issue to geeks who promise to rid the virtual world of bad stuff and not worry too much about the solution. The Pied Piper of Hamelin for the internet age.

The real issue is twofold.  First, by subscribing to filters developed ‘over there’ — mostly in the United States — we are subscribing to cultural values that are subtly different from our own. As the ongoing debate with Facebook reveals, these values are, on the whole, harsher on things like nudity and breastfeeding, and far softer on representations of violence against women. The latter of which is frequently described as “just a joke” and “free speech”.

Second, the politicians and ISPs and maybe the search engines too are buying a solution in a box without actually bothering to look inside the box.  I’ve asked senior officials from many of these companies just what their porn filter filters.  The answer?  They don’t know. Not in detail, anyway.

They’ll happily talk generalities: but ask whether this or that site or image is likely to be blocked and they simply don’t know.

And in the end, if politicians are making decisions that are likely to have a significant impact on our lives, I’d much rather they actually knew what those decisions implied.  That our venerable PM, when he says he does not intend to regulate page 3 should be aware that his threatened regulation of search engines will actually do just that — and not simply subside into a froth of “principle” when called on the fact that he really doesn’t know much at all about the internet.



Note:  A spokesperson for the Prime Minister has responded:  “The Prime Minister has not suggested companies such as google should adopt filters to filter out online pornography and so your first point is wrong.” [See above: this statement was put to the No. 10 Press Office who initially agreed this was the case.]

They continue: “For search companies, the Prime Minister has suggested they should be doing more to stop illegal images from being returned in searches and for search terms to be blocked where the person at the keyboard is clearly looking for revolting child abuse images.

“The reference to possible legislation is about search companies doing more to block illegal content or malevolent search terms, not about filtering online pornography more generally.


“The other aspect here is about filters for children and that is a separate piece of work being taken forward by internet providers. We’re keen to ensure parents have the choice to switch on a filter to ensure their children do not have easy access to pornography”.

Thanks to: JANE FAE  LINK

WEB: www.sinfulandwicked.co.uk 

MOB: 07426 490 214 

TWITTER: @sinfulandwicked

No comments:

Practice makes perfect

Resulting form the lack of effectiveness in work while wearing shackles, I did promise Mistress to practice more at home when I have time an...