Posts Tagged ‘Internet Censorship’

Read more news.htm at MelonFarmers.co.uk

count dankula pug video A man who suffered a miscarriage of justice after being convicted for a joke has been refused permission to appeal against a conviction for supposedly causing gross offence.

Mark Meechan, who blogs under the name Count Dankula, was fined £800 in April after being found guilty under the Communications Act over a  video joke in which he trained his girlfriend’s dog to perform Nazi salutes.

A letter from the court claimed the appeal was not arguable and in each of its elements is wholly misconceived. It also dismissed arguments made by Meechan’s lawyers over the judge’s handling of witness evidence at Airdrie Sheriff Court in March and the meaning of grossly offensive. The letter said:

The appeal against conviction is without merit. Likewise the appeal against sentence is not arguable — this was a deeply unpleasant offence in which disgraceful and utterly offensive material was very widely distributed by the appellant, it said. This was to the considerable distress of the community in question and — just as disturbingly — to the apparent approval of a large number of persons who appear to share the appellant’s racist views.

Indeed it must be observed that in the circumstance the appellant was fortunate that the learned sheriff was not considering custody as an option.

Advertisements
Read more uk_internet_censors.htm at MelonFarmers.co.uk

elspeth howeElspeth Howe, a member of the House of Lords, has written an article in the Telegraph outlining her case that the remit for the BBFC to censor internet porn sites should be widened to include a wider range of material that she does not like.This seems to tally with other recent news that the CPS is reconsidering its views on what pornographic content should be banned from publication in Britain.

Surely these debates are related to the detailed guidelines to be used by the BBFC when either banning porn sites, or else requiring them to implement strict age verification for users. It probably explains why the Telegraph recently reported that the publication of the final guidelines has been delayed until at least the autumn.


Categories of Porn

For clarity the categories of porn being discussed are as follows:

Current Proposed
offline online offline online
Softcore 18 BBFC uncut BBFC uncut BBFC uncut BBFC uncut
Hardcore R18 BBFC uncut BBFC uncut BBFC uncut BBFC uncut
Beyond R18
(proposal by CPS)
banned BBFC uncut BBFC uncut BBFC uncut
Cartoon Porn
(proposal by Howe))
banned BBFC uncut banned banned
Extreme porn banned banned banned banned
Child porn banned banned banned banned
  • Softcore porn rated 18 under BBFC guidelines

    – Will be allowed subject to strict age verification

  • Vanilla hardcore porn rated R18 under current BBFC guidelines

    – Will be allowed subject to strict age verification

  • Beyond R18 hardcore porn that includes material historically banned by the CPS claiming obscenity, ie fisting, golden showers, BDSM, female ejaculation, and famously from a recent anti censorship campaign, face sitting/breath play. Such material is currently cut from R18s.

    – Such content will be allowed under the current Digital Economy Act for online porn sites
    – This category is currently banned for offline sales in the UK, but the CPS has just opened a public consultation on its proposal to legalise such content, as long as it is consensual. Presumably this is related to the government’s  overarching policy: What’s illegal offline, is illegal online.

  • Extreme Porn as banned from possession in the UK under the Dangerous Pictures Act. This content covers, bestiality, necrophilia, realistic violence likely to result in serious injury, realistic rape

    – This content is illegal to possess in the UK and any websites with such content will be banned by the BBFC regardless of age verification implementation

  • Cartoon Porn depicting under 18s

    – This content is banned from possession in the UK but will be allowed online subject to age verification requirements

  • Photographic child porn

    This is already totally illegal in the UK on all media. Any foreign websites featuring such content are probably already being blocked by ISPs using lists maintained by the IWF. The BBFC will ban anything it spots that may have slipped through the net.


‘What’s illegal offline, is illegal online’

Elspeth Howe writes:

I very much welcome part three of the Digital Economy Act 2017 which requires robust age verification checks to protect children from accessing pornography. The Government deserves congratulations for bringing forward this seminal provision, due to come into effect later this year.

The Government’s achievement, however, has been sadly undermined by amendments that it introduced in the House of Lords, about which there has been precious little public debate. I very much hope that polling that I am placing in the public domain today will facilitate a rethink.

When the Digital Economy Bill was introduced in the Lords, it proposed that legal pornography should be placed behind robust age verification checks. Not surprisingly, no accommodation for either adults or children was made for illegal pornography, which encompasses violent pornography and child sex abuse images.

As the Bill passed through the Lords, however, pressure was put on the Government to allow adults to access violent pornography, after going through age-verification checks, which in other contexts it would be illegal to supply. In the end the Government bowed to this pressure and introduced amendments so that only one category of illegal pornography will not be accessible by adults.

[When  Howe mentions violent pornography she is talking about the Beyond R18 category, not the Extreme Porn category, which will be the one category mentioned that will not be accessible to adults].

The trouble with the idea of banning Beyond R18 pornography is that Britain is out of step with the rest of the world. This category includes content that is ubiquitous in most of the major porn websites in the world. Banning so much content would be simply be impractical. So rather than banning all foreign porn, the government opted to remove the prohibition of Beyond R18 porn from the original bill.

Another category that has not hitherto come to attention is the category of cartoon porn that depicts under 18s. The original law that bans possession of this content seemed most concerned about material that was near photographic, and indeed may have been processed from real photos. However the law is of most relevance in practical terms when it covers comedic Simpsons style porn, or else Japanese anime often featuring youthful, but vaguely drawn cartoon characters in sexual scenes.

Again there would be problems of practicality of banning foreign websites from carry such content. All the major tube sites seems to have a section devoted to Hentai anime porn which edges into the category.

In July 2017, Howe introduced a bill that would put Beyond R18 and Cartoon Porn back into the list of prohibited material in the Digital Economy Act. The bill is titled the Digital Economy Act 2017 (Amendment) (Definition of Extreme Pornography) Bill and is still open, but further consideration in Parliament has stalled, presumably as the Government itself is currently addressing these issues.

The bill adds in to the list of prohibitions any content that has been refused a BBFC certificate or would be refused a certificate if it were to be submitted. This would catch both the Beyond Porn and Cartoon Porn categories.

The government is very keen on its policy mantra: What’s illegal offline, is illegal online and it seems to have addressed the issue of Beyond 18 material being illegal offline but legal online. The government is proposing to relax its own obscenity rules so that Beyond R18 material will be legalised, (with the proviso that the porn is consensual). The CPS has published a public consultation with this proposal, and it should be ready for implementation after the consultation closes on 17th October 2018.

Interestingly Howe seems to have dropped the call to ban Beyond R18 material in her latest piece, so presumably she has accepted that Beyond R18 material will soon be classifiable by the BBFC, and so not an issue for her bill.
Still to be Addressed

That still leaves the category of Cartoon Porn to be addressed. The current Digital Economy Act renders it illegal offline, but legal online. Perhaps the Government has given Howe the nod to rationalise the situation by making banning the likes of Hentai. Hence Howe is initiating a bit of propaganda to support her bill.  She writes:

The polling that I am putting in the public domain specifically addresses the non-photographic child sex abuse images and is particularly interesting because it gauges the views of MPs whose detailed consideration of the Bill came before the controversial Lords amendments were made.

According to the survey, which was conducted by ComRes on behalf of CARE, a massive 71% of MPs, rising to 76% of female MPs, stated that they did not believe it was right for the Digital Economy Act to make non-photographic child sex abuse images available online to adults after age verification checks. Only 5% of MPs disagreed.

There is an opportunity to address this as part of a review in the next 18 months, but things are too serious to wait .The Government should put matters right now by adopting my very short, but very important two-clause Digital Economy Act (Amendment) (Extreme Pornography) Bill which would restore the effect of the Government’s initial prohibition of this material.

I — along with 71 per cent of MPs — urge the Government to take action to ensure that the UK’s internet does not endorse the sexual exploitation of children.

I haven’t heard of this issue being discussed before and I can’t believe that anybody has much of an opinion on the matter. Presumably therefore, the survey presented out of the blue with the questions being worded in such a way as to get the required response. Not unusual, but surely it shows that someone is making an effort to generate an issue where one didn’t exists before. Perhaps an indication that Howe’s solution is what the authorities have decreed will happen.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo MPs left behind unfinished business when they broke for summer recess, and we aren’t talking about Brexit negotiations. The rollout of mandatory age verification (AV) technology for adult websites is being held up once again while the Government mulls over final details. AV tech will create highly sensitive databases of the public’s porn watching habits, and Open Rights Groups submitted a report warning the proposed privacy protections are woefully inadequate. The Government’s hesitation could be a sign they are receptive to our concerns, but we expect their final guidance will still treat privacy as an afterthought. MPs need to understand what’s at stake before they are asked to approve AV guidelines after summer.

AV tools will be operated by private companies, but if the technology gets hacked and the personal data of millions of British citizens is breached, the Government will be squarely to blame. By issuing weak guidelines, the Government is begging for a Cambridge Analytica-style data scandal. If this technology fails to protect user privacy, everybody loses. Businesses will be damaged (just look at Facebook), the Government will be embarrassed, and the over 20 million UK residents who view porn could have their private sexual preferences exposed. It’s in everybody’s interest to fix this. The draft guidance lacks even the basic privacy protections required for other digital tools like credit card payments and email services. Meanwhile, major data breaches are rocking international headlines on a regular basis. AV tech needs a dose of common sense.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Telegraph logoThe Telegraph reveals:

The government is braced for criticism next week over an anticipated delay in its prospective curbs on under 18s’ access to hardcore porn sites.

The current timetable culminating in the implementation of UK porn censorship by the end of the year required that the final censorship guidelines are presented to MPs before they go on holiday on Thursday. They will then be ready to approve them when they return to work in the autumn. It sound like they won’t be ready for publishing by this Thursday.

The BBFC noted that they were due to send the results of the public consultation along with the BBFC censorship rules to the government by late May of this year so presumably the government is still pondering what to do.

‘Best practice’ just like Facebook and Cambridge Analytica

Back in April when the BBFC initiated its rather naive draft rules for public consultation its prose tried to suggest that we can trust age verifiers with our most sensitive porn browsing data because they will voluntarily follow ‘best practice’. But in light of the major industry player, in this case Facebook, allowing Cambridge Analytica to so dramatically abuse our personal data, the hope that these people will follow best practice’ is surely forlorn.

GDPR

And there was the implementation of GDPR. The BBFC seemed to think that this was all that was needed to keep our data safe. But when t comes down to it all GDPR seems to have done is to train us, like Pavlov’s dogs, to endlessly tick the consent box for all these companies to do what the hell they like with our data.

Ingenious kids

Then there was a nice little piece of research this week that revealed that network level ISP filtering of porn has next to no impact on preventing young porn seekers from obtaining their kicks. The research notes seems to suggest that it is not enough to block porn one lad because he has 30 mates whose house he can round to surf the web there, or else it only takes a few lads to be able to download porn and it will soon be circulated to the whole community on a memory stick or whatever.

Mass Buy in

I guess the government is finding it tough to find age verification ideas that are both convenient for adult users, whilst remaining robust about preventing access by the under 18s. I think the governments needs to find a solution that will achieve a mass buy in by adult users. If the adults don’t want to play ball with the age verification process, then the first fall back position is for them to use a VPN. I know that from my use of VPNS that they are very good, and once you turn it on then I find it gets left on all day. I am sure millions of people using VPNs would not go down well with the security services on the trail of more serious crimes than under age porn viewing.

I think the most likely age verification method proposed to date that has a chance of a mass buy-in is the AVSecure system of anonymously buying a porn access card from a local shop, and using a PIN, perhaps typed in once a day. Then they are able to browse without further hassle on all participating websites. But I think it would require a certain pragmatism from government to accept this idea, as it would be so open to over 18s buying a card and then selling the PIN to under 18s, or perhaps sons nicking their Dad’s PINS when they see the card lying around, (or even perhaps installing a keyboard logger to nick the password).

The government would probably like something more robust where PINS have to be matched to people’s proven ID. But I think pron users would be stupid to hand over their ID to anyone on the internet who can monitor porn use. The risks are enormous, reputational damage, blackmail, fraud etc, and in this nasty PC world, the penalty of the most trivial of moral transgressions is to lose your job or even career.

A path to failure

The government is also setting out on a path when it can do nothing but fail. The Telegraph piece mentioned above is already lambasting the government for not applying the rules to social media websites such as Twitter, that host a fair bit of porn. The Telegraph comments:

Children will be free to watch explicit X-rated sex videos on social media sites because of a loophole in a new porn crackdown, Britain’s chief censor has admitted.

David Austin, chief executive of the BBFC, has been charged by ministers with enforcing new laws that require people to prove they are over 18 to access porn sites. However, writing for telegraph.co.uk, Mr Austin admitted it would not be a silver bullet as online porn on sites such as Facebook and YouTube would escape the age restrictions. Social media companies will not be required to carry age-verification for pornographic content on their platforms. He said it was a matter for government to review this position.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

BBFC logo Nobody seems to have heard much about the progress of the BBFC consultation about the process to censor internet porn in the UK.The sketchy timetable laid out so far suggests that the result of the consultation should be published prior to the Parliamentary recess scheduled for 26th July. Presumably this would provide MPs with some light reading over their summer hols ready for them to approve as soon as the hols are over.

Maybe this publication may have to be hurried along though, as pesky MPs are messing up Theresa May’s plans for a non-Brexit, and she would like to send them packing a week early before they can cause trouble.

The BBFC published meeting minutes this week that mentions the consultation:

The public consultation on the draft Guidance on Age Verification Arrangements and the draft Guidance on Ancillary Service Providers closed on 23 April. The BBFC received 620 responses, 40 from organisations and 580 from individuals. Many of the individual responses were encouraged by a campaign organised by the Open Rights Group.

Our proposed response to the consultation will be circulated to the Board before being sent to DCMS on 21 May.

So assuming that the response was sent to the government on the appointed day then someone has been sitting on the results for quite a long time now.

Meanwhile its good to see that people are still thinking about the monstrosity that is coming our way. Ethical porn producer Erica Lust has been speaking to News Internationalist. She comments on the way the new law will compound MindGeek’s monopolitistc dominance of the online porn market:

The age verification laws are going to disproportionately affect smaller low-traffic sites and independent sex workers who cannot cover the costs of installing age verification tools.

It will also impact smaller sites by giving MindGeek even more dominance in the adult industry. This is because the BBFC draft guidance does not enforce sites to offer more than one age verification product. So, all of MindGeeks sites (again, 90% of the mainstream porn sites) will only offer their own product; Age ID. The BBFC have also stated that users do not have to verify their age on each visit if access is restricted by password or a personal ID number. So users visiting a MindGeek site will only have to verify their age once using AgeID and then will be able to login to any complying site without having to verify again. Therefore, viewers will be less likely to visit competitor sites not using the AgeID technology, and simultaneously competitor sites will feel pressured to use AgeID to protect themselves from losing viewers.

…Read the full  article from newint.org

Read more eu.htm at MelonFarmers.co.uk

terminator battle machinePoliticians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by YouTube’s upload filtering.

French politicians of the former Front National are furious: their entire YouTube channel was just taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week’s vote, which they have announced they will support: the bill that will make exactly this arbitrary, political, and unilateral upload filtering mandatory all across Europe.

The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel disappeared on grounds of alleged copyright violations. In an interview with French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral.

Europe is about to vote on new copyright law next week. Next Wednesday or Thursday. So let’s disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France’s biggest parties regardless of their policies, then it can happen to anyone for political reasons 204 or any other reason.

The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.

Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast].

She’s right. Automated upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make sure that the hosting platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here.

And yet, the Rassemblemant National is going to vote in favor of exactly this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral.

It’s hard to illustrate clearer that Europe’s politicians have absolutely no idea about the monster they’re voting on next week.

The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube’s Content ID filtering is today, as has just been illustrated.

The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of automated censorship machines.

Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.

Read more eu.htm at MelonFarmers.co.uk

UN logoDavid Kaye, the UN’s Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive — the part about mandatory copyright filters — would be a disaster for free speech and would violate the UN’s Declaration on Human Rights, and in particular Article 19 which says:

Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.

As Kaye’s report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.

Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave considerable leeway for interpretation.

The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem — especially when a website may face legal liability for getting it wrong.

The designation of such mechanisms as the main avenue to address users’ complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content — particularly in the context of fair use and other fact-sensitive exceptions to copyright — may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.

In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State’s obligation to provide access to an effective remedy for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation — this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.

He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:

I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)’s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.