Read more EU Censorship News at

pirate party iceland logo The Pirate Party in Iceland continues its shakeup of the local political arena. According to the latest polls the party now has a serious shot at taking part in the next Government coalition, with roughly 20% of all votes one week before the parliamentary elections.The Pirate Party was founded in 2006 by Rick Falkvinge, and has scored some significant victories over the years including a continuing presence in the European Parliament.

Iceland’s Pirates have a great track record already, with three members in the national Parliament. However, more may join in the future as the party has added many new supporters in recent months. The Pirates have been leading the polls for most of the year and are currently neck-and-neck with the Social Democratic Alliance to become the largest party in the country.

This brings the Pirates in an unusual position where they have to start thinking about possible partners to form a coalition Government, for the first time in history.

TorrentFreak spoke with Ásta Helgadóttir, Member of Parliament for the Icelandic Pirate Party, who says that the party is ready to bring the change many citizens are longing for. Despite the Pirate name, copyright issues are not central to their plans. That said, they have spoken out against recent web-blocking efforts.

Iceland’s ISPs have been ordered to block access to ‘infringing’ sites such as The Pirate Bay, which the party sees as a step in the wrong direction. The party fears that these censorship efforts will lead to more stringent measures. Helgadóttir said:

These measures are not a solution and only exacerbate the problem. There needs to be a review of copyright law and how creators are compensated for their work.

Helgadóttir has also been speaking about the censorship of internet porn. She commented in an interview with :

In 2013 the Pirate Party came along. The freedom of information aspect attracted me–I’m very much against censorship.

One idea being mooted at the time was the blocking of porn sites in Iceland, which set alarm bells ringing for Ásta. According to Icelandic law, pornography is illegal. It’s a law from the 19th century, and it hasn’t been enforced for fifteen years now. Then the idea of building a pornography shield around Iceland came up. And I thought, No, you can’t do that! It’s censorship! And they were like, No, it’s not censorship, we’re thinking about the children!'”

The Pirate Party is trying to infiltrate the system and change these ‘heritage laws, because when you read a law, you have to understand the root of that law–when was it written, what was the context, and the culture. And now we’re in the 21st century, with the internet, which changes everything.

The parliamentary elections will take place next week, October 29.

Read more UK Parliament Watch at

House of Commons logo The UK government has introduced an amendment to the Investigatory Powers Bill currently going through Parliament, to make ensure that data retention orders cannot require ISPs to collect and retain third party data. The Home Office had previously said that they didn’t need powers to force ISPs to collect third party data, but until now refused to provide guarantees in law.Third party data is defined as communications data (sender, receiver, date, time etc) for messages sent within a website as opposed to messages sent by more direct methods such as email. It is obviously a bit tricky for ISPs to try and decode what is going on within websites as messaging data formats are generally proprietary, and in the general case, simply not de-cypherable by ISPs.

The Government will therefore snoop on messages sent, for example via Facebook, by demanding the communication details from Facebook themselves.

Read more Internet News at

Facebook logo Facebook’s VPs Joel Kaplan and Justin Osofsky wrote in a blog:

In recent weeks, we have gotten continued feedback from our community and partners about our Community Standards and the kinds of images and stories permitted on Facebook. We are grateful for the input, and want to share an update on our approach.

Observing global standards for our community is complex. Whether an image is newsworthy or historically significant is highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive — or even illegal — in another. Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community that is both safe and open to expression.

In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards. We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.

As always, our goal is to channel our community’s values, and to make sure our policies reflect our community’s interests. We’re looking forward to working closely with experts, publishers, journalists, photographers, law enforcement officials and safety advocates about how to do better when it comes to the kinds of items we allow. And we’re grateful for the counsel of so many people who are helping us try to get this right.

Read more UK Internet Censorship at

open rights group 2016 logo The Digital Economy Bill mandates that pornographic websites must verify the age of their customers. Are there any powers to protect user privacy?Yesterday we published a blog detailing the lack of privacy safeguards for Age Verification systems mandated in the Digital Economy Bill. Since then, we have been offered two explanations as to why the regulator designate , the BBFC, may think that privacy can be regulated.

The first and most important claim is that Clause 15 may allow the regulation of AV services, in an open-ended and non-specific way:

15 Internet pornography: requirement to prevent access by persons under the age of 18

  1. A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18
  2. [snip]
  3. The age-verification regulator (see section 17) must publish guidance about–(a) types of arrangements for making pornographic material available that the regulator will treat as complying with subsection (1);

However, this clause seems to regulate publishers who “make pornography material available on the internet” and what is regulated in 15 (3) (a) is the “arrangements for making pornography available”. They do not mention age verification systems, which is not really an “arrangement for making pornography available” except inasmuch as it is used by the publisher to verify age correctly.

AV systems are not “making pornography available”.

The argument however runs that the BBFC could under 15 (3) (a) tell websites what kind of AV systems with which privacy standards they can use.

If the BBFC sought to regulate providers of age verification systems via this means, we could expect them to be subject to legal challenge for exceeding their powers. It may seem unfair to a court for the BBFC to start imposing new privacy and security requirements on AV providers or website publishers that are not spelled out and when they are subject to separate legal regimes such as data protection and e-privacy.

This clause does not provide the BBFC with enough power to guarantee a high standard of privacy for end users, as any potential requirements are undefined. The bill should spell out what the standards are, in order to meet an ‘accordance with the law’ test for intrusions on the fundamental right to privacy.

The second fig leaf towards privacy is the draft standard for age verification technologies drafted by the Digital Policy Alliance. This is being edited by the British Standards Institution, as PAS 1296 . It has been touted as the means by which commercial outlets will produce a workable system.

The government may believe that PAS 1296 could, via Clause 15 (3) (a), be stipulated as a standard that Age Verifcation providers abide by in order to supply publishers, thereby giving a higher standard of protection than data protection law alone.

PAS 1296 provides general guidance and has no means of strong enforcement towards companies that adopt it. It is a soft design guide that provides broad principles to adopt when producing these systems.

Contrast this, for instance, with the hard and fast contractual arrangements the government’s Verify system has in place with its providers, alongside firmly specified protocols. Or card payment processors, who must abide by strict terms and conditions set by the card companies, where bad actors rapidly get switched off.

The result is that PAS 1296 says little about security requirements , data protection standards, or anything else we are concerned about. It stipulates that the age verification systems cannot be sued for losing your data. Rather, you must sue the website owner, i.e. the porn site which contracted with the age verifier.

There are also several terminological gaffes such as referring to PII (personally identifying information) which is a US legal concept, rather than EU and UK’s ‘personal data’; this suggests that PAS 1296 is very much a draft, in fact appears to have been hastily cobbled-together

However you look at it, the proposed PAS 1296 standard is very generic, lacks meaningful enforcement and is designed to tackle situations where the user has some control and choice, and can provide meaningful consent. This is not the case with this duty for pornographic publishers. Users have no choice but to use age verification to access the content, and the publishers are forced to provide such tools.

Pornography companies meanwhile have every reason to do age verification as cheaply as possible, and possibly to harvest as much user data as they can, to track and profile users, especially where that data may in future, at the slip of a switch, be used for other purposes such as advertising-tracking. This combination of poor incentives has plenty of potential for disastrous consequences.

What is needed is clear, spelt out, legally binding duties for the regulator to provide security, privacy and anonymity protections for end users. To be clear, the AV Regulator, or BBFC, does not need to be the organisation that enforces these standards. There are powers in the Bill for it to delegate the regulator’s responsbilties. But we have a very dangerous situation if these duties do not exist.

Read more UK Internet Censorship at

open rights group 2016 logo The Government wants people who view pornography to show that they are over 18, via Age Verification systems. This is aimed at reducing the likelihood of children accessing inappropriate content.

To this end the Digital Economy Bill creates a regulator that will seek to ensure that adult content websites will verify the age of users, or face monetary penalties, or in the case of overseas sites, ask payment providers such as VISA to refuse to process UK payments for non-compliant providers.

There are obvious problems with this, which we detail elsewhere .

However, the worst risks are worth going into in some detail, not least from the perspective of the Bill Committee who want the Age Verification system to succeed.

As David Austen, from the BBFC, who will likely become the Age Verification Regulator said :

Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, “Is this person 18 or over?” The answer should be either yes or no. No other personal details are necessary.

However, the Age Verification Regulator has no duties in relation to the Age Verification systems. They will make sites verify age, or issue penalties, but they are given no duty to protect people’s privacy, security or defend against cyber security risks that may emerge from the Age Verification systems themselves.

David Austen’s expectations are unfortunately entirely out of his hands.

Instead, the government appears to assume that Data Protection law will be adequate to deal with the privacy and security risks. Meanwhile, the market will provide the tools.

The market has a plethora of possible means to solve this problem. Some involve vast data trawls through Facebook and social media. Others plan to link people’s identity across web services and will provide way to profile people’s porn viewing habits. Still others attempt to piggyback upon payment providers and risk confusing their defences against fraud. Many appear to encourage people to submit sensitive information to services that the users, and the regulator, will have little or no understanding of.

And yet with all the risks that these solutions pose, all of these solutions may be entirely data protection compliant. This is because data protection allows people to share pretty much whatever they agree to share, on the basis that they are free to make agreements with whoever they wish, by providing ‘consent’.

In other words: Data protection law is simply not designed to govern situations where the user is forced to agree to the use of highly intrusive tools against themselves.

What makes this proposal more dangerous is that the incentives for the industry are poor and lead in the wrong direction. They have no desire for large costs, but would benefit vastly from acquiring user data.

If the government wants to have Age Verification in place, it must mandate a system that increases the privacy and safety of end users, since the users will be compelled to use Age Verification tools. Also, any and all Age Verification solutions must not make Britain’s cybersecurity worse overall, e.g. by building databases of the nation’s porn-surfing habits which might later appear on Wikileaks.

The Digital Economy Bill’s impact on privacy of users should, in human rights law, be properly spelled out (” in accordance with the law “) and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges.

User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is regarded as particularly sensitive in law.

Government, in fact has at its hands a system called Verify which could provide age-verification in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts.

As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a complex social problem. The Internet has given young people unprecedented access to adult content but it’s education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and therefore proportionality of this measure remain.

However, legislating for the Age Verification problem to be “solved” without any specific regulation for any private sector operator who wants to “help” is simply to throw the privacy of the UK’s adult population to the mercy of the porn industry. With this mind, we have drafted an amendment to introduce the duties necessary to minimise the privacy impacts which could also reduce if not remove the free expression harms to adults.

Read more US Sex Sells News at

california block warning Several porn websites are alerting their viewers living in California that they could be blocked from the state should Proposition 60 be accepted in a public ballot next month.Proposition 60 is measure proposed by anti-porn campaigners that would require adult performers to use condoms for all videos made in the state. If they don’t, the law would allow any citizen in the state to sue producers and distributors of prophylactic-lacking porn.

In protest, popular sites Vivid, Evil Angel and Kink, among others, have pop-ups urging visitors with California IP addresses to vote no on the proposition come election day. If it passes, some are considering blocking those users entirely to protect themselves from litigation.

Prop 60 is sponsored by the AIDS Healthcare Foundation and operates much like Los Angeles’ Measure B initiative passed in 2012, but would apply to the entire state.

The adult industry is opposed to it as there exists considerable customer resistance to condom protected porn. The existing adult trade policy of continuous testing of performers has kept AIDS infections to extraordinarily low numbers in the last few years and so the new law proposal is only of benefit to anti-porn activists. Performers would be placed in more danger by such a new law as commercial pressures will surely drive sections of the industry underground and outside of the testing regime.

The proposed law also has a nasty requirement for performers to be identified with real world names so exposing studios and actors/actresses to harassment by stalkers, trolls and anti-porn activists.

Read more News: Latest Cuts at

Get Weird Little Mix Little Mix: Get Weird Tour is a filmed music performance by Paul Caslin.
Starring Perrie Edwards, Jesy Nelson and Leigh-Anne Pinnock. BBFC link UK: Passed PG for mild innuendo after 5:10s of BBFC category cuts for:

  • 2016 Sony Music Entertainment video

The BBFC commented:

  • Company chose to remove a routine featuring moderate sex references and innuendo in song lyrics, accompanied by suggestive dancing, in order to achieve a PG classification. An uncut 12 was available.