Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logo Sky News has learned that the government has delayed setting a date for when age verification rules will come into force due to concerns regarding the security and human rights issues posed by the rules. A DCMS representative said:

This is a world-leading step forward to protect our children from adult content which is currently far too easy to access online.

The government, and the BBFC as the regulator, have taken the time to get this right and we will announce a commencement date shortly.

Previously the government indicated that age verification would start from about Easter but the law states that 3 months notice must be given for the start date. Official notice has yet to be published so the earliest it could start is already June 2019.

The basic issue is that the Digital Economy Act underpinning age verification does not mandate that identity data and browsing provided of porn users should be protected by law. The law makers thought that GDPR would be sufficient for data protection, but in fact it only requires that user consent is required for use of that data. All it requires is for users to tick the consent box, probably without reading the deliberately verbose or vague terms and conditions provided. After getting the box ticked the age verifier can then do more or less what they want to do with the data.

Realising that this voluntary system is hardly ideal, and that the world’s largest internet porn company Mindgeek is likely to become the monopoly gatekeeper of the scheme, the government has moved on to considering some sort of voluntary kitemark scheme to try and convince porn users that an age verification company can be trusted with the data. The kitemark scheme would appoint an audit company to investigate the age verification implementations and to approve those that use good practises.

I would guess that this scheme is difficult to set up as it would be a major risk for audit companies to approve age verification systems based upon voluntary data protection rules. If an ‘approved’ company were later found to be selling, misusing data or even getting hacked, then the auditor could be sued for negligent advice, whilst the age verification company could get off scot-free.

Advertisements
Read more eu.htm at MelonFarmers.co.uk

european parliament 2018 logo Members of the European Parliament are considering a proposition for the censorship of terrorist internet content issued by the European Commission last September.The IMCO Committee (“Internal Market and Consumers protection”) has just published its initial opinions on the proposition.

laquadrature.net reports

Judicial Review

The idea is that the government of any European Member State will be able to order any website to remove content considered “terrorist”. No independent judicial authorisation will be needed to do so, letting governments abuse the wide definition of “terrorism”. The only thing IMCO accepted to add is for government’s orders to be subject to “judicial review”, which can mean anything.

In France, the government’s orders to remove “terrorist content” are already subject to “judicial review”, where an independent body is notified of all removal orders and may ask judges to asses them. This has not been of much help: only once has this censorship been submitted to a judge’s review. It was found to be unlawful, but more than one year and half after it was ordered. During this time, the French government was able to abusively censor content, in this case, far-left publications by two French Indymedia outlets.

Far from simplifying, this Regulation will add confusion as authorities from one member state will be able to order removal in other one, without necessarily understanding context.

Unrealistic removal delays

Regarding the one hour delay within which the police can order a hosting service provider to block any content reported as “terrorist”, there was no real progress either. It has been replaced by a deadline of at least eight hours, with a small exception for “microentreprises” that have not been previously subject to a removal order (in this case, the “deadline shall be no sooner than the end of the next working day”).

This narrow exception will not allow the vast majority of Internet actors to comply with such a strict deadline. Even if the IMCO Committee has removed any mention of proactive measures that can be imposed on Internet actors, and has stated that “automated content filters” shall not be used by hosting service providers, this very tight deadline, and the threat of heavy fines will only incite them to adopt the moderation tools developed by the Web’s juggernauts (Facebook and Google) and use the broadest possible definition of terrorism to avoid the risk of penalties. The impossible obligation to provide a point of contact reachable 24/7 has not been modified either. The IMCO opinion has even worsened the financial penalties that can be imposed: it is now “at least” 1% and up to 4% of the hosting service provider’s turnover.

Next steps

The next step will be on 11 March, when the CULT Committee (Culture and Education) will adopt its opinion.

The last real opportunity to obtain the rejection of this dangerous text will be on 21 March 2019, in the LIBE Committee (Civil Liberties, Justice and Home Affairs). European citizens must contact their MEPs to demand this rejection. We have provided a dedicated page on our website with an analysis of this Regulation and a tool to directly contact the MEPs in charge.

Starting today, and for the weeks to come, call your MEPS and demand they reject this text.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

ageid logo Pornhub and sister websites will soon require ID from users before being able to browse its porn.The government most recently suggested that this requirement would start from about Easter this year, but this date has already slipped. The government will give 3 months notice of the start date and as this has not yet been announced, the earliest start date is currently in June.

Pornhub and YouPorn will use the AgeID system, which requires users to identify themselves with an email address and a credit card, passport, driving licence or an age verified mobile phone number.

Metro.co.uk spoke to a spokesperson from AgeID to find out how it will work (and what you’ll actually see when you try to log in). James Clark, AgeID spokesperson, said:

When a user first visits a site protected by AgeID, a landing page will appear with a prompt for the user to verify their age before they can access the site.

First, a user can register an AgeID account using an email address and password. The user verifies their email address and then chooses an age verification option from our list of 3rd party providers, using options such as Mobile SMS, Credit Card, Passport, or Driving Licence.

The second option is to purchase a PortesCard or voucher from a retail outlet. Using this method, a customer does not need to register an email address, and can simply access the site using the Portes app.

Thereafter, users will be able to use this username/password combination to log into all porn sites which use the Age ID system.

It is a one-time verification, with a simple single sign-on for future access. If a user verifies on one AgeID protected site, they will not need to perform this verification again on any other site carrying AgeID.

The PortesCard is available to purchase from selected high street retailers and any of the UK’s 29,000 PayPoint outlets as a voucher. Once a card or voucher is purchased, its unique validation code must be activated via the Portes app within 24 hours before expiring.

If a user changes device or uses a fresh browser, they will need to login with the credentials they used to register. If using the same browser/device, the user has a choice as to whether they wish to login every time, for instance if they are on a shared device (the default option), or instead allow AgeID to log them in automatically, perhaps on a mobile phone or other personal device.

Clark claimed that AgeID’s system does not store details of people’s ID, nor does it store their browsing history. This sounds a little unconvincing and must be taken on trust. And this statement rather seems to be contradicted by a previous line noting that user’s email will be verified, so that piece of identity information at least will need to be stored and read.

The Portes App solution seems a little doubtful too. It claims not to log device data and then goes on to explain that the PortesCard needs to be locked to a device, rather suggesting that it will in fact be using device data. It will be interesting to see what app permissions the app will require when installing. Hopefully it won’t ask to read your contact list.

This AgeID statement rather leaves the AVSecure card idea in the cold. The AVSecure system of proving your age anonymously at a shop, and then obtaining a password for use on porn websites seems to be the most genuinely anonymous idea suggested so far, but it will be pretty useless if it can’t be used on the main porn websites.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

internet association logo The world’s biggest internet companies including Facebook, Google and Twitter are represented by a trade group call The Internet Association. This organisation has written to UK government ministers to outline how they believe harmful online activity should be regulated.The letter has been sent to the culture, health and home secretaries. The letter will be seen as a pre-emptive move in the coming negotiation over new rules to govern the internet. The government is due to publish a delayed White Paper on online harms in the coming weeks.

The letter outlines six principles:

  • “Be targeted at specific harms, using a risk-based approach
  • “Provide flexibility to adapt to changing technologies, different services and evolving societal expectations
  • “Maintain the intermediary liability protections that enable the internet to deliver significant benefits for consumers, society and the economy
  • “Be technically possible to implement in practice
  • “Provide clarity and certainty for consumers, citizens and internet companies
  • “Recognise the distinction between public and private communication”

Many leading figures in the UK technology sector fear a lack of expertise in government, and hardening public sentiment against the excesses of the internet, will push the Online Harms paper in a more radical direction.

Three of the key areas of debate are the definition of online harm, the lack of liability for third-party content, and the difference between public and private communication.

The companies insist that government should recognise the distinction between clearly illegal content and content which is harmful, but not illegal. If these leading tech companies believe this government definition of harm is too broad, their insistence on a distinction between illegal and harmful content may be superseded by another set of problems.

The companies also defend the principle that platforms such as YouTube permit users to post and share information without fear that those platforms will be held liable for third-party content. Another area which will be of particular interest to the Home Office is the insistence that care should be taken to avoid regulation encroaching into the surveillance of private communications.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo We met to discuss BBFC’s voluntary age verification privacy scheme, but BBFC did not attend. Open Rights Group met a number of age verification providers to discuss the privacy standards that they will be meeting when the scheme launches, slated for April. Up to 20 million UK adults are expected to sign up to these products.

We invited all the AV providers we know about, and most importantly, the BBFC, at the start of February. BBFC are about to launch a voluntary privacy standard which some of the providers will sign up to. Unfortunately, BBFC have not committed to any public consultation about the scheme, relying instead on a commercial provider to draft the contents with providers, but without wider feedback from privacy experts and people who are concerned about users.

We held the offices close to the BBFC’s offices in order that it would be convenient for them to send someone that might be able to discuss this with us. We have been asking for meetings with BBFC about the privacy issues in the new code since October 2018: but have not received any reply or acknowledgement of our requests, until this morning, when BBFC said they would be unable to attend today’s roundtable. This is very disappointing.

BBFC’s failure to consult the public about this standard, or even to meet us to discuss our concerns, is alarming. We can understand that BBFC is cautious and does not wish to overstep its relationship with its new masters at DCMS. BBFC may be worried about ORG’s attitude towards the scheme: and we certainly are critical. However, it is not responsible for a regulator to fail to talk to its potential critics.

We are very clear about our objectives. We are acting to do our best to ensure the risk to adult users of age verification technologies are minimised. We do not pose a threat to the scheme as a whole: listening to us can only result in making the pornographic age verification scheme more likely to succeed, and for instance, to avoid catastrophic failures.

Privacy concerns appear to have been recognised by BBFC and DCMS as a result of consultation responses from ORG supporters and others, which resulted in the voluntary privacy standard. These concerns have also been highlighted by Parliament, whose regulatory committee expressed surprise that the Digital Economy Act 2017 had contained no provision to deal with the privacy implications of pornographic age verification.

Today’s meeting was held to discuss:

  • What the scheme is likely to cover; and what it ideally should cover;

  • Whether there is any prospect of making the scheme compulsory;

  • What should be done about non-compliant services;

  • What the governance of the scheme should be in the long tern, for instance whether it might be suitable to become an ICO backed code, or complement such as code

As we communicated to BBFC in December 2018, we have considerable worries about the lack of consultation over the standard they are writing, which appears to be truncated in order to meet the artificial deadline of April this year. This is what we explained to BBFC in our email:

  • Security requires as many perspectives to be considered as possible.

  • The best security standards eg PCI DSS are developed in the open and iterated

  • The standards will be best if those with most to lose are involved in the design.

    • For PCI DSS, the banks and their customers have more to lose than the processors

    • For Age Verification, site users have more to lose than the processors, however only the processors seem likely to be involved in setting the standard

We look forward to BBFC agreeing to meet us to discuss the outcome of the roundtable we held about their scheme, and to discuss our concerns about the new voluntary privacy standard. Meanwhile, we will produce a note from the meeting, which we believe was useful. It covered the concerns above, and issues around timing, as well as strategies for getting government to adjust their view of the absence of compulsory standards, which many of the providers want. In this, BBFC are a critical actor. ORG also intends as a result of the meeting to start to produce a note explaining what an effective privacy scheme would cover, in terms of scope, risks to mitigate, governance and enforcement for participants.

Read more inau.htm at MelonFarmers.co.uk

sarft logo Chinese government censors are reading Australian publishers’ books and, in some cases, refusing to allow them to be printed in China if they fail to comply with a long list of restrictions.

Publishing industry figures have confirmed that the censors from the State Administration of Press, Publication, Radio, Film and Television of the People’s Republic of China are vetting books sent by Australian publishers to Chinese printing presses, even though they are written by Australian authors and intended for Australian readers.

  • Any mention of a list of political dissidents, protests or political figures in China, is entirely prohibited, according to a list circulated to publishers and obtained by The Age and Sydney Morning Herald.
  • The list of prohibitions mentions key political incidents, including the 1989 Tiananmen Square protests, the pro-democracy protests in 2011 and the 2014 umbrella revolution in Hong Kong. The Tibetan independence movement, Uighur nationalism and Falun Gong are also taboo subjects.
  • Mention of all major Chinese political figures, including Mao Zedong and the current president, Mr Xi, and all current members of the Politburo Standing Committee is ruled out, as is a long list of 118 dissidents who are not allowed to be mentioned.
  • Most major religions are also on the sensitive list, as well as a long list of Chinese, or former Chinese locations, most relating to current or former border disputes. The printer’s guidance says these things can be published after vetting by censors.
  • Pornography was ruled out entirely, but artistic nudity or sexual acts could be censored in 10 working days.

Printing books, particularly those with colour illustrations, is significantly cheaper in China, so some publishers have little choice but to put them through the government censorship process.

Sandy Grant, of publisher Hardie Grant, said he had scrapped a proposed children’s atlas last year because the censors ruled out a map showing the wrong borders.(probably to do with Chinese claims about Taiwan or Tibet). European alternatives were considered economically unviable.

A printing industry source who works with Chinese presses confirmed that the rules, in theory, had been in place for a long time, but that, all of a sudden they’ve decided to up the ante. They’re checking every book; they’re very, very strict at the moment. I don’t know how they’re reading every book, but they definitely are, the printer said. The change had happened in the past few months.

Read more inau.htm at MelonFarmers.co.uk

censored pell About 100 journalists have been threatened with a charge of contempt of court — and could face possible jail terms — over reporting of the Cardinal George Pell trial.

Victoria’s director of public prosecutions, Kerri Judd QC, has written to as many as 100 individual publishers, editors, broadcasters, reporters and subeditors at the media giants News Corp Australia, Nine Entertainment , the ABC, Crikey and several smaller publications, accusing them of breaching a nationwide suppression order imposed during the case.

The ones who do not have a strong enough explanation could be prosecuted. The gag order was issued by the chief judge of Melbourne’s county court on 25 June 2018 in the matter of DPP v George Pell . The prosecution had applied for the suppression order to prevent risk of prejudice for a second trial for Pell on separate charges.

The Herald Sun published the most dramatic piece: a black front page with the word CENSORED in large white letters. The world is reading a very important story that is relevant to Victorians, the page one editorial said. The Herald Sun is prevented from publishing details of this very significant news. But trust us, it’s a story you deserve to read.

Judd’s letters targeted even oblique references because the gag order banned any information about the case, including that there was a suppression order.