Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo Following the conclusion of their consultation period, the BBFC have issued new age verification guidance that has been laid before Parliament. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary. Summary

The new code has some important improvements, notably the introduction of a voluntary scheme for privacy, close to or based on a GDPR Code of Conduct. This is a good idea, but should not be put in place as a voluntary arrangement. Companies may not want the attention of a regulator, or may simply wish to apply lower or different standards, and ignore it. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

We are also concerned that the voluntary scheme may not be up and running before the AV requirement is put in place. Given that 25 million UK adults are expected to sign up to these products within a few months of its launch, this would be very unhelpful.

Parliament should now:

  • Ask the government why the privacy scheme is to be voluntary, if the risks of relying on general data protection law are now recognised;
  • Ask for assurance from BBFC that the voluntary scheme will cover the all of the major operators; and
  • Ask for assurance from BBFC and DCMS that the voluntary privacy scheme will be up and running before obliging operators to put Age Verification measures in place.

The draft code can be found here .

Lack of Enforceability of Guidance The Digital Economy Act does not allow the BBFC to judge age verification tools by any standard other than whether or not they sufficiently verify age. We asked that the BBFC persuade the DCMS that statutory requirements for privacy and security were required for age verification tools.

The BBFC have clearly acknowledged privacy and security concerns with age verification in their response. However, the BBFC indicate in their response that they have been working with the ICO and DCMS to create a voluntary certification scheme for age verification providers:

“This voluntary certification scheme will mean that age-verification providers may choose to be independently audited by a third party and then certified by the Age-verification Regulator. The third party’s audit will include an assessment of an age-verification solution’s compliance with strict privacy and data security requirements.”

The lack of a requirement for additional and specific privacy regulation in the Digital Economy Act is the cause for this voluntary approach.

While a voluntary scheme above is likely to be of some assistance in promoting better standards among age verification providers, the “strict privacy and data security requirements” which the voluntary scheme mentions are not a statutory requirement, leaving some consumers at greater risk than others.

Sensitive Personal Data The data handled by age verification systems is sensitive personal data. Age verification services must directly identify users in order to accurately verify age. Users will be viewing pornographic content, and the data about what specific content a user views is highly personal and sensitive. This has potentially disastrous consequences for individuals and families if the data is lost, leaked, or stolen.

Following a hack affecting Ashley Madison — a dating website for extramarital affairs — a number of the site’s users were driven to suicide as a result of the public exposure of their sexual activities and interests.

For the purposes of GDPR, data handled by age verification systems falls under the criteria for sensitive personal data, as it amounts to “data concerning a natural person’s sex life or sexual orientation”.

Scheduling Concerns It is of critical importance that any accreditation scheme for age verification providers, or GDPR code of conduct if one is established, is in place and functional before enforcement of the age verification provisions in the Digital Economy Act commences. All of the major providers who are expected to dominate the age verification market should undergo their audit under the scheme before consumers will be expected to use the tool. This is especially true when considering the fact that MindGeek have indicated their expectation that 20-25 million UK adults will sign up to their tool within the first few months of operation. A voluntary accreditation scheme that begins enforcement after all these people have already signed up would be unhelpful.

Consumers should be empowered to make informed decisions about the age verification tools that they choose from the very first day of enforcement. No delays are acceptable if users are expected to rely upon the scheme to inform themselves about the safety of their data. If this cannot be achieved prior to the start of expected enforcement of the DE Act’s provisions, then the planned date for enforcement should be moved back to allow for the accreditation to be completed.

Issues with Lack of Consumer Choice It is of vital importance that consumers, if they must verify their age, are given a choice of age verification providers when visiting a site. This enables users to choose which provider they trust with their highly sensitive age verification data and prevents one actor from dominating the market and thereby promoting detrimental practices with data. The BBFC also acknowledge the importance of this in their guidance, noting in 3.8:

“Although not a requirement under section 14(1) the BBFC recommends that online commercial pornography services offer a choice of age-verification methods for the end-user”.

This does not go far enough to acknowledge the potential issues that may arise in a fragmented market where pornographic sites are free to offer only a single tool if they desire.

Without a statutory requirement for sites to offer all appropriate and available tools for age verification and log in purposes, it is likely that a market will be established in which one or two tools dominate. Smaller sites will then be forced to adopt these dominant tools as well, to avoid friction with consumers who would otherwise be required to sign up to a new provider.

This kind of market for age verification tools will provide little room for a smaller provider with a greater commitment to privacy or security to survive and robs users of the ability to choose who they trust with their data.

We already called for it to be made a statutory requirement that pornographic sites must offer a choice of providers to consumers who must age verify, however this suggestion has not been taken up.

We note that the BBFC has been working with the ICO and DCMS to produce a voluntary code of conduct. Perhaps a potential alternative solution would be to ensure that a site is only considered compliant if it offers users a number of tools which has been accredited under the additional privacy and security requirements of the voluntary scheme.

GDPR Codes of Conduct A GDPR “Code of Conduct” is a mechanism for providing guidelines to organisations who process data in particular ways, and allows them to demonstrate compliance with the requirements of the GDPR.

A code of conduct is voluntary, but compliance is continually monitored by an appropriate body who are accredited by a supervisory authority. In this case, the “accredited body” would likely be the BBFC, and the “supervisory authority” would be the ICO. The code of conduct allows for certifications, seals and marks which indicate clearly to consumers that a service or product complies with the code.

Codes of conduct are expected to provide more specific guidance on exactly how data may be processed or stored. In the case of age verification data, the code could contain stipulations on:

  • Appropriate pseudonymisation of stored data;
  • Data and metadata retention periods;
  • Data minimisation recommendations;
  • Appropriate security measures for data storage;
  • Security breach notification procedures;
  • Re-use of data for other purposes.

The BBFC’s proposed “voluntary standard” regime appears to be similar to a GDPR code of conduct, though it remains to be seen how specific the stipulations in the BBFC’s standard are. A code of conduct would also involve being entered into the ICO’s public register of UK approved codes of conduct, and the EPDB’s public register for all codes of conduct in the EU.

Similarly, GDPR Recital 99 notes that “relevant stakeholders, including data subjects” should be consulted during the drafting period of a code of conduct – a requirement which is not in place for the BBFC’s voluntary scheme.

It is possible that the BBFC have opted to create this voluntary scheme for age verification providers rather than use a code of conduct, because they felt they may not meet the GDPR requirements to be considered as an appropriate body to monitor compliance. Compliance must be monitored by a body who has demonstrated:

  • Their expertise in relation to the subject-matter;
  • They have established procedures to assess the ability of data processors to apply the code of conduct;
  • They have the ability to deal with complaints about infringements; and
  • Their tasks do not amount to a conflict of interest.

Parties Involved in the Code of Conduct Process As noted by GDPR Recital 99, a consultation should be a public process which involves stakeholders and data subjects, and their responses should be taken into account during the drafting period:

“When drawing up a code of conduct, or when amending or extending such a code, associations and other bodies representing categories of controllers or processors should consult relevant stakeholders, including data subjects where feasible , and have regard to submissions received and views expressed in response to such consultations.”

The code of conduct must be approved by a relevant supervisory authority (in this case the ICO).

An accredited body (BBFC) that establishes a code of conduct and monitors compliance is able to establish their own structures and procedures under GDPR Article 41 to handle complaints regarding infringements of the code, or regarding the way it has been implemented. BBFC would be liable for failures to regulate the code properly under Article 41(4), [1] however DCMS appear to have accepted the principle that the government would need to protect BBFC from such liabilities. [2]

GDPR Codes of Conduct and Risk Management Below is a table of risks created by age verification which we identified during the consultation process. For each risk, we have considered whether a GDPR code of conduct may help to mitigate the effects of it.

Risk CoC Appropriate? Details
User identity may be correlated with viewed content. Partially This risk can never be entirely mitigated if AV is to go ahead, but a CoC could contain very strict restrictions on what identifying data could be stored after a successful age verification.
Identity may be associated to an IP address, location or device. No It would be very difficult for a CoC to mitigate this risk as the only safe mitigation would be not to collect user identity information.
An age verification provider could track users across all the websites it’s tool is offered on. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Users may be incentivised to consent to further processing of their data in exchange for rewards (content, discounts etc.) Yes Age verification tools could be expressly forbidden from offering anything in exchange for user consent.
Leaked data creates major risks for identified individuals and cannot be revoked or adequately compensated for. Partially A CoC can never fully mitigate this risk if any data is being collected, but it could contain strict prohibitions on storing certain information and specify retention periods after which data must be destroyed, which may mitigate the impacts of a data breach.
Risks to the user of access via shared computers if viewing history is stored alongside age verification data. Yes A CoC could specify that any accounts for pornographic websites which may track viewed content must be strictly separate and not in any visible way linked to a user’s age verification account or data that confirms their identity.
Age verification systems are likely to trade off convenience for security. (No 2FA, auto-login, etc.) Yes A CoC could stipulate that login cookies that “remember” a returning user must only persist for a short time period, and should recommend or enforce two-factor authentication.
The need to re-login to age verification services to access pornography in “private browsing” mode may lead people to avoid using this feature and generate much more data which is then stored. No A CoC cannot fix this issue. Private browsing by nature will not store any login cookies or other objects and will require the user to re-authenticate with age verification providers every time they wish to view adult content.
Users may turn to alternative tools to avoid age verification, which carry their own security risks. (Especially “free” VPN services or peer-to-peer networks). No Many UK adults, although over 18, will be uncomfortable with the need to submit identity documents to verify their age and will seek alternative means to access content. It is unlikely that many of these individuals will be persuaded by an accreditation under a GDPR code.
Age verification login details may be traded and shared among teenagers or younger children, which could lead to bullying or “outing” if such details are linked to viewed content. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Child abusers could use their access to age verified content as an adult as leverage to create and exploit relationships with children and teenagers seeking access to such content (grooming). No This risk will exist as long as age verification is providing a successful barrier to accessing such content for under-18s who wish to do so.
The sensitivity of content dealt with by age verification services means that users who fall victim to phishing scams or fraud have a lower propensity to report it to the relevant authorities. Partially A CoC or education campaign may help consumers identify trustworthy services, but it can not fix the core issue, which is that users are being socialised into it being “normal” to input their identity details into websites in exchange for pornography. Phishing scams resulting from age verification will appear and will be common, and the sensitivity of the content involved is a disincentive to reporting it.
The use of credit cards as an age verification mechanism creates an opportunity for fraudulent sites to engage in credit card theft. No Phishing and fraud will be common. A code of conduct which lists compliant sites and tools externally on the ICO website may be useful, but a phishing site may simply pretend to be another (compliant) tool, or rely on the fact that users are unlikely to check with the ICO every time they wish to view pornographic content.
The rush to get age verification tools to market means they may take significant shortcuts when it comes to privacy and security. Yes A CoC could assist in solving this issue if tools are given time to be assessed for compliance before the age verification regime commences .
A single age verification provider may come to dominate the market, leaving users little choice but to accept whatever terms the provider offers. Partially Practically, a CoC could mitigate some of the effects of an age verification tool monopoly if the dominant tool is accredited under the Code. However, this relies on users being empowered to demand compliance with a CoC, and it is possible that users will instead be left with a “take it or leave it” situation where the dominant tool is not CoC accredited.
Allowing pornography “monopolies” such as MindGeek to operate age verification tools is a conflict of interest. Partially As the BBFC note in their consultation response, it would not be reasonable to prohibit a pornographic content provider from running an age verification service as it would prevent any site from running their own tool. However, under a CoC it is possible that a degree of separation could be enforced that requires an age verification tools to adhere to strict rules about the use of data, which could mitigate the effects of a large pornographic content provider attempting to collect as much user data as possible for their own business purposes.

[1] “Infringements of the following provisions shall, in accordance with paragraph 2, be subject to administrative fines up to 10 000 000 EUR, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: the obligations of the monitoring body pursuant to Article 41(4).”

[2] “contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography.”

Advertisements
Read more me_internet.htm at MelonFarmers.co.uk

webfoundation 0300x0078 0297x0077 logo Speaking at the Web Summit conference in Lisbon, Tim Berners-Lee, inventor of the World Wide Web, has launched a campaign to persuade governments, companies and individuals to sign a Contract for the Web with a set of principles intended to defend a free and open internet.Contract for the Web CORE PRINCIPLES

The web was designed to bring people together and make knowledge freely available. Everyone has a role to play to ensure the web serves humanity. By committing to the following principles, governments, companies and citizens around the world can help protect the open web as a public good and a basic right for everyone.

GOVERNMENTS WILL

  • Ensure everyone can connect to the internet so that anyone, no matter who they are or where they live, can participate actively online.
  • Keep all of the internet available, all of the time so that no one is denied their right to full internet access.
  • Respect people’s fundamental right to privacy so everyone can use the internet freely, safely and without fear.

COMPANIES WILL

  • Make the internet affordable and accessible to everyone so that no one is excluded from using and shaping the web.
  • Respect consumers’ privacy and personal data so people are in control of their lives online.
  • Develop technologies that support the best in humanity and challenge the worst so the web really is a public good that puts people first.

CITIZENS WILL

  • Be creators and collaborators on the web so the web has rich and relevant content for everyone.
  • Build strong communities that respect civil discourse and human dignity so that everyone feels safe and welcome online.
  • Fight for the web so the web remains open and a global public resource for people everywhere, now and in the future.

We commit to uphold these principles and to engage in a deliberative process to build a full “Contract for the Web”, which will set out the roles and responsibilities of governments, companies and citizens. The challenges facing the web today are daunting and affect us in all our lives, not just when we are online. But if we work together and each of us takes responsibility for our actions, we can protect a web that truly is for everyone.See more from fortheweb.webfoundation.org

Read more me_asa.htm at MelonFarmers.co.uk

more impact online The advert censors of ASA have published a five year strategy, with a  focus on more censorship of online advertising including exploring the use of machine learning in regulation.The strategy will be officially launched at an ASA conference in Manchester, entitled The Future of Ad Regulation.

ASA explains the highlights of its strategy:

We will prioritise the protection of vulnerable people and appropriately limiting children and young people’s exposure to age-restricted ads in sectors like food, gambling and alcohol We will listen in new ways, including research, data-driven intelligence gathering and machine learning 203 our own or that of others – to find out which other advertising-related issues are the most important to tackle We will develop our thought-leadership in online ad regulation, including on advertising content and targeting issues relating to areas like voice, facial recognition, machine-generated personalised content and biometrics We will explore lighter-touch ways for people to flag concerns We will explore whether our decision-making processes and governance always allow us to act nimbly, in line with people’s expectations of regulating an increasingly online advertising world We will explore new technological solutions, including machine learning, to improve our regulation

Online trends are reflected in the balance of our workload – 88% of the 7,099 ads amended or withdrawn in 2017 following our action were online ads, either in whole or in part. Meanwhile, two-thirds of the 19,000 cases we resolved last year were about online ads.

Our guiding principle is that people should benefit from the same level of protection against irresponsible online ads as they do offline. The ad rules apply just as strongly online as they do to ads in more traditional media.

Our recent rebalancing towards more proactive regulation has had a positive impact, evidenced by steep rises in the number of ads withdrawn or changed (7,009 last year, up 47% on 2016) and the number of pieces of advice and training delivered to businesses (on course to exceed 400,000 this year). This emphasis on proactive regulation — intervening before people need to complain about problematic ads — will continue under the new strategy.

The launch event – The Future of Ad Regulation conference – will take place at Manchester Central Convention Complex on 1 November. Speakers will include Professor Tanya Byron, Reg Bailey, BBC Breakfast’s Tina Daheley, Marketing Week’s Russell Parsons, ASA Chief Executive Guy Parker and ASA Chairman David Currie.

Online ASA Chief Executive, Guy Parker said:

We’re a much more proactive regulator as a result of the work we’ve done in the last five years. In the next five, we want to have even more impact regulating online advertising. Online is already well over half of our regulation, but we’ve more work to do to take further steps towards our ambition of making every UK ad a responsible ad.

Lord Currie, Chairman of the ASA said:

The new strategy will ensure that protecting consumers remains at the heart of what we do but that our system is also fit for purpose when regulating newer forms of advertising. This also means harnessing new technology to improve our ways of working in identifying problem ads.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

UKCCIS logo The Government has announced the organisations that will sit on the Executive Board of a new national body to tackle online harms in the UK. The UK Council for Internet Safety (UKCIS) is the successor to the UK Council for Child Internet Safety (UKCCIS), with an expanded scope to improve online safety for everyone in the UK.

The Executive Board brings together expertise from a range of organisations in the tech industry, civil society and public sector.

Margot James, Minister for Digital and the Creative Industries said:

Only through collaborative action will the UK be the safest place to be online. By bringing together a wealth of expertise from a wide range of fields, UKCIS can be an example to the world on how we can work together to face the challenges of the digital revolution in an effective and responsible way.

UKCIS has been established to allow these organisations to collaborate and coordinate a UK-wide approach to online safety.

It will contribute to the Government’s commitment to make the UK the safest place in the world to be online, and will help to inform the development of the forthcoming Online Harms White Paper.

Priority areas of focus will include online harms experienced by children such as cyberbullying and sexual exploitation; radicalisation and extremism; violence against women and girls; hate crime and hate speech; and forms of discrimination against groups protected under the Equality Act, for example on the basis of disability or race.

CEO of Internet Matters Carolyn Bunting said:

We are delighted to sit on the Executive Board of UKCIS where we are able to represent parents needs in keeping their children safe online.

Online safety demands a collaborative approach and by bringing industry together we hope we can bring about real change and help everyone benefit from the opportunities the digital world has to offer.

The UKCIS Executive Board consists of the following organisations:

  • Apple
  • BBC
  • Childnet
  • Children’s Commissioner
  • Commission for Countering Extremism
  • End Violence Against Women Coalition
  • Facebook
  • GCHQ
  • Google
  • ICO
  • Independent Advisory Group on Hate Crime
  • Internet Matters
  • Internet Watch Foundation
  • Internet Service Providers and Mobile Operators (rotating between BT, Sky, TalkTalk, Three, Virgin Media, Vodafone)
  • Microsoft
  • National Police Chiefs’ Council
  • National Crime Agency – CEOP Command
  • Northern Ireland Executive
  • NSPCC
  • Ofcom
  • Parentzone
  • Scottish Government
  • TechUK
  • Twitter
  • UKCIS Evidence Group Chair
  • UKIE
  • Welsh Assembly

The UKCIS Executive Board is jointly chaired by Margot James, Minister for Digital and the Creative Industries (Department for Digital, Culture, Media and Sport); Victoria Atkins, Minister for Crime, Safeguarding and Vulnerability (Home Office); and Nadeem Zahawi, Minister for Children and Families (Department for Education). It also includes representatives from the Devolved Administrations of Scotland, Wales and Northern Ireland. Board membership will be kept under periodic review, to ensure it represents the full range of online harms that the government seeks to tackle.

Read more megames.htm at MelonFarmers.co.uk

sony games censorship Senran Kagura Burst Re:Newal was recently delayed on the PlayStation 4 as Sony demanded that publisher XSEED remove a mode which effectively allows you to fondle its cast of indeterminate aged virtual characters against their will.There’s some speculation that Sony’s clamping down on heavily sexualised content, especially after it refused release of bizzarro dating game Super Seducer , but many assumed that this would be limited to Western territories. However, comparison screenshots of a new Japanese visual novel which released this week in the East reveal it may be a company-wide policy. The pictures, compared to the Nintendo Switch and PC, show use of heavy censorship to obscure sexual imagery on the PS4 only. The censorship was not present in the PS Vita version, which launched a year ago.

There’s also chatter that the PlayStation maker has requested jiggle physics be removed from the PS4 version of Warriors Orochi 4 , as they’re present in the Nintendo Switch release but conspicuously absent from the Sony SKU. This adds evidence to the notion that Sony ia shutting down this kind of content.

Meanwhile nichegamer.com reports that the Japanese developer light recently held a live broadcast where they confirmed Sony’s new and aggressive policy against sexual themes in seemingly only Japanese-made games is actually preventing them from releasing their latest visual novel. The game, Silverio Trinity , is their latest visual novel opus — and it has sexual themes in it. Developer light noted that Sony is getting strict with their approval process, especially regarding sexual themes.

The developer noted they were hoping to release the game for PlayStation 4 soon after New Year’s as development on the game is complete, however, Sony has been reluctant to approve the game. Furthermore, Sony is confusingly asking Japanese developers to plead their approval only in English, making the process even more difficult for developers whose staff only speak or write in Japanese. The developer noted if they were to release the game for Windows PC (via Steam) they could release it next week.

Read more eu.htm at MelonFarmers.co.uk

Italy flag When the EU voted for mandatory copyright censorship of the internet in September, Italy had a different government; the ensuing Italian elections empowered a new government, who oppose the filters.

Once states totalling 35% of the EU’s population oppose the new Copyright Directive, they can form a “blocking minority” and kill it or cause it to be substantially refactored. With the Italians opposing the Directive because of its draconian new internet rules (rules introduced at the last moment, which have been hugely controversial), the reputed opponents of the Directive have now crossed the 35% threshold, thanks to Germany, Finland, the Netherlands, Slovenia, Belgium and Hungary.

Unfortunately, the opponents of Article 11 (the “link tax”) and Article 13 (the copyright filters) are not united on their opposition — they have different ideas about what they would like to see done with these provisions. If they pull together, that could be the end of these provisions.

If you’re a European this form will let you contact your MEP quickly and painlessly and let them know how you feel about the proposals.

That’s where matters stand now: a growing set of countries who think copyright filters and link taxes go too far, but no agreement yet on rejecting or fixing them.

The trilogues are not a process designed to resolve such large rifts when both the EU states and the parliament are so deeply divided.

What happens now depends entirely on how the members states decide to go forward: and how hard they push for real reform of Articles 13 and 11. The balance in that discussion has changed, because Italy changed its position. Italy changed its position because Italians spoke up. If you reach out to your countries’ ministry in charge of copyright, and tell them that these Articles are a concern to you, they’ll start paying attention too. And we’ll have a chance to stop this terrible directive from becoming terrible law across Europe.

Read more eu.htm at MelonFarmers.co.uk

YouTube logo YouTube has warned its video creators about the likely effect of the EU’s upcoming censorship machines:

YouTube’s growing creative economy is at risk, as the EU Parliament voted on Article 13, copyright legislation that could drastically change the internet that you see today.

Article 13 as written threatens to shut down the ability of millions of people — from creators like you to everyday users — to upload content to platforms like YouTube. And it threatens to block users in the EU from viewing content that is already live on the channels of creators everywhere. This includes YouTube’s incredible video library of educational content, such as language classes, physics tutorials and other how-to’s.

This legislation poses a threat to both your livelihood and your ability to share your voice with the world. And, if implemented as proposed, Article 13 threatens hundreds of thousands of jobs, European creators, businesses, artists and everyone they employ. The proposal could force platforms, like YouTube, to allow only content from a small number of large companies. It would be too risky for platforms to host content from smaller original content creators, because the platforms would now be directly liable for that content. We realize the importance of all rights holders being fairly compensated, which is why we built Content ID and a platform to pay out all types of content owners. But the unintended consequences of article 13 will put this ecosystem at risk. We are committed to working with the industry to find a better way. This language could be finalized by the end of the year, so it’s important to speak up now.

Please take a moment to learn more about how it could affect your channel and take action immediately. Tell the world through social media (#SaveYourInternet) and your channel why the creator economy is important and how this legislation will impact you