Posts Tagged ‘DCMS’

Read more uk_internet_censors.htm at MelonFarmers.co.uk

house of lords red logo The following four motions are expected to be debated together in the House of Lords on 11th December 2018:

Online Pornography (Commercial Basis) Regulations 2018

Lord Ashton of Hyde to move that the draft Regulations laid before the House on 10 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B)

Guidance on Age-verification Arrangements

Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B)

Lord Stevenson of Balmacara to move that this House regrets that the draft Online Pornography (Commercial Basis) Regulations 2018 and the draft Guidance on Age-verification Arrangements do not bring into force section 19 of the Digital Economy Act 2017, which would have given the regulator powers to impose a financial penalty on persons who have not complied with their instructions to require that they have in place an age verification system which is fit for purpose and effectively managed so as to ensure that commercial pornographic material online will not normally be accessible by persons under the age of 18.

Guidance on Ancillary Service Providers

Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B)

The DCMS and BBFC age verification scheme has been widely panned as fundamentally the law provides no requirement to actually protect people’s identity data that can be coupled with their sexual preferences and sexuality. The scheme only offers voluntary suggestions that age verification services and websites should protect their user’s privacy. But one only has to look to Google, Facebook and Cambridge Analytica to see how worthless mere advice is. GDPR is often quoted but that only requires that user consent is obtained. One will have to simply to the consent to the ‘improved user experience’ tick box to watch the porn, and thereafter the companies can do what the fuck they like with the data.

See criticism of the scheme:

Security expert provides a detailed break down of the privacy and security failures of the age verification scheme

Parliamentary scrutiny committee condemns BBFC Age Verification Guidelines

Parliamentary scrutiny committee condemns as ‘defective’ a DCMS Statutory Instrument excusing Twitter and Google images from age verification.

Advertisements
Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo MPs left behind unfinished business when they broke for summer recess, and we aren’t talking about Brexit negotiations. The rollout of mandatory age verification (AV) technology for adult websites is being held up once again while the Government mulls over final details. AV tech will create highly sensitive databases of the public’s porn watching habits, and Open Rights Groups submitted a report warning the proposed privacy protections are woefully inadequate. The Government’s hesitation could be a sign they are receptive to our concerns, but we expect their final guidance will still treat privacy as an afterthought. MPs need to understand what’s at stake before they are asked to approve AV guidelines after summer.

AV tools will be operated by private companies, but if the technology gets hacked and the personal data of millions of British citizens is breached, the Government will be squarely to blame. By issuing weak guidelines, the Government is begging for a Cambridge Analytica-style data scandal. If this technology fails to protect user privacy, everybody loses. Businesses will be damaged (just look at Facebook), the Government will be embarrassed, and the over 20 million UK residents who view porn could have their private sexual preferences exposed. It’s in everybody’s interest to fix this. The draft guidance lacks even the basic privacy protections required for other digital tools like credit card payments and email services. Meanwhile, major data breaches are rocking international headlines on a regular basis. AV tech needs a dose of common sense.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcsm fake news Parliament’s Digital, Culture, Media and Sport (DCMS) Committee has been investigating disinformation and fake news following the Cambridge Analytica data scandal and is claiming that the UK faces a democratic crisis due to the spread of pernicious views and the manipulation of personal data.In its first report it will suggest social media companies should face tighter censorship. It also proposes measures to combat election interference.

The report claims that the relentless targeting of hyper-partisan views, which play to the fears and prejudices of people, in order to influence their voting plans is a threat to democracy.

The report was very critical of Facebook, which has been under increased scrutiny following the Cambridge Analytica data scandal.

Facebook has hampered our efforts to get information about their company throughout this inquiry. It is as if it thinks that the problem will go away if it does not share information about the problem, and reacts only when it is pressed, the report said.  It provided witnesses who have been unwilling or unable to give full answers to the committee’s questions.

The committee suggests:

1. Social media sites should be held responsible for harmful content on their services

Social media companies cannot hide behind the claim of being merely a ‘platform’, claiming that they are tech companies and have no role themselves in regulating the content of their sites, the committee said.

They continually change what is and is not seen on their sites, based on algorithms and human intervention.

They reward what is most engaging, because engagement is part of their business model and their growth strategy. They have profited greatly by using this model.

The committee suggested a new category of tech company should be created, which was not necessarily a platform or a publisher but something in between.

This should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms, the report said.

2. The rules on political campaigns should be made fit for the digital age

The committee said electoral law needed to be updated to reflect changes in campaigning techniques.

It suggested creating a public register for political advertising so that anybody can see what messages are being distributed online political advertisements should have a digital imprint stating who was responsible, as is required with printed leaflets and advertisements social media sites should be held responsible for interference in elections by malicious actors electoral fraud fines should be increased from a maximum of £20,000 to a percentage of organisations’ annual turnover

3. Technology companies should be taxed to fund education and regulation

Increased regulation of social media sites would result in more work for organisations such as the Electoral Commission and Information Commissioner’s Office (ICO).

The committee suggested a levy on tech companies should fund the expanded responsibilities of the regulators.

The money should also be spent on educational programmes and a public information campaign, to help people identify disinformation and fake news.

4. Social networks should be audited

The committee warned that fake accounts on sites such as Facebook and Twitter not only damage the user experience, but potentially defraud advertisers.

It suggested an independent authority such as the Competition and Markets Authority should audit the social networks.

It also said security mechanisms and algorithms used by social networks should be available for audit by a government regulator, to ensure they are operating responsibly.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

matt hancockCulture Secretary Matt Hancock has issued to the following press release from the Department for Digital, Culture, Media & Sport:

New laws to make social media safer

New laws will be created to make sure that the UK is the safest place in the world to be online, Digital Secretary Matt Hancock has announced.

The move is part of a series of measures included in the government’s response to the Internet Safety Strategy green paper, published today.

The Government has been clear that much more needs to be done to tackle the full range of online harm.

Our consultation revealed users feel powerless to address safety issues online and that technology companies operate without sufficient oversight or transparency. Six in ten people said they had witnessed inappropriate or harmful content online.

The Government is already working with social media companies to protect users and while several of the tech giants have taken important and positive steps, the performance of the industry overall has been mixed.

The UK Government will therefore take the lead, working collaboratively with tech companies, children’s charities and other stakeholders to develop the detail of the new legislation.

Matt Hancock, DCMS Secretary of State said:

Internet Safety StrategyDigital technology is overwhelmingly a force for good across the world and we must always champion innovation and change for the better. At the same time I have been clear that we have to address the Wild West elements of the Internet through legislation, in a way that supports innovation. We strongly support technology companies to start up and grow, and we want to work with them to keep our citizens safe.

People increasingly live their lives through online platforms so it’s more important than ever that people are safe and parents can have confidence they can keep their children from harm. The measures we’re taking forward today will help make sure children are protected online and balance the need for safety with the great freedoms the internet brings just as we have to strike this balance offline.

DCMS and Home Office will jointly work on a White Paper with other government departments, to be published later this year. This will set out legislation to be brought forward that tackles a range of both legal and illegal harms, from cyberbullying to online child sexual exploitation. The Government will continue to collaborate closely with industry on this work, to ensure it builds on progress already made.

Home Secretary Sajid Javid said:

Criminals are using the internet to further their exploitation and abuse of children, while terrorists are abusing these platforms to recruit people and incite atrocities. We need to protect our communities from these heinous crimes and vile propaganda and that is why this Government has been taking the lead on this issue.

But more needs to be done and this is why we will continue to work with the companies and the public to do everything we can to stop the misuse of these platforms. Only by working together can we defeat those who seek to do us harm.

The Government will be considering where legislation will have the strongest impact, for example whether transparency or a code of practice should be underwritten by legislation, but also a range of other options to address both legal and illegal harms.

We will work closely with industry to provide clarity on the roles and responsibilities of companies that operate online in the UK to keep users safe.

The Government will also work with regulators, platforms and advertising companies to ensure that the principles that govern advertising in traditional media — such as preventing companies targeting unsuitable advertisements at children — also apply and are enforced online.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logoIn a press release the DCMS describes its digital strategy including a delayed introduction of internet porn censorship. The press release states:

The Strategy also reflects the Government’s ambition to make the internet safer for children by requiring age verification for access to commercial pornographic websites in the UK. In February, the British Board of Film Classification (BBFC) was formally designated as the age verification regulator.

Our priority is to make the internet safer for children and we believe this is best achieved by taking time to get the implementation of the policy right. We will therefore allow time for the BBFC as regulator to undertake a public consultation on its draft guidance which will be launched later this month.

For the public and the industry to prepare for and comply with age verification, the Government will also ensure a period of up to three months after the BBFC guidance has been cleared by Parliament before the law comes into force. It is anticipated age verification will be enforceable by the end of the year.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcmd guidance age verification A few extracts from the documentIntroduction

  1. A person contravenes Part 3 of the Digital Economy Act 2017 if they make
    pornographic material available on the internet on a commercial basis to
    persons in the United Kingdom without ensuring that the material is not
    normally accessible to persons under the age of 18. Contravention could lead
    to a range of measures being taken by the age-verification regulator in
    relation to that person, including blocking by internet service providers (ISPs).
  2. Part 3 also gives the age-verification regulator powers to act where a person
    makes extreme pornographic material (as defined in section 22 of the Digital
    Economy Act 2017) available on the internet to persons in the United
    Kingdom.

Purpose

This guidance has been written to provide the framework for the operation of
the age-verification regulatory regime in the following areas:

● Regulator’s approach to the exercise of its powers;
● Age-verification arrangements;
● Appeals;
● Payment-services Providers and Ancillary Service Providers;
● Internet Service Provider blocking; and
● Reporting.

Enforcement principles

This guidance balances two overarching principles in the regulator’s application of its powers under sections 19, 21 and 23 – that it should apply its powers in the way which it thinks will be most effective in ensuring compliance on a case-by-case basis and that it should take a proportionate approach.

As set out in this guidance, it is expected that the regulator, in taking a proportionate approach, will first seek to engage with the non-compliant person to encourage them to comply, before considering issuing a notice under section 19, 21 or 23, unless there are reasons as to why the regulator does not think that is appropriate in a given case

Regulator’s approach to the exercise of its powers

The age-verification consultation Child Safety Online: Age verification for pornography identified that an extremely large number of websites contain pornographic content – circa 5 million sites or parts of sites. All providers of online pornography, who are making available pornographic material to persons in the United Kingdom on a commercial basis, will be required to comply with the age-verification requirement .

In exercising its powers, the regulator should take a proportionate approach. Section 26(1) specifically provides that the regulator may, if it thinks fit, choose to exercise its powers principally in relation to persons who, in the age-verification regulator’s opinion:

  • (a) make pornographic material or extreme pornographic material available on the internet on a commercial basis to a large number of persons, or a large number of persons under the age of 18, in the United Kingdom; or
  • (b) generate a large amount of turnover by doing so.

In taking a proportionate approach, the regulator should have regard to the following:

a. As set out in section 19, before making a determination that a person is contravening section 14(1), the regulator must allow that person an opportunity to make representations about why the determination should not be made. To ensure clarity and discourage evasion, the regulator should specify a prompt timeframe for compliance and, if it considers it appropriate, set out the steps that it considers that the person needs to take to comply.

b. When considering whether to exercise its powers (whether under section 19, 21 or 23), including considering what type of notice to issue, the regulator should consider, in any given case, which intervention will be most effective in encouraging compliance, while balancing this against the need to act in a proportionate manner.

c. Before issuing a notice to require internet service providers to block access to material, the regulator must always first consider whether issuing civil proceedings or giving notice to ancillary service providers and payment-services providers might have a sufficient effect on the non-complying person’s behaviour.

To help ensure transparency, the regulator should publish on its website details of any notices under sections 19, 21 and 23.

Age-verification arrangements

Section 25(1) provides that the regulator must publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as complying with section 14(1). This guidance is subject to a Parliamentary procedure

A person making pornographic material available on a commercial basis to persons in the United Kingdom must have an effective process in place to verify a user is 18 or over. There are various methods for verifying whether someone is 18 or over (and it is expected that new age-verification technologies will develop over time). As such, the Secretary of State considers that rather than setting out a closed list of age-verification arrangements, the regulator’s guidance should specify the criteria by which it will assess, in any given case, that a person has met with this requirement. The regulator’s guidance should also outline good practice in relation to age verification to encourage consumer choice and the use of mechanisms which confirm age, rather than identity.

The regulator is not required to approve individual age-verification solutions. There are various ways to age verify online and the industry is developing at pace. Providers are innovating and providing choice to consumers.

The process of verifying age for adults should be concerned only with the need to establish that the user is aged 18 or above. The privacy of adult users of pornographic sites should be maintained and the potential for fraud or misuse of personal data should be safeguarded. The key focus of many age-verification providers is on privacy and specifically providing verification, rather than identification of the individual.

Payment-services providers and ancillary service providers

There is no requirement in the Digital Economy Act for payment-services providers or ancillary service providers to take any action on receipt of such a notice. However, Government expects that responsible companies will wish to withdraw services from those who are in breach of UK legislation by making pornographic material accessible online to children or by making extreme pornographic material available.

The regulator should consider on a case-by-case basis the effectiveness of notifying different ancillary service providers (and payment-services providers).

There are a wide-range of providers whose services may be used by pornography providers to enable or facilitate making pornography available online and who may therefore fall under the definition of ancillary service provider in section 21(5)(a) . Such a service is not limited to where a direct financial relationship is in place between the service and the pornography provider. Section 21(5)(b) identifies those who advertise commercially on such sites as ancillary service providers. In addition, others include, but are not limited to:

  • a. Platforms which enable pornographic content or extreme pornographic material to be uploaded;
  • b. Search engines which facilitate access to pornographic content or extreme pornographic material;
  • c. Discussion for a and communities in which users post links;
  • d. Cyberlockers’ and cloud storage services on which pornographic content or extreme pornographic material may be stored;
  • e. Services including websites and App marketplaces that enable users to download Apps;
  • f. Hosting services which enable access to websites, Apps or App marketplaces; that enable users to download apps
  • g. Domain name registrars.
  • h. Set-top boxes, mobile applications and other devices that can connect directly to streaming servers

Internet Service Provider blocking

The regulator should only issue a notice to an internet service provider having had regard to Chapter 2 of this guidance. The regulator should take a proportionate approach and consider all actions (Chapter 2.4) before issuing a notice to internet service providers.

In determining those ISPs that will be subject to notification, the regulator should take into consideration the number and the nature of customers, with a focus on suppliers of home and mobile broadband services. The regulator should consider any ISP that promotes its services on the basis of pornography being accessible without age verification irrespective of other considerations.

The regulator should take into account the child safety impact that will be achieved by notifying a supplier with a small number of subscribers and ensure a proportionate approach. Additionally, it is not anticipated that ISPs will be expected to block services to business customers, unless a specific need is identified.

Reporting

In order to assist with the ongoing review of the effectiveness of the new regime and the regulator’s functions, the Secretary of State considers that it would be good practice for the regulator to submit to the Secretary of State an annual report on the exercise of its functions and their effectiveness.

Read more gcnews.htm at MelonFarmers.co.uk

matt hancockMatt Hancock MP was appointed Secretary of State for Digital, Censorship, Media and Sport on 8 January 2018. He was previously Minister of State for Digital from July 2016 to January 2018.Matt Hancock is the MP for West Suffolk, having been elected in the 2010 general election.  Since July 2016 he has served at DCMS as Minister of State for Digital and is responsible for broadband, broadcasting, creative industries, cyber and the tech industry.

The Secretary of State has overall responsibility for strategy and policy across the Department for Culture, Media and Sport.

The department’s main policy areas are:

  • arts and culture
  • broadcasting
  • creative industries
  • cultural property, heritage and the historic environment
  • gambling and racing
  • libraries
  • media ownership and mergers
  • museums and galleries
  • the National Lottery
  • sport
  • telecommunications and online
  • tourism

Hancock has already been working on the new law to serve up porn viewers on a platter to scammers, fraudsters, blackmailers and identity thieves, so there is unlikely to be a change of direction there.