Archive for the ‘UK Parliament’ Category

Read more uk_internet_censors.htm at MelonFarmers.co.uk

new filters An informal group of MPs, the All Party Parliamentary Group on Social Media and Young People’s Mental Health and Wellbeing has published a report calling for the establishment of an internet censor. The report clams:

  • 80% of the UK public believe tighter regulation is needed to address the impact of social media on the health and wellbeing of young people.
  • 63% of young people reported social media to be a good source of health information.
  • However, children who spend more than three hours a day using social media are twice as likely to display symptoms of mental ill health.
  • Pressure to conform to beauty standards perpetuated and praised online can encourage harmful behaviours to achieve “results”, including body shame and disordered eating, with 46% of girls compared to 38% of all young people reporting social media has a negative impacted on their self-esteem.

The report titled, #NewFilters to manage the impact of social media on young people’s mental health and wellbeing , puts forward a number of policy recommendations, including:

  • Establish a duty of care on all social media companies with registered UK users aged 24 and under in the form of a statutory code of conduct, with Ofcom to act as regulator.
  • Create a Social Media Health Alliance, funded by a 0.5% levy on the profits of social media companies, to fund research, educational initiatives and establish clearer guidance for the public.
  • Review whether the “addictive” nature of social media is sufficient for official disease classification.
  • Urgently commission robust, longitudinal research, into understanding the extent to which the impact of social media on young people’s mental health and wellbeing is one of cause or correlation.

Chris Elmore MP, Chair of the APPG on Social Media on Young People’s Mental Health and Wellbeing said:

“I truly think our report is the wakeup call needed to ensure – finally – that meaningful action is taken to lessen the negative impact social media is having on young people’s mental health.

For far too long social media companies have been allowed to operate in an online Wild West. And it is in this lawless landscape that our children currently work and play online. This cannot continue. As the report makes clear, now is the time for the government to take action.

The recommendations from our Inquiry are both sensible and reasonable; they would make a huge difference to the current mental health crisis among our young people.

I hope to work constructively with the UK Government in the coming weeks and months to ensure we see real changes to tackle the issues highlighted in the report at the earliest opportunity.”

Advertisements
Read more uk_internet_censors.htm at MelonFarmers.co.uk

regulating in a digital world The House of Lords Communications Committee has called for a new, overarching censorship framework so that the services in the digital world are held accountable to an enforceable set of government rules.The Lords Communications Committee writes:

Background

In its report ‘Regulating in a digital world’ the committee notes that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight. As a result, regulation of the digital environment is fragmented, with gaps and overlaps. Big tech companies have failed to adequately tackle online harms.

Responses to growing public concern have been piecemeal and inadequate. The Committee recommends a new Digital Authority, guided by 10 principles to inform regulation of the digital world.

Chairman’s Comments

The chairman of the committee, Lord Gilbert of Panteg , said:

“The Government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.

Self-regulation by online platforms is clearly failing. The current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people’s lives. Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service.”

Recommendations for a new regulatory approach Digital Authority

A new ‘Digital Authority’ should be established to co-ordinate regulators, continually assess regulation and make recommendations on which additional powers are necessary to fill gaps. The Digital Authority should play a key role in providing the public, the Government and Parliament with the latest information. It should report to a new joint committee of both Houses of Parliament, whose remit would be to consider all matters related to the digital world.

10 principles for regulation

The 10 principles identified in the committee’s report should guide all regulation of the internet. They include accountability, transparency, respect for privacy and freedom of expression. The principles will help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. If rights are infringed, those responsible should be held accountable in a fair and transparent way.

Recommendations for specific action Online harms and a duty of care

  • A duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom’s remit should expand to include responsibility for enforcing the duty of care.

  • Online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Major platforms should invest in more effective moderation systems to uphold their community standards.

Ethical technology

  • Users should have greater control over the collection of personal data. Maximum privacy and safety settings should be the default.

  • Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred.

  • The Government should empower the Information Commissioner’s Office to conduct impact-based audits where risks associated with using algorithms are greatest. Businesses should be required to explain how they use personal data and what their algorithms do.

Market concentration

  • The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. Greater use of data portability might help, but this will require more interoperability.

  • The Government should consider creating a public-interest test for data-driven mergers and acquisitions.

  • Regulation should recognise the inherent power of intermediaries.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcms facebook The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and ‘fake news’. The report calls for:

  • Compulsory Code of Ethics for tech companies overseen by independent regulator

  • Regulator given powers to launch legal action against companies breaching code

  • Government to reform current electoral communications laws and rules on overseas involvement in UK elections

  • Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation

Further finds that:

  • Electoral law ‘not fit for purpose’

  • Facebook intentionally and knowingly violated both data privacy and anti-competition laws

Chair’s comment

Damian Collins MP, Chair of the DCMS Committee said:

“Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.

“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.

“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

“Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.

“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.

“We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be done to require major donors to clearly establish the source of their funds.

“Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.

“We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.

“Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.

“We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation.”

Final Report

This Final Report on Disinformation and ‘Fake News’ repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.

The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.

Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: “Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.”

The Report’s recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.

It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.

Data use and data targeting

The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the Report finds evidence to indicate that the company was willing to: override its users’ privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers–such as Six4Three–of that data, contributing to them losing their business. MPs conclude: “It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws.”

It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data. The CMA (Competition and Markets Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.

MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: “By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both our Committee and the ‘International Grand Committee’ involving members from nine legislators from around the world.”

Read more parl.htm at MelonFarmers.co.uk

home affairs committee People convicted of insulting people online should be named and shamed on a government register of offenders under new laws to censor social media, says an all-party committee of MPs.The Commons petitions committee claimed new laws were needed to combat online harms because current legislation was not fit for purpose and self-regulation by the social media firms had failed.

The committee was responding to a petition, backed by more than 220,000 people, from reality TV star and model Katie Price who demanded new online laws and a register of offenders after her disabled son, Harvey, was viciously trolled for his condition, colour and size.

The MPs believe a criminal law, which covered online abuse and included proper recognition of hate crimes against disabled people, will achieve what the petition is looking for from a register, as criminal convictions will show up as part of a Disclosure and Barring Service check, said the MPs.

The committee said a high proportion of abusive content related to football with most shockingly the name of Harvey Price used by fans as an insult for someone’s ability as a footballer.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

regulatory policy commitee logo Parliament’s Regulatory Policy Committee (RPC) has reported that the government’s approach to internet porn censorship and age verification is fit for purpose, but asks a few important questions about how safe it is for porn viewers.The RPC was originally set up a decade ago to help cut red tape by independently checking government estimates of how much complying with new laws and regulations would cost the private sector. Of curse all it has achieved is to watch the western world suffocate itself in accelerating red tape to such a point that the west seems to be on a permanent course to diminishing wealth and popular unrest. One has to ask if the committee itself is fit for purpose?

Anyway in the subject of endangering porn users by setting them up for identity thieves, blackmailers and scammers, the authors write:

Risks and wider impacts. The Impact Assessment (IA) makes only limited reference to risks and wider impacts of the measure. These include the risk that adults and children may be pushed towards the dark web or related systems to avoid AV, where they could be exposed to illegal activities and extreme material that they otherwise would never have come into contact with. The IA also recognises numerous other wider impacts, including privacy/fraud concerns linked to inputting ID data into sites and apps.

Given the potential severity of such risks and wider impacts, the RPC believes that a more thorough consideration of each, and of the potential means to mitigate them, would have been appropriate. The RPC therefore recommends that the Department ensures that it robustly monitors these risks and wider impacts, post-implementation.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

house of lords red logo On Tuesday the House of Lords approved the BBFC’s scheme to implement internet porn censorship in the UK. Approval will now be sought from the House of Commons.

The debate in the Lords mentioned a few issues in passing but they seemed to be avoiding taking about some of the horrors of the scheme.

The Digital Economy Act defining the law behind the scheme offers no legal requirement for age verification providers to restrict how they can use porn viewers data. Lords mentioned that it is protected under the GDPR rules but these rules still let companies do whatever they like with data, just with the proviso that they ask for consent. But of course the consent is pretty mandatory to sign up for age verification, and some of the biggest internet companies in the world have set the precedent they can explain wide ranging usage of the data claiming it will be used say to improve customer experience.

Even if the lords didn’t push very hard, people at the DCMS or BBFC have been considering this deficiency, and have come up with the idea that data use should be voluntarily restricted according to a kite mark scheme. Age verification schemes will have their privacy protections audited by some independent group and if they pass they can display a gold star. Porn viewers are then expected to trust age verification schemes with a gold star. But unfortunately it sounds a little like the sort of process that decided that cladding was safe for high rise blocks of flats.

The lords were much more concerned about the age verification requirements for social media and search engines, notably Twitter and Google Images. Clearly age verification schemes required for checking that users are 13 or 18 will be very different from an 18 only check, and will be technically very different. So the Government explained that these wider issues will be addressed in a new censorship white paper to be published in 2019.

The lords were also a bit perturbed that the definition of banned material wasn’t wide enough for their own preferences. Under the current scheme the BBFC will be expected to ban totally any websites with child porn or extreme porn. The lords wondered why this wasn’t extended to cartoon porn and beyond R18 porn, presumably thinking of fisting, golden showers and the like. However in reality if the definition of bannable porn was extended, then every major porn website in the word would have to be banned by the BBFC. And anyway the government is changing its censorship rules such that fisting and golden showers are, or will soon be, allowable at R18 anyway.

The debate revealed that the banks and payment providers have already agreed to ban payments to websites banned by the BBFC. The government also confirmed its intention to get the scheme up and running by April. Saying that, it would seem a little unfair for the website’s 3 month implementation period to be set running before their age verification options are accredited with their gold stars. Otherwise some websites would waste time and money implementing schemes that may later be declared unacceptable.

Next a motion to approve draft legislation over the UK’s age-verification regulations will be debated in the House of Commons. Stephen Winyard, AVSecure s chief marketing officer, told XBIZ:

We are particularly pleased that the prime minister is set to approve the draft guidance for the age-verification law on Monday. From this, the Department for Digital, Culture, Media and Sport will issue the effective start date and that will be around Easter.

But maybe the prime minister has a few more urgent issues on her mind at the moment.

Read more parl.htm at MelonFarmers.co.uk

home affairs committee Parliament’s fake news inquiry has published a cache of seized Facebook documents including internal emails sent between Mark Zuckerberg and the social network’s staff. The emails were obtained from the chief of a software firm that is suing the tech giant. About 250 pages have been published, some of which are marked highly confidential.

Facebook had objected to their release.

Damian Collins MP, the chair of the parliamentary committee involved, highlighted several key issues in an introductory note. He wrote that:

  • Facebook allowed some companies to maintain “full access” to users’ friends data even after announcing changes to its platform in 2014/2015 to limit what developers’ could see. “It is not clear that there was any user consent for this, nor how Facebook decided which companies should be whitelisted,” Mr Collins wrote
  • Facebook had been aware that an update to its Android app that let it collect records of users’ calls and texts would be controversial. “To mitigate any bad PR, Facebook planned to make it as hard as possible for users to know that this was one of the underlying features,” Mr Collins wrote
  • Facebook used data provided by the Israeli analytics firm Onavo to determine which other mobile apps were being downloaded and used by the public. It then used this knowledge to decide which apps to acquire or otherwise treat as a threat
  • there was evidence that Facebook’s refusal to share data with some apps caused them to fail
  • there had been much discussion of the financial value of providing access to friends’ data

In response, Facebook has said that the documents had been presented in a very misleading manner and required additional context.

See Mark Zuckerberg’s response on Facebook