Archive for the ‘UK Parliament’ Category

The ICO’s onerous internet censorship measure starts its parliamentary approval stage.

Read more me_ico.htm at MelonFarmers.co.uk

house of commons logo ICO statement in response to the Government laying the Age Appropriate Design Code, also known as the Children’s Code, before Parliament.

We welcome the news that Government has laid the Age Appropriate Design Code before Parliament. It’s a huge step towards protecting children online especially given the increased reliance on online services at home during COVID-19.

The code sets out 15 standards that relevant online services should meet to protect children’s privacy and is the result of wide-ranging consultation and engagement with stakeholders including the tech industry, campaigners, trade bodies and organisations.

We are now pulling together our existing work on the benefits and the costs of the code to assess its impact. This will inform the discussions we have with businesses to help us develop a package of support to help them implement the code during the transition year.”

But the idea failed because politicians like her didn’t give a damn about the safety of adults…

Read more uk_internet_censors.htm at MelonFarmers.co.uk

elspeth howe Elspeth Howe announced her intentions in a House of Lords debate about the Queen’s Speech. She said:

My Lords, I welcome the Government’s commitment to introduce its online harms Bill to improve internet safety for all, but, equally, stress that I remain deeply concerned by their failure to implement Part 3 of the Digital Economy Act. The rationale for focusing on the new Bill instead seems to be a desire to put attempts to protect children from pornographic websites on the same footing as attempts to protect them on social media platforms. It is entirely right to seek to promote safety in both contexts, but a basic error to suggest that both challenges should be addressed in the same way. The internet is complicated and one-size-fits-all policies simply will not work.

The focus of what I have read about the Government’s plans for online harms revolves around social media companies and fining them if they do not do what they are supposed to do under a new legal duty of care. An article in the Times on 31 December suggested that Ofcom is going to draw up legally enforceable codes of practice that will include protecting children from accessing pornography. This approach may work for social media platforms if they have bases in the UK but it will be absolutely useless at engaging with the challenge of protecting children from pornographic websites.

Initially when the Digital Economy Bill was introduced in another place, the proposal was that statutory age-verification requirements should be enforced through fines, but a cross-party group of MPs pointed out that this would never work because the top 50 pornographic websites accessed in the UK are all based in other jurisdictions. One could certainly threaten fines but it would be quite impossible to enforce them in a way that would concentrate the minds of website providers because, based in other jurisdictions, they could simply ignore them.

Because of that, MPs amended the Bill to give the regulator the option of IP blocking. This would enable the regulator to tell a site based in say, Russia, that if it failed to introduce robust age-verification checks within a certain timeframe, the regulator would block it from accessing the UK market. Children would be protected either by the site being blocked after the specified timeframe or, more probably, by the site deciding that it would make more sense for it to introduce proper age-verification checks rather than risk disruption of its UK income stream. The Government readily accepted the amendment because the case for it was unanswerable.

I say again that I welcome the fact that the Government want to address online safety with respect to social media platforms through their new Bill. This must not, however, be used as an excuse not to proceed with implementing Part 3 of the Digital Economy Bill, which provides the very best way of dealing with the different challenge of protecting children from pornographic websites.

The failure to implement this legislation is particularly concerning because, rather than being a distant aspiration, it is all there on the statute book. The only thing standing in the way of statutory age verification with respect to pornographic websites is the Government’s delay in relaying the BBFC age-verification guidance before Parliament and setting an implementation date. Having the capacity to deal with this problem204thanks to Part 3 of the Digital Economy Act204yet not bothering to avail ourselves of it does not reflect at all well on either the Government or British society as a whole. The Government must stop procrastinating over child safety with respect to pornographic websites and get on with implementing Part 3.

Mindful of that, on 21 January I will introduce my Digital Economy Act 2017 (commencement of Part 3) Bill, the simple purpose of which will be to implement Part 3 of the Digital Economy Act .

I hope that that will not be necessary and that the Minister will today confirm that, notwithstanding the new online safety Bill, the Government will now press ahead with implementation themselves. I very much look forward to hearing what the Minister has to say.

Read more parl.htm at MelonFarmers.co.uk

house of lords red logo A few unelected members of the House of Lords are introducing their own internet censorship law because they think it is unreasonable to wait a year for the government to work through the issues.Tom McNally, previously involved in TV censorship law has challenged the Government to back his proposed new law. This is set to be introduced in the House or Lord on January 14.

The bill gives Ofcom censorship powers requiring that internet companies accept a duty of care with provisions to be enforced by Ofcom.

McNally told The Daily Telegraph:

We are in danger of losing a whole year on this. The Government’s commitment to develop safer internet legislation in the Queen’s Speech, though welcome, did not go far enough.

The Government has yet to reveal the findings from its consultation on its White Paper which was published in the Summer. The results had been expected before the end of this year but have been delayed by the general election.McNally is drafting the bill with the Carnegie Trust who campaign for internet censorship in the name of thinking of the children. Lord Puttnam and Baroness Kidron, the film director and children’s internet rights campaigner are being canvassed as sponsors of the bill.

‘When we enter a building we expect it to be safe. We are not expected to examine and understand all the paperwork and then tick a box that lets the companies involved off the hook’

Read more parl.htm at MelonFarmers.co.uk

home affairs committee The UK Parliament’s Joint Committee on Human Rights has reported on serious grounds for concern about the nature of the “consent” people provide when giving over an extraordinary range of information about themselves, to be used for commercial gain by private companies:

  • Privacy policies are too complicated for the vast majority of people to understand: while individuals may understand they are consenting to data collection from a given site in exchange for “free” access to content, they may not understand that information is being compiled, without their knowledge, across sites to create a profile. The Committee heard alarming evidence about eye tracking software being used to make assumptions about people’s sexual orientation, whether they have a mental illness, are drunk or have taken drugs: all then added to their profile.
  • Too often the use of a service or website is conditional on consent being given — raising questions about whether it is freely given
  • People cannot find out what they have consented to: it is difficult, if not nearly impossible, for people – even tech experts – to find out who their data has been shared with, to stop it being shared or to delete inaccurate information about themselves.
  • The consent model relies on individuals knowing about the risks associated with using web based services when the system should provide adequate protection from the risks as a default..
  • It is completely inappropriate to use consent when processing children’s data: children aged 13 and older are, under the current legal framework, considered old enough to consent to their data being used, even though many adults struggle to understand what they are consenting to.

Key conclusions and recommendations The Committee points out that there is a real risk of discrimination against some groups and individuals through the way this data is used: it heard deeply troubling evidence about some companies using personal data to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement.

There are also long-established concerns about the use of such data to discriminate in provision of insurance or credit products.

Unlike traditional print advertising where such blatant discrimination would be obvious and potentially illegal personalisation of content means people have no way of knowing how what they see online compares to anyone else.

Short of whistleblowers or work by investigative journalists, there currently appears to be no mechanism for protecting against such privacy breaches or discrimination being in the online “Wild West”.

The Committee calls on the Government to ensure there is robust regulation over how our data can be collected and used and it calls for better enforcement of that regulation.

The Committee says:

  • The “consent model is broken” and should not be used as a blanket basis for processing. It is impossible for people to know what they are consenting to when making a non-negotiable, take it-or-leave-it “choice” about joining services like Facebook, Snapchat and YouTube based on lengthy, complex T&Cs, subject to future changes to terms.
  • This model puts too much onus on the individual, but the responsibility of knowing about the risks with using web based services cannot be on the individual. The Government should strengthen regulation to ensure there is safe passage on the internet guaranteed
  • Its completely inadequate to use consent when it comes to processing children’s data,. If adults struggle to understand complex consent agreements, how do we expect our children to give informed consent? The Committee says setting the digital age of consent at 13 years old should be revisited.
  • The Government should be regulating to keep us safe online in the same way as they do in the real world – not by expecting us to become technical experts who can judge whether our data is being used appropriately but by having strictly enforced standards that protect our right to privacy and freedom from discrimination.
  • It should be made much simpler for individuals to see what data has been shared about them, and with whom, and to prevent some or all of their data being shared.
  • The Government should look at creating a single online registry that would allow people to see, in real time, all the companies that hold personal data on them, and what data they hold.

The report is worth a read and contains many important points criticising the consent model as dictated by GDPR and enfoced by ICO. Here are a few passages from the report’s summary:

The evidence we heard during this inquiry, however, has convinced us that the consent model is broken. The information providing the details of what we are consenting to is too complicated for the vast majority of people to understand. Far too often, the use of a service or website is conditional on consent being given: the choice is between full consent or not being able to use the website or service. This raises questions over how meaningful this consent can ever really be.

Whilst most of us are probably unaware of who we have consented to share our information with and what we have agreed that they can do with it, this is undoubtedly doubly true for children. The law allows children aged 13 and over to give their own consent. If adults struggle to understand complex consent agreements, how do we expect our children to give informed consent. Parents have no say over or knowledge of the data their children are sharing with whom. There is no effective mechanism for a company to determine the age of a person providing consent. In reality a child of any age can click a consent button.

The bogus reliance on consent is in clear conflict with our right to privacy. The consent model relies on us, as individuals, to understand, take decisions, and be responsible for how our data is used. But we heard that it is difficult, if not nearly impossible, for people to find out whom their data has been shared with, to stop it being shared or to delete inaccurate information about themselves. Even when consent is given, all too often the limit of that consent is not respected. We believe companies must make it much easier for us to understand how our data is used and shared. They must make it easier for us to opt out of some or all of our data being used. More fundamentally, however, the onus should not be on us to ensure our data is used appropriately – the system should be designed so that we are protected without requiring us to understand and to police whether our freedoms are being protected.

As one witness to our inquiry said, when we enter a building we expect it to be safe. We are not expected to examine and understand all the paperwork and then tick a box that lets the companies involved off the hook. It is the job of the law, the regulatory system and of regulators to ensure that the appropriate standards have been met to keep us from harm and ensure our safe passage. We do not believe the internet should be any different. The Government must ensure that there is robust regulation over how our data can be collected and used, and that regulation must be stringently enforced.

Internet companies argue that we benefit from our data being collected and shared. It means the content we see online – from recommended TV shows to product advertisements – is more likely to be relevant to us. But there is a darker side to personalisation. The ability to target advertisements and other content at specific groups of people makes it possible to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement. Unlike traditional print advertising, where such blatant discrimination would be obvious, personalisation of content means people have no way of knowing how what they see online compares to anyone else. Short of a whistle-blower within the company or work by an investigative journalist, there does not currently seem to be a mechanism for uncovering these cases and protecting people from discrimination.

We also heard how the data being used (often by computer programmes rather than people) to make potentially life-changing decisions about the services and information available to us is not even necessarily accurate, but based on inferences made from the data they do hold. We were told of one case, for example, where eye-tracking software was being used to make assumptions about people’s sexual orientation, whether they have a mental illness, are drunk or have taken drugs. These inferences may be entirely untrue, but the individual has no way of finding out what judgements have been made about them.

We were left with the impression that the internet, at times, is like the Wild West, when it comes to the lack of effective regulation and enforcement.

That is why we are deeply frustrated that the Government’s recently published Online Harms White Paper explicitly excludes the protection of people’s personal data. The Government is intending to create a new statutory duty of care to make internet companies take more responsibility for the safety of their users, and an independent regulator to enforce it. This could be an ideal vehicle for requiring companies to take people’s right to privacy, and freedom from discrimination, more seriously and we would strongly urge the Government to reconsider its decision to exclude data protection from the scope of their new regulatory framework. In particular, we consider that the enforcement of data protection rules – including the risks of discrimination through the use of algorithms – should be within scope of this work.

Fact Check this Mr Collins: ‘What people voted for last year was for us to leave the European Union and we will leave the EU on 29 March 2019’

Read more parl.htm at MelonFarmers.co.uk

damian collins Damian Collins, the chair of the House of Commons’ digital, culture, media and sport select committee has written to Nick Clegg, Facebook’s vice-president for global affairs and communications,  querying Facebook decision to exempt political adverts from fact-checkingCollins, presumably speaking from planet Uranus where all politicians always tell the truth, demanded to know why Facebook has decided to exempt political statements from its fact-checking programme — removing all bars on political candidates lying in paid adverts.

Collins wrote to Clegg with five questions for Facebook to answer , three of which covered the rule change. Why was the decision taken to change Facebook’s policy, the MP asked, given the heavy constraint this will place on Facebook’s ability to combat online disinformation in the run-up to elections around the world, and a possible UK general election in particular?

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Houses of Parliament Web browser risk to child safety

We are deeply concerned that a new form of encryption being introduced to our web browsers will have terrible consequences for child protection.

The new system 204 known as DNS over HTTPS — would have the effect of undermining the work of the Internet Watch Foundation (IWF); yet Mozilla, provider of the Firefox browser, has decided to introduce it, and others may follow.

The amount of abusive content online is huge and not declining. Last year, the IWF removed more than 105,000 web pages showing the sexual abuse of children. While the UK has an excellent record in eliminating the hosting of such illegal content, there is still a significant demand from UK internet users: the National Crime Agency estimates there are 144,000 internet users on some of the worst dark-web child sexual abuse sites.

To fight this, the IWF provides a URL block list that allows internet service providers to block internet users from accessing known child sexual abuse content until it is taken down by the host country. The deployment of the new encryption system in its proposed form could render this service obsolete, exposing millions of people to the worst imagery of children being sexually abused, and the victims of said abuse to countless sets of eyes.

Advances in protecting users’ data must not come at the expense of children. We urge the secretary of state for digital, culture, media and sport to address this issue in the government’s upcoming legislation on online harms.

  • Sarah Champion MP;
  • Tom Watson MP;
  • Carolyn Harris MP;
  • Tom Brake MP;
  • Stephen Timms MP;
  • Ian Lucas MP;
  • Tim Loughton MP;
  • Giles Watling MP;
  • Madeleine Moon MP;
  • Vicky Ford MP;
  • Rosie Cooper MP;
  • Baroness Howe;
  • Lord Knight;
  • Baroness Thornton;
  • Baroness Walmsley;
  • Lord Maginnis;
  • Baroness Benjamin;
  • Lord Harris of Haringey

The IWF service is continually being rolled out as an argument against DoH but I am starting to wonder if it is still relevant. Given the universal revulsion against child sex abuse then I’d suspect that little of it would now be located on the open internet. Surely it would be hiding away in hard to find places like the dark web, that are unlikely to stumbled on by normal people. And of course those using the dark web aren’t using ISP DNS servers anyway.

In reality the point of using DoH is to evade government attempts to block legal porn sites. If they weren’t intending to block legal sites then surely people would be happy to use the ISP DNS including the IWF service.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

house of lords red logo At the moment when internet users want to view a page, they specify the page they want in the clear. ISPs can see the page requested and block it if the authorities don’t like it. A new internet protocol has been launched that encrypts the specification of the page requested so that ISPs can’t tell what page is being requested, so can’t block it.This new DNS Over HTTPS protocol is already available in Firefox which also provides an uncensored and encrypted DNS server. Users simply have to change the settings in about:config (being careful of the dragons of course)

Questions have been raised in the House of Lords about the impact on the UK’s ability to censor the internet.

House of Lords, 14th May 2019, Internet Encryption Question

Baroness Thornton Shadow Spokesperson (Health) 2:53 pm, 14th May 2019

To ask Her Majesty ‘s Government what assessment they have made of the deployment of the Internet Engineering Task Force ‘s new ” DNS over HTTPS ” protocol and its implications for the blocking of content by internet service providers and the Internet Watch Foundation ; and what steps they intend to take in response.

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, DCMS is working together with the National Cyber Security Centre to understand and resolve the implications of DNS over HTTPS , also referred to as DoH, for the blocking of content online. This involves liaising across government and engaging with industry at all levels, operators, internet service providers, browser providers and pan-industry organisations to understand rollout options and influence the way ahead. The rollout of DoH is a complex commercial and technical issue revolving around the global nature of the internet.

Baroness Thornton Shadow Spokesperson (Health)

My Lords, I thank the Minister for that Answer, and I apologise to the House for this somewhat geeky Question. This Question concerns the danger posed to existing internet safety mechanisms by an encryption protocol that, if implemented, would render useless the family filters in millions of homes and the ability to track down illegal content by organisations such as the Internet Watch Foundation . Does the Minister agree that there is a fundamental and very concerning lack of accountability when obscure technical groups, peopled largely by the employees of the big internet companies, take decisions that have major public policy implications with enormous consequences for all of us and the safety of our children? What engagement have the British Government had with the internet companies that are represented on the Internet Engineering Task Force about this matter?

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My Lords, I thank the noble Baroness for discussing this with me beforehand, which was very welcome. I agree that there may be serious consequences from DoH. The DoH protocol has been defined by the Internet Engineering Task Force . Where I do not agree with the noble Baroness is that this is not an obscure organisation; it has been the dominant internet technical standards organisation for 30-plus years and has attendants from civil society, academia and the UK Government as well as the industry. The proceedings are available online and are not restricted. It is important to know that DoH has not been rolled out yet and the picture in it is complex–there are pros to DoH as well as cons. We will continue to be part of these discussions; indeed, there was a meeting last week, convened by the NCSC , with DCMS and industry stakeholders present.

Lord Clement-Jones Liberal Democrat Lords Spokesperson (Digital)

My Lords, the noble Baroness has raised a very important issue, and it sounds from the Minister ‘s Answer as though the Government are somewhat behind the curve on this. When did Ministers actually get to hear about the new encrypted DoH protocol? Does it not risk blowing a very large hole in the Government’s online safety strategy set out in the White Paper ?

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

As I said to the noble Baroness, the Government attend the IETF . The protocol was discussed from October 2017 to October 2018, so it was during that process. As far as the online harms White Paper is concerned, the technology will potentially cause changes in enforcement by online companies, but of course it does not change the duty of care in any way. We will have to look at the alternatives to some of the most dramatic forms of enforcement, which are DNS blocking.

Lord Stevenson of Balmacara Opposition Whip (Lords)

My Lords, if there is obscurity, it is probably in the use of the technology itself and the terminology that we have to use–DoH and the other protocols that have been referred to are complicated. At heart, there are two issues at stake, are there not? The first is that the intentions of DoH, as the Minister said, are quite helpful in terms of protecting identity, and we do not want to lose that. On the other hand, it makes it difficult, as has been said, to see how the Government can continue with their current plan. We support the Digital Economy Act approach to age-appropriate design, and we hope that that will not be affected. We also think that the soon to be legislated for–we hope–duty of care on all companies to protect users of their services will help. I note that the Minister says in his recent letter that there is a requirement on the Secretary of State to carry out a review of the impact and effectiveness of the regulatory framework included in the DEA within the next 12 to 18 months. Can he confirm that the issue of DoH will be included?

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Clearly, DoH is on the agenda at DCMS and will be included everywhere it is relevant. On the consideration of enforcement–as I said before, it may require changes to potential enforcement mechanisms–we are aware that there are other enforcement mechanisms. It is not true to say that you cannot block sites; it makes it more difficult, and you have to do it in a different way.

The Countess of Mar Deputy Chairman of Committees, Deputy Speaker (Lords)

My Lords, for the uninitiated, can the noble Lord tell us what DoH means –very briefly, please?

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It is not possible to do so very briefly. It means that, when you send a request to a server and you have to work out which server you are going to by finding out the IP address, the message is encrypted so that the intervening servers are not able to look at what is in the message. It encrypts the message that is sent to the servers. What that means is that, whereas previously every server along the route could see what was in the message, now only the browser will have the ability to look at it, and that will put more power in the hands of the browsers.

Lord West of Spithead Labour

My Lords, I thought I understood this subject until the Minister explained it a minute ago. This is a very serious issue. I was unclear from his answer: is this going to be addressed in the White Paper ? Will the new officer who is being appointed have the ability to look at this issue when the White Paper comes out?

Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It is not something that the White Paper per se can look at, because it is not within the purview of the Government. The protocol is designed by the IETF , which is not a government body; it is a standards body, so to that extent it is not possible. Obviously, however, when it comes to regulating and the powers that the regulator can use, the White Paper is consulting precisely on those matters, which include DNS blocking, so it can be considered in the consultation.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

new filters An informal group of MPs, the All Party Parliamentary Group on Social Media and Young People’s Mental Health and Wellbeing has published a report calling for the establishment of an internet censor. The report clams:

  • 80% of the UK public believe tighter regulation is needed to address the impact of social media on the health and wellbeing of young people.
  • 63% of young people reported social media to be a good source of health information.
  • However, children who spend more than three hours a day using social media are twice as likely to display symptoms of mental ill health.
  • Pressure to conform to beauty standards perpetuated and praised online can encourage harmful behaviours to achieve “results”, including body shame and disordered eating, with 46% of girls compared to 38% of all young people reporting social media has a negative impacted on their self-esteem.

The report titled, #NewFilters to manage the impact of social media on young people’s mental health and wellbeing , puts forward a number of policy recommendations, including:

  • Establish a duty of care on all social media companies with registered UK users aged 24 and under in the form of a statutory code of conduct, with Ofcom to act as regulator.
  • Create a Social Media Health Alliance, funded by a 0.5% levy on the profits of social media companies, to fund research, educational initiatives and establish clearer guidance for the public.
  • Review whether the “addictive” nature of social media is sufficient for official disease classification.
  • Urgently commission robust, longitudinal research, into understanding the extent to which the impact of social media on young people’s mental health and wellbeing is one of cause or correlation.

Chris Elmore MP, Chair of the APPG on Social Media on Young People’s Mental Health and Wellbeing said:

“I truly think our report is the wakeup call needed to ensure – finally – that meaningful action is taken to lessen the negative impact social media is having on young people’s mental health.

For far too long social media companies have been allowed to operate in an online Wild West. And it is in this lawless landscape that our children currently work and play online. This cannot continue. As the report makes clear, now is the time for the government to take action.

The recommendations from our Inquiry are both sensible and reasonable; they would make a huge difference to the current mental health crisis among our young people.

I hope to work constructively with the UK Government in the coming weeks and months to ensure we see real changes to tackle the issues highlighted in the report at the earliest opportunity.”

Read more uk_internet_censors.htm at MelonFarmers.co.uk

regulating in a digital world The House of Lords Communications Committee has called for a new, overarching censorship framework so that the services in the digital world are held accountable to an enforceable set of government rules.The Lords Communications Committee writes:

Background

In its report ‘Regulating in a digital world’ the committee notes that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight. As a result, regulation of the digital environment is fragmented, with gaps and overlaps. Big tech companies have failed to adequately tackle online harms.

Responses to growing public concern have been piecemeal and inadequate. The Committee recommends a new Digital Authority, guided by 10 principles to inform regulation of the digital world.

Chairman’s Comments

The chairman of the committee, Lord Gilbert of Panteg , said:

“The Government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.

Self-regulation by online platforms is clearly failing. The current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people’s lives. Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service.”

Recommendations for a new regulatory approach Digital Authority

A new ‘Digital Authority’ should be established to co-ordinate regulators, continually assess regulation and make recommendations on which additional powers are necessary to fill gaps. The Digital Authority should play a key role in providing the public, the Government and Parliament with the latest information. It should report to a new joint committee of both Houses of Parliament, whose remit would be to consider all matters related to the digital world.

10 principles for regulation

The 10 principles identified in the committee’s report should guide all regulation of the internet. They include accountability, transparency, respect for privacy and freedom of expression. The principles will help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. If rights are infringed, those responsible should be held accountable in a fair and transparent way.

Recommendations for specific action Online harms and a duty of care

  • A duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom’s remit should expand to include responsibility for enforcing the duty of care.

  • Online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Major platforms should invest in more effective moderation systems to uphold their community standards.

Ethical technology

  • Users should have greater control over the collection of personal data. Maximum privacy and safety settings should be the default.

  • Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred.

  • The Government should empower the Information Commissioner’s Office to conduct impact-based audits where risks associated with using algorithms are greatest. Businesses should be required to explain how they use personal data and what their algorithms do.

Market concentration

  • The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. Greater use of data portability might help, but this will require more interoperability.

  • The Government should consider creating a public-interest test for data-driven mergers and acquisitions.

  • Regulation should recognise the inherent power of intermediaries.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcms facebook The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and ‘fake news’. The report calls for:

  • Compulsory Code of Ethics for tech companies overseen by independent regulator

  • Regulator given powers to launch legal action against companies breaching code

  • Government to reform current electoral communications laws and rules on overseas involvement in UK elections

  • Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation

Further finds that:

  • Electoral law ‘not fit for purpose’

  • Facebook intentionally and knowingly violated both data privacy and anti-competition laws

Chair’s comment

Damian Collins MP, Chair of the DCMS Committee said:

“Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.

“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.

“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

“Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.

“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.

“We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be done to require major donors to clearly establish the source of their funds.

“Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.

“We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.

“Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.

“We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation.”

Final Report

This Final Report on Disinformation and ‘Fake News’ repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.

The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.

Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: “Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.”

The Report’s recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.

It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.

Data use and data targeting

The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the Report finds evidence to indicate that the company was willing to: override its users’ privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers–such as Six4Three–of that data, contributing to them losing their business. MPs conclude: “It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws.”

It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data. The CMA (Competition and Markets Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.

MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: “By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both our Committee and the ‘International Grand Committee’ involving members from nine legislators from around the world.”