Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcms facebook The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and ‘fake news’. The report calls for:

  • Compulsory Code of Ethics for tech companies overseen by independent regulator

  • Regulator given powers to launch legal action against companies breaching code

  • Government to reform current electoral communications laws and rules on overseas involvement in UK elections

  • Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation

Further finds that:

  • Electoral law ‘not fit for purpose’

  • Facebook intentionally and knowingly violated both data privacy and anti-competition laws

Chair’s comment

Damian Collins MP, Chair of the DCMS Committee said:

“Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.

“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this is directed from agencies working in foreign countries, including Russia.

“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

“Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.

“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.

“We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world. More needs to be done to require major donors to clearly establish the source of their funds.

“Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.

“We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.

“Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.

“We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation.”

Final Report

This Final Report on Disinformation and ‘Fake News’ repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.

The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.

Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: “Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.”

The Report’s recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.

It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.

Data use and data targeting

The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the Report finds evidence to indicate that the company was willing to: override its users’ privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers–such as Six4Three–of that data, contributing to them losing their business. MPs conclude: “It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws.”

It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data. The CMA (Competition and Markets Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.

MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: “By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both our Committee and the ‘International Grand Committee’ involving members from nine legislators from around the world.”

Advertisements
Read more eu.htm at MelonFarmers.co.uk

EU flag In the evening of February 13, negotiators from the European Parliament and the Council concluded the trilogue negotiations with a final text for the new EU Copyright Directive.

For two years we’ve debated different drafts and versions of the controversial Articles 11 and 13. Now, there is no more ambiguity: This law will fundamentally change the internet as we know it — if it is adopted in the upcoming final vote. But we can still prevent that!

Please click the links to take a look at the final wording of Article 11 and Article 13 . Here’s my summary:

Article 13: Upload filters

Parliament negotiator Axel Voss accepted the deal between France and Germany I laid out in a recent blog post :

  • Commercial sites and apps where users can post material must make “best efforts” to preemptively buy licences for anything that users may possibly upload — that is: all copyrighted content in the world. An impossible feat.

  • In addition, all but very few sites (those both tiny and very new) will need to do everything in their power to prevent anything from ever going online that may be an unauthorised copy of a work that a rightsholder has registered with the platform. They will have no choice but to deploy upload filters , which are by their nature both expensive and error-prone .

  • Should a court ever find their licensing or filtering efforts not fierce enough, sites are directly liable for infringements as if they had committed them themselves. This massive threat will lead platforms to over-comply with these rules to stay on the safe side, further worsening the impact on our freedom of speech.

Article 11: The “link tax”

The final version of this extra copyright for news sites closely resembles the version that already failed in Germany — only this time not limited to search engines and news aggregators, meaning it will do damage to a lot more websites.

  • Reproducing more than “single words or very short extracts” of news stories will require a licence. That will likely cover many of the snippets commonly shown alongside links today in order to give you an idea of what they lead to. We will have to wait and see how courts interpret what “very short” means in practice — until then, hyperlinking (with snippets) will be mired in legal uncertainty.

  • No exceptions are made even for services run by individuals, small companies or non-profits, which probably includes any monetised blogs or websites.

Other provisions

The project to allow Europeans to conduct Text and Data Mining , crucial for modern research and the development of artificial intelligence, has been obstructed with too many caveats and requirements. Rightholders can opt out of having their works datamined by anyone except research organisations.

Authors’ rights: The Parliament’s proposal that authors should have a right to proportionate remuneration has been severely watered down: Total buy-out contracts will continue to be the norm.

Minor improvements for access to cultural heritage : Libraries will be able to publish out-of-commerce works online and museums will no longer be able to claim copyright on photographs of centuries-old paintings.

How we got here Former digital Commissioner Oettinger proposed the law

The history of this law is a shameful one. From the very beginning , the purpose of Articles 11 and 13 was never to solve clearly-defined issues in copyright law with well-assessed measures, but to serve powerful special interests , with hardly any concern for the collateral damage caused.

In the relentless pursuit of this goal , concerns by independent academics , fundamental rights defenders , independent publishers , startups and many others were ignored. At times, confusion was spread about crystal-clear contrary evidence . Parliament negotiator Axel Voss defamed the unprecedented protest of millions of internet users as ” built on lies “.

In his conservative EPP group, the driving force behind this law, dissenters were marginalised . The work of their initially-appointed representative was thrown out after the conclusions she reached were too sensible. Mr Voss then voted so blindly in favour of any and all restrictive measures that he was caught by surprise by some of the nonsense he had gotten approved. His party, the German CDU/CSU, nonchalantly violated the coalition agreement they had signed (which rejected upload filters), paying no mind to their own minister for digital issues .

It took efforts equally herculean and sisyphean across party lines to prevent the text from turning out even worse than it now is.

In the end, a closed-door horse trade between France and Germany was enough to outweigh the objections… so far.

What’s important to note, though: It’s not “the EU” in general that is to blame — but those who put special interests above fundamental rights who currently hold considerable power. You can change that at the polls! The anti-EU far right is trying to seize this opportunity to promote their narrow-minded nationalist agenda — when in fact without the persistent support of the far-right ENF Group (dominated by the Rassemblement/Front National ) the law could have been stopped in the crucial Legal Affairs Committee and in general would not be as extreme as it is today.

We can still stop this law

Our best chance to stop the EU copyright law: The upcoming Parliament vote.

The Parliament and Council negotiators who agreed on the final text now return to their institutions seeking approval of the result. If it passes both votes unchanged, it becomes EU law , which member states are forced to implement into national law.

In both bodies, there is resistance.

The Parliament’s process starts with the approval by the Legal Affairs Committee — which is likely to be given on Monday, February 18.

Next, at a date to be announced, the EU member state governments will vote in the Council. The law can be stopped here either by 13 member state governments or by any number of governments who together represent 35% of the EU population ( calculator ). Last time, 8 countries representing 27% of the population were opposed. Either a large country like Germany or several small ones would need to change their minds: This is the less likely way to stop it.

Our best bet: The final vote in the plenary of the European Parliament , when all 751 MEPs, directly elected to represent the people, have a vote. This will take place either between March 25 and 28, on April 4 or between April 15 and 18. We’ve already demonstrated last July that a majority against a bad copyright proposal is achievable .

The plenary can vote to kill the bill — or to make changes , like removing Articles 11 and 13. In the latter case, it’s up to the Council to decide whether to accept these changes (the Directive then becomes law without these articles) or to shelve the project until after the EU elections in May, which will reshuffle all the cards.

This is where you come in

The final Parliament vote will happen mere weeks before the EU elections . Most MEPs — and certainly all parties — are going to be seeking reelection. Articles 11 and 13 will be defeated if enough voters make these issues relevant to the campaigns. ( Here’s how to vote in the EU elections — change the language to one of your country’s official ones for specific information)

It is up to you to make clear to your representatives: Their vote on whether to break the internet with Articles 11 and 13 will make or break your vote in the EU elections. Be insistent — but please always stay polite.

Together, we can still stop this law.

Read more gcnews.htm at MelonFarmers.co.uk

arms of the british governmentjpg logo It will be an offence to view terrorist material online just once — and could incur a prison sentence of up to 15 years — under a new UK law.The Counter-Terrorism and Border Security Bill has just been granted Royal Assent, updating a previous Act and bringing new powers to law enforcement to tackle terrorism.

But a controversial inclusion was to update the offence of obtaining information likely to be useful to a person committing or preparing an act of terrorism so that it now covers viewing or streaming content online.

Originally, the proposal had been to make it an offence for someone to view material three or more times — but the three strikes idea has been dropped from the final Act.

The government said that the existing laws didn’t capture the nuance in changing methods for distribution and consumption of terrorist content — and so added a new clause into the 2019 Act making it an offence to view (or otherwise access) any terrorist material online. This means that, technically, anyone who clicked on a link to such material could be caught by the law.

Read more latest.htm at MelonFarmers.co.uk

Poster Lords of Chaos 2018 Jonas Kerlund Lords of Chaos is a UK / Sweden thriller by Jonas Åkerlund.
Starring Rory Culkin, Emory Cohen and Sky Ferreira. YouTube icon BBFC link IMDb

A teenager’s quest to launch Norwegian Black Metal in Oslo in the 1980s Members of the Norwegian death metal band perform a series of increasingly shocking publicity stunts leading to a very violent outcome.

It is based on real-life band Mayhem, and includes scenes of murder including the brutal killing of a homosexual man – and the burning of churches by satanists.

The latest most controversial film ever has been passed 18 uncut by the BBFC for s trong bloody violence, gore, suicide. According to the Telegraph the BBFC are understood to have been so concerned about the film that it was reviewed at the highest levels and suicide prevention experts were consulted before it was approved for an 18 certificate.

The Telegraph suggests the US film censors at the MPAA were similarly concerned before rating it R for strong brutal violence, disturbing behavior, grisly images, strong sexuality, nudity, and pervasive language.

The BBFC said the film did not glamorise self-harm and that there was no reason to think the film would have a damaging effect on adults who chose to view it – although some might find it distressing.

Church groups have, however, have called for it to be banned. Speaking to The Telegraph, Simon Calvert, deputy director of The Christian Institute, said he was surprised the film had not been banned given the recent discussion about self-harm. He said:

In the current climate of concern over self-harm and suicide, you would have thought there might have been more consideration of the risk that vulnerable people might imitate what they see. The distributors ought to be asking themselves if it is worth this risk.’

The film is being distributed in the United Kingdom by Arrow Films and will be released in cinemas on 29th March.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo There is every reason to believe that the government and opposition are moving to a consensus on introducing a duty of care for social media companies to reduce harm and risk to their users. This may be backed by an Internet regulator, who might decide what kind of mitigating actions are appropriate to address the risks to users on different platforms.

This idea originated from a series of papers by Will Perrin and Lorna Woods and has been mentioned most recently in a recent Science and Technology committee report and by NGOs including children’s charity 5Rights.

A duty of care has some obvious merits: it could be based on objective risks, based on evidence, and ensure that mitigations are proportionate to those risks. It could take some of the politicisation out of the current debate.

However, it also has obvious problems. For a start, it focuses on risk rather than process . It moves attention away from the fact that interventions are regulating social media users just as much as platforms. It does not by itself tell us that free expression impacts will be considered, tracked or mitigated.

Furthermore, the lack of focus that a duty of care model gives to process means that platform decisions that have nothing to do with risky content are not necessarily based on better decisions, independent appeals and so on. Rather, as has happened with German regulation, processes can remain unaffected when they are outside a duty of care.

In practice, a lot of content which is disturbing or offensive is already banned on online platforms. Much of this would not be in scope under a duty of care but it is precisely these kinds of material which users often complain about, when it is either not removed when they want it gone, or is removed incorrectly. Any model of social media regulation needs to improve these issues, but a duty of care is unlikely to touch these problems.

There are very many questions about the kinds of risk, whether to individual in general, vulnerable groups, or society at large; and the evidence required to create action. The truth is that a duty of care, if cast sensibly and narrowly, will not satisfy many of the people who are demanding action; equally, if the threshold to act is low, then it will quickly be seen to be a mechanism for wide-scale Internet censorship.

It is also a simple fact that many decisions that platforms make about legal content which is not risky are not the business of government to regulate. This includes decisions about what legal content is promoted and why. For this reason, we believe that a better approach might be to require independent self-regulation of major platforms across all of their content decisions. This requirement could be a legislative one, but the regulator would need to be independent of government and platforms.

Independent self-regulation has not been truly tried. Instead, voluntary agreements have filled its place. We should be cautious about moving straight to government regulation of social media and social media users. The government refuses to regulate the press in this way because it doesn’t wish to be seen to be controlling print media. It is pretty curious that neither the media nor the government are spelling out the risks of state regulation of the speech of millions of British citizens.

That we are in this place is of course largely the fault of the social media platforms themselves, who have failed to understand the need and value of transparent and accountable systems to ensure they are acting properly. That, however, just demonstrates the problem: politically weak platforms who have created monopoly positions based on data silos are now being sliced and diced at the policy table for their wider errors. It’s imperative that as these government proposals progress we keep focus on the simple fact that it is end users whose speech will ultimately be regulated.

Read more eu.htm at MelonFarmers.co.uk

eu council 0299x0133 logo The Council of the EU headed by Donald Tusk has just adopted as the common position, the deal struck by France and Germany on the controversial EU Copyright Directive that was leaked earlier this week .

While Italy, Poland, the Netherlands, Sweden, Finland and Luxembourg maintained their opposition to the text and were newly joined by Malta and Slovakia, Germany’s support of the “compromise” secretly negotiated with France over the last weeks has broken the previous deadlock .

This new Council position is actually more extreme than previous versions, requiring all platforms older than 3 years to automatically censor all their users’ uploads, and putting unreasonable burdens even on the newest companies.

The German Conservative–Social Democrat government is now in blatant violation of its own coalition agreement , which rejects upload filters against copyright infringement as disproportionate. This breach of coalition promises will not go down well with many young voters just ahead of the European elections in May. Meanwhile, prominent members of both German government parties have joined the protests against upload filters.

The deal in Council paves the way for a final round of negotiations with the Parliament over the course of next week, before the entire European Parliament and the Council vote on the final agreement. It is now up to you to contact your MEPs, call their offices in their constituencies and visit as many of their election campaign events as you can! Ask them to reject a copyright deal that will violate your rights to share legal creations like parodies and reviews online, and includes measures like the link tax that will limit your access to the news and drive small online newspapers out of business.

Right before the European elections, your voices cannot be ignored! Join the over 4.6 million signatories to the largest European petition ever and tell your representatives: If you break the Internet and accept Article 13, we won’t reelect you!

Read more eu.htm at MelonFarmers.co.uk

EFF logo Governments around the world are grappling with the threat of terrorism, but their efforts aimed at curbing the dissemination of terrorist content online all too often result in censorship. Over the past five years, we’ve seen a number of governments–from the US Congress to that of France and now the European Commission (EC)–seek to implement measures that place an undue burden on technology companies to remove terrorist speech or face financial liability.

This is why EFF has joined forces with dozens of organizations to call on members of the European Parliament to oppose the EC’s proposed regulation, which would require companies to take down terrorist content within one hour . We’ve added our voice to two letters–one from Witness and another organized by the Center for Democracy and Technology –asking that MEPs consider the serious consequences that the passing of this regulation could have on human rights defenders and on freedom of expression.

We share the concerns of dozens of allies that requiring the use of proactive measures such as use of the terrorism hash database (already voluntarily in use by a number of companies) will restrict expression and have a disproportionate impact on marginalized groups. We know from years of experience that filters just don’t work.

Furthermore, the proposed requirement that companies must respond to reports of terrorist speech within an hour is, to put it bluntly, absurd. As the letter organized by Witness states, this regulation essentially forces companies to bypass due process and make rapid and unaccountable decisions on expression through automated means and furthermore doesn’t reflect the realities of how violent groups recruit and share information online.

We echo these and other calls from defenders of human rights and civil liberties for MEPs to reject proactive filtering obligations and to refrain from enacting laws that will have unintended consequences for freedom of expression.