Archive for the ‘Internet’ Category

Read more uk_internet_censors.htm at MelonFarmers.co.uk

new filters An informal group of MPs, the All Party Parliamentary Group on Social Media and Young People’s Mental Health and Wellbeing has published a report calling for the establishment of an internet censor. The report clams:

  • 80% of the UK public believe tighter regulation is needed to address the impact of social media on the health and wellbeing of young people.
  • 63% of young people reported social media to be a good source of health information.
  • However, children who spend more than three hours a day using social media are twice as likely to display symptoms of mental ill health.
  • Pressure to conform to beauty standards perpetuated and praised online can encourage harmful behaviours to achieve “results”, including body shame and disordered eating, with 46% of girls compared to 38% of all young people reporting social media has a negative impacted on their self-esteem.

The report titled, #NewFilters to manage the impact of social media on young people’s mental health and wellbeing , puts forward a number of policy recommendations, including:

  • Establish a duty of care on all social media companies with registered UK users aged 24 and under in the form of a statutory code of conduct, with Ofcom to act as regulator.
  • Create a Social Media Health Alliance, funded by a 0.5% levy on the profits of social media companies, to fund research, educational initiatives and establish clearer guidance for the public.
  • Review whether the “addictive” nature of social media is sufficient for official disease classification.
  • Urgently commission robust, longitudinal research, into understanding the extent to which the impact of social media on young people’s mental health and wellbeing is one of cause or correlation.

Chris Elmore MP, Chair of the APPG on Social Media on Young People’s Mental Health and Wellbeing said:

“I truly think our report is the wakeup call needed to ensure – finally – that meaningful action is taken to lessen the negative impact social media is having on young people’s mental health.

For far too long social media companies have been allowed to operate in an online Wild West. And it is in this lawless landscape that our children currently work and play online. This cannot continue. As the report makes clear, now is the time for the government to take action.

The recommendations from our Inquiry are both sensible and reasonable; they would make a huge difference to the current mental health crisis among our young people.

I hope to work constructively with the UK Government in the coming weeks and months to ensure we see real changes to tackle the issues highlighted in the report at the earliest opportunity.”

Advertisements
Read more bw.htm at MelonFarmers.co.uk

BBFC logo The BBFC has launched an innovative new industry collaboration with Netflix to move towards classifying all content on the service using BBFC age ratings.

Netflix will produce BBFC age ratings for content using a manual tagging system along with an automated rating algorithm, with the BBFC taking up an auditing role. Netflix and the BBFC will work together to make sure Netflix’s classification process produces ratings which are consistent with the BBFC’s Classification Guidelines for the UK.

It comes as new research by the British Board of Film Classification (BBFC) and the Video Standards Council Rating Board (VSC) has revealed that almost 80% of parents are concerned about children seeing inappropriate content on video on demand or online games platforms.

The BBFC and the VSC have joined forces to respond to calls from parents and are publishing a joint set of Best Practice Guidelines to help online services deliver what UK consumers want.

The Best Practice Guidelines will help online platforms work towards greater and more consistent use of trusted age ratings online. The move is supported by the Department for Digital, Culture, Media and Sport as part of the Government’s strategy to make the UK the safest place to be online.

This includes recommending the use of consistent and more comprehensive use of BBFC age labelling symbols across all Video On Demand (VOD) services, and PEGI symbols across online games services, including additional ratings info and mapping parental controls to BBFC age ratings and PEGI ratings.

The voluntary Guidelines are aimed at VOD services offering video content to UK consumers via subscription, purchase and rental, but exclude pure catch-up TV services like iPlayer, ITV Hub, All4, My 5 and UKTV Player.

The research also shows that 90% of parents believe that it is important to display age ratings when downloading or streaming a film online, and 92% of parents think it’s important for video on demand platforms to show the same type of age ratings they would expect at the cinema or on DVD and Blu-ray 203 confirmed by 94% of parents saying it’s important to have consistent ratings across all video on demand platforms, rather than a variety of bespoke ratings systems.

With nine in 10 (94%) parents believing it is important to have consistent ratings across all online game platforms rather than a variety of bespoke systems, the VSC is encouraging services to join the likes of Microsoft, Sony PlayStation, Nintendo and Google in providing consumers with the nationally recognised PEGI ratings on games – bringing consistency between the offline and online worlds.

The Video Recordings Act requires that the majority of video works and video games released on physical media must be classified by the BBFC or the VSC prior to release. While there is no equivalent legal requirement that online releases must be classified, the BBFC has been working with VOD services since 2008, and the VSC has been working with online games platforms since 2003. The Best Practice Guidelines aim to build on the good work that is already happening, and both authorities are now calling for the online industry to work with them in 2019 and beyond to better protect children.

David Austin, Chief Executive of the BBFC, said:

Our research clearly shows a desire from the public to see the same trusted ratings they expect at the cinema, on DVD and on Blu-ray when they choose to watch material online. We know that it’s not just parents who want age ratings, teenagers want them too. We want to work with the industry to ensure that families are able to make the right decisions for them when watching content online.

Ian Rice, Director General of the VSC, said:

We have always believed that consumers wanted a clear, consistent and readily recognisable rating system for online video games and this research has certainly confirmed that view. While the vast majority of online game providers are compliant and apply PEGI ratings to their product, it is clear that more can be done to help consumers make an informed purchasing decision. To this end, the best practice recommendations will certainly make a valuable contribution in achieving this aim.

Digital Minister Margot James said:

Our ambition is for the UK to be the safest place to be online, which means having age ratings parents know and trust applied to all online films and video games. I welcome the innovative collaboration announced today by Netflix and the BBFC, but more needs to be done.

It is important that more of the industry takes this opportunity for voluntary action, and I encourage all video on demand and games platforms to adopt the new best practice standards set out by the BBFC and Video Standards Council.

The BBFC is looking at innovative ways to open up access to its classifications to ensure that more online video content goes live with a trusted age rating. Today the BBFC and Netflix announce a year-long self-ratings pilot which will see the online streaming service move towards in-house classification using BBFC age ratings, under licence.

Netflix will use an algorithm to apply BBFC Guideline standards to their own content, with the BBFC setting those standards and auditing ratings to ensure consistency. The goal is to work towards 100% coverage of BBFC age ratings across the platform.

Mike Hastings, Director of Editorial Creative at Netflix, said:

The BBFC is a trusted resource in the UK for providing classification information to parents and consumers and we are excited to expand our partnership with them. Our work with the BBFC allows us to ensure our members always press play on content that is right for them and their families.

David Austin added:

We are fully committed to helping families chose content that is right for them, and this partnership with Netflix will help us in our goal to do just that. By partnering with the biggest streaming service, we hope that others will follow Netflix’s lead and provide comprehensive, trusted, well understood age ratings and ratings info, consistent with film and DVD, on their UK platforms. The partnership shows how the industry are working with us to find new and innovative ways to deliver 100% age ratings for families.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

regulating in a digital world The House of Lords Communications Committee has called for a new, overarching censorship framework so that the services in the digital world are held accountable to an enforceable set of government rules.The Lords Communications Committee writes:

Background

In its report ‘Regulating in a digital world’ the committee notes that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight. As a result, regulation of the digital environment is fragmented, with gaps and overlaps. Big tech companies have failed to adequately tackle online harms.

Responses to growing public concern have been piecemeal and inadequate. The Committee recommends a new Digital Authority, guided by 10 principles to inform regulation of the digital world.

Chairman’s Comments

The chairman of the committee, Lord Gilbert of Panteg , said:

“The Government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.

Self-regulation by online platforms is clearly failing. The current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people’s lives. Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service.”

Recommendations for a new regulatory approach Digital Authority

A new ‘Digital Authority’ should be established to co-ordinate regulators, continually assess regulation and make recommendations on which additional powers are necessary to fill gaps. The Digital Authority should play a key role in providing the public, the Government and Parliament with the latest information. It should report to a new joint committee of both Houses of Parliament, whose remit would be to consider all matters related to the digital world.

10 principles for regulation

The 10 principles identified in the committee’s report should guide all regulation of the internet. They include accountability, transparency, respect for privacy and freedom of expression. The principles will help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. If rights are infringed, those responsible should be held accountable in a fair and transparent way.

Recommendations for specific action Online harms and a duty of care

  • A duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. Given the urgent need to address online harms, Ofcom’s remit should expand to include responsibility for enforcing the duty of care.

  • Online platforms should make community standards clearer through a new classification framework akin to that of the British Board of Film Classification. Major platforms should invest in more effective moderation systems to uphold their community standards.

Ethical technology

  • Users should have greater control over the collection of personal data. Maximum privacy and safety settings should be the default.

  • Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored, for how long, and how they are used and transferred.

  • The Government should empower the Information Commissioner’s Office to conduct impact-based audits where risks associated with using algorithms are greatest. Businesses should be required to explain how they use personal data and what their algorithms do.

Market concentration

  • The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. Greater use of data portability might help, but this will require more interoperability.

  • The Government should consider creating a public-interest test for data-driven mergers and acquisitions.

  • Regulation should recognise the inherent power of intermediaries.

Read more pc_news.htm at MelonFarmers.co.uk

butchered meat sensitivity A chef has criticised Instagram after it decided that a photograph she posted of two pigs’ trotters and a pair of ears needed to be protected from ‘sensitive’ readers.Olia Hercules, a writer and chef who regularly appears on Saturday Kitchen and Sunday Brunch , shared the photo alongside a caption in which she praised the quality and affordability of the ears and trotters before asking why the cuts had fallen out of favour with people in the UK.

However Hercules later discovered that the image had been censored by the photo-sharing app with a warning that read: Sensitive content. This photo contains sensitive content which some people may find offensive or disturbing.

Hercules hit back at the decision on Twitter, condemning Instagram and the general public for becoming detached from reality.

Read more eu.htm at MelonFarmers.co.uk

european parliament 2018 logo Members of the European Parliament are considering a proposition for the censorship of terrorist internet content issued by the European Commission last September.The IMCO Committee (“Internal Market and Consumers protection”) has just published its initial opinions on the proposition.

laquadrature.net reports

Judicial Review

The idea is that the government of any European Member State will be able to order any website to remove content considered “terrorist”. No independent judicial authorisation will be needed to do so, letting governments abuse the wide definition of “terrorism”. The only thing IMCO accepted to add is for government’s orders to be subject to “judicial review”, which can mean anything.

In France, the government’s orders to remove “terrorist content” are already subject to “judicial review”, where an independent body is notified of all removal orders and may ask judges to asses them. This has not been of much help: only once has this censorship been submitted to a judge’s review. It was found to be unlawful, but more than one year and half after it was ordered. During this time, the French government was able to abusively censor content, in this case, far-left publications by two French Indymedia outlets.

Far from simplifying, this Regulation will add confusion as authorities from one member state will be able to order removal in other one, without necessarily understanding context.

Unrealistic removal delays

Regarding the one hour delay within which the police can order a hosting service provider to block any content reported as “terrorist”, there was no real progress either. It has been replaced by a deadline of at least eight hours, with a small exception for “microentreprises” that have not been previously subject to a removal order (in this case, the “deadline shall be no sooner than the end of the next working day”).

This narrow exception will not allow the vast majority of Internet actors to comply with such a strict deadline. Even if the IMCO Committee has removed any mention of proactive measures that can be imposed on Internet actors, and has stated that “automated content filters” shall not be used by hosting service providers, this very tight deadline, and the threat of heavy fines will only incite them to adopt the moderation tools developed by the Web’s juggernauts (Facebook and Google) and use the broadest possible definition of terrorism to avoid the risk of penalties. The impossible obligation to provide a point of contact reachable 24/7 has not been modified either. The IMCO opinion has even worsened the financial penalties that can be imposed: it is now “at least” 1% and up to 4% of the hosting service provider’s turnover.

Next steps

The next step will be on 11 March, when the CULT Committee (Culture and Education) will adopt its opinion.

The last real opportunity to obtain the rejection of this dangerous text will be on 21 March 2019, in the LIBE Committee (Civil Liberties, Justice and Home Affairs). European citizens must contact their MEPs to demand this rejection. We have provided a dedicated page on our website with an analysis of this Regulation and a tool to directly contact the MEPs in charge.

Starting today, and for the weeks to come, call your MEPS and demand they reject this text.

Read more eu.htm at MelonFarmers.co.uk

EU flag Last week’s publication of the final draft of the new EU Copyright Directive baffled and infuriated almost everyone, including the massive entertainment companies that lobbied for it in the first place; the artists’ groups who endorsed it only to have their interests stripped out of the final document; and the millions and millions of Europeans who had publicly called on lawmakers to fix grave deficiencies in the earlier drafts, only to find these deficiencies made even worse .

Take Action: Stop Article 13

Thankfully, Europeans aren’t taking this lying down. With the final vote expected to come during the March 25-28 session, mere weeks before European elections, European activists are pouring the pressure onto their Members of the European Parliament (MEPs), letting them know that their vote on this dreadful mess will be on everyone’s mind during the election campaigns.

The epicenter of the uprising is Germany, which is only fitting, given that German MEP Axel Voss is almost singlehandedly responsible for poisoning the Directive with rules that will lead to mass surveillance and mass censorship, not to mention undermining much of Europe’s tech sector.

The German Consumer Association were swift to condemn the Directive, stating : “The reform of copyright law in this form does not benefit anyone, let alone consumers. MEPs are now obliged to do so. Since the outcome of the trilogue falls short of the EU Parliament’s positions at key points, they should refuse to give their consent.”

A viral video of Axel Voss being confronted by activists has been picked up by politicians campaigning against Voss’s Christian Democratic Party in the upcoming elections, spreading to Germany’s top TV personalities, like Jan Böhmermann.

Things are just getting started. On Saturday, with just two days of organizing, hundreds of Europeans marched on the streets of Cologne against Article 13. A day of action –March 23, just before the first possible voting date for MEPs–is being planned, with EU-wide events.

In the meantime, the petition to save Europe from the Directive –already the largest in EU history–keeps racking up more signatures, and is on track to be the largest petition in the history of the world.

Take Action : Stop Article 13

Read more eu.htm at MelonFarmers.co.uk

EU flag In the evening of February 13, negotiators from the European Parliament and the Council concluded the trilogue negotiations with a final text for the new EU Copyright Directive.

For two years we’ve debated different drafts and versions of the controversial Articles 11 and 13. Now, there is no more ambiguity: This law will fundamentally change the internet as we know it — if it is adopted in the upcoming final vote. But we can still prevent that!

Please click the links to take a look at the final wording of Article 11 and Article 13 . Here’s my summary:

Article 13: Upload filters

Parliament negotiator Axel Voss accepted the deal between France and Germany I laid out in a recent blog post :

  • Commercial sites and apps where users can post material must make “best efforts” to preemptively buy licences for anything that users may possibly upload — that is: all copyrighted content in the world. An impossible feat.

  • In addition, all but very few sites (those both tiny and very new) will need to do everything in their power to prevent anything from ever going online that may be an unauthorised copy of a work that a rightsholder has registered with the platform. They will have no choice but to deploy upload filters , which are by their nature both expensive and error-prone .

  • Should a court ever find their licensing or filtering efforts not fierce enough, sites are directly liable for infringements as if they had committed them themselves. This massive threat will lead platforms to over-comply with these rules to stay on the safe side, further worsening the impact on our freedom of speech.

Article 11: The “link tax”

The final version of this extra copyright for news sites closely resembles the version that already failed in Germany — only this time not limited to search engines and news aggregators, meaning it will do damage to a lot more websites.

  • Reproducing more than “single words or very short extracts” of news stories will require a licence. That will likely cover many of the snippets commonly shown alongside links today in order to give you an idea of what they lead to. We will have to wait and see how courts interpret what “very short” means in practice — until then, hyperlinking (with snippets) will be mired in legal uncertainty.

  • No exceptions are made even for services run by individuals, small companies or non-profits, which probably includes any monetised blogs or websites.

Other provisions

The project to allow Europeans to conduct Text and Data Mining , crucial for modern research and the development of artificial intelligence, has been obstructed with too many caveats and requirements. Rightholders can opt out of having their works datamined by anyone except research organisations.

Authors’ rights: The Parliament’s proposal that authors should have a right to proportionate remuneration has been severely watered down: Total buy-out contracts will continue to be the norm.

Minor improvements for access to cultural heritage : Libraries will be able to publish out-of-commerce works online and museums will no longer be able to claim copyright on photographs of centuries-old paintings.

How we got here Former digital Commissioner Oettinger proposed the law

The history of this law is a shameful one. From the very beginning , the purpose of Articles 11 and 13 was never to solve clearly-defined issues in copyright law with well-assessed measures, but to serve powerful special interests , with hardly any concern for the collateral damage caused.

In the relentless pursuit of this goal , concerns by independent academics , fundamental rights defenders , independent publishers , startups and many others were ignored. At times, confusion was spread about crystal-clear contrary evidence . Parliament negotiator Axel Voss defamed the unprecedented protest of millions of internet users as ” built on lies “.

In his conservative EPP group, the driving force behind this law, dissenters were marginalised . The work of their initially-appointed representative was thrown out after the conclusions she reached were too sensible. Mr Voss then voted so blindly in favour of any and all restrictive measures that he was caught by surprise by some of the nonsense he had gotten approved. His party, the German CDU/CSU, nonchalantly violated the coalition agreement they had signed (which rejected upload filters), paying no mind to their own minister for digital issues .

It took efforts equally herculean and sisyphean across party lines to prevent the text from turning out even worse than it now is.

In the end, a closed-door horse trade between France and Germany was enough to outweigh the objections… so far.

What’s important to note, though: It’s not “the EU” in general that is to blame — but those who put special interests above fundamental rights who currently hold considerable power. You can change that at the polls! The anti-EU far right is trying to seize this opportunity to promote their narrow-minded nationalist agenda — when in fact without the persistent support of the far-right ENF Group (dominated by the Rassemblement/Front National ) the law could have been stopped in the crucial Legal Affairs Committee and in general would not be as extreme as it is today.

We can still stop this law

Our best chance to stop the EU copyright law: The upcoming Parliament vote.

The Parliament and Council negotiators who agreed on the final text now return to their institutions seeking approval of the result. If it passes both votes unchanged, it becomes EU law , which member states are forced to implement into national law.

In both bodies, there is resistance.

The Parliament’s process starts with the approval by the Legal Affairs Committee — which is likely to be given on Monday, February 18.

Next, at a date to be announced, the EU member state governments will vote in the Council. The law can be stopped here either by 13 member state governments or by any number of governments who together represent 35% of the EU population ( calculator ). Last time, 8 countries representing 27% of the population were opposed. Either a large country like Germany or several small ones would need to change their minds: This is the less likely way to stop it.

Our best bet: The final vote in the plenary of the European Parliament , when all 751 MEPs, directly elected to represent the people, have a vote. This will take place either between March 25 and 28, on April 4 or between April 15 and 18. We’ve already demonstrated last July that a majority against a bad copyright proposal is achievable .

The plenary can vote to kill the bill — or to make changes , like removing Articles 11 and 13. In the latter case, it’s up to the Council to decide whether to accept these changes (the Directive then becomes law without these articles) or to shelve the project until after the EU elections in May, which will reshuffle all the cards.

This is where you come in

The final Parliament vote will happen mere weeks before the EU elections . Most MEPs — and certainly all parties — are going to be seeking reelection. Articles 11 and 13 will be defeated if enough voters make these issues relevant to the campaigns. ( Here’s how to vote in the EU elections — change the language to one of your country’s official ones for specific information)

It is up to you to make clear to your representatives: Their vote on whether to break the internet with Articles 11 and 13 will make or break your vote in the EU elections. Be insistent — but please always stay polite.

Together, we can still stop this law.