Archive for the ‘Internet’ Category

Read more me_internet.htm at MelonFarmers.co.uk

webfoundation 0300x0078 0297x0077 logo Speaking at the Web Summit conference in Lisbon, Tim Berners-Lee, inventor of the World Wide Web, has launched a campaign to persuade governments, companies and individuals to sign a Contract for the Web with a set of principles intended to defend a free and open internet.Contract for the Web CORE PRINCIPLES

The web was designed to bring people together and make knowledge freely available. Everyone has a role to play to ensure the web serves humanity. By committing to the following principles, governments, companies and citizens around the world can help protect the open web as a public good and a basic right for everyone.

GOVERNMENTS WILL

  • Ensure everyone can connect to the internet so that anyone, no matter who they are or where they live, can participate actively online.
  • Keep all of the internet available, all of the time so that no one is denied their right to full internet access.
  • Respect people’s fundamental right to privacy so everyone can use the internet freely, safely and without fear.

COMPANIES WILL

  • Make the internet affordable and accessible to everyone so that no one is excluded from using and shaping the web.
  • Respect consumers’ privacy and personal data so people are in control of their lives online.
  • Develop technologies that support the best in humanity and challenge the worst so the web really is a public good that puts people first.

CITIZENS WILL

  • Be creators and collaborators on the web so the web has rich and relevant content for everyone.
  • Build strong communities that respect civil discourse and human dignity so that everyone feels safe and welcome online.
  • Fight for the web so the web remains open and a global public resource for people everywhere, now and in the future.

We commit to uphold these principles and to engage in a deliberative process to build a full “Contract for the Web”, which will set out the roles and responsibilities of governments, companies and citizens. The challenges facing the web today are daunting and affect us in all our lives, not just when we are online. But if we work together and each of us takes responsibility for our actions, we can protect a web that truly is for everyone.See more from fortheweb.webfoundation.org

Advertisements
Read more uk_internet_censors.htm at MelonFarmers.co.uk

UKCCIS logo The Government has announced the organisations that will sit on the Executive Board of a new national body to tackle online harms in the UK. The UK Council for Internet Safety (UKCIS) is the successor to the UK Council for Child Internet Safety (UKCCIS), with an expanded scope to improve online safety for everyone in the UK.

The Executive Board brings together expertise from a range of organisations in the tech industry, civil society and public sector.

Margot James, Minister for Digital and the Creative Industries said:

Only through collaborative action will the UK be the safest place to be online. By bringing together a wealth of expertise from a wide range of fields, UKCIS can be an example to the world on how we can work together to face the challenges of the digital revolution in an effective and responsible way.

UKCIS has been established to allow these organisations to collaborate and coordinate a UK-wide approach to online safety.

It will contribute to the Government’s commitment to make the UK the safest place in the world to be online, and will help to inform the development of the forthcoming Online Harms White Paper.

Priority areas of focus will include online harms experienced by children such as cyberbullying and sexual exploitation; radicalisation and extremism; violence against women and girls; hate crime and hate speech; and forms of discrimination against groups protected under the Equality Act, for example on the basis of disability or race.

CEO of Internet Matters Carolyn Bunting said:

We are delighted to sit on the Executive Board of UKCIS where we are able to represent parents needs in keeping their children safe online.

Online safety demands a collaborative approach and by bringing industry together we hope we can bring about real change and help everyone benefit from the opportunities the digital world has to offer.

The UKCIS Executive Board consists of the following organisations:

  • Apple
  • BBC
  • Childnet
  • Children’s Commissioner
  • Commission for Countering Extremism
  • End Violence Against Women Coalition
  • Facebook
  • GCHQ
  • Google
  • ICO
  • Independent Advisory Group on Hate Crime
  • Internet Matters
  • Internet Watch Foundation
  • Internet Service Providers and Mobile Operators (rotating between BT, Sky, TalkTalk, Three, Virgin Media, Vodafone)
  • Microsoft
  • National Police Chiefs’ Council
  • National Crime Agency – CEOP Command
  • Northern Ireland Executive
  • NSPCC
  • Ofcom
  • Parentzone
  • Scottish Government
  • TechUK
  • Twitter
  • UKCIS Evidence Group Chair
  • UKIE
  • Welsh Assembly

The UKCIS Executive Board is jointly chaired by Margot James, Minister for Digital and the Creative Industries (Department for Digital, Culture, Media and Sport); Victoria Atkins, Minister for Crime, Safeguarding and Vulnerability (Home Office); and Nadeem Zahawi, Minister for Children and Families (Department for Education). It also includes representatives from the Devolved Administrations of Scotland, Wales and Northern Ireland. Board membership will be kept under periodic review, to ensure it represents the full range of online harms that the government seeks to tackle.

Read more eu.htm at MelonFarmers.co.uk

YouTube logo YouTube has warned its video creators about the likely effect of the EU’s upcoming censorship machines:

YouTube’s growing creative economy is at risk, as the EU Parliament voted on Article 13, copyright legislation that could drastically change the internet that you see today.

Article 13 as written threatens to shut down the ability of millions of people — from creators like you to everyday users — to upload content to platforms like YouTube. And it threatens to block users in the EU from viewing content that is already live on the channels of creators everywhere. This includes YouTube’s incredible video library of educational content, such as language classes, physics tutorials and other how-to’s.

This legislation poses a threat to both your livelihood and your ability to share your voice with the world. And, if implemented as proposed, Article 13 threatens hundreds of thousands of jobs, European creators, businesses, artists and everyone they employ. The proposal could force platforms, like YouTube, to allow only content from a small number of large companies. It would be too risky for platforms to host content from smaller original content creators, because the platforms would now be directly liable for that content. We realize the importance of all rights holders being fairly compensated, which is why we built Content ID and a platform to pay out all types of content owners. But the unintended consequences of article 13 will put this ecosystem at risk. We are committed to working with the industry to find a better way. This language could be finalized by the end of the year, so it’s important to speak up now.

Please take a moment to learn more about how it could affect your channel and take action immediately. Tell the world through social media (#SaveYourInternet) and your channel why the creator economy is important and how this legislation will impact you

Read more eu.htm at MelonFarmers.co.uk

european parliament 2018 logo New rules on audiovisual media services will apply to broadcasters, and also to video-on-demand and video-sharing platforms

MEPs voted on updated rules on audiovisual media services covering children protection, stricter rules on advertising, and a requirement 30% European content in video-on-demand.

Following the final vote on this agreement, the revised legislation will apply to broadcasters, but also to video-on-demand and video-sharing platforms, such as Netflix, YouTube or Facebook, as well as to live streaming on video-sharing platforms.

The updated rules will ensure:

  • Enhanced protection of minors from violence, hatred, terrorism and harmful advertising

Audiovisual media services providers should have appropriate measures to combat content inciting violence, hatred and terrorism, while gratuitous violence and pornography will be subject to the strictest rules. Video-sharing platforms will now be responsible for reacting quickly when content is reported or flagged by users as harmful.

The legislation does not include any automatic filtering of uploaded content, but, at the request of the Parliament, platforms need to create a transparent, easy-to-use and effective mechanism to allow users to report or flag content.

The new law includes strict rules on advertising, product placement in children’s TV programmes and content available on video-on-demand platforms. EP negotiators also secured a personal data protection mechanism for children, imposing measures to ensure that data collected by audiovisual media providers are not processed for commercial use, including for profiling and behaviourally targeted advertising.

  • Redefined limits of advertising

Under the new rules, advertising can take up a maximum of 20% of the daily broadcasting period between 6.00 and 18.00, giving the broadcaster the flexibility to adjust their advertising periods. A prime-time window between 18:00 and 0:00 was also set out, during which advertising will only be allowed to take up a maximum of 20% of broadcasting time.

  • 30% of European content on the video-on-demand platforms’ catalogues

In order to support the cultural diversity of the European audiovisual sector, MEPs ensured that 30% of content in the video-on-demand platforms’ catalogues should be European.

Video-on-demand platforms are also asked to contribute to the development of European audiovisual productions, either by investing directly in content or by contributing to national funds. The level of contribution in each country should be proportional to their on-demand revenues in that country (member states where they are established or member states where they target the audience wholly or mostly).

The legislation also includes provisions regarding accessibility, integrity of a broadcaster’s signal, strengthening regulatory authorities and promoting media competences.

Next steps

The deal still needs to be formally approved by the Council of EU ministers before the revised law can enter into force. Member States have 21 months after its entry into force to transpose the new rules into national legislation.

The text was adopted by 452 votes against 132, with 65 abstentions.

Article 6a

A new section has been added to the AVMS rules re censorship

  1. Member States shall take appropriate measures to ensure that audiovisual media services provided by media service providers under their jurisdiction which may impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm of the programme. The most harmful content, such as gratuitous violence and pornography, shall be subject to the strictest measures.

  2. Personal data of minors collected or otherwise generated by media service providers pursuant to paragraph 1 shall not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising.

  3. Member States shall ensure that media service providers provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. For this purpose, media service providers shall use a system describing the potentially harmful nature of the content of an audiovisual media service. For the implementation of this paragraph, Member States shall encourage the use of co – regulation as provided for in Article 4a(1).

  4. The Commission shall encourage media service providers to exchange best practices on co – regulatory codes of conduct . Member States and the Commission may foster self – regulation, for the purposes of this Article, through Union codes of conduct as referred to in Article 4a(2).

Article 4a suggests possible organisation of the censors assigned to the task, eg state censors, state controlled organisations eg Ofcom, or nominally state controlled co-regulators like the defunct ATVOD.

Article 4a(3). notes that censorial countries like the UK are free to add further censorship rules of their own:

Member States shall remain free to require media service providers under their jurisdiction to comply with more detailed or stricter rules in compliance with this Directive and Union law, including where their national independent regulatory authorities or bodies conclude that any code of conduct or parts thereof h ave proven not to be sufficiently effective. Member States shall report such rules to the Commission without undue delay. ;

Read more uk_internet_censors.htm at MelonFarmers.co.uk

arms of the british governmentjpg logo The UK government is preparing to establish a new internet censor that would make tech firms liable for content published on their platforms and have the power to sanction companies that fail to take down illegal material and hate speech within hours.Under legislation being drafted by the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) due to be announced this winter, a new censorship framework for online social harms would be created.

BuzzFeed News has obtained details of the proposals, which would see the establishment of an internet censor similar to Ofcom.

Home secretary Sajid Javid and culture secretary Jeremy Wright are considering the introduction of a mandatory code of practice for social media platforms and strict new rules such as takedown times forcing websites to remove illegal hate speech within a set timeframe or face penalties. Ministers are also looking at implementing age verification for users of Facebook, Twitter, and Instagram.

The new proposals are still in the development stage and are due to be put out for consultation later this year. The new censor would also develop rules new regulations on controlling non-illegal content and online behaviour . The rules for what constitutes non-illegal content will be the subject of what is likely to be a hotly debated consultation.

BuzzFeed News has also been told ministers are looking at creating a second new censor for online advertising. Its powers would include a crackdown on online advertisements for food and soft drink products that are high in salt, fat, or sugar.

BuzzFeed News understands concerns have been raised in Whitehall that the regulation of non-illegal content will spark opposition from free speech campaigners and MPs. There are also fears internally that some of the measures being considered, including blocking websites that do not adhere to the new regulations, are so draconian that they will generate considerable opposition.

A government spokesperson confirmed to BuzzFeed News that the plans would be unveiled later this year.

Read more eu.htm at MelonFarmers.co.uk

european commission logo Tech companies that fail to remove terrorist content quickly could soon face massive fines. The European Commission proposed new rules on Wednesday that would require internet platforms to remove illegal terror content within an hour of it being flagged by national authorities. Firms could be fined up to 4% of global annual revenue if they repeatedly fail to comply.Facebook (FB), Twitter (TWTR) and YouTube owner Google (GOOGL) had already agreed to work with the European Union on a voluntary basis to tackle the problem. But the Commission said that progress has not been sufficient.

A penalty of 4% of annual revenue for 2017 would translate to $4.4 billion for Google parent Alphabet and $1.6 billion for Facebook.

The proposal is the latest in a series of European efforts to control the activities of tech companies.

The terror content proposal needs to be approved by the European Parliament and EU member states before becoming law.

Read more eu.htm at MelonFarmers.co.uk

European Parliament logo The European Parliament has voted to approve new copyright powers enabling the big media industry to control how their content is used on the internet.Article 11 introduces the link tax which lets news companies control how their content is used. The target of the new law is to make Google pay newspapers for its aggregating Google News service. The collateral damage is that millions of websites can now be harangued for linking to and quoting articles, or even just sharing links to them.

Article 13 introduces the requirements for user content sites to create censorship machines that pre-scan all uploaded content and block anything copyrighted. The original proposal, voted on in June, directly specified content hosts use censorship machines (or filters as the EU prefers to call them). After a cosmetic rethink since June, the law no longer specifies automatic filters, but instead specifies that content hosts are responsible for copyright published. And of course the only feasible way that content hosts can ensure they are not publishing copyrighted material is to use censorship machines anyway. The law was introduced, really with just the intention of making YouTube and Facebook pay more for content from the big media companies. The collateral damage to individuals and small businesses was clearly of no concern to the well lobbied MEPs.

Both articles will introduce profound new levels of censorship to all users of the internet, and will also mean that there will reduced opportunities for people to get their contributions published or noticed on the internet. This is simply because the large internet companies are commercial organisations and will always make decisions with costs and profitability in mind. They are not state censors with a budget to spend on nuanced decision making. So the net outcome will be to block vast swathes of content being uploaded just in case it may contain copyright.

An example to demonstrate the point is the US censorship law, FOSTA. It requires content hosts to block content facilitating sex trafficking. Internet companies generally decided that it was easier to block all adult content rather than to try and distinguish sex trafficking from non-trafficking sex related content. So sections of websites for dating and small ads, personal services etc were shut down overnight.

The EU however has introduced a few amendments to the original law to slightly lessen the impact an individuals and small scale content creators.

  • Article 13 will now only apply to platforms where the main purpose …is to store and give access to the public or to stream significant amounts of copyright protected content uploaded / made available by its users and that optimise content and promotes for profit making purposes .
  • When defining best practices for Article 13, special account must now be taken of fundamental rights, the use of exceptions and limitations. Special focus should also be given to ensuring that the burden on SMEs remain appropriate and that automated blocking of content is avoided (effectively an exception for micro/small businesses). Article 11 shall not extend to mere hyperlinks, which are accompanied by individual words (so it seems links are safe, but quoted snippets of text must be very short) and the protection shall also not extend to factual information which is reported in journalistic articles from a press publication and will therefore not prevent anyone from reporting such factual information .
  • Article 11 shall not prevent legitimate private and non-commercial use of press publications by individual users .
  • Article 11 rights shall expire 5 years after the publication of the press publication. This term shall be calculated from the first day of January of the year following the date of publication. The right referred to in paragraph 1 shall not apply with retroactive effect .
  • Individual member states will now have to decide how Article 11 is implemented, which could create some confusion across borders.

At the same time, the EU rejected the other modest proposals to help out individuals and small creators:

  • No freedom of panorama. When we take photos or videos in public spaces, we’re apt to incidentally capture copyrighted works: from stock art in ads on the sides of buses to t-shirts worn by protestors, to building facades claimed by architects as their copyright. The EU rejected a proposal that would make it legal Europe-wide to photograph street scenes without worrying about infringing the copyright of objects in the background.
  • No user-generated content exemption, which would have made EU states carve out an exception to copyright for using excerpts from works for criticism, review, illustration, caricature, parody or pastiche.

A final round of negotiation with the EU Council and European Commission is now due to take place before member states make a decision early next year. But this is historically more of a rubber stamping process and few, if any, significant changes are expected.

However, anybody who mistakenly thinks that Brexit will stop this from impacting the UK should be cautious. Regardless of what the EU approves, the UK might still have to implement it, and in any case the current UK Government supports many of the controversial new measures.