Archive for the ‘Internet’ Category

Read more ow.htm at MelonFarmers.co.uk

sharon whiteSharon White, the CEO of Ofcom has put her case to be the British internet news censor, disgracefully from behind the paywalled website of the The Times.White says Ofcom has done research showing how little users trust what they read on social media. She said that only 39% consider social media to be a trustworthy news source, compared with 63% for newspapers, and 70% for TV.

But then again many people don’t much trust the biased moralising from the politically correct mainstream media, including the likes of Ofcom.

White claims social media platforms need to be more accountable in how they curate and police content on their platforms, or face regulation.

In reality, Facebook’s algorithm seems pretty straightforward, it just gives readers more of what they have liked in the past. But of course the powers that be don’t like people choosing their own media sources, they would much prefer that the BBC, or the Guardian , or Ofcom do the choosing.

Sharon White, wrote in the Times:

The argument for independent regulatory oversight of [large online players] has never been stronger.

In practice, this would place much greater scrutiny on how effectively the online platforms respond to harmful content to protect consumers, with powers for a regulator to enforce standards, and act if these are not met.

She continued, disgracefully revealing her complete contempt of the British people:

Many people admit they simply don’t have the time or inclination to think critically when engaging with news, which has important implications for our democracy.

White joins a growing number of the establishment elite arguing that social media needs cenorship. The government has frequently suggested as much, with Matt Hancock, then digital, culture, media and sport secretary, telling Facebook in April:

Social media companies are not above the law and will not be allowed to shirk their responsibilities to our citizens.

Advertisements
Read more ssusp4p.htm at MelonFarmers.co.uk

indianapolis metro police logoSupporters of the US internet censorship law FOSTA were supposedly attempting to target pimps and traffickers, but of course their target was the wider sex work industry. Hence they weren’t really interested in the warning that the law would make it harder to target pimps and sex traffickers as their activity would be driven off radar.Anyway it seems that the police at least have started to realise that the warning is coming true, but I don’t suppose this will bother the politicians much.

Over in Indianapolis, the police have just arrested their first pimp in 2018, and it involved an undercover cop being approached by the pimp. The reporter asks why there have been so few such arrests, and the police point the finger right at the shutdown of Backpage:

The cases, according to Sgt. John Daggy, an undercover officer with IMPD’s vice unit, have just dried up. The reason for that is pretty simple: the feds closed police’s best source of leads, the online personals site Backpage, earlier this year. Daggy explained:

We’ve been a little bit blinded lately because they shut Backpage down. I get the reasoning behind it, and the ethics behind it, however, it has blinded us. We used to look at Backpage as a trap for human traffickers and pimps.

With Backpage, we would subpoena the ads and it would tell a lot of the story. Also, with the ads we would catch our victim at a hotel room, which would give us a crime scene. There’s a ton of evidence at a crime scene. Now, since [Backpage] has gone down, we’re getting late reports of them and we don’t have much to go by.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

sky virgin logoSky, TalkTalk and Virgin Media would back the creation of an internet censor to set out a framework for internet companies in the UK, the House of Lords Communications Committee was told.The three major UK ISPs were reporting to a House of Lords’ ongoing inquiry into internet censorship. The companies’ policy heads pushed for a new censor, or the expansion of the responsibility of a current censor, to set the rules for content censorship and to better equip children using the internet amid safety concerns .

At the moment Information Commissioner’s Office has responsibility for data protection and privacy; Ofcom censors internet TV; the Advertising Standards Authority censors adverts; and the BBFC censors adult porn.

Citing a report by consultancy Communications Chambers, Sky’s Adam Kinsley said that websites and internet providers are making decisions but in a non structured way. Speaking about the current state of internet regulation, Kinsley said:

Companies are already policing their own platforms. There is no accountability of what they are doing and how they are doing it. The only bit of transparency is when they decide to do it on a global basis and at a time of their choosing. Policy makers need to understand what is happening, and at the moment they don’t have that.

The 13-strong House of Lords committee, chaired by Lord Gilbert of Panteg, launched an inquiry earlier this year to explore how the censorship of the internet should be improved. The committee will consider whether there is a need for new laws to govern internet companies. This inquiry will consider whether websites are sufficiently accountable and transparent, and whether they have adequate governance and provide behavioural standards for users.

The committee is hearing evidence from April to September 2018 and will launch a report at the end of the year.

Read more parl.htm at MelonFarmers.co.uk

House of Commons logoGoogle, Facebook, YouTube and other sites would be required by law to take down extremist material within 24 hours of receiving an official complaint under an amendment put forward for inclusion in new counter-terror legislation.The Labour MP Stephen Doughty’s amendment echoes censorship laws that came into effect in Germany last year. However the effect of the German law was to enable no-questions-asked censorship of anything the government doesn’t like. Social media companies have no interest in challenging unfair censorship and find the easiest and cheapest way to comply is to err on the side of the government, and take down anything asked regardless of the merits of the case.

The counter-terrorism strategy unveiled by the home secretary, Sajid Javid, this month, said the Home Office would place a renewed emphasis on engagement with internet providers and work with the tech industry to seek more investment in technologies that automatically identify and remove terrorist content before it is accessible to all.

But Doughty, a member of the home affairs select committee, said his amendment was needed because the voluntary approach was failing. He said a wide variety of extremist content remained online despite repeated warnings.

If these companies can remove copyrighted video or music content from companies like Disney within a matter of hours, there is no excuse for them to be failing to do so for extremist material.

Doughty’s amendment would also require tech companies to proactively check content for extremist material and take it down within six hours of it being identified.

The proactive check of content alludes to the censorship machines being introduced by the EU to scan uploads for copyrighted material. The extension to detect terrorist material coupled with the erring on the side of caution approach would inevitably lead to the automatic censorship of any content even using vocabulary of terrorism, regardless of it being news reporting, satire or criticsim.

Read more eu.htm at MelonFarmers.co.uk
pied piper
What is the mysterious hold that US Big Music has over Euro politicians?

Article 13, the proposed EU legislation that aims to restrict safe harbors for online platforms, was crafted to end the so-called “Value Gap” on YouTube.

Music piracy was traditionally viewed as an easy to identify problem, one that takes place on illegal sites or via largely uncontrollable peer-to-peer networks. In recent years, however, the lines have been blurred.

Sites like YouTube allow anyone to upload potentially infringing content which is then made available to the public. Under the safe harbor provisions of US and EU law, this remains legal — provided YouTube takes content down when told to do so. It complies constantly but there’s always more to do.

This means that in addition to being one of the greatest legal platforms ever created, YouTube is also a goldmine of unlicensed content, something unacceptable to the music industry.

They argue that the existence of this pirate material devalues the licensed content on the platform. As a result, YouTube maintains a favorable bargaining position with the labels and the best licensing deal in the industry.

The difference between YouTube’s rates and those the industry would actually like is now known as the ” Value Gap ” and it’s become one of the hottest topics in recent years.

In fact, it is so controversial that new copyright legislation, currently weaving its way through the corridors of power in the EU Parliament, is specifically designed to address it.

If passed, Article 13 will require platforms like YouTube to pre-filter uploads to detect potential infringement. Indeed, the legislation may as well have been named the YouTube Act, since it’s the platform that provoked this entire debate and whole Value Gap dispute.

With that in mind, it’s of interest to consider the words of YouTube’s global head of music Lyor Cohen this week. In an interview with MusicWeek , Cohen pledges that his company’s new music service, YouTube Music, will not only match the rates the industry achieves from Apple Music and Spotify, but the company’s ad-supported free tier viewers will soon be delivering more cash to the labels too.  “Of course [rights holders are] going to get more money,” he told Music Week.

If YouTube lives up to its pledge, a level playing field will not only be welcomed by the music industry but also YouTube competitors such as Spotify, who currently offer a free tier on less favorable terms.

While there’s still plenty of room for YouTube to maneuver, peace breaking out with the labels may be coming a little too late for those deeply concerned about the implications of Article 13.

YouTube’s business model and its reluctance to pay full market rate for music is what started the whole Article 13 movement in the first place and with the Legal Affairs Committee of the Parliament (JURI) adopting the proposals last week , time is running out to have them overturned.

Behind the scenes, however, the labels and their associates are going flat out to ensure that Article 13 passes, whether YouTube decides to “play fair” or not. Their language suggests that force is the best negotiating tactic with the distribution giant.

Yesterday, UK Music CEO Michael Dugher led a delegation to the EU Parliament in support of Article 13. He was joined by deputy Labour leader Tom Watson and representatives from the BPI, PRS, and Music Publishers Association, who urged MEPs to support the changes.

Read more eu.htm at MelonFarmers.co.uk

terminator battle machinePoliticians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by YouTube’s upload filtering.

French politicians of the former Front National are furious: their entire YouTube channel was just taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week’s vote, which they have announced they will support: the bill that will make exactly this arbitrary, political, and unilateral upload filtering mandatory all across Europe.

The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel disappeared on grounds of alleged copyright violations. In an interview with French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral.

Europe is about to vote on new copyright law next week. Next Wednesday or Thursday. So let’s disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France’s biggest parties regardless of their policies, then it can happen to anyone for political reasons 204 or any other reason.

The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.

Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast].

She’s right. Automated upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make sure that the hosting platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here.

And yet, the Rassemblemant National is going to vote in favor of exactly this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral.

It’s hard to illustrate clearer that Europe’s politicians have absolutely no idea about the monster they’re voting on next week.

The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube’s Content ID filtering is today, as has just been illustrated.

The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of automated censorship machines.

Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.

Read more eu.htm at MelonFarmers.co.uk

UN logoDavid Kaye, the UN’s Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive — the part about mandatory copyright filters — would be a disaster for free speech and would violate the UN’s Declaration on Human Rights, and in particular Article 19 which says:

Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.

As Kaye’s report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.

Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave considerable leeway for interpretation.

The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem — especially when a website may face legal liability for getting it wrong.

The designation of such mechanisms as the main avenue to address users’ complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content — particularly in the context of fair use and other fact-sensitive exceptions to copyright — may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.

In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State’s obligation to provide access to an effective remedy for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation — this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.

He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:

I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)’s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.