Read more me_asa.htm at MelonFarmers.co.uk

spotify horror video A pre-roll ad seen on YouTube in June 2018 for Spotify featured a number of scenes in quick succession and tense sound effects that imitated the style of a horror film. The ad opened with a shot of three characters having breakfast. One character said, Can you play the wakeup playlist? and they played a particular song from their phone. That was followed by a shot of another character rousing himself and saying, Turn that up. As the music was turned up, a shot showed a horror film style doll in a dilapidated old room raising its head and tense music was played to accompany the song. Several shots followed of the doll ambushing the characters in the ad whenever they played the song and implicitly attacking them. The final shots showed one character attempting to convince the other not to play the song. The ad showed the character taking hold of the other character’s hand to stop him playing it but then the doll’s hand reached out to press play. The final shots of the ad showed the doll’s face alongside text which stated, Killer songs you can’t resist.

The ad was seen during a video on the YouTube channel for DanTDM, a gaming channel.

The complainant, who was a parent said their children saw the ad and found it distressing, and objected that the ad was:

  1. unduly distressing; and

  2. irresponsibly targeted, because it was seen during videos that were of appeal to children.

Spotify said that the ad was intended for an adult audience and was particularly targeted towards adults aged 18 to 34. They understood that the tools provided to them by YouTube to target ads towards a particular age group and demographic used a combination of self-identification by YouTube users and probabilistic data based on the user’s behaviour across the internet. Their agency had applied relevant content exclusions including ensuring that the ad was not shown alongside shocking or graphic content. Additionally they applied a function so that users could skip the ad after five seconds. They noted that the first encounter with the doll in the ad occurred after 12 seconds and that between 7 and 12 seconds the ad introduced cues as to the tone of the ad so they considered that viewers would have had the opportunity to skip the ad at any point if they considered the content to be distressing.

Spotify provided information from YouTube which listed the demographic data of viewers of logged-in viewers of the YouTube channel on which the ad was seen by the complainant. They explained that the data showed that 89% of viewers of the channel were aged 18 or over and that most (73%) were aged between 18 and 44. Only 11% of viewers were aged between 13 and 17. Spotify said that the ad had appeared prior to a video about a video game that was marketed as a stealth and horror game.

ASA Assessment: Complaints 1 & 2 upheld in part

The ASA considered that although violence was not explicitly shown in the ad, it was implied. The ad contained several scenes that were suggestive of a horror film, including tense music and scenes of characters looking scared or in distress. In two scenes in particular, actors were shown playing the song in bed and in the shower when they were ambushed by the doll. We considered that those scenes would be seen by viewers as reminiscent of famous scenes from horror films.

We first considered whether the ad was likely to cause undue distress to adults who saw it. The ad featured shots reminiscent of a horror film. However, we considered a number of scenes, including the doll nodding its head to the rhythm of the song and the doll’s hand pressing the play button on a device that had the Spotify app open, would be seen by viewers as humorous. We considered that although some might find the ad mildly scary, most adult viewers would find the ad overall to be humorous rather than frightening and it was unlikely to cause distress to them.

However, we did consider that the nature of the ad meant it was not suitable to be seen by children because it was likely to be distressing to them. In particular, the ad contained scenes that had tense sound effects and imagery similar to a horror film including the implied threat of violence. The fact the ad was set inside the home, including a bedtime setting, and featured a doll, meant it was particularly likely to cause distress to children who saw it. We did not consider that the context of the ad justified the distress. In addition, the nature of the ad as emulating a horror trailer was deliberately not made clear from the start of the ad and children were likely to be exposed to some of the potentially frightening scenes before they, or parents viewing with them, realised that was the case. We considered the ad therefore should have been appropriately targeted to avoid the risk of children seeing it.

We considered that the ad may have been appropriate to show before content on YouTube that was unlikely to be of particular interest to children. However, when seen by the complainant the ad was juxtaposed against unrelated content for the video game Hello Neighbour . Although the video game was marketed as a stealth horror game, it included colourful cartoonish images and was rated by the ESRB as suitable for players aged 10+ and by PEGI as suitable for players aged seven or older. We therefore considered that it was reasonable to expect that content about Hello Neighbour was more likely to appeal to children.

The figures provided by Spotify showed that 11% of viewers of the DanTDM were between the ages of 13 and 17, based on viewer demographics relating to logged-in users. However, the channel made use of cartoonish imagery and included videos of video games popular with children and media including Fortnite and The Incredibles. We noted videos on the channel were presented in an enthusiastic manner by a youthful presenter who had won an award from a children’s television network. Taken altogether, we considered that from the content of the videos and presentational style, the channel would have particular appeal to children. For those reasons we concluded that the ads had appeared before videos that were likely to be of appeal or interest to children.

We concluded that the ad was unlikely to cause distress to adults, but that it was likely to cause undue distress to children. Therefore, because the ad had appeared before videos of appeal to children, we concluded that it had been inappropriately targeted.

We told Spotify to ensure that future ads did not cause distress to children without justifiable reason, and to ensure ads that were unsuitable for viewing by children were appropriately targeted.

Advertisements
Read more inus.htm at MelonFarmers.co.uk

US Supreme Court After the recent censorship purge of over 800 independent media outlets on Facebook, the Supreme Court is now hearing a case that could have ramifications for any future attempts at similar purges.The United States Supreme Court has agreed to take a case that could change free speech on the Internet. Manhattan Community Access Corp. v. Halleck, No. 17-702, the case that it has agreed to take, will decide if the private operator of a public access network is considered a state actor.

The case could affect how companies like Facebook, Twitter, Instagram, Google and YouTube are governed. If the Court were to issue a far-reaching ruling it could subject such companies to First Amendment lawsuits and force them to allow a much broader scope of free speech from its users.

DeeDee Halleck and Jesus Melendez claimed that they were fired from Manhattan Neighborhood Network for speaking critically of the network. And, though the case does not involve the Internet giants, it could create a ruling that expands the First Amendment beyond the government.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logo As far as I can see if a porn website verifies your age with personal data, it will probably also require you tick tick a consent box with a hol load of small print that nobody ever reads. Now if that small print lets it forward all personal data, coupled with porn viewing data, to the Kremlin’s dirty tricks and blackmail department then that’s ok with the the Government’s age verification law. So for sure some porn viewers are going to get burnt because of what the government has legislated and because of what the BBFC have implemented.So perhaps it is not surprising that the BBFC has asked the government to pick up the tab should the BBFC be sued by people harmed by their decisions. After all it was the government who set up the unsafe environment, not the BBFC.

Margot James The Minister of State, Department for Culture, Media and Sport announced in Parliament:

I am today laying a Departmental Minute to advise that the Department for Digital, Culture, Media and Sport (DCMS) has received approval from Her Majesty’s Treasury (HMT) to recognise a new Contingent Liability which will come into effect when age verification powers under Part 3 of the Digital Economy Act 2017 enter force.

The contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography.

As you know, the Digital Economy Act introduces the requirement for commercial providers of online pornography to have robust age verification controls to protect children and young people under 18 from exposure to online pornography. As the designated age verification regulator, the BBFC will have extensive powers to take enforcement action against non-compliant sites. The BBFC can issue civil proceedings, give notice to payment-service providers or ancillary service providers, or direct internet service providers to block access to websites where a provider of online pornography remains non-compliant.

The BBFC expects a high level of voluntary compliance by providers of online pornography. To encourage compliance, the BBFC has engaged with industry, charities and undertaken a public consultation on its regulatory approach. Furthermore, the BBFC will ensure that it takes a proportionate approach to enforcement and will maintain arrangements for an appeals process to be overseen by an independent appeals body. This will help reduce the risk of potential legal action against the BBFC.

However, despite the effective work with industry, charities and the public to promote and encourage compliance, this is a new law and there nevertheless remains a risk that the BBFC will be exposed to legal challenge on the basis of decisions taken as the age verification regulator or on grounds of principle from those opposed to the policy.

As this is a new policy, it is not possible to quantify accurately the value of such risks. The Government estimates a realistic risk range to be between 2£1m – 2£10m in the first year, based on likely number and scale of legal challenges. The BBFC investigated options to procure commercial insurance but failed to do so given difficulties in accurately determining the size of potential risks. The Government therefore will ensure that the BBFC is protected against any legal action brought against the BBFC as a result of carrying out duties as the age verification regulator.

The Contingent Liability is required to be in place for the duration of the period the BBFC remain the age verification regulator. However, we expect the likelihood of the Contingent Liability being called upon to diminish over time as the regime settles in and relevant industries become accustomed to it. If the liability is called upon, provision for any payment will be sought through the normal Supply procedure.

It is usual to allow a period of 14 Sitting Days prior to accepting a Contingent Liability, to provide Members of Parliament an opportunity to raise any objections.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

bbfc internet censor There’s loads of new information today about the upcoming internet porn censorship regime to be coordinated by the BBFC.The BBFC has launched a new website, ageverificationregulator.com , perhaps to distance itself a bit from its film censorship work.

The BBFC has made a few changes to its approach since the rather ropey document published prior to the BBFC’s public consultation. In general the BBFC seems a little more pragmatic about trying to get adult porn users to buy into the age verification way of thinking. The BBFC seems supportive of the anonymously bought porn access card from the local store, and has taken a strong stance against age verification providers who reprehensibly want to record people’s porn browsing, claiming a need to provide an audit trail.

The BBFC has also decided to offer a service to certify age verification providers in the way that they protect people’s data. This is again probably targeted at making adult porn users a bit more confident in handing over ID.

The BBFC tone is a little bit more acknowledging of people’s privacy concerns, but it’s the government’s law being implemented by the BBFC, that allows the recipients of the data to use it more or less how they like. Once you tick the ‘take it or leave it’ consent box allowing the AV provider ‘to make your user experience better’ then they can do what they like with your data (although GDPR does kindly let you later withdraw that consent and see what they have got on you).

Another theme that runs through the site is a rather ironic acceptance that, for all the devastation that will befall the UK porn industry, for all the lives ruined by people having their porn viewing outed, for all the lives ruined by fraud and identity theft, that somehow the regime is only about stopping young children ‘stumbling on porn’… because the older, more determined, children will still know how to find it anyway.

So the BBFC has laid out its stall, and its a little more conciliatory to porn users, but I for one will never hand over any ID data to anyone connected with a servicing porn websites. I suspect that many others will feel the same. If you can’t trust the biggest companies in the business with your data, what hope is there for anyone else.

There’s no word yet on when all this will come into force, but the schedule seems to be 3 months after the BBFC scheme has been approved by Parliament. This approval seems scheduled to be debated in Parliament in early November, eg on 5th November there will be a House of Lords session:

Implementation by the British Board of Film Classification of age-verifications to prevent children accessing pornographic websites 203 Baroness Benjamin Oral questions

So the earliest it could come into force is about mid February.

Read more food_censors.htm at MelonFarmers.co.uk

flying dog easy ipa Index on Censorship is standing with our free speech friends at Flying Dog Brewery who’ve just been told by the UK drinks censor that they should stop selling one of the beers because the artwork by award-winning artist Ralph Steadman might encourage immoderate drinking.Flying Dog was told that the Portman Group deemed the artwork for its Easy IPA Session India Pale Ale could spur people to drink irresponsibly.

indexoncensorship commented:

We think this is nonsense and are pleased Flying Dog plans to ignore this ruling.

The press release sent by Flying Dog Brewery is below:

Flying Dog Brewery Will Not Comply with Regulatory Group’s Ruling on Easy IPA

Flying Dog Brewery has been defending free speech and creative expression in the United States for more than 25 years. Now, it’s taking a stand in the United Kingdom.

In May 2018, the Portman Group, a third-party organization that evaluates alcohol-related marketing, allegedly received a single complaint from a person who thought that Flying Dog’s Easy IPA Session India Pale Ale could be mistaken for a soft drink.

After months of deliberation, the Portman Group issued a final ruling, claiming that the packaging artwork …directly or indirectly encourages illegal, irresponsible or immoderate consumption, such as binge drinking, drunkenness or drunk-driving. It will be issuing a Retailer Alert Bulletin on 15 October, which will ask retailers not to place orders for the beer.

Notwithstanding the Portman Group’s ruling, Flying Dog has decided to continue to distribute Easy IPA in the United Kingdom.

Jim Caruso, Flying Dog CEO said:

Not surprisingly, the alleged complaint — by a sole individual — that a product labeled ‘Easy IPA Session India Pale Ale’ might be mistaken for a soft drink was, we believe, correctly dismissed by the Portman Group,  That should have been the end of it. However, the Portman Group then went on to ban the creative and carefree Easy IPA label art by the internationally-renowned UK artist Ralph Steadman.

Steadman has illustrated all of Flying Dog’s labels since 1995. In the ruling, the Portman Group claims that the artwork of this low-ABV beer could be seen as encouraging drunkenness.

Without question, over-consumption, binge drinking and drunk-driving are serious health and public safety issues, and Flying Dog has always advocated for moderation and responsible social drinking, Caruso said. At the same time, there is no evidence to suggest that the whimsical Ralph Steadman art on the Easy IPA label causes any of those problems. We believe that British adults can think for themselves and Flying Dog, an independent U.S. craft brewer, will not honor the Portman Group’s request to discontinue shipping Easy IPA to the UK.

Read more awwb.htm at MelonFarmers.co.uk

uk department of health 0300x0201 0297x0198 logo People’s medical records will be combined with social and smartphone surveillance to predict who will pick up bad habits and stop them getting ill, under radical government proposals.

Matt Hancock, the health secretary, is planning a system of predictive prevention, in which algorithms will trawl data on individuals to send targeted health nags to those flagged as having propensities to health problems, such as taking up smoking or becoming obese.

The creepy plans have already attracted privacy concerns among doctors and campaigners, who say that the project risks backfiring by scaring people or being seen to be abusing public trust in NHS handling of sensitive information.

Read more me_asa.htm at MelonFarmers.co.uk

ASA logo Online adverts placed by Scottish companies are to be trawled by automated bots to proactively seek out commercials which break censorship rules.The automated technology is part of a new strategy to be unveiled next month by the Advertising Standards Authority, which will use the software to identify adverts and social media posts which could potentially be in breach of official standards. They will then be assessed by humans and a decision made as to whether action should be taken.

ASA chief executive Guy Parker told Scotland on Sunday that Scottish companies and organisations were likely to be specifically targeted under the new, UK-wide strategy. Parker regurgitated the old trope that the innocent have nothing to fear saying:

I don’t think responsible Scottish companies have anything to fear — on the contrary, they will welcome better online regulation.

We want to make more adverts responsible online than we have at the moment. We are looking at how we can responsibly automate something that would flag up things that we would then want humans to review. We want to be in a position by 2023 where we are an organisation that is using this technology in a way that makes adverts more responsible.

It seems that Scotland was chosen as the Guinea-pig for the new system as ASA says that Scots historically don’t complain much about adverts, although there was an upturn last year. Parker notes that the most complaints UK-wide come from “better off, middle class people in London and the southeast of England”.