Posts Tagged ‘DCMS’

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dcms logo The Telegraph is reporting on significant changes being considered by the government to its Online Censorship Bill.The government is considering backing off from the government defined censorship of ‘legal but harmful’ content on most websites available in the UK. The government has rightfully been taking stick for these free speech curtailing measures, particularly as the censorship is expected to be implemented mostly by mostly woke US internet giants who clearly don’t care about free speech, and will over censor to ensure that they don’t get caught up in the expense of getting it wrong by under censoring.

Culture Secretary Michelle Donelan is said to be considering the option for adults to be able to self censor ‘legal but harmful’ content by clicking a filter button that will order websites to block such content. Of course children will not be able to opt out of that choice. And of course this will men that age and identity verification has to be in place to esnsure that only adults can opt out.A Culture Department spokesman said:

The Secretary of State has committed to strengthen protections for free speech and children in the Online Safety Bill and bring the bill back to the Commons as soon as possible. It remains the Government’s intention to pass the bill this session.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Select Committee A parliamentary committee looking into supposed ‘fake news’ is calling for more internet censorship. It writes:

The Digital, Culture, Media and Sport Committee calls for Government to appoint new Online Harms Regulator now.

Online misinformation about Covid-19 was allowed to spread virulently across social media without the protections offered by legislation, promised by the Government 15 months ago.

The Misinformation in the COVID-19 Infodemic Report details evidence on a range of harms from dangerous hoax treatments to conspiracy theories that led to attacks on 5G engineers.

The Online Harms White Paper, published in April 2019, proposed a duty of care on tech companies and an independent Online Harms Regulator, both key recommendations from the predecessor DCMS Committee.

MPs voice new concerns that the delayed legislation will not address the harms caused by misinformation and disinformation 203 a serious omission that would ignore the lessons of the Covid crisis.

The Report finds that tech companies use business models that disincentivise action against misinformation while affording opportunities to bad actors to monetise misleading content. As a result the public is reliant on the good will of tech companies or the bad press they attract to compel them to act.

The DCMS Committee calls for the Government to make a final decision on the appointment of the regulator now.

The report summary reads:

In February, the World Health Organisation warned that, alongside the outbreak of COVID-19, the world faced an infodemic, an unprecedented overabundance of information204both accurate and false204that prevented people from accessing authoritative, reliable guidance about the virus. The infodemic has allowed for harmful misinformation, disinformation, scams and cybercrime to spread. False narratives have resulted in people harming themselves by resorting to dangerous hoax cures or forgoing medical treatment altogether. There have been attacks on frontline workers and critical national infrastructure as a result of alarmist conspiracy theories.

The UK Government is currently developing proposals for online harms legislation that would impose a duty of care on tech companies. Whilst not a silver bullet in addressing harmful content, this legislation is expected to give a new online harms regulator the power to investigate and sanction tech companies. Even so, legislation has been delayed. As yet, the Government has not produced the final response to its consultation (which closed over a year ago), voluntary interim codes of practice, or a media literacy strategy. Moreover, there are concerns that the proposed legislation will not address the harms caused by misinformation and disinformation and will not contain necessary sanctions for tech companies who fail in their duty of care

We have conducted an inquiry into the impact of misinformation about COVID-19, and the efforts of tech companies and relevant public sector bodies to tackle it. This has presented an opportunity to scrutinise how online harms proposals might work in practice. Whilst tech companies have introduced new ways of tackling misinformation through the introduction of warning labels and tools to correct the record, these innovations have been applied inconsistently, particularly in the case of high-profile accounts. Platform policies have been also been too slow to adapt, while automated content moderation at the expense of human review and user reporting has had limited effectiveness. The business models of tech companies themselves disincentivise action against misinformation while affording opportunities to bad actors to monetise misleading content. At least until well-drafted, robust legislation is brought forward, the public is reliant on the goodwill of tech companies, or the bad press they attract, to compel them to act.

During the crisis the public have turned to public service broadcasting as the main and most trusted source of information. Beyond broadcasting, public service broadcasters (PSBs) have contributed through fact-checking and media literacy initiatives and through engagement with tech companies. The Government has also acted against misinformation by reforming its Counter Disinformation Unit to co-ordinate its response and tasked its Rapid Response Unit with refuting seventy pieces of misinformation a week. We have raised concerns, however, that the Government has been duplicating the efforts of other organisations in this field and could have taken a more active role in resourcing an offline, digital literacy-focused response. Finally, we have considered the work of Ofcom, as the Government’s current preferred candidate for online harms regulator, as part of our discussion of online harms proposals. We call on the Government to make a final decision now on the online harms regulator to begin laying the groundwork for legislation to come into effect.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logo The UK culture secretary is to order social media companies to be more aggressive in their response to conspiracy theories linking 5G networks to the coronavirus pandemic.Oliver Dowden plans to hold virtual meetings with representatives from several tech firms next week to discuss the matter. It follows a number of 5G masts apparently being set on fire.

A spokeswoman for the Department for Digital, Culture, Media and Sport told the BBC:

We have received several reports of criminal damage to phone masts and abuse of telecoms engineers apparently inspired by crackpot conspiracy theories circulating online, Those responsible for criminal acts will face the full force of the law.

We must also see social media companies acting responsibly and taking much swifter action to stop nonsense spreading on their platforms which encourages such acts.

Several platforms have already taken steps to address the problem but have not banned discussion of the subject outright.

It is not really very clear what the rumours are based upon beyond a correlation between big cities becoming SARS 2 hotspots and big cities being selected for the initial roll out of 5G. But surely denser housing and the larger households found in big cities provides a more compelling reason for big cities being the hotspots. One could ask why western countries seem too being hit hardest when the housing density argument would seem to make mega cities in the developing world more logical centres for the largest contagions, which doesn’t seem to be happening so far.

Ofcom’s unevidenced refutation

5th April 2020. See article from ofcom.org.uk

uckfield fm logo Ofcom has imposed a sanction on Uckfield Community Radio Limited after a discussion about the causes and origins of Covid-19 on its community radio station Uckfield FM was found to have breached broadcasting rules. The broadcaster must broadcast a summary of our findings to its listeners.

On 28 February 2020, Uckfield FM broadcast a discussion which contained potentially harmful claims about the coronavirus virus, including unfounded claims that the virus outbreak in Wuhan, China was linked to the roll out of 5G technology. Ofcom’s investigation concluded that the broadcaster failed to adequately protect listeners and had breached Rule 2.1 of the Ofcom Broadcasting Code.

Given the seriousness of this breach, Ofcom has directed the Licensee to broadcast a statement of Ofcom’s findings on a date and in a form to be determined by Ofcom.

Oliver Dowden takes over as the Culture Secretary, Julian Knight takes over the chair of the DCMS Select Committee and Ofcom is appointed as the AVMS internet censor.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Oliver Dowden Oliver Dowden was appointed Secretary of State for Digital, Culture, Media and Sport on 13 February 2020.He was previously Paymaster General and Minister for the Cabinet Office, and before that, Parliamentary Secretary at the Cabinet Office. He was elected Conservative MP for Hertsmere in May 2015.

The previous Culture Secretary Nicky Morgan will now be spending more time with her family.

There’s been no suggestions that Dowden will diverge from the government path on setting out a new internet censorship regime as outlined in its OnlIne Harms white paper.

Perhaps another parliamentary appointment that may be relevant is that Julian Knight has taken over the Chair of the DCMS Select Committee, the Parliamentary scrutiny body overseeing the DCMS.

Knight seems quite keen on the internet censorship idea and will surely be spurring on the DCMS.

And finally one more censorship appointment was announced by the Government. The government has appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

In Fact this censorship process is set to start in September 2020 and in fact Ofcom have already produced their solution that shadows the age verification requirements of the Digital Economy Act but now may need rethinking as some of the enforcement mechanisms, such as ISP blocking, are no longer on the table. The mechanism also only applies to British based online adult companies providing online video. of which there are hardly any left, after previously being destroyed by the ATVOD regime.

DCMS group calls for new law in the Online Harms Bill to give the government oversight into algorithms used by social media companies.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logo The Centre for Data Ethics and Innovation does is part of the Department for Digital, Culture, Media & Sport. It’s tasked by the Government to connect policymakers, industry, civil society, and the public to develop the ‘right’ governance regime for data-driven technologies.The group has just published its final report into the control of social media and their ‘algorithms’ in time for their suggestions to be incorporated into the government’s upcoming internet censorship bill.

Maybe the term ‘algorithm’ has been used to imply some sort of manipulative menace that secretly drives social media. In fact the algorithm isn’t likely to be far away from: Give them more of what they like, and maybe also try them with what their mates like. No doubt the government would prefer something more like: Give them more of what the government likes.

Anyway the press release reads:

The CDEI publishes recommendations to make online platforms more accountable, increase transparency, and empower users to take control of how they are targeted. These include:

  • New systemic regulation of the online targeting systems that promote and recommend content like posts, videos and adverts.

  • Powers to require platforms to allow independent researchers secure access to their data to build an evidence base on issues of public concern – from the potential links between social media use and declining mental health, to its role in incentivising the spread of misinformation

  • Platforms to host publicly accessible online archives for ‘high-risk’ adverts, including politics, ‘opportunities’ (e.g. jobs, housing, credit) and age-restricted products.

  • Steps to encourage long-term wholesale reform of online targeting to give individuals greater control over how their online experiences are personalised.

The CDEI recommendations come as the government develops proposals for online harms regulation.

The Centre for Data Ethics and Innovation (CDEI), the UK’s independent advisory body on the ethical use of AI and data-driven technology, has warned that people are being left in the dark about the way that major platforms target information at their users, in its first report to the government.

The CDEI’s year long review of online targeting systems – which use personal information about users to decide which posts, videos and adverts to show them – has found that existing regulation is out of step with the public’s expectations.

A major new analysis of public attitudes towards online targeting, conducted with Ipsos MORI, finds that people welcome the convenience of targeting systems, but are concerned that platforms are unaccountable for the way their systems could cause harm to individuals and society, such as by increasing discrimination and harming the vulnerable. The research highlighted most concern was related to social media platforms.

The analysis found that only 28% of people trust platforms to target them in a responsible way, and when they try to change settings, only one-third (33%) of people trust these companies to do what they ask. 61% of people favoured greater regulatory oversight of online targeting, compared with 17% of people who support self-regulation.

The CDEI’s recommendations to the government would increase the accountability of platforms, improve transparency and give users more meaningful control of their online experience.

The recommendations strike a balance by protecting users from the potential harms of online targeting, without inhibiting the kind of personalisation of the online experience that the public find useful. Clear governance will support the development and take-up of socially beneficial applications of online targeting, including by the public sector.

The report calls for internet regulation to be developed in a way that promotes human rights-based international norms, and recommends that the online harms regulator should have a statutory duty to protect and respect freedom of expression and privacy.

And from the report:

Key recommendations

Accountability

The government’s new online harms regulator should be required to provide regulatory oversight of targeting:

  • The regulator should take a “systemic” approach, with a code of practice to set standards, and require online platforms to assess and explain the impacts of their systems.

  • To ensure compliance, the regulator needs information gathering powers. This should include the power to give independent experts secure access to platform data to undertake audits.

  • The regulator’s duties should explicitly include protecting rights to freedom of expression and privacy.

  • Regulation of online targeting should encompass all types of content, including advertising.

  • The regulatory landscape should be coherent and efficient. The online harms regulator, ICO, and CMA should develop formal coordination mechanisms.

The government should develop a code for public sector use of online targeting to promote safe, trustworthy innovation in the delivery of personalised advice and support.

Transparency

  • The regulator should have the power to require platforms to give independent researchers secure access to their data where this is needed for research of significant potential importance to public policy.

  • Platforms should be required to host publicly accessible archives for online political advertising, “opportunity” advertising (jobs, credit and housing), and adverts for age-restricted products.

  • The government should consider formal mechanisms for collaboration to tackle “coordinated inauthentic behaviour” on online platforms.

User empowerment

Regulation should encourage platforms to provide people with more information and control:

  • We support the CMA’s proposed “Fairness by Design” duty on online platforms.

  • The government’s plans for labels on online electoral adverts should make paid-for content easy to identify, and give users some basic information to show that the content they are seeing has been targeted at them.

  • Regulators should increase coordination of their digital literacy campaigns. The emergence of “data intermediaries” could improve data governance and rebalance power towards users. Government and regulatory policy should support their development.

Four companies hoping to profit from cancelled porn age verification go to court seeking compensation from the government.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Old Bailey Four age verification companies have launched legal action to challenge the government’s decision to cancel the censorship scheme requiring age verification for access to internet porn. The companies have lodged a judicial review at the High Court Thursday.The Telegraph understands the companies are arguing the decision was an abuse of power as the move had been approved by parliament. They are also claiming damages, understood to be in the region of £3 million, for losses sustained developing age verification technology.

The four companies behind the judicial review – AgeChecked Ltd, VeriMe, AVYourself and AVSecure – are arguing the secretary of state only had power to choose when the scheme came into force, not scrap it in the form passed by Parliament.

The legal action has been backed by campaigners from the Children’s Charities’ Coalition for Internet Safety (CCCIS), which represents organisations such as the NSPCC and Barnardo’s.The CEO of AVSecure, Stuart Lawley, a British tech entrepreneur who made his fortune in the dotcom boom, said he had personally lost millions creating the technology. He said the company, which is behind other parental control apps such as Ageblock, had been preparing for up to 10 million people signing up for the service on day one.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

DCMS logoMore than £2m of taxpayers’ money was spent preparing for the age verification for porn censorship regime before the policy was dropped in early October, the government has revealed.

The bulk of the spending, £2.2m, was paid to the BBFC to do the detailed work on the policy from 2016 onwards. Before then, additional costs were borne by the Department for Digital, Culture, Media and Sport, where civil servants were tasked with developing the proposals as part of their normal work.

Answering a written question fromthe shadow DCMS secretary, Tom Watson, Matt Warman for the government added: Building on that work, we are now establishing how the objectives of part three of the Digital Economy Act can be delivered through our online harms regime.

It is not just government funds that were wasted on the abortive scheme. Multiple private companies had developed systems that they were hoping to provide age verification services.

The bizarre thing was all this money was spent when the government knew that it wouldn’t even prevent determined viewers from getting access to porn. It was only was only considered as effective from blocking kids from stumbling on porn.

So all that expense, and all that potential danger for adults stupidly submitting to age verification, and all for what?

Well at least next time round the  government may consider that they should put a least a modicum of thought about people’s privacy.

It’s not ALL about the kids. Surely the government has a duty of care for adults too. We need a Government Harms bill requiring a duty of care for ALL citizens. Now that would be a first!

Read more uk_internet_censors.htm at MelonFarmers.co.uk

nicky morgan House of Commons, 3rd October 2019.

The Secretary of State for Digital, Culture, Media and Sport (Nicky Morgan):

THon. Members will be aware of our ongoing work to keep people safe online and our proposals around age verification for online pornography. I wish to notify the House that the standstill period under the EU’s technical services and regulations directive expired at midnight last night. I understand the interest in this issue that exists in all parts of the House, and I will update the House on next steps in due course.

Read more uk_internet_censors.htm at MelonFarmers.co.uk
ICO’s Data Protection Training
The Pavlov Method
nodding dog
 ☑  Yes I won’t read this message. and yes you can do what the fuck you like with my porn browsing data
 ☑  Yes please do, I waiver all my GDPR rights
 ☑  Yes I won’t read this message. and yes, feel free to blackmail me
 ☑  Yes you can do anything you like ‘to make my viewing experience better’
 ☑  Yes, no need to ask, I’ll tick anything

Digital Minister Margot James has apologised for the six-month delay on the so-called porn block, which had been due to take effect today (16th July). It is designed to force pornography websites to verify users are over 18.

But the law has been delayed twice – most recently because the UK government failed to properly notify European regulators. James told the BBC:

I’m extremely sorry that there has been a delay. I know it sounds incompetent. Mistakes do happen, and I’m terribly sorry that it happened in such an important area,

Of course the fundamental mistake is that the incompetent lawmakers cared only about ‘protecting the children’ and gave bugger all consideration to the resulting endangerment of the adults visiting porn sites.

It took the government months, but it finally started to dawn on them that perhaps they should do something to protect the identity data that they are forcing porn users to hand over that can then be pinned to their porn browsing history. They probably still didn’t care about porn users but perhaps realised that the scheme would not get of the ground if it proved so toxic that no one would ever sign up for age verification at all.

Well as a belated after thought the government, BBFC and ICO went away to dream up a few standards that perhaps the age verifiers ought to be sticking to try and ensure that data is being kept safe.

So then the whole law ended up as a bag of worms. The authorities now realise that there should be level of data protection, but unfortunately this is not actually backed up by the law that was actually passed. So now the data protection standards suggested by the government/BBFC/ICO are only voluntary and there remains nothing in law to require the data actually be kept safe. And there is no recourse against anyone who ends up exploiting people’s data.

The Open Rights Group have just written an open letter to the government to ask that government to change their flawed law and actually require that porn users’ data is kept properly safe:

The Rt Hon Jeremy Wright QC MP Secretary of State for Digital, Culture, Media and Sport

Re: BBFC Age Verification Privacy Certification Scheme

Dear Secretary of State,

open rights group 2016 logo We write to ask you to legislate without delay to place a statutory requirement on the British Board of Film Classification (BBFC) to make their privacy certification scheme for age verification providers mandatory. Legislation is also needed to grant the BBFC powers to require compliance reports and penalise non-compliant providers.

As presently constituted, the BBFC certification scheme will be a disaster. Our analysis report, attached, shows that rather than setting out objective privacy safeguards to which companies must adhere, the scheme allows companies to set their own rules and then demonstrate that these are being followed. There are no penalties for providers which sign up to the standard and then fail to meet its requirements.

The broadly-drafted, voluntary scheme encourages a race to the bottom on privacy protection. It provides no consistent guarantees for consumers about how their personal data will be safeguarded and puts millions of British citizens at serious risk of fraud, blackmail or devastating sexual exposure.

The BBFC standard was only published in April. Some age verification providers have admitted that they are not ready. Others have stated that for commercial reasons they will not engage with the scheme. This means that the bureaucratic delay to age verification’s roll-out can now be turned to advantage. The Government needs to use this delay to introduce legislation, or at the least issue guidance under section 27 of the Digital Economy Act 2017, that will ensure the privacy and security of online users is protected.

We welcome the opportunity to bring this issue to your attention and await your response.

Yours sincerely,

Jim Killock Executive Director Open Rights Group

Read more uk_internet_censors.htm at MelonFarmers.co.uk

jeremy wright Jeremy Wright, the Secretary of State for Digital, Culture, Media and Sport addressed parliament to explain that the start data for Age Verification scheme for porn has been delayed by about 6 months. The reason is that the Government failed to inform the EU about laws that effect free trade (eg those that that allow EU websites to be blocked in the UK). Although the main Digital Economy Act was submitted to the EU, extra bolt on laws added since, have not been submitted. Wright explained:

In autumn last year, we laid three instruments before the House for approval. One of them204the guidance on age verification arrangements204sets out standards that companies need to comply with. That should have been notified to the European Commission, in line with the technical standards and regulations directive, and it was not. Upon learning of that administrative oversight, I instructed my Department to notify this guidance to the EU and re-lay the guidance in Parliament as soon as possible. However, I expect that that will result in a delay in the region of six months.

Perhaps it would help if I explained why I think that six months is roughly the appropriate time. Let me set out what has to happen now: we need to go back to the European Commission, and the rules under the relevant directive say that there must be a three-month standstill period after we have properly notified the regulations to the Commission. If it wishes to look into this in more detail204I hope that it will not204there could be a further month of standstill before we can take matters further, so that is four months. We will then need to re-lay the regulations before the House. As she knows, under the negative procedure, which is what these will be subject to, there is a period during which they can be prayed against, which accounts for roughly another 40 days. If we add all that together, we come to roughly six months.

Wright apologised profusely to supporters of the scheme:

I recognise that many Members of the House and many people beyond it have campaigned passionately for age verification to come into force as soon as possible to ensure that children are protected from pornographic material they should not see. I apologise to them all for the fact that a mistake has been made that means these measures will not be brought into force as soon as they and I would like.

However the law has not been received well by porn users. Parliament has generally shown no interest in the privacy and safety of porn users. In fact much of the delay has been down belatedly realising that the scheme might not get off the ground at all unless they at least pay a little lip service to the safety of porn users.

Even now Wright decided to dismiss people’s privacy fears and concerns as if they were all just deplorables bent on opposing child safety. He said:

However, there are also those who do not want these measures to be brought in at all, so let me make it clear that my statement is an apology for delay, not a change of policy or a lessening of this Government’s determination to bring these changes about. Age verification for online pornography needs to happen. I believe that it is the clear will of the House and those we represent that it should happen, and that it is in the clear interests of our children that it must.

Wright compounded his point by simply not acknowledging that if, given a choice people, would prefer not to hand over their ID. Voluntarily complying websites would have to take a major hit from customers who would prefer to seek out the safety of non-complying sites. Wright said:

I see no reason why, in most cases, they [websites] cannot begin to comply voluntarily. They had expected to be compelled to do this from 15 July, so they should be in a position to comply. There seems to be no reason why they should not.

In passing Wright also mentioned how the government is trying to counter encrypted DNS which reduces.  the capabilities of ISPs to block websites. Instead the Government will try and press the browser companies into doing their censorship dirty work for them instead:

It is important to understand changes in technology and the additional challenges they throw up, and she is right to say that the so-called D over H changes will present additional challenges. We are working through those now and speaking to the browsers, which is where we must focus our attention. As the hon. Lady rightly says, the use of these protocols will make it more difficult, if not impossible, for ISPs to do what we ask, but it is possible for browsers to do that. We are therefore talking to browsers about how that might practically be done, and the Minister and I will continue those conversations to ensure that these provisions can continue to be effective.