The Centre for Data Ethics and Innovation does is part of the Department for Digital, Culture, Media & Sport. It’s tasked by the Government to connect policymakers, industry, civil society, and the public to develop the ‘right’ governance regime for data-driven technologies.The group has just published its final report into the control of social media and their ‘algorithms’ in time for their suggestions to be incorporated into the government’s upcoming internet censorship bill.
Maybe the term ‘algorithm’ has been used to imply some sort of manipulative menace that secretly drives social media. In fact the algorithm isn’t likely to be far away from: Give them more of what they like, and maybe also try them with what their mates like. No doubt the government would prefer something more like: Give them more of what the government likes.
Anyway the press release reads:
The CDEI publishes recommendations to make online platforms more accountable, increase transparency, and empower users to take control of how they are targeted. These include:
-
New systemic regulation of the online targeting systems that promote and recommend content like posts, videos and adverts.
-
Powers to require platforms to allow independent researchers secure access to their data to build an evidence base on issues of public concern – from the potential links between social media use and declining mental health, to its role in incentivising the spread of misinformation
-
Platforms to host publicly accessible online archives for ‘high-risk’ adverts, including politics, ‘opportunities’ (e.g. jobs, housing, credit) and age-restricted products.
-
Steps to encourage long-term wholesale reform of online targeting to give individuals greater control over how their online experiences are personalised.
The CDEI recommendations come as the government develops proposals for online harms regulation.
The Centre for Data Ethics and Innovation (CDEI), the UK’s independent advisory body on the ethical use of AI and data-driven technology, has warned that people are being left in the dark about the way that major platforms target information at their users, in its first report to the government.
The CDEI’s year long review of online targeting systems – which use personal information about users to decide which posts, videos and adverts to show them – has found that existing regulation is out of step with the public’s expectations.
A major new analysis of public attitudes towards online targeting, conducted with Ipsos MORI, finds that people welcome the convenience of targeting systems, but are concerned that platforms are unaccountable for the way their systems could cause harm to individuals and society, such as by increasing discrimination and harming the vulnerable. The research highlighted most concern was related to social media platforms.
The analysis found that only 28% of people trust platforms to target them in a responsible way, and when they try to change settings, only one-third (33%) of people trust these companies to do what they ask. 61% of people favoured greater regulatory oversight of online targeting, compared with 17% of people who support self-regulation.
The CDEI’s recommendations to the government would increase the accountability of platforms, improve transparency and give users more meaningful control of their online experience.
The recommendations strike a balance by protecting users from the potential harms of online targeting, without inhibiting the kind of personalisation of the online experience that the public find useful. Clear governance will support the development and take-up of socially beneficial applications of online targeting, including by the public sector.
The report calls for internet regulation to be developed in a way that promotes human rights-based international norms, and recommends that the online harms regulator should have a statutory duty to protect and respect freedom of expression and privacy.
And from the report:
Key recommendations
Accountability
The government’s new online harms regulator should be required to provide regulatory oversight of targeting:
-
The regulator should take a “systemic” approach, with a code of practice to set standards, and require online platforms to assess and explain the impacts of their systems.
-
To ensure compliance, the regulator needs information gathering powers. This should include the power to give independent experts secure access to platform data to undertake audits.
-
The regulator’s duties should explicitly include protecting rights to freedom of expression and privacy.
-
Regulation of online targeting should encompass all types of content, including advertising.
-
The regulatory landscape should be coherent and efficient. The online harms regulator, ICO, and CMA should develop formal coordination mechanisms.
The government should develop a code for public sector use of online targeting to promote safe, trustworthy innovation in the delivery of personalised advice and support.
Transparency
-
The regulator should have the power to require platforms to give independent researchers secure access to their data where this is needed for research of significant potential importance to public policy.
-
Platforms should be required to host publicly accessible archives for online political advertising, “opportunity” advertising (jobs, credit and housing), and adverts for age-restricted products.
-
The government should consider formal mechanisms for collaboration to tackle “coordinated inauthentic behaviour” on online platforms.
User empowerment
Regulation should encourage platforms to provide people with more information and control:
-
We support the CMA’s proposed “Fairness by Design” duty on online platforms.
-
The government’s plans for labels on online electoral adverts should make paid-for content easy to identify, and give users some basic information to show that the content they are seeing has been targeted at them.
-
Regulators should increase coordination of their digital literacy campaigns. The emergence of “data intermediaries” could improve data governance and rebalance power towards users. Government and regulatory policy should support their development.