Archive for the ‘ICO’ Category

Read more me_ico.htm at MelonFarmers.co.uk

information commissioners office logo UK Information Commissioner issues preliminary enforcement notice against Snap

  • Snap issued with preliminary enforcement notice over potential failure to properly assess the privacy risks posed by its generative AI chatbot ‘My AI’

  • Investigation provisionally finds Snap failed to adequately identify and assess the risks to several million ‘My AI’ users in the UK including children aged 13 to 17.

The Information Commissioner’s Office (ICO) has issued Snap Inc with a preliminary enforcement notice over potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot ‘My AI’.

The preliminary notice sets out the steps which the Commissioner may require, subject to Snap’s representations on the preliminary notice. If a final enforcement notice were to be adopted, Snap may be required to stop processing data in connection with ‘My AI’. This means not offering the ‘My AI’ product to UK users pending Snap carrying out an adequate risk assessment.

Snap launched the ‘My AI’ feature for UK Snapchat+ subscribers in February 2023, with a roll out to its wider Snapchat user base in the UK in April 2023. The chatbot feature, powered by OpenAI’s GPT technology, marked the first example of generative AI embedded into a major messaging platform in the UK. As at May 2023 Snapchat had 21 million monthly active users in the UK.

The ICO’s investigation provisionally found the risk assessment Snap conducted before it launched ‘My AI’ did not adequately assess the data protection risks posed by the generative AI technology, particularly to children. The assessment of data protection risk is particularly important in this context which involves the use of innovative technology and the processing of personal data of 13 to 17 year old children.

The Commissioner’s findings in the notice are provisional. No conclusion should be drawn at this stage that there has, in fact, been any breach of data protection law or that an enforcement notice will ultimately be issued. The ICO will carefully consider any representations from Snap before taking a final decision.

John Edwards, Information Commissioner said:

The provisional findings of our investigation suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching ‘My AI’.

We have been clear that organisations must consider the risks associated with AI, alongside the benefits. Today’s preliminary enforcement notice shows we will take action in order to protect UK consumers’ privacy rights.

The continuingly dangerous campaign to force ALL people to hand over sensitive ID details to porn sites in the name of protecting children from handing over sensitive ID details.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

ico childrens-code logo The UK’s data protection censors at the Information Commissioner’s Office ICO have generated a disgracefully onerous red tape nightmare called the Age Appropriate Design Code that requires any internet service that provides any sort of grown up content to evaluate the age of all users so that under 18s can be protected from handing over sensitive ID data. Of course the age checking usually requires all users to hand over lots of sensitive and dangerous ID data to any website that asks.Now the ICO has decided to make these requirements of porn sites given that they are often accessed by under 18s. ICO writes:

Next steps

We will continue to evolve our approach, listening to others to ensure the code is having the maximum impact.

For example, we have seen an increasing amount of research (from the NSPCC, 5Rights, Microsoft and British Board of Film Classification), that children are likely to be accessing adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms. We have therefore revised our position to clarify that adult-only services are in scope of the Children’s code if they are likely to be accessed by children.

As well as engaging with adult-only services directly to ensure they conform with the code, we will also be working closely with Ofcom and the Department for Digital, Culture, Media and Sport (DCMS) to establish how the code works in practice in relation to adult-only services and what they should expect. This work is continuing to drive the improvements necessary to provide a better internet for children.

Read more gambling.htm at MelonFarmers.co.uk

sky bet logo Internet censors of the Information Commissioner’s Office has been called on to implement  a full-scale probe into how the online betting industry is exploiting new technology to profile and target gamblers.The move follows a complaint by  the campiagn group Clean Up Gambling. It alleges that Sky Bet and its partners are creating detailed behavioural profiles of customers and sharing thousands of data points with dozens of third parties.

Clean Up Gambling alleges that one advertising partner, Signal, owned by TransUnion, has a dossier of 186 attributes for an individual, including their propensity to gamble, their favourite games and their susceptibility to specific types of marketing.

TransUnion said it assists gambling companies in preventing fraud, confirming age and identity, checking affordability and protecting vulnerable customers, to support responsible gambling.

Maybe more about data monetisation than data protection…

Read more me_ico.htm at MelonFarmers.co.uk

information commissioners office logo Culture Secretary Oliver Dowden has announced that John Edwards is the Government’s preferred candidate for Information Commissioner.John Edwards is currently New Zealand’s Privacy Commissioner. He will now appear before MPs on the Digital, Culture, Media and Sport Select Committee for pre-appointment scrutiny on 9th September 2021.

It seems that the Government has its eyes on market opportunities related to selling data rather than data protection. Dowden commented:

Data underpins innovation and the global digital economy, everyday apps and cloud computing systems. It allows businesses to trade, drives international investment, supports law enforcement agencies tackling crime, the delivery of critical public services and health and scientific research.

The government is outlining the first territories with which it will prioritise striking data adequacy partnerships now it has left the EU as the United States, Australia, the Republic of Korea, Singapore, the Dubai International Finance Centre and Colombia. It is also confirming that future partnerships with India, Brazil, Kenya and Indonesia are being prioritised.

Estimates suggest there is as much as £11 billion worth of trade that goes unrealised around the world due to barriers associated with data transfers.

The aim is to move quickly and creatively to develop global partnerships which will make it easier for UK organisations to exchange data with important markets and fast-growing economies. T

The government also today names New Zealand Privacy Commissioner John Edwards as its preferred candidate to be the UK’s next Information Commissioner, following a global search.

As Information Commissioner and head of the UK regulator responsible for enforcing data protection law, he will be empowered to go beyond the regulator’s traditional role of focusing only on protecting data rights, with a clear mandate to take a balanced approach that promotes further innovation and economic growth.

It means reforming our own data laws so that they’re based on common sense, not box-ticking. And it means having the leadership in place at the Information Commissioner’s Office to pursue a new era of data-driven growth and innovation. John Edwards’s vast experience makes him the ideal candidate to ensure data is used responsibly to achieve those goals.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

facebook logo The data protection censors at the Information Commissioner’s Office have got into the internet censorship game with a new regime that starts on the 2nd September 2021. It’s Age Appropriate Design code very much requires an age gated internet in the name of data protection for children, The code itself is not law but ICO claims that is an interpretation of the EU’s GDPR (General Data Protection Regulation) law and so carries legal weight.The code requires that websites hand over their personal data to anyone that asks to verify that they are of sufficient age to hand over their personal data. All in the name of preventing children from handing over their personal data.

And the most immediate impact is that social media websites need to ensure that their users are over the age of 13 before the internet companies can make hay with their personal data.

And in preparation for the new rules Facebook and Instagram have posted substantial blogs laying out new polices on age verification.

Facebook summarised:

Facebook and Instagram weren’t designed for people under the age of 13, so we’re creating new ways to stop those who are underage from signing up.

We’re developing AI to find and remove underaged accounts, and new solutions to verify people’s ages.

We’re also building new experiences designed specifically for those under 13.

See full article from about.fb.com

Instagram added:

Creating an experience on Instagram that’s safe and private for young people, but also fun comes with competing challenges. We want them to easily make new friends and keep up with their family, but we don’t want them to deal with unwanted DMs or comments from strangers. We think private accounts are the right choice for young people, but we recognize some young creators might want to have public accounts to build a following.

We want to strike the right balance of giving young people all the things they love about Instagram while also keeping them safe. That’s why we’re announcing changes we’ll make today, including:

  • Defaulting young people into private accounts.

  • Making it harder for potentially suspicious accounts to find young people.

  • Limiting the options advertisers have to reach young people with ads.

See full article from about.instagram.com

Read more nw.htm at MelonFarmers.co.uk

cease logo CEASE (Centre to End All Sexual Exploitation) is a new morality group campaigning against porn and sex work in the UK.The group was founded in 2019 and describes itself on its website:

We shine a light on what sexual exploitation is, where it occurs and how it contravenes our human rights. We campaign for new and better laws, advocate for policy change and hold the global sex industry to account.

We’re building a UK-wide movement of campaigners against sexual exploitation, and we’re amplifying the voices of the very best advocates for change: survivors.

Its latest cunning plan is to hold the Information Commissioners Office (the UK data protection censor) as responsible for failing to prevent the world’s porn sites from obtaining usage data from under 18s. The group writes on its website:

We are threatening to take legal action against the Information Commissioner’s Office (ICO) for failing to protect children’s data from misuse by porn sites.

The excuses the ICO has given for its failure to fulfil its regulatory duties are legally and factually flawed. What’s more, it has left children exposed to a profit-hungry industry which is intent on drawing children back again and again to watch violent and abusive pornographic material for its own financial gain.

The group quotes long time porn campaigner John Carr:

I was shocked and dismayed by the Information Commissioner’s reply to me in which they refused to act against porn sites which were collecting and processing children’s data on a large scale. If the data protection laws weren’t designed to protect children … I am sure a lot of parents will wonder just what they were designed to do.

Read more me_ico.htm at MelonFarmers.co.uk

information commissioners office logo A survey by the Information Commissioner’s Office (ICO) shows that three quarters of businesses surveyed are aware of the impending Children’s Code. The full findings will be published in May but initial analysis shows businesses are still in the preparation stages.And with just six months to go until the code comes into force, the ICO is urging organisations and businesses to make the necessary but onerous changes to their online services and products.

The Children’s Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. The code will apply to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have access to online services whilst minimising data collection and use.

Details of the code were first published in June 2018 and UK Parliament approved it last year. Since then, the ICO has been providing support and advice to help organisations adapt their online services and products in line with data protection law.

Read more me_ico.htm at MelonFarmers.co.uk

experion logo In a landmark decison that shines a light on widespread data protecton failings by the entire data broker industry, the UK data protection censor ICO, has taken enforcement action against Experian, based in part on a complaint made by Privacy International in 2018.Privacy International (PI) welcomes the report from the UK Information Commissioner’s Office (ICO) into three credit reference agencies (CRAs) which also operate as data brokers for direct marketing purposes. As a result, the ICO has ordered the credit reference agency Experian to make fundamental changes to how it handles people’s personal data within its offline direct marketing services.

Experian now has until July 2021 to inform people that it holds their personal data and how it intends to use it for marketing purposes. The ICO also requires Experian to stop using personal data derived from the credit referencing side of its business by January 2021.The ICO investigation found widespread and systemic data protection failings across the sector, significant data protection failures at each company and that significant invisible processing took place, likely affecting millions of individuals in the UK. As the report underlines, between the CRAs, the data of almost every adult in the UK was, in some way, screened, traded, profiled, enriched, or enhanced to provide direct marketing services.

Moreover, the report notes that all three of the credit referencing agencies investigated were also using profiling to generate new or previously unknown information about people. This can be extremely invasive and can also have discriminatory effects for individuals.

Experian has said it intends to appeal the ICO decisions saying:

We believe the ICO’s view goes beyond the legal requirements. This interpretation (of General Data Protection Regulation) also risks damaging the services that help consumers, thousands of small businesses and charities, particularly as they try to recover from the COVID-19 crisis.

Read more me_ico.htm at MelonFarmers.co.uk

information commissioners office logo ICO consultation on the draft Statutory guidance

We are running a consultation about an updated version of the Statutory guidance on how the ICO will exercise its data protection regulatory functions of information notices, assessment notices, enforcement notices and penalty notices.

This guidance is a requirement of the Data Protection Act 2018 and only covers data protection law under that Act. Our other regulatory activity and the other laws we regulate are covered in our Regulatory action policy (which is currently under review).

We welcome written responses from all interested parties including members of the public and data controllers and those who represent them. Please answer the questions in the survey and also tell us whether you are responding on behalf of an organisation or in a personal capacity.

We will use your responses to this survey to help us understand the areas where organisations and members of the public are seeking further clarity about information notices, assessment notices, enforcement notices and penalty notices. We will only use this information to inform the final version of this guidance and not to consider any regulatory action.

We will publish this guidance after the UK has left the EU and we have therefore drafted it accordingly.

Won’t somebody think of the children!…

Read more me_ico.htm at MelonFarmers.co.uk

age appropriate design 2020 The ICO issued the code on 12 August 2020 and it will come into force on 2 September 2020 with a 12 month transition period.Information Commissioner Elizabeth Denham writes:

Data sits at the heart of the digital services children use every day. From the moment a young person opens an app, plays a game or loads a website, data begins to be gathered. Who’s using the service? How are they using it? How frequently? Where from? On what device?

That information may then inform techniques used to persuade young people to spend more time using services, to shape the content they are encouraged to engage with, and to tailor the advertisements they see.

For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play.

This statutory code of practice looks to change that, not by seeking to protect children from the digital world, but by protecting them within it.

This code is necessary.

This code will lead to changes that will help empower both adults and children.

One in five UK internet users are children, but they are using an internet that was not designed for them. In our own research conducted to inform the direction of the code, we heard children describing data practices as nosy, rude and a bit freaky.

Our recent national survey into people’s biggest data protection concerns ranked children’s privacy second only to cyber security. This mirrors similar sentiments in research by Ofcom and the London School of Economics.

This code will lead to changes in practices that other countries are considering too.

It is rooted in the United Nations Convention on the Rights of the Child (UNCRC) that recognises the special safeguards children need in all aspects of their life. Data protection law at the European level reflects this and provides its own additional safeguards for children.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

This code will lead to changes that UK Parliament wants.

Parliament and government ensured UK data protection laws will truly transform the way we look after children online by requiring my office to introduce this statutory code of practice.

The code delivers on that mandate and requires information society services to put the best interests of the child first when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them.

This code is achievable.

The code is not a new law but it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services. It follows a thorough consultation process that included speaking with parents, children, schools, children’s campaign groups, developers, tech and gaming companies and online service providers.

Such conversations helped shape our code into effective, proportionate and achievable provisions.

Organisations should conform to the code and demonstrate that their services use children’s data fairly and in compliance with data protection law.

The code is a set of 15 flexible standards 203 they do not ban or specifically prescribe 203 that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child are the primary consideration when designing and developing online services.

Settings must be high privacy by default (unless there’s a compelling reason not to); only the minimum amount of personal data should be collected and retained; children’s data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. The code also addresses issues of parental control and profiling.

This code will make a difference.

Developers and those in the digital sector must act. We have allowed the maximum transition period of 12 months and will continue working with the industry.

We want coders, UX designers and system engineers to engage with these standards in their day-to-day to work and we’re setting up a package of support to help.

But the next step must be a period of action and preparation. I believe companies will want to conform with the standards because they will want to demonstrate their commitment to always acting in the best interests of the child. Those companies that do not make the required changes risk regulatory action.

What’s more, they risk being left behind by those organisations that are keen to conform.

A generation from now, I believe we will look back and find it peculiar that online services weren’t always designed with children in mind.

When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthily, get a good education or buckle up in the back of a car.

And while our code will never replace parental control and guidance, it will help people have greater confidence that their children can safely learn, explore and play online.

There is no doubt that change is needed. The code is an important and significant part of that change.