Archive for the ‘UK Government Censorship’ Category

Ofcom goes full on nightmare with age/ID verification for nearly all websites coupled with a mountain of red tape and expense.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

Ofcom logo With a theatrical flourish clamouring to the ‘won’t somebody think of the children’ mob, Ofcom has proposed a set of censorship rules that demand strict age/ID verification for practically ever single website that allows users to post content. On top of that they are proposing the most onerous mountain of expensive red tape seen in the western world.There are few clever slight of hands that drag most of the internet into the realm of strict age/ID verification. Ofcom argues that nearly all websites will have child users because 16 and 17 year old ‘children’ have more or less the same interests as adults and so there is no content that is not of interest to ‘children’

And so all websites will have to offer content that is appropriate to all age children or else put in place strict age/ID verification to ensure that content is appropriate to age.

And at every stage of deciding website policy, Ofcom is demanding extensive justification of decision made and proof of data used in making decisions. The amount of risk assessments, documents, research, evidence required makes the ‘health and safety’ regime look like child’s play.

On occasions in the consultation documents Ofcom acknowledges that this will impose a massive administrative burden, but swats away criticism by noting that is the fault of the Online Safety Act law itself, and not Ofcom’s fault.

Comment: Online Safety proposals could cause new harms

See article from openrightsgroup.org

open rights group 2020 logo Ofcom’s consultation on safeguarding children online exposes significant problems regarding the proposed implementation of age-gating measures. While aimed at protecting children from digital harms, the proposed measures introduce risks to cybersecurity, privacy and freedom of expression.

Ofcom’s proposals outline the implementation of age assurance systems, including photo-ID matching, facial age estimation, and reusable digital identity services, to restrict access to popular platforms like Twitter, Reddit, YouTube, and Google that might contain content deemed harmful to children.

Open Rights Group warns that these measures could inadvertently curtail individuals’ freedom of expression while simultaneously exposing them to heightened cybersecurity risks.

Jim Killock, Executive Director of Open Rights Group, said:

Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites.

Some overseas providers may block access to their platforms from the UK rather than comply with these stringent measures.

We are also concerned that educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by moderation systems.

Risks to children will continue with these measures. Regulators need to shift their approach to one that empowers children to understand the risks they may face, especially where young people may look for content, whether it is meant to be available to them or not.

Open Rights Group underscores the necessity for privacy-friendly standards in the development and deployment of age-assurance systems mandated by the Online Safety Act. Killock notes, Current data protection laws lack the framework to pre-emptively address the specific and novel cybersecurity risks posed by these proposals.

Open Rights Group urges the government to prioritize comprehensive solutions that incorporate parental guidance and education rather than relying largely on technical measures.

Read more gcnews.htm at MelonFarmers.co.uk

scottish government logo Scotland’s disgraceful new hate crime law has come into force that will undoubtedly restrict free speech and give power to those wih a scores to settle regardless of the merits of their claims.The Hate Crime and Public Order (Scotland) Act 2021 creates a new crime of stirring up hatred relating to age, disability, religion, sexual orientation, transgender identity or being intersex. The maximum penalty is a prison sentence of seven years. A person commits an offence if they communicate material, or behave in a manner, that a reasonable person would consider to be threatening or abusive, with the intention of stirring up hatred based on the protected characteristics.

The bar for this offence is lower than for the other protected characteristics, as it also includes insulting behaviour, and as the prosecution need only prove that stirring up hatred was likely rather than intended.As well as the offence of stirring up hatred, the Hate Crime Act also consolidates the existing law on crimes which are aggravated by prejudice. These are where an offender demonstrates malice or ill-will towards their victim based on a protected characteristic, which can be taken into account by a sheriff or judge with a longer sentence or a higher fine than would otherwise have been the case. This is the first time that age has been included in the list of protected characteristics for aggravated offences, a move welcomed by some campaign groups.

Adam Tomkins, professor of public law at Glasgow University, and a former Conservative MSP, voted against the bill because it could see someone convicted of stirring up hatred for a comment they make in private in their own home, not just in public, and I just don’t think that’s where the criminal law belongs.Susan Smith of For Women Scotland fears those who are investigated under the new law will have their lives upended.  She told BBC News:

The tests are quite woolly and we don’t know how people are going to interpret this. We do anticipate that there will be a lot of malicious complaints, a lot of rather trivial complaints and potentially people who are investigated will see their lives upended. I imagine there will be many complaints, for example, made against JK Rowling.

Ch Supt Rob Hay of the Association of Scottish Police Superintendents (ASPS), which represents senior officers, said there was the potential for a huge uplift in complaints about social media posts. And as is so often the case, the police have sided with complainers and have pledged to investigate every hate crime complaint it receives.BBC News understands that these will be assessed by a dedicated team within Police Scotland including a number of hate crime advisers to assist officers in determining what, if any, action to take.

Categorised as a mountain of suffocating censorial red tape…

Read more uk_internet_censors.htm at MelonFarmers.co.uk

ofcom logo Ofcom writes:

Ofcom is seeking evidence to inform our codes of practice and guidance on the additional duties that will apply to some of the most widely used online sites and apps — designated as categorised services – under the Online Safety Act.

Under the new laws, all in-scope tech firms must put in place appropriate safety measures to protect users from online harms. In addition, some online services will have to comply with extra requirements if they fall into one of three categories, known as Category 1, 2A or 2B.

These extra duties include giving users more tools to control what content they see, ensuring protections for news publisher and journalistic content, preventing fraudulent advertising and producing transparency reports. Different duties apply, depending on which category a service falls into.

The Act requires us to produce codes of practice and guidance outlining the steps that companies can take to comply with these additional duties. We are inviting evidence from industry, expert groups and other organisations to help inform and shape our approach. A formal consultation on the draft codes and guidance will follow in 2025, taking account of responses to today’s call for evidence. Advice to Government on categorisation thresholds

Alongside this, we have also today published our advice to Government on the thresholds which would determine whether or not a service falls into Category 1, 2A or 2B. We advise that:

Category 1 (most onerous): should apply to services which meet either of the following conditions:

  • Condition 1 – uses a content recommender system; and has more than 34 million UK users on the user-to-user part of its service, representing around 50% of the UK population;

  • Condition 2 – allows users to forward or reshare user-generated content; and uses a content recommender system; and has more than 7 million UK users on the user-to-user part of its service, representing circa 10% of the UK population.

Category 2A: should apply to services which meet both of the following criteria:

  • is a search service, but not vertical search service

  • has more than 7 million UK users on the search engine part of its service, representing circa 10% of the UK population.

Category 2B: should apply to services which meet both of the following criteria:

  • allows users to send direct messages;

  • and has more than 3 million UK users on the user-to-user part of the service, representing circa 5% of the UK population.

Taking our advice into consideration, the Secretary of State must set the threshold conditions in secondary legislation. Once passed, we will then gather information, as needed, from regulated services and produce a published register of categorised services.

The UK government calls for evidence for its biased review seeking to further censor and control internet pornography.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dept of science innovations and technology logo The UK Government’s Department for Science, Innovation, Technology and Censorship has called for evidence to inform the final recommendations of its ‘Independent’ Pornography Review. The government writes:

The government wants to ensure that any legislation and regulation operates appropriately for all pornographic content, and that the criminal justice system have the tools they need to respond to online illegal pornographic material, and exploitation and abuse in the industry.

The Independent Pornography Review involves a comprehensive assessment of the legislation, regulation and enforcement of online and offline pornographic content, and is overseen by Independent Lead Reviewer Baroness Gabby Bertin.

The review will take an evidence-based approach to develop a range of recommendations on how to best to achieve the review’s objectives:

  • understand the prevalence and harmful impact of illegal pornography online, and the impact on viewers of other forms of legal pornography, including emerging themes like AI-generated pornography, and the impact on viewer’s attitudes to violence against women and girls;

  • assess the public’s awareness and understanding of existing regulation and legislation of pornography;

  • consider the current rules in place to regulate the pornography industry, comparing online and offline laws;

  • determine if law enforcers and the justice system are responding to illegal pornography sufficiently, and if change is needed;

  • find out how prevalent human trafficking and exploitation is in the industry, before recommending how to identify and tackle this;

  • use this knowledge to set out what more can be done to provide those who need it with guidance on the potential harmful impact of pornography.

To ensure the review’s final recommendations are robust, it is important that a broad range of views and evidence are considered. This call for evidence invites:

  • members of the public

  • the government

  • subject matter experts

  • organisations

to contribute to the review.

The call for evidence closes on 7 March 2024.

The UK government dreams up a new wheeze to take censorship control of streaming TV channels under current law.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

uk government 300 logo The government writes:

Broadcast television in the UK is subject to a system of regulation overseen by the independent communications regulator Ofcom, which is key to ensuring protections for audiences. This regulation ensures that regulated television channels available in the UK abide by a common set of rules and standards in relation to the programmes they show.

Over the last century, the number of channels available in the UK has increased significantly 203 from a single channel in 1922 to several hundred today. This trend has been recently accelerated by the increasing availability of internet-delivered linear television, known as internet protocol (IP) delivered television. For example, Sky’s newest product Sky Stream delivers content via the internet, compared to Sky Q that delivers its services via satellite.

Under the amended Communications Act 2003, in general only channels that appear on regulated electronic programme guides (EPGs) are subject to UK regulation. Which EPGs are regulated in the UK is described in legislation and under this description these currently are Freeview, Freesat, Sky, Virgin Media, and YouView. This list of regulated EPGs means that many of the newer EPGs and channels utilising IP technology are unregulated and can be easily accessed by audiences on their television sets. While millions of people still choose to watch television through the traditional regulated EPGs, there are increasingly significant numbers of UK viewers accessing linear television channels and content via television sets that can be connected to the internet. Data suggests that the UK has a high proportion of these kinds of televisions, with smart televisions already in as many as 74% of UK households.

This shift is transforming the way that audiences access television, with many new services now delivered via the internet. This evolution of distribution means that there is greater choice for consumers in how they access linear television content and that there is more competition within the market for delivering services, allowing for new and innovative services to emerge.

Many of the larger providers of unregulated EPGs have voluntarily put in place terms and procedures to protect audiences from harmful content, which may result in some comparable levels of protection as the regulated EPGs while incurring lower administrative costs for the providers.

However, the introduction of these newer unregulated and self-regulated guides has resulted in a clear regulatory gap within the existing statutory regime, which could result in inconsistent protections for audiences and limited options for independent complaints handling. This also means that guides do not have to ensure other benefits for audiences like prominence for public service channels and accessibility for people with disabilities.

The government is therefore concerned that the combination of the defined set of regulated EPGs and the growth of new, IP delivered services means that there is increasingly a lack of regulation. UK audiences being able to access unregulated EPGs means there is an increasing number of linear television channels and services that are not regulated by Ofcom and to the standards audiences in the UK expect. This has the potential to cause harm, especially for children and vulnerable audiences, with no statutory protections on these unregulated services.

The lack of protections in place for these unregulated services mean that there is a range of potentially harmful content that could be shown on television with no independent recourse for action to be taken. This includes content that would be unsuitable for younger audiences that are available during the day, that would need to be shown after the watershed if regulated, such as those that include swearing, violence, and sexual content.

Moreover, an inconsistent application of statutory regulation means that EPGs delivering similar — and often competing — services do not currently have to comply with the same statutory requirements. This means that there is not currently a fair competitive environment between providers.

Given the landscape of changing technology and the increasing risk to audiences of unregulated content appearing on television, the government believes that legislation is required to update the EPGs that are regulated in the UK. The government is therefore consulting on whether and how to use existing powers that allow it to update which EPGs are regulated in the UK.

This 8-week consultation seeks views on whether and how the Secretary of State should exercise this power, and seeks views on a proposed approach.

In summary, the government is consulting on:

  • The impact of regulating EPGs.

  • The proposed approach for defining which EPGs should be regulated.

Responses from all individuals or organisations on the specific consultation questions and content of the consultation document are welcome.

Read more gcnews.htm at MelonFarmers.co.uk

arms of the british governmentjpg logo Apple says it will remove services such as FaceTime and iMessage from the UK rather than weaken security if new UK government proposals are made law and acted upon.The government is seeking to update the Investigatory Powers Act (IPA) 2016. It wants messaging services to clear security features with the Home Office before releasing them to customers. The act lets the Home Office demand security features are disabled, without telling the public. Under the update, this would have to be immediate.

Currently, there has to be a review, there can also be an independent oversight process and a technology company can appeal before taking any action.

WhatsApp and Signal are among the platforms to have opposed a clause in the Online Safety Bill allowing the communications regulator to require companies to install technology to scan for child-abuse material in encrypted messaging apps and other services.

The government has opened an eight-week consultation on the proposed amendments to the IPA. , which already enables the storage of internet browsing records for 12 months and authorises the bulk collection of personal data.

Apple has made a  9 page submission to the current consultation opposing the snooping proposal:

It would not make changes to security features specifically for one country that would weaken a product for all users. Some changes would require issuing a software update so could not be made secretly The proposals constitute a serious and direct threat to data security and information privacy that would affect people outside the UK.

Read more ow.htm at MelonFarmers.co.uk

ofcom logo Between 21 February and 21 April 2023, Ofcom consulted on proposals for implementing new statutory restrictions on advertising and sponsorship for less healthy food and drink products.

The Health and Care Act — which received Royal Assent on 28 April 2022 — amended the Communications Act 2003 to introduce new restrictions on advertising and sponsorship for certain food and drink products that are high in fat, salt or sugar (HFSS). These new restrictions apply to advertising on Ofcom-regulated TV and on-demand programme services (ODPS) and also online.

The restrictions:

  • prohibit TV services from including advertising and sponsorship for less healthy food and drink products between 5.30am and 9pm;

  • prohibit ODPS from including advertising and sponsorship for less healthy food and drink products between 5.30am and 9pm; and

  • prohibit paid-for advertisements for less healthy food and drink products that are aimed at UK users from being placed online at any time.

These restrictions take effect from 1 October 2025.

Ofcom is the statutory regulator with responsibility for advertising on TV and ODPS. Our consultation proposed to:

  • designate the Advertising Standards Authority (ASA) as a co-regulator for the new prohibition on advertising for less healthy food and drink products in paid-for online space; and

  • amend the Broadcast Committee of Advertising Practice (BCAP) Code and the Broadcasting Code to reflect the new restrictions that apply to advertising and sponsorship on TV.

This statement summarises the consultation responses and sets out our conclusions.

See statement [pdf] from ofcom.org.uk

Read more uk_internet_censors.htm at MelonFarmers.co.uk

uk government 300 logo The UK government is reviewing porn censorship laws for adults, moving beyond the age verification requirements proposed in the current Online Censorship Bill.No doubt the ‘review’ will be a one-sided whinge-fest soliciting the views of moralists, censors and law enforcers, whilst totally ignoring the views of film makers and viewers.

The Government writes:

Regulation of online pornography in the UK will undergo a thorough review to make sure it is fit for purpose in tackling exploitation and abuse, the government has announced today (Monday 3 July).

As the way we consume media and access content rapidly changes, the Review will investigate any gaps in UK regulation which allows exploitation and abuse to take place online as well as identifying barriers to enforcing criminal law. While the criminal law has been updated in recent years to tackle the presence of extreme and revenge pornography, there are currently different regimes that address the publication and distribution of commercial pornographic material offline, such as videos, and online. The government wants to ensure any pornography legislation and regulation operates consistently for all pornographic content.

The review will also look at how effective the criminal justice system and law enforcement agencies are in responding to illegal pornographic content, including considering if any changes need to be made to criminal law to address challenges law enforcement might have.

It will also consider what more can be done to provide children with information and resources about the harm caused by pornography. This will make sure that illegal and harmful content, such as that which features child sexual abuse and exploitation, or where adults are being exploited, is robustly dealt with.

The Pornography Review is a prompt response to calls for action from parliamentarians and campaign groups concerned with the prevalence and impact on both children and adults of illegal pornographic content and child sexual exploitation and abuse on pornography sites and social media.

This work is separate to, but builds on, the Online Safety Bill, which will hold social media companies and pornography services accountable for ensuring children cannot view pornography, with a new higher standard on the age verification or age estimation tools they must use.

Technology Minister, Paul Scully, said:

Keeping the public safe is the first priority of any government and with technology moving faster than ever, we cannot take our eye off the ball in exploring what more we can do.

Our Pornography Review will look closely at the laws and regulations relating to offline and online content, informing our next steps in tackling the heinous crimes of exploitation and abuse, wherever it occurs.

‘Justice’ Minister, Ed Argar, said:

It is vital we keep up with the pace of the online world and this review will help ensure our laws work to protect people online while punishing those who share illegal and harmful content.

The Review will seek expertise across government and significant engagement with the Crown Prosecution Service and police, industry, civil society stakeholders and regulators.

The review will also look at the role of the pornography industry in trafficking and exploiting adult performers, child sexual exploitation and abuse, and how extreme and non-consensual pornographic content online is dealt with.

There are currently several criminal offences, linked to legislation such as the Obscene Publications Act 1959 and the extreme porn offence at s63 of the Criminal Justice and Immigration Act 2008, which can be committed in relation to all pornographic material, whether offline or online. Some pornographic material is covered by communications offences and offences which deal with publicly displayed material in shops and other premises.

Separately, there is a very robust regime of offences tackling the possession, taking and making of indecent images of children, whether they are photographs / films, or non-photographic.

There are also different regulatory regimes, including that established by the Video Recordings Act 1984, which address the publication and distribution of commercial pornographic material offline, and the video-sharing platform regime that addresses some online pornography. Notes to editors

The Review will involve a range of government departments, including the Department for Science, Innovation and Technology, Ministry of Justice, the Home Office and the Department for Culture, Media and Sport.

Further scope of the Review will be set out in due course.

The Review is aiming to be completed within a year.

Read more gcnews.htm at MelonFarmers.co.uk

open rights group 2020 logo To: Chloe Smith, Secretary of State, Department for Science, Innovation and Technology
cc: Tom Tugendhat, Minister of State for Security, Home Office Paul Scully, Minister for Tech and the Digital Economy Lord Parkinson of Whitley Bay

Dear Ms Smith,

We are over 80 national and international civil society organisations, academics and cyberexperts. We represent a wide range of perspectives including digital human rights and technology.

We are writing to you to raise our concerns about the serious threat to the security of private and encrypted messaging posed by the UK’s proposed Online Safety Bill (OSB).

The Online Safety Bill is a deeply troubling legislative proposal. If passed in its present form, the UK could become the first liberal democracy to require the routine scanning of people’s private chat messages, including chats that are secured by end-to-end encryption. As over 40 million UK citizens and 2 billion people worldwide rely on these services, this poses a significant risk to the security of digital communication services not only in the UK, but also internationally.

End-to-end encryption ensures the security of communications for everyone on a network. It is designed so that no-one, including the platform provider, can read or alter the messages. The confidentiality between sender and recipient is completely preserved. That’s why the United Nations, several human rights groups, and anti-human trafficking organisations alike have emphasised that encryption is a vital human rights tool.

In order to comply with the Online Safety Bill, platform providers would have to break that protection either by removing it or by developing work-arounds. Any form of work-around risks compromising the security of the messaging platform, creating back-doors, and other dangerous ways and means for malicious actors and hostile states to corrupt the system. This would put all users in danger.

The UK government has indicated its intention for providers to use a technology that would scan chats on people’s phone and devices — known as client-side scanning. The UK government’s assertion that client-side scanning will not compromise the privacy of messages contradicts the significant evidence of cyber-security experts around the world. This software intercepts chat messages before they are encrypted, and as the user is uploading their images or text, and therefore confidentiality of messages cannot be guaranteed. It would most likely breach human rights law in the UK and internationally.

Serious concerns have also been raised about similar provisions in the EU’s proposed Child Sexual Abuse Regulation, which an independent expert study warns is in contradiction to human rights rules. French, Irish and Austrian parliamentarians have all also warned of severe threats to human rights and of undermining encryption.

Moreover, the scanning software would have to be pre-installed on people’s phones, without their permission or full awareness of the severe privacy and security implications. The underlying databases can be corrupted by hostile actors, meaning that individual phones would become vulnerable to attack. The breadth of the measures proposed in the Online Safety Bill — which would infringe the rights to privacy to the same extent for the internet’s majority of legitimate law-abiding users as it would for potential criminals — means that the measures cannot be considered either necessary or proportionate.

The inconvenient truth is that it is not possible to scan messages for bad things without infringing on the privacy of lawful messages. It is not possible to create a backdoor that only works for good people and that cannot be exploited by bad people.

Privacy and free expression rights are vital for all citizens everywhere, in every country, to do their jobs, raise their voices, and hold power to account without arbitrary intrusion, persecution or repression. End-to-end encryption provides vital security that allows them to do that without arbitrary interference. People in conflict zones who rely on secure encrypted communications to be able to speak safely to friends and family as well as for national security. Journalists around the world who rely on the confidential channels of encrypted chat, can communicate to sources and upload their stories in safety.

Children, too, need these rights, as emphasised by UNICEF based on the UN Convention of the Rights of the Child. Child safety and privacy are not mutually exclusive; they are mutually reinforcing. Indeed, children are less safe without encrypted communications, as they equally rely on secure digital experiences free from their data being harvested or conversations intercepted. Online content scanning alone cannot hope to fish out the serious cases of exploitation, which require a whole-of-society approach. The UK government must invest in education, judicial reform, social services, law enforcement and other critical resources to prevent abuse before it can reach the point of online dissemination, thereby prioritising harm prevention over retrospective scanning.

As an international community, we are deeply concerned that the UK will become the weak link in the global system. The security risk will not be confined within UK borders. It is difficult to envisage how such a destructive step for the security of billions of users could be justified.

The UK Prime Minister, Rishi Sunak, has said that the UK will maintain freedom, peace and security around the world. With that in mind, we urge you to ensure that end-to-end encrypted services will be removed from the scope of the Bill and that the privacy of people’s confidential communications will be upheld.

Signed,

Access Now, ARTICLE 19: Global Campaign for Free Expression, Asociatia pentru Tehnologie Ui Internet (ApTI), Associação Portuguesa para a Promoção da Segurança da Informação (AP2SI), Association for Progressive Communications (APC), Big Brother Watch, Centre for Democracy and Technology, Chaos Computer Club (CCC), Citizen D / Drzavljan D, Collaboration on International ICT Policy for East and Southern Africa (CIPESA), Community NeHUBs Africa, cyberstorm.mu, Defend Digital Me, CASM at Demos, Digitalcourage, Digitale Gesellschaft, DNS Africa Media and Communications, Electronic Frontier Finland, Electronic Frontier Foundation (EFF), Electronic Frontier Norway, Epicenter.works, European Center for Not-for-Profit Law, European Digital Rights (EDRi), European Sex Workers Rights Association (ESWA), Fair Vote, Fight for the Future, Foundation for Information Policy Research, Fundación Cibervoluntarios, Global Partners Digital, Granitt, Hermes Center for Transparency and Digital Human Rights, Homo Digitalis, Ikigai Innovation Initiative, Internet Society, Interpeer gUG, ISOC Brazil — Brazilian Chapter of the Internet Society, ISOC Ghana, ISOC India Hyderabad Chapter, ISOC Venezuela, IT-Pol, JCA-Net (Japan), Kijiji Yeetu, La Quadrature du Net, Liberty, McEvedys Solicitors and Attorneys Ltd, Open Rights Group, OpenMedia, OPTF, Privacy and Access Council of Canada, Privacy International, Ranking Digital Rights, Statewatch, SUPERRR Lab, Tech for Good Asia, UBUNTEAM, Wikimedia Foundation, Wikimedia UK

Professor Paul Bernal, Nicholas Bohm, Dr Duncan Campbell, Alan Cox, Ray Corrigan, Professor Angela Daly, Dr Erin Ferguson, Wendy M. Grossman, Dr Edina Harbinja, Dr Julian Huppert, Steve Karmeinsky, Dr Konstantinos Komaitis, Professor Douwe Korff, Petr Kucera, Mark A. Lane, Christian de Larrinaga, Mark Lizar, Dr Brenda McPhail, Alec Muffett, Riana Pferfferkorn, Simon Phipps, Dr Birgit Schippers, Peter Wells, Professor Alan Woodward

Read more inus.htm at MelonFarmers.co.uk

Montana State Seal Montana lawmakers have passed a bill banning the social media app TikTok from operating in the state. The measure now goes to Republican Gov. Greg Gianforte for his consideration.The bill would prohibit downloads of TikTok in Montana and would fine any entity – an app store or TikTok itself $10,000 per day for each time someone is offered the ability to access the social media platform or download the app.

The state House voted 54-43 to pass the bill, which goes further than prohibitions in place in nearly half the states and the US federal government that prohibit TikTok on government devices. Montana already bans the app on state-owned devices.

The bill’s supporters have admitted that they have no feasible plan for implementing the bill and that the bill’s constitutionality will be decided by the courts.TikTok, which is owned by the Chinese tech company ByteDance, has been under intense scrutiny over user data being sent to the Chinese government and its use to distribute pro-Beijing propaganda and misinformation.

The US Congress is considering legislation that gives the Commerce Department the ability to restrict foreign threats on tech platforms.