Archive for the ‘Internet’ Category

The UK government calls for evidence for its biased review seeking to further censor and control internet pornography.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

dept of science innovations and technology logo The UK Government’s Department for Science, Innovation, Technology and Censorship has called for evidence to inform the final recommendations of its ‘Independent’ Pornography Review. The government writes:

The government wants to ensure that any legislation and regulation operates appropriately for all pornographic content, and that the criminal justice system have the tools they need to respond to online illegal pornographic material, and exploitation and abuse in the industry.

The Independent Pornography Review involves a comprehensive assessment of the legislation, regulation and enforcement of online and offline pornographic content, and is overseen by Independent Lead Reviewer Baroness Gabby Bertin.

The review will take an evidence-based approach to develop a range of recommendations on how to best to achieve the review’s objectives:

  • understand the prevalence and harmful impact of illegal pornography online, and the impact on viewers of other forms of legal pornography, including emerging themes like AI-generated pornography, and the impact on viewer’s attitudes to violence against women and girls;

  • assess the public’s awareness and understanding of existing regulation and legislation of pornography;

  • consider the current rules in place to regulate the pornography industry, comparing online and offline laws;

  • determine if law enforcers and the justice system are responding to illegal pornography sufficiently, and if change is needed;

  • find out how prevalent human trafficking and exploitation is in the industry, before recommending how to identify and tackle this;

  • use this knowledge to set out what more can be done to provide those who need it with guidance on the potential harmful impact of pornography.

To ensure the review’s final recommendations are robust, it is important that a broad range of views and evidence are considered. This call for evidence invites:

  • members of the public

  • the government

  • subject matter experts

  • organisations

to contribute to the review.

The call for evidence closes on 7 March 2024.

Launching Default End-to-End Encryption on Messenger

Read more awwb.htm at MelonFarmers.co.uk

messenger logo I’m delighted to announce that we are rolling out default end-to-end encryption for personal messages and calls on Messenger and Facebook, as well as a suite of new features that let you further control your messaging experience. We take our responsibility to protect your messages seriously and we’re thrilled that after years of investment and testing, we’re able to launch a safer, more secure and private service.

Since 2016, Messenger has had the option for people to turn on end-to-end encryption, but we’re now changing private chats and calls across Messenger to be end-to-end encrypted by default. This has taken years to deliver because we’ve taken our time to get this right. Our engineers, cryptographers, designers, policy experts and product managers have worked tirelessly to rebuild Messenger features from the ground up. We’ve introduced new privacy, safety and control features along the way like delivery controls that let people choose who can message them, as well as app lock , alongside existing safety features like report, block and message requests. We worked closely with outside experts, academics, advocates and governments to identify risks and build mitigations to ensure that privacy and safety go hand-in-hand.

The extra layer of security provided by end-to-end encryption means that the content of your messages and calls with friends and family are protected from the moment they leave your device to the moment they reach the receiver’s device. This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.

End-to-end encryption gives people more secure chats in Messenger. These chats will not only have all of the things people know and love, like themes and custom reactions, but also a host of new features we know are important for our community. These new features will be available for use immediately, though it may take some time for Messenger chats to be updated with default end-to-end encryption.

Read more parl.htm at MelonFarmers.co.uk

blocked france Last week, The Times and Channel 4’s Dispatches covered serious allegations of assault against Russell Brand. While the comedian has yet to be convicted of any wrongdoing and whether the anonymous accusers are victims is yet to be determined, several major platforms, including YouTube, Netflix, and BBC iPlayer, took swift action, either demonetizing or removing Brand’s content.A senior Tory politician has taken it onboard to take the lynch mob position of declaring that the accuser is always right, and that without needing to bother with due process, police investigation or judicial trial, she has demanded the standard PC punishment of loss of career.

Caroline Dinenage, the chair of chair of Parliament’s Culture, Media and Sport Committee has written to bully the free speech friendly social media website Rumble into banning or demonetising Brand’s video content which seems to have about 1.5 million followers. Dineage wrote that she is concerned that Brand may be able to profit from his work online:

We would be grateful if you could confirm whether Mr Brand is able to monetise his content, including his videos relating to the serious accusations against him. If so, we would like to know whether Rumble intends to join YouTube in suspending Mr Brand’s ability to earn money on the platform.

We would also like to know what Rumble is doing to ensure that creators are not able to use the platform to undermine the welfare of victims of inappropriate and potentially illegal behaviour.

Rumble, however, has chosen a different route from the other platforms. In response to an inquiry by the UK’s Culture, Media and Sport Committee regarding Brand’s monetization on the platform, Rumble CEO Chris Pavlovski issued a statement emphasizing the company’s commitment to a free internet. In a clear stance against cancel culture and rushes to judgement, Pavlovski responded, stressing that allegations against Brand have no connection with his content on Rumble. He pointed out the importance of a free internet, where no one arbitrarily dictates which ideas can or cannot be heard.From Rumble CEO Chris Pavlovski:

Today we received an extremely disturbing letter from a committee chair in the UK Parliament. While Rumble obviously deplores sexual assault, rape, and all serious crimes, and believes that both alleged victims and the accused are entitled to a full and serious investigation, it is vital to note that recent allegations against Russell Brand have nothing to do with content on Rumble’s platform. Just yesterday, YouTube announced that, based solely on these media accusations, it was barring Mr. Brand from monetizing his video content. Rumble stands for very different values. We have devoted ourselves to the vital cause of defending a free internet — meaning an internet where no one arbitrarily dictates which ideas can or cannot be heard, or which citizens may or may not be entitled to a platform.

We regard it as deeply inappropriate and dangerous that the UK Parliament would attempt to control who is allowed to speak on our platform or to earn a living from doing so. Singling out an individual and demanding his ban is even more disturbing given the absence of any connection between the allegations and his content on Rumble. We don’t agree with the behavior of many Rumble creators, but we refuse to penalize them for actions that have nothing to do with our platform.

Although it may be politically and socially easier for Rumble to join a cancel culture mob, doing so would be a violation of our company’s values and mission. We emphatically reject the UK Parliament’s demands.

spiked logo Offsite Comment: The casual authoritarianism of Caroline Dinenage

21st September 2023. See article from spiked-online.com by Laurie Wastell

Why is the head of parliament’s culture committee calling on tech firms to unperson Russell Brand?

The Online Censorship Bill passes its final parliamentary hurdle.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

arms of the british governmentjpg logo The UK’s disgraceful Online Safety Bill has passed through Parliament and will soon become law. The wide-ranging legislation, which is likely to affect every internet user in the UK and any service they access, and generate mountains of onerous red tape for any internet business stupid enough to be based in Britain. Potential impacts are still unclear and some of the new regulations are technologically impossible to comply with.A key sticking point is what the legislation means for end-to-end encryption, a security technique used by services like WhatsApp that mathematically guarantees that no one, not even the service provider, can read messages sent between two users. The new law gives regulator Ofcom the power to intercept and check this encrypted data for illegal or harmful content.

Using this power would require service providers to create a backdoor in their software, allowing Ofcom to bypass the mathematically secure encryption. But this same backdoor would be abused by hackers, thieves, scammers and malicious states to snoop, steal and hack.

Beyond encryption, the bill also brings in mandatory age checks on pornography websites and requires that websites have policies in place to protect people from harmful or illegal content. What counts as illegal and exactly which websites will fall under the scope of the bill is unclear, however.Neil Brown at law firm decoded.legal says Ofcom still has a huge amount of work to do. The new law could plausibly affect any company that allows comments on its website, publishes user-generated content, transmits encrypted data or hosts anything that the government deems may be harmful to children, says Brown:

What I’m fearful of is that there are going to be an awful lot of people, small organisations – not these big tech giants — who are going to face pretty chunky legal bills trying to work out if they are in scope and, if so, what they need to do.

Read more ow.htm at MelonFarmers.co.uk

world internet censors This week Ofcom hosted the first annual meeting of the Global Online ‘Safety Regulators’ Network (GOSRN), which brings together censors from Europe, Asia, Africa and the Pacific to discuss solutions to ‘global online safety challenges’.GOSRN is a collaboration between the first movers in internet censorship, including the eSafety Commissioner (Australia); Coimisiún na Meán (Ireland); the Film and Publication Board (South Africa); the Korea Communications Standards Commission (Republic of Korea); the Online Safety Commission (Fiji); and Ofcom (UK).

Members reflected on progress made in the first year of the network’s existence and discussed ways in which internet censors can further enhance collaboration in the year to come.

Network members agreed to appoint Ofcom as Chair of the Network for 2024.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

house of lords red logo The U.K.’s Online Safety Bill has passed a critical final stage in the House of Lords, and envisions a potentially vast scheme to surveil internet users.

The bill would empower the U.K. government, in certain situations, to demand that online platforms use government-approved software to search through all users’ photos, files, and messages, scanning for illegal content. Online services that don’t comply can be subject to extreme penalties, including criminal penalties.

Such a backdoor scanning system can and will be exploited by bad actors. It will also produce false positives, leading to false accusations of child abuse that will have to be resolved. That’s why the bill is incompatible with end-to-end encryption–and human rights. EFF has strongly opposed this bill from the start.

Now, with the bill on the verge of becoming U.K. law, the U.K. government has sheepishly acknowledged that it may not be able to make use of some aspects of this law. During a final debate over the bill, a representative of the government said that orders to scan user files can be issued only where technically feasible, as determined by Ofcom, the U.K.’s telecom regulatory agency. He also said any such order must be compatible with U.K. and European human rights law.

That’s a notable step back, since previously the same representative, Lord Parkinson of Whitley Bay, said in a letter to the House of Lords that the technology that would magically make invasive scanning co-exist with end-to-end encryption already existed . We have seen companies develop such solutions for platforms with end-to-end encryption before, wrote Lord Parkinson in that letter.

Now, Parkinson has come quite close to admitting that such technology does not, in fact, exist. On Tuesday, he said :

There is no intention by the Government to weaken the encryption technology used by platforms, and we have built strong safeguards into the Bill to ensure that users’ privacy is protected.

If appropriate technology which meets these requirements does not exist, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavors to develop or source a new solution.

The same day that these public statements were made, news outlets reported that the U.K. government privately acknowledged that there is no technology that could examine end-to-end encrypted messages while respecting user privacy.

People Need Privacy, Not Weak Promises

Let’s be clear: weak statements by government ministers, such as the hedging from Lord Parkinson during this week’s debate, are no substitute for real privacy rights.

Nothing in the law’s text has changed. The bill gives the U.K. government the right to order message and photo-scanning, and that will harm the privacy and security of internet users worldwide. These powers, enshrined in Clause 122 of the bill, are now set to become law. After that, the regulator in charge of enforcing the law, Ofcom, will have to devise and publish a set of regulations regarding how the law will be enforced.

Several companies that provide end-to-end encrypted services have said they will withdraw from the U.K. if Ofcom actually takes the extreme choice of requiring examination of currently encrypted messages. Those companies include Meta-owned WhatsApp, Signal, and U.K.-based Element, among others.

While it’s the last minute, Members of Parliament still could introduce an amendment with real protections for user privacy, including an explicit protection for real end-to-end encryption.

Failing that, Ofcom should publish regulations that make clear that there is no available technology that can allow for scanning of user data to co-exist with strong encryption and privacy.

Finally, lawmakers in other jurisdictions, including the United States, should take heed of the embarrassing result of passing a law that is not just deceptive, but unhinged from computational reality. The U.K. government has insisted that through software magic, a system in which they can examine or scan everything will also somehow be a privacy-protecting system. Faced with the reality of this contradiction, the government has turned to an 11th hour campaign to assure people that the powers it has demanded simply won’t be used.

Read more inus.htm at MelonFarmers.co.uk

Texas state seal Hours before controversial internet censorship laws were set to take effect in Texas and Arkansas, two federal judges granted preliminary injunctions temporarily blocking them.The more narrow Texas law sought to restrict minors from accessing content that is meant for adults. The law in particular required age/ID verification to access porn websites. It was opposed by free speech groups and adult performer industry groups.

The Arkansas law, known as the Social Media Safety Act, is broader and would prevent minors from creating accounts without parental permission on platforms earning more than $100 million a year. The tech industry trade group NetChoice, which represents Google, Meta and TikTok, among others, sued in June to block the law on the grounds that it is unconstitutional and would place an onerous burden on digital platforms.

In Arkansas, U.S. District Judge Timothy Brooks sided with NetChoice , saying that the law is not targeted to address the harms it has identified, and further research is necessary before the State may begin to construct a regulation that is narrowly tailored to address the harms that minors face due to prolonged use of certain social media. Brooks added that age–gating social media platforms does not seem to be an effective approach when, in reality, it is the content on particular platforms that is driving the State’s true concerns.The more narrow Texas law seeking to stop minors from accessing adult content online was temporarily blocked Thursday by District Judge David Alan Ezra in a move that the Free Speech Coalition said in a press release will protect citizens from facing a chilling effect on legally-protected speech.

The temporary injunctions block the laws from taking effect until further adjudication. It is unclear whether both Arkansas and Texas intend to appeal.

Read more gcnews.htm at MelonFarmers.co.uk

arms of the british governmentjpg logo Apple says it will remove services such as FaceTime and iMessage from the UK rather than weaken security if new UK government proposals are made law and acted upon.The government is seeking to update the Investigatory Powers Act (IPA) 2016. It wants messaging services to clear security features with the Home Office before releasing them to customers. The act lets the Home Office demand security features are disabled, without telling the public. Under the update, this would have to be immediate.

Currently, there has to be a review, there can also be an independent oversight process and a technology company can appeal before taking any action.

WhatsApp and Signal are among the platforms to have opposed a clause in the Online Safety Bill allowing the communications regulator to require companies to install technology to scan for child-abuse material in encrypted messaging apps and other services.

The government has opened an eight-week consultation on the proposed amendments to the IPA. , which already enables the storage of internet browsing records for 12 months and authorises the bulk collection of personal data.

Apple has made a  9 page submission to the current consultation opposing the snooping proposal:

It would not make changes to security features specifically for one country that would weaken a product for all users. Some changes would require issuing a software update so could not be made secretly The proposals constitute a serious and direct threat to data security and information privacy that would affect people outside the UK.

Read more gcnews.htm at MelonFarmers.co.uk

open rights group 2020 logo To: Chloe Smith, Secretary of State, Department for Science, Innovation and Technology
cc: Tom Tugendhat, Minister of State for Security, Home Office Paul Scully, Minister for Tech and the Digital Economy Lord Parkinson of Whitley Bay

Dear Ms Smith,

We are over 80 national and international civil society organisations, academics and cyberexperts. We represent a wide range of perspectives including digital human rights and technology.

We are writing to you to raise our concerns about the serious threat to the security of private and encrypted messaging posed by the UK’s proposed Online Safety Bill (OSB).

The Online Safety Bill is a deeply troubling legislative proposal. If passed in its present form, the UK could become the first liberal democracy to require the routine scanning of people’s private chat messages, including chats that are secured by end-to-end encryption. As over 40 million UK citizens and 2 billion people worldwide rely on these services, this poses a significant risk to the security of digital communication services not only in the UK, but also internationally.

End-to-end encryption ensures the security of communications for everyone on a network. It is designed so that no-one, including the platform provider, can read or alter the messages. The confidentiality between sender and recipient is completely preserved. That’s why the United Nations, several human rights groups, and anti-human trafficking organisations alike have emphasised that encryption is a vital human rights tool.

In order to comply with the Online Safety Bill, platform providers would have to break that protection either by removing it or by developing work-arounds. Any form of work-around risks compromising the security of the messaging platform, creating back-doors, and other dangerous ways and means for malicious actors and hostile states to corrupt the system. This would put all users in danger.

The UK government has indicated its intention for providers to use a technology that would scan chats on people’s phone and devices — known as client-side scanning. The UK government’s assertion that client-side scanning will not compromise the privacy of messages contradicts the significant evidence of cyber-security experts around the world. This software intercepts chat messages before they are encrypted, and as the user is uploading their images or text, and therefore confidentiality of messages cannot be guaranteed. It would most likely breach human rights law in the UK and internationally.

Serious concerns have also been raised about similar provisions in the EU’s proposed Child Sexual Abuse Regulation, which an independent expert study warns is in contradiction to human rights rules. French, Irish and Austrian parliamentarians have all also warned of severe threats to human rights and of undermining encryption.

Moreover, the scanning software would have to be pre-installed on people’s phones, without their permission or full awareness of the severe privacy and security implications. The underlying databases can be corrupted by hostile actors, meaning that individual phones would become vulnerable to attack. The breadth of the measures proposed in the Online Safety Bill — which would infringe the rights to privacy to the same extent for the internet’s majority of legitimate law-abiding users as it would for potential criminals — means that the measures cannot be considered either necessary or proportionate.

The inconvenient truth is that it is not possible to scan messages for bad things without infringing on the privacy of lawful messages. It is not possible to create a backdoor that only works for good people and that cannot be exploited by bad people.

Privacy and free expression rights are vital for all citizens everywhere, in every country, to do their jobs, raise their voices, and hold power to account without arbitrary intrusion, persecution or repression. End-to-end encryption provides vital security that allows them to do that without arbitrary interference. People in conflict zones who rely on secure encrypted communications to be able to speak safely to friends and family as well as for national security. Journalists around the world who rely on the confidential channels of encrypted chat, can communicate to sources and upload their stories in safety.

Children, too, need these rights, as emphasised by UNICEF based on the UN Convention of the Rights of the Child. Child safety and privacy are not mutually exclusive; they are mutually reinforcing. Indeed, children are less safe without encrypted communications, as they equally rely on secure digital experiences free from their data being harvested or conversations intercepted. Online content scanning alone cannot hope to fish out the serious cases of exploitation, which require a whole-of-society approach. The UK government must invest in education, judicial reform, social services, law enforcement and other critical resources to prevent abuse before it can reach the point of online dissemination, thereby prioritising harm prevention over retrospective scanning.

As an international community, we are deeply concerned that the UK will become the weak link in the global system. The security risk will not be confined within UK borders. It is difficult to envisage how such a destructive step for the security of billions of users could be justified.

The UK Prime Minister, Rishi Sunak, has said that the UK will maintain freedom, peace and security around the world. With that in mind, we urge you to ensure that end-to-end encrypted services will be removed from the scope of the Bill and that the privacy of people’s confidential communications will be upheld.

Signed,

Access Now, ARTICLE 19: Global Campaign for Free Expression, Asociatia pentru Tehnologie Ui Internet (ApTI), Associação Portuguesa para a Promoção da Segurança da Informação (AP2SI), Association for Progressive Communications (APC), Big Brother Watch, Centre for Democracy and Technology, Chaos Computer Club (CCC), Citizen D / Drzavljan D, Collaboration on International ICT Policy for East and Southern Africa (CIPESA), Community NeHUBs Africa, cyberstorm.mu, Defend Digital Me, CASM at Demos, Digitalcourage, Digitale Gesellschaft, DNS Africa Media and Communications, Electronic Frontier Finland, Electronic Frontier Foundation (EFF), Electronic Frontier Norway, Epicenter.works, European Center for Not-for-Profit Law, European Digital Rights (EDRi), European Sex Workers Rights Association (ESWA), Fair Vote, Fight for the Future, Foundation for Information Policy Research, Fundación Cibervoluntarios, Global Partners Digital, Granitt, Hermes Center for Transparency and Digital Human Rights, Homo Digitalis, Ikigai Innovation Initiative, Internet Society, Interpeer gUG, ISOC Brazil — Brazilian Chapter of the Internet Society, ISOC Ghana, ISOC India Hyderabad Chapter, ISOC Venezuela, IT-Pol, JCA-Net (Japan), Kijiji Yeetu, La Quadrature du Net, Liberty, McEvedys Solicitors and Attorneys Ltd, Open Rights Group, OpenMedia, OPTF, Privacy and Access Council of Canada, Privacy International, Ranking Digital Rights, Statewatch, SUPERRR Lab, Tech for Good Asia, UBUNTEAM, Wikimedia Foundation, Wikimedia UK

Professor Paul Bernal, Nicholas Bohm, Dr Duncan Campbell, Alan Cox, Ray Corrigan, Professor Angela Daly, Dr Erin Ferguson, Wendy M. Grossman, Dr Edina Harbinja, Dr Julian Huppert, Steve Karmeinsky, Dr Konstantinos Komaitis, Professor Douwe Korff, Petr Kucera, Mark A. Lane, Christian de Larrinaga, Mark Lizar, Dr Brenda McPhail, Alec Muffett, Riana Pferfferkorn, Simon Phipps, Dr Birgit Schippers, Peter Wells, Professor Alan Woodward

Read more inau.htm at MelonFarmers.co.uk

australia esafety commissioner logo Australia’s eSafety Commissioner has made the decision not to register two of eight online censorship codes drafted by the online industry as they fail to provide appropriate mechanisms to deal with illegal and harmful content online.New mandatory codes will cover five sections of the online industry and operate under Australia’s Online Safety Act 2021. The codes require industry to take adequate steps to reduce the availability of seriously harmful online content, such as child sexual abuse and pro-terror material.

eSafety’s decision not to register the Designated Internet Services (DIS) code, covering apps, websites, and file and photo storage services like Apple iCloud and Microsoft One Drive; and the Relevant Electronic Services (RES) code, covering dating sites, online games and instant messaging, is due to the failure of the codes to define appropriate snooping/surveillance mechanisms, which is a requirement for registration .

eSafety will now move to develop mandatory and enforceable industry standards for Relevant Electronic Services and Designated Internet Services.

The eSafety Commissioner has reserved her decision on a third code, the draft Search Engines code, covering online search over concerns it is no longer fit for purpose following recently announced developments in the field of generative AI and its integration into search engine functions. eSafety has requested that a revised Search Engines code be submitted within four weeks to address specific concerns we have raised.

eSafety Commissioner Julie Inman Grant said:

While I commend industry for their significant amendments following our final feedback on these world-first codes in February, these two codes still don’t meet our minimum expectations.

For example, the Designated Internet Services code still doesn’t require file and photo storage services like iCloud, Google Drive, or OneDrive to detect and flag known child sexual abuse material.

We know that online storage services like these are used to store and share child sexual abuse material and pro-terror material between offenders.

And the Relevant Electronic Services code also doesn’t require email services and some partially encrypted messaging services to detect and flag this material either, even though we know there are proactive steps they can take to stem the already rampant sharing of illegal content.

Industry codes will come into effect six months from the date of registration while eSafety will begin the process of drafting industry standards for Designated Internet Services and Relevant Electronic Services.

Once a code or standard is in place, eSafety will be able to receive complaints and investigate potential breaches. An industry code or standard will be backed up by powers to ensure compliance including injunctions, enforceable undertakings, and maximum financial penalties of nearly $700,000 per day for continuing breaches.

The draft industry censorship codes submitted to eSafety on 31 March can be found at onlinesafety.org.au/codes .