The Government wants people who view pornography to show that they are over 18, via Age Verification systems. This is aimed at reducing the likelihood of children accessing inappropriate content.
To this end the Digital Economy Bill creates a regulator that will seek to ensure that adult content websites will verify the age of users, or face monetary penalties, or in the case of overseas sites, ask payment providers such as VISA to refuse to process UK payments for non-compliant providers.
There are obvious problems with this, which we detail elsewhere .
However, the worst risks are worth going into in some detail, not least from the perspective of the Bill Committee who want the Age Verification system to succeed.
As David Austen, from the BBFC, who will likely become the Age Verification Regulator said :
Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, “Is this person 18 or over?” The answer should be either yes or no. No other personal details are necessary.
However, the Age Verification Regulator has no duties in relation to the Age Verification systems. They will make sites verify age, or issue penalties, but they are given no duty to protect people’s privacy, security or defend against cyber security risks that may emerge from the Age Verification systems themselves.
David Austen’s expectations are unfortunately entirely out of his hands.
Instead, the government appears to assume that Data Protection law will be adequate to deal with the privacy and security risks. Meanwhile, the market will provide the tools.
The market has a plethora of possible means to solve this problem. Some involve vast data trawls through Facebook and social media. Others plan to link people’s identity across web services and will provide way to profile people’s porn viewing habits. Still others attempt to piggyback upon payment providers and risk confusing their defences against fraud. Many appear to encourage people to submit sensitive information to services that the users, and the regulator, will have little or no understanding of.
And yet with all the risks that these solutions pose, all of these solutions may be entirely data protection compliant. This is because data protection allows people to share pretty much whatever they agree to share, on the basis that they are free to make agreements with whoever they wish, by providing ‘consent’.
In other words: Data protection law is simply not designed to govern situations where the user is forced to agree to the use of highly intrusive tools against themselves.
What makes this proposal more dangerous is that the incentives for the industry are poor and lead in the wrong direction. They have no desire for large costs, but would benefit vastly from acquiring user data.
If the government wants to have Age Verification in place, it must mandate a system that increases the privacy and safety of end users, since the users will be compelled to use Age Verification tools. Also, any and all Age Verification solutions must not make Britain’s cybersecurity worse overall, e.g. by building databases of the nation’s porn-surfing habits which might later appear on Wikileaks.
The Digital Economy Bill’s impact on privacy of users should, in human rights law, be properly spelled out (” in accordance with the law “) and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges.
User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is regarded as particularly sensitive in law.
Government, in fact has at its hands a system called Verify which could provide age-verification in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts.
As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a complex social problem. The Internet has given young people unprecedented access to adult content but it’s education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and therefore proportionality of this measure remain.
However, legislating for the Age Verification problem to be “solved” without any specific regulation for any private sector operator who wants to “help” is simply to throw the privacy of the UK’s adult population to the mercy of the porn industry. With this mind, we have drafted an amendment to introduce the duties necessary to minimise the privacy impacts which could also reduce if not remove the free expression harms to adults.