Posts Tagged ‘Internet Censorship’

Read more eu.htm at MelonFarmers.co.uk

european commission logoSocial media giants Facebook, Google and Twitter will be forced to change their terms of service for EU users within a month, or face hefty fines from European authorities, an official said on Friday.The move was initiated after politicians have decided to blame their unpopularity on ‘fake news’ rather than their own incompetence and their failure to listen to the will of the people.

The EU Commission sent letters to the three companies in December, stating that some terms of service were in breach of EU protection laws and urged them to do more to prevent fraud on their platforms. The EU has also urged social media companies to do more when it comes to assessing the suitability of user generated content.

The letters, seen by Reuters, explained that the EU Commission also wanted clearer signposting for sponsored content, and that mandatory rights, such as cancelling a contract, could not be interfered with.

Germany said this week it is working on a new law that would see social media sites face fines of up to $53 million if they failed to strengthen their efforts to remove material that the EU does not like. German censorship minister Heiko Mass said:

There must be as little space for criminal incitement and slander on social networks as on the streets. Too few criminal comments are deleted and they are not erased quickly enough. The biggest problem is that networks do not take the complaints of their own users seriously enough…it is now clear that we must increase the pressure on social networks.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

open rights group 2016 logo How could the power to block pornographic websites lead to massive censorship, when the BBFC thinks it wants want to censor “just” a few hundred sites.

Officials wrote to the New Statesman yesterday to complain about Myles Jackman’s characterisation of the Digital Economy Bill as leading to an attempt to classify everything on the Internet. (They perhaps hadn’t understood the satire .)

However, the fact of the matter is that the DE Bill gives the BBFC (the regulator, TBC) the power to block any pornographic website that doesn’t use age verification tools. It can even block websites that publish pornography that doesn’t fit their guidelines of taste and acceptability – which are significantly narrower than what is legal, and certainly narrower than what is viewed as acceptable by US websites.

A single video of “watersports” or whipping produces marks, for instance, would be enough for the BBFC to ban a website for every UK adult. The question is, how many sites does the regulator want to block, and how many can it block?

Parliament has been told that the regulator wants to block just a few, major websites, maybe 50 or 100, as an “incentive” to implement age checks. However, that’s not what Clause 23 says. The “Age-verification regulator’s power to direct internet service providers to block access to material” just says that any site that fits the criteria can be blocked by an administrative request.

What could possibly go wrong?

Imagine, not implausibly, that some time after the Act is in operation, one of the MPs who pushed for this power goes and sees how it is working. This MP tries a few searches, and finds to their surprise that it is still possible to find websites that are neither asking for age checks nor blocked.

While the first page or two of results under the new policy would find major porn sites that are checking, or else are blocked, the results on page three and four would lead to sites that have the same kinds of material available to anyone.

In short, what happens when MPs realise this policy is nearly useless?

They will, of course, ask for more to be done. You could write the Daily Mail headlines months in advance: BBFC lets kids watch porn .

MPs will ask why the BBFC isn’t blocking more websites. The answer will come back that it would be possible, with more funding, to classify and block more sites, with the powers the BBFC has been given already. While individual review of millions of sites would be very expensive, maybe it is worth paying for the first five or ten thousand sites to be checked. (And if that doesn’t work, why not use machines to produce the lists?)

And then, it is just a matter of putting more cash the way of the BBFC and they can block more and more sites, to “make the Internet safe”.

That’s the point we are making. The power in the Digital Economy Bill given to the BBFC will create a mechanism to block literally millions of websites; the only real restraint is the amount of cash that MPs are willing to pour into the organisation.

What could possibly go wrong?

Read more uk_internet_censors.htm at MelonFarmers.co.uk

westminster eforum logoAn interesting article in Wired reports on a a recent Westminster eForum meeting when the British establishment got together to discuss, porn, internet censorship and child protection.A large portion of the article considers the issue that porn is not generally restricted just to ‘porn websites’. It is widely available on more mainstream wesbites such as Google Images. Stephen Winyard, director and VP of ICM Registry and council member of the digital policy alliance, argued that Twitter is in fact commercially benefiting from the proliferation of pornography on the network:

It’s on Twitter, Reddit, Tumblr, mobile apps – Skype is used hugely for adult content. But Twitter is the largest platform for promoting pornography in the world – and it takes money for it. They pay Twitter money to advertise adult content.

Another good good pint was that the Digital Censorship Bill going through parliament was targetting the prevention of children ‘stumbling across’ porn. Hence a bit of partial blockade of porn may somehow reduce this problem. However Adam Kinsley of Sky pointed out that partial blockage may not be so effective in stopping kids actively looking for porn. He noted:

The Digital Economy Bill’s exact objectives are a little uncertain, but we are trying to stop children stumbling on pornography — but they are not ‘stumbling’, they are looking for it and Twitter is where they will [find] it. Whether what the government is proposing will deal with that threat is unclear. Initially, it did not propose ISPs blocking content. When it comes to extremist sites, the Home Office asks social media platforms to take down content. The government does not ask us to block material – it has never done that. So this is a big deal. It doesn’t happen with the IWF; it doesn’t happen with terrorist material, and it wasn’t in the government’s original proposal. Whether they got it right and how will we deal with these millions of sites, is unclear.

We’re not really achieving anything if only dealing with a few sites.

The Bill is incredibly complex, as it stands. David Austin, from the BBFC, pointed out that for it to implement the bill correctly, it needs to be effective, proportionate, respectful of privacy, accountable – and the

Tens of millions of adults that go online to see legal content must be able to continue to do so.

At the same time, he said:

There is no silver bullet, no one model, no one sector that can achieve all child protection goals.

…Read the full article from wired.co.uk

Read more eu.htm at MelonFarmers.co.uk

france senate logoFrance is considering appointing an official internet ombudsman to investigate complaints about online material in order to prevent excessive censorship and preserve free speech. A bill establishing a content qualification assessment procedure has been tabled in the French senate.

Dan Shefets, a Danish lawyer explained one of the issues targeted by the bill:

ISPs face both penal and civil liability as soon as they are made aware of allegedly illicit content. One consequence of such liability is that smaller companies take down such content for fear of later sanctions.

The aim is to provide a simple procedure that will support firms operating online who are uncertain of their legal liabilities and to prevent over-zealous removal or censorship of material merely because it is the subject of a complaint. It could be copied by other European jurisdictions.

The idea is that a rapid response from the internet ombudsman would either order the material to be taken down or allow it to remain. As long as ISPs complied with the rulings, they would not face any fine or punishment.

Read more uk_internet_censors.htm at MelonFarmers.co.uk

house of lords red logoThe Lords had their first debate on the Digital Economy Bill which includes laws to require age verification as well as extension of out dated police and BBFC censorship rules to the internet.Lords inevitable queued up to support the age verification requirements. However a couple of the lords made cautionary remarks about the privacy issues of websites being able to build up dangerous database of personal ID information of porn users.

A couple of lords also spoke our against the BBFC/police/government censorship prohibitions being included in the bill. It was noted that these rules are outdated, disproportionate and perhaps requires further debate in another bill.

As an example of these points, the Earl of Erroll (cross bencher) said:

My Lords, I welcome the Bill because it has some very useful stuff in it — but, like everything else, it might benefit from some tweaking. Many other speakers mentioned the tweaks that need to be made, and if that happens I think that we may end up with quite a good Bill.

I will concentrate on age verification because I have been working on this issue with a group for about a year and three-quarters. We spotted that its profile was going to be raised because so many people were worried about it. We were the first group to bring together the people who run adult content websites — porn websites — with those who want to protect children. The interesting thing to come out quite quickly from the meetings was that, believe it or not, the people who run porn sites are not interested in corrupting children because they want to make money. What they want are adult, middle-aged people, with credit cards from whom they can extract money, preferably on a subscription basis or whatever. The stuff that children are getting access to is what are called teaser adverts. They are designed to draw people in to the harder stuff inside, you might say. The providers would be delighted to offer age verification right up front so long as all the others have to comply as well — otherwise they will get all the traffic. Children use up bandwidth. It costs the providers money and wastes their time, so they are very happy to go along with it. They will even help police it, for the simple reason that it will block the opposition. It is one of the few times I approve of the larger companies getting a competitive advantage in helping to police the smaller sites that try not to comply.

One of the things that became apparent early on was that we will not be able to do anything about foreign sites. They will not answer mail or do anything, so blocking is probably the only thing that will work. We are delighted that the Government has gone for that at this stage. Things need to get blocked fast or sites will get around it. So it is a case of block first, appeal later, and we will need a simple appeals system. I am sure that the BBFC will do a fine job, but we need something just in case.

Another thing that came back from the ISPs is that they want more clarity about what should be blocked, how it will be done and what they will have to do. There also needs to be indemnity. When the ISPs block something for intellectual property and copyright reasons, they are indemnified. They would need to have it for this as well, or there will be a great deal of reluctance, which will cause problems.

The next thing that came up was censorship. The whole point of this is we want to enforce online what is already illegal offline. We are not trying to increase censorship or censor new material. If it illegal offline, it should be illegal online and we should be able to do something about it. This is about children viewing adult material and pornography online. I am afraid this is where I slightly disagree with the noble Baroness, Lady Kidron. We should decide what should be blocked elsewhere; we should not use the Bill to block other content that adults probably should not be watching either. It is a separate issue. The Bill is about protecting children. The challenge is that the Obscene Publications Act has some definitions and there is ATVOD stuff as well. They are supposed to be involved with time. CPS guidelines are out of step with current case law as a result of one of the quite recent cases — so there is a bit of a mess that needs clearing up. This is not the Bill to do it. We probably need to address it quite soon and keep the pressure on; that is the next step. But this Bill is about keeping children away from such material.

The noble Baroness, Lady Benjamin, made a very good point about social platforms. They are commercial. There are loopholes that will get exploited. It is probably unrealistic to block the whole of Twitter — it would make us look like idiots. On the other hand, there are other things we can do. This brings me to the point that other noble Lords made about ancillary service complaints. If we start to make the payment service providers comply and help, they will make it less easy for those sites to make money. They will not be able to do certain things. I do not know what enforcement is possible. All these sites have to sign up to terms and conditions. Big retail websites such as Amazon sell films that would certainly come under this category. They should put an age check in front of the webpage. It is not difficult to do; they could easily comply.

We will probably need an enforcer as well. The BBFC is happy to be a regulator, and I think it is also happy to inform ISPs which sites should be blocked, but other enforcement stuff might need to be done. There is provision for it in the Bill. The Government may need to start looking for an enforcer.

Another point that has come up is about anonymity and privacy, which is paramount. Imagine the fallout if some hacker found a list of senior politicians who had had to go through an age-verification process on one of these websites, which would mean they had accessed them. They could bring down the Government or the Opposition overnight. Noble Lords could all go to the MindGeek website and look at the statistics, where there is a breakdown of which age groups and genders are accessing these websites. I have not dared to do so because it will show I have been to that website, which I am sure would show up somewhere on one of these investigatory powers web searches and could be dangerous.

One of the things the Digital Policy Alliance, which I chair, has done is sponsor a public available specification, which the BSI is behind as well. There is a lot privacy-enforcing stuff in that. It is not totally obvious; it is not finished yet, and it is being highlighted a bit more. One thing we came up with is that websites should not store the identity of the people whom they age-check. In fact, in most cases, they will bounce straight off the website and be sent to someone called an attribute provider, who will check the age. They will probably know who the person is, but they will send back to the website only an encrypted token which says, We’ve checked this person that you sent to us. Store this token. This person is over 18 — or under 18, or whatever age they have asked to be confirmed. On their side, they will just keep a record of the token but will not say to which website they have issued it — they will not store that, either. The link is the token, so if a regulator or social service had to track it down, they could physically take the token from the porn site to where it came from, the attribute provider, and say, Can you check this person’s really over 18, because we think someone breached the security? What went wrong with your procedures? They can then reverse it and find out who the person was — but they could still perhaps not be told by the regulator which site it was. So there should be a security cut-out in there. A lot of work went into this because we all knew the danger.

This is where I agree entirely with the Open Rights Group, which thinks that such a measure should be mandated. Although the publicly available specification, which is almost like a British standard, says that privacy should be mandated under general data protection regulation out of Europe, which we all subscribe to, I am not sure that that is enough. It is a guideline at the end of the day and it depends on how much emphasis the BBFC decides to put on it. I am not sure that we should not just put something in the Bill to mandate that a website cannot keep a person’s identity. If the person after they have proved that they are 18 then decides to subscribe to the website freely and to give it credit card details and stuff like that, that is a different problem — I am not worried about that. That is something else. That should be kept extremely securely and I personally would not give my ID to such a site — but at the age verification end, it must be private.

There are some other funny things behind the scenes that I have been briefed on, such as the EU VAT reporting requirements under the VAT Mini One Stop Shop, which requires sites to keep some information which might make a person identifiable. That could apply if someone was using one of the attribute providers that uses a credit card to provide that check or if the website itself was doing that. There may be some things that people will have to be careful of. There are some perfectly good age-checking providers out there who can do it without you having to give your details. So it is a good idea; I think that it will help. Let us then worry about the point that the noble Baroness, Lady Kidron, made so well about what goes where.

The universal service obligation should be territorial; it has to cover the country and not just everyone’s homes. With the internet of things coming along — which I am also involved in because I am chair of the Hypercat Alliance, which is about resource discovery over the internet of things — one of the big problems is that we are going to need it everywhere: to do traffic monitoring, people flows and all the useful things we need. We cannot have little not-spots, or the Government will not be able to get the information on which to run all sorts of helpful control systems. The noble Lord, Lord Gordon of Strathblane, referred to mast sharing. The problem with it is that they then do not put masts in the not-spots; they just keep the money and work off just one mast — you still get the not-spots. If someone shares a mast, they should be forced a mast somewhere else, which they then share as well.

On broadband take-up, people say, Oh, well, people aren’t asking for it . It is chicken and egg: until it is there, you do not know what it is good for. Once it is there and suddenly it is all useful, the applications will flow. We have to look to the future; we have to have some vision. Let us get chicken or the egg out there and the chicken will follow — I cannot remember which way round it is.

I agree entirely with the noble Lord, Lord Mitchell, that the problem with Openreach is that it will always be controlled by its holding company, which takes the investment, redirects it and decides where the money goes. That is the challenge with having it overseeing.

I do not want waste much time, because I know that it is getting late-ish. On jobs, a huge number of jobs were created in earlier days in installing and maintaining internet of things sensors all over the place — that will change. On the gigabit stuff, it will save travel, energy and all sorts of things — we might even do remote-control hip operations, so you send the device and the surgeon then does it remotely, once we get super-duper superfast broadband.

I want to say one thing about IP. The Open Rights Group raised having thresholds of seriousness. It is quite important that we do not start prosecuting people on charges with 10-year sentences for trivial things. But it is also sad how interesting documentaries can disappear terribly quickly. The catch-up services cover only a month or so and if you are interested, it is quite nice being able to find these things out there on the internet a year or two later. There should somehow be a publicly available archive for all the people who produce interesting documentaries. I do not know whether they should make a small charge for it, but it should be out there.

The Open Rights Group also highlighted the bulk sharing of data. Some of the stuff will be very useful — the briefing on free school meals is interesting — but if you are the only person who really knows what might be leaked, it is very dangerous. If someone were to beat you up, an ordinary register could leak your address across without realising that at that point you are about to go into witness protection. There can be lots of problems with bulk data sharing, so be careful; that is why the insurance database was killed off a few years ago. Apart from that, I thank your Lordships for listening and say that, in general, this is a good effort.?

Read more bw.htm at MelonFarmers.co.uk

bannedThe BBFC currently cuts about 15% of all R18 porn films on their way to totally ordinary mainstream porn shops. These are not niche or speciality films, they are totally middle of the road porn, which represents the sort of content on all the world’s major porn sites. Most of the cuts are ludicrous but Murray Perkins, a senior examiner of the BBFC, points out that they are all considered either be to be harmful, or else are still prohibited by the police or the government for reasons that have long since past their sell by date.So about a sixth of all the world’s adult films are therefore considered prohibited by the British authorities, and so any website containing such films will have to be banned as there is to practical way to cut out the bits that wind up censors, police or government. And this mainstream but prohibited content appears on just about  all the world’s major porn sites, free or paid.

The main prohibitions that will cause a website to be blocked (even before considering whether they will set up strict age verification) are such mainstream content as female ejaculation, urine play, gagging during blow jobs, rough sex, incest story lines (which is a major genre of porn at the moment), use of the word ‘teen’ and verbal references to under 18’s.

Murray Perkins has picked up the job of explaining this catch all ban. He explains it well,  but he tries to throw readers off track by citing examples of prohibitions being justifiable because the apply to violent porn whilst not mentioning that they apply equally well to trivia such as female squirting.

Perkins writes in the Huffington Post:

BBFC logoRecent media reports highlighting what content will be defined as prohibited material under the terms of the Digital Economy Bill could have given an inaccurate impression of the serious nature of the harmful material that the BBFC generally refuses to classify. The BBFC works only to the BBFC Classification Guidelines and UK law, with guidance from the Crown Prosecution Service (CPS) and enforcement bodies, and not to any other lists.

The Digital Economy Bill aims to reduce the risk of children and young people accessing, or stumbling across, pornographic content online. It proposes that the BBFC check whether

(i) robust age verification is in place on websites containing pornographic content and

(ii) whether the website or app contains pornographic content that is prohibited.

An amendment to the Digital Economy Bill, passed in the House of Commons, would also permit the BBFC to ask Internet Service Providers (ISPs) to block pornographic websites that refuse to offer effective age verification or contain prohibited material such as sexually violent pornography.

In making any assessment of content, the BBFC will apply the standards used to classify pornography that is distributed offline. Under the Video Recordings Act 1984 the BBFC is obliged to consider harm when classifying any content including 18 and R18 rated sex works. Examples of material that the BBFC refuses to classify include pornographic works that: depict and encourage rape, including gang rape; depict non-consensual violent abuse against women; promote an interest in incestuous behaviour; and promote an interest in sex with children. [Perkins misleadingly neglects to include, squirting, gagging, and urine play in his examples here]. The Digital Economy Bill defines this type of unclassifiable material as prohibited .-

Under its letters of designation the BBFC may not classify anything that may breach criminal law, including the Obscene Publications Act (OPA) as currently interpreted by the Crown Prosecution Service (CPS). The CPS provides guidance on acts which are most commonly prosecuted under the OPA. The BBFC is required to follow this guidance when classifying content offline and will be required to do the same under the Digital Economy Bill. In 2015, 12% of all cuts made to pornographic works classified by the BBFC were compulsory cuts under the OPA. The majority of these cuts were to scenes involving urolagnia which is in breach of CPS guidance and could be subject to prosecution.

Read more me_internet.htm at MelonFarmers.co.uk

Facebook logoFacebook, Microsoft, Twitter and YouTube are coming together to help curb the spread of terrorist content online. There is no place for content that promotes terrorism on our hosted consumer services. When alerted, we take swift action against this kind of content in accordance with our respective policies.

We have committed to the creation of a shared industry database of hashes 204 unique digital fingerprints 204 for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services. By sharing this information with each other, we may use the shared hashes to help identify potential terrorist content on our respective hosted consumer platforms. We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help curb the pressing global issue of terrorist content online.

Our companies will begin sharing hashes of the most extreme and egregious terrorist images and videos we have removed from our services 204 content most likely to violate all of our respective companies’ content policies. Participating companies can add hashes of terrorist images or videos that are identified on one of our platforms to the database. Other participating companies can then use those hashes to identify such content on their services, review against their respective policies and definitions, and remove matching content as appropriate.

As we continue to collaborate and share best practices, each company will independently determine what image and video hashes to contribute to the shared database. No personally identifiable information will be shared, and matching content will not be automatically removed. Each company will continue to apply its own policies and definitions of terrorist content when deciding whether to remove content when a match to a shared hash is found. And each company will continue to apply its practice of transparency and review for any government requests, as well as retain its own appeal process for removal decisions and grievances. As part of this collaboration, we will all focus on how to involve additional companies in the future.

Throughout this collaboration, we are committed to protecting our users’ privacy and their ability to express themselves freely and safely on our platforms. We also seek to engage with the wider community of interested stakeholders in a transparent, thoughtful and responsible way as we further our shared objective to prevent the spread of terrorist content online while respecting human rights.