Posts Tagged ‘YouTube’

Read more me_internet.htm at MelonFarmers.co.uk

Facebook logoFacebook, Microsoft, Twitter and YouTube are coming together to help curb the spread of terrorist content online. There is no place for content that promotes terrorism on our hosted consumer services. When alerted, we take swift action against this kind of content in accordance with our respective policies.

We have committed to the creation of a shared industry database of hashes 204 unique digital fingerprints 204 for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services. By sharing this information with each other, we may use the shared hashes to help identify potential terrorist content on our respective hosted consumer platforms. We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help curb the pressing global issue of terrorist content online.

Our companies will begin sharing hashes of the most extreme and egregious terrorist images and videos we have removed from our services 204 content most likely to violate all of our respective companies’ content policies. Participating companies can add hashes of terrorist images or videos that are identified on one of our platforms to the database. Other participating companies can then use those hashes to identify such content on their services, review against their respective policies and definitions, and remove matching content as appropriate.

As we continue to collaborate and share best practices, each company will independently determine what image and video hashes to contribute to the shared database. No personally identifiable information will be shared, and matching content will not be automatically removed. Each company will continue to apply its own policies and definitions of terrorist content when deciding whether to remove content when a match to a shared hash is found. And each company will continue to apply its practice of transparency and review for any government requests, as well as retain its own appeal process for removal decisions and grievances. As part of this collaboration, we will all focus on how to involve additional companies in the future.

Throughout this collaboration, we are committed to protecting our users’ privacy and their ability to express themselves freely and safely on our platforms. We also seek to engage with the wider community of interested stakeholders in a transparent, thoughtful and responsible way as we further our shared objective to prevent the spread of terrorist content online while respecting human rights.

Read more Internet News at MelonFarmers.co.uk

tubeheroes YouTube is looking for what if calls, heroes to snitch on videos and inappropriate comments, but early feedback has been overwhelmingly negative with users describing it as crowdsourced censorship.Users who join the Heroes program will earn points for adding captions and subtitles to videos, flagging inappropriate videos and answering questions on the site’s Help forum.

Accruing points will earn them with rather underwhelming and cheapo ‘privileges’ like joining video chats with others in the Heroes program, exclusive previews of upcoming product launches and the ability to flag abusive videos en masse instead of one at a time.

However, YouTube employees ultimately make the final decision on what to do with content marked as inappropriate.

Users on YouTube made their voices heard almost immediately, with an overwhelming number of Dislikes on the announcement video. It currently has over 200,000 Dislikes compared to 3,000 Likes, after nearly 600,000 views.

Read more Internet News at MelonFarmers.co.uk

youtube advertising logo A change in YouTube’s content moderation system has left many video creators uncertain of their place on the platform. Over the past day, users have been posting notices from Google, saying that certain videos were being barred from accepting advertising via YouTube’s ad service. The videos were often arbitrarily flagged for reasons that seemed unfair, unclear, or outright censorious.YouTube have explained that the changes are due to a change of process rather than a change of rules, and have added that a new appeal process has been introduced for those considering that they have been unfairly treated.

The Google rules for videos suitable for advertising are as follows:

Content that is considered “not advertiser-friendly” includes, but is not limited to:

  • Sexually suggestive content, including partial nudity and sexual humor
  • Violence, including display of serious injury and events related to violent extremism
  • Inappropriate language, including harassment, profanity and vulgar language
  • Promotion of drugs and regulated substances, including selling, use and abuse of such items
  • Controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown
Read more BBFC News at MelonFarmers.co.uk

 See article from newstatesman.com by Mark Owen

light censorshipThe web inevitably makes available some content which is unsuitable or inappropriate for children to access. Some of this will be illegal, but much more will not, or may be suitable say for over 13s or over 16s only. A traffic light system may therefore struggle to distinguish between these and runs the risk of imposing the strictest warning on masses of content by default.

A greater concern however, is how the new system will guard against becoming a tool to enable prejudices of one kind or another to be played out. The system can only operate if it is the crowd’s decision which counts – the reason this is even being considered is because there is too much content for a regulator or platform to consider. Relying on the crowd assumes that a collective consciousness emerges from the great mass of web users and their shared values, rather than a set of subjective reactions. This is a dangerous assumption. As a recent MIT study reported in Science suggests, the wisdom of the crowd may be a myth, its mentality more akin to that of a mob or herd.

…Read the full article

Read more BBFC News at MelonFarmers.co.uk

See article from guardian.co.uk

light censorshipFilm censors in three countries including UK are to pilot a program in which amateur video-makers can self-classify their postings.

Under the traffic light system, green footage would be suitable for all, amber for 12 year-olds and up, and red for adults only.

The project, developed by the British Board of Film Classification in collaboration with partners in Italy and the Netherlands, could also allow powerful internet service providers and search engines a new path through the current controversy about their uncensored content.

Amateur film-makers will be able to rate the films they put online according to national ratings categories, and the whole process could then be further policed by users of the site. Participating websites would have the option of letting viewers comment on the way that each film has been rated, alerting both users and the relevant national authorities to any serious transgressions.

The idea of offering a do-it-yourself rating service for user-generated content came out of international discussions with the parallel bodies in charge of film censorship and classification.

David Austin, assistant director of policy and public affairs at the BBFC said:

We already classify some 10,000 videos and films that are submitted to us for release every year and we will be using much the same classification model in the pilot for user-generated content.

The sheer amount of private video footage uploaded on popular sites such as YouTube means there is no way any board could tackle it. The volume is so great that it became clear the answer was to get those who are making and posting the films to rate them for users

Consultation with the Dutch film regulator led to the idea that an online questionnaire comprising simple questions about the nature of the content could be made to apply across international boundaries. Austin explained the procedure:

We will not be asking people to make value judgments about their films. They just have to answer simple questions about the content, such as ‘Does this video contain X, Y or Z, and if so, how long is the scene?’

In Britain the usual six ratings categories for films will be reduced to three:

We felt that six would be too complicated, said Austin, so we have conflated U, which means suitable for all, with PG, parental guidance, and then the age category 12 with 15, and finally 18, suitable only for adults, with R18, which covers those adult works intended for licensed premises only.

We will represent these three categories with the traffic light symbols green, amber and red.

The scheme will be voluntary and service providers and search engines will be able to decide how their users want to see the ratings displayed.

At this stage a lot of it depends on how much the search engines buy into the scheme. We want to help them look after their sites, and if some of the big ones get involved, then they can make the age-rating option available for everything.

The crowdsource monitoring option would then allow users to judge the chosen rating and to spot abuses of the system. If there is a serious problem, such as an example of hate speech or of child abuse, it can be reported.

The ‘conflation’ of 12 and 15 seems to be devil in the detail. 12 is very much the new PG and the current guidelines define it as more or less suitable for kids over 8, albeit with parental discretion. All modern family blockbusters fit into this category.

Surely you cannot have currently 15 rated strong language, horror films, and sex scenes noted as suitable for 12 year olds. So the lack of separate 15 rating means that anything with more than couple of swearwords, or bit violent, or even a bit sexy, has got nowhere to go, except an 18 rating.

So it appears that the ratings scheme only offers 3 choices, suitable for kids under 8, suitable for kids over 8, and adults only. Sounds like the powers that be are working towards a cheap and easy to implement, kids or adults internet censorship scheme.

Read more ATVOD Watch at MelonFarmers.co.uk

See article from pocket-lint.com

YouTube logoThe Financial Times is reporting that Google will launch paid subscription channels on YouTube sometime very soon. Channels will be priced from about £ 1.30 a month. The idea would allow traditional broadcasters to offer content to viewers

YouTube has been interested in creating more high-quality channels for some time now. Recently it awarded grants of $1million to several UK bidders who pitched channel ideas.

There is one interesting side issue here, because at some point YouTube will become, in the eyes of the UK government – and likely others – a broadcaster. When that happens, the firm is going to have to obey UK censorship laws and make sure that under-18s are protected from unsuitable content.

Pocket-lint understands that the money YouTube gave to its channel partners to start channels was paid in advance specifically to avoid the need to be censored by ATVOD and Ofcom.

ATVOD’s censorship fees are very expensive and the money is mostly spent dreaming up ways to suffocate the UK adult internet business.

YouTube is currently outside of the grasp of ATVOD as user content is specifically excused from their censorship under European law. However material from commercial channels which may be TV programmes is not exempt from TV censorship once it is under editorial control and uploaded by the channels themselves.

Read more UK Parliament Watch at MelonFarmers.co.uk

See article from guardian.co.uk

heidi alexanderMP Heidi Alexander has launched a private members bill allowing police to censor social media videos that incite violence. She has been in the forefront of attacks against social media since the riots in August.

MPs have now backed a call for police to be given censorship powers to block or take down YouTube videos that could incite violence.  MPs voted in favour of allowing Alexander to bring forward her bill, which will receive a second reading in March. However, the proposals are unlikely to become law without government support.

Alexander told MPs:

I am introducing this bill because I am appalled by the proliferation of online videos which glorify gangs and serious youth violence.

Police, via the courts and internet service providers, need to be given explicit power to get these videos taken down or access to them blocked.

I recognise the policing of the internet is always going to be incredibly difficult but unless we start to grapple with the online manifestation of gangs, I question our ability to really tackle the problem

We can talk about gang injunctions all we like, and yes, there may be a need to stop a certain individual or group coming into a certain area at a certain time, but don’t we too need to recognise that the same individual may be causing an equal amount of fear by his or her actions sat on a computer at home, or spreading these vile videos through social networking sites?

Similar powers already exist to take down or block access to websites that could incite racial hatred or feature extremist material.