Log In | Subscribe | | |

Reining in Facebook: the company CAN identify suspicious material

Editorial Staff

Facebook is a distribution channel for child pornography but it's young Danes who are being prosecuted after Facebook identified the material, told US authorities who reported the distribution to the police in Denmark.

Facebook will be seeking plaudits for identifying a video of two fifteen year old children having sex which was being passed around using Facebook Messenger. More than 1,000 Danes, mostly teenagers, have been charged by the Danish police with the offence of distributing sexually explicit material. Moreover, because the participants in the video are minors, that is under 18, the images constitute "indecent images of children," i.e. child pornography.

If they are convicted of distributing pornography, they will be listed on the child pornography offenders register for ten years. If they are convicted of the lesser offence, of distributing sexually explicit material, they can be jailed for up to 20 days but will not be listed on a sexual offenders register.

The platform has a history of being a medium used by child pornographers. A US Marine, Lance Corporal Billie Morton is facing a General Court Martial. He is alleged to have used Facebook to supply him with "incest pics" and "nudes" a report in the San Diego Reader said on 1 December last year. The case is being handled by real-life Naval Criminal Investigative Services (NCIS) officers. Morton, it is alleged, used a government-owned computer for his activities and was discovered when another marine opened the PC and found Morton's log-in was still active and his Facebook page open.

In 2016, a 14 year old girl sued Facebook in Belfast, Northern Ireland. A user had published and distributed naked images of the girl using the Facebook platform. The case was described as "revenge porn," a name that may be seen as a bit too "cuddly" for the harm it causes. Facebook tried to solve the problem by removing all nudity and children but that idea ran aground pretty quickly after it repeatedly removed an iconic image of a young, naked, victim of the war in Vietnam.

Facebook does make an effort, it has to be said. Using software called PhotoDNA it scans all uploaded images and compares them to known pornographic material. But it doesn't recognise pornography per se: unless an image is reported, the software doesn't question it.

The law protects Facebook and other sites by, in many countries, failing to consider them publishers (which would make them liable for whatever appears on the site) but only considering them culpable if they are notified of offending material and fail to take it down within a reasonable time.

It is the fact that once the image appears in social media it can be replicated both within that platform and across the webscape, many many times within a short time - and there is no record of where the images have gone.




Amazon ads

| |