BRIEF-Gemini reports Q1 net loss per share of $0.04
* Now expecting revenue for 2017 to be less than 2016 but expects activity to pick up in second half of 2017 into 2018 Source text for Eikon: Further company coverage:
LONDON, April 13 Facebook failed to remove dozens of instances of extremist and child pornography even after the social network's moderators were directly informed of the potentially illegal content, an investigation by The Times showed on Thursday.
Using a fake profile set up last month, a Times journalist found images and videos glorifying Islamic State and recent deadly attacks in London and Egypt, along with graphic images of child abuse, and asked site moderators to remove them.
Facebook moderators removed some of reported images but left untouched pro-jihadist posts praising recent attacks and calling for new ones. The company appeared to take action only after The Times identified itself as reporting a story on the matter.
Failure to remove content which is illegal under British law after company officials have been notified of its existence could expose Facebook to criminal prosecution for its role in encouraging the publication and distribution of such imagery.
The social media giant faces new laws in countries around the world to force it to move faster to combat illegal content but it has struggled to keep pace as illicit posts can reappear as fast as they are identified and taken down.
A Facebook spokesman said the company had now removed all the images identified by the Times as potentially illegal, acknowledging that they "violate our policies and have no place on Facebook".
"We are sorry that this occurred," Facebook Vice President of Operations Justin Osofsky said in a statement. "It is clear that we can do better, and we'll continue to work hard to live up to the high standards people rightly expect of Facebook.”
A spokesman for London's Metropolitan Police called for individuals to report extremist content to it via an online form. It declined to comment on whether it was investigating if Facebook failed to act when notified of the illegal content.
"Where material breaches UK terrorism laws, the Counter Terrorism Internet Referral Unit (CTIRU) will, where possible, seek the removal of the content by working with the relevant internet hosting company," the spokesman said. (Reporting By Eric Auchard; editing by Stephen Addison)
* Ironhorse announces Q1 2017 financial and operating results
* AT&T announces IBEW-represented employees vote to ratify midwest wireline agreement