Facebook should evaluate its products’ potential for misuse and discontinue them if the effects are harmful to society. In the interim, there will be many more dangerous and violent incidents people will try to livestream. But if it happens at all, that will be very difficult – breaking up AT&T lasted a decade, from the 1974 lawsuit to the 1984 launch of the “Baby Bell” companies. Much of the discussion about regulating social media has considered using anti-trust and monopoly laws to force the enormous technology giants like Facebook to break up into smaller separate companies. At least two people have already been arrested for sharing it online. New Zealand’s government has stepped up, too, banning the livestream video of the mosque massacre, meaning anyone who shares it could face up to NZ$10,000 in fines and 14 years in prison. regulators aren’t likely to lead the way.Įuropean Union officials are handling much of the work, especially around privacy. Despite strong statements from politicians and even calls for hearings about social media in response to the New Zealand attack, U.S. US regulation isn’t coming soonĬongress has not yet taken any meaningful action to regulate social media companies. In my view, Facebook’s role as a tool to gain, keep and spread political power makes politicians far less likely to rein it in. Once in office, they continue to use social media to communicate with supporters in hopes of getting reelected.įederal agencies also use social media to communicate with the public and influence people’s opinions – even in violation of U.S. Some have relied on social media to collect donations, target supporters with advertising and help them get elected. politicians have developed deep ties with platforms like Facebook. Whether and how to regulate social media is a political question, but many U.S. There is effectively no regulation for social media companies they change only in pursuit of profits or to minimize public outcry. Time-delays took hold in TV only because broadcasting regulators penalized broadcasters for airing inappropriate content during live shows. Facebook could even let people request a company moderator for upcoming livestreams.įacebook has not yet taken this relatively simple step – and the reason is clear. Major users, including publishers and corporations, could be permitted to livestream directly after completing a training course. Only then would enough adult users have screened it and had the chance to report its content. That time allows a moderator to review the content and confirm that it’s appropriate for a broad audience.įacebook relies on users as moderators, and some livestreams may not have a large audience like TV, so its delay would need to be longer, perhaps a few minutes. In the television industry, short time-delays of a few seconds are typical during broadcasts of live events. They also suggest that people don’t know how to report inappropriate content – or don’t have confidence the company will act on the complaint.įacebook founder and CEO Mark Zuckerberg discusses the murder of Robert Godwin Sr. These details make painfully clear how dependent Facebook is on users to flag harmful content. The company recently issued some analytic details and noted that fewer than 200 people viewed the livestream of the massacre, and that surprisingly, no users reported it to Facebook until after it ended. Facebook highlighted the fact that 1.2 million of them “ were blocked at upload.” However, as a social media researcher and educator, I heard that as an admission that 300,000 videos and images of a mass murder passed through its automated systems and were visible on the platform. In the 24 hours after the New Zealand massacre, 1.5 million videos and images of the killings were uploaded to Facebook’s servers, the company announced. Though the company has hired more than 3,000 additional human content moderators, Facebook is not any better at keeping horrifying violence from streaming live online without any filter or warning for users. Facebook Live has broadcast killings, as well as other serious crimes such as sexual assault, torture and child abuse. That way, adult users would have an opportunity to flag inappropriate content before children were exposed to it. In the wake of Godwin’s murder, I recommended that Facebook Live broadcasts be time-delayed, at least for Facebook users who had told the company they were under 18. Facebook later clarified that the graphic video was uploaded after the event, but the incident called public attention to the risks of livestreaming violence. In 2017, Godwin was murdered in Cleveland, Ohio, and initial reports indicated that the attacker streamed it on Facebook Live, at the time a relatively new feature of the social network. When word broke that the massacre in New Zealand was livestreamed on Facebook, I immediately thought of Robert Godwin Sr.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |