Advocacy group urging social media platforms be held accountable for content they publish
One of the advantages of the internet is it provides a wealth of knowledge to anyone who has a device that can access it.
However, one of the downsides is with so much information available, a lot of it is unverified, while some of it can even be so inaccurate it becomes harmful.
Because of this, many believe social media companies, such as Facebook and YouTube should be held accountable for the information shared on their websites.
A new research report released by watchdog group FRIENDS of Canadian Broadcasting argues these companies should be considered publishers, and thus held accountable for user-generated content published to their platforms.
“Our elected officials don’t need to create new laws to deal with this problem. They don’t need to define harmful content, police social media, or constrain free expression in any new way. All government needs to do is apply existing laws," Daniel Bernhard, Executive Director for FRIENDS, said in a news release.
"But if a judge decides that content circulated on social media breaks the law, the platform which publishes and recommends that illegal content must be held liable for it,” he continued.
In their defense, social media companies have argued that they simply function as bulletin boards that display user-generated content without editorial control--they posit that it would be impossible to discover illegal content from among the 100 billion daily posts.
Platforms such as Facebook claim to advertisers that they have technology that recognize content users post before it is published and pushed out to others.
Additionally, Facebook routinely exercises editorial control by promoting content users have never asked to see, including extreme content that would land other publishers in legal trouble, as well as conceals content from users without consulting them--another form of editorial control.
“Facebook and other social media platforms have complaints processes where they are alerted to potentially illegal or otherwise objectionable content. Yet it is their own community standards, not the law, which dictates whether they will remove a post," George Carothers, director of research for FRIENDS, said in the same release.
"Even then Facebook employees say that the company does not apply its own standards when prominent right-wing groups are involved,” he continued.
- Google to pay global publishers US$1 billion over three years for news content
- Facebook Bans Several Local Groups and People for Hate-Related Content
- LCBO Recalling Select Bottles with Extremely High Alcohol Content
- Publishers want Facebook, Google to share ad revenues amid COVID-19 declines
- UK Study Lists Instagram as Leading Social Media Platform for Child Grooming
- Some university students in Mississauga are dating older partners to pay off debt
- Enforcement of Ontario’s new stay-at-home orders: an explanation
- Peel Police say Mississauga residents are calling 911 to find out if it is okay to go out
- Ontario announces blitz of big-box stores in Mississauga following declaration of stay at home order
- Starbucks says closure of up to 300 stores in Canada to be completed by end of March