But they literally cannot moderate their platform. The amount of data that Facebook sees every minute would bankrupt any company if they had to actually hire enough people to go through all that content and determine which is fine and which isn’t. And that isn’t even taking into consideration the mental and emotional damage that a person will go through just seeing all the vile and despicable shit that gets posted. AI moderation isn’t advanced enough and the human moderation cost is so great that the giant social media companies will pretty much never be able to self moderate. Reddit was only able to moderate itself (to an extent) because they had an endless supply of free mods. Facebook doesn’t have that same luxury.
Why pay $5 million to pay 100 mods $50k/year when you can just pay a few hundred thousand in fines while you let the government move the walls of your garden for you.
iirc meta on its own spends billions on content moderation, much more than other companies generally do. the problem is with content moderation, you only see the stuff they miss and not the stuff they already filtered out.
on the topic of weeding out CSAM, an example of where a company gave up on it is nintendo suprisingly. when they had flipnote(a 3ds application where you can send post it nores to others) was used by predators in japan to lure children. Nintendo deemed it not moderatable and since then, removed. flipnote and no chat replacement has since then replaced it functionally.
moderation is super tough and you can hear some really fucked up stories these people go through, even ones who have to go through more content (e.g people who have to filter out content in china due to government surveillance) has and how it affevted their lives.
I’ve reported probably a thousand pictures of swastika tattoos and shit they don’t remove, and people calling people homophobic slurs. I don’t think anyone reviews those reports.
because on the list of stuff theyre filtering out, thats probably low on their list when compared to content like CSAM or actual murder, which gets them into legal problems if that kind of content gets wild.
That’s what externalization looks like. In the fossil fuel industry, it’s creating polluting products without having to bear the costs. In chemical companies, it’s physically polluting the environment. Same with mining companies, etc.
In social media, it is a refusal to manage content in a responsible manner, whether it’s CSAM or disinformation campaigns or hate speech. That externalization is what allows them to pay the salaries that they do, and invest in r&d, and increase their stock values to ridiculous levels. Meta is a trillion dollar company and it needs to rebalance its priorities.
Worth noting: FB and fuckzuck knew that moderation would be a problem for a big platform with millions of daily users. They didn’t care back in 2012, they don’t care now. “Not our problem”, for all intents and purposes, just like their nonexistant customer support.
In the corporate world, profits are always more important than safety, health and other civilian nonsense. Last I checked instagram via the app, I saw 3 ads for obvious pyramid schemes, not too different from my previous check in 2023. Hey, scammers are paying for ad space, why should zuck care?
But they literally cannot moderate their platform. The amount of data that Facebook sees every minute would bankrupt any company if they had to actually hire enough people to go through all that content and determine which is fine and which isn’t. And that isn’t even taking into consideration the mental and emotional damage that a person will go through just seeing all the vile and despicable shit that gets posted. AI moderation isn’t advanced enough and the human moderation cost is so great that the giant social media companies will pretty much never be able to self moderate. Reddit was only able to moderate itself (to an extent) because they had an endless supply of free mods. Facebook doesn’t have that same luxury.
Sounds like they should be bankrupt then
Why pay $5 million to pay 100 mods $50k/year when you can just pay a few hundred thousand in fines while you let the government move the walls of your garden for you.
I guess. But they could do a better job with user reported content which they very much don’t.
iirc meta on its own spends billions on content moderation, much more than other companies generally do. the problem is with content moderation, you only see the stuff they miss and not the stuff they already filtered out.
on the topic of weeding out CSAM, an example of where a company gave up on it is nintendo suprisingly. when they had flipnote(a 3ds application where you can send post it nores to others) was used by predators in japan to lure children. Nintendo deemed it not moderatable and since then, removed. flipnote and no chat replacement has since then replaced it functionally.
moderation is super tough and you can hear some really fucked up stories these people go through, even ones who have to go through more content (e.g people who have to filter out content in china due to government surveillance) has and how it affevted their lives.
I’ve reported probably a thousand pictures of swastika tattoos and shit they don’t remove, and people calling people homophobic slurs. I don’t think anyone reviews those reports.
because on the list of stuff theyre filtering out, thats probably low on their list when compared to content like CSAM or actual murder, which gets them into legal problems if that kind of content gets wild.
That’s what externalization looks like. In the fossil fuel industry, it’s creating polluting products without having to bear the costs. In chemical companies, it’s physically polluting the environment. Same with mining companies, etc.
In social media, it is a refusal to manage content in a responsible manner, whether it’s CSAM or disinformation campaigns or hate speech. That externalization is what allows them to pay the salaries that they do, and invest in r&d, and increase their stock values to ridiculous levels. Meta is a trillion dollar company and it needs to rebalance its priorities.
They can, but doing so will affect profits. They used to outsource moderation to Kenyans, who got paid in pennies. Sama, the company doing said “moderation”, apparently stopped offering that kind of work.
Worth noting: FB and fuckzuck knew that moderation would be a problem for a big platform with millions of daily users. They didn’t care back in 2012, they don’t care now. “Not our problem”, for all intents and purposes, just like their nonexistant customer support.
In the corporate world, profits are always more important than safety, health and other civilian nonsense. Last I checked instagram via the app, I saw 3 ads for obvious pyramid schemes, not too different from my previous check in 2023. Hey, scammers are paying for ad space, why should zuck care?