[kwlug-disc] More reasons to end Facebook

CrankyOldBugger crankyoldbugger at gmail.com
Sat Sep 25 10:03:39 EDT 2021


While some of us in this group, including myself, swore off Facebook years
ago, we all know people who swim in it every day.  In this week's email
from The Markup (which I highly recommend) they talked of more issues with
FB that I thought you guys would like (like as in enjoy, not like as in
"remember to Like and Subscribe!"...)

The Markup can be found at https://themarkup.org/


Honestly, it was kind of hard to imagine that Facebook’s image could get
more tarnished. After all, the company’s been mired in negative press for
the past five years, ever since the Cambridge Analytica scandal
<https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html>
and the Russian disinformation campaign
<https://www.washingtonpost.com/technology/2021/05/26/facebook-disinformation-russia-report/>
fueled Trump’s election in 2016.
But, amazingly, public relations has gotten even worse for Facebook this
month. The precipitating event was the emergence of a Snowden-style trove
of documents—“The Facebook Files
<https://www.wsj.com/articles/the-facebook-files-11631713039>”—that appear
to have been leaked to The Wall Street Journal reporter Jeff Horwitz.
In a five-part series, The Wall Street Journal used those documents to
reveal that not only was Facebook fueling teenage self-harm and enabling
human trafficking, but that Facebook itself also knew that its platform
contributed to those problems and chose to ignore it.
The series revealed that:

   - Facebook’s secret program, XCheck
   <https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353?mod=article_inline>,
   allows millions of VIPs to be exempt from the company’s normal content
   moderation rules preventing hate speech, harassment, and incitements to
   violence. A 2019 internal review of the program declared that its double
   standard was a “breach of trust.”
   - Facebook’s own research found that Instagram makes one-third of
   teenage girls feel worse about their bodies
   <https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739?mod=hp_lead_pos7&mod=article_inline>
   .
   - Facebook tweaked its algorithms to “increase meaningful social
   interactions” but found that the shift actually boosted “misinformation,
   toxicity, and violent content.
   <https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215?mod=article_inline>
   ”
   - Facebook has been lethargic about removing dangerous content
   <https://www.wsj.com/articles/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline>
   in developing countries including human trafficking and incitements to
   ethnic violence—even when the content was flagged internally.
   - Facebook was slow to address a content moderation loophole
   <https://www.wsj.com/articles/facebook-mark-zuckerberg-vaccinated-11631880296?mod=article_inline>
   that allowed anti-vaccine advocates to flood Facebook with anti-vax
   comments, despite internal cries for a fix.

In many of these arenas, it’s been documented in the past that Facebook was
enabling harm. What’s new is proof that Facebook has long understood its
ills but has repeatedly failed to address them adequately. As the Journal
put it, “Facebook knows, in acute detail, that its platforms are riddled
with flaws that cause harm, often in ways only the company fully
understands.”
Facebook responded that the Journal was “cherry-picking” anecdotes that
mischaracterized its actions. “The fact that not every idea that a
researcher raises is acted upon doesn’t mean Facebook teams are not
continually considering a range of different improvements,” Nick Clegg,
Facebook’s vice president of global affairs, wrote in a blog post
<https://about.fb.com/news/2021/09/what-the-wall-street-journal-got-wrong/>
.
At the same time, new evidence of harm enabled by Facebook continued to
accumulate in other publications, including The Markup. The New York Times
reported that Facebook had rolled out a feature that was intended to promote
positive news about the social network
<https://www.nytimes.com/2021/09/21/technology/zuckerberg-facebook-project-amplify.html>
in users’ feeds in part to bury negative press about itself. And ProPublica
reported that Facebook Marketplace was riddled with fraudulent and scam
listings
<https://www.propublica.org/article/facebook-grew-marketplace-to-1-billion-users-now-scammers-are-using-it-to-target-people-around-the-world>
.
And, this week, The Markup Citizen Browser project manager Angie Waller and
reporter Colin Lecher revealed that Facebook has been disproportionately
amplifying Germany’s far-right political party
<https://themarkup.org/citizen-browser/2021/09/22/germanys-far-right-political-party-the-afd-is-dominating-facebook-this-election>
during the run-up to tomorrow’s parliamentary elections.
Although the political party Alternative for Germany (Alternative für
Deutschland), or the AfD, is relatively small (it won just 12 percent of
the vote in the 2017 parliamentary elections
<https://www.bundeswahlleiter.de/en/bundestagswahlen/2017/ergebnisse/bund-99.html>),
its supporters’ anti-immigrant and anti-pandemic-restrictions posts were
displayed more than three times as often as posts from much larger rival
political parties and their supporters in our German panelists’ news feeds.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://kwlug.org/pipermail/kwlug-disc_kwlug.org/attachments/20210925/6863c523/attachment.htm>


More information about the kwlug-disc mailing list