In 2019, a researcher at Facebook conducted an experiment to see whether the platform really has a tendency to send users down a rabbit hole of extreme and conspiratorial content. The employee set up a pair of fake profiles—for Trump-supporting “Carol Smith” and Bernie-loving “Karen Jones”—and then led each one down the path of least resistance, liking whichever groups and pages Facebook’s recommendation system served up. Not a huge surprise: It took less than a week for Carol to be pushed toward online communities dedicated to QAnon, and for Karen to be swamped by lewd anti-Trump material.
The details of this experiment were found among the thousands of documents shared with reporters last month by the whistleblower and former Facebook employee Frances Haugen; “Carol’s Journey to QAnon,” in particular, has featured heavily in coverage. But the mere existence of the rabbit hole wasn’t shocking in itself. In 2017, the reporter Ryan Broderick published a bloggy version of the same idea at BuzzFeed News: “I Made a Facebook Profile, Started Liking Right-Wing Pages, and Radicalized My News Feed in Four Days.” When that piece came out, Facebook responded, “This isn’t an experiment; it’s a stunt.” Now we know that Broderick’s stunt produced, if nothing else, a replicable result.
Carol’s journey, like Karen’s and Broderick’s, addressed specific, urgent questions about how Facebook might polarize and confuse American voters. Facebook’s fake accounts started out by liking Fox News and Donald Trump, or else Elizabeth Warren and MoveOn; the one created for BuzzFeed went with the Republican National Committee and then–White House Chief of Staff Reince Priebus, as well as Hillary Clinton and Barack Obama. Taken all together, they show how Facebook’s mechanics, left unchecked, can grab ahold of even the slightest political leaning and bend it to grotesque extremes.
But none of these experiments has that much to say about what might happen to a Facebook user who doesn’t care about politics at all. Let’s say you never gave the platform any hint about your ideology, or how you’ve ever voted, or whether you even have. Let’s say you made yourself as bland and centrist as you possibly could be, and then let the system do its algorithmic work. Would your account get pulled into some other kind of rabbit hole? And if it did, what would be waiting there?
For two weeks, I’ve been conducting my own Facebook experiment. I decided to make a new account on the platform as an alternative, apolitical version of myself who enjoys only the most widely beloved things in life. Like the fake Ryan Broderick, and the imaginary Carol Smith and Karen Jones, I would not send or accept any friend requests. I uploaded a real picture of myself, and added my real hometown as my location. Then, my editor and I decided on a list of “likes” that might reflect the tastes of a thoroughly nonpartisan, general-interest American: the Rolling Stones, Grey’s Anatomy, Domino’s Pizza, Target, Oprah, wine. From there, I engaged only with pages and groups and posts that Facebook curated for me, in all of its data-hoovering and look-alike-audience-building wisdom.
When I liked the Target page, a little widget popped up immediately and prompted me to like 10 other pages, which I did. Some of these recommendations were what one might expect: “Target Careers,” “Amazon Toys and Games.” Some were not, but they didn’t surprise me: “Dr Pepper Snapple Group,” “Sweet’N Low.” And some were a total mystery: a financial adviser named Max who lives in Nevada, a home-health-care service in Massachusetts run by an Irish couple. When I liked the “Wine” page, I was recommended “Beer,” and also a page called “We Like the United States of America.” When I liked Domino’s Pizza, I was recommended “Arby’s Curly Fries,” as well as the page for a specific Domino’s location in Zimbabwe that had apparently burned down in September.
I liked all of it! And then I “liked” all of it. I also joined the first 30 groups that Facebook recommended, including three Rolling Stones–related groups, some generic-sounding stuff like “Funny sarcastic quotes” and “Aesthetics,” and some other things … such as “Nana Funny Society,” “Germany dating serious site,” and “Old Men With Trucks.” The next day, an updated (and presumably refined) list of suggested pages appeared in my feed, including a meme page called “Twisted Abyss,” a page for a Travelodge Inn & Suites in South Carolina, a page touting the health benefits of dandelions, and a page for a psychic based in Tucson. I liked all of those and waited a few more days. When I came back, my new suggestions included “Memes for inmates,” “Skulls,” and a page called “Darkness of evil” with an About section signed by “the jokerman.” I liked all of those too.
After a week, Facebook started suggesting that I send some friend requests. Though I had entered my hometown, in upstate New York, as my location, almost all of the profiles collected into the “People You May Know” widget in my feed were from either Wisconsin or Pennsylvania. In the following days—though I did not send a friend request to anybody—the concentration of Wisconsinites and Pennsylvanians in the widget grew even higher. (Both swing states, so perhaps appropriate for my middle-of-the-road journey?) Yet for some reason, many of my suggested friends from Pennsylvania were specifically from New Castle, a small city in a county northwest of Pittsburgh that voted for Trump by a 30-point margin.
After establishing my presence on the platform, I loosened up a bit. I checked in each day and liked a few of whatever pages were suggested to me, joined a few of whichever groups, and scrolled through the main feed briefly, liking whatever I saw. I would be told to join a niche-sounding dating group, and end up watching a 30-minute video of a British man livestreaming from his kitchen in a group called “Foreigner’s Looking For Filipina,” but clearly not with the goal of finding anyone to date; he was just talking about his breakfast and his life, and telling commenters, “Please, don’t call me ‘daddy’; I actually have two daughters.” Or I would notice a vague but ubiquitous hashtag, like #BOOMChallenge, attached to a post about trusting in God or manifesting money, and click to see if I could decipher its meaning, which I never could.
In the comments below memes about how men and women tend to behave (differently), I would find links to expensive self-help courses or terrifying diet powders. Startled by an extremely graphic photo of a vagina or a butthole, I would realize I was looking at an optical illusion being played for laughs and engagement. (Click at your own risk.) Moments of true novelty were few and far between, and not any more pleasant. (Again, be warned.) I ended up in one amazing group called “Goofy Huskies,” which was full of great content, but alas, the time I spent there seemed to skew my recommendations toward pages that random people had made for their pets.
A few days later, I came across an image of white text on a black background, reading, “GIRLS HAVE MAGIC POWERS. THEY GET WET WITHOUT WATER. BLEED WITHOUT INJURY. AND MAKE BONELESS THINGS HARD.” The first comment beneath this post started, “I was totally broken when the love of my life left me,” and ended by providing the WhatsApp number for some kind of love sorcerer named Dr. Moses. Reading these words filled me with despair, but also a sense of cosmic surety that I’d reached the end of my journey.
After just two weeks on the platform, consuming only content that Facebook’s recommendation systems selected for me, I found myself at the bottom of a rabbit hole not of extremism but of utter trash—bad advice, stolen memes, shady businesses, and sophomoric jokes repeated over and over. Facebook isn’t just dangerous, I learned. It doesn’t merely have the ability to shape offline reality for its billions of users. No, Facebook is also—and perhaps for most people—senseless and demoralizing.
The results of my experiment fascinated me mostly on account of their brutality. Each post felt like a blunt-force expression of loneliness, desperation, horniness, or all three. At the same time, they seemed entirely inhuman. Who exactly had created these images, with their colorful backgrounds and their text about wanting to be kissed on the forehead or “bent over on the balcony”? It could have been a regular person, or it could have been a violent criminal, or it could have been some demon deep within the machine. My feed was full of promises and emotional declarations: “You will have money TOMORROW,” or “May god heal everything That you’re suffering alone,” or “Real men make your panties wet not your eyes.” But they came from nowhere and went nowhere—and they only made me feel worse and worse.
Even photos of nature and videos of animals were stripped of their basic earthliness. A hen guarding a litter of kittens seemed real and not real, as the person posting didn’t claim to have filmed it, and I had no idea how the situation had been arranged—or where or why. Same for a video of a girl and a cat eating from the same piece of watermelon—though that one I could trace back to the Instagram account it had been stolen from, which belongs to a cat whose profile refers to it as a “public figure.” (How can we live like this … with cats who are public figures?) I became suspicious of anything that verged on being entertaining or useful—a “TikTok hair hack” or a method for making deep-fried ham sandwiches made with Doritos instead of bread—because I could tell that it had been taken from somewhere else and assumed it had been posted only to boost views of something upsetting, like another $47 course on how to “Be Irresistible.”
Of course, I wasn’t using Facebook as it was intended—I was using it as a person who had no friends at all. When your family and friends are active on Facebook, you might at least get to see some photos of faces you recognize, doing things you can understand. But I’m not the first person to notice that Facebook has started to resemble something undead. “Earlier this month, the highest-performing link on U.S. Facebook was a five-year-old story about a shelter dog likely posted to the platform by a bot,” Ryan Broderick wrote in October. “That is 2010-Myspace levels of grim.” In recent issues of his newsletter, Platformer, Casey Newton picked through a couple of Facebook’s new “Widely Viewed Content” reports, noting how many of the site’s most popular posts had been ripped off from other sites and repurposed, and how many of its biggest pages were both selling something weird and operating like spam networks. If we can say that Facebook is a doomsday machine, we can also call it a chicken with its head cut off.
There were traces of chaos in “Carol’s Journey” too. They weren’t the flashiest part of the experiment, nor did they get mentioned in any Facebook Papers coverage that I read. (I saw them only during a second read of the report.) The fake Carol was an imaginary 41-year-old woman from North Carolina who was a Christian, a fan of Trump, and a mom. The leaked documents start by listing the very first set of recommendations she gets from Facebook in response to these stated characteristics. One is a Donald Trump fan group and one is a Melania Trump fan group—okay. One is a large group for home chefs to share photos of their cooking—sure. Then, for no discernible reason, there’s also a brain-injury support group, a small meme group called “Positively Insane,” a fan group devoted to the San Francisco–based sports announcers Mike Krukow and Duane Kuiper, and seven groups dedicated to various regions of California (“Tri-Valley Friends & Memories,” “You know you’re from San Leandro if...” etc.). These recommendations make no sense whatsoever, and yet they apparently weren’t even worth a side note from the report’s author.
After experiencing it for myself, it seems absurd that this fundamental strangeness of Facebook isn’t a regular topic of conversation. The company would not comment for this story, but elsewhere has acknowledged a need to reduce the impact of what it calls “low quality content,” and says that it is now building out its “engagement bait identifiers.” Still, the content that I consumed was so bad, it came off as almost cruel. After Newton’s post directed me to Facebook’s “Widely Viewed Content” reports, I noticed that one of the pages that appeared most often on the list was doing so by sharing silly questions paired with simple graphics, which then received millions of comments. There were setups like “800 seats in heaven, your last three digit of your phone number determines your seat” (7.8 million comments), and “Honor a pet who is no longer with you, who you miss dearly. What was their name?” (8.1 million comments). These sound a lot like questions you might ask if you were trying to hack into a stranger’s bank account.
Or maybe that’s just how Facebook makes me feel now: preyed upon. My experiment brought me to the understanding that there’s always some trick, angle, or motivation that I can’t quite see. If you don’t take any of your politics to Facebook, you may not get sucked into political extremism. But there are other ways to spiral down to the lowest common denominator, and then lower and lower, and there’s no relief, and there’s no bottom.
source: https://www.theatlantic.com/technology/archive/2021/11/facebook-experiment-toxic-centrist-content/620731/
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.