Tech

The pivot to groups poses a major challenge for Facebook

Facebook groups
Image credit: TechSpot

A viral ‘boomer’ group floods news feeds while moderators struggle to keep up

By Eric Ravenscraft

If it’s lit AF, drop a yeet here fam,” reads the first post on my Facebook news feed.

It’s been taken over by a pair of dueling groups dedicated to mocking the yawning generation gap between baby boomers and millennials.

This is one glimpse at the future Facebook wants: A social experience built not around the largely public news feed, but around people organizing in semiprivate groups.

The idea is to give users more reasons to return to Facebook, beyond the unending flood of content from random publishers, random “friends,” and random friends of friends in their news feeds.

In April, the company announced that it was going to redesign its service around two of its most popular features, groups, and events, in an effort to emphasize more “meaningful” communication.

(You may have noticed a new tab on the Facebook mobile app that contains a feed solely of posts from the groups you’re in.) Then, in May, the company changed its news feed algorithm to prioritize posts from your groups.

The message is clear: sharing junky videos with everyone you’re connected to is out. Hanging out in closed groups with like-minded people is in. Which brings us to the mega-viral “A group where we all pretend to be boomers,” which launched in May.

The group’s all about parody — Jim shares an article about a celebrity that died several years ago, people schedule events for “calling applecare at 6 am to figure out how to unlock your phone” — and more than 240,000 people have joined.

Then came the counterpunch. A few days after the boomer group was created, “A group where we all pretend to be millennials” sprouted up. It has fewer members — just over 5,000 — and most of them seem to be millennials themselves, with self-roasting updates about student loan debt and avocado toast.

Shortly after I joined both groups, they started to dominate my news feed. Out of a random 20 or so posts, more than half were from one of the two groups.

My notifications first let me know if any of my friends posted or commented on anything in the groups. Then Facebook started alerting me to random posts strangers made. The company had replaced one kind of noise in my feed with another. The group’s very popularity made it a beast to deal with.

And not just for me, it turns out.

If Facebook’s goal is to get users to like, comment, and post more, the new strategy seems to be working. According to statistics provided to OneZero by the founders of the boomer group, over a 28-day period ending on July 8, people posted new updates to the group — photos, links, and text posts — more than 190,000 times.

Those posts collectively received over 2.2 million comments and 2.8 million “reactions” — likes, for example — for an average of nearly 10 comments for every member.

One of the group’s co-founders, Justin Richard, explains that around a thousand new people were added to the group every week after its inception, even before the viral tweet that led to a spike in membership.

“We have to be on guard,” Snider says. “We’re concerned about being unpublished arbitrarily at any moment.”

See: How the Facebook algorithm works and how to make it work for you

Its size has turned into a liability of sorts. The more members a group has, the harder it becomes to police content within the group. Another of the boomer group’s co-founders, Robert Joseph Snider, says this has become the single biggest obstacle to keeping the group going.

“We have to be on guard,” Snider says. “We’re concerned about being unpublished arbitrarily at any moment.”

According to Facebook’s policies, groups can be unpublished for violating the company’s community standards, which prohibit nudity, hate speech, and violent content.

Facebook automatically flags content that it thinks might be in violation, including in groups. A moderator can manually approve the post, but if it gets reported to Facebook again by a member of the group, the company’s moderators might step in. Eventually, a group could get shut down for too many violations.

While that policy can seem aggressive to groups like the faux boomers, more problematic groups have shown the need for Facebook to move in the opposite direction.

Earlier this month, a report from ProPublica revealed a private group named “I’m 10–15” made up of current and former U.S. Border Patrol agents. Members made jokes about migrant deaths, flinging burritos at Representative Alexandria Ocasio-Cortez during a visit, and other heinous topics.

That group had over 9,500 members — a far cry from what the boomer group deals with, yet still a challenging number of people to moderate. Facebook’s automated tools operate inside private groups, where members are subject to the same community standards as everyone else, but they still rely on group leaders to handle reports and keep things clean.

If members or admins don’t report (or even encourage) offensive content, it can be hard for Facebook to find out about it.

A Facebook spokesperson tells OneZero that its machine learning moderation is key to dealing with content that violates its policies. The company has hired over 30,000 people to its safety and security teams, many of whom work on developing better-automated tools to flag content and send it to group admins to deal with.

Those tools can’t come soon enough.

Earlier this year, a report from the Verge detailed the grueling conditions of contractors charged with manually reviewing disturbing content reported to Facebook. Employees making less than $30,000 a year are required to view some of the worst the internet has to offer.

If Facebook could automate that task so each group admin deals with only a few offensive posts, rather than a single person reviewing hundreds, it might work out better for everyone in the long run.

“The group function is not well-suited to hundreds of thousands of people.”

But that’s a mighty big if. According to the founders of the boomer group, Facebook’s current auto-flag frequently gets things wrong.

Some of the automatic content warnings they’ve been warned about include “a picture of someone’s elbow” or “minions flagged as nudity.” For a group with a few hundred members, the occasional false alarm isn’t likely to be a problem. With a quarter-million, it’s a lot tougher.

“The group function is not well-suited to hundreds of thousands of people,” says Richard.

Facebook disagrees. A company spokesperson says its tools work for groups of all sizes and topics. Of course, it relies on group moderators to use those tools: The boomer group’s viral growth means moderators are forced to handle a disproportionately large number of people.

Its founders complain that Facebook is unclear on just how much wiggle room there is before a group gets banned. The site provides a dashboard where admins can see how many posts have been auto-flagged — the boomer group had over 60 community standard violations at the time of writing — but little info on what those violations are or how close they are to the point of no return.

“I wish they didn’t auto-flag things and gave us more transparency about our group being at risk of being unpublished,” says Snider.

In order to keep up with the flood of content — and avoid the lurking specter of a ban — the boomer group has already recruited 24 content moderators of its own focused solely on managing content. Snider says it’s exhausting.

“I already have a day job, and I especially feel very exhausted [training moderators]… because the subtle nuances of toeing the line in Facebook takes months to navigate,” he says.

To prepare for the possibility that the group might disappear, the admins have already created a backup version with the same name. Over 7,000 members have already signed up for this second group, just in case, though there’s currently no activity there.

This backup is a risk in and of itself. Earlier this year, Facebook announced that in the event it bans a group or page, it may also proactively ban related groups that it suspects will be used to host the banned content.

And so the boomer group exposes something of a contradiction in Facebook’s policies.

The social network is doing everything it can to promote groups, but the bigger they get, the more problems they present — for the moderators who control them, and for the users who face a never-ending flood of “yeets” in their news feeds.

To Top