At first glance, a private “Justice for George Floyd” Facebook group looks like many of the dozens of other groups created to collectively mourn the killing of an unarmed black man by a white police officer. “We demand justice for the murder of George Floyd,” the group says in its description with the hashtag #BLM, for Black Lives Matter. The group has more than 10,000 members and its profile picture is an illustration of . Nine people, several of whom appear to be black, control the group.
But dig a little deeper and it quickly becomes apparent the group isn’t interested in racial justice. Instead, it cynically uses Floyd’s name to fan racial hatred.
The admins, none of whom appear to be who they say they are, post “announcements” with hateful memes and misinformation, while commenters share support for neo-Nazis. Many mock Floyd’s appearance and background. The posts are sprinkled with anti-Semitic conspiracy theories, sometimes involving billionaire investor George Soros. The image of Pepe the Frog, a comic book character co-opted by white supremacists, makes a few appearances.
One announcement, posted by an admin, touts “fact-checked” statistics that show more black people drown in swimming pools than getting killed by police officers. “I think we need to talk to the leaders of BLM about teaching swimming lessons,” said the poster, who calls himself George Lincoln, likely a reference to American Nazi Party founder George Lincoln Rockwell.
The group includes other people who joined to show support for the community grieving Floyd. Those members question why posts, like a meme showing a black man pointing a gun at the belly of a screaming, pregnant white woman, don’t get removed.
“We out here NOT babysitting grown folks,” wrote one of the admins, a man who calls himself James Gressett and says he lives in Somalia though he “likes” several small businesses based in the Dallas-Fort Worth area. In his public Facebook profile, he notes the correct way to pronounce his name is “HIT-lerr-did NUTH-ing-rawng.”
“Y’all post the sh-t you wanna see,” he said. “We are too busy keeping over 1,000 spam posts and comments out of this group EVERY DAY to sweat basic bit-hes reactions to lively debate and mildly inflammatory remarks.”
On the surface, this group could be dismissed as a den of racists to troll others. And while it’s nearly impossible to know its true motives, experts believe using a major news event to entice members could represent something more insidious — it could be an effort to indoctrinate and radicalize people who have sympathetic views, but who wouldn’t necessarily seek out racist groups. And at least one expert who’s tracked these types of groups believes there are many more out there.
“It’s a way to penetrate and get your ideas and goals across in a fashion that you would not be able to if you said, ‘Hi, I’m a white supremacist, I’m here to screw up your society,'” said Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights. “Instead you say, ‘Hi, I’m someone just like you and think this about that.’ That’s a way you win people’s confidence.”
“Justice for George Floyd” represents yet another example of a Facebook group or account masquerading as something that it’s not. Russians used a similar scheme to influence the 2016 US election via the social media site. And in 2018, bogus groups posing as pop star fan accounts used Facebook to stoke ethnic violence in Myanmar. Earlier this month, the company said it removed a “handful” of fake accounts where white supremacists had posed as part of the Antifa far-left, an anti-fascist group with plans to infiltrate the George Floyd protests.
Now, groups using Floyd’s name and positioning themselves as places to discuss justice, racism, and reform appear to be proliferating. A quick search brought up about 100 different public and private groups with “Justice” and “George Floyd” in the title. While most of these are legitimate, some are more questionable.
“It’s crazy how rampant it is,” said Zachary Elwood, who’s based in Portland, Oregon, and who has been tracking deceptive online activity on Facebook and Twitter for the past three years. He’s found a handful of suspicious Facebook groups that have appeared since the killing of Floyd. “I wouldn’t be surprised if this was just a drop in the bucket.”
Elwood has also found deceptive groups in Macedonia allegedly aiming to influence US politics through Facebook and pro-gun, anti-government groups that organized protests against coronavirus quarantines throughout the county in April. He also pinpointed the “Justice for George Floyd” group, which appears to be one of the most vitriolic groups misusing the Black Lives Matter movement to gain members.
Facebook, which has more than 2.6 billion monthly active users, has been pushing people to shift from public spaces into private groups over the last couple of years. It’s a strategy to stay competitive against social media rivals like Twitter, YouTube, and TikTok and to offer users something they can’t get from the more public-facing sites. But this can make moderating hate speech and harassment challenging.
The social media company says it’s addressing the problem. Facebook yanked 190 accounts linked to white supremacist groups earlier this month. The company declined to comment on the “Justice for George Floyd” group and its content before this story was published. After the story published, a Facebook spokeswoman told CNET the company was investigating the group and would remove content that violated its rules.
But many of these groups have a trove of tactics to cover their tracks.
Getting past Facebook’s censors
One of the moderators of the “Justice for George Floyd” group is someone who goes by the name “Shawn D Ildo.” His Facebook profile photo is a picture of a black man driving a go-kart with a small child. The cover photo is a chalk pastel drawing by the black artist LaMark Crosby titled “Adjacent to the King,” which shows King Tut morph into a modern black man — Crosby himself.
But something is off. The cover photo has been altered so that the most current evolution of the drawing is a photo of Tom Brady, a white football player. And a reverse image search of the man in the go-kart brings up a rapper named Viper. (The person calling himself Ildo told CNET he used the image to commemorate Viper’s death in police custody. Viper, whose real name is Lee Arthur Carter, is alive). And even though the owner of this Facebook account said, “funnily enough, my name is Shawn David Ildo,” internet searches for that name don’t yield any results.
Ildo said over a chat on Facebook Messenger that he lives in Dallas, does freelance IT work and is 24 years old. It’s unclear if any of that is true. He said he got involved with the “Justice for George Floyd” group through a mutual friend, and his role as a moderator includes approving members and posts.
“We keep a zero-tolerance on racism,” he said.
When asked about the goal of the group, he said, “The goal? Well, I guess it would be making sure the real facts and opinions are out there. Sharing ideas, bridging gaps, defeating racism one post at a time.”
Crosby, the artist behind Ildo’s doctored cover artwork, said he was shocked and upset by Ildo’s alterations on “Adjacent to the King.” The 48-year-old, who lives in Columbus, Ohio, and works as a graphic designer, created the artwork while in college to portray the passage of time and history of black people. Other people have used Crosby’s self-portrait on their profiles and websites, he said in an interview with CNET, but it’s always been focused on black history and positive usage. He’s never before seen a doctored version of his artwork, which he considered to be offensive.
“I fight this every day as a black man,” said Crosby, who noted he’s been the victim of police brutality in the past. “To have someone use my image, not only the image I created but an image of me, is bothersome, to say the least.”
The “Justice For George Floyd” group is just two months old. Its original name was “Justice for Ahmaud Arbery,” for a black man who was shot to death in February while jogging in Georgia. Ildo said the organizers changed the name because they were having a hard time getting members, and “Facebook can give us a little better traction when people searched for George.”
None of the group’s other admins and moderators returned multiple requests for comment.
Elwood said when he’s searching for fake Facebook groups, he looks to see when the group was created (the newer, the more suspicious) and who the admins are, along with their history, friends, and first posts.
“It’s worth noting, though, you can’t judge too much from what’s visible or empty,” Elwood said. “Because theoretically, they could have just not put it in or have it hidden through Facebook data security settings.”
Facebook accounts and groups can be set to public or private. From there, the site allows for additional layers of privacy, including a setting that makes groups invisible from searches. Admins “own” the groups and can appoint moderators, who can also have the power to approve and remove posts and members.
As the social network has nudged users into private spaces, it’s created an opening for people who typically frequented more outwardly hateful sites and message boards, like 4chan and 8chan. And unlike those other sites, posts on Facebook can carry more weight because its users are supposed to be legitimate and non-anonymous. It’s against Facebook policy to use a fake name.
“Facebook gives an appearance of ‘these are actually other people,'” Elwood said.
The company says it doesn’t allow hate speech, racism, harassment, white nationalist or white separatist content on its site, and it will remove any posts or comments that violate those policies. It also bans “coordinated inauthentic behavior,” which are group efforts to mislead people. Private Facebook groups are supposed to obey the same community standards as other areas of the social media site. But a lot still gets through Facebook’s censors — as evidenced in the “Justice for George Floyd” group.
To police the platform, Facebook said it uses a mix of artificial intelligence, machine learning and computer vision. With these tools, the company says it can analyze specific examples of banned content to identify patterns of behavior. Those patterns then teach its software to find other similar problems and allow it to detect violating content, the company says.
Facebook also uses human moderators. According to NYU’s Bennett, who authored a report earlier this month called, “Who Moderates the Social Media Giants? A Call to End Outsourcing,” 3 million Facebook posts are flagged for review by 15,000 moderators every day. But they have an error rate of at least 10%, which equals about 300,000 mistakes a day. For comparison, YouTube has about 10,000 moderators and Twitter has roughly 1,500, according to Bennett
“The machines can’t pick up everything,” he said. “The bad actors are clever, and you need an awful lot of people to keep up with this.”
When it comes to removing content from its platform, Facebook has to balance between trying to stop the spread of hate and misinformation and protecting users’ freedom of speech and privacy.
“There is interest within the company to provide some policing mechanism there,” said Cody Buntain, a professor of informatics at the New Jersey Institute of Technology who studies how online behavior translates to the real world. “But I think that’d be a very, very uphill battle to actually be able to do that and still respect the privacy concerns that Facebook has.”
After Floyd’s death made headlines around the world, Jamie Elliott-Deming did the only thing she could think to do: She painted a watercolor portrait of Floyd.
“I wanted to honor him,” the 36-year-old, Tennessee-based artist told CNET over Facebook Messenger. “Portraiture, before photography, historically was the way to pay tribute to the existence of someone, immortalize them. His life mattered.”
Elliott-Deming searched for a way to contact Floyd’s family to send them the artwork. She turned to Facebook and joined a couple of George Floyd groups with the hope someone knew his relatives. Instead, Elliott-Deming found hateful memes and racist members. The more than 100 comments on her painting ranged from compliments on her artistic skill to rants about Floyd’s alleged criminal record.
“I was surprised by the posts the admin/moderators of the group were approving,” said Elliott-Deming, who was kicked out of the group by Ildo the same day she shared Floyd’s portrait. “Some of them seemed openly divisive.”
That appears to be the aim.
The idea is to lure in as many people as possible, troll the ones who may be offended, like Elliott-Deming, and get like-minded people to join the cause. It’s a textbook recruitment tactic for white supremacists, experts say. It’s how the neo-Nazi site The Daily Stormer gained a following when it launched in July 2013, according to the Anti-Defamation League. Early on, the website gave trolling instructions and techniques on how to harass people of other races and religions online.
“I’m never surprised at how online racists and extremists will adapt to the latest issue on the ground and find a way to exploit technology to aggravate the situation,” said Oren Segal, vice president of the Center on Extremism at the Anti-Defamation League. “These racially charged incidents are the lifeblood of the movement, these are the touchpoints they’re able to exploit and leverage and create propaganda to move their supporters or create new ones.”
Within the “Justice for George Floyd” group, the admins and moderators are careful to allow hateful comments and memes but ban anything inflammatory enough to get Facebook’s attention.
In one instance, an admin — who calls himself Kondo Miyamoto and uses a doctored photo of Nintendo game developer Takashi Tezuka as his profile photo — posted an announcement alleging Floyd worked in pornography. He included a photo that was edited to cover any explicit content and said sharing any links or screenshots about the topic would get group members banned.
“Posting anything of this caliber will get your profile automatically reported by Facebook’s algorithm, and may also compromise the group,” the person using the Miyamoto name wrote. “NO GROUP = NO JUSTICE FOR GEORGE. [But] VERBAL DISCUSSION of this is OK.”
The group’s admins and moderators also give members trolling tutorials.
For example, Gressett, one of the group’s admins, posted a 3D photo of Floyd’s face that slowly moves and manipulates Floyd’s features. In the caption, Gressett insinuated that Floyd’s death, as well as the coronavirus, was faked. Gressett blamed Democrats for being racist. “#BlackLivesMatter already to everybody except the #DeepState leftists who only care about #BlackVotes and tryna #FuckTrump,” he wrote.
Deep within the more than 100 comments on the post, a group member who goes by Christopher B. Brenneman responded with the mantra “all lives matter.” Gressett wrote back, “Don’t tell anybody yet, but that’s precisely that direction we’re taking this group. … Wake the others. Show them the truth.”
So Brenneman wrote several more comments along the same lines. But then Gressett chastised him. “You can rant on more than one post, just don’t be spammy,” Gressett said. “And try not to get muted, bro.”
Brenneman apologized. “I’m a little slow when it comes to FB, what exactly should I be doing?” he wrote. “I thought just post the hell out of All lives matter, like rally the troops to try to get people straight.”
Segal, from the Anti-Defamation League, said it’s unclear whether these people are hardcore extremists or part of a larger white supremacist organization. It’s likely a mix of online trolls, racists, and hardliners, he said. And, if the “Justice for George Floyd” group is like other white supremacist groups, the hardliners are probably trying to lure impressionable people to their side.
“That platform is probably bringing together some real hardcore white supremacists with your everyday racists. It’s not a good place,” Segal said. “If there are people who have nefarious purposes, making it seem like they’re sympathetic only to do the slow reveal seems more dangerous.”
The “Justice for George Floyd” group on Monday was filled with the same hateful rhetoric as before. The day’s posts include someone saying that “Black lives matter is a front for child sex trafficking” and another saying that “white lives matter too.” One person posted, “Who in the hell let all the trash ass racist fucks in this group because disrespectful is an understatement.” That collected more than 70 comments, the majority of which aren’t worth repeating.
When CNET brought the group to Facebook’s attention, the company opted to let it stay on the platform.
That didn’t surprise Elwood, the Facebook watcher. He used to report all of the fake accounts he found to the social media company, but he’s since stopped. Even in some of the most blatant of cases, like when someone had copied another person’s name and photo directly — violating company policies — Elwood said Facebook would often leave the accounts up.
“I’ve become kind of numb to Facebook doing anything,” he said. “When we look back at this period of time, I think people will realize how bad Facebook was for our social discourse.”
After chatting with Ildo last week, CNET’s Shara Tibken got kicked out of the “Justice for George Floyd” group. Ildo later told Tibken he wasn’t sure what happened and that he’d “try to get you back in asap.” He’d previously thanked her for the conversation and said, “I will always be here if you’d like to reach back out.”
Instead, his account vanished.
CNET’s Queenie Wong contributed to this report.