Facebook has ignored calls from regulatory groups to increase the promotion of group content in the news.

Facebook Inc. reportedly said Thursday that it will begin promoting content from Facebook’s public groups in users’ news subscriptions and search engine results. The move ignores calls by regulators to limit algorithm recommendations ahead of next month’s US election. Fidji Simo, Facebook’s app director, said in a blog post that the company would post recommendations in news feeds by displaying “relevant discussions” from public groups next to a link or post.

Facebook has ignored calls from regulatory groups to increase the promotion of group content in the news.

Facebook groups are communities built around shared interests, and any user can view posts from public groups.

Facebook will also start displaying conversations from public groups outside the app, such as conversations in web searches, Mr Seymmo said.

Facebook declined to say when specific policy changes would be rolled out, saying only that testing in the U.S. would begin “to a very limited extent” in the coming weeks.

Facebook made groups a strategic priority last year and said it was adding new review tools to more than 70 million users who manage those communities to help them reject posts, direct discussions and schedule paid sponsorships.

According to an internal Facebook memo, this year’s group push was a success on the Facebook app. As of August, group content was up 31.9% year-on-year.

Facebook is also facing a series of scandals over extremist activities within the group, including thousands of Boogao, QAnon and militia groups and communities that spread false health information.

Facebook has gradually removed some of these groups and restricted recommendations to others.

A few days ago, Accountable Tech and a dozen other advocacy groups called on Facebook to suspend the use of group recommendations until the November 3 election results are officially certified.

The coalition accused Facebook of prioritising groups despite warning signs from researchers and warned that the tools had become “hidden fertile ground for disinformation and extremist platforms”.

Adam Conner, a former Facebook executive who now runs technology policy at the center for American Progress, a left-leaning group, said Facebook’s decision could accelerate these dangerous events.

“During the New Crown pandemic, it was a dangerous choice to have content from these groups spread more widely and be more easily detected before the highly controversial elections,” Connor said. “