Ian Haworth writes for the Washington Free Beacon about new insights into Facebook’s anti-conservative bias.
It took just hours after a Washington Free Beacon report on a Biden administration plan to distribute crack pipes to drug addicts at taxpayer expense for the Facebook fact-checkers to mobilize.
In a “fact check” titled “Biden Administration Is NOT Funding ‘Crack Pipes, Heroin’ For Drug Use,” Lead Stories—a prominent member of Facebook’s third-party fact-checking program—concluded the Free Beacon report was “not true.” Lead Stories based its determination on Health and Human Services Secretary Xavier Becerra’s declaration, made days after the report elicited considerable blowback, that as Lead Stories phrased it, “none of the federal funds for harm reduction programs for drug addicts can be used to provide crack pipes.”
“While a description of the HHS grants stated that the grantees would be required to buy materials like safe smoking kits and supplies to ‘enhance harm reduction efforts,’ such kits and supplies are just a few of the many materials that grantees can utilize,” Lead Stories added. The fact-checking system at Facebook, which I saw first hand during my time as a software engineer on Facebook’s “Misinformation” fact-checking team between 2019 and 2021, hands monumental power to supposedly nonpartisan fact-checking organizations to quash legitimate news.
According to the original Free Beacon report, President Joe Biden’s Department of Health and Human Services planned to implement a $30 million grant program that included the distribution of “safe smoking kits” to drug addicts. A spokesman for the administration told the Free Beacon that these “safe smoking kits” would—like many other similar existing kits across the nation—include pipes for the use of “any illicit substance.” Another Facebook fact-checker, AFP Fact Check, also concluded the “U.S. grant program is not funding crack pipes for addicts.”
As a result of this wave of fact-checking activity, Facebook posts linking to the Free Beacon report were tagged as “Partly False,” thereby “significantly” reducing the “content’s distribution so that fewer people see it,” according to Facebook’s own fact-checking policy.