‘Algorithmic Freedom Day’ is chapter 24 of Insights from the Future, a book I’m writing about technology, innovation, and people … from the perspective of the future. THIS IS NOT NEWS; IT IS A PROJECTION OF FUTURE NEWS. Subscribe to my newsletter to keep in touch and get notified when the book publishes.
October 7, 2024
- Facebook must publicly disclose algorithms
- Twitter must share virality factors
- LinkedIn’s biases against off-platform links must be plainly stated
- Google must reveal reasons for search result ranking
- Social platforms’ fact-checking virality dampers need to be outlined
- All disclosures must be human-readable
Want to know why your Facebook post got two likes from your 3,000 friends, and one of them was your mother? Or why your website gets approximately zero traffic from Google? Soon you’ll be able to, thanks to a new law that forces social platforms, news outlets, and other information and data handlers to publicly disclose the rules that govern the popularity and visibility of information.
At least in the state of California.
Today, the California governor signed into law the 2024 Algorithmic Freedom Day. While it takes full affect in a week, hackers with sharp eyes have already seen major platforms like Facebook beta-testing their disclosures — and tweaking them for public consumption.
“It looks like Facebook decreases visibility of any post that links off-platform by 83.7%,” hacktivist Freida Jones told me on the TechFirst podcast. “Google search results prioritize high-page-rank websites about 99.3% higher than an average person’s blog, and LinkedIn absolutely buries anything that doesn’t keep LinkedIn users right on LinkedIn.”
The key part of the legislation is that the algorithms that govern what is visible, what is shared, and what gets promoted need to be made public. Typically, they’re buried in the code of websites and apps. TikTok shows what’s popular, YouTube shows you more videos like the ones you’ve already seen, and Twitter highlights topics, people, and tweets it agrees with. That’s okay, according to the new legislation, as long as the people who use those services can find a simple, human-readable explanation of the automated influences driving content on each platform they use.
“We’re sweeping away the hidden hand of social media promotion and popularity,” says the bill’s initial sponsor, state senator Justin Chi said in a statement. “Also, any code that results in shadow-banning.”
Some of the major findings already are things already well-known: Facebook restricts the spread of conspiracy-theory content around 5G, vaccines, and politics. Others are more obscure and might simply be errors, like the fact that LinkedIn minimized the spread of podiatry content.
But some say meeting the new law’s requirements is actually impossible.
“Honestly, we don’t always know why the engine returns a certain result,” Google chief scientist Franklin Mehhan told me. “It’s extremely complex, uses a lot of data, involves artificial intelligence, and can’t easily be resolved down to human language. Certainly not in a concise form.”
So far Chi remains unmoved.
Platforms that don’t comply with the new order face a daily penalty of $1,000 or as much as .1% of annual revenue, which could quickly get costly. That’s resulted in a rather odd situation at Google, which has built an AI to translate the reason why its main search engine AI made a ranking decision into plain English. Artificial intelligence, in other words, to translate the decisions of artificial intelligence.
Some of the explanations for common search results, I’m told, verge on War And Peace-like lengths: 40, 50, or even 90 pages of single-spaced explanation as to why a Wikipedia article ranks first for “almond milk.”
Still, data activists hold out hope for good results.
“Look,” says Jones. “We need the ability to understand why platforms show us the content they do. And it’s always nice to know why our own posts, tweets, and links get or don’t get attention.”
According to rumors, the European Union is following the implementation of this law closely, and is considering following suit with similar legislation.
Again, this is a chapter of Insights from the Future, a book I’m writing about technology, innovation, and people … from the perspective of the future. Subscribe to my newsletter to keep in touch and get notified when the book publishes.