Facebook can shift elections. That’s why, with rumors swirling that the social media CEO might run, transparency is needed now more than ever
‘The Facebook platform already has the ability to shift the outcome election if it so chooses.’
Despite his protestations to the contrary, Facebook founder Mark Zuckerberg has been acting like someone planning to run for office. He hired a pollster, visited a Detroit auto plant and other swing-state locations, and gave a high-profile commencement speech.
Meanwhile, Facebook has been under intense criticism for its role as a vector of misinformation in recent elections. This week, Facebook admitted that Russian accounts purchased $100,000 in political ads in 2015 and 2016. This disclosure comes only two months after the platform refused to disclose who is paying for advertising on the platform and where they’re running.
This confluence of events demonstrates the urgent need for greater transparency about how Facebook is already being used for electoral influence, particularly its algorithms and advertising features. Facebook must be regulated like the broadcast medium that it has become. And if Zuckerberg wants to run for office, he should be leading the charge for meaningful transparency.
The Facebook platform already has the ability to shift the outcome election if it so chooses. Setting aside the issue of so-called ‘fake news’ and its spread on the platform during 2016, Facebook itself is a powerful tool of voter mobilization and information.
My own research demonstrates that seeing one’s Facebook friends praise others for voting increases turnout; other work finds that exposure to voting-related posts on Facebook increases turnout.
Meanwhile, Facebook’s internal research shows that exposing users to voting reminders that include pictures or names of friends also makes them more likely to vote. Algorithmic shifts that prioritize showing these kinds of messages to certain groups of voters and suppressing them from others could theoretically be used to help a particular candidate.
Currently, however, candidates can’t manipulate the algorithm – they only have access to Facebook’s public organizing and advertising tools. They can create fan pages, pay to promote posts, use Facebook ads to recruit fans, recruit members for their email lists, and deploy apps that supporters can use to engage their friends.
And while shifts in Facebook’s algorithms can hurt the organic reach of posts by candidate pages, theoretically these changes are deployed without the intent to harm or benefit specific campaigns.
The algorithm makes Facebook different from other media – we don’t see a customized TV feed or hear different radio songs based on what we’ve liked before. This algorithm has long been a black box for people outside of the company. Now, it’s time for the platform to explain how it influences what political content gets spread on the network.
Advertising is another area where Facebook needs to radically increase transparency. While candidates must disclose their spending on required campaign finance filings, they can avoid reporting their exact Facebook spending amounts by hiring a firm to run the ads and paying for them as part of their overall consulting fee.
Other entities don’t have to disclose their Facebook spending at all, such as business entities, social welfare organizations, or so-called “dark money” groups. Furthermore, there’s nothing to stop the administrator of a Facebook page from paying to promote misleading or false content into millions of newsfeeds – and it’s not clear that they would have to disclose it to anyone.
That’s why Facebook needs to pro-actively disclose the political advertisements made on its platform: the amount, spender, content, and targeting. There is precedent for this kind of disclosure: TV and radio stations have been providing so-called public inspection files.
This reform would also result in the disclosure of spending by any autocratic states or regimes in weak democracies that want to promote propaganda on Facebook. Facebook should also block the use of their paid features to promote links to websites and pages that knowingly publish political misinformation.
That would limit the spread of fake news on the platform going forward. And while Facebook claims to have blocked ads from running on fake news sites, they have not yet taken the step to block ads on Facebook by fake news sites – or haven’t disclosed it, if they have.
Disclosure of the algorithm and ad spending are big steps – and would generate an equally large increase in transparency regarding Facebook’s role and use in spreading political content. While these disclosures would not necessarily stop outside interference or bad actors, at the very least voters would have the chance to know how the platform was being used to influence them.
Facebook’s report on “information operations” in the 2016 election was a good first step in identifying organized efforts to use Facebook to influence political opinion. However, their analysis is primarily focused on fake accounts and coordinated efforts to boost the organic spread of misinformation. Facebook’s paid features and algorithm are arguably much more powerful.
two billion global users, Facebook has been adopted faster than any technology or service in human history, and due to its widespread use, people deserve more information regarding its influence in elections. We need to know who is paying to put political content in our newsfeeds, and how the Facebook platform itself determines who sees what.
Source By theguardian.com