Insights

Who’s responsible for misinformation on social media? The case for rethinking accountability.

Social media companies are losing the battle against misinformation. We explore why decentralised accountability, not algorithms, may hold the key — and what a blockchain-inspired approach could look like.
person on tiktok

The pressure on social media platforms is real – but are the responses working?

Over the past few years, social media companies have faced mounting pressure to regulate content accessible or promoted through their platforms while addressing the pervasive issue of misinformation. The prevailing sentiment is clear: “Social media companies should do more.”

In response, these companies have implemented various measures to mitigate these challenges. However, from an observer’s perspective, the effectiveness of these measures in policing information remains questionable.

Why misinformation is so hard to police at scale

The authenticity and accuracy of information shared on social media platforms can often be dubious, extending to many media outlets, and is frequently labelled as “fake news.” Alarmingly, social media has become the primary source of information for a significant portion of the population. As a result, many individuals are increasingly looking to governments and regulators for guidelines and legislation that impose specific standards on social media companies.

The limits of algorithmic detection

From these companies’ standpoint, effectively monitoring the content disseminated on their platforms is a complex and elusive challenge. The proliferation of advanced tools for generating content complicates this task. Although a variety of algorithms can be developed and deployed to identify and remove content that fails to meet certain criteria, relying solely on algorithmic techniques to discern fake content is a short-term solution at best.

The rapid advancements in artificial intelligence (AI) will likely continue to enable the creation of increasingly sophisticated content that evades detection by automated systems. Social media companies are acutely aware of the challenges associated with monitoring content on their platforms. Consequently, they appear to be gravitating towards a decentralised approach, placing the responsibility for verifying information authenticity in the hands of individual users.

The village analogy: why accountability requires skin in the game

To better understand the rationale behind this reliance on users, consider the dynamics of a village. In such a setting, individuals are often well-acquainted with one another, and stories spread rapidly. An essential aspect of storytelling within the village is that each person who shares a story risks their reputation and their standing in the community. This creates a tangible sense of accountability concerning the decision to share information.

Fake news: Social media companies are losing the battle against misinformation. We explore why decentralised accountability, not algorithms, may hold the key - and what a blockchain-inspired approach could look like.

What banking got right (and social media hasn’t)

In combating online fraud, many mainstream banks in the UK have adopted a similar philosophy, encouraging users to take some responsibility for verifying the identities of those with whom they transact.

Customers are frequently prompted to reconsider the recipient of their payments, leading many to take extra time to confirm the legitimacy (to the best of their knowledge) of the transaction. The stakes are high, as the risk of losing hard-earned money compels individuals to act judiciously. This dynamic fosters a strong implicit assumption that participants in the ecosystem will act responsibly.

Why individual accountability breaks down on social media

However, this sense of accountability is often weaker in the realm of social media. Given the overwhelming volume of information, individuals may struggle to scrutinise all content they encounter, yet they may feel compelled to share, repost or like it. Information on social media is typically disseminated through various mediums, such as posts, shares, reposts, and likes. These actions are akin to retelling a story within a community, as they make content accessible to the user’s network. Unlike in smaller communities, however, there is minimal individual risk associated with spreading incorrect information. Although there is a potential collective cost, the average user is unlikely to factor it into their decision-making process. Similarly, the lack of individual rewards for sharing accurate information further complicates the issue.

As such, it can be argued that the decentralised approach of shifting the responsibility of verifying content authenticity to users is impractical, if not unrealistic, within the current social media landscape.

Rethinking the share, like, and repost – a mechanism for change

Social media platforms have the potential to enhance accountability by emulating real-life ecosystems. A straightforward approach might involve penalising users for spreading fake news. By incentivising users to share, like, or repost only content they believe to be true (to the best of their knowledge), platforms can instil a sense of accountability and responsibility at the individual level, thereby curbing the spread of misinformation.

However, significant challenges arise in designing and implementing mechanisms to effectively deter users from sharing untrustworthy information. While it may be impossible to prevent the posting of misleading content entirely, rethinking the “share,” “like,” and “repost” mechanisms for each piece of content could provide a powerful tool for controlling content quality while still allowing users to determine what they wish to disseminate.

A blockchain-inspired model for decentralised content verification

The proposed approach to enhance accountability and information verification on social media can be conceptualised as a relaxed blockchain mechanism that adopts key principles of decentralisation and transparency, without the full complexity of traditional blockchain systems. In this framework, users collectively verify the authenticity of content while creating a transparent record of interactions.

Instead of relying solely on algorithmic detection or centralised authority, this approach empowers individuals to contribute to the validation process, akin to how nodes in a blockchain network validate transactions. By recording each user’s actions, such as sharing and liking content, a decentralised ledger emerges that tracks the provenance of information. This structure promotes a sense of accountability among users, as they recognise that their contributions impact the overall integrity of the information ecosystem.

Community-driven consensus: flexible, scalable, practical

Moreover, while traditional blockchains often employ consensus mechanisms to validate transactions, the proposed solution can implement a more flexible form of consensus based on user interactions and reputations. This relaxed mechanism allows for a dynamic assessment of content credibility, where multiple users can weigh in on the authenticity of information without the stringent requirements of a fully decentralised blockchain.

By leveraging community-driven validation, the system can adapt to the rapidly changing landscape of social media, fostering a collaborative environment where users are incentivised to share accurate information. This hybrid approach retains the core benefits of blockchain—such as transparency and accountability—while simplifying the verification process, making it more accessible and practical for everyday users navigating the complexities of online information.


If this has got you thinking about how your organisation handles data integrity, content trust, or the technology underpinning either, we’d enjoy the conversation – talk to us.

More insights