CORDIS - EU research results
CORDIS

Computing Veracity Across Media, Languages, and Social Networks

Article Category

Article available in the following languages:

A step closer in tackling online rumours and fake news

With a recent explosion in fake news online – from propaganda to ‘alternative facts’ – online media desperately needs new systems to check news veracity and validity. A recent EU-funded project may have the answer.

Digital Economy icon Digital Economy
Society icon Society

Rumours have been around for millennia, as attested by the ancient (and modern) Greek word ‘pheme’, which means rumour or inaccurate information. In the age of social media, people are getting most of their news and information from the internet, often making decisions or forming opinions based on false information. The EU-funded PHEME (Computing veracity across media, languages, and social networks) project developed software tools that can identify and verify the veracity of online rumours. Such efforts couldn’t have come at a better time as the spread of alternative media has motivated journalists and media managers worldwide to intensify their fact-checking and verification efforts. ‘Recent high-profile examples are elections and referenda, where false information and online propaganda may have misled numerous citizens,’ says project coordinator Kalina Bontcheva from the University of Sheffield in the United Kingdom. ‘Social media platforms such as Facebook are also having to rise to the challenge of limiting the impact of misinformation online,’ she adds. PHEME developed a sophisticated computational framework for rapid automatic discovery and verification of rumours on a large scale. ‘One key tool we created is an open-source journalism dashboard that helps journalists track emerging rumours and examine key aspects of relevant discussions on social media,’ explains Prof. Bontcheva. ‘Another is an automated fact checking tool that assists journalists in checking the validity of claims made by politicians or in the news media.’ In addition, PHEME developed a dashboard aimed at identifying and analysing medical misinformation. To ensure the tools’ efficiency, the project team used past rumours as training data for machine learning algorithms. ‘We trained models to spot the opinions or stances that people are taking about a claim, and based on that picked out how likely a rumour is to be true or false,’ explains Prof. Bontcheva. ‘Once this is completed, the team can assign a value that reflects the veracity of the rumour.’ The work has been challenging, as machines – like people – can also fall for half-truths and propaganda. Nonetheless, machine performance is improving continuously and can cross-reference large amounts of information very quickly from different sources. ‘The PHEME tools can provide the evidence, but human input is needed for the decision making,’ says Prof. Bontcheva, underlining that ‘the PHEME rumour analysis tools are intended to assist, not replace, journalists in the decision-making and verification tasks.’ The current level of accuracy of around 75-80 % has been quite satisfactory for the project’s purpose. Importantly, most of the tools are open source and freely available to scientists, journalists and medical professionals so they can experiment with them. Some news organisations are trialling the tools, and there are ongoing discussions to commercialise the outcomes successfully. One global challenge in the field is how to eliminate fake news from people’s social feeds. ‘Although some key players have started developing solutions, they haven’t solved the problem, in the same way that we haven’t been fully able to eliminate email spam,’ argues Prof. Bontcheva. ‘Any tools must also to be complemented with raising user awareness and educating social network users about how to identify fake news and unreliable content,’ she adds. In the meantime, project partners are continuing to improve the algorithms in terms of reliability, scalability and efficiency beyond the project’s end date. ‘The technology is getting better and we’ve pushed the state of the art from what it was three years ago,’ says Prof. Bontcheva. Overall, PHEME has created and released several human-verified rumour datasets and software tools. These can then be scrutinised and verified independently, avoiding as well concerns regarding tool-based censorship. The findings will be instrumental in making our news more reliable and so our beliefs and decisions are more sound.

Keywords

Online rumour, fake news, veracity, PHEME, social networks

Discover other articles in the same domain of application