With the explosion in the number of websites offering political viewpoints, health advice and other information, researchers are working on tools that can quickly determine if a website is credible.

The need for this has increased due to the growth of user-generated content like blogs, said Andreas Juffinger, a researcher at the Know-Centre, a technology research centre in Austria.

He is part of a team that is developing a programme that analyses blogs and automatically ranks them as either "highly credible", "average credible" or "little credible".

It does this by comparing statistical properties, such as the distribution of certain words over time, of a blog entry with those of news articles on the same topic from mainstream sources that have already been deemed to be credible.

"It has shown promising results, we think we are on the right path," said Mr Juffinger.

"It has to be automatic because it is not possible for customers to label and read all these blogs," he added.

Japanese researchers are developing a programme which mines the web for a variety of viewpoints on an issue and then presents them to internet surfers together with supporting evidence as part of a "statement map" that makes it clear how the different opinions are related.

"We really believe that 'statement maps' can help users come to conclusions about the reliability of a website," said Koji Murakami of the Nara Institute of Science and Technology who is working on the project.

The number of websites available worldwide has risen from just 500 as recently as 1994 to tens of millions today, according to Microsoft.

One of the most popular websites today is Wikipedia, the online encyclopaedia which uses open-source software that lets anyone reading a subject entry to edit it, a feature which has raised doubts about its credibility.

Those doubts flared again in January when its entry on US Senator Edward Kennedy was altered to falsely report that he had died after suffering a seizure at a luncheon following the inauguration of President Barack Obama.

Despite periodic incidents such as this one, Wikipedia has grown since it was founded in 2001 to have some 2.6 million articles in English alone.

To determine the credibility of Wikipedia entries, researchers at Italy's University of Udine are working on an algorithm that assigns quality scores to its articles and contributors.

"Preliminary results demonstrate that the proposed algorithm seems to appropriately identify high and low quality articles," the research team led by Alberto Cusinato wrote in a paper presented at the Madrid conference.

British journal Nature examined a range of scientific entries on both Wikipedia and Encyclopaedia Britannica in 2005 and concluded that, while Wikipedia entries were often poorly structured and confused, there were few differences in accuracy.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.