The trustworthiness of a web page might help it rise up Google’s rankings if the search giant starts to measure quality by facts, not just links.
...
Instead of counting incoming links, [Google’s system for measuring the trustworthiness of a page] – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team (arxiv.org/abs/1502.03519v1). The score they compute for each page is its Knowledge-Based Trust score.
The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.
I’m guessing that in practice means ranking websites by the popularity of their delusions. The problem is that you can’t distinguish facts from fictions without reference to the external world. Furthermore, given how bad wikipedia is at getting its “facts” wright about any vaguely controversial topic, I don’t have a lot of confidence in the ability of the internet to settle on the truth.
Edit: speaking of bad sources of “facts”, why are you treating New Scientist as a reasonable source?
Google wants to rank websites based on facts not links
...
I’m guessing that in practice means ranking websites by the popularity of their delusions. The problem is that you can’t distinguish facts from fictions without reference to the external world. Furthermore, given how bad wikipedia is at getting its “facts” wright about any vaguely controversial topic, I don’t have a lot of confidence in the ability of the internet to settle on the truth.
Edit: speaking of bad sources of “facts”, why are you treating New Scientist as a reasonable source?
...wait, what?
...I guess they don’t actually mean “unanimously”...