Search engines like Google make the web go round. Billions of visitors are thrown around each month by these Internet giants, and without their ongoing domination, it’s hard to imagine what the structure of the net might look like. As such, obtaining a top ranking for your core keyword-term as a business has the potential to generate a serious return over a spell of months and years. Thus the incentives for webmasters to engage with search engines and to attempt to ‘game’ the results are vast, creating a fundamental problem for search engines that will continue to raise its ugly head as the relevance of human website reviews in determining rankings continues to grow.
The trouble with algorithmic search is that once the algorithm is known, it can be (and is encouraged by incentive to be) open to manipulation. Google do their best in hiding parts of the algorithm and shrouding their methods in secrecy, but the nature of the web makes it possible for anyone to test out their own theories of how the algorithms function, while also providing access to the thoughts and theories of professional search engine optimizers.
The result? A situation where even on a limited budget, any webmaster can influence the relevancy of their website in the search engines to ensure more visitors end up on their page than the page of a competitor.
The glaring problem this causes is a pollution of the search engine rankings ï¿½ it is naturally at odds with the principles of a search engine to allow biased human intervention to determine how visitors use the web and the quality of site they might expect to visit; in effect, enabling webmasters to conduct their own website reviews.
Of course, algorithmic search does benefit from the automation and sophistication of technology, and in terms of scale it’s simply impossible for any other improved form of search derivation to compete. How else could it be possible to index and rank billions of website results, to the extent that even the most obscure long-tail search terms had a determined ranking of pages?
Human-derived search, while far from perfect, would appear to help at least solve the issues of relevancy pollution, providing independent, corroborating website reviews in determining the ranking of a particular website. When a collective of users determine which sites are to be commended and which to be given a lower ranking, the incentives are purely based in utility: ensuring the best websites get the most coverage to improve the experience of web users. As a result, commercial bias is completely stripped from the equation, leaving a much more refined series of results that are likely to provide the most value to the searcher.
The problem with search engine rankings is that they take no notice of independent website reviews, and as such effectively permit individuals to determine their content as the most relevant and of most interest to searchers. User generated website reviews on the other hand provide a means by which web browsers can truly build up a picture of the relevance of their destination, without the growing concern of tainted, compromised results that fail to reflect the best sites on the web.