Why is there such less variety of unique sites in Google search results?

551 views

Are there simply less unique websites compared to a decade ago? Are there more big monoliths of information that make individual sources less common or necessary? Is there some deeper explanation to it all? Search results used to be extremely varied and diverse years ago.

In: Technology

Anonymous 0 Comments

Google was good at launch because it was harvesting data from webrings and directories to provide it “high quality” link ranking data. However, they didn’t thank or credit or share any of their revenue with the sites whose human curation helped their results become so impressive. Seeing that Google search was effective, most human curators stopped curating directories and webrings. The SEO industry picked up the slack and began curating “blogs” that are junk links to junk products. This pair of outcomes led to the gradual and ongoing decay of Google’s result quality.

Google has not yet discovered how to automate “is this a quality link?” evaluation or not, since they can’t tell the difference between “an amateur who’s put in 20 years and just writes haphazardly” and “an SEO professional who uses Markov-generated text to juice links”. They have started to select “human-curated” sources of knowledge to promote above search results, which has resulted in various instances of e.g. a political party’s search results showing a parody image. They simply cannot evaluate trust without the data they initially harvested to make their billions, and without curation their algorithm will continue to fail.