Read below the video for further description, and make sure to leave a comment with your view or opinion in the comments section.
The recent claims by Hubpages, who had been hit hard by the Panda update , that moving all their content to various subdomains has returned their traffic levels to near pre-Panda levels, that there is a clear problem with the Google algorithm and that it has been failing users.
Hubpages lost over 50% of their search engine traffic after Google clamped down on the site and a lot of other sites that it claimed to host low quality content in Feburary.
After consulting with Matt Cutts a Google fellow, Hubpages decided that by giving each author their own subdomain, much like WordPress and Blogger do would be the best way to split up the content and help Google distinguish what is high quality and what is crap.
Following the change Hubpages have reported that their traffic levels are back near pre-Panda levels.
It is already widely known that Google treat subdomains as individual sites and the site-wide blanket factors Google apply don’t impact subdomains as much as if the content was hosted on the single domain.
I’m not worried whether this is the best way to combat the Panda update, I know a lot of people agree and disagree with this as a solution but what I am worried about is the obvious and gaping hole in the Google algorithm that clearly still exists.
Google Lost Focus
In my opinion Google have lost focus on what makes a great algorithm and applied too much weight to algorithmic factors that impact Google’s opinion of a website as a whole.
Factors like the homepage’s PageRank, number of indexed pages, number of inbound links and social media influence on the site as a whole.
The Hubpage’s example shows that Google are currently ranking pages based on site-wide factors as opposed to on a page by page basis.
While this may in some cases lead to better quality search results (and Google extensively research the impact of changes to their algorithm on search quality before implementing them), in other cases it has lead to spam and low quality search results ruling the web and the proliferation of content farms.
As Google strove to move away from on page factors, because some are easily manipulated by the webmaster, they left themselves open to large sites with a big brand showing up in the search results where they shouldn’t simply because they rank well in the site-wide factors.
Right Idea – Poor Execution
I understand why Google tried to reduce the weighting on the on page factors like having the keyword in the title tag, first 90 words, URL, image alt tag, etc.
But as Google moved away from those on page factors, they should have looked to move towards more on page factors that were harder to manipulate.
Instead they focussed on a lot of site-wide factors that were harder but not impossible to manipulate.
The Panda update essentially sought to regain the balance, but instead of regaining the balance by increasing the weighting of these hard to manipulate on page factors like Latent Dirichlet Allocation and other complicated factors that are hard or not worth the effort to try to game.
The article I linked to above shows a large correlation between LDA and search engine ranking so Google may already be using these factors but clearly not enough, or maybe they don’t trust them to a level they think it would be wise to increase their weighting.
A possible solution would be for Google to implement conditional factors that are only looked at when there is a large number of content farm/big brand sites for one query.
So in the case of these big brand dominated queries the weighting of LDA type factors would be increased thus most likely increasing the quality of those search results but the queries that are being returned as high quality with the current algorithm would be unchanged.
It’s sort of like having an algorithm analyse the work of the Google algorithm and make decisions from the finding.
Of course there are many other possible solutions but that is a potential one.
The reason I decided to write this article is because we all (myself included) looked up to Google as the search engine powerhouse who may still be living on their original reputation of revolutionaries and spam killers while this is no longer the case.
They may still be the best search engine, in terms of search quality but with moves like the Panda update and the state of the algorithm before that I would suspect that Google could be overlapped if a new search engine with new ideas starts-up.
While I would love to see somebody like Blekko do well and maybe even implement solutions like I have suggested, with their current attitude towards search and their current results I can’t see them overtaking Google any time soon.
Still I would love to be proven wrong and I would love for a small search engine with a point to prove to take on Google with these types of queries with a new and creative solution and put some heat on the search giant.