Feeding The Beast: Why Google's Quality Rater Guidelines Are So Important
Published 11th December 2015
It's official! Google have voluntarily released their latest Search Quality Rater Guidelines, arguably one of the most important SEO documents of all time. Perhaps they knew the guidelines would be leaked anyway, or maybe they’re sending an important message to the Webmaster community.
In this article, I’ll dig into:
- Why every site owner should read and act upon Quality Rater Guidelines
- Their history
- What they are used for
- Why in my opinion Google has officially released them
Did I mention that the Quality Rater Guidelines are one of the most important SEO documents ever released?
Why should you care?
If someone came to you and said: “Here is the secret to success”, I think you would be interested. If the secret to success also came with a shortcut, then I bet you would be even more interested, because we all want the shortest route to our goal. In my opinion, the Quality Rater Guidelines are the secret to success, because in this document, Google has brilliantly defined a whole framework for planning and executing the perfect website. Of course, you might say that’s not a shortcut, because I still have to build the website, do stuff people want and work hard. But shortcuts are relative. Spending months building out the wrong kind of site and burning money on trying to promote it seems like a lot of wasted effort, when you could just take some time out, read a document which clearly tells you what works and take a straight line to your goal. Doesn’t that make more sense?
I recently had the honour of judging on the iGB Affiliate Awards, so have been reviewing a huge number of affiliate websites. Of course, I had Google Quality Rater Guidelines in mind and a funny thing happened. I evaluated the sites, ranking them in order of merit and checking them on SEMrush to assess their search visibility, and my evaluation nearly perfectly correlated with volumes of search traffic.
The best sites win
In other words, Google is getting to a point where the best sites win. And they know what the best sites are, because they’re using machine learning and user engagement to evaluate sites in a more sophisticated way than ever before.
If you keep track of SEO news, you will have heard of RankBrain, now the third most important ranking signal there is. That’s right, the third most important signal. Out of nowhere! Being driven by artificial intelligence, it needs training, and Google Quality Rater Guidelines appear to be the evaluation criteria for people training this algorithm. If you’re not technical, but you are a parent, then you will know about teaching children the basics, and from there, they will grow and learn themselves. This is how artificial intelligence works. You have a body of training data, which says: “This is what good looks like”. This informs the algorithm what to look for, because like you and I, we need somewhere to start from. It then processes this data and comes up with an evaluation, leading to decisions. In this case, quality raters tell the algorithm what is good and bad, and from there, it processes vast amounts of data and decides which websites should rank or not. The difference between a human learning something and Google’s vast infrastructure is that their learning algorithms will do far more than a human can never do. Yes, I know it feels like Star Trek and the Borg. It kind of is. And it’s creepy. So, that document is important, no… it’s massively important. For me, it is the blueprint for every web project I do.
What about black hat?
But you will say to me, what about links? Of course, page rank still matters, but less and less so. That’s why with link acquisition I’m seeing either really good links working or really good black hat links working, but nothing in the middle. Links are what made Google great in the first place, because they have been a reasonably good indicator for understanding the relative popularity of content, but they are so gameable.
Artificial intelligence was born in 1956, so it makes sense that Google engineers would have been investigating AI almost since the company’s birth. The first Quality Rater Guidelines emerged in 2008, and followed a very similar structure to today’s. In other words, even back then it’s clear they were training their artificial intelligence algorithms. Other copies were leaked in 2011 and 2012, and in 2013 Google came out with an abridged version. In 2014, another version was leaked, and finally today, we have the full unabridged version that has been officially released.
Why haven’t I heard about this before?
One question is why, despite this document hanging around for so many years, have SEOs not taken it very seriously? I’ve thought about this a lot, because over the last year-and-a-half, I made a huge amount of noise about these guidelines and I feel no one is really listening. I’ve noticed how other peers of mine get far more attention from talking about SEO housekeeping or how to score a good link. Of course this is valuable, but it’s very myopic, because you can mess around with housekeeping all you like, but if your site doesn’t fulfil the criteria outlined in this document, you’re wasting your time. I think I’m a relatively kind person, but my sympathy runs very thin when I get people crying on my shoulder about their website not ranking, when plainly it’s a junk website and doesn’t deserve to.
I think site owners now - have to - build websites people actually want. Building a site people want may sound strange and obvious at the same time, but we’ve had years of easy rankings for websites people don’t really want. We’ve gamed Google to rank these sites and along the way they produce revenue. Those easy days are drawing to a close. That’s why when the ‘hand that feeds you’ (Google) gives you clear, concise guidelines that will definitely help you, it may be a good idea to take some time out and listen.
That’s great...but what next?
There is one big idea that runs through the whole document: user satisfaction. For Google it means when a user asks a question (querying a search engine), the search results are the most satisfying they can be for that user. If you build websites that satisfy users, Google will want to rank you for queries where you will satisfy their users the most. The question you might ask yourself is: how can I satisfy users and make money? If you’re an iGaming affiliate, the answer generally goes two ways, or a mixture of both. Users want: - Lots of free trustworthy bonus offers - Trustworthy opinions on where to gamble Above all, users want you to help them save time and make better decisions. Of course, there are other parts to this, like presenting your information well. Users care more about the utility of your website than how pretty it is, because ultimately they are there to fulfil an objective. The rater guidelines go into this in great detail.
Some of you, like me, are white label operators. I made the jump into bitcoin casino a couple of months ago. Using Quality Rater Guidelines as my frame of reference, I decided to de-prioritise SEO ahead of UX and utility, because I want to be wanted by users, and in turn Google. The current iteration of our site is just like any other casino site and that is very painful for me. That’s why we’re completely rebuilding, so we have something people really want to interact with. If you own a white label site, the number one question should be: how can I make my website better for users, to do what they want? In this case, they want to gamble. So is your website helping make gambling as easy and pleasant as possible? If you nail this, users will want to be on your website, Google will recognise this from user engagement data and the general footprint your website is laying down, and they will want you in their search results, as you satisfy their users.
Why release the guidelines officially?
Google aren’t making a huge noise about officially releasing this document, because I’m sure they did so reluctantly. They knew it would get out eventually, but they also understand it’s a rare insight into how their business works, which Google hates, because they’re notoriously secretive. But, if the documents are going to get out anyway, it’s better to have some oversight and gauge the level of interest in the quality guidelines more accurately.
In many ways these guidelines are more sincere than their Webmaster guidelines, because there’s no PR spin or agenda. It’s simply a training manual feeding into where they want their algorithm to go. That’s why it’s so credible. It’s worth remembering: If users want you, Google needs you. And Google is training their algorithm to find sites users want. So wouldn’t it just be a lot easier to give users what they want? To find the guidelines, just search for “2015 quality rater guidelines” and there’s a link on the Search Engine Land post.