Google’s algorithm is the driving force behind Search. It’s the machine that sorts out all web rankings, deciding whether and how highly your website ranks. For a long time, these ranking factors have been a complete mystery. Search Engine Optimization, or SEO, has relied on a guess-and-check approach as marketers have tried to optimize their websites to rank as highly as possible on Google.
With the May 27th leak of a portion of Google’s algorithm, we now get a peek behind the curtain of the rules that dictate Search. Here, we break down important points to keep in mind as you work to improve your website’s visibility.
It is important to note that this is a developing story. The leaks discussed below include a document featuring 14,000 attributes, along with spots that call for missing information. More insights and analyses are expected to emerge in the coming weeks. Sign up for Quell’s Virtual Marketing Minute to stay updated on all developments.
Table of Contents
Was Google’s Algorithm Leaked?
What Did the Google Algorithm Leak Show
What Are Some of the Key Attributes Google Considers?
What Is the Architecture of Google’s Ranking Systems?
How a Professional SEO Agency Can Help Your Business Make The Most of This Leak
Was Google’s Algorithm Leaked?
On May 27th, Mike King, with the help of Rand Fishkin, published an article that revealed a huge leak from Google. The article provided a first-time look at a portion of the Google algorithm that determines how websites rank in search results. The information came from Erfan Azimi, an SEO professional who claims to have been given access to these files by an ex-Google employee.
It’s important to note that we will not link to any leaked documents here. We will only dissect what Mike King and Rand Fishkin have already published.
What Did the Google Algorithm Leak Show?
The May 27th leak covers internal documents from Google Search’s Content Warehouse API. It does not show every aspect of Google’s algorithm, but it does reveal some important details.
First, we should make sure we understand what King and Fishkin’s article is not.
- It is NOT simply a list of “ranking factors.” Many of the points in the leaked document are, most likely, ranking factors, but using that as a blanket label for all mentioned attributes would be imprecise. It only shows the types of website data Google looks at.
- It does NOT include any scoring functions. Those are most likely somewhere else in the algorithm, in parts that have not been included in this leak. We still have no way of knowing the weight of each of these attributes for ranking.
- It is NOT a complete look at all information included in the leaked document. There were thousands of data points to consider included with this leak. If you want to know everything, be prepared to read a series of articles. If you’re interested in keeping up with digital marketing and the most important developments around this story, contact us to sign up for Quell’s Virtual Marketing Minute.
With that understood, here is a summary what this leak does include:
- Information about Google’s algorithm, updated as recently as May 7th.
- 14,014 attributes sorted into 2,596 modules.
- The architecture of Google’s SEO ranking systems.
What Are Some of the Key Attributes Google Considers?
Below are a handful of the attributes noted in the review of leaked documents thus far. Note that many of the points included go against what Google has said in the past.
- Site Authority: The leak showed that there is a ranking called siteAuthority that looks at your site as a whole. This means that each page is not judged on an individual basis, and not all backlinks are created equal, with links from higher authority sites most likely being worth more. As you work on your site, make sure you are updating everything and pruning old, unhelpful content that drags down your site authority.
- Click Data: There are a number of attributes titled badClicks, goodClicks, lastLongestClicks, unsquashedClicks, unsquashedImpressions, and unsquashedLastLongestClicks that show that Google does consider how many clicks your website tends to get and how engaged users are with your site. These work on a rolling 13-month period.
- Author: Google considers who the author is, using an explicit isAuthor attribute. You will need to match this by deciding on an author who can serve as the thought leader of your company.
- Google Sandbox: When launching a new site or even putting up a new page, it seems that there is a throttle on how well your site can do. This is due to a “Sandbox” that Google puts you in, most likely to prevent spam in results.
- Anchor Mismatch: This is a potential penalty your site can have. Whether you are linking out or receiving a backlink, the anchor text needs to make sense, or else Google may mark you down to prevent spam. Make sure that this is correct across your own site and in backlinks from other sites to yours.
- SERP Demotion: These are potential demotions from how users interact with your result in the search results page. This indicates the importance of meta data, providing Title Tags, Meta Descriptions, and the right schema to stand out.
- Nav Demotion: This demotion focuses on user experience, putting a stress on good navigation practices.
- Exact Match Domains Demotion: This is in reference to websites that purchase a URL that is the same as one of their target keywords, as opposed to a brand name. An example would be getting a site with the URL “redshoes.com” with the hope of appearing when someone searches “red shoes”. Google is throttling back websites that try to use this strategy.
Most likely, Google values some of these attributes more than others. However, due to the lack of any scoring functions included in the leak, it remains unclear how each of these attributes are weighted. What’s important is that, despite prior comments from Google, we now know they are, in fact, gathering this data and considering each of these attributes.
What Is the Architecture of Google’s Ranking Systems?
The one certainty this leak has provided is that Google is not a single algorithm with a series of equations and factors. Instead, it is a series of algorithmic systems that work together, crawling, indexing, rendering, processing, ranking and serving your content.
- Trawler goes through sites, maintains crawl rates and the queue for which pages get crawled next, and estimates how often content tends to change.
- Alexandria is the core indexing system, while SegIndexer and TeraGoogle manage indexing for documents.
- HtmlrenderWebkitHeadless renders pages that use JavaScript.
- LinkExtractor does what the name implies, finding links in pages, while WebMirror sorts through and manages canonicalization and duplication of content.
- Mustang and Ascorer are the primary algorithms for ranking pages, while NavBoost, FreshnessTwiddler, and WebChooserScorer re-rank based on user click-logs, freshness of content, and feature names in snippets.
- Google Web Server (GWS) is the frontend server for Google, while SuperRoot sends messages to GWS and manages re-ranking systems, SnippetBrain generates snippets on the search results page, Glue keeps results universal based on user behavior, and Cookbook generates signals for GWS.
There are a few key points to pull from this:
- There is an element of personalization that the reranking systems allow for. This is important to keep in mind as you write targeted web content and as you evaluate the search engine results page.
- Fresh content is important. Google has created an entire system that prioritizes ranking new articles ahead of dated information. This is why it’s important to have an SEO agency on hand that is regularly creating new, fresh content.
- SEO is past the point of jamming keywords in as many places on your website as possible. Google has a system dedicated to user experience. This means that Google goes beyond just the words on the page. You need to have a navigation that is easy for users.
- There is a clear priority on getting “snippets” from your website. These are the quick pieces of information that appear on Google’s search results page. You need content writers that know to plan and write with snippets in mind in order to get the most out of your Google results.
How a Professional SEO Agency Can Help Your Business Make The Most of This Leak
The key to SEO and ranking as highly as possible on Google remains producing helpful and informative content. The leak of Google’s algorithm confirms this, exposing specifics that are all working together for this goal. As you consider the best way to implement the information from this leak, keep this thought in mind to provide content and user experience that gives your potential customers the information they want.
A major takeaway from this leak is that Google’s algorithm is truly complex. With 14,000 pieces of data that go through a number of different ranking systems, there is so much going on, and that’s just from the portion of their algorithm that this leak covers.
With how much Search has evolved, we have simply reached a point that SEO can’t be done in-house. You need a team of experts who are doing constant research to keep up and give yourself a boost. If you don’t partner with a professional SEO agency, your competitors will.