Marketing

Google Search Algorithm Leak – What you need to know

Note: This blog is still being updated as I research on this topic and more news comes in.

The SEO world is always divided between experts and ‘experts’ who have their way of running things and knowing what works and what doesn’t.

But one thing that brought the community together and screamed at the top of their voice “I told you so!” is the recent Google Algorithm Leak. Over 2500 documents were ‘accidentally’ uploaded to a GitHub repository.

For SEO professionals and website owners alike, this data dump has ignited a firestorm of discussions, debates, and speculation about the true nature of Google’s ranking algorithm and its implications for search engine optimization strategies.

It took about a week, but Google responded confirming that the leak was true and the documents were authentic, but downplayed its significance.

They emphasize two main points:

  • Caution against misinterpretations: Google warns against drawing inaccurate conclusions from the information. They claim the data might be outdated, incomplete, or lack context.
  • Protecting search integrity: Google emphasizes its commitment to transparency while also safeguarding its search results from manipulation. This suggests they may not disclose everything about the algorithm.

An in-depth analysis of the leak was done by Mike King and is detailed in his blog: https://ipullrank.com/google-algo-leak

Well, let’s take a closer look at what is in the documents.

User Experience Signals

The Paramount Factor One of the most significant revelations from the leaked documents is the extent to which Google tracks and utilizes user experience signals to determine search rankings. Contrary to previous denials, the leak suggests that Google heavily factors in metrics such as click data, dwell time (the amount of time users spend on a website), and data collected from Chrome browser users.

The Enduring Importance of Backlinks

Amidst speculation that backlinks might be diminishing in importance, the leaked documents suggest otherwise. According to the files, Google still employs sophisticated systems to extract, categorize, and score links pointing to websites and individual pages.

On-Page Optimization

A New Wave of Formatting and Text Styling? One of the more bizarre revelations from the leak is the indication that Google considers attributes like font sizing and bolding of text and links when assessing a page’s relevance and importance. If confirmed, this could necessitate a new wave of on-page optimization efforts, with website owners and SEO professionals paying closer attention to formatting and text styling to enhance their search visibility.

    Freshness, Authority, and Author Signals

    The leaked documents also shed light on the various ways Google attempts to assess the freshness and authority of a page and its author. Factors such as tracking dates mentioned within the content, monitoring author reputation, and evaluating the credibility of the information source appear to be highly valued by Google’s ranking algorithms.

    Human Raters and Critical Content Fields

    One of the most surprising revelations from the leak is the use of actual human raters employed by Google to evaluate and whitelist critical content areas, such as news and election-related information. This revelation suggests that Google relies on human oversight and intervention to ensure the accuracy and quality of search results in sensitive or high-stakes domains.

      Google Search Accused of 6 Lies

      • Google has repeatedly stated that it does not use Chrome user data for ranking websites, but the leaked documents mention Chrome data being used.
      • Google representatives have said that author bylines and experience, expertise, authoritativeness, and trustworthiness (E-E-A-T) are not ranking factors, but the documents show Google tracks author information and has metrics related to E-E-A-T.
      • Google has claimed that it does not use a “sandbox” to initially filter out new websites, but the documents reveal that Google does sandbox new domains if it believes there is spam.
      • Google has stated that word counts and character limits for titles and meta descriptions don’t matter for rankings, but the documents show Google tracks these metrics which likely impact click-through rates.
      • Google has insisted that features like disavowed links are incorporated into their ranking systems, but the leaked documents make no mention of disavowed data being used.
      • Google has downplayed the importance of links as a ranking factor, but the documents go into extensive detail on how Google evaluates and scores different link signals.

      So what we get from this is that Google is still focusing on user experience, while increasing the tanking of Chrome users. The leak confirms that creating high-quality engaging content that builds a strong brand authority through strategic link-building efforts is a must.

      The fundamentals of SEO remain largely unchanged – continue to prioritize creating valuable content that benefits their audiences. The leak however evens the playing field for the SEO community and provides a rare opportunity to reverse-engineer and decipher the intricacies of Google’s ranking algorithm.

      Leave a Reply

      Your email address will not be published. Required fields are marked *