The Google Navboost Leak That Validated CTR Manipulation Techniques

In early May of 2024, Google Search API documents were leaked to online journalists. These detailed notes from Google’s internal Content API Warehouse were confirmed as valid in an interview with Barry Schwartz, although the company wouldn’t comment on the contents of the leak itself.

Simply called the ‘Google Navboost Leak’, the 2,500 pages of API details cover everything from the parameters that RankBrain considers, to how click-through rate (CTR) is categorized.

In this article we’ll cover how the Navboost leak has aided our research on CTR manipulation methods, and how these findings can be used to drastically improve SEO and overall Google rankings.

How the Google Navboost Leak Showcased CTR Manipulation

There are over 14,000 ranking features, both positive and negative within the Navboost leak documents. These features put to bed some of the ‘official statements’ that Google has made about not using metrics such as domain authority and click relevance towards rankings… we now know beyond a shadow of a doubt that they do use these metrics.

If we laser-focus in on how Google measures CTR metrics, we find a vast array of methods that their API recognizes:

Bad Clicks: The implication here is that Google tracks the quality of each click resulting from their engine’s search results. This metric might be referring to user experience (most likely scenario), for example the case where users click on a result and then click the back button to go to a different result.

badClicks (type: float(), default: nil) –

Clicks: This means at some level, the raw number of total clicks matters. This is important because the entire theory behind CTR manipulation is to increase the ratio of ‘Good Clicks’ to this total click figure.

clicks (type: float(), default: nil) –

Good Clicks: Again, we see that click quality matters. This is likely a combination of positive user experience, feedback, dwell time… all of the things that CTR manipulation seeks to enhance. Especially clicks that don’t click the “Go back” button to see the SERP again.

goodClicks (type: float(), default: nil) –

Impressions: Impressions are an indirect way to rate the ‘attractiveness’ of a link. For example, if the average user sees a link and decides that it is relevant to their search, that link’s ratio of clicks to impressions will go up. The higher click-to-impression ratio a link has, the more relevant it will be to the average user, at least in their estimation.

impressions (type: float(), default: nil) –

Last Longest Click: The theory here is that users will go through a list of search results to get a lot of different takes on their query. The best scenario when a link is listed as a search result is to be the last query a user makes (getting the answer right), the longest (being the most relevant and compelling), or both.

lastLongestClicks (type: float(), default: nil) –

Unicorn Clicks: This could mean a few different things, but in terms of Chromium’s documentation, a Unicorn user is a child under the age of consent in their jurisdiction. A different set of laws and rules apply to those users, and the way search algorithms are impacted by their interaction is entirely different as well.

unicornClicks (type: float(), default: nil) – The subset of clicks that are associated with an event from a Unicorn user.

There are some additional fields that are not being used yet, so we’ve removed them from this particular list. But the full text is available at the link above for those who wanted to poke around and speculate.

All of these metrics are being tracked for a reason. They’re an integral part of how search rank is determined. The only thing in question is the exact weight of each attribute. Is ‘Last Longest Click’ more important than the link’s ‘Good Clicks’ to ‘Clicks’ ratio? Does ‘Impressions’ simply help give a link a kind of ‘batting average’ similar to a hitter in baseball, or is it an indication of how trendy your link is and thus have an impact on things like decay rate? We can’t be sure.

But the SEO community knows that they get better results when keeping these factors in mind, despite what certain Google reps might have said in the past.

Google Representatives Misinformed the Public About the Importance of SEO and CTR Manipulation

The most infamous case of a Google rep slamming the SEO industry is when Gary Ilyes slammed Rand Fishkin by saying:

“Dwell time, CTR, whatever Fishkin’s new theory is, those are generally made up crap. Search is much more simple than people think.”

He also once said, “Using clicks directly in rankings would be a mistake.”

Given the content of the Google Navboost leak, we know that neither of those statements accurately reflected Google’s ongoing search ranking theory or practices. And there are dozens of other statements from Google employees that have been called into question by these latest revelations.

We now know that the ‘Site Authority’ attribute is real, despite being told otherwise for years. In fact, E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) is probably more important than ever before.

And that’s the case for multiple attributes that the industry was told to ignore or downplay in relevance. To say that the leak was frustrating to some and vindicating to others would be an accurate summary of what’s being said all over the blogosphere.

This isn’t to say that these employees were intentionally misleading the public. Some might have been mistaken. Others were being intentionally vague. Still others might have been in a bad mood and said something mean spirited that was interpreted as a technical opinion.

But no matter which of the above cases is true, the resulting misinformation hurt the SEO and CTR manipulation fields, who trusted Google employees to give them accurate information if they were going to give them any information at all.

After all, there are antitrust implications if Google gave out false information to the public, but more accurate information to a select few. The kind of market manipulation that would be possible if we didn’t have a fair and level playing field would be astronomical. So it’s in Google’s best interests to be factual and even-handed about these things in the future. Or simply say ‘no comment’, which is well within their right.

How Does RankBrain Fit Into This Picture?

Part of the Navboost leak talked about how search terms are weighted in terms of relevancy. This is likely a function of the RankBrain AI, which pulls its training data from many sources including human link ratings (coming from Google’s Quality Raters), Chrome browser data, and click records.

We call this process term weighting, and it’s one of the most important concepts for the future of SEO, particularly as AI becomes a more active component of the ranking process.

What we know from the leak is that RankBrain considers factors such as ‘Good Clicks’ and click ratios into its calculations. But much like the other leaks, we don’t know the exact weight of these considerations. It’s a factor, but to date we can’t tell you if it’s a major or minor one.

The Importance of Content Decay in CTR Manipulation

One of the big takeaways from the Google Navboost leak was the existence of a ‘Last Longest Click’ attribute. This is a massive implication that content decay is a factor when it comes to retaining rank. As the date of a link’s ‘Last Longest Click’ gets older, the system assumes that it is no longer relevant.

CTR manipulation combats this by getting fresh users to click on the link regularly and dwell on the resulting page as their last search action. That is reflected in the ‘Last Longest Click’ attribute and any date related relevancy penalty vanishes.

This is yet another vindication of the practical results that industry experts have seen over the years, despite being told that what they were doing was superstition and guesswork. We now know for a fact that decay has some impact on ranking, and that proper CTR manipulation eliminates any concern about it.

The Importance of Good Click Ratios in CTR Manipulation

The Navboost leak also tells us that click ratios and click quality matter. This is the backbone of high quality CTR manipulation; not just clicking on a link to add to a counter, but dwelling at the resulting page and interacting with it.

In that way, good clicks and last clicks can both be registered simultaneously, in a manner that looks natural to the search engine. All the while, total clicks also rise. While this might not be as important of a statistic as the others, it’s definitely tracked.

Bad CTR manipulation, on the other hand, might actively hurt a site’s ranking. Simply clicking on a link and bailing immediately can indicate that the site isn’t relevant to the search, and imply that the click wasn’t ‘good’. This defeats the entire purpose of CTR manipulation. Last clicks and longest clicks are critical components to the process.

CTR manipulation

E-E-A-T Should Now Be Considered an Authority Imperative

We know from the Google Navboost leaks that scoring expertise and authority is a core attribute of their overall ranking system. Any non-believers out there need to shed their cloaks of doubt and hop on board the trust train. The ‘Author’ and ‘isAuthor’ values are explicitly tracked as a Navboost component. Entity SEO is real.

There are arguments out there that say we have no idea how much E-E-A-T matters in terms of search results, and the leak doesn’t specify. But the same can be said about every aspect of the leak, given that no ranking formulas or weighting systems were explicitly included! As SEO experts, we either have to assume that all of these attributes have meaning or that none of them do. Guessing helps nobody. When more data is available, we can reevaluate.

Until then, populating every relevant E-E-A-T field should become standard practice. Can a site rank highly without it? Sure. But it’s quite possible that individual contributors have a ‘stickiness’ between sites that adds to relevancy and domain authority. The best way to find out is to put it into practice, and to see if prolific authorship becomes a common factor among top ranking sites, particularly in more factual and scientific fields.

Think of it this way: E-E-A-T represents transferable entities within Google Navboost. That’s an incredibly rare resource. It means that putting aside every aspect of the website itself, the input of an individual author can lend authority to a search result. That’s a power that transcends the system in a unique way. It’s worth some time and effort to see how deep this rabbit hole goes.

Getting back to CTR manipulation, this opens up a whole new experiment: Author clicking. Rather than focusing on a single site, a client might try to boost their author relevance by last clicking and lingering on all sites where they are listed as the primary author. It’s certainly one way to limit-test the relevance of E-E-A-T.

Some Final Thoughts on the Navboost Leak

First of all, we as a community owe both Mike King and Rand Fishkin a debt of gratitude for their amazing technical breakdowns of the Google Navboost leak. Their opinions differ in minor ways here and there, but both of them worked tirelessly to sift through this absolute mountain of data.

Another point: While we may be hyper-focused on topics that impact CTR manipulation, it’s always wise to look at the bigger picture. Google’s more heavy reliance on RankBrain is a big deal, and how this leak fits into that picture is something that industry experts will be analyzing for months. Human quality raters and the EWOK platform are mentioned several times in the leak, some in reference to machine learning training platforms, but others in reference to direct search results. That’s also a big deal.

But for the moment, the fact that high quality CTR manipulation techniques have in-engine metrics attached to them is gratifying. It means that the activities that we undertake are impacting core Google Navboost functionality, and not just tickling some tangential function that could be accidentally wiped out in a minor update.

In short: CTR manipulation works, it matters, and thanks to the Navboost leak, we have the receipts to prove it.