How Does Google Measure Website authority & Rank | Steeped Digital

How Does Google measure Authority and Rank Websites?

Google uses over 200 signals to determine what page it will serve up as search results in response to a query from a user. It is also true that no one outside of Google knows for sure exactly what influence each metric has on page rank. Google’s bucket of 200 signals is also not set in time. It is, in fact, dynamic, and it visibly changes with every update that Google makes to its search engine algorithm. It is well established by the SEO experts who track every movement that Google makes that there is no one specific authority metric that can move the needle on page rank. It is instead a basket of signals that determine the Authority of every page in a website.

How the Google ranking system Works

In preparing for a user’s query – classic search, Google states that, when it thinks about helping the user, it knows that there are ten blue links to show on its first page (and others). Therefore, the question to Google is:
  1. What does it show?
  2. What order should it show it in?
Google builds what it calls the “Life of a query.”
  1. What does Google do before they have a query?
  2. What do they do after?

Before the query

Google builds indexes of billions of pages on the web. – Crawl the pages and analyze them – Extract links – Render content – Annotates semantics – Address extraction – Content rendering

What is this index

– A web index is like the index in a book – For each word a list of pages that it is visible on – This list of pages is split into groups of millions of pages – Google calls these ‘index shards.’ – Plus per document metadata

Query processing

– query understanding – Retrieval and scoring – post retrieval adjustments

Query understanding

– Does the query name any known named entities in it? – Are there useful synonyms? – context matters – does GM mean General Motors of Genetically Modified?

Retrieval and scoring

– Send the query to all the shards – Each shard finds matching pages, computes a score for query/page and send back the top number of pages by score – Combine all the top pages – Sort by score

Post retrieval adjustments

– Host clustering. Sitelinks – Check if there is too much duplication – Spam demotions, – Manual actions – Generate rich snippets and other results Or in straightforward terms, Google first gathers information about pages across the web through its search engine “spiders” and creates a ranking system for each page that it has crawled. It then documents this information in an index that is like a library’s filing system. When the searcher’s query comes in, Google’s servers read the “index shards” and pulls a list of webpages that are displayed on its search result pages. The position of each page on the Google SERP (Search Engine Results Page) reflects Google’s opinion of each web page’s relevance and Authority. Therefore, being at the top of SERP communicates that your page has the highest Authority and “relevance” and is likely most reflective of the user’s search intent.   Advanced Web Ranking chart | Steeped Digital Source: https://www.advancedwebranking.com/ctrstudy/   This graph from Advanced Web Ranking (US rankings for Feb 2020 Searches) shows that the lion’s share of clicks is going to the highest-ranked search results. The First position gets around 33% of the clicks on Desktop search, while mobile search gets about 25% of the clicks. The second position is firmly 15% for both Desktop and Mobile searches. The third position is 10% also for both Desktop and Mobile searches. Note also the drop-offs after position number 5. By all estimates, about 75% of clicks go to the first page of Google SERPs. The higher the rank on the SERP page, the larger the number of clicks that you get, therefore, the larger the volume of traffic to your site. In the final analysis, the larger the volume of traffic to your site, the higher the opportunities for your business.

The original Google metric for Authority: PageRank

PageRank was the definitive algorithm for Google when it first started. This algorithm calculated Page Rank by scoring all the links going to specific web pages. The algorithm counted all the links pointing to a particular page to determine a PageRank for that particular page. The PageRank algorithm also gives weightage to how vital the link is, based on the importance of the source that it is coming from. So a link that came from a website that had a more significant number of links pointing to itself would count as a weightier link that a link that came from a website that had very few links pointing to it. The number of links wasn’t itself the only factor determining rank in the PageRank algorithm. In addition to other factors, the words on the page were the other signal of importance in the PageRank algorithm.

How does Google calculate Authority in 2020?

Google’s relatively recent algorithm change, the RankBrain update, involves over 200 signals, which are all categorized as significant signals. To look at what these 200+ ranking signals are, read this blogpost on backlinko. The RankBrain update leverages artificial intelligence to determine how pages will rank and, therefore, what it will serve up as search results. What is of significant importance is that Google has categorically stated that none of these 200 signals is a single dominant “authority” factor that determines how a page will be ranked. A large number of factors, including the AI component of RankBrain, work together to determine the rank of a page. Google’s RankBrain algorithm works in what mathematicians would call an asymmetric polynomial process. So, what then are these elements that impact page rank? In addition to its algorithms that help determine page rank. Google itself explained in 2017 that they had hired more than 10,000 quality rating contractors who were to use a 200-page guideline book to evaluate the information for Google search. Their principle objective: Use the guidelines that Google gave them to flag low-quality web pages. It is useful to recognize that Google assesses Authority on a per-page basis. The role of the quality checkers is to gather specific details about the websites, so the engineers at Google can mix this with their ranking tools and refine Google’s ability to deliver better and more “authoritative” results. To understand how Google determines its ranking, see this video of Paul Haahr, a Ranking Engineer at Google and part of its leadership team. The video was recorded during Paul’s talk at the “2016 Search Marketing Expo.”  
  A tip of the hat to SMX West for the Paul Haahr video. Even though a bit dated (2016), Paul Haahr’s presentation does lay out in lucid detail what goes into determining rank, and what Google is doing to improve its process continually. A key takeaway (amongst many) was Paul Haahr’s confirmation, and which SEOs long suspected, that the percentage of original/duplicate content found on a web page was a definite ranking factor.

Can Page Authority be transferred to its Domain?

Google says that its decision on Authority is made on a per-page basis. A large number of authoritative pages could logically lead to an authoritative site. However, consider the opposite: If you have popular blog sites like Tumblr, Medium, Blogger, etc., then Authority for the Domain could lead to the potentially false assumptions about the Authority of every single individual blogger (or user) who has a page on these sites. Google says Authority is arrived at on a per-page basis. It explicitly avoids the idea of sitewide Domain Authority. Google feels that doing otherwise can potentially lead to false assumptions about individual pages, especially those on popular sites. So what then is Domain Rating and Domain Authority that SEO tools provide? These are in essence educated guesses by third part SEO companies. These scores serve more like guidelines than absolute numbers or ranking positions. A key point to note – these are not the scores that Google uses. (Refer back to Paul Haahr’s video posted above, for greater clarity.)

Sitewide signals, not domain authority

There are sitewide signals that help Google determine page rank and to show the result higher or lower down its SERP pages. So, what are these sitewide factors that could affect the ranking
  • The site’s loading speed
  • Optimization for mobile
  • The presence of Malware on the site
  • The presence of a large number of high ranking pages on a domain
These are just some of the factors that can generate strong sitewide signals so that when Google has to choose between two pages of, say, equal Authority, then the page ranked higher will be the page that has the more powerful sitewide signals. To quote Paul Haahr, “Our goal in all of this is that we are increasing the quality of the pages that we show to users. Some of our signals are correlated with these notions of quality.” Now, to the final section of this article, what are the Domain Authority and Domain Rating used by SEO companies.

1. AHREFS – Domain Rating

AHREFS - What is domain rating | Steeped Digital Source: https://ahrefs.com/blog/domain-rating/ The Domain Rating score of Ahrefs, perhaps the highest-rated SEO tool at the moment, is calculated, in a straightforward fashion: According to ahrefs.com, their Domain Rating system is a derived metric that shows the “link popularity” of each website compared to all other sites in the world on a scale from 0 to 100. This scale is not linear. It is a logarithmic scale. Let us take an example: the effort to go from domain rank 50 to 51 will be far greater than the effort to go from domain rank 20 to 21. The metric additionally takes into account the quality of links. Meaning do the links come from a website that itself has many links pointing to it, or from a site with very few links pointing to it? The Domain Rating metric includes whether or not the links pointing to the site are “Do Follow” or “No-Follow.” Finally, as you may have already noted, there is nothing in this score about traffic or some other complex ranking system. In a nutshell, the Ahrefs domain rating score for a website is shorthand for the popularity of the website.

2. SEMrush – Authority Score

The second SEO company whose domain score we are going to look at is SEMrush. SEMrush Authority Score | Steeped DigitalSource: https://www.semrush.com/news/backlinks-authority-score/ The SEMrush Authority score, as seen in the example above for eBay, is based on a more complex set of metrics. This metric represents a domain’s overall quality and is measured on a scale from 0 to 100. The elements that constitute this metric are Backlink data, including Domain Score, Trust Score, Referring domains, Follow, and No Follow links, etc. The score also includes Website traffic data and Organic search data. Organic search data is the volume of organic search traffic and positions on SERPs. For a website receiving a link from another site, the SEMrush Authority score will be an excellent index of how beneficial a backlink from this source would be.

3.Moz – Domain Authority

The third and final metric is from moz.com: Domain Authority (DA) is a popular metric for search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs). A Domain Authority score ranges from one to 100, with higher scores corresponding to a more exceptional ability to rank. MOZ Domain Authority | Steeped DigitalSource: https://moz.com/learn/seo/domain-authority Moz calculates Domain Authority by evaluating multiple factors, including linking root domains, and the number of total links, into a single DA score. Moz itself states that the best use of this metric is as a comparative metric and that it is not an absolute score. This score is to be used in doing research for a website’s search results and determining which sites may have more powerful/relevant link profiles than others. Because it’s a comparative tool, there isn’t necessarily a “good” or “bad” Domain Authority score. A score of say DA 25 in a highly fragmented industry with thousands of small businesses competing, could prove to be sufficient to be competitive. While in another industry that is dominated by fewer but larger players, a score of 50 may be entirely insufficient to be competitive.

In conclusion

Domain Authority, Domain Rating, or Authority Scores are all metrics that are essentially a handy heuristic in the SEO industry. In other words, these scores provide information about how a site is likely to rank for a specific keyword. These three metrics/scores that we have evaluated are not the only scores that are available in the SEO industry, but these are three of the most popular tools in the industry. Companies like Ahrefs and Moz have their index of URLs. And they use bots to crawl continuously and update this index. It is rather apparent that none of these indexes will be as comprehensive as Google. Commercial question: If these indexes are all relative, how do you then choose which one you should use? These indexes will generally deliver (reasonably) consistently related ranks. You should not expect that these ranks will be identical, because their formulae to calculate the scores are different from each other. The questions that you may want to ask before deciding on which SEO tool and which index you will use could include the following:
  1. What is the index size of the company? Meaning how many URLs exist in the company’s index?
  2. What is its crawl frequency? The more frequent the crawling, the fresher the data available to you.
  3. What is the instance of ‘false positives’ with Live links? This means cases where inactive links are reported with 200 status codes.
  4. Do the rankings correlate to how you see them on Google? This is a simple test that can show you, for example, if a page on a website with a high PageRank equates to better rankings?
These are all points to consider when you decide to choose which SEO metric you will use as a heuristic or shorthand for your website’s ranking on Google SERPs. To read about what Google thinks is visually and content-wise a good website, click here.  

What is a good website anyway?

What is Google’s definition of a good Website?

Whenever this question is asked to Google managers, they tend to point to a blogpost in the Google Webmaster Central blog written by Google Fellow Amit Singhal, about what constitutes a good website design?

The blog post is titled “More Guidance On Building High Quality Sites”

This article was written in 2011 after the Panda update is an attempt to explain what Google was looking for in website quality that it was algorithmically trying to achieve. While dated 2011, this response as policy continues to be relevant even today. Here is what Singhal says qualifies as good content!

The questions below provide some guidance on how we’ve been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health-related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

He closes by saying that Google is continuing to work on additional algorithmic iterations that would help webmasters operating high-quality sites get more traffic from organic search.

So, there you have it, that is the Canonical gold standard for what a good website should be!

Must Haves For Good Website Design In 2020

To complete the case for what makes a good website. Here are a few trends for visually and technically what makes a good website:

  1. Clear intent – Your website needs to know who its core audience is. Or let me correct that you need to know who your audience is and who is the customer for your product or services.  This is really Marketing 101. Clarity about who the customer is, leads to a laser-focused website!
  1. Technically sound – Is your website stable, is it hosted on a platform with adequate bandwidth? Is it optimized for speed?Last not least, is it crawlable by search engine bots?
  1. Trustworthy and Secure – For starters an SSL certificate is now a ranking factor with Google! Your website needs to secure for your visitors. It must have UpToDate software, and UpToDate plugins. Do your use a tool like Sitelock to guard your website from hacking and have the ability to have it back up asap should you have such an attack? Do you use Cloudflare to not only add speed to your website but use it to protect your website from DDoS attacks. This list is long…… but its prudent to make your site as secure as possible. That trust will make your visitors secure in browsing your website.
  1. Responsive design – With so much of search moving to mobile, having a website that can show well on any device from a desktop to a mobile smart phone is not just desirable, its critical. Mobile First in fact is increasingly the way to go!
  1. Minimalist design – This is generally a matter of personal preference. But uncluttered websites, make content easier to find and will keep visitors who come to your website find navigating it a visually pleasant experience.
  1. UX and UI – User experience and the User Interfaces used to deliver that experience are the aggregate of what constitutes a positive visitor experience. Good UX is a delivery of superior experience visually as well as technically.
  1. Superlative content – Content remains King! Quality always will come out on top. Google weighs content quality as a major ranking factor. The volume of content can also make a substantial difference to how your page ranks or does not rank. The average word count for a page served up by Google on its first page is ~ 1200 words.
  1. Multimedia content – Google looks for content supported by images and videos. A picture can genuinely be worth a thousand words. And a video perhaps significantly more!

Resources:

Google Updates Content Quality | Steeped Digital

Google’s John Mueller says Don’t Focus on how Google defines Content Quality!

Or why traditional knowledge of advertising and consumer behavior will be pivotal in how good website content is created!

Search engine optimizers have been trying to reverse engineer everything that Google does or puts in its updates to get their websites to rank on page one of Google, virtually from the time that Google became a successful search engine.

At the same time, Google has been moving deftly and with speed to keep making changes to its algorithm, both big and small, that impact how it serves up results to a searcher’s query. According to Search Engine Land, in 2018 alone, Google made 3200 changes to its algorithm. Contrast that with 2010, when the number of updates was between 350 and 400.

This search for the holy grail of page-1 rankings has seen SEOs use all kinds of optimizing techniques, Black Hat, White Hat, and everything in between. Google has kept the SEO world in effervescence by continually changing its algorithms. It has also moved inexorably towards a position where the results that it serves up are based more and more on the demonstration of domain Expertise and Authority by the website, and upon Trust demonstrated in the website by users. Google’s algorithm modifications have generally moved in the direction of rewarding quality.

In a conversation on March 6th, earlier this month, Google’s John Mueller when asked, “What is quality content for Google,” responded :

“I wouldn’t worry too much about what Google thinks about quality content. But rather, you need to show that you really have something that is unique and compelling and of high quality.”

As Mueller elaborates his response, he goes on to say:

“So instead of trying to work back how Google’s algorithms might be working, I would recommend trying to figure out what your users are actually thinking and doing things like user studies, inviting a bunch of people to your office or virtually to show them something new that you’re providing on your website and ask them really hard questions where sometimes the answer might be we don’t like your website, or we were confused by your website, or we don’t like the color of your logo or something.”

Google’s suggestion is an approach to content quality that involves exhaustive customer insights and in-depth knowledge of consumer behavior based on actual customer interactions and not the manipulation of content on a page, adding links, and running various optimization loops.

While technology will still lead the SEO practice, this approach to content quality is more suggestive of Madison Avenue creative skills and copywriting. A strategy in which good advertising is approached with the customer genuinely understood and firmly in mind when communication is created.

David Ogilvy, one of the advertising greats of the 20th century, is supposed to have remarked in what became advertising folklore of the last century that “The customer is not a moron. She’s your wife”. His remark came in response to traditional 1950s and 60s advertising techniques that involved a manufacturer’s mindset, and a hectoring tone of delivery of the message. His logic was that the customer was intelligent, capable, and aware, and should be treated as such by the brands and their advertisers.

Google’s algorithm updates leading up to Hummingbird are directionally similar. Their ask seems to be simple – create content for human beings and not for machines.

A brief history of major Google algorithm updates of the past decade:

2011 – Google Panda

The role of the Panda update was to be a filter beyond the actual rank algorithm of Google. It intended to penalize websites loaded with thin content pages or had multiple pages of duplicate content.

It also penalized websites that relied on low quality farmed content. These websites essentially paid a host of content creators to create sub-par, thin content that was stuffed with keywords to rank for those keywords.

The Panda updated also penalized websites that used low-quality user-generated content, as well as websites or pages on the sites that had a low content to ad ratio.

So, who was Panda supposed to reward? Websites that focused on quality content would now be rewarded in Google’s organic search results. Simultaneously low-quality websites would be penalized, and their search ranking diminished considerably.

In Google’s own words, this algorithm update was going to noticeably impact about 12% of all English language websites at the time.

Panda’s impact was so significant that within a year of its launch, it was integrated by Google into its core search results algorithm.

As an interesting aside, the Panda update was not named after the Giant Panda as is sometimes assumed. The Panda update was named after the Google engineer Navneet Panda, who developed the technology behind what became the eponymous algorithm update.

2012 – Google Penguin

Penguin was the update that destroyed websites that used manipulative link techniques and keyword stuffing on pages to rank higher. This update too, was designed to rewarded high-quality websites.

Why was Penguin needed?

Google uses backlinks, pointing to a website as an indication, very broadly, that the website is a trusted source of information. The backlinks are the rough equivalents of votes that signify the degree of Trust that the website enjoys.

Manipulative backlink creation involves merely buying backlinks from other domains or using spammy techniques to create backlinks from say internet forums. This technique could also include getting contextual backlinks from websites that have nothing to do with the category of the website obtaining these links.

For example, a home renovation website that buys links from medical supply websites.

Google uses the presence of keywords on a page as a sign that the website could provide a relevant answer to the searcher’s query. 

Keyword stuffing sometimes demonstrated (and sometimes still does) acme level creativity in spinning content to a level that is barely readable by humans. An example of keyword stuffing could be the following two-sentence pitch for a Plumber on his website:

“ABC Toronto Plumbers is a trusted Toronto Plumber. If Toronto homeowners need a Trusted Toronto Plumber, then ABC Toronto Plumbers is their trusted Toronto Plumber to go to”!  

These 2 sentences together have the keyword “Toronto Plumber” mentioned 5 times.

In summary, Google created its Penguin update to prevent websites from obtaining a large number of low-value links and stuffing their pages with keywords to leap to the top of keyword rankings.

In 2016 Google made Penguin a part of its core search engine ranking algorithm.  

2013 – Hummingbird

The Hummingbird update was more than an update; it was an overhaul of Google’s core search engine algorithm. For clarity, while Panda and Penguin were algorithm changes, this was a generational change in the core search engine algorithm.

The primary objective of the Hummingbird algorithm for Google was to understand better; a user’s search query. Bill Slawsky, who writes about Google’s patents, writing in a 2013 post in seobythesea.com explains that Google’s Hummingbird patent is designed to help Google read and understand a natural language query. As an example, he offers that if a customer was to query “What is the best place for Chicago style Pizza,” then Google will understand that when the searcher asked for ‘Place,’ the intent was likely to mean ‘restaurant.’ Google would accordingly serve up relevant restaurant results for this query.

CONTENT QUALITY


The root of this algorithm change is Google’s effort to understand the searcher’s intent even when the words in the query do not directly match the words on the page. Or in a nutshell, what this algorithm change means is that Google is signaling to website owners and webmasters that they should try and create content that answers their customers’ queries instead of just focusing heavily on keywords and links to obtain ranking.

There is consistency in these three updates. Google’s clear message to website owners and webmasters is that you should create quality content that provides information of value to the searcher and fulfills search intent.

And to quote David Ogilvy again; words almost prescient about the challenges of digital communication:

“If you have a truly big idea, the wrong technique won’t kill it. And if you don’t have a big idea, the right technique won’t help you.”

Watch John Mueller in the Google Hangout. The interaction referenced here begins at 26.41

 

#SEO #backlinks #onpage #digitalmarketing