Google uses over 200 signals to determine what page it will serve up as search results in response to a query from a user. It is also true that no one outside of Google knows for sure exactly what influence each metric has on page rank.
Google’s bucket of 200 signals is also not set in time. It is, in fact, dynamic, and it visibly changes with every update that Google makes to its search engine algorithm.
It is well established by the SEO experts who track every movement that Google makes that there is no one specific authority metric that can move the needle on page rank. It is instead a basket of signals that determine the Authority of every page in a website.
How the Google ranking system Works
In preparing for a user’s query – classic search, Google states that, when it thinks about helping the user, it knows that there are ten blue links to show on its first page (and others). Therefore, the question to Google is:
What does it show?
What order should it show it in?
Google builds what it calls the “Life of a query.”
What does Google do before they have a query?
What do they do after?
Before the query
Google builds indexes of billions of pages on the web.
– Crawl the pages and analyze them
– Extract links
– Render content
– Annotates semantics
– Address extraction
– Content rendering
What is this index
– A web index is like the index in a book
– For each word a list of pages that it is visible on
– This list of pages is split into groups of millions of pages – Google calls these ‘index shards.’
– Plus per document metadata
– query understanding
– Retrieval and scoring
– post retrieval adjustments
– Does the query name any known named entities in it?
– Are there useful synonyms?
– context matters – does GM mean General Motors of Genetically Modified?
Retrieval and scoring
– Send the query to all the shards
– Each shard finds matching pages, computes a score for query/page and send back the top number of pages by score
– Combine all the top pages
– Sort by score
Post retrieval adjustments
– Host clustering. Sitelinks
– Check if there is too much duplication
– Spam demotions,
– Manual actions
– Generate rich snippets and other results
Or in straightforward terms, Google first gathers information about pages across the web through its search engine “spiders” and creates a ranking system for each page that it has crawled. It then documents this information in an index that is like a library’s filing system.
When the searcher’s query comes in, Google’s servers read the “index shards” and pulls a list of webpages that are displayed on its search result pages. The position of each page on the Google SERP (Search Engine Results Page) reflects Google’s opinion of each web page’s relevance and Authority.
Therefore, being at the top of SERP communicates that your page has the highest Authority and “relevance” and is likely most reflective of the user’s search intent.
This graph from Advanced Web Ranking (US rankings for Feb 2020 Searches) shows that the lion’s share of clicks is going to the highest-ranked search results.
The First position gets around 33% of the clicks on Desktop search, while mobile search gets about 25% of the clicks.
The second position is firmly 15% for both Desktop and Mobile searches.
The third position is 10% also for both Desktop and Mobile searches.
Note also the drop-offs after position number 5. By all estimates, about 75% of clicks go to the first page of Google SERPs.
The higher the rank on the SERP page, the larger the number of clicks that you get, therefore, the larger the volume of traffic to your site.
In the final analysis, the larger the volume of traffic to your site, the higher the opportunities for your business.
The original Google metric for Authority: PageRank
PageRank was the definitive algorithm for Google when it first started. This algorithm calculated Page Rank by scoring all the links going to specific web pages. The algorithm counted all the links pointing to a particular page to determine a PageRank for that particular page.
The PageRank algorithm also gives weightage to how vital the link is, based on the importance of the source that it is coming from. So a link that came from a website that had a more significant number of links pointing to itself would count as a weightier link that a link that came from a website that had very few links pointing to it.
The number of links wasn’t itself the only factor determining rank in the PageRank algorithm. In addition to other factors, the words on the page were the other signal of importance in the PageRank algorithm.
How does Google calculate Authority in 2020?
Google’s relatively recent algorithm change, the RankBrain update, involves over 200 signals, which are all categorized as significant signals. To look at what these 200+ ranking signals are, read this blogpost on backlinko.
The RankBrain update leverages artificial intelligence to determine how pages will rank and, therefore, what it will serve up as search results.
What is of significant importance is that Google has categorically stated that none of these 200 signals is a single dominant “authority” factor that determines how a page will be ranked. A large number of factors, including the AI component of RankBrain, work together to determine the rank of a page.
Google’s RankBrain algorithm works in what mathematicians would call an asymmetric polynomial process.
So, what then are these elements that impact page rank? In addition to its algorithms that help determine page rank. Google itself explained in 2017 that they had hired more than 10,000 quality rating contractors who were to use a 200-page guideline book to evaluate the information for Google search.
Their principle objective: Use the guidelines that Google gave them to flag low-quality web pages. It is useful to recognize that Google assesses Authority on a per-page basis.
The role of the quality checkers is to gather specific details about the websites, so the engineers at Google can mix this with their ranking tools and refine Google’s ability to deliver better and more “authoritative” results.
To understand how Google determines its ranking, see this video of Paul Haahr, a Ranking Engineer at Google and part of its leadership team. The video was recorded during Paul’s talk at the “2016 Search Marketing Expo.”
A tip of the hat to SMX West for the Paul Haahr video. Even though a bit dated (2016), Paul Haahr’s presentation does lay out in lucid detail what goes into determining rank, and what Google is doing to improve its process continually.
A key takeaway (amongst many) was Paul Haahr’s confirmation, and which SEOs long suspected, that the percentage of original/duplicate content found on a web page was a definite ranking factor.
Can Page Authority be transferred to its Domain?
Google says that its decision on Authority is made on a per-page basis. A large number of authoritative pages could logically lead to an authoritative site.
However, consider the opposite: If you have popular blog sites like Tumblr, Medium, Blogger, etc., then Authority for the Domain could lead to the potentially false assumptions about the Authority of every single individual blogger (or user) who has a page on these sites.
Google says Authority is arrived at on a per-page basis. It explicitly avoids the idea of sitewide Domain Authority. Google feels that doing otherwise can potentially lead to false assumptions about individual pages, especially those on popular sites.
So what then is Domain Rating and Domain Authority that SEO tools provide? These are in essence educated guesses by third part SEO companies. These scores serve more like guidelines than absolute numbers or ranking positions. A key point to note – these are not the scores that Google uses. (Refer back to Paul Haahr’s video posted above, for greater clarity.)
Sitewide signals, not domain authority
There are sitewide signals that help Google determine page rank and to show the result higher or lower down its SERP pages.
So, what are these sitewide factors that could affect the ranking
The site’s loading speed
Optimization for mobile
The presence of Malware on the site
The presence of a large number of high ranking pages on a domain
These are just some of the factors that can generate strong sitewide signals so that when Google has to choose between two pages of, say, equal Authority, then the page ranked higher will be the page that has the more powerful sitewide signals.
To quote Paul Haahr, “Our goal in all of this is that we are increasing the quality of the pages that we show to users. Some of our signals are correlated with these notions of quality.”
Now, to the final section of this article, what are the Domain Authority and Domain Rating used by SEO companies.
1. AHREFS – Domain Rating
The Domain Rating score of Ahrefs, perhaps the highest-rated SEO tool at the moment, is calculated, in a straightforward fashion:
According to ahrefs.com, their Domain Rating system is a derived metric that shows the “link popularity” of each website compared to all other sites in the world on a scale from 0 to 100.
This scale is not linear. It is a logarithmic scale. Let us take an example: the effort to go from domain rank 50 to 51 will be far greater than the effort to go from domain rank 20 to 21.
The metric additionally takes into account the quality of links. Meaning do the links come from a website that itself has many links pointing to it, or from a site with very few links pointing to it?
The Domain Rating metric includes whether or not the links pointing to the site are “Do Follow” or “No-Follow.”
Finally, as you may have already noted, there is nothing in this score about traffic or some other complex ranking system.
In a nutshell, the Ahrefs domain rating score for a website is shorthand for the popularity of the website.
2. SEMrush – Authority Score
The second SEO company whose domain score we are going to look at is SEMrush.
The SEMrush Authority score, as seen in the example above for eBay, is based on a more complex set of metrics.
This metric represents a domain’s overall quality and is measured on a scale from 0 to 100.
The elements that constitute this metric are Backlink data, including Domain Score, Trust Score, Referring domains, Follow, and No Follow links, etc.
The score also includes Website traffic data and Organic search data. Organic search data is the volume of organic search traffic and positions on SERPs.
For a website receiving a link from another site, the SEMrush Authority score will be an excellent index of how beneficial a backlink from this source would be.
3.Moz – Domain Authority
The third and final metric is from moz.com: Domain Authority (DA) is a popular metric for search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs).
A Domain Authority score ranges from one to 100, with higher scores corresponding to a more exceptional ability to rank.
Moz calculates Domain Authority by evaluating multiple factors, including linking root domains, and the number of total links, into a single DA score.
Moz itself states that the best use of this metric is as a comparative metric and that it is not an absolute score.
This score is to be used in doing research for a website’s search results and determining which sites may have more powerful/relevant link profiles than others.
Because it’s a comparative tool, there isn’t necessarily a “good” or “bad” Domain Authority score. A score of say DA 25 in a highly fragmented industry with thousands of small businesses competing, could prove to be sufficient to be competitive.
While in another industry that is dominated by fewer but larger players, a score of 50 may be entirely insufficient to be competitive.
Domain Authority, Domain Rating, or Authority Scores are all metrics that are essentially a handy heuristic in the SEO industry.
In other words, these scores provide information about how a site is likely to rank for a specific keyword.
These three metrics/scores that we have evaluated are not the only scores that are available in the SEO industry, but these are three of the most popular tools in the industry.
Companies like Ahrefs and Moz have their index of URLs. And they use bots to crawl continuously and update this index.
It is rather apparent that none of these indexes will be as comprehensive as Google.
Commercial question: If these indexes are all relative, how do you then choose which one you should use?
These indexes will generally deliver (reasonably) consistently related ranks. You should not expect that these ranks will be identical, because their formulae to calculate the scores are different from each other.
The questions that you may want to ask before deciding on which SEO tool and which index you will use could include the following:
What is the index size of the company? Meaning how many URLs exist in the company’s index?
What is its crawl frequency? The more frequent the crawling, the fresher the data available to you.
What is the instance of ‘false positives’ with Live links? This means cases where inactive links are reported with 200 status codes.
Do the rankings correlate to how you see them on Google? This is a simple test that can show you, for example, if a page on a website with a high PageRank equates to better rankings?
These are all points to consider when you decide to choose which SEO metric you will use as a heuristic or shorthand for your website’s ranking on Google SERPs.
To read about what Google thinks is visually and content-wise a good website, click here.
Whenever this question is asked to Google managers, they tend to point to a blogpost in the Google Webmaster Central blog written by Google Fellow Amit Singhal, about what constitutes a good website design?
This article was written in 2011 after the Panda update is an attempt to explain what Google was looking for in website quality that it was algorithmically trying to achieve. While dated 2011, this response as policy continues to be relevant even today. Here is what Singhal says qualifies as good content!
The questions below provide some guidance on how we’ve been looking at the issue:
Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Would you be comfortable giving your credit card information to this site?
Does this article have spelling, stylistic, or factual errors?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does the page provide substantial value when compared to other pages in search results?
How much quality control is done on content?
Does the article describe both sides of a story?
Is the site a recognized authority on its topic?
Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
Was the article edited well, or does it appear sloppy or hastily produced?
For a health-related query, would you trust information from this site?
Would you recognize this site as an authoritative source when mentioned by name?
Does this article provide a complete or comprehensive description of the topic?
Does this article contain insightful analysis or interesting information that is beyond obvious?
Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
Does this article have an excessive amount of ads that distract from or interfere with the main content?
Would you expect to see this article in a printed magazine, encyclopedia or book?
Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
Are the pages produced with great care and attention to detail vs. less attention to detail?
Would users complain when they see pages from this site?
He closes by saying that Google is continuing to work on additional algorithmic iterations that would help webmasters operating high-quality sites get more traffic from organic search.
So, there you have it, that is the Canonical gold standard for what a good website should be!
Must Haves For Good Website Design In 2020
To complete the case for what makes a good website. Here are a few trends for visually and technically what makes a good website:
Clear intent – Your website needs to know who its core audience is. Or let me correct that you need to know who your audience is and who is the customer for your product or services. This is really Marketing 101. Clarity about who the customer is, leads to a laser-focused website!
Technically sound – Is your website stable, is it hosted on a platform with adequate bandwidth? Is it optimized for speed?Last not least, is it crawlable by search engine bots?
Trustworthy and Secure – For starters an SSL certificate is now a ranking factor with Google! Your website needs to secure for your visitors. It must have UpToDate software, and UpToDate plugins. Do your use a tool like Sitelock to guard your website from hacking and have the ability to have it back up asap should you have such an attack? Do you use Cloudflare to not only add speed to your website but use it to protect your website from DDoS attacks. This list is long…… but its prudent to make your site as secure as possible. That trust will make your visitors secure in browsing your website.
Responsive design – With so much of search moving to mobile, having a website that can show well on any device from a desktop to a mobile smart phone is not just desirable, its critical. Mobile First in fact is increasingly the way to go!
Minimalist design – This is generally a matter of personal preference. But uncluttered websites, make content easier to find and will keep visitors who come to your website find navigating it a visually pleasant experience.
UX and UI – User experience and the User Interfaces used to deliver that experience are the aggregate of what constitutes a positive visitor experience. Good UX is a delivery of superior experience visually as well as technically.
Superlative content – Content remains King! Quality always will come out on top. Google weighs content quality as a major ranking factor. The volume of content can also make a substantial difference to how your page ranks or does not rank. The average word count for a page served up by Google on its first page is ~ 1200 words.
Multimedia content – Google looks for content supported by images and videos. A picture can genuinely be worth a thousand words. And a video perhaps significantly more!
Or why traditional knowledge of advertising and consumer behavior will be pivotal in how good website content is created!
Search engine optimizers have been trying to reverse engineer everything that Google does or puts in its updates to get their websites to rank on page one of Google, virtually from the time that Google became a successful search engine.
At the same time, Google has been moving deftly and with speed to keep making changes to its algorithm, both big and small, that impact how it serves up results to a searcher’s query. According to Search Engine Land, in 2018 alone, Google made 3200 changes to its algorithm. Contrast that with 2010, when the number of updates was between 350 and 400.
This search for the holy grail of page-1 rankings has seen SEOs use all kinds of optimizing techniques, Black Hat, White Hat, and everything in between. Google has kept the SEO world in effervescence by continually changing its algorithms. It has also moved inexorably towards a position where the results that it serves up are based more and more on the demonstration of domain Expertise and Authority by the website, and upon Trust demonstrated in the website by users. Google’s algorithm modifications have generally moved in the direction of rewarding quality.
In a conversation on March 6th, earlier this month, Google’s John Mueller when asked, “What is quality content for Google,” responded :
“I wouldn’t worry too much about what Google thinks about quality content. But rather, you need to show that you really have something that is unique and compelling and of high quality.”
As Mueller elaborates his response, he goes on to say:
“So instead of trying to work back how Google’s algorithms might be working, I would recommend trying to figure out what your users are actually thinking and doing things like user studies, inviting a bunch of people to your office or virtually to show them something new that you’re providing on your website and ask them really hard questions where sometimes the answer might be we don’t like your website, or we were confused by your website, or we don’t like the color of your logo or something.”
Google’s suggestion is an approach to content quality that involves exhaustive customer insights and in-depth knowledge of consumer behavior based on actual customer interactions and not the manipulation of content on a page, adding links, and running various optimization loops.
While technology will still lead the SEO practice, this approach to content quality is more suggestive of Madison Avenue creative skills and copywriting. A strategy in which good advertising is approached with the customer genuinely understood and firmly in mind when communication is created.
David Ogilvy, one of the advertising greats of the 20th century, is supposed to have remarked in what became advertising folklore of the last century that “The customer is not a moron. She’s your wife”. His remark came in response to traditional 1950s and 60s advertising techniques that involved a manufacturer’s mindset, and a hectoring tone of delivery of the message. His logic was that the customer was intelligent, capable, and aware, and should be treated as such by the brands and their advertisers.
Google’s algorithm updates leading up to Hummingbird are directionally similar. Their ask seems to be simple – create content for human beings and not for machines.
A brief history of major Google algorithm updates of the past decade:
2011 – Google Panda
The role of the Panda update was to be a filter beyond the actual rank algorithm of Google. It intended to penalize websites loaded with thin content pages or had multiple pages of duplicate content.
It also penalized websites that relied on low quality farmed content. These websites essentially paid a host of content creators to create sub-par, thin content that was stuffed with keywords to rank for those keywords.
The Panda updated also penalized websites that used low-quality user-generated content, as well as websites or pages on the sites that had a low content to ad ratio.
So, who was Panda supposed to reward? Websites that focused on quality content would now be rewarded in Google’s organic search results. Simultaneously low-quality websites would be penalized, and their search ranking diminished considerably.
In Google’s own words, this algorithm update was going to noticeably impact about 12% of all English language websites at the time.
Panda’s impact was so significant that within a year of its launch, it was integrated by Google into its core search results algorithm.
As an interesting aside, the Panda update was not named after the Giant Panda as is sometimes assumed. The Panda update was named after the Google engineer Navneet Panda, who developed the technology behind what became the eponymous algorithm update.
2012 – Google Penguin
Penguin was the update that destroyed websites that used manipulative link techniques and keyword stuffing on pages to rank higher. This update too, was designed to rewarded high-quality websites.
Why was Penguin needed?
Google uses backlinks, pointing to a website as an indication, very broadly, that the website is a trusted source of information. The backlinks are the rough equivalents of votes that signify the degree of Trust that the website enjoys.
Manipulative backlink creation involves merely buying backlinks from other domains or using spammy techniques to create backlinks from say internet forums. This technique could also include getting contextual backlinks from websites that have nothing to do with the category of the website obtaining these links.
For example, a home renovation website that buys links from medical supply websites.
Google uses the presence of keywords on a page as a sign that the website could provide a relevant answer to the searcher’s query.
Keyword stuffing sometimes demonstrated (and sometimes still does) acme level creativity in spinning content to a level that is barely readable by humans. An example of keyword stuffing could be the following two-sentence pitch for a Plumber on his website:
“ABC Toronto Plumbers is a trusted Toronto Plumber. If Toronto homeowners need a Trusted Toronto Plumber, then ABC Toronto Plumbers is their trusted Toronto Plumber to go to”!
These 2 sentences together have the keyword “Toronto Plumber” mentioned 5 times.
In summary, Google created its Penguin update to prevent websites from obtaining a large number of low-value links and stuffing their pages with keywords to leap to the top of keyword rankings.
In 2016 Google made Penguin a part of its core search engine ranking algorithm.
2013 – Hummingbird
The Hummingbird update was more than an update; it was an overhaul of Google’s core search engine algorithm. For clarity, while Panda and Penguin were algorithm changes, this was a generational change in the core search engine algorithm.
The primary objective of the Hummingbird algorithm for Google was to understand better; a user’s search query. Bill Slawsky, who writes about Google’s patents, writing in a 2013 post in seobythesea.com explains that Google’s Hummingbird patent is designed to help Google read and understand a natural language query. As an example, he offers that if a customer was to query “What is the best place for Chicago style Pizza,” then Google will understand that when the searcher asked for ‘Place,’ the intent was likely to mean ‘restaurant.’ Google would accordingly serve up relevant restaurant results for this query.
The root of this algorithm change is Google’s effort to understand the searcher’s intent even when the words in the query do not directly match the words on the page. Or in a nutshell, what this algorithm change means is that Google is signaling to website owners and webmasters that they should try and create content that answers their customers’ queries instead of just focusing heavily on keywords and links to obtain ranking.
There is consistency in these three updates. Google’s clear message to website owners and webmasters is that you should create quality content that provides information of value to the searcher and fulfills search intent.
And to quote David Ogilvy again; words almost prescient about the challenges of digital communication:
“If you have a truly big idea, the wrong technique won’t kill it. And if you don’t have a big idea, the right technique won’t help you.”
Watch John Mueller in the Google Hangout. The interaction referenced here begins at 26.41
How to Drive Traffic to Your Online Store or Business Website
Retail e-commerce stores are expected to double between 2018 and 2023. This statistic shows not just how well e-commerce is doing at the moment, but it also shows how omnipresent online stores are becoming. That trend is only catching speed.
But not every online store is thriving or will thrive. One of the reasons why is that you, a business, don’t know how to drive traffic to your online store. They don’t know how to attract customers with index pages or how to set up Google My Business Page.
If you have an online business, you have to have more than just a product. You have to have more than a fancy website.
You need a web site that is easy for customers to find.
The best part is, you don’t need to be an SEO expert or a website programmer to know how to drive traffic to your online store. You just need simple effective tactics.
Use Organic Traffic Methods
These methods can be free or cost as much as you want them to cost. A lot of these methods also use social media as a platform to speak to their audience. You can see how to effectively use social media or leverage other tactics to drive more traffic to your site.
Write Quality Content
Writing quality content that always answers a customer’s problem is one way to keep your website visitors coming back. The more your content helps solve a problem, the more likely that audience will recommend that reading to someone else and keep coming back for more information.
Create YouTube Videos
Having tutorial YouTube videos can help solve a problem for your audience. In essence, you are giving your audience the keys to the car to do it themselves.
You can have YouTube videos linked on your website and have multiple videos in order to keep your audience coming back. YouTube is also a very popular search engine, which helps people find your online store.
Google My Business Page
Google is the no. 1 search engine, which means it wise to have your business listed on Google. You can claim and list your page for free and it also helps with SEO.
When you do list your business page on Google, you can see your reviews. You can also keep track of posts. Ultimately, you can use Google insights to see how your page is doing to give you some insight into how you can improve your business.
How to Drive Traffic to Your Online Store Requires Quality Work and Time
Driving traffic to your site requires simple tactics. It requires solving your core audience’s problems. You can do this in a variety of ways without spending a fortune on marketing.
How to drive traffic to your site requires using social media to speak to your core audience. These are platforms that can help you solve a problem for your audience. They allow you to speak directly to your audience, such as YouTube.
You can find more information on our website and see the services we offer to our clients.
This is imperative in a word where the internet has billions of pages and users. And there are many logical things you can do to improve your SEO without having to spend money or time researching complex algorithms.
Here’s SEO for dummies and everything you need to know about what is SEO.
1.On-Page SEO Rewards Great Copy
The first thing to note about SEO is that it rewards great writing. Crafting great content with great links is part of on-page SEO.
This is not writing that is overly complex and full of big words. It is copy that is simple, concise and clear. You can improve your SEO by improving your writing skills.
Write in manageable paragraphs of no more than three sentences. Stick to one idea per paragraph and try to stick to the concept of making a point, backing it up with evidence and then explaining how the evidence relates to the point. Use lots of headings and subheadings.
Try to break meatier paragraphs up into smaller sections using numbered lists and bullet points.
If you want some tangible facts about how to improve your writing for SEO then try putting your text through programs like Grammarly or Hemmingway. These will provide you with a report that has suggestions such as removing too many adverbs or reformatting some sentences to remove instances of passive voice.
These services can also give you an idea of the reading age of what you’ve written. You want the average reader to be able to understand what you’ve written: you don’t want users to click off because your blog post is too dense.
Google rewards well-written posts. Moreover, if a blog post can make a point in a clear concise way then those that read it are more likely to share it which will increase the traffic.
2.Reward Your User By Using Lots of Links
Links are SEO 101 and are another part of on-page SEO. You need to include links to other posts on the site as well as external links. An external link should be to an authoritative news site rather than to other blogs or companies.
Different sites will be relevant to different topics. If you are writing about money or business then Forbes is always a good site as it includes lots of great advice from industry experts and news articles on changes to tax legislation.
If you are writing about design then sites like Good Housekeeping could be a great site to add-in or if you are writing about a recent event then news articles on CNBC or CNN could work well.
Remember not to spam your viewers. You don’t want to cram lots of links in on irrelevant articles. The point of a link is to allow the reader to read more about a topic or area that you don’t have time to go into detail in your text. They are designed to help your users understand the topic and to make it easy for them to look up terms they don’t understand.
3.On-Site SEO: Don’t Make Mobile Users Work For Your Site
When we think about websites the first image in our minds is no doubt a website displayed on a desktop or a laptop. But it is becoming increasingly uncommon for a user to view a website for the first time on the big screen.
More people now scroll the web on their cell phone when they are bored or are waiting in line for something such as a hospital appointment or bus. Websites must, therefore, be optimized for mobiles.
And a website copied straight from the desktop and opened on a cell phone will often be fiddly to use. There will be too much text on display for a small screen and the user might have to zoom to click some of the menus. This kind of user experience will lead to many users clicking off your site.
You need to ensure you have two versions of your site, a mobile version, and a desktop version. The mobile version must only include the basic details that the user needs to know.
Big corporations have now started employing UX designers who specialize in creating a comfortable user experience as a result ofÂ Google algorithms that reward good design and mobile-friendly sites.
This is a crucial part of on-site SEO, different from on-page SEO. It is about making sure your site looks and works well for your users.
What is SEO? A Great Tool To Get Noticed
In the internet age where we are loaded with information on a daily basis SEO is a great way of standing out from the crowd. By following a few simple guidelines to the way you write you can get noticed more easily and drive traffic to your site.
Write in manageable paragraphs in a friendly and informative way, do some research into what topics and keywords are driving traffic and include lots of great useful links. Then build a site that is easy to navigate and is mobile-friendly and you are well on the way to increase your traffic.
If you are interested in learning more about local SEO services then be sure to check out our services to see if we can help you or your business today.
3 Easy SEO Tips to Exponentially Improve Your Online Visibility
If you’re looking to boost your online visibility and increase traffic to your website then you need to get to grips with Search Engine Optimization. This frightens many people off as they feel that it is a complicated set of algorithms that they have to study and game. But this couldn’t be further from the truth.
Many simple SEO tips are relatively intuitive and can in many ways be regarded as common sense. So if you are looking to improve your online visibility, here are our three top SEO tips for increasing your traffic.
1. Include Lots Of Links
To ensure your website does well you need to illustrate to Google that you are a reputable website. We follow Google’s algorithms. And to do this you need to include links to other reputable sites, ones that are considered authoritative. This is also helpful for the reader as you are guiding them in the direction of more information should they want it.
Sites that are considered authoritative are generally news websites. Many sites are good for one type of link. If you are writing about business or wealth then Forbes is always a good site to link to. If you are writing about a recent event or politics then a major news site such as CNBC, CNN or even the BBC might be good choices.
But remember not to overload your reader. Trying to cram as many links as you can into your blog post so that every other word is a hyperlink will backfire. Two to three links per 500 word post is generally a good number to include.
2. Optimize Your Site For Mobile
Did you know that we are approaching an area where mobile traffic accounts for more traffic than desktops or laptops? This means your mobile site has to be accessible and top rate. You might wonder why users don’t just view the desktop site on their cell phones or tablets.
But this is not good either for your SEO or your audience. A desktop site opened on a small screen can look cluttered and messy and the user might find they are having to pinch to zoom just to be able to read some of the icons or menus.
Instead, develop a separate site for a cell phone or tablet where some of the menus have been collapsed or moved and the site fits the screen properly.
3. Be Patient
Don’t expect your site to suddenly zoom up in Google’s rankings overnight or your traffic to double. SEO takes time, consistency and a lot of measuring analytics to see what works. It takes most sites one year, perhaps even two before they start to see solid results.
A steady increase in clicks is what you’re looking for and what Google is looking for rather than ‘clickbait where your site generates increased traffic for a few hours that will shortly die away.
Online Visibility Takes Time
Online visibility is all about generating good quality content and making life easier for your users. And this takes time. Linking to good quality sites if the user wants to find more information and ensuring the website is optimized for mobile users is a good place to start.
If you want to learn more about strategic SEO planning and keywords be sure to click here to see the services we offer.