Google Ranking Factors
Getting to the top of relevant Google searches can transform the fortunes of any business. But what exactly is Google looking for? Although the search engine doesn’t publish details of exactly how their algorithm works, there are widely acknowledged to be over 200 factors which govern where exactly a particular website will rank. Different factors are given different weightings, but they all count. The following list includes them all, as we understand it. For ease of use we’ve grouped them into domain, page factors, site factors, backlines, user interaction, special google algorithm rules, brand signals, on-site webspam and off-site webspam factors.
As a London web design agency, we build sites that adhere to these guidelines, helping our clients get to the top of search rankings. If you don’t want to read the whole list, there are two main characteristics – trust and user experience. Most if not all will fall into one of the other. Essentially Google is looking for trustworthy sites that have an excellent user experience, but for more detail here’s the list in full.
Domain
- Age of the domain – Generally Google favours older domains as they hold more authority, it’s not that important but does contribute to rankings.
- Keyword Appears in a domain name – It’s generally understood that including a keyword in a domain doesn’t work as well for SEO as it once did – although we know of quite a few examples where it continues to work well. It also helps as a relevancy signal.
- Keyword as the first word in a Domain name – Putting the main keyword as the first word in your domain gives it a slight advantage over competitors.
- Domain registration length – From Google “Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain.”
- Keyword in Subdomain – Including a keyword in a subdomain can help with seo – for example www.marketingdesigngroup.webdesign.co.uk
- Domain History – If your domain has a chequered history i.e. it has experienced drops for implementing black hat techniques or it’s been a penalised domain in the past (even if it’s an old owner and not you). Google may reset the site’s history therefore ignoring all historial links.
- Exact Match Domain – It’s widely acknowledged that having the exact keyword as a domain doesn’t work as well as it did especially after The Google EMD update. But it’s still an advantage. This is one area the Marketing and Design Group are exploiting by not only using an exact match but making the keyword the brand name and exact match for the Birmingham Marketing Agency and Bristol Marketing Agency. Essentially Google doesn’t know if someone is searching for THE Birmingham Marketing Agency or A Birmingham Marketing Agency.
- Public vs Private Whois Data – Hiding who owns a domain could be an indicator of something untoward. As Google Matt Cutts has said ”…When I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. …Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”
- Penalised WhoIs Owner- If a known spammer owns a particular domain it’s likely that domain (and other domains owned by that person) will be downgraded
- Country TLD extension – UK based businesses should have the Country Code Top Level Domain .co.uk
Page-Level Factors
- Keyword in the Title Tag. It’s SEO power has diminished over the years but the title tag is still important as it tells Google what you want to be ranked for.
- Title Tag Starts with Keyword. The general consensus is that title tags with the keyword as the beginning rank better than those with the keyword at the end.
- Keyword in Description tag. The meta description is irrelevant as a ranking element. But a clear, concise and persuasive description will generate better click through rates, which is an important ranking factor.
- Keyword in the H1 tag. Effectively the second title tag, taken together with the title tag, Google takes the H1 as a relevancy signal.
- Term Frequency — Inverse Data Frequency (also known as TF-IDF). The Google algorithm uses TF-IDF to understand what a page is about.
- Content Length. Pages (and to a certain extent sites) with a higher word count are favoured by the algorithm as they are generally more detailed and therefore are more likely to contain the information a searcher is looking for, than shorter articles.
- Table of Contents Google likes a table of content that includes links as it’s an easy way for the algorithm to understand what the content is about.
- Keyword Density. Content must be natural and flow for the reader as well as for Google. Stuffing content with keywords can get a site penalised.
- Latest Semantic Indexing Keywords in Content (LSI). These are keywords which are conceptually related terms. They are the terms that add weight to an overall topic. For example you may have “men’s clothes” in the meta title, content etc but the algorithm will also use related terms such as “trousers” and “shirts” to rank relevancy.
- LSI Keywords in Title and Description Tags. When included in the title and description tags, LSI keywords help Google understand relevance especially if a word has a number of meanings – eg “apple”
- Page Covers Topic In-Depth. Similar to point 16 above, Google likes to provide searchers with the information they are searching for. So a more in-depth article covering all aspects of a given subject will rank higher than a more superficial article.
- Page loading speed via HTML. Load speed is a ranking factor for now (when 5G is widespread it’s importance will probably be downgraded). The algorithm looks at the HTML code to gauge a particular site’s speed.
- Page Loading Speed via Chrome. As well as HTML, Google also uses data it collects from Chrome users to ascertain the load speed of a page.
- Use of AMP. (Accelerated Mobile Pages). AMP does help with the mobile version of the Google News Carousel although it’s not a direct ranking factor
- Entity Match. What’s the entity of a webpage and does it match what a user is searching for?
- Google Hummingbird. This algorithm update helped Google to understand “real-world entities and their relationships to one another” – ie it’s more efficient in ascertaining the topic of a webpage.
- Duplicate Content. A site needs original content on each page, any duplication of content on the site will reduce it’s ranking.
- Rel=Canonical: However if your site does need to use duplicate content, then a rel=canonical tag will stop Google penalising your site.
- Image optimisation. Google looks at the image file name, alt text, title, description and caption for signs of relevancy.
- Content recency. Google’s 2017 Caffeine update prioritises recently published and updated content, this is especially relevant for searches which are time sensitive.
- Magnitude of content updates. Continuing the freshness theme, Google looks at edits, updates and amendments to content. Larger changes eg adding a paragraph will hold more weight than fixing a typo.
- Historical page updates. How often your site is updated is also an element which the algorithm takes into consideration
- Keyword Prominence. Including a keyword you want to rank for in the first 100 words of the page helps to rank highly for it
- Keyword in H2 and H3 tags. Not that important but still indicates relevancy to Google.
- Outbound Link Quality. Linking to high quality sites within your content also helps to establish trust and is therefore a positive ranking factor.
- Outbound Link theme. Another relevancy factor will be content of the external pages linked to. For example the content mentions Apple is relation to mobile phones rather than fruit
- Spelling and Grammar. The alorightm is looking for quality content, therefore it’s a good idea to make sure the grammar and spelling are both correct for the user experience as well as for Google.
- Syndicated Content. Make sure all your content is unique and not copied from another site, otherwise that content will not rank.
- Mobile friendly sites. With so many searches now being done on a mobile, Google ranks more highly sites optimised for mobiles.
- Mobile user experience. Google’s “Mobile-first Index” ranks more highly sites which are easier to use.
- “Hidden” content on mobile. Content which is hidden on mobiles probably won’t get indexed
- Helpful “supplementary content”. Reinforcing the direction which Google has gone in recently i.e. rewarding a better user experience, means it’s advantageous to include Supplementary Content. Examples of which include a mortgage calculator or currency converter.
- Content Hidden Behind Tabs. if users have to click on a button to see content it may not be indexed.
- Outbound Links – include too many external links from a page and you will likely get marked down.
- Multimedia. Photos, infographics and videos not only help break up text, making it easier to read, but they also help communicate the content on that page. Therefore they are seen by Google as a quality signal.
- Number of Internal Links Pointing to a Page: internal links are not only useful for the user experience but they also help tell Google which pages are more important on your site. The pages with the most links are deemed to be the most important.
- Quality of Internal Links. More weight is given to links from pages with a higher page rank
- Broken links. Not only are broken links bad for user experience, if there are too many Google may also categorise your site as neglected.
- Reading level. Google does grade sites on the reading level, although there is some debate on what it does with that information. Google likes sites to appeal to as many people as possible, so one argument goes that basic rather than advanced english will see your site get a boost.
- Affiliate Links – whilst some are ok, don’t include too manage affiliate links – as your site will be viewed as poor quality.
- HTML errors / poor code. As we mentioned earlier, poor quality code can slow a site down and get downgraded, but Google also uses it as a poor quality signal
- Domain authority. A webpage on a website with a higher domain authority will rank higher than one on a lower domain authority site – all things being equal.
- Page’s PageRank. Page authority seems to trump pages which don’t have much link authority
- URL length. Shorter URL’s seem to do better than excessively long ones
- URL Path. if a page is too far from the homepage in terms of the site’s architecture it won’t rank as high as one that is close to the homepage
- Human editors. Google has been tight lipped on this one, but they have filed a patent for a system which allows human editors to influence the rankings.
- Page Category. Another tool used by Google to judge relevancy is the category a page appears on.
- Keyword in URL – including the keyword in the URL acts as another relevancy indicator.
- URL String. The Google algorithm analyses the categories within a URL for relevancy indicators
- References and Sources. Linking to sources like research papers could be seen by Google as a quality indicator.
- Bullets and Numbered Lists. As we have seen Google is looking for a good user experience. As bullet points and numbered text help to break up text and make it easier to read, Google ranks pages that include them more highly.
- Priority of Page in Sitemap. A page which is given a priority in your site map will have a positive impact on your ranking
- Too Many Outbound links. Google says “Some pages have way, way too many links, obscuring the page and distracting from the Main Content”
- User Experience from other keywords a Page Ranks for: Google states “We look for sites that many users seem to value for similar queries” so if a page ranks well for a particular keyword it will have a positive effect on the ranking for other keywords.
- Page Age. Google tends to prefer older pages which get frequent updates
- User friendly layout – as Google states in their Quality Guidelines Document “The page layout on highest quality pages makes the Main Content immediately visible”
- Parked domains – for years now parked domains how gotten decreased visibility
- Useful Content – Google may distinguish between what it analyses as Quality as opposed to useful content
Site-level Factors
- Content which Provides Value and Unique Insights. If your site doesn’t provide anything new or useful it will probably get downgraded
- Contact Us Page. Google’s Quality Guidelines document does state it likes an “appropriate amount of contact information”. Ideally your contact information will be the same as the Whois info.
- Domain Trust. TrustRank seems to be a very important factor in where a site will rank.
- Site Architecture. A well thought out and implemented site architecture will help Google organise your content and aids the Googlebot to index all your pages.
- Site updates. Regular updates to your site are widely believed to be a factor Google uses
- Presence of Sitmap. Sitemaps are widely thought to help guide the Googlebot rank all the pages in your site.
- Site uptime. If your site is down regularly for maintenance or there are frequent problems with your server, then these issues will have a negative impact on your rankings
- Server location. The physical location of your server will play a part in geographical searches
- SSL Certificate. Secure sites that use HTTPS will rank higher than ones that don’t all things being equal.
- Privacy Pages and Terms of Service. Including both these pages helps your site establish trust with Google.
- Duplicate Meta Info. If your site is using the same metadata on multiple pages it will get penalised
- Breadcrumb Navigation. Breadcrumb style of navigation is user friendly and also helps search engines know where they are on a site
- Mobile Optimised. If your site isn’t mobile optimised it won’t rank well on mobile searches.
- Youtube. Google owns Youtube and therefore it’s no surprise Youtube videos are given preferential treatment
- Site Usability. If users find your site difficult to use then it’s likely this will have a knock on effect on your ranking due to reduced view time and high bounce rates.
- Use of Google Analytics and Google Search Console. Although denied by Google, some SEO experts think that having Google Analytics and Search Console installed helps with rankings. They think those programs give Google more information on traffic and bounce rates which it uses for rankings.
- User Reviews and a website’s Reputation. It’s thought your online reputation on review sites will play an important role in your rankings. Getting customers to leave positive reviews on Google My Business is an important strategy.
Backlink Factors
- Linking Domain Age. It’s no surprise links from older sites are more valuable than links from newer ones.
- Number of Linking Root Domains. One of the most important ranking factors for a website is the number of other sites which link to it.
- Number of Links from Separate C-Class IPs. the Google algorithm seems to like links from a wider scope of separate class-c IP addresses
- Number of linking pages. Even if coming from the same domain, the total number of linking pages will have a positive impact on rankings.
- Backlink Anchor text. As the founders of Google are quoted as saying “anchors often provide more accurate descriptions of web pages than the pages themselves” they are still used as a strong relevancy signal when not overdone.
- Alt Tag for Image Links: the Alt text for images acts as anchor text for images
- Links for .edu or .gov domains – although denied by Google a lot of SEO’s still think that links from these domains carry more weight
- Authority of Linking Page: links from pages with a higher Pagerank will rank better than those from lower Pageranks
- Authority of Linking Domain. Links from websites with a higher domain authority are more valuable for SEO
- Links from Competitors. A good indicator of authority is if there’s a link from a competitors page to your site for that particular keyword.
- Links from “Expected” websites. Another factor that many in the SEO field hold true but to a large extent is unproven is that Google expects links from certain sites in any particular industry.
- Links from Bad Neighbourhoods. Links from “bad neighbourhoods” will be bad for your rankings
- Guest Posts. Whilst links from editorial content are far better, links from guest posts still add some value.
- Links from Ads. According to Google, links from ads should be nofollowed. But it’s likely Google can identify and filter out followed links from ads
- Homepage Authority. More weight is given to links from a referring page if it’s the homepage
- Nofollow Links. Lots of debate in the industry with regards to Nofollow links. Google state “in general, we don’t follow them” which would suggest that on occasion they do. Having a mixture of nofollow links seems natural
- Diversity of link types. Quality sites will have a good mixture of link types. Therefore if too many are coming from forum profiles and blog comments it could be a sign of webspam and get penalised.
- Sponsored or UGC tags Links tagged as “rel=sponsored” or “rel=UGC” are treated differently than normal “followed” or rel=nofollow links
- Contextual Links: links need to be naturally occurring in the content therefore links on empty pages will have no value
- Excessive 301 redirects to Page: too many backlinks from 301 redirects have a negative effect on PageRank
- Internal Link Anchor Text. Although less weight is given to anchor text from internal links rather than external ones, they still act as a relevancy signal.
- Link Title Attribute. A weak factor, but the link title given when you hover over a link still acts as a relevancy factor.
- Country TLD of the Referring domain. Links from sites with the same country code as your site helps you to rank better in your country.
- Link location in content. Where a link appears on a page will affect the weight given to it. Links at the top of the content (in the first paragraph) will have more of a positive affect than ones towards the bottom.
- Link Location on Page. Similar to the point above, links in the content will be more powerful than links in the footer
- Linking domain relevancy: Links from sites in a similar field or industry are a lot more powerful.
- Page-Level Relevancy: Similar to the point above, links from a relevant page (as well as the domain), hold more value.
- Keyword in Title: The algorithm gives extra weight to links from pages that contain your keywords in their title.
- Positive Link Velocity: Sites that have a positive link velocity benefit because it illustrates an upward trend of popularity.
- Negative Link Velocity: Conversely, sites decreasing in popularity will suffer falls in rankings.
- Links from “Hub” Pages: Links from authoritative pages on a particular subject hold more weight
- Links from Authority sites. Links from authoritative websites on a particular subject hold more weight
- Linked to a Wikipedia Source. Despite the fact they are Nofollow links, many in the SEO industry think a link from Wikipedia helps with the rankings as it’s an authority signal
- Co-occurrences: Words which are close to a link help the algorithm to judge relevancy.
- Back Link Age: Older links carry more weight than newer ones.
- Links from Real sites vs. Splogs. More weight is give to links from what you could call real sites rather than blog networks
- Natural Link Profile. Google doesn’t like what it considers to be artificial link building, so links to a site must look natural.
- Reciprocal links. Sites with too many link exchanges will be penalised.
- User Generated Content Links: More weight is given to links which would have been added by the owner of the site rather than the users
- Links from 301: Links from 301 redirects aren’t as good as a direct link
- Schema.org usage: Pages which support microformats will probably rank higher than those that don’t
- TrustRank of Linking Site: The trustworthiness of a site linking to you will determine how much “Trustrank” transfers to you
- Number of Outbound Links on Page: The quantity of links on a page can diminish it’s Pagerank
- Forum Links: as it’s been a favoured SEO tactic for years but the Algorithm now devalues links from forums
- Word Count on Linking Content. Links from in depth articles (eg 1000 word plus) are more valuable than ones from a short snippet
- Quality of Linking Content. Links from well written content, without spelling and grammar mistakes will be more valuable
- Sitewide links. Sitewide links are compressed and act as a single link.
User Interaction
- RankBrain. Google’s AI Algorithm – RankBrain is widely believed to gauge how users interact with search results
- Organic Click Through Rate for a Keyword. The pages which get more click throughs on a search will get a boost for that keyword
- Organic CTR for All Keywords: if a site is getting good click through rates for all keywords, then it’s ranking will rise as a result.
- Bounce rate: a high bounce rate for a particular keyword would suggest a lower quality page and get penalised as a result
- Direct traffic. Using data collected via Chrome, Google ascertains the volume of traffic to any given site. The more traffic a site gets, the more likely it has quality content and therefore the higher it’s ranked
- Repeat Traffic. Lots of returning users is another factor used to gauge quality and therefore will result in a ranking boost.
- Pogosticking: similar to bounce rates, a user may jump from site to site to find the right answer to a particular query. The pages a user skips through will get a rankings drop.
- Blocked sites. It’s thought sites blocked in Chrome will act as a poor quality signal and get downgraded
- Chrome Bookmarks. More data from collected from Chrome, if lots of people bookmark a particular page then that will be seen as a quality page
- Number of comments: when a page gets a lot of interactions, it’s seen as a higher quality page, helping with rankings
- Dwell Time: another quality indicator which the algorithm analyses. If users spend longer on a particular page, that page is deemed to be of a higher quality and therefore will get a rankings boost.
Special Google Algorithm Rules
- Query Deserves Freshness. Some searches will be time sensitive and therefore newer pages will get a boost in that case. For example, what time does the England football match start?
- Query Deserves Diversity: some ambiguous queries will need a more diverse set of results
- User Browsing History. As most people would have seen, the sites you’ve visited before will appear further up the rankings
- User Search History: Previous searches can have an effect if searching for a complimentary query
- Featured Snippets Google displays Featured Snippets content based on a combination of content length, formatting, page authority and HTTPs usage.
- Geo targeting – the algorithm favours sites with a country specific domain extension and local server IP
- Safe search. When a user has safe search turned on swear words or adult content won’t be shown
- Google+ Circles: Although Google+ is being retired and will soon cease to exist, it still shows better results for sites which have been added to Google+ circles.
- “YMYL” Keywords: Your Money or Your Life (YMYL) content is the type of information that, if presented inaccurately, untruthfully, or deceptively, could directly impact the reader’s happiness, health, safety, or financial stability. Therefore it has higher content quality standards
- DMCA Complaints. Google penalises pages with valid DMCA complaints (Digital Millennium Copyright Act)
- Domain Diversity: It’s thought The “Bigfoot Update” added more domains to each SERP page
- Transactional searches: sometimes the algorithm will display different results for shopping related keywords such as flights
- Local searches: Google often includes Google My Business results above the organic results
- Top stories box. Some keywords can trigger results for stories in the media
- Big brand preference. Google does give preference to big brands for some keywords
- Shopping results: Google will sometimes show Google Shopping results in the organic SERP’s
- Image results: sometimes images will also appear in the organic searches
- Easter egg results: Google has added Easter eggs and April Fools’ Day jokes and hoaxes into many of its products and services, such as Google Search, YouTube, and Android since at least 2000
- Single site results for brands: Serval results from the same website will appear if the search was for a specific brand.
- Payday loans update. Back in 2014 Google confirmed they released a new algorithm update to their Payday Loan Algorithm. This algorithm specifically targets “very spammy queries”
Brand Signals
- Brand Name Anchor Text: Using the brand name in the anchor text is a strong brand signal
- Branded Searches: If people are searching for your brand name name it proves to Google that your website is a real brand
- Brand and Keyword Searches: if people are searching your your brand name as well as a keyword, you’ll get a boost in searches for those keywords without your brand name
- Site Has Facebook Page and Likes: brands will have a facebook page and are likely to have lots of likes.
- Site has Twitter Profile with Followers: brands will have a twitter profile with lots of followers
- Official Linked In Company Page. Businesses will also have a company Linked In page
- Known Authorships: In 2013, Google CEO Eric Schmid said “Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results”
- Legitimacy of Social Media Accounts: frequent posts with lots of interaction will be interpreted differently than accounts with lots of followers but hardly any posts and interactions.
- Brand mentions on Top Stories. The biggest brands will get mentioned in Top Stories frequently
- Unlinked Brand Mentions: another brand signal is when brands get mentioned but don’t get linked to.
- Brick and mortar Location: Proper businesses have real premises so Google will probably be looking at location data.
On-site webspam factors
- Panda Penalty: The Panda Penalty hit low-quality content sites especially content farms
- Links to bad Neighbourhoods: links from spammy sites will hurt your rankings
- Cloaking: Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines as it provides users with different results than they expected
- Popups or “Distracting Ads” Popups and distracting ads are an indicator of a low quality site.
- Interstitial Popups: The Algorithm will penalise websites which use full page “interstitial” popups to mobile users
- Over-optimised sites. It may seem counter-initiative but Google does like “natural” looking sites and does appear to penalise sites which are over-optimised. So avoid keyword stuffing and header tag stuffing
- Gibberish Content. The algorithm can identify what it calls gibberish content for example spun or auto-generated content which will be removed from it’s indexing
- Doorway Pages: Google will penalise sites which try and fool it – for example by redirecting people to pages which aren’t clear, ie using Doorway Pages