To show the most pertinent, helpful results, all in a fraction of a second, Google employs automatic ranking methods that examine several criteria and signals about hundreds of billions of web pages and other content in our Search index.
We rigorously test and evaluate these technologies on a regular basis to improve them, and we notify content authors and others when we make changes that may be beneficial.
This website serves as a reference for comprehending some of our more illustrious rating systems. It covers some of the underpinning technologies that generate search results in response to inquiries, or our key ranking systems. It also covers a few systems that have unique ranking requirements.
Visit our How Search Works page to learn more about how our ranking methods and other procedures come together to help Google Search fulfil its goal of making all information accessible and valuable to everyone.
We can comprehend how word combinations convey various meanings and intentions thanks to Google’s Bidirectional Encoder Representations from Transformers (BERT) AI system.
Crisis information systems
When a crisis arises, whether it is a personal crisis, a natural disaster, or another widespread crisis, Google has developed tools to give timely and helpful information:
- Personal crisis: In order to display hotlines and content from reputable organisations for specific queries regarding suicide, sexual assault, poisoning, gender-based violence, or drug addiction, our systems are programmed to recognise when people are looking for information about personal crisis situations. Find out more about how Google Search presents information regarding personal crises.
- SOS Warnings: Our SOS Alerts system works to display updates from local, national, or worldwide authorities during times of natural catastrophes or widespread crises. These updates may include information on maps, contribution possibilities, emergency phone numbers, websites, and more. Learn more about SOS Alerts and how they function as a component of Google’s crisis alerts, which are useful during times of earthquakes, hurricanes, wildfires, and other disasters.
Google searches may turn up dozens or even millions of relevant web sites. There may be striking similarities between some of these. To prevent pointless duplication in these situations, our computers only display the results that are most pertinent. Find out more about deduplication, including how to see excluded results if you’d like to.
Featured snippets also go through the deduplication process. We don’t repeat a web page listing on the first page of results if it has been promoted to become a highlighted snippet. This simplifies the search results and makes it simpler for users to find pertinent information.
Exact match domain system
The terms in domain names are one of several characteristics that our ranking engines take into account when determining whether material is pertinent to a search. Our exact match domain approach, however, ensures that we don’t overvalue information hosted under domains created to precisely match certain queries. For instance, someone might register a domain name with the words “best-places-to-eat-lunch” in the hopes that it will help content rise in the search engine results. For this, our system modifies.
We have a number of “query deserves freshness” methods built to display more recent content when it is appropriate. For instance, if someone is looking for information on a recently released film, they are likely looking for current reviews rather than earlier pieces from before filming started. Another illustration would be that a search for “earthquake” would often return information on resources and planning. If, however, an earthquake had just occurred, news reports and more recent content might be available.
Helpful content system
Instead of content created merely to attract search engine traffic, our system for useful content is intended to make it more likely that users will find unique, helpful content written by people, for people, in search results.
Link analysis systems and PageRank
In order to decide what sites are about and which would be the most useful in response to a query, we have a variety of systems that comprehend how pages link to one another. PageRank, one of our primary ranking systems employed when Google initially debuted, is one of them. The original PageRank research paper and patent can be read by those who are interested in learning more. Since then, PageRank’s functionality has changed significantly, but it is still a fundamental component of our ranking algorithms.
Local news systems
When appropriate, our systems are designed to find and highlight local news sources, as seen in our “Top articles” and “Local news” sections.
An AI system called Multitask Unified Model (MUM) is able to both comprehend and produce language. It is currently utilised for a few particular uses, such as to enhance searches for COVID-19 vaccine information and to enhance featured snippet callouts we display, rather than for general ranking in Search.
Google utilises an AI technology called neural matching to match concepts represented in queries and pages to one another.
Original content systems
We have processes in place to make sure that original information, such as original reporting, is displayed prominently in search results before content that is merely cited. This provides support for a particular canonical markup that authors can use to clarify which page is the main one when a page has been copied elsewhere.
Removal-based demotion systems
Certain sorts of content may be removed under Google regulations. We utilise that as a signal to improve our performance if we process a lot of these removals affecting a specific site. more specifically
- Legal removals: We are able to exploit a site’s high rate of legitimate copyright takedown requests to degrade other content on the site in our search results. In this manner, if there is more unlawful content, it will be less likely to be found by users than the original content. For complaints involving defamation, fake goods, and court-ordered removals, we use analogous demotion signals.
- Personal information removals: We demote other content from a site in our results if we process a large number of personal information removals involving that site that involve exploitative removal tactics. We also check other websites to determine whether the same pattern of behaviour is there, and if it is, we demote the content on those websites. For websites that frequently have doxxing content removed, we might engage in similar demotional activities. Additionally, we have built-in safeguards to stop non-consensual explicit personal photographs from appearing highly in search results for names.
Page experience system
Websites that provide a fantastic page experience are more popular. This is why we have a page experience system that evaluates a number of factors, including how quickly pages load, their mobile friendliness, whether they contain invasive interstitials, and how securely they are served. The approach aids in giving preference to content with a better page experience when there are numerous alternative matches with roughly equal relevance.
Passage ranking system
To better understand how relevant a website is to a search, we employ a method called “passage ranking” that uses AI to identify certain “passages” or chunks of a web page.
Product reviews system
The method for product reviews attempts to more favourably reward content that offers incisive analysis and unique research and is authored by subject-matter experts or fans who are knowledgeable about the subject.
Our understanding of the relationships between words and concepts is aided by the AI system RankBrain. By understanding how the content is related to other words and concepts, we can better deliver relevant content even if it doesn’t contain all the precise phrases used in a search.
Reliable information systems
The most trustworthy information is displayed using a variety of techniques, some of which help uncover more authoritative pages, degrade poor content, and promote high-quality journalism. Our computers automatically offer content advisories when there may be a shortage of accurate information, such as when a topic is changing quickly or when there is a low level of confidence in the overall quality of the search results. These offer advice on how to conduct searches in a way that can produce more beneficial results. Find out more about how we give reliable information in Search.
Site diversity system
Our site diversity technique is designed to prevent any single site from predominating all of the top results by generally not showing more than two web page listings from the same site in our top results. However, if our computers decide it’s particularly pertinent to do so for a specific search, we may still display more than two listings. Subdomains are typically treated as a part of the root domain by site diversity. Specifically, listings from both the parent domain (example.com) and the subdomain (subdomain.example.com) will all be seen as coming from the same single site. Subdomains may, however, occasionally be recognised as independent sites for diversity considerations when judged appropriate.
Spam detection systems
Spam filters are quite useful since nobody wants their email inbox to be clogged with spam. Similar difficulties arise with search because of the massive volumes of spam on the internet, which would prevent us from displaying the most beneficial and pertinent results if not addressed. To deal with content and actions that violate our spam policies, we use a variety of spam detection technologies, such as SpamBrain. To stay current with the most recent ways that the spam threat is evolving, these technologies are regularly updated.
The following systems are mentioned for historical reasons. They have either been added to our primary ranking systems or included in successor systems.
In August 2013, we conducted a significant upgrade to our general ranking algorithms, including this one. Since then, our ranking algorithms have developed in the same manner as they have before.
Mobile-friendly ranking system
The mobile-friendly ranking system gave preference to content that rendered better on mobile devices in cases where there are numerous candidate matches with roughly equal relevancy since mobile-friendly content is more helpful for users looking on those devices. Our page experience system has since been updated to include the system.
Page speed system
The “Speed Update” approach, which was first introduced in 2018, said that, where all other factors were equal, material that loaded more quickly for mobile users will perform higher in our mobile search results. Since then, it has been incorporated into our page experience system.
This mechanism was put in place to make sure that only original, high-quality content showed up in our search results. It was introduced in 2011, given the moniker “Panda,” evolved, and entered our primary rating systems in 2015.
This mechanism was created to stop link spam. The “Penguin Update,” which had been announced in 2012, was included in our primary ranking systems in 2016.
Secure sites system
When this approach was introduced in 2014, it meant that websites secured with HTTPS would perform better in our ranking systems when all other factors were equal. It promoted the development of secure websites at a period when HTTPS usage was still not all that prevalent. Since then, it has been incorporated into our page experience system.
FULL ARTICLE: A Guide to Google Search Ranking Systems | Google Search Central | Documentation | Google Developers