SEMrush Technical SEO Exam Questions

Check the list of SEMrush Technical SEO Exam Questions. Or get unlimited access to our continuously growing library of certification exam questions and answers. All possible questions with 100% correct & verified answers

Link to the official certification exam page:  SEMrush Academy Certification Exams.

What elements should text links consist of to ensure the best possible SEO performance?


Nofollow attribute, anchor text

a-tag with href-attribute, noindex attribute

Anchor text, a-tag with href-attribute


A good approach is to create internal competition. More links to different URLs have the same anchor text, easier for Google is to differentiate which of them is the one URL on your domain to be ranked for the given keyword.





What are two the most commonly known best practices to increase crawling effectiveness?


Multiple links to a single URL

Interlink relevant contents with each other

Using linkhubs

Internal, link-level rel-nofollow

Meta robots nofollow


Choose three statements referring to XML sitemaps that are true:


XML sitemaps must only contain URLs that give a HTTP 200 response

XML sitemaps should usually be used when a website is very extensive

It is recommended to use gzip compression and UTF-8 encoding

It is recommended to have URLs that return non-200 status codes within XML sitemaps

There can be only one XML sitemap per website


Choose a factor that affects the crawling process negatively.


A well-defined hierarchy of the pages

Content freshness

Duplicate pages/content


Choose two statements that are false about the SEMrush Audit Tool.


It provides you with a list of issues with ways of fixing

It can’t audit desktop and mobile versions of a website separately

It can be downloaded to your local computer

It allows you to include or exclude certain parts of a website from audit


What is the proper instrument to simulate Googlebot activity in Chrome?


User Agent Switcher

Reverse DNS lookup

User Agent Overrider


How often does the option to combine a robots.txt disallow with a robots.txt noindex statement make folders or URLs appear in SERPs?


Less than ones without noindex




True or false? It is not possible to have multiple robots meta tags.





Choose two correct statements about a canonical tag:


It is useful to create canonical tag chaining

Each URL can have several rel-canonical directives

Pages linked by a canonical tag should have identical or at least very similar content

It should point to URLs that serve HTTP200 status codes


Fill in the blank. It’s not wise to index search result pages because _____


Google prefers them over other pages because they are dynamically generated and thus very fresh.

those pages are dynamic and thus can create bad UX for the searcher

they do not pass any linkjuice to other pages


PRG (Post-Redirect-Get pattern) is a great way to make Google crawl all the multiple URLs created on pages with many categories and subcategories.





Choose the wrong statement.


Pagination is extremely important in e-commerce and editorial websites

It is important to have all sub-pages of a category being indexed

Proper pagination is required for the overall good performance of a domain in search results

rel=next and rel=prev attributes explain to Google which page in the chain comes next or appeared before it


You have two versions of the same content in HTML (on the website and in PDF). What is the best solution to bringing a user to the site with the full navigation instead of just downloading a PDF file?


Using the X-robots rel=canonical header

Introducing hreflang using X-Robots headers

Using the X-robots-tag and the noindex attribute


What does the 4XX HTTP status code range refer to?


Server-side errors

Client-side errors



Check all three reasons for choosing a 301 redirect over a 302 redirect:


Link equity will be passed to the new URL

The rankings will be fully transferred to the new URL

The new URL won’t have any redirect chains

To not lose important positions without any replacement


When is it better to use the 410 error rather than the 404? Choose two answers:


If the page can be restored in the near future

When you want to delete the page from the index as quickly as possible and are sure it won’t ever be back

When the page existed and then was intentionally removed, and will never be back

When there is another page to replace the deleted URL


What is the best solution when you know the approximate time of maintenance work on your website?


Using the HTTP status code 200

Using the noindex directive in your robots.txt file

Using the 500 status code with the retry-after header

Using the 503 status code with the retry-after header


Choose three answers. What information can be found in an access-logfile?


The request URL

The method of the request (usually GET/POST)


The time spent on a URL

The server IP/hostname


True or false? It is recommended to work with log files constantly, making it a part of the SEO routine rather than doing one-off audits.





Which HTTP code ranges refer to crawl errors? Choose two answers.


5xx range

2xx range

4xx range

3xx range


Choose two statements that are right.


If you overlay your sitemap with your logfiles, you may see a lack of internal links that shows that the site architecture is not working properly

Combining data from logfiles and webcrawls helps compare simulated and real crawler behavior

It is not a good idea to combine different data sources for deep analysis. It’s much better to concentrate on just one data source, e.g. logfile


Choose two answers. Some disadvantages of ccTLDs are:


They need to be registered within the local market, which can make it expensive

They may be unavailable in different regions/markets

They have strong default geo-targeting features, e.g. .fr for French


Choose two http response status codes that will work where there is any kind of geographical, automated redirect. We are talking about international requests from different geographical regions.


302 and 301

301 and 303

302 and 303


You have site versions for France and Italy and you set up two hreflangs for them. For the rest of your end-users you plan to use the English version of the site. Which directive will you use?


<link rel=”alternate” href=”” hreflang=”uk”/>

<link rel=”alternate” href=”” hreflang=”en-au”/>

<link rel=”alternate” href=”” hreflang=”x-default”/>


True or false? The SEMrush Site Audit tool allows you only to define issues that slow down your website and does not give any recommendations on how to fix them.





Choose two optimization approaches that are useful for performance optimization:


Avoid using new modern formats like WebP

Increase the number of ССS files per URL

Proper compression & meta data removal for images

Asynchronous requests


True or false? Pre-fetch and pre-render are especially useful when you do not depend on 3rd party requests or contents from a CDN or a subdomain.





Fill in the blank. According to the latest statistics, 60% or more of all results for high volume keyword queries in the TOP-3 have already been moved over to run on ______






What are the two valid statements with regard to the critical rendering path (CRP)?


CRP on mobile is bigger than on a desktop

There is an initial view (which is critical) and below-the-fold-content

The non-critical CSS is required when the site starts to render

The “Critical” tool on Github helps to build CCS for CRP optimisation


Choose the correct statement about mark-up.


Invalid mark-up still works, so there’s no need to control it

Changes in HTML can break the mark-up, so monitoring is needed

Even if GSC says that your mark-up is not valid, Google will still consider it


Choose a valid statement about AMP:


Using AMP is the only way to get into the Google News carousel/box

A regular website can never be as fast as an AMP version

CSS files do not need to be inlined as non-blocking compared to a regular version

AMP implementation is easy, there’s no need to rewrite HTML and build a new CSS


Fill in the blanks. When you want to use _____, make sure they are placed in plain HTML/X-robots tags. _____ injected by JavaScript are considered less reliable, and the chances are that Google will ignore them.


Canonical tags

rel=amp HTML tags

hreflang tags


Which type of mobile website version should you use to check if the “user agent HTTP header” variable is included to identify and provide the relevant web version to the right user agent?


Independent/standalone mobile site

Dynamic serving

Responsive web design