Who are the best technical mistakes

27 SEO Mistakes That Are Most Commonly Affecting Website Performance in 2020

Webmasters face the daily challenge of keeping up with the constant updates and avoiding technical problems on their websites. You will also find further studies from 2017 and 2019.

However, if you know about the most common (and potentially damaging) bugs that plague websites right now, including internal linking, you have a chance of keeping technical problems to a minimum - and maximizing performance . A low webpage speed is also a technical error that can be counteracted, for example, by optimizing the pagespeed insights.

This guide provides you with a comprehensive site audit and technical SEO checklist that will enable you, the webmaster, to do just that - no matter how big or small your website is.


How we collected the data

We used the SEMrush Site Audit to review 250,000 websites from various niches including health, travel, sports, and science to find the most common SEO mistakes that are affecting their performance.

Overall we have analyzed:

  • 310,161,067 web pages
  • 28,561,137,301 links
  • 6,910,489,415 images

This breadth of analysis gave us enough insight to create a comprehensive site audit template that webmasters can use to avoid the most common mistakes on their own websites.

There is no getting around the fact that a careful site audit is time-consuming.

Our study uncovered 27 common mistakes that you obviously cannot fix all at once. We have therefore divided the list into more easily digestible sections, which we have made available to you here as practical templates.


1 HTTP status and server problems

HTTP status is the most common SEO mistake

The most sensitive technical website problems are often related to their HTTP status.

This includes status codes such as 404 (page not found) errors, which represent the server response to requests from a client such as a browser or search engine.

If the dialog between client and server - or more simply, a user and your website - is interrupted and torn down, the user's trust in the website also suffers.

Serious server problems can not only lead to a loss of traffic due to inaccessible content, but also damage your rankings in the long run if Google does not find suitable results for searchers on your website.

2 errors related to the HTTP status:

1. 4xx errors
4xx error codes mean that the requested page is defective and cannot be reached. They can also appear on working pages if something is preventing the robots from crawling.

2. Pages not crawled
This message occurs when a page is down for one of two reasons: 1) Your website is responding in more than five seconds, or 2) Your server has denied access to the page.

3. Broken internal links
These are links that point to unavailable pages on the same website, which affects user experience (UX) and SEO.

4. Broken external links
These are links that take users to unavailable pages on other websites, which also sends negative signals to search engines.

5. Defective internal images
This message appears when image files no longer exist or their URLs contain typographical errors. This can have a negative impact on the images SEO.

Other common HTTP status errors are:

Permanent redirects
Temporary redirects


2 Under-optimized meta tags

Meta tags help search engines identify the topics of web pages in order to link them to the keywords and phrases that search engine users enter.

Optimal title tags process your relevant main keywords in clear, click-worthy link texts for the search results pages (SERPs).

Meta descriptions give you another way to include keywords and related expressions. They should be as unique and bespoke as possible.

If you do not create your own meta descriptions, Google will automatically generate a page excerpt as a description of the search result based on the keywords in the search queries. This can mean that search terms and results do not match.

Optimized title tags and meta descriptions must contain the most relevant keywords, have the right length and avoid duplication if possible.

Some industries, e.g. Online fashion retailers, for example, cannot create clear descriptions for each individual product. Therefore, they must show a clear value with other text content on their pages.

If individual metadata is practically feasible for each page, use this option to create the greatest possible impact on the SERPs for your website.

The most common meta tag errors:

6. Duplicate title tags and meta descriptions
Two or more pages with the same titles and descriptions make it difficult for search engines to correctly determine their relevance and thus rankings.

7. Missing H1 tags
Search engines can use H1 tags to determine the subject of your content. If they are missing, there will be gaps in Google's understanding of your website.

8. Missing meta descriptions
Well-written meta descriptions help Google assess the relevance of the page and motivate users to click on your result. If they are missing, there is a risk of falling click rates. Relevance and user signals on the search results pages are among the most important ranking factors.

9. Missing ALT attributes
ALT attributes provide search engines and the visually impaired with descriptions of the images in your content. Without them, relevance is lost and user loyalty can decrease.

10. Duplicate content in H1 and title tags
If H1 tags and title tags are identical on a page, it can look like over-optimization. In addition, in many cases you miss out on opportunities to be visible to other relevant keywords.

Other common mistakes with meta tags are:

Short / long title elements
Multiple H1 tags


3 duplicates

Duplicate content can damage your rankings - and it can take a while to recover.

Avoid duplicating content from any type of website, whether it is direct competitor or not.

Check your website for duplicate descriptions, paragraphs or sections of text, identical H1 tags on multiple pages, and URL problems such as: B. WWW and non-WWW versions of the same page.

Make sure that every detail is unique so that your pages not only deserve a ranking in the eyes of Google, but are also attractive to users.

The most common duplication problems

11. Duplicate content
The Site Audit identifies duplicate content, for example if pages on your website have the same URL or identical textual content. You can fix this by adding a canonical (rel = "canonical") link to one of the duplicates or by using a 301 redirect.

Other common duplication problems are:

Duplicate content in H1 and title tags
Duplicate meta descriptions


4 Insufficiently optimized external and internal links

The links that lead your visitors through the customer journeys largely determine the user-friendliness. If there are deficiencies here, this also affects the performance of the affected pages in the web search. Pages with poor user experience are not ranked by Google.

Our study found that almost half of the websites we checked with the Site Audit had problems with internal and external links. This suggests that their link architectures are not optimized.

Some of the links themselves have underscores in the URLs, contain nofollow attributes, and point to HTTP rather than HTTPS pages - all of which can affect rankings.

With the Site Audit you can easily find broken links on your website. The next step is to identify which are having the greatest impact on user interaction and correct them in order of priority.

The most common linking problems:

12. Links on an HTTPS website that point to HTTP pages
Links to old HTTP pages can lead to an insecure data exchange between the user and the server. So make sure all of your links are up to date.

13. URLs with underscores
Search engines can misinterpret underscores and incorrectly document your site index. Instead, use hyphens for your URL structure.

Other common mistakes when linking are:

Broken internal links
Broken external links
Nofollow attributes in external links
Pages with only one internal link
More than 3 clicks page crawl depth


5 Crawlability Problems

In addition to indexability, crawlability is one of the decisive indicators for the technical condition of a website.

The crawlability of your website directly affects your chances of getting good rankings in the SERPs.

If you ignore technical crawling issues with your SEO and the pages cannot be crawled, your pages may not be as visible to Google as they should be.

Fix these issues to help Google get the right links for the right users in the SERPs.

You can avoid technical crawling problems by examining your site for broken or blocked elements that limit crawlability.

Kevin Indig, VP for SEO & Content at G2dotcom, emphasizes the importance of the synergy between sitemaps and robots.txt:

I was surprised that many XML sitemaps are not referenced in the robots.txt. That seems like a standard to me. I am less surprised by the high proportion of websites with only one internal link to pages or even orphaned pages. This is a classic website structure problem that only SEO professionals are aware of.

If an XML sitemap isn't listed in your robots.txt, it can cause search engine crawlers to misinterpret your site architecture, according to Matt Jones, SEO and CRO manager at Rise at Seven:

Sitemap.xml files help crawlers identify and find the URLs present on your website. They are therefore a fantastic way to give search engines a full understanding of your website and thereby get higher rankings for more relevant terms.

The most common problems website crawlers encounter:

14. Nofollow attributes in outbound internal links
Internal links with nofollow attributes prevent link authority from being passed on on your website.

15. Wrong pages in sitemap.xml
Your sitemap.xml should not contain any broken pages. Check for redirect chains and non-canonical pages and make sure they return a status code of 200.

16. Sitemap.xml not found
Missing sitemaps make it difficult for search engines to find, crawl, and index your website pages.

17. Sitemap.xml not specified in robots.txt
Without a link to your sitemap.xml in your robots.txt file, search engines cannot fully understand the structure of your website.

Other common crawlability errors are:

Pages not crawled
Defective internal images
Broken internal links
URLs contain sub-strings
4xx errors
Resources formatted as page links
Blocked external resources in robots.txt
Nofollow attributes in outbound external links
Crawling blocked
Pages with only one internal link
Orphaned sitemap pages
More than 3 clicks page crawl depth
Temporary redirects


6 Problems with indexability

Indicators of good indexability are crucial for SEO. Put simply, if a page isn't indexed, it's invisible to the search engine in question - and to its users too.

Many factors can prevent your website from indexing even if you don't experience crawlability issues.

For example, duplicate metadata and content can make it difficult for search engines to identify which pages to display for similar search terms.

As part of our research, almost half of the websites we examined showed indexing problems caused by duplicate title tags, descriptions, and textual content.

This can force Google to make its own decisions about which pages should appear in the search result. But webmasters can avoid such problems and clearly tell Google which page is dealing with which topic.

A number of different problems can affect the indexability of your website, from low word count to hreflang gaps or conflicts with multilingual websites.

The most common problems with non-indexable websites:

18. Short / long title tags
Title tags with more than 60 characters are abbreviated in the SERPs, while with significantly less than 60 characters you may miss opportunities for further optimization.

19. Hreflang conflicts in the page source code
Multilingual websites can confuse search engines if the hreflang attribute conflicts with the source code of a page.

20. Problems with incorrect hreflang links
Broken hreflang links can lead to problems with indexing if, for example, relative instead of absolute URLs are used, e.g. B./ blog / your-articles instead of https: // yourwebsite / blog / your-article.

21. Low word counts
The site audit can point out pages that offer too little content. It's worth checking them out and making them as informative as possible.

22. Missing hreflang and lang attributes
This problem is reported when a page on a multilingual website is missing the links or tags necessary to tell search engines what content should be displayed to the users in the individual regions.

23. AMP HTML problems
This issue affects mobile users of your website and is reported when the HTML code does not conform to AMP standards.

Other common problems with indexability include:

Duplicate H1 tags
Duplicate content
Duplicate title tags
Duplicate meta descriptions
Missing H1 tags
Multiple H1 tags
Hreflang conflicts in the page source code


7 Avoidance of Accelerated Mobile Pages (AMPs)

It is important that you gear your on-page SEO towards creating a mobile-friendly website with mobile SEO too.

We know that mobile-friendliness on both mobile devices and desktops will be a ranking factor for Google from September 2020.

Until then, as a webmaster, you should make sure that your website's HTML follows Google's AMP guidelines to be mobile ready and avoid possible damage to your rankings.

Use Site Audit to check your website for invalid AMP pages to see what needs to be fixed. Problems can arise with the HTML code, styles, layout or page templates.

The most common problem with mobile friendliness:

AMP HTML problems can occur with styles or layout and, as mentioned above, affect the indexability of a website.

Read this study on the most common AMP mistakes.


8 Neglect of website performance

Page load time is an important issue for SEO and online marketing, and its importance is increasing. The slower your website is, the fewer people will have the patience to wait for it to load.

You can get suggestions for improving the speed of your website on mobile and desktops directly from Google. Learn how to measure page load time and improve Google Pagespeed Insights Score and identify ways to make your website faster.

For example, Google's speed test, in conjunction with the SEMrush Site Audit, can reveal overcomplicated JavaScript or CSS files, as found on many websites in our study.

Gerry White, SEO Director at Rise at Seven, recalls that code minification is a quick win for website performance and user experience:

One of the noticeable aspects of the data is the number of quick wins for the load time. It's not just about rankings, but also about the user and the conversions. So here I would focus on the simple, quick wins that can usually be made without undue development effort. Tasks like compressing JavaScript and CSS take minutes, but can be a huge improvement on many websites. At the same time, every webmaster should make sure that HTTPS is enabled with HTTP2.

The most common website performance issues:

24. Mixed content issues
This problem is similar to number 12 in that it occurs when parts of your website are not secured using the HTTPS protocol. Replace all HTTP links with the new HTTPS versions.

25. Long loading time (HTML)
The time it takes for a page to be fully rendered by a browser should be as short as possible, as the speed has a direct impact on your rankings.

26. JavaScript and CSS files without caching
This problem can affect the loading time of your website and occurs if no browser caching is specified in the response header.

27. JavaScript and CSS files without minification
The aim here is to make JavaScript and CSS elements smaller. Remove unnecessary lines, comments, and spaces to speed up the page.


9 Cross-Category Problems

In some cases, the errors, warnings, and notices reported by Site Audit cover multiple categories.

This means that they can cause quite a number of problems for your website, as shown below. Therefore, it is advisable to treat them as priorities.


Regular site audits are important

If you make any of these SEO mistakes, your website may not be able to reach its full potential. It is therefore important that you as a webmaster keep an eye on this potential for error through regular site audits.

You can use this checklist to prevent rolling snowballs from turning into avalanches - whether it's crawlability issues preventing pages from being indexed or duplication issues challenging potential Google penalties.

Make it a habit to take care of the SEO and UX status of your website with tools like Site Audit. As a reward for the effort, there are gains in visibility and user interest of magnitudes that have a noticeably positive effect on your business results.

Download and print our PDF poster of the top SEO mistakes so you can always have the checklist close at hand.

Further studies for comparison

The 80 Most Common Ecommerce Website Mistakes - A 2019 Study

To support the entrepreneur community, we used the SEMrush Site Audit to research the errors and problems most commonly affecting e-commerce websites. We searched 1,300 online stores for 80 technical and SEO problems. These range from minor disruptive factors to serious errors.

The analysis takes into account all common OnPage and technical SEO problems, including problems with the HTTPS implementation, hreflang attributes, crawlability, page architecture and more.

80 SEO mistakes in online shops

40 SEO technical mistakes - a 2017 study

We carried out a study on the frequency of various SEO mistakes in 2017. To do this, we used the SEMrush Site Audit to collect anonymous data from 100,000 websites and 450 million individual pages in order to determine the most common technical on-site and SEO errors as well as optimization gaps.

Below you will find the current statistics for 40 test criteria, divided into three categories:

  • Crawlability
  • Technical SEO
  • Onpage SEO

At the same time, we have added a new dimension to the methodology: a rating of the urgency of every SEO problem. For this we use a scale from 1 to 5, with 5 being the highest urgency.

This indicator not only shows the influence of the error on the search engine ranking, but also its importance for website performance and user-friendliness. Such a rating seemed necessary to us because some problems rarely arise but are highly urgent. These should not be overlooked.

Together, the available data on frequencies and urgencies provide you with a complete picture of the errors investigated. In this way, you can assess the associated risks in the best possible way and set sensible priorities after your website audit.

40 SEO Mistakes 2017