How to quickly check the technical condition of the site?

It is not always possible to conduct a basic technical audit of a website on your own. But you can quickly understand what the problems are and what needs to be corrected to increase the efficiency of the resource, without involving specialists.

Technical errors on the site are a serious obstacle on the way to the TOP. They interfere with users and search engines, negatively affect the ranking and indexing of pages.

When do I need to check the technical condition of the site?

Checking the technical condition of the site is a reusable event. During the life cycle of the resource, this will need to be done regularly.

A website technical audit is carried out when:

  • the launch of a new project is being prepared;
  • promotional activities are ineffective;
  • positions in search have dropped;
  • an audit of the work of the employees responsible for the site is needed.

First of all, you need to understand that there are technical errors on almost any website. Technical flaws, as a rule, appear and gradually accumulate in the process of revision, addition of functionality, services, and content changes. They affect rankings, even if the promotion activities are done correctly.

Google changes site requirements from time to time. The search engine puts out recommendations and instructions for webmasters. You can check how the site meets the requirements, what critical errors need to be quickly fixed, you can do it yourself using free public tools.

Checklist for checking the technical condition of the site

A full-fledged technical audit of a website is difficult to carry out without the appropriate knowledge and skills. However, with a basic check, you can quickly find weaknesses and outline a plan to fix them.

We recommend checking:

  • page loading speed;
  • optimization errors: duplicate pages and content, incorrect settings for redirects, broken links;
  • correct URL;
  • lack of “mirrors” and the availability of pages at different addresses;
  • presence of sitemap and robots.txt files
  • errors and messages in Google Search Console
  • using the https protocol;
  • correctness of the mobile version.

Let’s consider the items of the checklist in more detail.

Checking the download speed

Due to the slow loading speed, the site loses traffic. Seconds matter. In addition, speed is one of the key factors in search engine ranking.

If, when checking in services like PageSpeed Insights, we see the result as in the picture below, this is cause for concern.

The service checks both the desktop and mobile versions of the site.

You can also use the Netpeak Spider and Pingdom services to check the speed.

Optimization errors

To find optimization errors, the crawlers Netpeak Spider, Screaming Frog, WebSite Auditor are used.

A basic check on these services can be carried out free of charge. They find:

  • “broken” links that return the 404th server response
  • duplication of meta tags;
  • lack of meta tags and headers
  • closed from indexing pages
  • redirects
  • pages with incorrect canonical tags

It is also very important to check the site’s Google Search Console. In it, you need to pay special attention to:

  • error pages
  • ease of use on mobile devices
  • indexation status

URL validation

Site page links should be written in CNC format – this will help with ranking. Human-readable url displays the content of the page and is well received by users. And if it contains keywords and is composed without errors, it will help you rank in the search.

The presence of the following in the URL is considered an error:

  • underscores;
  • uppercase letters;
  • spaces;
  • cyrillic characters;
  • parameters of the indexed link.

The site must have robots.txt and sitemap files. Robots.txt is contained in the root directory on the server and, together with the sitemap, shows search robots which pages on the site are prioritized for indexing. The presence of these files is important for ranking, but still not all sites have them.

Check if they are on your resource. Robots.txt can be found at https://site.com/robots.txt, where site.com is your site’s address.

Sitemap can be tricky. Its address needs to be specified in robots.txt, but this is not always done. You can try to find a sitemap by specifying one of the commands in the address bar of your browser:

  • https: //site.com/sitemap;
  • https: //site.com/sitemap.hml;
  • https: //site.com/sitemap_index.hml.

Full information about robots.txt and sitemap, as well as how well they are composed, is available in Google Search Console.

Lack of “mirrors” and availability of pages at different addresses

If a site or its individual pages are available at different addresses, the search engine perceives them as different resources.
You can check the presence of “mirrors” of the site using the commands:

  • https://site.com/;
  • https://site.com/;
  • https://www.site.com/;
  • https://www.site.com/.

When specifying any of them, the same page should be displayed in the address bar of the browser.

Https protocol

All sites that work without the https protocol and a secure SSL certificate are considered by Google to be insecure. Whether the protocol is used by the site, you will see in the address bar in the browser.

To check the security of an SSL certificate, you can use the service https://www.ssllabs.com/ssltest/analyze.html

Correct mobile version

Many users view pages only from mobile devices, so the site needs a correct mobile version. In addition, Google has now switched to mobile-first indexing, which means that it primarily analyzes the mobile version of the site, and not the desktop one.

To check how your project is perceived by mobile users, the service will help Google Mobile-Friendly Test.

Conclusions

These activities can be performed independently and quickly check the technical condition of the site. Nevertheless, in case of serious problems, it is better to entrust a full technical audit of the site and fix errors to specialists. It will take time, but it will provide more complete and accurate information, allow you to find errors and identify the causes of their occurrence. The further plan of action will depend on this.

By themselves, work on technical optimization does not increase the relevance of the page’s response to the user’s request, but they eliminate the obstacles that prevent it from reaching the TOP.

Services for checking the technical condition of the site

ApplicationInstrumentsFormatTerms of Use
Netpeak Spider– download speed;
– response codes;
– server;
– checks title, meta description, H1 headers, redirects
DesktopFree version of basic functionality
Screaming Frog– finds broken links, errors, redirects;
– checks the correctness of the URL;
– analyzes images
DesktopFree version of basic functionality
PageSpeed Insights– tests website loading speed for desktop and mobile devicesOnlineFree
Google Search Console– checks the site for the main technical parametersOnlineFree
Google Mobile-Friendly Test– checks the usability of the site pages for mobile usersOnlineFree
W3C– code validationOnlineFree
WebSite Auditor– checks broken links, pictures;
– finds duplicate content;
– evaluates adaptability to mobile devices;
– finds code errors
DesktopFree
Pingdom– site ping testOnlineFree period of 14 days

    I want to send a request