At a time when most businesses, institutions and organizations are online, each one is vying for the first page rank on Search engines, mainly Google. Though  very few website designers focus on SEO, those who believe in conducting a technical SEO audit of a website, before beginning to work on SEO, are fewer still.

Unless an SEO audit is done, how does one go about assessing what needs to be done to improve a website’s SERPs.   We will not talk about Paid Advertisements and PPC here; only pure Search Engine Optimization, that too On-Page SEO. Technical SEO audit of a website can be very fruitful, if done with the right tools. We will discuss here the basic steps to conduct such a technical SEO audit. So, let’s begin. 

Hostname

A website can be accessed using different URLs http://domain.com or http://www.domain.com or https://domain.com or https://www.domain.com, but SEO best practices suggest they should all point to the same site.

Title Tag

The title tag of a web page is the first meta-tag that is added in the HEAD section of a web page. It is visible to Search Engines as well as to the visitors to the web page. The listings on Search Engines show the Title Tag of the page first. It should be a short sentence reflecting the keywords and the content of the page. During a technical SEO audit, we check if the title tag is there (it should be. Some website creators do not bother to write the title tag). The title tag should not be too long. There should not be multiple title tags, and multiple pages should not have the same title tag.

Access

The idea here is to check that no page is more than 3 clicks away from the home page. When websites are structured this way, web pages are more easily accessible to Search Engines and visitors both.

Anchor Text

Anchor text is the clickable text on a web page. When doing a SEO audit, we check if the anchor text is relevant (in terms of keywords) to the page it links. SEO best practices suggest that linked text should not be a general “click here” but something like “more about anchor text”. 

Redirects

Some businesses register more than one domain names, and have them all redirect to one particular website. This method is viewed negatively in SEO best practices and should be avoided. In fact, all domains lose their rating if they do such redirections.

Broken Links

If the links on a website point to pages that do not exist, these are called broken links. Check for broken links when doing a Technical SEO audit, and rectify them. It could be a minor spelling mistake or a simple .html replaced with .htm, but it will count for a broken link and is penalised by Search Engines.

Dead End Pages

To keep a website live, every web page should have links to other web pages. If nothing else, it should link back to the home page or the page just above it in hierarchy. A web page that does not have a link to any other web page is called a Dead End Page. 

Long URLs

There are different schools of thought here. Some say shorter URLs tend to rank better. Some say Google does not use URL length while ranking pages. To correctly render in all browsers, URLs must be shorter than 2,083 characters. Though there is no limit to the length of page URLs, the ideal is between 50-60 characters. I am talking about the length of page URLs here and not the length of the domain name.

Duplicate content

Duplicate content is a very serious issue in terms of SEO best practices. If a piece of content is copied from some web page, and used on another; both the pages tend to lose their rankings. Whenever you are hosting a web page, make sure the content is original. There are many tools available on the internet to check plagiarism or duplicate content.  

Robots.txt

Robots.txt is a file on the web server which tells the Search Engines which pages to crawl and index in its database. These crawl instructions are specified by the terms allow and disallow in a text file. Be very clear about the syntax while writing a robots.txt file. As an example, User-agent: * Disallow: / would tell all web crawlers not to crawl any pages on the website, while User-agent: * Disallow: will tell all web crawlers to crawl all the web pages on the website. The difference is in a simple “slash”.

Low word count

Web pages with word count lower than 300 are not placed at the top while ranking. Try to make the web pages rich in content, with a word count between 500-800.

Image Descriptions

Search Engines and web crawlers cannot understand the images placed on a web page. Visually impaired users using screen readers will not be able to read an image on your website. So, alt-tags are used within an HTML code to describe the appearance and function of an image on a page. An alt-tag will be displayed on your web page if the image is slow or cannot be loaded. It is an SEO best practice to include alt tags for all images on a website.

Mobile friendly

The term used to describe web pages that perform well on all devices is “responsive”. Since the web is accessed more on mobile devices, it is an essential part of a technical SEO audit to check if the website is mobile-friendly. 

Page Load Speeds

Users prefer web pages that load quicker. Hence page load speeds are a factor for ranking pages on Search Engines. Faster page loads account for a better user experience and that is what ranking is all about.
The above checklist for SEO analysis can seem extremely intimidating, but it is always better to make corrections rather than face penalties or wait for ever to see your websites being ranked by Google. Now, if you enjoyed reading this article, make sure to subscribe to our newsletter. So keep working, don’t ignore your technical SEO, and I’ll see you in the next tutorial.