A Guide to Website Audits

In order to rank better on search engines every aspect of your website should be audited and optimised. How can this be done?

E-commerce is one of the fastest growing markets in the world. It has become a multi-billion dollar revenue stream and has hooked businesses all over the Internet. According to Forester Research, Internet retail sales in the US will grow by 10% over the next five years. While the rise may be a threat to brick-and-mortar stores, it’s a positive step in an online direction for e-marketers.

A successful e-commerce venture relies mostly on good marketing strategies, online and offline, as well as the visibility of the website. Good website visibility equals good ranking in search engines, which may only come after a careful audit of your website.

Website and Rankings

Your website is a powerful tool in generating revenue for your business, which is one reason why the Internet is an amazing place for commerce. Although ranking isn’t an overnight achievement, you certainly can achieve it after some hard work on your SEO strategy.

While search engines are always changing their algorithms (Google Panda and Penguin updates) the basics of website ranking have stayed the same, more or less.

Being on the first pages of search engine results provides better visibility for your website, your product and your brand. Google is recorded having 100 billion searches a month and 1.17 billion unique users every month. Getting this kind of exposure is an astounding possibility for businesses. Don’t forget, 93% of all buying decisions start with an online search.

Getting ranked on the first page of Google may lead to an increase of 91.5% in traffic, since most people don’t really bother going beyond the first page. Second page ranking websites usually only get 4.8% increase in the traffic.

Of course, ranking on the first page of Google requires hard work and excellent website optimisation.

80% of users ignore paid advertisements and instead focus on organic search results. This leads to a rise in importance of search engine optimisation. Although a compelling product or service can change the game, good SEO and other marketing tactics can make all the difference in the world of digital marketing.

Before you can have a hope of ranking, you must audit your website. Website audits can be plain and simple if you know what you’re doing. However, without the right tools and the correct processes it can prove to be difficult.

Why you Need Site Health Audits

Website audits should be done once a year if you are adding and changing content on your website every month. Since different pages have different goals, site audits will make adjustments for these aspects too. A complete and detailed audit will give you a much better understanding of your website, including why you aren’t generating the traffic you expected despite the work you have done.

Over the years, we have witnessed changes in search engines and how search results are being computed for their rankings. What used to be the simple work of adding keywords to a title, content and header tags, getting a few links and guest posts to help you rank is now an obsolete method.

The changes in search engine algorithms now include details about your entire website (such as speed, responsiveness, content and authority) together with your quality content.

To examine the potential of your website, you may need to comprehend that certain factors may only be unearthed by site audits and through site audit tools. And it will not only improve content performance but technical performance as well. Through audits, you can inspect your website’s technical framework in order to gain a greater understanding about how it works and how you can get it to rank.

HubSpot outlined three important benefits of site auditing from a marketing perspective:

  1. Improves website performance
  2. Enhances SEO
  3. Improves conversion

A thorough site audit usually lasts for six days for those with large websites, but on average, it may take two-three days. Website admins often do the auditing themselves, but most companies will hire people for the job.

Types of Site Audits


There are many types of audits. Some websites perform audits based only on their current needs. These audits aim to spot key challenges and opportunities. The following, however, are the most common audit types related to search engine optimisation:

#1 Site Health Audits

Site health audits are performed to check the general standing of your website. It analyses different areas and often addresses technical, onsite, links, social media and other miscellaneous aspects of the website. It offers holistic analysis of the site and gives insight into the possibilities of growth. Access to the analytics and webmaster tool accounts is needed for proper auditing.

#2 Red Flag Audits

Although red flag audits can be included in a general health audit, it can also be done as a standalone procedure. Red flag audits help assess a site for potential issues.

#3 Competitive Audits

Competitive audits help in the analysis of site gaps. It may check your site compared to competitor sites to see opportunities for site growth and overall comparison. It can give you the insight on what makes your competitor visible and what works for them. Your analysis of your competitor keeps you updated about your industry and helps you remain relevant to your customers.

Competitive audits often start with your website’s top five competitors. Put your fiercest competitor at the top – it’s the website which is catering to your most ideal customers.

#4 Conversion Optimisation Audits

This type of audit analyses your conversion issues. Delving deeper, it can pinpoint whether the challenge is an onsite problem or a technical one. Pave the way to conversion and optimisation audits through the use of various tools, including Clicktale, Google Analytics and live user testing.

Conversion auditing addresses several issues and answers questions through the analysis of the following processes:

  • Live user feedback
  • Places where users click
  • How users view the site with eye tracking tool
  • Find broken links and errors in your code
  • Find problems with your conversion funnel
  • Identify trends that tell you where problems exist

If these queries were answered, then a detailed and actionable recommendation will be proposed. It is believed that if done right, it can benefit conversion and optimisation of the website. For example, Jellyfish helped increase conversion of subscribers to MoneyWeek magazine using Google Analytics, focusing on the conversion sensitive areas and having a thorough understanding of the user journey. They were able to generate 50, 000 subscribers in 2011.

#5 Negative SEO or Attacked Site Audits

In 2010 and 2011, NASA reported a total of 5,408 computer security incidents and installation of malicious software on its systems. It was an internal cyber attack targeting certain computer networks and organisations. Smaller websites are also at risk of being attacked by malicious hackers and even negative SEO.

Your site can be attacked through a direct hit as well as attempts to lower your site’s rankings in search engines. Often, negative SEO is achieved through bad linking. Other threats include hacking your server, deleting your site or changing your robots.txt file so you won’t be crawled by search engines.

Old statements from Google declined the idea of negative SEO through bad links, however, software engineer Matt Cutts stated that it would be extremely difficult but not impossible. Thus, the Disavow Link tool was created.


Backlink auditing tools are always present to help in your audit. However, some of these tools may recommend disavowing natural links which will hurt your website all the more. But then, not all unnatural links should panic you into a new site audit when you’ve just done it recently. Sometimes these weird links aren’t harmful. However, be on your guard for a sudden influx of odd links, which is a common sign of attack.

If you are in a competitive field or industry, there is a big chance you’ll be facing competitive negative SEO attacks. And if this happens, audit your links and submit a disavow file.

#6 Penalty and Recovery Audits

The Google Panda update crippled lots of websites for a short time. Websites experienced a sharp decline in traffic and main keywords did not rank anymore. Evaluating your website to know if you’ve been hit is necessary before doing your recovery audits.

But you should remember that every penalty is unique and every recovery is different. A typical penalty from algorithm updates of Google may be removed from one to six months depending on the site. Although your traffic won’t go back to its old numbers as fast as you want it, it may come back once you go through your audits and fix the problem. Depending on the issues, your site’s health will come back in no time.

Image Credit: sites.google.com
Image Credit: sites.google.com

#7 Security Audits

Most websites today, even the most secure ones, are vulnerable to malicious hackers. These hackers may expose not only the website but the sensitive data within it. Take for example the Sony Pictures uproar in November 2014, where data both business and personal, email correspondence and unreleased films were exposed to the world. Intrusion has been occurring for more than a year, so where did Sony go wrong?

Sony Pictures is a big company with a great team but the security audit might not have been given the attention it needed. Usually, bad programming can be blamed for the problem but lack of security audits also contributes to the access of the site through login pages, shopping carts and contact forms to access the backend databases.

Open source power tools for security audits include the following:

Nessus. Nessus is a free and open source package that offers comprehensive scanning capabilities, developed by Renaud Deraison.

Image Credit: Tenable

SARA. The Security Auditor’s Research Assistant came from COPS and is a vulnerability scanner known to be effective.

Hping. It is a network tool that supports various protocols like TCP, UDP, ICMP and RAW-IP. Aside from this, Hping has trace route modes which can send files between covered channels.

Commercial scanners are also available for purchase. These scanners may have the ability to provide comprehensive vulnerability data, detection and correction of security problems.

Site Audit Checklist

This audit checklist should be performed before a website is launched. Some can be checked even if the website is running already. These pointers are a consolidation of all the guidelines from different search engines including Google and Bing. These should be performed once or twice a year depending on what the website needs. Make sure you examine each of these points during an audit.

Canonicalisation, URL Structure

To have an effective website, you have to have only one URL used to get to the homepage. The use of www or not in your website is up to you just as long as it’s not followed by index.ph or index.html.

For visitors, variations like www.xenlife.com.au and xenlife.com.au may not be that big of a deal, however, they are technically different and the web server will return completely different content from these two. Google usually picks up the best representative from the set. The trick is for you to check (through the audit you are running) if you have been using the same URL consistently across your web pages. Use the uniform format all throughout your internal linking.

Image Credit: OnsiteReport
Image Credit: SEOChat

Matt Cutts said that by choosing a default URL you will help search engines, Google specifically, decide which URL you prefer to be canonical. This can be done by using a 301 redirect whenever users use the variation and you have a set that changes often.

As for webpages and posts, the URL should contain words and no numbers or symbols, with words separated by dashes or underscores. It should be limited to five words in length including a keyword.

Content Optimisation

Without quality content, your ranking can be affected. Each of your webpages should contain high quality content. Googlebot usually judges relevance of webpages through the quality of the content and the keyword targeted. Keywords, although recommended to be aplenty in the body, should be used carefully. Don’t make it look like you’re trying to stuff in a keyword wherever possible.

Follow the right tagging; h1 for titles, h2 for subheadings and h3 subsequently. Never duplicate content even on your other pages. Content duplication usually happens among spam websites copying content from legitimate websites and posting them to other websites to collect money. But if you must duplicate your content, you can use rel=canonical tag attributes to tell the crawlers which page should be prioritised.

Another thing that is considered to be a concern in content optimisation is advertisements. Check if your webpage or posts are oversaturated with ads. Chances are, if your website is filled with these ads, you will be penalised. Once detected, mosaic advertisements usually get warnings from Google.

You should also take note that longer posts are more likely to be shared. Here’s an infographic from OkDork and Buzzsumo.

Image credit: OkDork

Webmaster Tools

Webmaster tools are important when starting a site audit. Google and Bing both have good webmasters that are considered helpful by most admin.

The most important thing to remember is the generation of a sitemap for your website. It is usually uploaded in the home directory of the server or pasted into sitemap plugins. It will then be submitted to Google or Bing’s webmaster tools.

Sitemaps allows search engines to update your website whenever you add new content. The webmaster is also used by website admin as a way to check on crawl errors and submit sites for indexing.

Google webmaster has tools like crawl errors, fetch as Googlebot. Bing has an equivalent for fetch and other tools.

Website Architecture

Users usually reach pages in the website through the homepage. Therefore, the number of clicks needed to reach certain pages matters. If it takes four clicks or more to reach a particular page from the homepage, Google will assume the page is less of a priority or of low importance. This is where knowing the importance of each page comes in handy. Check which of your pages you want to be at the top of the hierarchy and which pages matter less.

At this point, Google has announced that it will index AJAX and JavaScript, so an effective content hierarchy can already be made. See the structure below:

Image Credit: Google

Geo Redirects

For foreign visitors, a good geo redirect feature of a website will help serve them better. It is often great for international websites that have a huge following from all over the world. Every time visitors visit the site, they will be redirected to the correct language version of the site based on their region.

However, this is viewed as one big SEO mistake. Redirecting visitors to a subdomain also means redirecting users from the root domain which has all the backlinks present. This will cut the value of the visit in half, like no conversion happened. Instead, it is advised that you create a language selection feature on the homepage. This way, redirects won’t waste a user’s potential.

Internal Linking

Internal linking is important. It is often recommended that each page should contain one hundred links which is quite hard to achieve. Making sure that all internal links are text links matters and making sure that you have no broken links on the site is a must. Kissmetrics outlined the guiding principles for internal linking here: Seven Commandments of Internal Linking.

Noindex and Nofollow

Some pages in your website won’t need indexing or crawling. These are pages you don’t want ranked. These may include your PDF files or your terms of use as well as sitemaps. These should be not followed to avoid further duplicate content issues.

Title Meta Tags

Your title tags should be unique for every page. Each should contain primary keywords or phrases relevant to the content. The recommended title tag is between 50-65 characters in length. Any more than that and SERPs will cut off the characters.

On the other hand, meta keywords seemed to be used by spammers making it a problem among legitimate websites. So, avoid using the meta keywords since they can been flagged by Bing as scam.

Website Security

We mentioned the importance of site audits and website authority. Providing your users the best experience with the promise of securing their information is the best thing you can offer.

Using various tools, you may check your website’s health and vulnerability against malware. You can use paid tools or simply head to Google Webmaster Tools and use the free malware diagnostic tool. You can also use W3C Markup Validator to check for bugs that affect indexing.

Screenshot Credit: W3 Validator
Screenshot Credit: W3 Validator

Audit Steps and Analysis

Before you can diagnose site problems, you may need to crawl the website. For those who have the skills and the time, you can set aside a few days for crawling and analysis of code for your audits. However, for those who want to avoid the task, take a look at Screaming Frog’s site crawler. This site crawler is free for your first 500 URLs.

screaming frog

Before crawling try to disable cookies, JavaScript and CSS. When you have problem with this process, try switching into other crawlers. After crawling and before your audit, you might want to get input from your website’s visitors.

After these initial steps, the audit process can be broken down into five main analysis points:

#1 Accessibility


The first concern here is the robots.txt. Robots.txt is used by websites to restrict crawlers from accessing certain parts of the website they don’t want crawled. Although robots.txt is a good way to keep crawlers away from your chosen pages, making sure that it does not restrict all crawlers from important sections of your site is part of the audit.

If you find an entry like this:

User-agent : *

Disallow: /

You have inadvertently disallowed crawlers into any part of your website. When developing the website, this can be a good sign but if you are trying to optimise your website, remove the restriction.

Robot meta tags

Robot meta tags play a great role in making sure your website is accessible. The robot meta tag usually allows crawlers to index certain pages as well as follow its links. Try to edit, minimise or remove, “noindex” or “nofollow” tags.

HTTP Status Codes

Another part of the audit is to check if your website URLs are returning errors, including soft 404 errors. To fix this, make sure that you are not turning away users from your site; rather, make sure you redirect them to a replacement that is relevant to their search. And instead of 302 HTTP redirects, use 301 HTTP redirects.

XML sitemap

As part of your checklist, sitemaps are included in the first part of your audit. Make sure it is embedded in your website and it follows the sitemap protocol. Also, make sure your sitemap is up to date. Site Architecture, as mentioned earlier, should take advantage of linking opportunities. Know your priorities and evaluate the way you are using linking to your advantage.

Site performance

Last but not the least in this section is site performance. Although this one is on the superficial level, the way your site performs matters a lot when herding users. Sites which are faster to load may be crawled more thoroughly. For users, the faster the site loads, the more satisfied they will be.

Different tools on the Internet can offer you insights about your page speed as well as recommendations to help you improve it. The number one tool is Google Page Speed. Pingdom can also provide details of speed and page size. Take a look at our test below:

xenlife 2

#2 Indexability

After auditing which pages you want to be crawled, you now need to determine the indexability of your website in the eyes of search engines. How many of your pages are indexed by search engines?

For the first topic, let’s take a look at site:command.

Site:command allows you to search for content on specific websites and it will give you back the rough estimate of pages that are indexed by your chosen search engine. Let’s take for example XEN Life’s indexed pages:


And compare it to a more established site like Quicksprout:


Although this is not very accurate, this rough estimate is already valuable when it comes to auditing and analysis. You cannot expect search engines to index all your pages. If you find a huge discrepancy between the data and your estimate, maybe it’s time to check for possible penalised pages. But if your indexed pages are larger than expected, then maybe you are serving duplicate content.

For duplicate content you can use the &start=990 appendage to the end of your URL and check the search engines duplicate content warning.

Other things to check with site:command include brand searches. Check if you rank well with your company name when searched.

#3 On-Page Ranking Factors

After analysing your website accessibility and indexability, you can now focus on possible factors and characteristics of your website that influence search engine rankings. For site-wide audits, you may do a domain level analysis.


The first stop in this stage of your audit analysis is your URL. We’ve mentioned before that your URL needs to be relevant, short, user-friendly and should use subfolders (avoid subdomains).


Secondly, focus on content. Often, search engines choose content that is more substantial compared to others. Moreover, it isn’t just the search engines you should be worried about; you should check if your content is addressing human needs and not machine needs alone. This can be measured through bounce rate and time spent on pages. You can do a check through Google Analytics although there are other tools that let you do the tracking over a period of time.

Another point to consider is targeted keywords embedded in content. Your content may not work well if it appears to be spamming a keyword. Minimise keyword use and make sure it’s placed strategically all over the content.

Apart from the quality, the length and keywords, content should also be structurally correct. Make sure you’re using h1, h2 and h3 tags correctly, as well as grammar and spelling. If you think you have a problem with this, do a simple spell check and hire an editor. If you are still unsure about the readability of your content, check with several tools like Fog Index.

HTML markups and titles

Ensure you validate your HTML. Evaluate its standard of compliance. You can use the W3C validator or the CSS one for your needs.

As for the title, there are lots of things to consider when creating content titles. You should include your keyword or keywords and stay under 60 characters.


Image alt text and filenames relevant to the image description. They should also contain your keywords and compressed to avoid long loading times.


Always be careful of linking. During the audit, check on your outlinks. They should be of high quality and lead to trustworthy sites relevant to the page or content.

#4 Off-Page Ranking Factors

Off-page ranking factors may not be everything but they certainly contribute to your rankings.

The more popular the website, the better the exposure to an audience. The larger the audience, the greater the possibility of spreading information about your product or services. Make sure traffic is steady and it performs well compared to your competitors. A good backlink form a famous site can help a lot.

Another thing to be considered is your authority. It can be checked through Page Authority and your Domain Authority. Take for example this famous news site PA and DA.


To better understand these readings, check SEOblog’s differentiation of PR and DA: What is the Difference Between Domain Authority and Page Rank?

Social matters too. Quantify your social media engagement and evaluate your content that has been shared by influential people.

#5 Competitive Analysis

You cannot fully complete your audit unless you have compared your site against competitors. An audit of your competitor’s website will help you gear up for the competition. You can exploit their weaknesses and use it to your advantage or learn a lesson from their moves.

Questions to Ask When Auditing


Now that we’re done with the steps, we have a few more tricks here to make sure that you’ve exhausted every possible solution when auditing your website:

  1. Have you run ranking reports?
  2. Did you define the purpose of your page?
  3. Do you have evidence that certain pages deliver?
  4. Can you identify PPC in the page?
  5. Have you checked the relevance of the information on the page?
  6. Did you check what conversion events are on page?
  7. Did you check for information time correctness?
  8. Did you check all links?
  9. Do you have any duplication concerns?
  10. Is the content optimised?
  11. Is your page layout effective?
  12. Does the page offer consistency?
  13. Did you check if you can use a schema?

These questions should be able to help you all throughout the process of auditing your site. Now that we’ve gone through the process, the checklist and these questions, you can prepare for the next step which is the reporting stage.

Audit Reporting

Auditing report isn’t just an ordinary report; you need to make it actionable so that you will get real results from the audit process and not just words and findings.

You may need to write your report for multiple audiences. In other words, the report may need to pass the eyes of people who are not familiar with tech-speak. To ensure everyone understands the findings of the audit, make one report for the IT team and a different one for the management team, highlighting the sections which are of the highest importance to each team. To make sure the most urgent problems receive immediate attention put them at the top of the list of concerns. In the end, provide the best possible actionable and concrete suggestions for your client or your own business.


We hope you can follow this audit guide to get you through the process successfully. Good luck!



Please enter your comment!
Please enter your name here