Register for the Ryte Newsletter
Get the latest SEO and website quality news! Exclusive content and Ryte news delivered to your inbox, every month.
An SEO audit is a complex process to assess the performance of a website with respect to its positions in search engine rankings. The technical infrastructure, various on-page and off-page factors, performance in social media, and the SERP positions of competitors are reviewed as part of it. The goal of an SEO audit is to optimize the visibility, user friendliness, and conversions of a website.
Typically, an SEO audit is used to check websites based on checklists to formulate recommendations for improvements based on these lists. SEO audits can be described as quality management actions. The actual state of the website is evaluated and compared with the standards of the search engines. The DESIRED state of an SEO audit depends on the website (size and target) and the standards of the different search engines (Google, Bing or Yahoo).
The duration of the analysis is also dependent on these two factors. Depending on the scope of the project, it can take up to six weeks and cost between 300 and 3,000 Euros. SEO audits are usually carried out by external agencies, consultants, or internal SEO employees.
Most audits begin with a discussion between the auditor and the customer. Questions get clarified such as:
- What is the goal or business model of the website?
- Which key areas of the website are particularly important to the customer?
- What previous SEO measures have been (or are still being implemented)?
- How can changes be made to the server, data management, CMS, and the source code?
- What are the access points for the Google Search Console, etc.?
- What results can be expected from experience?
Aspects of an SEO audit
The basis of an audit initially is data in the form of worksheets or Excel spreadsheets as well as a document, which serves the customer as a recommendation for action. Based on the data obtained, recommendations are made which are marked with different symbols (for example, red symbols for absolutely necessary changes). These files are also a documentation that makes process steps transparent to the customer.
Server and infrastructure
To find out the status of the technical infrastructure, the site is either read with a crawler or indirectly checked with the search engine’s webmaster tools (for example, Google Search Console and Bing Webmaster Tools). For example, the crawler will detect error messages for specific URLs. Other important technical factors are:
- Robots.txt and robots meta tags: Is the crawler allowed to read the websites? Are individual pages blocked by robots meta tags?
- Http Status Code: Do certain URLs output an error code (4xx, 5xx and also Soft 404) when they are called? How is forwarding handled?
- XML sitemaps: Is there a sitemap available to the search engine? Is it a valid XML document? Do the pages listed on the sitemap match the results of the crawler?
- Page speed and time to first byte: How fast does the website load when called? How much time passes before the first byte of data is loaded from the server?
- Dynamic, canonical URLs and paginated pages: Are dynamically generated URLs rewritten with mod-rewrite? Is there a canonical URL that serves as the main domain? Are pages paginated correctly with rel="next" and rel="prev"?
Indexing and visibility
How many websites of a domain are listed in the index of the respective search engine? Are there or were there penalties or manual actions by Google for individual sites? These questions can be answered by comparing the index and the results of the crawling. Google’s index can be queried with commands such as “site: www.example.com,” for example. Ideally the data matches up.
If this is not the case, it either indicates crawling errors (websites are not read, and therefore also not indexed) or duplicate content (there are sites with duplicate content in the index).The entry of relevant keywords (brand, company name, services, products) into the search bar provided additional insight of the visibility with regard to certain keywords. Even individual websites with a complete address can be checked for their existence in the index.
The information architecture is the vertical and horizontal structure of a website which can be represented as a reverse tree. How many clicks does it take for a user to access the information they searched? How many levels exist in terms of website depth? How many horizontal site elements are there? The most important subpages should be accessible with about three clicks. A relatively flat overall structure is also important in order to enrich the customer journey.
The website architecture also has an impact on the linking within the hierarchy. The link juice should be evenly distributed among the structures to avoid silos, which are structures that only distribute link juice horizontally.
- URLs: URLs should be short, speaking, and thus user-friendly. In addition, they should include the relevant keywords for individual pages and, if possible, no special characters. Dynamic URLs often use parameters. If so, these URLs should be rewritten or registered with the Search Console.
- Duplicate Content: Some URLs produce duplicate content. This is absolutely to be avoided.
- Content: The content can be inspected using a text-based browser such as SEO Browser or Browseo. Each page should provide substantial information to users, be well-readable, and include the most important keywords for that individual page. A clear structure with the use of H1-H6 headings and additional text markup is recommended. Keyword Stuffing and grammatical errors should be avoided.
- Meta Title: One of the most important ranking factors. The title tag typically contains relevant keywords and describes the page content.
- Meta Description: Good meta descriptions can lure users to the site and increase the click-through rate. Again, one or two keywords should describe the content of the site.
- Images: Image, logo and graphics descriptions should be concise. The ALT tag specifies an alternative description of the image, if read-out programs or the like are used. Again, keywords can be used moderately.
- Outgoing and internal links: Hyperlinks act as recommendations for the quality of a website. Each outgoing link should be checked for the trustworthiness of the website, the relevance of the content, and anchor text. Error messages and unnecessary redirects must be avoided. Ideally, most links are set to "dofollow" to allow the link juice to spread.
- Popularity: How much traffic does the website get? What is its popularity compared to the competition? Is the website linked from other popular websites?
- Trustworthiness: Does the site contain a high density of keywords? Is there hidden text that is invisible to crawlers? Is cloaking used?
- Backlinks: An organic link profile is one of the most important criteria for an SEO audit. How many domains link to the website in question? How many different domains are there?
Does the link profile contain any nofollow links (no nofollow links would look unnatural)? Are the backlinks relevant in terms of subject and of high quality? Are there sites that include earned media?
- Authority: The authority of individual sites or sub-sites and whole domains may affect ranking.
- Public Relations: Is the company, the brand or the website mentioned in different media?
- Social media: There are different social signals for each social network. In most cases, interactions between users and a profile are the focus. Is content distributed? Is the website mentioned in the social media of users? Are there influencers that distribute content virally? Is the social profile SEO-optimized?
- Competition: Competitor information helps customers understand the strengths and weaknesses of their competitors and improve their portfolio. To get such data, an SEO audit can be done on competitors, of course, with some changes since certain information is not available. With TF*IDF analysis of texts, successful competitors can also be identified with regard to keywords.
Relevance search engine optimization
An SEO audit reveals the strengths and weaknesses of a website and shows improvement potentials. In addition to the basic aspects such as crawlability, indexing, and on-and-page factors, the user-friendliness of a website is an important signal. Although user behavior changes only slightly over longer periods of time, the ranking factors and standards of search engines are constantly changing. This is one of the main reasons why an SEO audits should be done regularly.
The SEO audit is not only aimed at optimizing websites for search engines, but also for providing users with high-quality content and a good customer journey. However, SEO audits are very extensive. Some agencies therefore divide it into sections and offer content audits, on-page audits, off-page audits, and technical SEO audits.
Free tools and audit software is recommendable for companies with limited budgets. Examples are MySiteAuditor, ScreamingFrog, ZadroWeb, Found, SEO ReportCard, WooRank, and Marketing Grabber. In addition, there are numerous tools that can be useful for certain tasks, such as Xenu as a crawler, Google PageSpeed and Pingdom as load time testers, SEMRush as a traffic test, and Opensiteexplorer as a link validator. However, profound knowledge is often required to interpret the data correctly when using free tools.
With the Ryte e-book “SEO Audit,” we have created a checklist that is not only a valuable support for agencies in their audits, but also helps beginners to understand how SEO professionals work and which criteria is considered particularly important.
- SEO Site Audits: Getting Started moz.com. Accessed on 02/07/2015
- How to Perform an SEO Audit – FREE $5000 Template Included quicksprout.com. Accessed on 02/07/2015
- How to Perform the World's Greatest SEO Audit moz.com. Accessed on 02/07/2015
- Successful Site Architecture for SEO moz.com. Accessed on 02/07/2015
- SEO Audit: What is it and do I Need One? linkedin.com. Accessed on 02/07/2015
- How to perform A SEO audit of your web site (checklist included) reliablesoft.net. Accessed on 02/07/2015
- The Perennial SEO Audit – Creating an Effective Framework for Keeping Your Campaign Running at Peak Performance searchenginewatch.com. Accessed on 02/07/2015
- Top 5 Free Website Audit Tools For Agencies business2community.com. Accessed on 02/07/2015