A website audit is simply the process of looking at your site and assessing its performance. There are a number of ways to do it and an exhaustive amount of data to be looked at from both a technical and marketing viewpoint. Whilst not going into the very technical issues, we will touch on what I believe are six important areas that need particular attention.
Does your website have a robot.txt file? This little file tells the search engines which pages you want indexed and which you dont. This needs to be implememted correctly as it is all to easy to block certain pages from being indexed. Why do you want to block certain pages? Well the simplest example is your website login screen. You dont want or need this to be indexed as it could be a security issue. So this little file is quite important and if not implemented correctly could have a detrimental effect.
Does your site have an XML sitemap? A sitemap is basically a route map that the search engines look for. It contains a list of all the pages on your website and thus allows the search engine to correctly index each page. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL. Search engines discover pages from links within the site and from other sites. Sitemaps supplement this data to allow search engines that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. This data is then used to index your website.
Optimize Meta Data
The search engines will crawl each page on your site so it is important that each page is focused for a particular keyword/keyphrase. To aid the search engines you should have a unique meta title and description. This should be brief, to the point summaries of the page. The meta title and meta description are what the search engines pull from your website and display in the search engine results page(SERP’s). So you can see instantly the importance of these tags. They need to not only describe the page content but also be enticing for users to click on them in the SERP’s.
Fix Duplicate and Shallow Content
When there is more than one piece of the same content, it will be difficult for search engines to distinguish between the original source and the copy. If this is repeated a number of times across your site it will result in a confusing message to the search engines and thus lower ranking. Further to this you should also look to have a least 300 plus words on each page. Now, obviously pages like the contact page cannot have this amount and that is ok. Do not fill up the page with needless copy but dont leave it blank. Also remember that you are writing for the user as well so the copy needs to be informative and useful.
Find Broken Internal and External Links
Another important area to consider is broken links. Broken links make for a poor user experience. A poor user experience means your website is downgraded by both your human visitors and the search engines. If a search engine spider visits your site and comes to lots of “dead ends”, or broken links, it can cause problems. If a broken link is not fixed it will result in a decrease in rankings and thus visiblity.
Site Architecture and Content
The most important thing to remember here is that you should write your content for the user, not for search engines. Saying that there are a few signals you can send to help the search engines index your site better. These include correct use of H1 and title tags, alt tags on images, Meta Keywords and good readable content. Also good site architecture will result in a better experience for your users. It will then improve rankings as your site will increase visibilty. This can then lead to more control for you to rank for the keywords you want to rank for, and thus an increase in your targeted traffic.