6 Tips for Improving Your Website’s Accessibility
Accessibility, the less-loved child of SEO
More and more people understand that good SEO relies on relevant, high quality content. Customers who ask us to perform “SEO magic” for them (with no work on their part, usually), or to create 5,000 high-quality pages per day (you can’t), are a dying breed. So far, so good. What is often overlooked still is the other side of good SEO, which, in the professional SEO circles, is called accessibility.
In my research for this post I came across a great SEO article by Vanessa Fox of Nine by Blue on one of Google’s (many) algorithm changes. One of the things that caught my attention was the succinct way she described what SEO means to her:
- Using search data to better understand your audience and solve their problems (by creating compelling, high-quality content about relevant topics to your business)
- Understanding how search engine crawl and index sites and ensuring that your site’s technical infrastructure can be comprehensively crawled and indexed
At Optify we’ve written a lot about Vanessa’s first bullet on how important it is to create great content. In this post I’d like to give you some pointers on accessibility—the other cornerstone of a great SEO strategy. While accessibility is not quite the ‘red-headed step child of SEO’, it’s an area that doesn’t always get the attention it deserves from B2B marketers. I suspect that is because of the often more technical, and less obvious nature (as in, you can’t see it on the page) of things.
To remedy that I want to provide you with a few pointers on how to improve your overall accessibility, which ties directly to improving your success in search overall. Most of my pointers are well deserving of their own blog post, and I will cite resources that I have used in the past, but let me know if you’d like to dive deeper into any of the below pointers and I’ll be happy to devote more time.
Six tips for improving your website’s accessibility
1. Understand how crawling & indexing work
Before you can fix accessibility problems you need to understand how crawling and indexing work. Think of search engines as librarians that oversee the huge amount of information on the Web and then point you in the right direction. They employ what is known as a web spider or crawler–basically a computer program that visits URLs and pages all over the Web. The spider then creates copies of the pages it visits and feeds them back to the search engine, which stores and processes the data for quick access and retrieval. This data stored is known as the search engine index. Just like the index at the back of any book, it contains a list of all relevant content and categorizes them accordingly. When you go to any search engine and type in your query, the engine scours its index, finds all pertinent material, and presents them quickly for your perusal.
To see how well your site is being crawled you can use Google’s Webmaster Tools. More later on what you can do with the information you get from these tools.
2. Submit your sitemaps & monitor index rates
If you own a business, you definitely want all relevant pages of your site included in the index, especially Google’s index. Google takes the lion’s share of searches made worldwide — an estimated 90% for B2B. However, with the millions of websites on the World Wide Web today, it’s not practical for Google to index all of them; it needs to be selective.
The number of your site pages being indexed by the search engine is known as the index rate. This varies for every website out there. Some have a higher index rate, while others have lower rates. There are also those that are never indexed for some reason and get trapped in the “sandbox.” While we have no control over the algorithm that search engines use to index our websites, we can discover how many pages of our site are indexed and set our site up to improve our chances:
- Have great, relevant content — in the past, “black hat” tactics like keyword stuffing, hidden text, and doorway pages could get a completely useless site indexed and highly ranked. In 2012, Google became much wiser. A helpful site with good content, even with minimal SEO, is almost guaranteed to be accessible via search.
- Check for indexing issues — use Google Webmaster Tools to view the status of your website and see any problems that might interfere with indexing. The service is free, and with the ton of features and data that Google offers to improve your visibility, you’d be crazy not to use it.
- Submit a current sitemap — make sure you submit a detailed sitemap to all of the major search engines. A sitemap is basically a document that lists the pages on your website in hierarchical order. It is accessible to crawlers and helps them index pages properly and efficiently. The sitemap can also be fine-tuned using Google Webmaster Tools.
3. Monitor and fix crawl errors & problematic server response headers
With Google Webmaster Tools, you can also see a detailed list of crawl errors that might have occurred on your site. Crawl errors are problems that prevent the spider from going over your pages and getting them indexed. Fixing these errors can instantly improve your site’s accessibility. Oftentimes, it could be a hitch with the server. The dreaded 404 error is a good example of this, meaning that the server cannot find the page it is looking for. This error also means that the spider has no way of indexing that missing page. Fixing it and other breakdowns should be a top priority.
4. Keep a simple URL structure
Complex URLs should be avoided like the plague. They do not help indexing and accessibility. Keep your URLs short and sweet, preferably by using only the primary keyword or a few related ones. For example, if your page is about CRM software, then www.yourdomainname.com/crm-software.html is much more appealing to crawlers (and people) than www.yourdomainname.com/index.php?id_sezione=42erwe=75hes48dfmm04662bdi4r.
5. Optimize or avoid content that’s hard to understand for search engines (including duplication)
Search engine spiders do well at crawling text-based, static content like HTML documents and the words inside title tags or the ALT attribute in images. However, they don’t particularly like dynamic content that involves web applications. These include rich media files such as flash, Silverlight, and other videos. Since 2012, Google has become smarter and can already index dynamic content to some extent, but isn’t as streamlined yet as we’d like it to be. Search engines also have a hard time with poorly coded pages and duplicated content. To keep them happy, fix broken code, remove duplicate content and focus on optimizing content that search engines love.
6. Optimize page speed & crawl efficiency
Just as people are more likely to get irritated and exit a site if they have to wait too long for a page to load, search engines don’t appreciate slow load speed, either, and include that parameter in their ranking algorithms. Luckily, Google provides an online tool that makes it easy: https://developers.google.com/speed.
Page speed is partly determined by your hosting service, so make sure your service level agreement is adequate to ensure fast load speeds and little to no downtime. Websites are also bogged down by applications and content, including flash, videos, or hefty graphics. A blog that has too many active plugins can also experience this problem. If you want to improve crawl efficiency, you’ll have to make sure your site loads quickly.
I hope this post has given you some information on how to improve the accessibility of your site. We’ve only scratched the surface here, but I’d like to think that even with ever-evolving search algorithms, these tried-and-tested methods will help you in your SEO endeavors. Keep on optimizing!