There are so many things that can affect your SEO strategy: content, website design and backlinks to name a few.
But crawlability is just as important.
Despite it flying under the radar somewhat, even small issues with your site’s crawlability can affect your SEO in big ways.
So what is crawlability?
Crawlability refers to how easy it is for search engines to find and read the content within your website. To learn about any new or updated page, search engines use what’s known as web crawlers. These are bots whose aim is to follow links on the web.
Google explains how it crawls the web: “The web is like an ever-growing library with billions of books and no central filing system. We use software known as web crawlers to discover publicly available web pages. Crawlers look at web pages and follow links on those pages, much like you would if you were browsing content on the web. They go from link to link and bring data about those web pages back to Google’s servers.”
When a website has now crawlability issues, then web crawlers can access all its content easily by following links between pages. However, broken links or dead ends might result in crawlability issues which can negatively impact your SEO strategy.
What can affect your website’s crawlability?
A variety of things can affect your website’s crawlability:
The way you have organised your website plays a crucial role in its crawlability.
A common mistake people make is having pages that aren’t linked to from anywhere else. If this is the case, web crawlers may have difficulty accessing them and they may not be able to read half your website – which could negatively affect your organic rankings.
These pages may be accessible through external links elsewhere on the web but this isn’t a given and you want to make sure your website is as easy to navigate and read as possible.
We know that Google uses web crawlers to follow links to ages across the web, so it’s vital that you have a strong internal link structure on your website so that Google, and other search engines can reach all your pages – even the deepest ones!
A poor internal link structure could send them to a dead end, resulting in a web crawler missing some of your content.
Broken redirects and server errors
Broken page redirects will stop a web crawler immediately, resulting in crawlability issues. In the same way, broken server redirects and other server-related problems may prevent web crawlers from accessing all of your content.
Much like real life users of the web, crawlers can’t access content restricted behind a form. So if you have content accessed via a login form or gated content requiring a form submission before displaying on many of your pages, it won’t be read and can cause crawlability issues.
Crawlability and SEO
So why does all the above matter and how does it link to SEO? Well, in organic search results, the most relevant pages are selected by the search engine, with the best pages appearing at the top of search. How does it pick these pages? Through web crawling of course. Website crawling is the main way search engines know what each page is about. So if your website has poor crawlability, the search engines don’t know what it’s about, and cannot present relevant pages to your potential customers. It’s really that simple.
For more information about SEO, read our Back to basics: SEO post.
Forever is a digital marketing agency in Manchester specialising in SEO, PPC, content, branding, web design and development and social media.