SEOAre You Making It Easy For Search Engines To Crawl Your Site?

Are You Making It Easy For Search Engines To Crawl Your Site?

For quality site paths that will attract the attention search engines and ultimately drive user traffic, here are some suggestions to make your site more accessible for crawling.

After watching the progression of organic search marketing over the years, it is interesting to assess how much things have changed.

We went from keyword obsession to conversion optimization – literally from the first impression to the end of the conversion funnel. Though there is a very important middle point: the entry into the site.

While the user is always the most important element of the conversation, as SEOs we have to consider what we do with perking search engine interest, in order to have more first impressions with users out in search results.

From a search marketer’s perspective, before the user makes it to the site the search engines have to make entry easy- quickly crawling the site without hiccups, trips, or confusion.

If you are making strides to improve the quality of the site path for search engines, you are doing the same for users as well. This article compiles a quick list of considerations and site checkpoints you should employ to keep your site door wide open for search engine crawling.

Let’s take a look at what you need to be looking for and the tools that can help get you there.

The Connection

Tool: Pingdom Website Speed Test

mccoy1-pingdom-speed-test

Before we can even think about a search engine getting to your site, we need to think about how well your site is communicating with your server and the properties requesting page files.

The first consideration I want to make is to test Ping and Traceroute to understand if there are any issues with network connectivity.

While I am in this tool I also want to do a domain name server (DNS) check to find out if there are any domain authorization issues that are going to trip up the search engines before they even truly get in the door.

Another area you want to review is data straight from the source via Google Analytics. Looking within the Behavior>>Site Speed>>Page Timings section you should review metrics such as Avg. Redirection Time, Avg. Domain Lookup Domain, Avg. Server Connection Time, and Avg. Server Response Time.

jmccoy-analytics

The Obvious

Tools: Google Search Console and Sitemap Writer Pro

I call this “the obvious” as it truly is the starting point for search engines on a site. It is the groundwork to SEO.

However if you are unfamiliar with it, from an SEO standpoint, this can stop you in your tracks rather quickly.

mccoy3

You will want to take a look at Google Search Console>>Crawl>>Robots.txt tester to assess how Google understands your robots.txt file and what you want them to not see.

This is also is a great opportunity for you to review your exclusions and understand if you are alienating search critical content from crawler view.

Run your most important site pages through the tester and reassure you are not making one of SEO’s most fundamental mistakes.

Also make sure that you have a current XML sitemap for site pages, images, and video so search engines via Google Search Console and Bing Webmaster Tools will always have a directory of all site pages that they need to visit regularly.

Loading the Page

Tools: Pingdom Website Speed Test and Google PageSpeed Insights

Page-load time has been a ranking factor for years now. It is considered best practice when loading pages to ensure that CSS and JavaScript coding are kept in externally referenced files for call-up and to as few files as possible.

I find that many people follow these best practices, but an area often forgotten is having page files in the page-load sequence that that are dead 404 files or files that redirect to another URL.

While a redirect isn’t the worst thing, it still takes time away from the search engine crawl. For the dead files being requested, these are simply a foot put forth to trip up the search engine.

mccoy4

We find ourselves using Pingdom’s speed test tool again, since they have a great page loading feature with its waterfall tool.

While certain page-load requests are taking a long time, you can look for the color-coded file names that indicate dead files or redirected files notated in code.

Also since we are on the topic of redirected files, another way to simplify site crawling is using redirecting internal links. Using tools such as Xenu’s Link Sleuth and SEMrush Site Audits can allow you to gain insight on internal links that are broken or redirecting.

Page Rendering

Tool: Google Search Console (Fetch as Googlebot and Blocked Resources)

You have made it to this point, and your site communicates well to requests and files load effectively. How does it look to the search engines? Just because it loads to your content doesn’t mean that it does not trouble a crawler.

From the Fetch as Googlebot section of Google Search Console you will want to see if your page rendering is the same to the search engines as it is to users.

By reviewing this as well as the Blocked Resources section, you will get a second chance to check to see whether page reload requests are timing out due to server inefficiencies or whether robots.txt exclusions are at play.

mccoy-fetching-rendering

Content Duplication or Crawl-Waste

Tools: SEMrush Site Audit, SiteLiner, and manual review

The crawlers are in the door without roadblocks and they can crawling through the site.

I have already mentioned the considerations of redirected or broken internal links, and we could get deep into navigation linking but one point I want to touch upon is crawl-waste and unknowingly possessing duplicate content.

Think about how crawlers spend their time on your site – do you want to make this time to be as quick as possible, or do you want to annoy them with pages that are duplicates of previously crawled content?

There are several tools out there that can identify duplicate content from SiteLiner to SEMrush. However, I routinely enjoy doing “site:” searches and adding identifiers by scraping copy sentences or title elements.

You can often find content that is duplicated on multiple pages of a domain or content duplicated across sub-domains of a domain property.

Conclusion

Of course we can’t run the entire gamut of what to consider in opening or closing the door to a search engine, but I have reviewed the common areas where I have seen sites making errors.

Hopefully you are now thinking about the search engine’s crawls, as they try to make it to the site, knock on the door, open it, and give them a fulfilling tour of your site.

Josh McCoy is Lead Strategist of SEO/PPC/Social Media at Vizion Interactive. You can connect with Josh on LinkedIn.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

9m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

11m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y