You can’t truly have a successful search engine optimization (SEO) program without evaluating your performance over time and knowing how potential technical issues can affect your website from a crawling, indexing and user experience perspective.
The SEO Diagnostics for the Skilled Search Mechanic panel at SES New York last week, moderated by Kristjan Mar Hauksson (@optimizeyourweb), founder and director of Search & Online Communications at Nordic eMarketing, brought together some very useful ideas on monitoring and getting the most out of your search results, being prepared to react and fix issues when need be, and knowing what to think about when it comes to the ins and outs of technical SEO.
Real-Time SEO Diagnostics: Ways to Lead a Moving Target
To kick things off we had Chris Boggs (@boggles), CMO of Internet Marketing Ninjas, Chairman of SEMPO and also on the Advisory Board of SES, discuss why having a proactive approach to SEO is important, but also being prepared to react at the right time is crucial when it comes to the success of your program. In fact, Boggs recently wrote a Search Engine Watch post around this same topic – SEO Diagnostics: Proactive and Reactive Diplomacy.
Boggs started off by saying there is no single needle in the haystack that can help you with the success of your site’s performance and digital marketing program as a whole. Over time you need to establish the right steps and find the tools that make most sense for your business.
When it comes to seeing whether your brand has authority in Google’s eyes, here are three things you can evaluate:
- Do you have sitelinks that show up with your brand search results with a number one listing?
- Does your brand merit a “Brand Seven-Pack“?
- How strong is your brand social presence?
Touching more on the brand social presence piece, Boggs went on to say that if you already have your Google+ Business page showing up on the right side of your brand search results, you have the ability to directly communicate in real-time with your target audiences, which is great. You can tell if your social presence is strong if you have that enabled; you can also look at the rest of the seven-pack when/if it appears to see if your social channels are showing there as well.
So what other proactive tactics did Boggs recommend? Here are some additional tips:
- Type in site:yourdomain.com in Google to see the results that come back to see how your website is doing. This allows you to see which pages of your site come back in which priority by keyword. Also, notice what types of universal results are showing up (e.g., video, news, images, etc.).
- Take data from Google and Bing Webmaster Tools (WMT) and try to validate what you’re seeing there in your web analytics tools (e.g., Google Analytics). For example, if you notice you have a slow page load time in WMT, then that might explain why you have a high bounce rate for the same page in Google Analytics. Take the time to really understand your WMT data.
Shifting focus to reactive SEO tactics, Boggs offered the following advice when things don’t exactly go as planned in his PREPARE acronym:
- Prepare for eventual SEO problem (and don’t blame Google, Bing, or your SEO).
- Readiness, recognition, and multidisciplinary response to disasters and emergencies.
- Educate all stakeholders, especially those with control over code or content.
- Promote an effective workflow to be able to treat issues in such emergencies.
- Alert all core personnel and stakeholders (don’t try to hide it).
- React swiftly and decisively.
- Evaluate remedial measures in real-time.
Summary of Technical SEO
Adam Audette (@audette), Chief Knowledge Officer at RKG, was up next with his information-packed presentation on technical SEO.
Audette started off by saying that we need to be strategic about what we’re doing and why we’re doing it. Technical SEO might make a lower impact on results, but it’s highly dependable in terms of control.
Before making any changes to your site try and justify your recommendations by putting together projections for your key stakeholders. For example, take the number of pages you think will be impacted by the changes, multiply them by average conversion percentage, or average revenue per page, or average leads per visit, etc. and articulate what kind of increase you’ll likely see in traffic, conversions and ultimately revenue as it makes sense.
Audette touched on the following technical areas:
There were two main issues that Audette pointed out with Canonical Signals:
- When canonical tags on the site are not part of the internal link profile, which means there’s a link that exists in a canonical tag that is not used in any of the internal links on the site.
- Self-referencing canonical tags that point to non-canonicals.
What you want is consistent signals with your URLs across your site. So when search engines start crawling the pages you want everything to line up, from the links in your navigation, to internal links, external backlinks, canonical tags, XML files, etc. If you’re able to accomplish this it’s incredibly powerful as a signal to search engines and works very well.
Audette touched on two generally-accepted best practices for pagination:
- View All ‐ if you have a good View All that is fast-loading, and contains all products or items included on pages you can just rel canonical all of your pages back to View All.
- That said, if you don’t want to use a View All, you can use rel prev, next, along with self-referencing canonicals to handle any duplicate content.
Handling Duplicate Content
There were two main issues that Audette discussed when it comes to handling duplicate content.
- Use robots.txt to handle duplicate content. This is usually a bad idea, because it passes no equity and search engines can’t crawl what’s excluded in robots.txt.
- Have an m. subdomain for mobile content that is competing against desktop content. The desktop version of your content needs a rel alternate tag, which can be placed in sitemaps, or on the page itself, and the mobile version should use rel=canonical.
Audette then went on to discuss six tools for managing duplicate content:
- Rel=canonical: Your best option, but pages have to be highly equivalent for this to work right
- Meta noindex (follow, nofollow): Nice way to exclude content from searches
- Robots.txt: As mentioned above, not the best idea
- Nofollow (link attribute): Works best as a link-level tool
- Webmaster Tools parameters: You can have a lot of success using this, but you need true value pairs
- Rel prev, next
Audette had three recommendations for dealing with faceted navigation, which are a powerful and easy way for users to search for what they want on your site by selecting their options. They were:
- Identify search vs. overhead facets and append the overhead stuff in parameters at the end of URLs if possible.
- Always force the canonical path regardless of selection order.
- Build URLs intuitively based on how people search.
Product Variations & Inventory
Audette then went on to identify three basic options to handle product variations: you can either:
- Use unique URLs for each one.
- Create unique attribute-specific versions of the URL and then rel=canonical back to the attribute-agnostic version, so that’s the one that ranks.
- Put all variations in the interface so the URL does not change no matter what options are selected (the best way in Audette’s opinion).
So what happens when products expire and don’t come back? If you have review data for products you don’t carry, but you want to still leverage the review data (i.e., user-generated content), provide a message that the product is no longer carried on the page, but link to a similar product option.
Crawling & Response Codes
There are two main issues that Audette identified here:
- 500 server code errors still present significant problems on sites. All websites have finite crawl resources, and once you clean up the overhead junk (i.e., duplicate pages, crawl errors, etc.) you help the crawl immensely.
- Another thing you can do to help your crawl efficiency is to monitor your indexation accurately by using segmented XML files ‐ you can do this by splitting your XML sitemaps into categories, then use the following search operators – site: geturl: and intitle: in Google. Log the data you get every week or every month, which will ultimately help you pinpoint your crawl issues and allow you to fix them more easily.
Another thing you can do to help your crawl efficiency is to use 304 not modified for pages that have not changed in a while.
There were seven main tips that Audette highlighted for fast sites:
- Gzip HTTP compression, which decreases file transfer size.
- Set a far-future Expires header, which allows browsers to cache content and assets.
- Use the asynchronous GA code (if you use Google Analytics), which is much quicker.
- CSS and avoid @import, to allow parallel downloading.
- Specify a character set, so browsers can begin parsing/executing faster.
- Avoid empty img tags, since they create HTTP requests.
- Set image dimensions and compress your images, which reduces page load time.
Companies are seeing more of their overall traffic coming from users on mobile devices, so what do you do to provide a good user experience? Responsive design is in fashion and is a great idea, but it’s not for every business. So what other ways are possible? You can
- Dynamically serve mobile content using the Vary header.
- Use a subdomain – but make sure you put rel alternate and rel canonical in place on the desktop and mobile pages, respectively.
Finally, Audette covered some of the advantages of responsive design, which include:
- Having a single URL for every page.
- No redirection is necessary.
- You can have more efficient crawls.