Web 2.0 Technologies and Search Visibility

CSS, Ajax and Web 2.0—more than mere buzz words, these are advancing technologies that bring vast improvement in the design and usability of web sites. What about search engines? Do these new technologies help search visibility or hurt it?

A special report from the Search Engine Strategies conference, December 4-7, 2006, Chicago, IL.

There was no shortage of experts to explore this topic on a panel entitled “CSS, Ajax, Web2.0 and Search Engines” at the recent Search Engine Strategies conference held in Chicago. On hand was Shari Thurow of Grantastic Designs, Jim McFadyen of Critical Mass, Scott Orth of Selytics, Amit Kumar from Yahoo and Dan Crow from Google.

CSS, Ajax and Web 2.0 Defined

First to define what CSS, Ajax and Web 2.0 are and how they apply to web sites. For this I’ll turn to the Wikipedia definitions of each term.

CSS, short for Cascading Style Sheets, is a style sheet language used to describe the presentation of a document written in a markup language. Its most common application is to style web pages written in HTML and XHTML, but the language can be applied to any kind of XML document, including SVG and XUL.

Ajax, (the programming language, not the cleanser) short for Asynchronous JavaScript and XML, is a web development technique for creating interactive web applications. The intent is to make web pages feel more responsive by exchanging small amounts of data with the server behind the scenes, so that the entire web page does not have to be reloaded each time the user makes a change.

Web 2.0 is a phrase coined by O’Reilly Media in 2004 which refers to a supposed second generation of Internet-based services – such as social networking sites, wikis, communication tools, and folksonomies – that emphasize online collaboration and sharing among users.

What are some of the common uses of these technologies in web development?

Shari Thurow pointed out that CSS is most commonly used to control font styles and attributes as well as page elements such as determining the exact position of elements without bulky tables. Because CSS is typically contained in a single file, one can change the look and feel of a site very quickly. Additionally because CSS is called for, the pages tend to load faster because redundant code is removed and once CSS is cached by the web browser, it does not need to be downloaded again and again.

Jim McFadyen said that Ajax allows the browser to communicate with a web server without having to reload a web page. Sliders that change content and forms that expand or auto-populate data are a couple of examples. Ajax uses XHTML and CSS for presentation but is not a programming language nor does it need to be downloaded or installed.

And what about Web 2.0? Is it just a popular buzz word or does it hold additional value? Scott Orth presented a case study to demonstrate that Web 2.0 is all about a user experience. The case study revealed how a static site was improved by adding a lot of dynamic tools and demos that improved user experience. The site also featured an inquiry form that delivered instant results, rather than the typical “thank you, someone will be in touch shortly” message. How ingenious!

This all sounds wonderful but what about the search engines? Are these three technologies good, bad or just plain ugly to the search engines?

CSS falls under the “good” and “bad” categories. The advantages of CSS are numerous: It allows one to manage style of web pages with ease, allows for faster downloading, etc.

What about the disadvantages or even abuses of CSS? One disadvantage is CSS formatted hyper-links can dominate the content of a web page, making content appear unfocused. Other issues occur with text wrapping when people override CSS. CSS can also be used to hide text or abuse header tags which would then fall under the category of web spam. Some designers will use CSS to position content first in html thinking it helps with rankings (it doesn’t). Finally the use of coordinates to hide text or stacking chunks of content on top of each other can be disadvantageous as engines can detect this as a spam technique.

Ajax would simply fall under the category of “ugly,” not because it makes for ugly web pages but rather it is invisible to search engines. Because search engines do not support JavaScript, and Ajax uses JavaScript to function, search engines will not see Ajax-delivered content. One example of this would be if your navigation was delivered with Ajax. If this is the only source of navigating the site, engines will not be able to crawl and find additional pages beyond the first page. The same is true of content. If content is delivered by Ajax, search engines will not see it.

Jim provided an example of a popular web site that breaks every SEO rule—Gucci.com. Pages look great but with JavaScript turned off, all you get is a blank page. All the content and links are served through Ajax which works great for end users but is completely invisible to search engines.

So how can one enjoy the benefits of Ajax while pleasing the search engines at the same time? The simple answer to this is to make sure navigation and content is in html. This will not only help with search visibility but will serve those end users who browse with JavaScript turned off. It is really about accommodating everybody, using Ajax features to enhance the web site but ensuring html content is accessible for those who cannot decipher Ajax commands—search engines and users.

Web 2.0 would fall under the “good” category. Because it is more about a specific user experience, sites that qualify to be classed as Web 2.0 properties typically are not at a disadvantage when it comes to search visibility. Of course that is in keeping all the before mentioned disadvantageous uses of CSS and Ajax in mind. In Scott Orth’s case study, his client experienced a 97% increase in rankings, traffic jumped 53% and targeted conversions increased by 59%.

The search engine representatives presented some additional tips that provided a good conclusion to this topic. Yahoo’s Amit Kumar added that while technology is awesome, simplicity is also crucial so engines can understand content of page. Google’s Dan Crow explained that using CSS, Ajax and Web 2.0 technologies with workarounds will accommodate for search engines in their current state of understanding, but at the same time be prepared for the future when search engines are able to better comprehend these technologies.

David Wallace is CEO and founder of SearchRank, an original search engine optimization and marketing firm.

Search Headlines

NOTE: Article links often change. In case of a bad link, use the publication’s search facility, which most have, and search for the headline.















Related reading

How to lead SEO teams and track its performance effectively: Experts tips
SEO is a team sport: How brands and agencies organize work
How to pitch to top online publishers: 10 exclusive survey insights
search reports for ecommerce to pull now for Q4 plan