IndustryReader Q&A: June 2004

Reader Q&A: June 2004

Answers to questions from Search Engine Watch readers.

Q. I recently read an article about PageRank that said that outgoing links are a drain on a site’s total PageRank. They leak PageRank. Is this true?

A. I don’t believe this to be the case, myself. First, it’s important to understand that what Google has published in terms of how PageRank is calculated is old and almost certainly not the current formula they use.

Having said this, you’ll still hear Google today talk about the fact that if a page has a particular PageRank score, then each outbound link on that page gets a percentage of this score. Is your page a PR8 with 10 outbound links? Then simplistically, each link gets 1/10th of the score. Have 100 links? Then each link gets 1/100ths of a score.

Given this, it’s possible that you are losing some of the PageRank you might pass on to your own internal pages by linking too much. That really shouldn’t be a “drain” on the page with the links itself but rather doesn’t help the pages it’s linking to as much.

You could worry about this and try to do various things to hide or minimize the links. I’d say, however, that you should instead link in whatever way you think makes sense for your users. Pages normally do have outbound links, and that normality is part of what Google and other search engines try to factor into their ranking algorithms.

The Link Analysis and Link Building article goes into more depth about link building, the issue of internal links, outbound links and more. Be sure to give it a read for more advice.


Q. I am looking to find out the source for the following statistics and quotes like “As of 2002, it was the most popular search engine, handling upwards of 80 percent of all internet searches through its web site and clients like Yahoo and AOL.”

A. I keep archives of old stats like this in the Search Engine Watch Archives area and current stats from various providers can be found in the Search Engine Ratings & Reviews section.


Q. The top result from Google for a query shows a page of text for less than a second, then a new page comes up. Is this cloaking?

A. It’s not cloaking. In this particular case, the page is using JavaScript to quickly reload a new page. Here’s a trick to catch this. Right click on the link in Google and save the page to your hard drive. Then open the page in something like notepad or a text editor. Look at the top, and you’ll see the JavaScript area. Cut that part and save. That should stop the fast reload, and you’ll see the original page.

In this case, it looks to be a doorway page aimed at doing well for this term. Overall, it’s what Google warns against as a “sneaky redirect.” If you want, you can report the page as spam using this form at Google.


Q. Do you know where to buy a database of recent actual queries made to a top search engines?

A. WordTracker has a contract to receive regular updates of data from InfoSpace, which operates a number of major search engines. That’s the only major database you can purchase. Of course, both Overture and Google offer free tools that let you do some limited research. A guide to links for these for various countries can be found here.


Q. I notice that your guide to meta tags puts the meta name portion first, followed by the content portion. My web building tool does the opposite, so that the tag might be. Does it make a difference?

A. I looked into this several years ago and found that it seemed to make no difference. My assumption is that this is still the case today. The simple test is to see if your pages are showing up with the descriptions you want on a search engine that largely supports the meta description tag. Unfortunately, none of them do it exactly the same, as this article details.


Q. Do you still think that it is better to have static pages instead of dynamic pages. I know that search engines choke (as you say) on them sometime, but what about large engines like Google? Do you believe that they punish for having them?

A. It’s not an issue that dynamic pages are bad simply because they are dynamic but rather that search engines may avoid them to stay out of crawling trouble. More and more, search engines have become braver to index dynamic pages of all types. But overall, if you find your pages aren’t getting in, a number of workarounds as described in this article often can solve the problem.


Q. Say Company X set up an ad and limits their budget so that it allows only 20 clickthroughs a day. Now what is to stop company Y, a competitor of Company X, from making a search, clicking on Company X’s listing burning through their budget each day? In fact, wouldn’t it be possible to set up a script/program/spider to automatically do this? Have you ever heard of this happening? Surely there must be competitors who are ruthless enough to do this out there?

A. This does indeed happen. Google and Overture both have systems in place to watch for many types of clickfraud. However, it’s important that marketers also monitor and watch for oddities, just as you might check your credit card statement. If you see something that’s not right, a sudden spike in costs and so on for no apparent reason, contact the search engines immediately for help, investigation and a possible refund. Here are also some past articles on this issue:


The Ultimate Guide to Forum Link Building in 2020

whitepaper | Analytics The Ultimate Guide to Forum Link Building in 2020

Top 5 SEO Mistakes

whitepaper | Analytics Top 5 SEO Mistakes

Improving SEO & Developer Relationships

whitepaper | Development Improving SEO & Developer Relationships

How to Do Online Competitor Analysis

whitepaper | Analytics How to Do Online Competitor Analysis