Yahoo Slurp Adds Wildcard Support For Robots.txt

The Yahoo Search Blog announced that Yahoo’s web crawler, aka Yahoo Slurp, now supports wildcards in the robots.txt file. The two parameters that Yahoo now supports include the “*” and the “$.” The * will tell Yahoo to do a “wildcard match a sequence of characters in your URL.” The & will tell Yahoo to do a “anchor the match to the end of the URL string.” Many more details at the Yahoo Search Blog.

Related reading

How to lead SEO teams and track its performance effectively: Experts tips
SEO is a team sport: How brands and agencies organize work
How to pitch to top online publishers: 10 exclusive survey insights
search reports for ecommerce to pull now for Q4 plan