SEO8 Reasons Your Reconsideration Request Will Fail
8 Reasons Your Reconsideration Request Will Fail
Google has made the process of applying for reconsideration of a site very difficult. If you're one of the unlucky ones struggling to get a penalty removed, here are eight common mistakes to avoid to ensure your reconsideration request succeeds.
Removing a manual unnatural links penalty can be extremely difficult. If you’ve gone through the process then you’re likely familiar with the mixture of excitement and fear that comes with seeing “All Messages (1)” in Google Webmaster Tools after filing for reconsideration.
Will you see “Manual Spam Action Revoked”? Or will it be the dreaded, “Links to your site violate Google’s quality guidelines”? Getting the latter message means that your reconsideration request has not been successful and that more work needs to be done in order to lift your penalty.
Unfortunately, Google doesn’t give much direction when it comes to understanding why a reconsideration request was denied. They may give some examples of unnatural links, but not always. Sometimes absolutely no explanation is given as the reason for failure.
One thing I do on a regular basis is review the reconsideration requests for businesses that haven’t been able to lift their penalty. Here are eight of the most common reasons why reconsideration requests have failed.
1. Not Enough Unnatural Links Addressed
This is, by far, the most common reason for a reconsideration to fail. It isn’t enough to go after just the worst links, or even most of the unnatural links. In most cases, Google wants to see that you have identified close to 100 percent of your unnatural links.
Site owners in forums often proclaim things like, “I removed 3,000 links and Google still failed me!” In one of these cases, the site owner had indeed removed 3,000 links all from really low quality directories, bookmarks and comment spam. However, they still had hundreds of articles in existence that contained unnatural links to their site. The site owners felt that those links were natural because the sites who published the articles made an editorial choice to publish them. But really, it was a linking scheme. That site won’t get their penalty removed until those links are addressed.
(As an aside, “addressed” doesn’t necessarily mean, “removed”. What Google wants to see is that you have identified which links are unnatural and that you have made a thorough attempt to get the links removed. And if you couldn’t remove the links, then you have disavowed the links.)
If your reconsideration requests are repeatedly failing, it’s worthwhile to have someone take an objective look at your link profile. There may be links that you feel are natural, but Google does not.
Sometimes, posting a question in the Google Webmaster Forumscan help. The volunteers there will give their opinion on your backlink profile. However, be warned that sometimes the responses you get in this forum can be a little snarky, and there is no guarantee that the answers will be completely correct. That said, there is a possibility that you can uncover links that you felt were natural, but really aren’t natural in Google’s eyes.
2. Not Enough Effort Was Made to Remove Unnatural Links
If you have a manual unnatural links penalty, it isn’t enough to just disavow your bad links. Google needs to see that you have thoroughly tried to remove as many unnatural links as possible.
If you have the login credentials to directories and articles where you have submitted links, then log in and remove them. Tell Google in your reconsideration request that you have done this. It may even help to create a Google Doc spreadsheet that lists each domain and the date on which you went in and removed your links or deleted your account and also indicates the ones where you were unable to log in or had other difficulties.
If you don’t control the pages where your links reside, then you need to do all you can to contact the site owner. Show Google in your spreadsheet that you have collected any email addresses you could find, the whois email address and also the URL of any contact form, Twitter page or Facebook page. Then, document the date on which you tried this form of contact.
I usually like to add additional documentation about my communications with site owners as well by entering things into my spreadsheet like, “Site owner said they will remove links but still has not been done – re-emailed,” or “Site owner wanted $100 to remove a link – will disavow”. This shows Google that you truly did spend time contacting webmasters.
I used to advocate showing Google a copy of every email that we sent. However, John Mueller recently said in a hangout that this was probably not necessary.
3. Not Enough Explanation
When writing a reconsideration request, you don’t need to write a novel, but you do need to write enough to convince the webspam team that you have been working hard to remove as many unnatural links as possible. Hereis a Webmaster Forum thread where John Mueller says the following to a site owner who has failed at reconsideration:
One of the things we noticed was that your last reconsideration request was a bit short & basic, which when taken on its own, gave the team a wrong impression about the steps you’ve taken to resolve this issue on the web….My general recommendation for reconsideration requests would be to make sure that you’re really submitting the right & relevant information there, so that it’s clear to those processing the request what steps you’ve taken to resolve this issue: linking to the doc you mentioned is great, linking to a forum discussion is great, providing more context in the message directly is also very useful.
In many cases, Google will tell you, when you submit your disavow file, whether there is an error in your file.
In the past, if you had submitted a .rtf file instead of a .txt, or if you had forgotten to use the domain: operator when you wanted to disavow an entire domain, you would receive no indication from Google that your disavow file would not work.
Now, after you upload the file, if there is a problem with its syntax, you will receive an error message.
Even with these warnings, it is still possible for a reconsideration request to fail because of a disavow file problem. In one situation, a reconsideration failed and Google gave example links that the site owner had attempted to remove, and when unsuccessful, had put in their disavow file. Or so they thought.
It turned out that the disavow file was missing every domain that started with the letters K to Z. This was most likely because of a glitch in Google Docs that sometimes causes only 200 cells of a spreadsheet to be pasted when using the copy and paste function. Only the domains starting with 0-J were transferred to the disavow which meant that a huge number of domains the disavow file were missing.
Always double check that your disavow file contains all of the domains and URLs you want to disavow!
5. Disavowing on the URL Level Instead of the Domain Level
I almost always disavow links on the domain level. The rare exception would be for sites where we had an unnatural link from one page but I felt we could get natural links from other pages on this site. This doesn’t happen very often.
Here’s an example that explains why it’s important to disavow on the domain level. Let’s say I have a paid link embedded in a blog post of a site that is based on the WordPress CMS and that link exists on the following URL:
The best way to disavow this link would be to include the following in my disavow file:
But, let’s say that I included the following in my disavow instead:
Because this site is WordPress based, there is a good chance that the following links pointing to my site also exist:
…and many more.
Similarly, some sites may have the link available on a www page, a non www page and also an https:// page.
Disavowing on the domain level will take care of all of these links.
6. Google Docs Wasn’t Used for Communicating Evidence
In this video, Google’s Matt Cutts explains that the webspam team is reluctant to open up external files for fear of getting malware. He recommends that you keep everything in Google Docs.
It isn’t a good idea to use an Excel spreadsheet to document your evidence. You can upload an Excel spreadsheet to Google Docs, but I do find that often these files are difficult to process through when viewed in Google Docs. If possible, it really is best to show your information in a Google Spreadsheet.
I have also seen failed reconsiderations where site owners didn’t document their attempts at contacting site owners well. Instead, they gave the password to their Gmail account and invited the webspam team to view their Sent folder. It’s very unlikely that a Google employee is going to do this, mostly for privacy reasons.
Instead, document everything really well in a Google Doc Spreadsheet. Include the email address, whois address, contact form URL, etc. that you used and the date on which you tried that form of contact. Make it as simple as possible for a webspam team employee to see your efforts.
7. Incorrect Google Docs Share Settings
If you’re sharing a Google Doc with the webspam team, be sure that the share settings are not set to “Private”, which is the default setting. I usually change it to “Anyone with the link can view”.
In this Webmaster Forum post, a Google employee is describing possible reasons for this site owner’s failure to get reconsidered. One of the things he says is:
Documentation is really good and can help when reviewing your site for reconsideration. Make sure that when you send in documentation that it’s accessible, though. It looks like https://docs.google.com/file/….. hasn’t been shared properly.
How tragic! Months of work can go down the drain if the webspam team can’t see your documentation.
You would think that Google would simply let you know that they can’t see your documentation. Unfortunately, it appears that they only have a limited number of canned responses they can give. It would be great if Google could add an additional canned response that told a site owner that there was a problem viewing their documentation and perhaps gave a checklist of possible problem areas.
8. Spreadsheet is Filtered
This is very similar to the previous point. If you have your Google Spreadsheet share settings set so that anyone with the link can view the document, this setting does not allow the viewer to use the filter function.
I have reviewed some failed requests where the site owner had accidentally left the spreadsheet filtered so that they could just see the links they were successful in removing, or in another case, for some reason they were looking at just their directory links. This means that webspam team member who saw your request couldn’t see the rest of the spreadsheet.
Unless you have been really successful at removing links, if the webspam team can’t see the work that you have done to try to contact site owners then they will likely send you a canned “Links to your site violate Google’s quality guidelines” message, which means you have failed at reconsideration. It’s possible that this is the reason why some site owners receive examples of unnatural links that are ones that they have tried to remove.
If the webspam team can’t see your efforts, then they may assume you have just disavowed this link rather than tried to remove it.
Did Google make the process of applying for reconsideration of a site so difficult so site owners learn their lesson and never want to build unnatural links again? Perhaps.
If you’re one of the unlucky ones struggling to get a penalty removed, don’t give up! Some sites have had to go through several requests before finally succeeding, but I’ve yet to see one that was impossible! Hopefully these tips will help.
If you know of other reasons why a reconsideration request may fail, please leave a comment.
In today’s highly competitive environment, accurate and timely data can be the key difference between keeping tabs on, getting ahead of, or being left behind the competition. This eBook gives a detailed insight into how you can perform in-depth market analysis using the Semrush .Trends toolkit. In particular, it explains how the reports and features in Market Explorer and Traffic Analytics can paint an accurate picture of your target market, and allow you to closely analyze the strategies and performance of your competitors.
Supported by examples, the eBook provides detailed instruction and insight into:
- Researching new markets and niches.
- Analyzing your competitors’ marketing strategies.
- Uncovering your competitors’ top-performing pages (and what you can learn from them).
- Searching for gaps and insights in your rivals’ strategies.
Keyword difficulty is one of the essential metrics when it comes to carrying out keyword research. Explore the methodology behind SEMRush's new keyword difficulty metrics, how it is calculated and what it could do for your business.
Less than 4% of websites passed all Core Web Vitals tests. Is yours one of them?
In this study we crawled over 2 million URLs, crunched the numbers, and performed correlation analysis across the top 20 organic Google search positions to get a sense of how websites are performing before Google rolls out the Core Web Vitals Update in June this year. This enables us to provide real performance benchmarks, insights, and data-backed guidance ahead of the algorithm update.
Having high-quality backlinks directing to your website should be a major factor in your overall SEO strategy. Crowdo.net is a unique link-building service. They provide Crowd Marketing Links (Crowdo Links), which work great for SEO and Web Traffic.