Nitro-Net.com – Internet Marketing Services – A Global Marketing Group Company
Google’s John Mueller posted one of the more detailed responses I’ve seen from him on the topic of overall search spam, penalties, quality algorithms and more. I am not sure if the comment is showing up anymore on Reddit, so here is a screenshot of it.
Here is a summary of what John said followed by a copy and paste.
(1) Websites do not get permanently removed from Google Search.
(2) Google doesn’t have a list of permanently blocked sites.
(3) Manual action complete removals are for “pure spam with nothing useful of its own on it.”
(4) Google prefers to ignore the bad parts of the sites vs penalizing.
(5) He explained how reconsideration requests work.
(6) Google has algorithms that flag issues (spam catching algorithms?)
(7) Google has algorithms for relevancy also (like core updates?)
(8) There is also is your site still useful today as it was years ago.
Here is what John wrote on Reddit:
Sites don’t get permanently removed from Google – there’s always a way to get the site indexed again. Sometimes it takes a long time & a lot of work, but Google doesn’t have a list of permanently blocked sites. For manual actions (aka penalties) usually the things that lead to complete removal are quite severe, like when a site is just pure spam with nothing useful of its own on it.
For everything else, there’s a variety of “ignoring the bad parts & focusing on the good”, “skipping for certain search features (eg, it’s not in News)”, and “broad drop in search ranking” (when the bad parts can’t be separated that well). I like the cases where Google can just ignore the bad and focus on the good — imo more and more of that is possible. People accidentally do “black-hat SEO” by following bad advice (like “use white on white text with your keywords”), and if Google’s systems can ignore that but still show the site for its useful parts, then users will be happier, and I’m sure, site owners will learn at some point. However, it can lead to the awkward situations where a competitor is ranking above you but doing something “obviously black hat”, tempting you to do that too (those sites tend to rank despite the black-hat stuff, not because of it).
When it comes to manual actions / penalties, you can request reconsideration after fixing the issue. In practice, that means Google has to re-review the site, depending on the type of manual action that can happen fairly quickly (eg, “site is no longer hacked”) or take a bit of time (“I removed some bad links, idk if it’s enough” can sometimes take months).
If “the algorithms” flag the issues (basically just some software that runs at Google), or just don’t find the site as relevant in search anymore (basically changes in how Google does ranking in search), then it’s more a matter of improving the site, usually on a broader level. It’s like when a radio station no longer plays your music because it’s no longer considered great: you don’t just change the drum samples, you kinda need to rethink what you’re creating overall.
It can also happen that a site’s time / usefulness has just passed. If you have a old-school-phone games website and nobody runs those devices anymore, no amount of tweaking of text on the pages will fix that. Spotting the change in wind before it’s too late is a bit of an art, as is moving on in a way that lets you re-use some of your work.
Forum discussion at Reddit.