Matt Cutts is the Alan Greenspan/Ben Bernanke of the search world, whenever he speaks people listen intently and he makes
markets search results move. His official title is “distinguished engineer” but has essentially been the man behind the curtains with respect to Google’s search results since 2000.
At an SMX Advanced session hosted by Search Engine Land’s Danny Sullivan there was a “You&A” where audience members asked questions to Matt, and like a true politician he did his best to say as much as possible without revealing too much–though there were certainly some tidbits that we found interesting.
Below we recap some of the main points that Mr. Cutts made.
If you’re a skimmer and want to just get to the main points, just keep your eyes focused on what we bolded.
our nosebleed seats
Penguin/Panda Updates Vs. Site-Imposed Penalties
-Penguin and Panda updates did not penalize sites, but rather were algorithmic changes.
The main difference between an algorithmic change and a penalty is that algorithmic changes are on the macro level, affecting all search results, while penalties are imposed on the micro level affecting individual sites.
–There are essentially three types of sites on the web, high quality sites that are providing great information that the user finds useful, and on the other end of the spectrum “webspsam” sites that provide no useful information–and everything else.
The purpose of Panda was to further breakdown “everything else”, and not just eliminate spam from Google’s diet, but sites of “low nutritional values”, that were borderline spam.
-Back to the point of a separating penalties from algorithmic changes, a penalty is initiated through a manual review of someone or a group from Google, that says your individual site is bad, and breaking “Google’s best practices”, an algorithm change is applied across the web to all sites.
What Happens During a Penalty Phase
-A penalty is a manual action initiated by someone in the Google team, and a notification then appears in the site’s webmasters tools with details on what the issue is and how it can be fixed.
Once the issue is fixed, if the webmaster responds to Google and clear action was taken to resolve the issue, the site will be officially cleared of all wrong doing.
How to Combat Negative SEO
–Google is aware of the existence of blog networks and negative SEO techniques, and despite black hatters’ methods getting more and more sophisticated, Google is well on top of everything.
Despite this, Google can’t catch everything, and techniques designed to hurt competitors can occasionally slip through.
There are talks down the line, “1, 2, 3 months” that a webmaster can tell Google that they didn’t authorize the links pointing to their site, so Google won’t penalize them accordingly.
Algorithm changes can affect millions of sites at a time, e.g. one that affected domain squatters affected 20 million sites. Again, this is contrasted to Google penalizing a site, which affects just one site.
Alerting the Black Hatters
–More recently Google has alerted Black Hatters of what they’re doing wrong through Webmaster Tools, something that Cutts admitted was unthinkable a few years ago.
Their ultimate goal is to improve transparency and give people a chance to correct their issues, no matter how “unethical” their methods may be in Google’s eyes.
No one is immune to being penalized by Google, Cutts cites examples of users thinking they can’t be caught dealing with another party, but the party on the other side may not be as careful.
One example is an agency that proclaimed that “no one can spot what we’re doing”, and just 4 days later said they “they were shutting down”.
Questions from Audience/General SEO Comments
-Most SEO’s will say SEO is harder than it was 5 years ago, with Google coming down harder on “them” with their new algorithms.
Cutts declares that there is no war on SEO’s despite what anyone says, and at most it’s war on spam with the intent of getting the most compelling content on search results.
Cutts cites examples of users on blackhat SEO’s asking “how do i fake sincerity?” or “how do i look real?”–to which he replies–“Um, just be real”.
His bottom line is that he wants everyone to compete on level playing field, and that you cannot pay to rank higher.
-Despite what some believe in the community, Google doesn’t consider bounce rates at all or similar signals and they don’t use data from Google Analytics.
When it comes to bounce rates, many users may just want to get a quick answer, e.g. what the weather is like or when the sunsets in their area, and good sites will provide that information immediately, so a high bounce rate in those cases can be a good signal.
–Hiding keyword data from secure searches (users that are logged in) may be bad for webmasters, but helps protect users, and Google’s main focus is and has always been on the user experience.
Even so, the keyword data that webmasters have, covers 98% of all searches coming to their sites.
-Post Google’s Panda update, the quality and spam team began to collaborate with each other to a greater extent, whereas before they were more independent, which will help with how algorithm changes affect everyone.
-Cutts agrees that rich snippet spamming and abuse is common, and was something they slowly rolled out because of this.
If Google catches you abusing rich snippets, you will be disallowed from ever using implementing it again.
-Q: If a site can’t get rid of links because the agency that implemented then is no longer in business, what can they do?
A: In the end Google wants to see good faith effort to make an improvement and while they realize they remove 100% of all links, what they want to see is progress.
For instance of a site has 500 bad links pointing them, Google will take a subset sample, and if none have been removed, the penalty will still be in place.
-Q: If a site can’t get rid of a bad link to a page, should they just get rid of that page?
A: “Yeah maybe if it’s not important”, in the end it’s how do you document it, and again comes down to putting forth the effort.
-Q: Is Google+ and author rich snippets an effort to use SEO’s as pawns to take down facebook? (received with laughter)
A: “Not in my opinion…it’s still in early days but…” (more laugther)
-Q: Why was the latest update labelled “Penguin”?
-A: Even though the Panda update was initiated by an engineer with the last name panda, this was not named after anyone. It wanted anonymity for person who came up with it so they just picked a similar black/white cute animal name that was next in the alphabet.
-Q: If I’m hit by both panda and penguin, should I just just give up?
-A: Sometimes; if you can change up your site significantly and clear everything up , go ahead and do that, but if you started a site with just the intention to spam, don’t count on your luck.
–Cutts believes that Google’s organic search results are as pure today as they were 10 years ago.