Google plans to penalize 'overly optimized' sites

Google might be trying to be more relevant in search by removing spammier sites, but so long as they offer silly things like what someone said on Google+ or what some Google service has to say about the query instead of what the user is looking for, it’ll fail.

Google seems to be moving in the opposite direction: instead of answers from sites, Google wants to offer answers from itself.

Teaming up with Wolfram Alpha was one of the best things DDG did. I’ve been using it all day today to make calculations. I could do that on WA directly, but since I can type a query into my addressbar and get it on DDG, that’s what I do.

And DDG doesn’t try to offer me “Duck” results, or what some social twit said on Duck+, or answers it thinks are personally relevant to me because of who I am online. This is why DDG, despite still not having all the search results it needs to overcome someone as big as Google, has a leg up on search and is growing.

Re overoptimised sites: I know exactly what this is when I see it. My bosses at my old job did it all the time.

Hint: when the main word of your site’s topic is in every sentence on the site in question, where reading the site content out loud sounds strange, as if the person writing it were a robot or a foreigner learning your language, you’ve over-optimised.

You might be right on that point, but it’s the opposite of what some people in the industry seem to be saying.

The point about semantic analysis (which is what Google is moving closer towards) is that Google will try to understand the meaning of the words on the sites that it indexes, and deliver results based on that meaning.

According to an example quoted in an article in the Wall Street Journal, if somone searches for “Lake Tahoe”, instead of seeing a list of sites that contains those two words, the search engine will know that Tahoe is a large lake in California, and will show key facts about it, gleaned from the sites in question. It might also directly answer such questions as “What are the 10 largest lakes in California?” This will be done, according the article, by “examining a Web page and identifying information about specific entities referenced on it, rather than only look for keywords”.

So the information will still come from web pages, but the owners of those pages will now get less benefit from the search engine; they’ll simply provide the information for Google to use (whether they want to or not).

I’m not saying that the above scenario is necessarily correct. But it would be very much in line with other moves in the industry, for example, the use of semantic analysis for machine translation rather than the traditional approach of looking up words in a dictionary.

Mike

I think the proposed changes really boils down to “overly optimized” sites. It’s strange though, how will Google determine what really is overly-optimized? I don’t think I agree at all with Cutts when he says, sites that “exchange way too many links or go well beyond what you normally expect”. What who would normally expect? And how many links are way too many? No sir, I don’t think this is going to be good for anyone.

SEO and applying it to your site for the likes of Google will always keep you walking the line and trying to create a balance that will benefit your site and please the requirements of Google. SEO will not become an industry where you do A and it results and B which adds up to a consistent answer each time. You simply have to keep up with the changes and make them as they come up so that your site will reap the best benefits from those changes.

Mikl: that sounds awesome except when you get stuff like phrases with multiple meanings.

Anyway I never had a google login, I never had G+ so the few times I extend my searches to Google I don’t get the nasty kinds of results my husband gets (who does have Google, GMail, YouTube and G+ account(s), and so gets results with “somebody you have some weird convoluted relation to said something inane about this search term!”, poor guy).

Just seems like adding more importance to content rather than things like efficiency, back links, etc. That seems fine to me. I could care less if a site was built with tables if it has the content I’m looking for. Poor mark-up really shouldn’t affect whether or not it rises to the top. The same with back links. Just because a site has more back links than another the site the one with the richer content should have a higher precedence. I completely agree with this move though I don’t see it having any real drastic effects. At least not for those sites that do have useful things to offer…

You know I’m a stickler for good markup… Ok, who are we kidding, I’m a total {expletive omitted} on the subject.

But you’re RIGHT!!! None of the search engine’s business; good markup might allow the engine to find the content easier, but what’s important is NOT the markup, it’s not the backlink nonsense, it’s the CONTENT. That’s the number one thing I’m about on websites… it’s the number one thing that matters to people visiting websites…

So I really don’t have a problem with such changes to the search… and it’s not like this is the first time Google’s made moves to slap down SEO scam artists and the nonsensical abuse. God forbid anyone write original content people want and present it in an accessible manner. If they’re ranking based on anything else, they’re no longer all that useful as a search engine.

NOT that I Google a whole lot these days thanks to their idiotic bloated slow scripted train wreck, inaccessible color choices and host of other things that are slowly proving they’ve forgotten what it was that made them king of search in the first place – clean simple fast search. It’s like they are trying to follow Bing’s suit, and replicate everything that flushed “Ask Jeeves” down the toilet a decade ago.

NOT that DuckDuckGo is much better with the crappy fixed width column, but it’s enough for now.

In any case, they want to slap down the people who’ve taken SEO from a small but important part of content generation and site coding and turned it into a cottage industry, all I can say is GOOD!!! World could use a few less scam artists and snake oil doctors.

Agree, All of those penalization and updates was good, as what has been said.

all I can say is GOOD!!! World could use a few less scam artists and snake oil doctors.

Thank you but in my case, I would probably stick with my usual process. Although I am considering of tweaking the system in order to keep up with google.

As far as i concern surprising thing about this announcement is that there is reason to believe that some penalties like this already exist. Although there is no official word from Google on existing over optimization penalties. Websites which use a lot of the same keywords as incoming links to the site, or overuse particular keywords in their on page SEO will see their ranking limited to a SERP position no higher than page 4, or in worse cases page Since it seems to take a site’s current ranking for a given keyword and add either 30 or 50 to it.

I have to agree with deathshadow60 here for bringing up a good point about the penalties that Google has been giving to the spam sites ever since. A lot of the people over at Google are trying to make the Internet a better place, but there are still a lot of hard-headed people who think that they’re good SEOs who think that they’ll be able to game Google’s system forever. Why can’t they realize that the content is what they need to get their ranks up?

Well DuckDuckGo had gone and intentionally removed whole domains and groups who seemed to only be spammers. Like sites with fake Gucci bags and things.

Google I can understand being more reluctant to do such a thing. They have more indexed than DDG and most domains are mixed between real spammers and people right on the edge who also do AdWords etc (which earn Google money). Still, I’m glad a smaller search engine can go ahead and leave out certain sites and groups because the community has labeled them as spammy. Especially a lot of content mill sites were blocked.

Thing is, that’s exactly the type of thing that could also cause their decline as king of search – killing off the weeds means a more useful result. Having “More useful results in a clean fast loading format” is what put google at the top in the first place burying Ask Jeeves, Altavista, Lycos, and a whole host of other long defunct or empty shells of their former selves… something they seem to have completely forgotten the past year as more and more they, well again, tack on the same crap that killed “Ask Jeeves” off a decade ago. (the exact same type of garbage “Bing” launched with)

Though it’s why whatever search engine comes up with the ability to say “I never want to see this site listed in my results” first, probably has me hooked for life; there are some spam-ish sites or worse, sites that expect you to pay for information that’s available freely elsewhere (experts exchange comes to mind) – sites that I’d love to blacklist from EVERY search I do. Google lets you hide one result, but won’t blacklist an entire domain… and that would be the type of thing that might actually make me log into a user account on a search engine.

[SIZE=4][SIZE=3]

Deep down IMHO most peddlars of ‘SnakE-Oil’ know this… though I do wonder sometimes as very often you’ll find folk signature tagging themselves as consultants then begging for help with metatags in many a coding related forum across the web. Clients coming to me wanting a website made or some polish adding to an existing one also struggle bitterly with this concept from my experience.

Most people expect not unreasonably for machines do the job they were built for and therefore computers to perform as instructed so the idea that a website can say “hey i’m better than everbody else” with addition of some magic code as opposed to content seems perfectly rational to many non techies.

I recall asking one in particular, a family owned antiques shop that had been going thirty plus years for a 100 words or so about the early startup days or any prominent epsodes,or famous customers in the history of the business. The old guy was still alive and had all his marbles so between oral history and the family photo album there would of been some great background content to fluff the ‘about us’ section. But no… faced with the prospect of a hour or so actual creative thought the grandson running the business instead bought several thousand backlinks online, depsite being warned against this. Then had the nerve to try chewing me out when google buried the site.[/SIZE][/SIZE]

Something that has repeatedly led to friction between myself and clients – and why for the time being I’m not taking on new work; I’m sick to death of clients demanding things I tell them not to do, then blaming me when it fails EXACTLY how I said it would. I don’t deal well with micromanagement to begin with, but when they then have the gall to blame you for things you told them not to do… I actually cut ties with my last “client” by literally screaming at one of the ‘owners’ over the phone "Look, you think you can do a better job, you do it – if you’re not going to listen to my input on a subject you know NOTHING about, what the {expletive omitted} did you even hire me for?!? CLICK

They’ve tried crawling back three times in as many months – I’ve told them to stuff it.

Crusty: you should’ve recorded that one and put it on YouTube for the lawlz.

I just wonder how Google are going to handle when the “grandson” starts buying links to his competitors sites to give them a helping hand. :x

If in fact what Mat says is true. I just cant see how they can police this mess they have got themselves into. Warning sirens are ringing - time to get out and work harder to get alternative traffic elsewhere.