One of the problems with reductive, prescribed SEO
approaches - i.e. step one: research keywords, step two: put keyword in title
etc can be seen in the recent "Content Farm" update.
When Google decide sites are affecting their search quality,
they look for a definable, repeated footprint made by the sites they deem to be
undesirable. They then design algorithms that flag and punish the sites that
use such a footprint.
This is why a lot of legitimate sites get taken out in
updates. A collection of sites may not look, to a human, like problem sites, but
the algo sees them as being the same thing, because their technical footprint
is the same. For instance, a website with a high number of 250-word pages is an
example of a footprint. Not necessarily an undesirable one, but a footprint
nevertheless. Similar footprints exist amongst ecommerce sites heavy in
sitewide templating but light on content unique to the page.
Copying successful sites is a great way to learn, but can
also be a trap. If you share a similar footprint, having followed the same SEO
prescription, you may go down with them if Google decides their approach is no
longer flavor of the month.
SEO Technology
SEO Technology
No comments:
Post a Comment