Pages

Website Content Meets the Penguin

Recovering from Google Panda penalties requires optimal website content. Rewarding high-quality content is supposedly a primary goal at Google HQ, and is stated as being at the core of both Panda & Penguin and sundry interim algorithm updates. In light of that, perhaps some clarification of the “content quality” concept is in order?

What Is Content?
Content is made up of multiple elements, and is primarily the;
* On-page visible text
* Images and image Alt text
* Anchor text in hyperlinks to internal or external pages
* Hyperlink titles in links and menus
* The descriptive Title and Description meta-data
In the context of Google, a picture is NOT worth a thousand words! Moreover, words must be accessible, not embedded in images or Flash movies, JavaScript, slide shows etc.
In 15 years as an SEO consultant, if there’s one common denominator evident on websites, it’s that there is a profound reluctance to expend time, money, and creative energy on unique text content. Brevity is the watchword – economical use of words is encouraged by design, branding and marketing advisers!
* The branding gurus want you to use the textual equivalent of sound bites – bullet points and short sentences!
* The website designers want the entire content of the page to be above the fold – no scrolling!
* The marketing and advertising team want the warm fuzzies – in preference to explicit descriptions of what you sell or do!
* The owner wants to constrain the budget…
* The SEO guy that knows how potential visitors actually search online won’t get employed until traffic volumes don’t meet expectations
That is generally how it goes for the majority of commercial websites… the inevitable result is:
* A very low word count on all pages
* Image domination at the expense of text
* Minimal originality in textual elements
* Vagueness in page’s descriptive elements
Net result is nothing much for Google et al to work within terms of assessing an individual page’s relevance to a specific keyword search phrase!

What is Bad Content
This is generally a subjective assessment, but some or all of the following elements will be evident within the website:
* Duplicated from your other website/s, and competing in the same web space.
* Copied from a similar site and passed off as your own.
* Homogenized content – finding a series of top-ranking pages from authoritative sources, and blending them together into a new page for your own site.
* More images than words, or words embedded in images.
* Content generated by JavaScript, Flash or Frames.
* Dominated by vaguely worded, non-explicit, warm and fuzzy drivel from branding and marketing pitches.
* Word-for-word product descriptions copied verbatim from the manufacturer.
* Lack of precisely targeted content that fails to fill a defined search need.
* Doorway pages – specifically produced to target a series of keyword phrases but bereft of intrinsic merit.
* Page and image file names are cryptic, devoid of explanatory keywords.
* Riddled with mis-spelt words, grammatical errors and incorrect punctuation.
Basically, if you can’t present your subject (products or services) in an intelligent and articulate manner, and accurately define precisely what the point of the page is, and what it contains, its inevitably going to be seen as substandard and confusing content.
If you “borrow” extensively from other people’s creative content, at best you are immediately second best. At worst, you are clearly guilty of:
* Copyright violation
* Theft of intellectual property
* Plagiarism
WHY would anyone expect Google to reward them for that???

What Defines Good Content?
* Written by an authority on the topic, in his/her own words, and conveying valuable information to site visitors.
* Defines a problem, proffers a solution and explains why you need it.
* Research has determined the audience’s needs, and the page is written for the specific viewer demographic.
* Contains relevant exact-match keyword search phrases in current use.
* Accurately and verbosely described in key areas (headings, titles, metadata)
* Keyword-rich, search-engine-friendly page and image URLs.
* Correct spelling, punctuation and grammar.
Take a leaf from the pages of the Public Speaking 101 book…
* Tell them what you are going to tell them
* Tell them
* Tell them what you told them
Don’t be vague, don’t waffle on about trivial stuff. Focus more on what your visitors need from you, and on solving their problems. Focus less on why you are so smart, handsome and witty.

Content Volume
There was a time when Google thought you could summarize a topic in less than 250 words and look like an expert at the same time. Obviously, that’s a superficial perspective, and they’ve addressed that by upping the ante on quality by considering volume.
Therefore, a page with 750 words that is original and gives better coverage of the topic in question can expect more exposure to the golden light from Google… That makes perfectly coherent sense to me, but apparently not to the pin-heads in Branding or Marketing…

Sure, There Are Shortcuts
Basically, creating good content takes a lot of time and creative ability. Actually, it takes damned near as long as the time and effort that goes into designing ways to cheat the system! However, some people instinctively prefer short-term gains derived from:
* engaging in black hat SEO tactics such as content cloaking.
* giving Google one thing and visitors another.
* engaging in dodgy schemes to make their site appear better than it really is.
When it all goes belly up, it’s incumbent on them not to blame Google.

Google Panda & Penguin – Marks out of 10

Actually, I heartily agree with the goal of rewarding good content. When I’m searching for something, I want the most relevant and helpful page to appear at the top of the list.

Is There a Fatal Flaw in Google’s Logic?

The algorithms make automated, and arguably, objective judgments of content on the basis of defined and extensive criteria.
In many situations, some subjective judgment is essential and it makes sense that peer comments from external sources are incorporated into the relevancy ranking algorithm. The latest algorithms place far greater emphasis on external public online commentary from Social Media websites. That’s great, kind of…
But what about the 100′s of thousands of excellent online businesses that have been providing exemplary services to customers for decades but;
* they are not on the Social Media radar – they don’t understand it, can’t be bothered with it, etc.
* they are not computer literate, nor do they comprehend Google’s strategy and the need to evolve.
* their websites meet the needs of their current and potential clients.
* They’ve always had decent rankings in their particular niche.
* Their website routinely delivers a significant portion of their new business clients.
Suddenly, Google arbitrarily changes the rules on established websites…
* Their website drops from page 1 to page 6.
* Their traffic drops by 75%.
* Their enquiries dry up overnight.
* Their income is slashed.
To add insult to injury, competitors who’ve been copying their content and stealing their ideas are suddenly ranked higher than them! That’s the consequence for many good businesses across the globe in the aftermath of the past year’s Search Engine Results Pages (SERPs) ranking formula adjustments at Google.
What is not factored in is that some website classes/niches/industries simply don’t fit the Social Media referrals mold well. In a Social context;
* you might readily choose to “like” a page on a travel destination/hotel/tour operator/fishing guide website…
* you are far less likely to “like” a plumber, electrician, lawyer, gynecologist, proctologist, hypnotist, etc.
Any external referencing mechanism that assigns a social media weighting to niches that are not well represented tends to tip the scales in strange directions.

Have Search Results Improved?
Frankly, I remain unconvinced that the results of Panda + Penguin have been a resounding success;
* More harm than good has been done to thousands of innocent businesses.
* The garbage content in many built-for-Adsense / scraper sites still dominates niches.
* Authentic local businesses are still dominated in local search by parasite sites – particularly in niches such as bed & breakfast, hotels, rental cars. The leech sites that suck out commissions from the businesses that provide the services continue to prosper.
* In location-specific searches, doorway pages from major sites, albeit containing minuscule content, still dominate rankings.
From what I see on a daily basis, there is a hell of a lot of collateral damage, but many primary targets are still intact!

Do Heavyweight Sites get too much Clout?

Why should a skinny page on Travelfish, Tripadvisor, Infohub, Agoda, HotelsCombined, etc. out-rank a location-specific town / provincial website? That would seem counter to the revolution.
If you operate an accommodation business in a town, is it fair that the top “accommodation” rankings are dominated by the people who don’t operate accommodation facilities! Instead, the tech-savvy, big-dollar accommodation promotion sites get top billing, and you have to either pay them to appear on their site, and/or give them a booking commission.
For example, try a search for: “bed & breakfast New Orleans”
To me, the results reveal a clear and unequivocal inconsistency (if not a double standard) in Google’s avowed dislike of selling/purchasing links.
* Are those Bed & Breakfast book & website promoters that sell advertising placement to bed and breakfast operators immune from paid link penalties?
* Are those bed and breakfast advertising sites truly more valuable in search results than a list of actual providers of bed and breakfast services?
The same logic and questions apply across a wide range of niche markets, where the service provider’s website is arbitrarily devalued in favor of the parasitic sites that live off the efforts of those providers… Or did I miss something profound in there somewhere?

Geographical Speed Wobbles
I’m perplexed about the impact of Penguin on sites’ positioning in Google’s geographic-specific databases. I’ve noticed dot.com sites that seem to have been relocated for neither rhyme nor reason. It used to be that a dot.com site;
* was assigned to the Google dataset of the country in which it was hosted – which makes sense…
* was assigned to the Geographic Target set in Webmaster Tools
In the past couple of months I’ve seen sites lose their geographic identity without the owner having made any changes. A New Zealand client’s dot.com, hosted in the USA, is included in “Search: pages from New Zealand.” That’s plain wrong, and neither the designer nor the owner has it tagged as geographically targeting New Zealand.

A Real-World Example
Flexiscreens.com is an Australia business based in Tasmania, producing flexible insect screens from premium-grade materials and delivering them world-wide. They’ve been targeted by copycats, but have always enjoyed good rankings and resoundingly good client testimonials due to exemplary products and services. The past year has been a roller coaster ride. First they were rewarded by Google, and then they were punished…
* The Monthly “Visits” column shows growth from 12,800 visits in Aug 2011 to 36,000 in Nov 2011.
* From Dec 2011 traffic has declined, now averaging 17,000 visitors (May 2012 as at 28th May)
It’s the same content, basically unchanged in that time and it’s inexplicable that such wild variations should occur in traffic.
There are some peculiar aspects:
* Nov 2011 – at the peak of visitor traffic, 54% of visits were from Australia!
* May 2012 – that dropped to 17% of visits from Australia!
The website is hosted in Melbourne Australia. The site appears in “Search: pages from Australia” so is clearly associated with the Australian dataset. The variation in Australian visitors is inexplicable – one would expect that particular statistic to be independent of traffic volumes.

Low Quality Content:
In looking at the site from a critical perspective, it appeared that the 89 glowing client references might well be the Achilles Heel. These were separate individual posts in the Testimonials category, with a widget that selected one at random and displayed it in the page sidebar.
However, bereft of unique Titles and Descriptions, lacking headings, and with only a couple of sentences and no images, they would certainly appear to be of low quality and minimal value!
By way of remedial action, these were inserted into a Testimonials Manager plugin, retaining a random Testimonial display in the sidebar, but with a single consolidated page instead of 89 individual posts. 301 Redirection was put in place on those posts to the new Testimonials page.

Doorway Pages:
In addition, there were a dozen of so Authorized Distributor pages, half of which contained nothing other than a State or Country variation, making them appear suspiciously like Doorway Pages!
These were all combined into a single distributors page, with an index hyperlinked to State & Country sections, and 301 Redirection applied.

Impact of Remedial Action on SERPs Results
There was a prompt response on the rankings for multiple keyword search phrases:
* flyscreen distributors – jumped up +19 places within a week to #5 on Google.com.au
* flyscreen distributors – jumped up +55 places within a week to #10 on Google.com
* fly screen distributors – jumped up +3 places within a week to #9
The consolidated page with its larger and unambiguous content definitely gained rapid traction in the SERPs! Unfortunately, since then those gains have been off-set by further losses across the board, but traffic volumes have remained steady over the last 3 months.

Website Content – The Conclusion
To succeed in generating new business via online marketing, Google’s goal of giving the best and most relevant content to searchers must become your goal!
To rank higher than your competitors, your website content must be demonstrably better than your competitors.
* Take a long hard look at every page on your website and ask yourself what its purpose is, and how well it meets that purpose.
* Take a long hard look at your competitors’ websites, and their competing pages and decide if they are doing a better job.
Try and figure out what your visitors are looking for by conducting some online research – Google’s keywords tool gives some great information in that regard. This beats the hell out of keyword brainstorming sessions around the office coffee table!
Assess what the most important searches are to your business, and which pages relate to those searches. Rewrite and expand each page to ensure it incorporates the targeted keyword phrase, and provides an answer/solution that is specific / relevant to the topic in question. Make sure the keyword phrase is prominent in a heading, the title and description.
There is an old Chinese proverb to the effect that one should “Buy expensive, and cry once only!” Translated into a website context:
* If you make the upfront investment in high quality website content, it’s a one-off cost.
* If you pay for quick and dirty results, expensive repairs will be necessary on a regular basis.

Ben Kemp has more than 25 years of experience in the IT industry, including 15 years as a professional SEO consultant, with clients throughout New Zealand, Australia, UK & USA. The SEO Guy
Email: bjk@ComAuth.co.nz
Web: www.comauth.co.nz