The Myth of W3C Compliance?

The past few years have seen a huge íncrease in the number of search engine optimisers preaching about the vital importance of W3C Compliance as part of any effective web promotion effort. But is compliant code really the ‘Magic SEO Potion’ so many promoters make it out to be?

For those of you not familiar with the term; a W3C compliant web site is one which adheres to the coding standards laid down by the World Wide Web Consortium, an organisation comprising of over 400 members including all the major search engines and global corporations such as AT&T, HP and Toshiba amongst many others. Headed by Sir Timothy Berners-Lee, the inventor of the internet as we know it today, the W3C has been working to provide a set of standards designed to keep the web’s continuing evolution on a single, coherent track since the Consortium’s inception in 1994.

Whilst the W3C has been a fact of life on the web since this time, general industry awareness of the benchmarks set down by the Consortium has taken some time to filter through to all quarters. Indeed, it is only within the past 24 to 36 months that the term W3C Compliance has emerged from general obscurity to become a major buzzword in the web design and SEO industries.

Although personally, I have been a staunch supporter of the Consortium’s standards for a long time, I cannot help but feel that their importance has been somewhat overplayed by a certain faction within the SEO sector, who are praising code compliance as a ‘cure-all’ for poor search engine performance.

Is standards compliance really the universal panacea it is commonly claimed to be these days?

Let’s take a quick look at some of the arguments most commonly used by SEOs and web designers:

1. Browsers such as Firefox, Opera and Lynx will not display your pages properly.

Browser compatibility is possibly one of the most frequently cited reasons for standards compliance, with Firefox being the usual target for these claims. Speaking from personal experience, Firefox will usually display all but the most broken code with reasonable success. In fact, this browser’s main issue seems to lie more with its occasional failure to correctly interpret the exact onscreen position of layers (Div tags – this often causes text overlap) even when expressed correctly, than its inability to deal with broken code.

What about Lynx? Interestingly enough whilst it is somewhat more fragile than Firefox, most of the problems encountered by this text-only browser mostly seem to stem from improper content semantics (paragraphs out of sequence) than poor code structure.

2. Search engines will have problems indexing your site.

Some SEOs actively claim that search engine spiders have trouble indexing non-compliant web pages. Whilst, again speaking from personal experience, there is an element of truth to these claims; it is not the sheer number of errors which causes a search engine spider to have a ‘nervous breakdown’, but the type of error encountered. So long as the W3C Code Validator is able to parse (*) a page’s source code from top to bottom, a search engine will likely be able to index it and classify its content. On the whole, indexing problems arise when code errors specifically prevent a page from being parsed altogether, rather than non-critical errors which allow the process to continue.

* To parse is to process a file in order to extract the desired information. Linguistic parsing may recognise words and phrases or even speech patterns in textual content.

3. Disabled internet users will not be able to use your site.

The inevitable, but somewhat weak, counter-argument to this point is that only an infinitely small percentage of internet users are visually or aurally impaired. However, it is a fact that browsers such as Lynx and JAWS (no, not the shark) will view a web page’s code in much the same way as a search engine spider. From this perspective, we once again return to the difference between critical and non-critical W3C compliance errors. As long as whatever tool/browser/spider is used to extract text content from a page’s code is able to continue its allotted task, the user is likely to be able to view the page in a satisfactory manner.

Interestingly, one of my fellow designer/SEOs over in Japan has just run an experiment entitled “W3C Validation; Who cares?” testing the overall importance of W3C compliance to long-term web promotion efforts. Whilst the results of this, the world’s most non-compliant web page, do initially indicate that compliance does not make much of a difference to a search engine’s ability to index and classify a web page, I do rather suspect that further research may be needed in order to establish the long-term effects of this experiment.

At the time of writing however, the page ranks well with Google for the following two non-specific search terms; “Does Google care about validation” and “Google care validation” – not bad for a page which is supposed to be utterly and completely un-indexable. What then is the answer to the W3C compliance conundrum?

In conclusion I would say that ignoring the World Wide Web Consortium’s standards at this stage may well have negative consequences in the long-term, as the internet’s continuing evolution is likely to place greater emphasis on good coding practices in the future. Having said this, I would also say that the current value of W3C compliance has been overplayed by some professionals in the web design and SEO industries.

Further studies into the effects of non-compliance are certainly needed.

About The Author
Sasch Mayer, a writer with well over a decade’s experience in the technology and internet sectors, is currently living in Larnaca on the Cypriot south coast. He writes under contract to IceGiant, a web studio specialising in W3C compliant web design in Cyprus, the UK and the rest of the world.

Scam Alert II: Domain Hijacking

There’s a frightening new batch of scams going around now that can damage your reputation as domain “squatters” steal your domain name.
There are a number of ways the “game” is played. The first is entirely legal, if more than a little questionable. In this version, the name of a city or geographic area is grabbed by a domain squatter and pointed to… “sites that you wouldn’t want your children visiting.”

(We chose that term to avoid getting caught in a lot of sp@m filters for the use of the word “p-o-rn.”)

A prominent notice is placed on the sites, offering them for sale at prices that range from $2500 to as much as $500,000!

The idea here is that city officials will feel that enough damage is being done to the reputations of their towns that they’ll pay to keep them from being associated with that type of material.

It’s obviously safe to say that it’s not appropriate to pop those kinds of images into people’s faces while they’re looking for info on a completely different topic.

That’s where the pressure on the cities comes from, and why this is such a disgusting scheme.

In essence, the domain squatter says: “Pay us, or continue to watch as your city’s reputation suffers.”

Many would call this blackmail…

The second variation on the theme is not always legal. When someone takes a trademarked name (or variation of the spelling of one) or a famous person’s name, and does the same thing.

For trademarks or close variations, there’s a specific procedure for addressing the problem. (See the resource section at the end of this issue.)

For the names of famous people, there MAY be a remedy. But, it can be tricky — and expensive.

For example, if someone named John Jones registered and pointed it to one of “those” sites, Walter Cronkite could probably force the domain away from him.

However, if someone named Steve Cronkite registered and did the same thing, Walter Cronkite would have no recourse. It would be very hard to demonstrate that Steve registered the domain in bad faith. And if Steve’s son’s name is Walter, the same is true for
If you feel that your name is likely to be typed into a browser when people are looking for information on you, you should consider getting both the .com and .net versions of the domain if they’re available.

It will cost you a few bucks to prevent the problem. Fixing it, assuming you win, will cost you hundreds — if not thousands — of dollars.

And there’s no guarantëe you’ll win.

A third version is a bit more benign. It’s common among members of affilíate programs. In this version, names very close to, or even including, the trademark are registered. The sites are created to drive traffíc to the affiliates’ URL at the main site.

This may or may not be acceptable to the affilíate program owner. If it is, it’s a good technique for getting traffíc. If not, it could get you into hot water. Chëck with the owner of the trademark before doing this. Less benign is an alternative version of this technique where someone grabs domain names that are close to the trademark of a competitor and uses them to grab competitor type-in traffíc. This is often done by finding out the most common misspellings of the real domain name or trademark. Watch for people doing this with your domain. Here’s the worst version of this — and it can hit anyone if they have enough traffíc and don’t pay close attention to when their domain registrations expire.

In this situation, someone grabs expired domain names and points them to “those” kinds of sites. This is a “no löse” for the hijacker, as they will profít from the traffíc even if the previous owner doesn’t pay the requested ransom for the domain.

The more traffíc the URL gets, the greater the clickthrough value to the hijacker. This means more potential damage to the original owner — and a higher ransom to get it back.

In effect, your own popularity is your worst enemy in this case.

The solution to this one is simple — and very important: Don’t let your domain names expire!

Useful Resources:

If you find yourself a victim of domain hijacking, there is hope for correcting the problem.

For a more formal explanation of the legal aspects of this problem, visit:

For specific information on the UDRP (Uniform Domain Name Dispute Resolution Policy), the procedure for taking domain names that are being used in violation of a trademark, see .

For information on taking action under the Anti-Cybersquatting Act (A US law that provides for damages in addition to the less severe penalties of the UDRP) see:

If you have a famous name or trademark, the best defense is to make sure that you register the main variations in both the .com and .net form. The .org is probably only necessary if you are heavily involved with charitable activities. Protect yourself. Scammers come up with new schemes all the time…

So, keep your eyes open.

About The Author
Douglas Miller is a retired fire service captain, now making a living wörking from home. His company Hundred-Fold-Life is not just a name but also a belief. To learn how to find the best höme based business ideas and opportunities so you can wörk at home visit:

Google Fear Hits AT&T Square In The Jaw

As predictable as daylight, AT&T isn’t happy about Google’s plan to bid on the 700MHz wireless spectrum. The telecommunications giant is poised to claw any competition out of the equation, and is hoping its traditional ally, the FCC, will have its back again.

But the nitty gritty of it is, the telecommunications industry is scared to death of Google.

A quick review:

AT&T, Verizon, and others are chomping at the bit to get a hold of the 700 MHz band, soon to be returned to the federal government by broadcast television once regulation takes effect requiring them to go digital. This swath of spectrum is ideal for wireless broadband and mobile phone networks.

But to get the most of profit from it, incumbent telecom providers must pressure the FCC to not impose requirements on how the spectrum is used. Rather, incumbents would prefer a setup similar to what they have now, with little incentive to give consumers choice in wireless services.

They do this by limiting devices that can be used on their networks, what third-party applications can be installed, exclusive contracting like with the iPhone, and punitive contract termination fees.

And they want it to stay that way.

Google, though, and consumers, and pretty much everybody that’s not an incumbent, want a section of the spectrum reserved with requirements that are more consumer friendly. Though incumbents have argued that doing so would devalue the spectrum and limit competition, the intent is just the opposite, to foster new players in the arena, and by default, putting pressure on incumbents to think more about customers and less about the bottom line.

Enter Google, the white knight (yes, I’m editorializing, it’s what I do best), who last Friday sent a letter to the FCC promising to bid at least the minimum reserve the agency had in mind for that slice of spectrum, $4.6 billion, but only if the FCC enforce four principles of open access.

This does three things: ensures new, consumer-friendly competition; takes away arguments against from incumbents; and really ticks AT&T off.

Okay, that wasn’t as quick as I thought it was going to be.

What AT&T has to say about it:

Om Malik gets credit for chasing down this statement from AT&T Senior VP Jim Cicconi:
…Google has now delivered an all or nothing ultimatum to the U.S. Government, insisting that every single one of their conditions “must” be met or they will not participate in the spectrum auction. Google is demanding the Government stack the deck in its favor, limit competing bids, and effectively force wireless carriers to alter their business models to Google’s liking…
He also said something to the effect that Google should “put up or shut up,” which comes across as belligerent, whiney, immature, and ultimately, threatened. He is right that Google is making demands. He is also right that Google couldn’t win the auction in a fair fight with the telecoms (nor could anyone else, save Microsoft).

But that’s why supporters of open access are concerned. With about four major providers pooling their resources, they could hoard that valuable spectrum and keep America behind other countries in wireless services indefinitely.

The irony of Cicconi’s statement is breathtaking, even painful, as one might not be able to decide which is the pot and which is the kettle. AT&T has always had the deck stacked in its favor…remember Ma Bell? … and Google’s potential entry into the market has them scared they won’t be able to manipulate the market like they are used to doing.

Cicconi’s words are nothing but saber-rattling, a tantrum, a scared kid crying foul when he knows it was fair.

About the Author:
Jason Lee Miller is a WebProNews editor and writer covering business and technology.

4 Great Reasons to use Google Analytics

Having used a large number of web site visitor trackers over the years, I first approached Google Analytics some time ago, with the somewhat jaded attitude of someone who’s ‘seen it all’ or at least ‘seen most of it’. What could possibly make this particular utility stand out in such a large crowd of competitors?

But first… What is Google Analytics?

Analytics is Google’s very own visitor tracking utility, allowing webmasters to keep tabs on traffíc to their site, including visitor numbers, traffíc sources, visitor behaviour & trends, times spent on the site and a host of other information gathered via two pieces of JavaScrípt embedded in the source-code.

Unlike other frëe visitor trackers, which insist on displaying annoying and often amateurish badges or buttons when they are being used, Google Analytics simply runs quietly in the background, gathering the necessary information without any visible signs of its presence.

Which brings me quite neatly to Analytics’ first major plus-point; the price.

What webmasters are effectively getting, is a fully fledged visitor tracking utility without all the irritations and limitations normally associated with frëe products of this type.

Ok, so its free; but is it any good?

In a word; yes.

The sheer depth of information gathered, really leaves very little to be desired. From search engine analysis to page views, bounce-rates and more, the available data is presented so as to give users an easy overview of the most essential elements, with the ability to ‘drill down’ to less commonly accessed or more in-depth statistics and figures.

Additionally, on the 18th of July 2007, the Google Analytics old user interface was discontinued, making way for a newer, more ergonomic look which makes reports more accessible and the interface itself more intuitive for the user.

The new Dashboard provides ‘at a glance’ visitor statistics for the previous month, as well as a graphical breakdown of your visitor’s geographical locations in the förm of a world map. A pie chart clearly shows what proportion of visitors reached the site through search engines, by referral or through direct access, whereas the ‘Content Overview’ provides a líst of the most commonly accessed pages.

What makes Google Analytics special though?

Although Analytics boasts all the features and statistical data to be expected from a top-class keyword analysis and statistics tracker, it also features a number of additional tools which put it ahead of most of the pack where ease-of-use and depth-of-information is concerned.

1. The Map Overlay

Essentially, this feature brings up a map of the world, highlighting the countries a site’s visitors stem from. Clicking on a country produces a close-up view, along with a geographical breakdown according to the region and/or city from which visitors accessed the site. This tool in itself is invaluable for all those webmasters with geo-specific sites, concentrating on a particular catchment area.

2. The Site Overlay

This is conceivably Google Analytics’ single most important feature from a webmaster’s or online business owner’s perspective, as it provides a hands-on view of visitor behaviour. When clicked, ‘Site Overlay’ opens the tracked web site in a new window and, after a moment’s loading time, overlays each link on the screen with a bar, containing information about clicks to the target page and goal values reached [more about goal values in a moment]. Since it allows the webmaster or site owner to navigate his or her site and see exactly how visitors flow through it, it is difficult to imagine a more effective tool than this as far as raising a site’s conversion rates is concerned.

3. Goals and Funnels

Unless the site being tracked is an information site which does not rely on generating sales or enquiries, conversion rates are as important as sheer visitor numbers. The ‘Goals & Funnels’ feature allows users to set up specific goals for their site, such as tracking a visitor to the ‘Thank you for your enquiry’ page for instance. It also allows the user to set up specific monetary values for each goal, and thus track the site’s financial perförmance and profitability during any given period of time.

The term ‘Funnels’ refers to the specific path a visitor takes to reach the goal’s target page. Since most web sites sell a number of different product ranges or feature a number of ways to enquire, all of which lead to a single ‘Thank You’ page, the funnel allows for the tracking of each individual path with a minimum of fuss.

4. Graphical Representations

A great many visitor trackers out there will present the collected information in a certain way, be it a líst, graph, pie chart, flow-chart or whatever. Whilst all these methods of presentation are of course valid, it is nevertheless a fact that most users are different, and a pie-chart is not necessarily ideal for those users preferring to work with graphs or vice versa. Google Analytics, however, allows users to choose between views on many of its reports. Although this may seem like a relatively minor point, it nevertheless makes things easier, as it allows the user to work with the view he or she is most comfortable with.

In Conclusion:

Google Analytics provides webmasters and site owners with a highly effective means of tracking visitors and analysing statistical data, easily the equal of most subscriptíon based services in the industry.

Although some concerns have been voiced amongst more paranoid internet users, that Google puts everyone’s collective data to its own evil demographic uses, there really are precious few reasons not to recommend this fantastic tool as one of the best means to boost any web promotion and marketing campaign.

About The Author
As a technical writer with over a decade’s experience, Sasch Mayer has been living and working in the Republic of Cyprus since 2005.
Currently under contract to IceGiant Web Design and Promotion Services, he mainly covers topics such as SEM and Site Promotion.

Fluff vs. Quality Content

There are basically three types of content you can use for your site. Fluff, leased, or custom. All three have their pros and cons, but which would work best for ranking well with Google and the other major engines? Let’s first explore the definitions of this varied content:
Fluff: Written cheaply by non-native-speaking writers, and used to fill up a web site with inexpensive content.

Leased: Identical articles that are well-researched and written, but sold to numerous web sites.

Custom: Well-researched, authoritative content that is tailored specifically to meet the needs of you and your business.

Fluff content is fine for businesses just starting out. It helps you to at least get a place in the race to the top of the search engines, but for long-lasting results, fluff just won’t cut it. The wording is often choppy, incoherent, and doesn’t achieve your primary goal which is customer conversion. Also, if the content of your site is sloppy, it will not instill confidence in a potential customer.

Leased content works well because it is professionally written, topical, and easy to find.

Search engine algorithms favor content that has keywords and phrases that are strategically placed but those words and phrases must also be embedded in text that is lean and carefully crafted for consistent results.

The drawback of leased content is that it can be found in a wide variety of other websites and cannot meet the unique needs of your business or specifically target the audience that you want to attract.

Custom Content

Custom content is content that has been professionally crafted to feature the keywords and phrases that you and an SEO expert have chosen to rank well with search engines and attract your target audience. The strengths of custom content are:


You can consult a copywriting firm to construct your content exactly the way that you want it to convey the unique products and services that your business offers and organically build the rank of your site which leads to lasting results.


Custom content will engage the reader and invite them to read further which entices them to linger at your site and explore the other content.


Custom content immediately lends legitimacy and lasting brand recognition to your site because discerning readers can see that you have taken the extra steps to tailor your message specifically to them.

Lasting Results

Web statistics consistently suggest that the best way to earn placement on that key first page of search results and retain your ranking is customized content.

As search engine bots become more and more sophisticated, keyword stuffing and other gimmicks get sniffed out and dismissed because they do not offër the reader any rewards for investing their time.

So, how do I hire a quality content writer?

Sure, anyone can write and practice keyword stuffing. You see it on hundreds of sites everyday, full of fluff-content that was written cheaply, and reads cheaply. Even the most basic conventions of writing are abandoned, simply to reach a high word count. Because of this, readers are having a hard time finding good, quality content. They want information, not gobbledygook.

Ask for Samples

The first rule in hiring a good content writer is reviewing their work. Ask for writing samples, as well as references. They should know the basic conventions of writing, and excel in creating informative, easy-to-read content that people will understanding.

Work Ethic

Good content writers, like any job, should also have a good work ethic. Meaning, they respond quickly to emails, meet deadlines, and keep in constant communication with the customer. People that are conscientious and prompt in their correspondence are likely to be quick and efficient in their work. This reduces the chances of procrastination as well. A good content writer will use the entire time to work on an assignment and produce good, thorough copy, while a sloppy or lazy writer will wait until the last moment and squeak in an unpolished product right before the deadline.

You Get What You Pay For

As the old saying goes, “You get what you pay for.” There are plenty of desperate writers out there that will work for peanuts, but it is an ínvestment to hire a more proficient writer at a higher rate, and you will have much better results on your ínvestment. If you need quick, cheap content, then there are plenty of people willing to produce it. But again, if it is written cheaply, it will read cheaply.

Knowing the Audience

A good content writer should also have a feel for their audience. Any good writer can complete an assignment, but someone that is in tune with their audience can connect with readers much better by tailoring their copy specifically to them. A sympathetic writer should be able to imagine a piece of writing from the audience’s perspective and detect what that reader wants or needs from it. This comes from in-depth interviews with the client, and really learning what message they want to convey to their readers.


Lastly, a good content writer should be trustworthy. While representing a company or employer, a writer must be privy to certain information in order to write effectively. Make sure the writer you are hiring has a good business ethic and won’t turn his back on you or exploit your ideas once he is gone. Although it is possible to work with someone and still withhold sensitive business tactics or information, it is much easier to work with someone that can be trusted in an open correspondence. And even if you do trust the writer, it is always smart to get a signed contract.

About The Author
Devin Hansen is the owner of SEO Copywriters, a web-content development company based in Illinois . With a staff of American writers and editors, they produce high-quality, unique content for any business in any industry.

Web Design and SEO: The Eternal Debate

Web design focuses on appearance and aesthetics. SEO focuses on text quality and quantity. Web designers don’t really like to clutter their designs with text. They prefer to see the images stand out on their own. SEOs on the other hand don’t like images that much. Sure, an image can be optimized for the search engines by adding relevant alt attributes and titles, but this is not enough for a site to be properly optimized. Page copy still plays the most important role in website optimization for SEO.

As a business owner you are caught in the middle of this conflict. For your website to convert you need both design and optimization. There is no middle way. You cannot have a little bit of this and a little bit of that and still be competitive. You cannot have just one of the two either. Without optimization your site is invisible to the search engines, hence to your potential customers. On the other hand, without a good design your site, although not invisible, will get nothing but hits. Web users are picky and if they find nothing of interest on your site they will just surf to the next site.

Having a beautiful website no one can find is like having a store and keeping the doors locked. You know it is there, you’ve done a great job decorating it, the products are waiting for the customers, yet no one comes in.

When you pay for web design don’t automatically assume that by paying thousands of dollars on a layout you’ll be a hit on the Web. The Web is a highly competitive place. There are already thousands entrepreneurs who, just like you, invest in design and hope to become the new “it.” Without online marketing (SEO being an important part of the discipline) all these entrepreneurs will remain in the shadow, with their beautiful websites closed to the world.

SEO is the key to that virtual door you need to open for your customers. It is important that you consider this tool when you first conceive your site. Web design and SEO don’t need to be enemies. There are enough professional agencies that employ both web designers and SEOs who work together to develop a good business website, a site that is SEO ready, accessible and readable with any browser. You just need to take your time, research and send a few inquiries. Then choose the company that answers your questions in a timely manner, basically choose the company that proves a clear ability of designing with W3C standards and a clear understanding of the online trends and realities.

Then balancing content with visual appearance shouldn’t be such a difficult task. Aside graphics and artwork you have to choose proper font types, in a readable size, with colors that harmonize with the layout of the site and so on. If your site is not SEO ready from the first stage of the project you’ll face additional costs after you launch. SEO ready means a site that is properly coded (errors in the HTML code might stop some search bots from crawling and indexing your site correctly), with good navigability and good internal linking structure.

On the other hand, SEO and appearance are not the only traits of a good site. Brand conscious companies should look at the broader picture: instead of debating what is better online entrepreneurs should ask themselves what works best to convert visitors into clients.

Studies show that an over optimized page might hurt the user-experience of people with disabilities. For example, many SEOs stuff the image alt attributes and their alternative titles with keywords. Blind and other visually impaired people who use screen readers to access the Web and read the pages cannot see the images and, instead of listening to a relevant image description, they’ll hear… nonsense.

Usability and accessibility are equally important as design and optimization. Strangely enough images are better for usability. They give focus to the design and when properly optimized they provide for less cluttered website content. The problems appear when the images slow down the loading times, but with the use of CSS loading times should not be a big concern.

As search engines prefer fast loading sites it is easy top understand why good coding and optimization are so important. Poor coding raises many other problems aside loading times and might increase costs when you need website updates, especially when your website administrator is not the one who created your site.


About the Author: Mihaela Lica used to be a military journalist, worked six years as a freelance reporter for the Romanian National Radio Station (ROR) and four years in the Public Relations Direcorate of the Romanian Ministry of Defense. Since 2002 she is a PR consultant in Germany. For more SEO articles visit ewritings.

WNW Design Launches LED Lighting Projects Website

WNW Design is proud to announce the launch of LED Lighting Projects new ecommerce website, for LED lamps and lighting systems online.

The LED system of lighting offers low-cost and long-term solutions for lighting in the home or office. Lighting Projects new website offers secure online purchasing and a wide range of lights from manufacturers such as Astrum and Elektor. From the lights themselves, mood lighting, office lighting and more, LED Lighting Projects can advice on the lamps themselves and also provide some of the most sophisticated control devices available.

For more information and to browse the website, go here: Led Lighting Projects

Maximizing the Triangle of Relevancy With Google

The “Triangle of Relevancy” is used to describe the relationship between the text in a landing page, a sponsored advertisement and the keyword or phrase that’s entered into a search engine. Google places a premium on relevancy as it endeavors to ensure visitors have a positive experience by getting search results relevant to their search terms. I will outline specific steps an advertiser can take to maximize their landing pages and sponsored advertisement’s effectiveness in their search engine marketing endeavors.

Relevancy with Landing Pages

The product, if you will, of any search engine is the resulting landing pages. The page’s relevance to the search terms determines whether the page will show up in a search and at what position. Google’s algorithm scores each page and/or sponsored ad’s relationship to the keywords or phrase and uses this information to assist in determining the order in which the landing pages and AdWords ads are placed. The algorithm also monitors the amount of time a visitor spends on a page and includes this in the score.

Search engine optimization (SEO) techniques such as placing keywords in the page’s title and throughout the body of the page can sometimes affect the position of a page in the search results. But of greater importance to Google’s algorithm is whether or not the keywords are located on the landing page and whether they are randomly included simply to íncrease the density of the keyword on the page.

A common scenario is for Web developers to design a number of landing pages for the same product specific to certain keywords. Using this method you can end up with 10 or more landing pages for each of your products. This can be expensive, time consuming and difficult to maintain as regular updates are required on each page.

This can be much more efficiently accomplished by using a product entitled Search Chameleon. This product uses scripting on a landing page and a related sponsored ad to adjust the text in the landing page in REAL TIME according to the keywords entered in the search bar. The scripting can be used in the page’s title or anywhere in the body. This not only saves development time but makes page updates much simpler since you’re only working with one page.

It also assures your page will be relevant to the search regardless of the search term entered. This can be a compelling factor in a visitor’s decision to spend more time on a landing page. An advertiser is then able to maximize on the relevancy of their landing pages by automating previously manual processes.

Relevancy with AdWords and Sponsored Advertisement

The “Triangle of Relevancy” would not be complete without the search terms being included in the title and/or body of your sponsored ad. Google and most search engines will highlight the search terms in the sponsored ad anywhere it shows up. This allows your ad to stand out and draws attention to the visitor that your ad is relevant to their search.

So, instead of loading your Adwords campaigns with numerous non-relevant keywords, your best bet is to use a single keyword or phrase that’s relevant to your ad allowing it show up in both the title and the body of the ad. This means you should write several ads specific to a keyword or phrase for your Adwords campaigns. This not only makes your ad more relevant but it pre-qualifies your prospect as the ad contains the specific key terms they’re searching for.

Another way to really boost your sponsored ad’s visibility is to have the keyword or phrase in the destination URL at the bottom of the ad. If you’re using an affilíate link, you may not get as good a clíck through rate as with a non-affilíate domain, because people will respond more favorably to your ad if they think you’re the product owner.

The best way to show you’re a professional is to use your own domain name as a redirect to your affilíate site. You can use the keyword or phrase in a successful ad as the domain name and your keyword will be highlighted in the title, the body of the ad AND in the destination URL!

The second best way to show you’re a professional is to use a keyword as a sub-domain for a domain you already own, i.e., Notice the keyword is in front catching the eye of the prospect first. An alternative would be to add the keyword as a landing page name, i.e., Using these two methods works best when you have a generic domain name like which will work with any product and does not conflict with the keywords.

In Summary

The “Triangle of Relevancy” is the most important aspect of a successful search engine marketing strategy. Google is very careful to ensure their visitors have a positive experience with their search engine so they reward the more relevant advertisers with a higher position in the search results and their AdWords ad placing. Both the landing pages and the AdWords ads should focus on specific keywords or phrases for maximum relevancy.

As previously highlighted, Search Chameleon will allow you to customize a single landing page, which will update the page title and body text with the specific keywords or phrase a visitor enters into the search engine. Search Chameleon is a proprietary application included in a suite of B2B productivity software called PromoBlackBox. Also included are Google AdWords training CDs developed by a top Internet marketing company. There are a number of additional proprietary applications and software included that will assist advertisers in maximizing on the triangle of relevancy.

The search engine marketing landscape is continually evolving as new technology is introduced. Search engines are continually updating their processes as developers learn how to counteract them. One thing that probably won’t change is the triangle of relevancy with the search term, the sponsored ad and the landing page. People will always want specific answers to specific questions.

About The Author
Sydney Nelson is a Microsoft Certified Professional and has a Bachelor in Information Technology. Please go to for more information on how to dramatically íncrease your Website’s conversions and presence on the Internet. More articles on Internet marketing can be found at

The Secret to Creating Ads That Sell

Whether you are starting a new business or looking to attract new sales there are a few things you can’t afford. Losing potential clients to your competitor and wasting money on ineffective advertising.

Unfortunately, these things happen more often than not. So, why do some businesses do so well while others fail? It’s not due to more marketing dollars spent. Instead a strategic plan was evoked to produce an effective advertisement.

There are many important elements that go into producing an effective ad. First, let me start by saying that coming up with a great ad it isn’t rocket-science. There are no complicated formulas to follow in order to create an ad that grabs the readers’ attention.

Second, using just creativity can kill your ad. Let me explain. Creative ideas are just that, creative. Before you slap down an idea and call it “brilliant” take the few minutes and ask yourself the following questions:

* Who is my targeted audience?

* Does the ad clearly communicate my message?

* What is “unique” about my message?

* How does my ad compare to my competitors?

* What will motivate my targeted audience to respond?

Your ad has to be more than just creative. It must exude value in its message. Think of the reasons why you buy a product or service. Almost every reason for a purchase has some sort of value tied to it. Whether it saves money, tastes good or satisfies an emotional need, it serves a valued purpose.

Make Headlines Work for You

The headline is by far one of the most important elements in creating an effective ad. There are thousands of pages both in print and online that cover the subject of headlines. Why? Well, quite frankly, it makes or breaks an ad. The headline is in essence the voice of your ad. It shouts out: “Hey This Product Will Make You Rich, Here’s How!”

Rather than using: “Jane’s Homemade Cookies” use “Instant Smiles with Easy to Bake Homemade Cookies”.

Always use appealing keywords in your headline that attract attention or stir up curiosity. The goal is to get your targeted audience to read the rest of your copy. Consider using the following keywords when writing your headline:

New, How, Why, Free, Save, Fast, Now, Announcing, Introducing, Wanted, Make, Grow, Sale, Limited, Guaranteed.

Next time you notice a headline that grabs your attention; use it and test it on your product or service. But always keep away from exaggerating your offer. You will quickly lose credibility if you are not honest with your targeted audience.

Writing Simple Yet Effective Body Copy

Once you accomplish transitioning the reader from the headline into the body copy, build momentum by relating to the viewers’ needs and satisfying their desires with each written word.

When writing the body copy; keep words simple and to the point. Use sub-headlines whenever possible and keep paragraphs short.

It’s important to make it easy for the reader to scan through the copy. If your offer consists of many benefits, use a bulleted list.

Avoid cluttering up your ad by trying to cram too much in the space provided. Eliminate unnecessary words that can drown your message.

Using Visual Elements

There are no set rules of where the graphics must be placed on your ad. However, when selecting illustrations or photographs, display your product or choose ones that are relevant to your offer.

Using graphics can enhance your message as well as grab a viewer’s attention when they glance around on a page. Remember, graphics can communicate a message before a single word is read.

Ask for the Sale While Creating a Sense of Urgency

Limited-time offers can create a sense of urgency, but giving your reader a valid reason why they should act now generates more sales. Again, use benefits to attract the reader to take action.

Is there a free gift with the purchase? Will it enhance their lives immediately? Is it a special one-time low price offer?

People love bargains. Use discounts to attract those who want to take advantage of your offer by providing a coupon with an expiration date.

Lastly, specify how your product or service can be obtained and ask for the sale. If it requires a phone call ask them to pick-up the telephone and call. If you require payment, tell them what forms of payment you accept and how to make them.


About the Author: Tom Killian is a partner at Media D’Vine. An Orlando, Florida based agency specializing in marketing and advertising. Tom has been successfully building online businesses for over 9 years.

5 Tips to Effective SEO Keyword Research Analysis

Keyword research and analysis can be a daunting task, when done correctly, and expert keyword research is the foundation to a successful SEO campaign. Many new website owners think the keyword research analysis process is easy. They think free tools, such as the Overture Search Term Suggestion Tool is the profít pill that will bring them ínstant results.
Unfortunately, the frëe tools will only give you a rough guide and a quick indication whether a hunch is worth further research. These frëe keyword research tools are limited to basic information. When performed correctly, expert keyword research exposes so much more – all the gems that are tucked away deep.

Real keyword research requires research AND analysis. There are so many aspects to the process that cannot be left to chance. Attempting to do the keyword research on your own is like going to a veterinarian to fix your car. My advise to all clients I do SEO consulting services for is to simply leave this task to the experts who have the correct keyword research tools and expertise.

Following are 5 tips for effective keyword research analysis:

1. Latent Semantic Indexing (LSI) – Use multi-word phrases

Latent Semantic Indexing (LSI) is a vital element in Search Engine Optimization (SEO) for better keyword rankings in search results. LSI is based on the relationship, the “clustering” or positioning, the variations of terms and the iterations of your keyword phrases.

Expertly knowing LSI and how it can be most useful and beneficial for your SEO and the importance it has with the algorithm updates to search engines like Google, MSN and Yahoo which will benefit your keyword research for best practice SEO.

LSI is NOT new. Those doing keyword research over the years have always known to use synonyms and “long tail” keyword terms which is a simpler “explanation” to LSI. More often than not, these long tail, less generic terms bring more traffíc to your site than the main keyword phrases. The real bottom line is that Latent Semantic Indexing is currently a MUST in keyword research and SEO.

2. Page Specific Keyword Research – Target your niche keyword phrases for each site page

Probably the most common mistake in keyword research is using a plethora of keywords and pasting the same meta keyword tag on every web site page. This is SO not effective! Your keyword research needs to be page specific and only focusing on 2 to 5 keywords per page. It’s more work, but combined with best practice SEO, gives each site page a chance for higher ranking on its own.

3. Country Specific Keyword Research and Search Engine Reference

Keep in mind that keyword search terms can be country specific. Even though a country is English speaking, there are different keyword terms you must research – and then reference that country’s search engine when doing your initial keyword research. For instance, UK and Australia may have different expressions, terminology and spellings (i.e. colour, personalised). Referencing the terms in the corresponding search engine is an important element to keyword research that is often forgotten. So for example, be sure to chëck the search terms on or And, of course, if you have 3 to 4 really comprehensive research tools in your arsenal, you will be able to search for historical, global and country specific search terms easily and effectively.

4. Keyword Analysis – Cross referencing in the search engines

Once the majority of the keyword research has been done for a site page, it’s time to plug those terms into the search engines to determine:

If it is really the desired niche keyword for that page

To assess the competitiveness of your keywords. Along with checking the competitiveness of your keywords you should look at the strength of the competition.

Are the other sites listed for your keywords truly your competitors?

Are the sites listed for your keyword even related to your industry, products or services?
These critical analyses of keyword phrases are often forgotten. Since the keyword research and analysis is the foundation of a successful SEO campaign, you certainly don’t want to build your on-page optimization on the wrong niche keywords!

5. Ongoing Keyword Research – Repeat your keyword research on a consistent basis

While you may think that you have completed your keyword research analysis and laid a solid foundation for your SEO, you need to keep monitoring your keywords and tweak as necessary. Keywords can change from month to month as keyword search terms change, genres change and/or if your niche is within social portal networking sites – to name just a few. Maintaining ongoing keyword research is essential for best practice SEO.

Most Successful Strategy to Streamline Your Keyword Research Efforts:

Yes, many website owners will opt to do the keyword research and analysis themselves with only a marginal effect on an SEO campaign. It’s not the most successful strategy to use for the most effective results.

To be certain of your keyword data, accurate keyword analysis should be performed – and cross referenced – across multiple expert keyword tools.

Effective keyword research lays the ground work for effective SEO results and can help you kick-start the ranking process – perhaps even giving you a step up on your competitors.

The most successful strategy to streamline your keyword research efforts is to hire an expert. Focus your business efforts on your strengths and expertise and allow the SEO experts to effectively perform the keyword research analysis correctly.

About The Author
Keyword Research Analysis expert Valerie DiCarlo helps companies large and small – worldwide – enjoy a long term improvement in website visibility, increased brand awareness, a continuous flow of new sales leads and higher revenues. To discover how you can improve rankings across multiple keyword phrases and search engines, go to: