Archive for May, 2008

Growing importance of Online Press Releases

Tuesday, May 20th, 2008

In a significant report, titled Search Marketing Benchmark Guide, MarketingSherpa reported that online press releases, combined with organic search engine optimization, are one of the most effective Internet marketing strategies. As Jack Trout and Al Ries, best-selling authors, wrote: “PR plants the seed. Search marketing harvests the crop.”

Advertising clutter and information saturation have made traditional media less appealing to consumers, who prefer to consume news when they want it and where they want it. Not surprisingly, more and more savvy users are turning to the Internet for their daily quota of news. About 75% of searchers and researchers get their news online today.

Remember, it’s not just your prospects, but even journalists are increasingly using news search engines.

= 98% of journalists go online daily
= = 92% for article research
= = 81% to do searching
= = 76% to find new sources, experts
= = 73% to find press releases
On an average day, 68 million American adults go online:
= 30% use a search engine to find information
= 27% get news

(Sources: Middleberg / Ross Survey and Pew Internet and American Life Project)

Google News and Yahoo News have the largest online news audience and an optimized press release in their news section can give a quick and ethical boost to your Search Marketing Campaign. You can expect to be on page one of their news search result pages for your keywords overnight.

There are many online news wire services such as prnewswire (www.prnewswire.com), BusinessWire (www.businesswire.com) and prweb (www.prweb.com) that will distribute your press releases to the trade media, online portals, news web sites and to mainstream media houses. All major news search engines such as Google News, Yahoo News etc incorporate press releases in their search result pages. Submitting your press release to these news wires gives it good chance of making it to news search engines.

Getting into the news search engines is just one part of the story. Making sure it ranks highly when someone searches for it requires good writing, good structure and well-targeted and deployed keywords.

The elements of a good online press release:

The inverted pyramid: This is the classic way to tell a news story. Structure the information in decreasing order of importance. Begin with the most important information – and, therefore, the most important keywords and key phrases – and end with the least important.

Deploy your keywords effectively: First identify a set of at least three keywords or key phrases that you want to target. On news search engines, proper research and keyword selection are even more important than on traditional web search engines. Make sure you deploy these keywords in important areas such as the headline (this is top priority), summary, first paragraph, sub-heads etc. Of course, you should not go overboard with your keyword deployment or search engines algorithms might penalize you for ‘stuffing’. Keep your keywords to a density of around 2 percent of the total number of words in a page.

Structure it well: Start with an interesting headline and a short, crisp summary of your press release. Make your press release readable by giving sub-heads, bulleted lists etc. Add a note about your company at the end of your release, with its Web site address, and contact details. Remember you must first write for the reader and then optimize for the search engines. Do not write lengthy releases; instead keep it to not more than 300-400 words.

Use good linking practices: Link to important areas of your site from within the press release. For instance, if you are talking about a particular product or service, use it as anchor text to link to the relevant page on your web site. Also host the press release on your site and devise a good landing page for it and make sure you provide a link to it on the press release.

The bottom line

Good online press releases not only get you instant visibility across the Web, they can also translate into long-term rankings on traditional Web search engines. This happens because your press release gets linked by many topical and niche industry sites. And since your press release links to your Website pages, it can bring lasting visibility for your site.

Short-cuts don’t pay off in the long run in SEO

Tuesday, May 20th, 2008

Some use the straight, narrow, and often, difficult path to achieve success. Many use short-cuts. As in life, there are no short-cuts to long-term success in search engine optimization (SEO).

Search engine optimization is the process of ensuring that a website gets visibility and ranking on search engines through a series of strategies based on globally-accepted best practices. Or rather it should be.

However, a number of SEO firms around the world use tactics that are not only unethical but specifically banned by search engines. If your website requires SEO services, it pays to keep yourself informed about what is ethical and what is not. If your SEO service partner follows Black Hat Tactics, it can have serious consequences for your site in the long run. Of course, such SEO firms may show dramatic results and ROI initially, but the moment search engines catch on – and they will – your site will be penalized and possibly black listed.

The penalties are quite stiff: search engines can penalize your site by lowering its ranking, stripping the site of the power to pass on PageRank; banning parts of your site; or even a total site ban (also called ‘graybarred’ because the site’s PageRank is grayed out on the Google toolbar).

How does one distinguish ethical SEO practices from unethical ones?

The fundamental trait that marks an unethical SEO firm is it focuses on getting the attention of the search engine robot rather than the human viewer. The aim of such a firm is to get quick results, rather than look at the long-term success of its clients.

Search engines are constantly on the watch for such unethical practices and are quick to crack down hard on sites relying on such methods. They use a variety of means to detect unethical practices. These include sophisticated automated spam checks, paid spam spotters and, most important, your own competitors. So if you think your unethical practices will go undetected, think again.

There’s a simple reason for their alacrity. For unethical SEO firms, getting a good ranking for their clients in the shortest period of time is all-important, not ensuring that the user finds what he or she wants. So the user gets a lot of search results that he or she finds perfectly useless, damaging the credibility of the search engine.

Make sure that your SEO partner is not using any of these tactics:

Unethical redirects (Doorway or Gateway pages): The search results throws up a certain URL that seems to match the word you were looking for. But when you click on it, you’re redirected to another URL, which may have absolutely no connection with what you wanted. Webmasters create a number of fake pages (doorway or gateway pages) that are stuffed with keyword-rich content specifically for search engine robots. These are not intended for user consumption. When a user clicks on those URLs, they are redirected to a single page that the webmaster wants you to see. Search engines are continuously improving the technology to detect such techniques. If you site is found using them, it could be banned by the search engine.

You’ve been framed: In this case the URL remains the same, but the entire viewing area of the web page is filled up by a frame (search engines robots have a problem reading content inside frames) that may or may not have the content you want. Right-click on the body of the frame and choose `properties’, and you’ll discover that it has a different URL.

Hidden content: This is one of the oldest spamming techniques and perhaps the easiest for search engines to detect: Making content such as keywords and links invisible by using the same colour for text as the background; white text on white background. While search engine robots will be able to read the content, users will not. Another method of using hidden content is through the use of comment tags to stuff content

Link farms: Many sites offer reciprocal links and exist solely for that purpose. Avoid such link networks. Another black hat tactic employed by SEOs is to submit competitors’ sites to such networks in the hope that they will be penalized.

Keyword spam: Excessive use of keywords in tags and across body copy. Use your keywords judiciously. Don’t overdo it. Another method of keyword spamming is using popular keywords that are irrelevant to your business.
Cloaking: Sites can deploy software on their web servers that can tell them whether a request for a page is being made by a search engine robot or a human user (by identifying the IP or user agent). As a result, web sites are able to deliver a keyword-rich page to a search engine robot and a completely different page for human users. Such a tactic is called cloaking, and is a popular technique with black hat SEO companies.

Interlinking: Interlinking is a way to improve the link popularity of one or more sites. It involves creating multiple websites and linking each other as a way to increase the overall link popularity of the sites involved.

Trademark infringement: Using the name of competitors as a keyword on your site.

All of these practices are frowned upon by search engines. Be aware that you will be discovered sooner or later. Taking short-cuts is obviously not the answer. A SEO firm that takes its job seriously goes about it in an ethical fashion, which pays off in the long-term. It treats the search engine not as an adversary, but as a friend. The focus is on delivering the relevant results to the relevant audience, not to con them.

Google Sitemaps: Great SEO Tool

Monday, May 19th, 2008

Till a year ago, the only way you could get your site indexed on Google is by waiting for Googlebot (the software program or spider that Google uses to crawl the Web) to visit your site.

While Googlebot does a fairly comprehensive job of covering the Web, webmasters needed more control over how their site was indexed by the search engine.

In 2005, Google introduced Sitemaps.

Here’s what Google has to say about its Sitemaps: “Search engines such as Google discover information about your site by employing software known as “spiders” to crawl the web. Once the spiders find a site, they follow links within the site to gather information about all the pages. The spiders periodically revisit sites to find new or changed content.

“Google Sitemaps is an experiment in web crawling. By using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and speed up the discovery and addition of pages to our index.”

So what exactly are Sitemaps?
Simply put, sitemaps are files (created in XML) that inform Google about the content or pages on your site. You can create sitemaps by using the Google Sitemap generator or a variety of third-party tools.

What makes sitemaps particularly useful for Webmasters, as well as Google, is the fact that it allows you tell the search engine a lot more than just the URLs of pages on your site.

This includes:
· How often you update the page
· How important a page is in relation to other pages on your site
· When it was last updated

Such information can help both the site as well as Google.

For instance, you may want the Googlebot to revisit your frequently update pages more often than a page that is never updated.

It also gives websites more control over what they feel is important on their site rather than submit to the whim of a software program. For instance, you can tell Google through the sitemaps that your product pages are more important than your Contact page.

A particularly useful feature of Google Sitemaps is the ability to tell Google about pages that would never have been reached by the Googlebot. Most search engine spiders have trouble indexing dynamically generated pages or pages that sit behind a decision point — for instance, pages that can be reached only through a search.

This is a great boon for e-commerce sites because it allows them to submit dynamically generated pages that were till now beyond the spider’s ken.

Sitemaps also gives webmasters a lot of flexibility. If, for instance, your site structure has changed, all you need to do is make those changes in your Sitemaps file, and Google will know about it.

There’s more to Sitemaps

In 2006, Google made some great additions to the Sitemaps program. It now offers webmasters a host of valuable information about how Google views your site.

Getting started is easy as pie. Just submit your URL and verify your site ownership and Google starts to display statistics that can help you tweak your site and make it more Google-friendly.

These include:
Crawl errors: This lists areas of your site that Google had trouble crawling.
Search query information: Google provides information on top queries that returned results from your site and the top queries that directed traffic to your site. You can also get this information for specific regions. For instance, if you only want information about search queries from google.co.in (India).
Page analysis: Google provides information on how it views your site and the words other sites use to link to yours.
Indexing information: Google tells you if your site is indexed or if it had trouble crawling your home page. It will also show you pages from your site in its index, and when the Googlebot last visited your site
Violations: Google will also inform you if it found any violations of its Webmaster guidelines on your site.

Webmasters can use this information to create much more search-engine friendly sites.

For instance, if one of your key phrases figures in top search queries but not in search query clicks then perhaps you need to look at some on page factors (such as title tag to attract more clicks) or focus on improving the ranking for those phrases.

Page analysis, on the other hand, provides a list of words that Google associates with your site and another list of words that other sites use to link to you. This can be a useful tool in determining whether Google views your site through the same keywords that you do. If it does not, maybe you need spend more time on optimizing your site for them.

In conclusion, Google Sitemaps is a very useful tool for Webmasters to tell Google how to view their site and to understand how Google actually views it.

SEO dynamics for a Shopping Site

Monday, May 19th, 2008

The Internet is the retail industry’s biggest new frontier. Online retail sales are expected to touch $329 billion by 2010, according to Forrester Research. That is a huge market by any yardstick. But any online retailer hoping to get a slice of that big pie will face one major hurdle on the path to customer acquisition: making their dynamic, database-driven sites search engine friendly.

It’s a no-brainer that search engines such as Google drive a majority of the traffic to web sites. And if online retailers want to attract huge volumes of prospects to their sites, they must have a strong SEO/SEM strategy.

That’s easier said than done. Shopping sites are by nature dynamic and database-driven. That’s the only way they can remain user-friendly. A shopper visits a site, chooses a few parameters and variables and is shown a page of products that match his or her criteria.

But search engines use software spiders not humans to crawl and index web sites. And though search engines have, over the years, become better at indexing dynamic pages, they are still known to stumble at this hurdle.

What exactly is a dynamic page? A simple definition is that a dynamic page is one that does not exist till a user or an agent submits certain parameters or variables to a database. For instance consider a page on an online shopping site that has three drop down boxes. One contains a list of product categories such as shoes, clothes, toys etc; a second has a list of locations such as Kolkata, Mumbai, Chennai etc; and the third has a price range (under Rs.1000, between Rs.1000-2000, over Rs.2000). If you choose ‘Shoes, Mumbai and Under Rs.1000’, you will get a page that lists products that meet your selection criteria.

The problem is that a search engine crawler will never be able to input those variables. Therefore, as far as the search engine is concerned, the dynamic page simply does not exist.

So how can sites with dynamic content get listed and ranked highly on search engines. Here are a few ways in which you can tackle the issue:

Create static URLs: To begin with convert dynamic URL into a static one. Search engine crawlers cannot read – or choose to ignore — query or database characters like `$’, or `?’ , aptly described as `spider traps’. There are many ways in which you can convert your dynamic URLs into static ones. It’s best to consult with your SEO partner on what is the best way in which you can do this.

Create information-rich static pages: Create a browsable directory of product or category-specific. Link them to dynamic pages. Optimise these static pages for you key terms and then submit them to search engines. But handle this carefully. Many webmasters create static shadow pages, and are penalised by search engines for spamming.

Create content-rich catalogs of your products that include reviews, ratings, user feedback etc. Such content-rich pages are not only search engine friendly, but they are also very user friendly.

Consider paid submission: Online shopping sites could also go in for paid inclusion in search engines like Yahoo! with `Trusted XML feeds’. This way, they’ll be able to submit dynamically generated pages, which crawlers never visit. It’s important to remember that while this ensures that web pages are indeed indexed on the search engine, it does not guarantee where they appear. So there’s no getting around it: web sites have to be well-optimised so that they figure on top of the search results.

Get your products into comparison and shopping engines: More and more online shoppers are using comparison and specialized shopping search engines such as Froogle. Get your products listed on them.

While SEO optimisation is indeed critical, it’s by no means the end of the story. Online merchants must build sites that are customer-friendly and not just search-engine-friendly. If a shopping site has good information, great deals, and a reputation for speedy delivery and quality products, it’s bound to attract a lot of custom. If it doesn’t, it won’t be able to sell much, SEO friendly or unfriendly.

Building a brand through SEO

Monday, May 19th, 2008

In the dense undergrowth of the Web, search engine marketing and optimization are far more effective and economical at drawing attention to your website and products than any other online marketing tool. Traditionally, most marketers have used SEO as a direct response mechanism to generate leads.

That may fast be changing. SEO and SEM, if used effectively, can be powerful brand-building tools as well.

Building brand recall through SEO

It’s clear that a vast majority of potential buyers of goods and services, search for relevant websites through search engines; and that searching is one of the most popular activities online. It’s also well known that most searchers don’t look beyond the first few result – pages. Now, let’s say, your company is the leader in the space that you operate in. If you don’t have a good SEO strategy, regardless of how good your products or services are, you are likely to lose out on several brand-building opportunities, to the exclusive benefit of your competitors.

Being prominently present on search engine result pages can significantly increase brand recall and your brand’s association with your products and services.

The online Indian demographic is young, savvy and flush with spending power. No brand builder can afford to ignore this market irrespective of whether you sell your goods online or offline.

Besides, consumers are increasingly going online to research and compare products before they buy even offline. And the tool they use to do this is a search engine. A user-friendly, information-packed, SEO-optimized site is the only way to get traction for your products among these users.

The brand building blocks of SEO

Many marketers in their desire to rank highly on search engines often compromise a brand’s message. While getting a higher rank is important, it is equally important how your site gets displayed on Search Engine Result Pages (SERPs).

There are three components to a listing on an SERP. (1)The Page Title, (2) The Website Description and (3) The URL. Often, companies stuff their titles with keywords that actually communicate no brand message. Use each page title intelligently, because that is what is going to make a user click to your site and experience your brand more deeply if found on first few pages of the Search Result Pages.

Many sites do not fully use the potential of the description Meta tag under the impression that it does not have great SEO value. Always write a crisp and engaging description of your company or your website. It can draw a user into your site.

An often-ignored element is the URL that is displayed on SERPs. Sites that use automated publishing tools such as a CMS have URLs that are dynamically generated. These URLs are unintelligible to users as they contain a string of alphabets and special characters. Your URLs, as far as possible, must be self-explanatory. For example, if you are a maker of digital cameras, it’s better to have something like www.yoursite.com/digital-cameras/slrs.html rather than www.yoursite.com/?l=php-i22u& r=443385294014131&w=5

SEO can help you create a positive first impression on prospective customers, but you must also ensure that your web site delivers on the promise. A mistake many marketers make is to direct users to the homepage of the site (especially with pay-for-click campaigns). A best practice is to direct users to relevant section pages that are easy to navigate, uncluttered and packed with information that is relevant to a user’s query.

Tackling the flip side

While a good SEO strategy can help you attract Internet users to your Web site to experience your brand, it also particularly useful in times of crises. The online medium is all about buzz and compulsively viral. During a crisis, adverse or false information can spread rapidly and then be picked up by search engines. If you don’t have a good SEO strategy in place, it is likely that when users search for a brand, they may find sites or pages that dilute your brand. Search engine optimisation can help you get your message across to prospective customers and make your voice heard above the din.