Tuesday, May 22, 2007

How Web 2.0 Affects SEO Strategy

"Web 2.0" was originally coined by O'Reilly Media in 2004. Web 2.0 properties are perceived as harbingers of second-generation Web usage, such as interactive communities and hosted services that facilitate collaboration and sharing between users.

"Web 2.0" is also one of the most overused and abused terms on Wall Street, sublimely crafted to reinvigorate investing in online entities that remain rooted in Web 1.0 technologies. Even though much of the machinery behind the Web remains relatively unchanged -- just upgraded, versioned, and rebundled -- people surfing the Web have changed. Web netizens have progressed beyond solely seeking information to embracing greater levels of interaction, even if it's virtual.


It's not enough anymore to deliver goods as promised from an e-commerce site. Merely informing your online audience of breaking news is passé, and amusing visitors with quirky applets is seriously behind the times.


To succeed on the Web today, you must engage your visitors so they return repeatedly. Toward this end, some Web 2.0 platforms could be your site's savior; others could be its online demise. Either way, much of the discovery depends on your search channel. This is where things get very interesting for those who seek greater visibility.


Some Web 2.0 content management systems, such as blogs and wikis, are primed and relatively optimal for search engine visibility straight out of the box. Google, in particular, seems to adore blogs. Blogs and wikis have essentially replaced outdated forums, third-party product reviews, comments in guest books, and user groups because specific elements inherent to blogs and wikis are naturally search engine optimized. They're textually rich, extensively interlinked, frequently updated, and rooted in semantic markup.


If you haven't yet embraced corporate blogging or transformed your glossary of terms into a wiki, your Web site is behind the times. Blogs and wikis can help you expand your search channel to embrace and engage new users, as long as you're willing to let go of the wheel a bit to allow your prospects, clients, and customers to help drive your online business.


Unfortunately, many so-called Web 2.0 embellishments can choke the life out of your Web site's search channel. Interactive elements such as AJAX, widgets, Flash, podcasts, and video are inherently inhospitable to search engine spiders, rendering them dazed and confused. If you intend to embrace these Web 2.0 elements for increased conversions, improved usability, and greater customer interaction, be prepared to leverage XML, RSS, mirror sites, and syndication services to keep the search-referred traffic flowing.


To be certain, some search engines have rudimentary means of extracting content and links from a Shockwave Flash file (.swf). Nonetheless, any content or navigation embedded within a Flash file will at best rank poorly compared to a static, HTML-based counterpart and at worst won't make it into the search engine's index.


AJAX poses similar problems to spiders as Flash because it too relies heavily on JavaScript. Search engine spiders can't click, so they can't execute JavaScript commands. The beauty of AJAX is it can be used to pull data seamlessly into a loaded Web page's background, sparing users from dreaded levels of click-and-wait ennui. The processing offers a great timesaver for users, but the additional content that's pulled in via AJAX is virtually invisible to spiders -- unless the content was preloaded into the page's HTML and simply hidden from the user via CSS (define).


Unlike Flash and AJAX, XML and RSS are inherently search engine friendly. That's because an RSS feed is an XML file, and XML is text rich with semantic markup. The problem lies in the fact that RSS isn't yet well supported within traditional Web search. It is within certain vertical engines, like Google Blog Search and Technorati. Perhaps Google Universal Search will change the scheme of things a bit.
We actually have RSS to thank for the evolution of pod- and videocasting. It's the RSS feed with audio or video enclosures that makes a podcast visible to search engine spiders, not the fact that you have audio or video files available for download. If you already produce podcasts, make certain to utilize your MP3 files' IDv3 tags to incorporate show notes, images, and links to your podcast feeds. Then syndicate your audio and video feeds via multiple venues for optimal Web exposure.


The good news is SEO (
define) is evolving to better meet Web 2.0 challenges. Specific SEO tactics exist to expose content trapped in Flash and AJAX, as well as tap into contextually rich audio and video transcripts.


The bad news is the major search engines still can't cope with these elements without some assistance. So the burden is on us to link disparate SEO tactics into an overreaching SEO strategy specific to our industries and inline with our business goals. As the Web produces new ways to present revitalized content, so too must your SEO strategy evolve.

Monday, April 30, 2007

Which Blog Platform is the Best for SEO?


A month ago, I opened accounts at the three major hosted blog platforms (WordPress, Typepad & Blogger), and wrote two equivalent posts (similar titles and keyword densities) on each platform using made-up keywords. I also hosted a WordPress powered blog on a newly registered domain and created similar posts there (I recognize that this isn’t an entirely fair option because I didn’t do the same for all three platforms). I used all of the default options of the blog platforms.

I then monitored the search results for these fictitious keywords to see how quickly the posts would be indexed and how the posts would rank relative to each other.

The two terms I targeted were: patwrxa & qtyuist, and the four blogs I used were: patwrxa.com, patwrxa.wordpress.com, patwrxa.typepad.com, patwrxa.blogspot.com

1. Time to Indexing

The average time that it took for the posts to be first seen in the search engines . I didn’t include Live.com as it had only fully indexed one of the four sites (Typepad). I also tested and excluded Technorati, which never picked up the Typepad blog.

The rank is my qualitative assessment based on the data, my perception of the importance of the various engines, and how I saw the data change over the week. Yeah, I wish it was more scientific too, but you’ll have to deal.

RankPlatformGoogleYahooGoogle Blog Search
1Blogger2 days10 days1.5 days
2WordPress5.583
3WP on Domain681.5
4Typepad3.521Never Indexed



I had several observations:


  • Blogger gets content into Google’s index the quickest (not surprising)
  • WordPress.com and Hosted WordPress were indexed similarly quickly
  • Some posts jumped in and out of the index.
  • I was surprised to find Typepad indexed slowly on Yahoo, and not at all on Google Blog Search.

Also notable: Yahoo is already struggling with spam on the search terms


2. Relative Ranking

Average Position calculated by taking the average of the two positional ranks.

A note on Yahoo’s numbers. The blogs indexed in Yahoo fluctuated a lot, so I have two numbers listed for Yahoo: the first is the present day rank. The second is the average rank when all pages were in the index (about a week ago). N/I means Not Indexed (Currently)

RankPlatformGoogleYahooGoogle Blog Search
1WP on Domain1.51 (1.5)3
2WordPress2.52 (4)1.5
3Typepad2N/I (2)N/I
4Blogger4N/I (2.5)1.5

OK, these results were a lot less consistent between platforms and between search engines. I did notice several things:

  • The blog on the domain patwrxa.com did the best for the keyword ‘patwrxa’.

  • I didn’t see any major sandboxing issues with the new domain. Or, the blogs all faced similar sandbox issues.

  • There seems to be an inverse relationship between rankings on Google Search and Google Blog Search. This may be coincidence.

  • The rankings did bounce around a bit. The numbers I posted above are the current rankings and generally represented the steady state.



Summary - the Best Search Optimized Blog Platform?

WordPress. Although the assessment is more qualitative than quantitative. It is indexed quickly in all engines and ranks well in all engines. If you host your own version of WordPress, you’ll find the search performance to be similar (maybe even better) to a WordPress blog hosted on wordpress.com.

Disclaimer: Although I think it is informative, this is far from a scientific test. To properly conduct a test, you would need to create tens of blogs on each of the platforms and test each with real keywords in addition to the manufactured ones - time/skills that I definitely don’t have.


Originally Posted By: http://www.naffziger.net/blog/2007/04/29/which-blog-platform-is-the-best-for-seo/

SEO Degrees?

We recently posted the video of an interview I did with Greg Jarboe at the 2007 SES New York conference. In the video, Greg and I were discussing the general lack of qualified search engine professionals that seemed to be one of the main buzz-themes of the conference this year. We both agreed that it was past time for mainstream universities to step up to the plate a little bit and start including some of this stuff in their coursework/disciplinary offerings.

Some of the early comments to the video are common enough. You hear it thrown around as a response to the concept fairly often. Universties can’t teach SEO because it changes too quickly. the technology is just too new… the landscape changes too fast… the rules and guidelines are too fluid. Actually, I really couldn’t disagree more.

As a matter of fact, I kinda think that’s a cop out argument. I’ve been following the industry for almost 8 years now and while things are definitely fast paced and rapidly evolving - I certainly wouldn’t go as far as saying SEO couldn’t or shouldn’t be taught at the university level.
Medicine - for example - another industry that changes at an extremely rapid pace. There are lots of other examples — tax law for accountants, building codes for architects etc etc. Every industry/profession has changes.

Sure the search industry might have more changes than some and the resulting changes might completely invalidate or render moot things that were taught 6 months ago. But I don’t think that is unique to SEO. I also don’t believe it should preclude it as a course of study in universities.

Look at some of the things that change in medicine. Hormone replacement therapy, mammograms, — Vioxx anyone? Things change just as fast in medical science as any other field - that certainly isn’t any kind of argument against medicine being taught as a cirriculum. When things change in medicine there is typically a lot more on the line than a 1st page result for ringtones. For that reason, Doctors are required to complete so many CME (continuing medical education) hours every year depending on their area of practice and what state they are in.

The fact that an industry or discipline ‘changes too much’ is a horrible reason/justification for it not to be taught. The fact of the matter is, there is a huge deficiency of qualified search people right now. There are some organizations like SEMPO and the DMA working to develop/create training and certification programs - and that’ll surely help. At the end of the day however, the question isn’t (nor should it be) SHOULD universities be teaching this stuff - the question is WHEN are they going to start and WHY is it taking them so long.

SEOs constantly complain about their industry being looked down on as shady business. They get all huffy when somebody like Calacanis questions the philosophical merits of their profession, yet many of them bristle at the thought of SEO being taught in colleges. I submit to you that if SEO were part of the marketing curriculum in mainstream universities across the country, it’s own area of specialization -or maybe even it’s own discipline - this staffing crisis be gone in short order. Beyond that, the industry itself would be legitimized and accepted on a far broader basis than it is today.

Monday, April 16, 2007

Let Google's Algorithm Show You The Traffic

Recently Rand Fishkin of Seomoz.org brought together 37 of the world's Top SEO experts to tackle Google's Algorithm, the complex formula and methods Google uses to rank web pages. This ranking formula is extremely important to webmasters because finding which factors Google uses to rank their index is often considered the Holy Grail of site optimization.

Google's ranking factors affect how and where you are listed in their search engine results or SERPs. Since obtaining top positions for your targeted keywords often spells success for your site, knowing Google's ranking factors can be very beneficial. Every experienced webmaster will know Google is the main supplier of search engine traffic on the web, getting listed on the first page or anywhere in the top 10 positions for popular keywords will result in plenty of free quality targeted traffic.

Briefly listed below are some of the main ranking factors you should be optimizing your web pages for in your marketing. The majority of these ranking factors will be very familiar to most webmasters who take full advantage of any and every SEO tactic which will give their site an edge over their competition. Here are some of the main ranking factors to consider:

1. Keywords In Your Title And On Your Page

Place your keyword or keyword phrase in the title of your page and also in your copy. Many webmasters use variations of their keywords on this page and also include it in the H1 headline.

2. Keywords In Your URL

Keep your page on topic and place your keyword in the URL. Use your keyword in the H2, H3... headlines. Place it in the description and meta tags, place it in bold/strong tags, but keep your content readable and useful. Be aware of the text surrounding your keywords, search engines will become more semantic in the coming years so context is important.

3. Create High Quality Relevant Content

Have high quality relevant content on your pages. Your content should be related to the topic of your site and updated regularly depending on the nature of your site.

4. Internal Onsite Linking

Internal linking is important to your overall ranking. Make sure your linking structure is easy for the spiders to crawl. Most suggest a simple hierarchy with links no more than three clicks away from your home/index page. Creating traffic modules or clusters of related links within a section on your site has proven very effective for many webmasters, including this one. For example, creating a simple online guide on a subject related to your site's topic can prove very beneficial. Keep all the links connected and closely related in subject matter and don't forget to have occasional external 'anchor keyworded' links coming to these internal links on your site instead of to your homepage. Deep build your links.

5. Only Linking To High Quality Related Sites

Don't forget to link to high quality PR related sites. Linking to high quality sites shows the search engines your site is very useful to your visitors. Build relationships within communities on the topic of your site. Be extremely careful not to link to bad neighborhoods, link farms and spam sites... when in doubt, don't link out! Unless your site has been around for years and is well established and trusted by Google, this factor will have an adverse effect on your site's overall ranking. Linking only to high quality content sites will give your site an edge over your competition.

6. Global Linking Popularity

One of the major ranking factors is the Global Linking Popularity of your site. You should try to build plenty of inbound links from quality sites. One simple and effective way to do this is through writing articles and submitting them to the online article directories. Only related sites will pick up and display your articles with your anchor text links back to your site. These are often ONE-WAY-LINKS. But don't just write articles to get links, write quality content that will help the reader first and the links will come naturally. Also remember an article is an extremely good way of pre-selling your products and gaining trust with your potential customers.

7. Anchor Text Is Very Important

Anchor text is an important factor your must not forget to use. Perhaps more importantly these inbound links should be related or relevant to your site's topic, which will play an important role in your rankings. Don't ignore the text surrounding your links and use different anchor text links to avoid keyword spamming. Keep in mind, as search engines become more semantic, the whole text of your article will probably be considered your anchor text, thus making articles even more important to your rankings.

8. Number And Quality Of Your Inbound Links

Your inbound links should also come from related high Global Link Popular sites. The more links your have from these popular related sites the higher rankings you will get. Many SEO experts suggest you should have a steady stream of new sites (inbound links) added each month to keep your rankings growing. These links will age and increase your rankings after 4 or 5 months. Both quality and quantity is important.

9. Reliable Server And Service

Like any business, Google is only serving up a product (SERPs) to its customers, this service must be continuous and available at all times. Made sure you have a good reliable server because any extended downtime when your site is inaccessible to the Bots may be detrimental to your rankings. If it is down for over 48 hours, you could be dropped from the index. Ouch!

10. Duplicate Content Is A NO NO!

Make certain you don't place duplicate content on your site. This may affect your rankings and get your pages thrown into the supplemental index. Be careful not to use duplicate title or mega tags on your pages as this will lower and disburse your internal page rankings, resulting in poor optimization. Your overall SEO strategy should be to provide valuable relevant content and links for your visitors and for the search engines. Furthermore, as mentioned earlier, be extremely careful who you link out to from your site. Avoid spam sites, link farms or selling links. Although it is a bit outdated, using the Google Toolbar will still give you a general overview of a site's PR or Page Rank.

These are some of the most common and important ranking factors Google uses to rank and display their search engine results. Optimizing your site or keywords for these factors can prove very beneficial and rewarding. There are many more factors so you should use the Seomoz.org link in the resource box below to get all the gory details. For any novice or experienced webmaster it makes for a fascinating read and is extremely helpful in tackling Google's complex ranking system or algorithm. Conquer it and an endless supply of free organic traffic is yours for the taking.

Copyright © 2007 Titus Hoskins.

This article may be freely distributed if this resource box stays attached.

Originally Posted By: http://www.promotionworld.com/se/articles/article/070416letgooglesalgorithm.html


Easy SEO

Very basic and easy SEO step to improve your search engine ranking for a particular phrase, teerm or word in a particular search engine… follow these stepsFirst , create a text file on your computer and name it analysis .

Step 1 : Go to the search engine on which you want high ranks .

Step 2 : Search the term you are targetting . Example if you want to rank high for “SEO”, then search for it .

Step 3 : Look at the number one sites title that the search engine is showing you . Count the number of times your search words appear in it . Add this number to your “Analysis” file .

Step 4 : Count the number in the description provided by the search engine . Add this number also to your “Analysis” file .

Originally Posted By : http://www.theecommercesolution.com/blog/?page_id=6


Friday, April 13, 2007

SEO Case Study

Recently I took on a new client to assist in some search engine optimization. The site was receiving ZERO search engine traffic. For a specialized business, such as an industrial and insurance job recruiting company, search engine traffic can be very beneficial because it is so highly-targeted.
After taking a look at the web site, although it looked nice and was easy to navigate, I noticed it had some fundamental flaws that were preventing it from ranking well in the SERPs (search engine results pages). Please note, I did not design this web site - I am only recoding parts of it so it will rank better for particular search queries.
In this case study, I will outline the SEO flaws of the web site and present the solutions to these problems. On a side note, it makes no sense to me why a web developer would create a web site only to have it be completely ignored by the search engines.
If you have any basic understanding of
the benefits of SEO, it is relatively easy to design a well-optimized site from the start by using proper tags and integrating the keywords into the page copy. However, it seems quite often lately, I come across existing corporate web sites which are so terribly optimized for search engines that they not only contain major design and usability flaws, but are listed in the supplemental index and do not even rank on the first page for a search query of the company name!
Current Status of Search Engine Rankings
As of a few days ago, the company, which specializes in job recruitment for the industrial and insurance industries, was not even present in Google’s main search listings - it was being listed in the supplemental index.
For those of you who don’t know what the supplemental index is, it is basically Google’s trash can index where it deems pages as not search-worthy. An SEO’s worst fear is to have his or her site listed in the supplemental index. Luckily, if you understand the basic principles of SEO, you never have to worry about this.
What are the implications of being listed in the supplemental index? Well, a few days ago, if you typed in the company name into Google, the web site was not even listed on the first two pages. My goal is to get the site in the top 3 for that query and several other queries for various other keyword phrases.
It is no surprise, however, that the site was listed in the supplemental index, because based on its source code, Google probably had no clue what the web site was about. Here are the problems and solutions to better optimizing that web site, or any web site, for that matter, for search engine relevance:

Problems with the Original Page Design


No text-based copy on the home page. All of the page copy was image-based. Unless you write ALT tags for the images, search engines have no way of being able to read text embedded within an image. Besides, even ALT tags do not carry as much weight as actual page copy.
Image-based navigation needed to be converted to text-based. Again, search engines have no way of figuring out how to read image-based text. Not only that, but from a useability perspective, image-based navigations are not scalable when someone enlarges the font size from within a web browser.
Poor use of title and header tags. These are key HTML elements that search engines use to figure out the relevency of a web site. On the web site, there were no header tags used and the titles contained no keywords. The home page even contained the word “welcome” - a redundant word for SERPs. On a side note, I recently wrote an article about why the page title is so important.
Non-existent META Keyword and META Description tags. This is another no-no if you want a page to rank well in the SERPs. Although the META keywords tag is not used by Google, it is still used by some search engines. However, Google does take into account the META description tag when it shows SERPs for a query containing a particular keyword. You want to make sure you incorporate your keywords into your META description tag.
No backlinks. A web site needs a good number amount of backlinks for Google or any other search engine to trust the site and believe it is a credible source of information. This can be fixed by adding the web site to free and trusted directories such as
DMOZ or the Yahoo Directory.
Keywords not present in any page copy. Of course, the first thing to do is figure out what keywords you want the site to rank for, but after that, if the keywords are not present in the page copy, you will have a tough if not possible time of ranking for those keywords.
No sitemap. A sitemap is what Google and other search engines use to figure out what pages to crawl on a site. Every site needs a sitemap. I wrote a post a while back about
why a sitemap is your blog’s best friend.
These were just a few of the issues that were causing to not be ranking well in the SERPs. I have already fixed most of those major problems, with the exception of the header tags.
I expect that within the next couple of weeks as I add the site to more targeted directories and further refine the page copy to contain the desired keywords, the site will receive even more search engine traffic and start to rank very well for the desired keywords.
I will keep you updated over the next few weeks as the rankings for the company continue to improve.
If you are interested in learning more about SEO, the best place to start is by reading Aaron Wall’s SEO Book. Aaron Wall is basically known as the Michael Jordan of SEO and his eBook is truly one of the only ebooks worth reading. SEO really is not that difficult and every web developer should take it into consideration when developing new web sites.

Monday, April 9, 2007

Technorati Profile

SEO Ranking Factors

As an online marketing firm with a history of expertise in search engine optimization, we've done many SEO projects. At last count, over 70 are done or in progress. We would do more except for the fact that so much confusion exists around them that it takes a long time to both sell and complete. There are a myriad of reasons, mostly to do with misconceptions of what works and what doesn't, by a company that thinks it has some in-house expertise, or worse yet 'a friend' who knows how to do SEO. We get more than a few of these later, after they have failed, or even get banned by the search engines.

With appropriate attribution to SEOmoz.org, I will list the top ten positive factors, as they have 'found' by getting feedback from some of the top SEO gurus. We've reviewed them and for the most part agree with them, although your mileage may vary, based on your OWN website. They are:

  1. Keyword Use in Title Tag
  2. Global Link Popularity of Site
  3. Anchor Text of Inbound Links
  4. Link Popularity within the Site
  5. Age of Site
  6. Topical Relevance of Inbound Links
  7. Link Popularity of Site in Topical Community
  8. Keyword Use in Body Text
  9. Global Link Popularity of Linking Site
  10. Rate of New Inbound Links to Site

This list wouldn't be complete without the top 10 controversial factors:

  1. Manual Authority/Weight Given to Site by Google
  2. Relevance of Site's Primary Subject
  3. Participation in Link Schemes or Actively Selling Links
  4. Duplicate Title/Meta Tags on Many Pages
  5. Global Link Popularity of Linking Site
  6. Quality of the Document Content
  7. Domain Extension of Linking Site
  8. Server is Often Inaccessible to Bots
  9. External Links to Low Quality/Spam Sites
  10. TLD Extension of Site (edu, gov, org, etc)

It's worth noting that although these factors are weighted and appear to have been researched carefully, Google's algorithm in particular has hundreds if not thousands of factors that it takes into consideration, both on-page and off-page. As you can surmise, it's an ongoing process with a lot of variables.

So the next time somebody says they can do SEO, bring some of this up and see if they know what they're doing.... The entire article can be found at: http://www.seomoz.org/article/search-ranking-factors#f53

Originally Posted By : http://www.foundpages.com/blog/2007/04/seo-ranking-factors.html

Monday, April 2, 2007

Hints for growing your search engine campaign

Practitioners have analyzed the new personalization searches and determined that based on the users search history and user profile searches on the Google search engine could vary as much as 90 percent. Adding a feed, Google gadget, or Add to Google buttons on your pages so users can subscribe to your content will begin to become a necessity as more and more personalized searches are being performed.


Orignally Posted By : http://www.elementfusion.com/personalization-and-search-engine-optimization-seo

Search Engine Personalization

Google has implemented a new addition to their algorithm that will deliver better results for people using their search engine. This means Google (the monster of all search engines) tailors your search engine results based on your previous searches and the sites you visit regularly. This feature is only used for people who have an account with Google, so don’t worry Google didn’t install a tracking device on your machine.