Jay Harris is Cpt. LoadTest

a .net developers blog on improving user experience of humans and coders
Home | About | Speaking | Contact | Archives | RSS
 
Filed under: Blogging | JavaScript | SEO

Search Engine Optimization is high on the radar, right now. Whether it be the quest for the first Coupon site in Bing, the highest Cosmetics site on Google, or the top-ranked "Jay Harris" on every search engine, the war is waged daily throughout the internet. For companies, it's the next sale. For people, it's the next job. Dollars are on the line in a never-ending battle for supremacy.

One of the contributing factors in your Search Engine Ranking is Content. Fresh, new content brings more search engine crawls. More crawls contributes to higher rankings. Search engines like sites that are constantly providing new content; it lets the engine know that the site is not dead or abandoned. And though this new content idea works out well for the New York Times and CNN, not everyone has a team of staff writers who are paid to constantly produce new content. So we shortcut. We don't so much have to have new content as long as we make Google think we have new content. There are hundreds if not thousands of JavaScript plugins out there to provide fresh content to our readers, ranging from Picasa photos, to Twitter updates, to AdWords, to Microsoft Gamercard tags. But I have to let you in on a little secret:

JavaScript Plugins do nothing for SEO.
Nothing.
Search engine spiders don't do JavaScript.

"This must be a lie. When I look at my site, I see my new photos, or my new tweets, or my new Achievement Points; why don't the spiders see it, too?" Well, it's true. Google Spiders, and most other Search Engine Spiders, don't do JavaScript, which is why JS provides no SEO contribution; spiders do not index what they do not see. A look through your traffic monitor, like Google Analytics, will often show a disparity between logged traffic and what is actually accounted for in Web Server logs. Analytics, a JavaScript-based traffic monitor, only logs about 40% of the total traffic to this site (excluding traffic to the RSS feed), which means that the other 60% of my visitors have JavaScript disabled. A JavaScript Disabled on 60% of all browsers seems like a ridiculously high percentage unless you consider that Spiders and Bots do not execute JavaScript.

Just like Google doesn't see the pretty layout from your stylesheet, Google also doesn't see the dynamic content from your JavaScript. Pulling down HTML, (since it is all just text, anyway) is easy; there's not even a lot of overhead associated with parsing that HTML. But add in some JavaScript, and suddenly there's a lot more effort involved in crawling your page, especially since there is a lot of bad JavaScript out there. So search engines just check what has been written into your HTML. They read the the URL, the keywords and META description, but only the content as rendered by the server. JavaScript is not touched, and JavaScript-based content is not indexed.

So how do you get around this? How do you get this SEO boost, since JavaScript isn't an available option?

Use plug-ins and utilities that pull your dynamic data server-side, rather than client-side. Create a custom WebControl that will download and parse your latest Twitter updates. Create a dasBlog macro to create your Microsoft Gamertag. By putting this responsibility on the server, not only will you make life easier on your end user (one less JavaScript library to download), but you also make this new content available to indexing engines, which can only help your Google Juice.

Update:

I've been working on a set of macros for dasBlog to start pulling my dynamic content retrievals to the server. Keep an eye out over the next couple of days for the release of my first macro, a Twitter Status dasBlog macro that will replace the need for the Twitter JS libraries on your site.

Technorati Tags: ,,
Monday, August 31, 2009 8:47:29 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] - Trackback