2006 SEO Forecasting

Just got back from reading Stuntdubl’s blog and his latest posting titled: 40 SEM 2006 Predictions where he gives a list of ideas related to SEO that he has brainstormed for his forecast. It makes for interesting reading and whilst a fair chunk of it was quite obvious there are also ideas he has touched on that need some serious consideration. I’ll mention a couple of them here that really caught my eye:

9. Community reviewed content
12. Article submission
16. Information architecture and strategic deep linking
21. parasite SEO
22. sandbox existence debates
30. lots of msn acquisitions and products
34. usability and standards compliance
36. mobile content optimization

Community reviewed content

This fits in with the whole Web 2.0 Democracy that is gaining strength at the moment. By creating content that the web community can get involved with you are creating customer loyalty, links and conversion. This is something we are building into one of our projects where we will have news fed into the main portal and where members can sign up and get involved by reviewing the national news. It’s not something new; already Threadwatch and other do this. We would like to eventually take it to a more national level by using headline news as the content backed with the second tier news from trusted blogs.

Article Submission

Absolutely! Content is now more important than it has ever been. I’m not talking about going to free article sites to get your free content; I am saying unique well written content will rule the day. IMO Good article and content writers will really make their mark in 2006. The quest for unique content has been picking up for a while. By placing content that comes from free article sites on your website all you are really doing is duplicating content and this can have some negative effects on a site. Search engine algorithms are now programmed to spot content duplication and will give the original piece preference over the later duplicated pieces.
By having a good useful content that is targeted with good researched key-phrases you can attract visitors and hopefully turn the traffic into conversions for your product.

Information architecture and strategic deep linking

Have I been banging on about this the last year in various threads in forums! I won’t touch on deep linking here right now as I would rather see what Stuntdubl’s explanation of this or his ideas on this are. I will briefly cover site architecture though… it’s not within the scope of pure information architecture but it should be mentioned. I seriously think that the way a website architecture  is structured will play an important role in any good SEO campaign especially as far as portals are concerned. Over the last year at SEO’s and developers’ best way forward to disseminate information from large portal structures. Some of the experimentation we implemented into MyJournal social network while it is being built and in its Beta stage. We have been looking at the creation of deep linked folders that are islands of relevant content. So far we have had positive results in exploring this folder architecture and there have been a few setbacks with some of it, but nothing too serious.

Parasite SEO

A cause for concern… there is already a lot of tricks and techniques that are being used by the SEO community that one would cast a dubious eye over. Thankfully, a lot of these tricks are becoming obsolete. The worrier here is that the web is a hotbed for some very smart and dark denizens of the deep. They will surely be looking at new ways for spamming, SEO ‘black’ techniques and other tricks utilising some of the newer technologies available at the moment. You simply can’t catch every trick in the book and stop it, though search engine programmers are constantly updating algorithms to prevent spam behaviour they won’t see everything. I think we will see a lot of ‘flash and burn’ sites make a quick mark and killing then disappear. This already happens, but we will also see them using new methods to achieve their rankings. This may not be what Stuntdubl meant so watch for his explanation.

Sandbox existence debates

It will never go away as long as this filter is in place and people will continue to debate this. I for one think it’s a load of hocus-pocus. There are no reasons why a new website should sink into the sandbox’ unless it is over-optimised out of the door. This is what I have seen from first hand experience. Release a new site all well coded, good content and in its first two weeks out of the starting gate you suddenly throw a ton of links at it. This in turn flags a search engine as to why all of a sudden a new website is being linked so heavily. It then comes up for manual review or maybe automated review. If the site is dealing with a well known subject matter that hits the news then that’s fine the links are probably natural and deserved. If it’s suddenly full of links that are obviously ‘bought in’ then it will hit the ‘sandbox’ effect. IMO the correct way is to start it off with just a few good inbound authority links that are worth more than a thousand lame-horses any day.

Lots of MSN acquisitions and products

I think 2006 will be partly the year of the ‘come-back kid’ talking about Bill Gates here. We have seen Google and Yahoo (in part) purchasing or creating good services a lot of them still in their Beta stages priming them for the future. Microsoft has not been doing a lot of this lately and has been fairly quiet on the sidelines. Given their reputation I think they have been biding their time. We will see Microsoft purchasing their fair share of businesses and applications in 2006. Also, what MS introduce may be better than their competitors considering where the web is going and headed it would be in their best interest to go one step better.

Usability and standards compliance

Well this is already now becoming tantamount to any good SEO’s skill-sets. Good coding and standards compliancy should always be one fo the first considerations when developing a website. The website architecture is so important that over-looking this aspect can and will probably have a detrimental effect on a site. By building to standards compliancy i.e. using no tables, creating a site using DIVS and strict protocols it increases future proofing and saves time and money later by not having to re-code. Good coding will leave a site scalable for more development at a later date.

Mobile content optimization

Yes, mobile search optimization will be and is now becoming part of a good SEO’s skill-set. The modern SEO will need to have this trade soundly buckled down. Mobile SEO is not just about optimising a website for the mobile web, but the understanding of content, branding and coding will be key to making this work.

I look forward to reading the explanations of Stuntdubl’s list or what others have to say on his thread for this. It’s quite an extensive list and while some of it appears to have already been said and done elsewhere he will probably have further emphasis to each item. Some real food for thought.