Basics of search engine optimisation

At my day job, we’re contacted every now and then by clients asking about search engine positioning and optimisation. Most of the time the client has been approached by an SEO (Search Engine Optimisation) consultant trying to talk them into paying lots of money for search engine optimisation. The SEO firms promise “guaranteed top results” and “submission to 500 000 search engines and directories”.

Many site owners are regularly contacted by scam companies of this kind, and it’s understandable that many take the bait and start paying for “top results and submission to everything”. After all, who doesn’t want their site to be highly ranked by search engines?

In an attempt to help a few people avoid paying for unnecessary search engine optimisation, I’d like to share what I have done to achieve very good search engine rankings, for this site as well as for clients’ sites.

First, just let me say that not all SEO consultants out there are scam artists – there are many reputable firms in the business. However, anyone promising “guaranteed top results”, “submission to 500 000 search engines and directories”, “instant results”, or “permanent top positions” is most likely a scammer. There is no way anyone can guarantee that your site is the number one result unless they actually control the search engine results, or if the top ranking is for a word or phrase that only exists on your site, of course.

What strikes me as I think through the steps I have taken to get good search engine rankings is how much SEO has in common with accessibility, usability, and high quality markup – the principles of web standards. That actually makes it even better: by making your site more accessible and usable for humans, and by using valid, semantic markup, you also make it more attractive to search engine robots.

Consider the guidelines I describe here a basic level of SEO – try this first, and if you’re still not getting the results you want, you may want to look into getting help from a reputable search engine optimisation firm. If you contact someone regarding optimising your site, ask them what they will do for your site. If they suggest any kinds of shady methods, be very careful. They might get you penalised or even banned from search engine indexes. On the other hand, if their advice includes what I’m suggesting here, they will probably do a good job.

There are no shortcuts

I’ll start with the bad news if you’re looking for a quick and easy way to get great results. There isn’t one. Instead, expect to do some hard work, especially when it comes to the content of your site.

You will also need patience. Results do not come overnight. If you’re working on improving the search engine positioning of a client’s site, you should probably explain this to them early on.

Write good content

This is probably the single most important thing you need to do if you want to be found on the web. Even if your site is technically perfect for search engine robots, it won’t do you any good unless you also fill it with good content. Yes, really!

Good content to me is text that is factually and grammatically correct, though that is not necessarily a must for all kinds of sites. Whatever your site is about, the content needs to be unique and/or specific enough to appeal to people. More specifically, it needs to be useful to the people you want to find your site.

Good content brings return visitors. Return visitors who like your content will eventually link to your site, and having lots of inbound links is great for search engine rankings, especially if those links are from highly ranked sites.

Closely related to good content is fresh content. By adding new content regularly, you give visitors a reason to come back. Search engine robots will also visit your site more often once they notice that you update regularly, which means that any new content you add will be indexed quicker.

When doing work for clients, creating quality content is rarely the responsibility of the web designer. Often, the client wants to write their own copy, which is fine if they’re good at it and keep adding new content. In my experience, that is rarely the case. If at all possible, try to make the client realise that they should hire someone to help them write, or at least get someone to help them edit what they have written. In either case, make it clear to them that they can’t expect consistent high rankings without good content.

Think about spelling

If you write in English, you are probably aware of the differences in spelling between American and British English. Colour vs. color, optimisation vs. optimization, etc. There are also many words that are commonly misspelled (this goes for all languages).

I don’t like the idea of intentionally misspelling words, since it goes against my definition of “good content”. If words with multiple spellings or commonly misspelled words are an important part of your content, i.e. keywords, consider adding a glossary or similar to include the most common spelling variations on the page.

Write descriptive page titles

By making your page titles simple, yet descriptive and relevant, you make it easier for search engines to know what each page is about, and people scanning through search results can quickly determine whether your document contains what they are looking for. The page title is also what is used to link to your site from search result listings.

Because of this, the title element is one of the most important elements on a page. Some argue that it is the most important element.

When it comes to the order of the text in the title element, I’ve found that the following works well:

Document title | Section name | Site or company name

Based on a discussion here a while ago, that is probably one of the best formats for accessible title texts. Again, accessibility and SEO work together.

Whatever you do, don’t use the same title text for all documents. Doing so will make it much harder for search engines, people browsing through search results, and site visitors to quickly find out what the document is about.

Use real headings

Use the h1 - h6 elements for headings. Using graphics for headings may let you use any typeface you want, but search engines aren’t going to pay much attention. Even if you (as is required) use the alt attribute to specify alternate text for heading images, that text will not be anywhere near as important as real text in a heading element. In my experience, this is true even if the images are inside heading elements. If you know otherwise, please tell.

If you cannot use real text, look at the various image or Flash replacement techniques that are available. Be aware that there may be a tiny risk involved in doing so. Since image replacement techniques involve hiding text, it is theoretically possible for search engines to penalise you. Currently that risk seems very slim, but don’t say I didn’t warn you if it does happen.

Use search engine friendly URLs

Avoid dynamically generated URLs that use a query string to let the server know which data to fetch from a database. Search engine robots may have difficulties with this kind of URL – they may stop at the question mark and not even look at the query string.

Use search engine friendly, human readable URLs instead. This will help both your ranking and your users. I’ve seen incredible improvements in search engine results from just changing the URL scheme of a site.

Modifying and rewriting a site’s URLs can be a little tricky, and some CM systems make it more difficult than others to implement. It is worth the effort though. A couple of resources to help you make your URLs better:

Get linked to

There is no easy and sustainable way to solve this one except for – you guessed it – providing good content. Incoming links are very, very important for SEO. They are also possibly the hardest part of SEO to implement.

However, in my experience incoming links are less important the more specific and unique your content is. As an example, a couple of our clients are in what you may call niche businesses. They don’t have lots of sites linking to them, yet they started ranking very well in search engines after I had applied the knowledge I’m sharing here to their sites.

Use valid, semantic, lean, and accessible markup

Most web browsers have advanced functionality to decipher the tag soup mess that is used instead of HTML on most current sites. You can’t rely on search engine robots to do that to the same extent. Validate your HTML and avoid presentational markup – use as lean and clean markup as possible. By increasing your content-to-markup ratio, you make your site faster and more attractive to search engines.

High quality markup will help boost your rankings.

Accessibility is also very important. Making your site more accessible to vision impaired humans will also help search engine robots find their way around it. Remember, Google is blind, so even if you don’t care about blind people using your site (which you should), you’ll still want it to be accessible. This means that you should use real headings, paragraphs, and lists, and avoid using anything that may interfere with search engine spiders.

Flash and JavaScript are fine, as long as they aren’t required to navigate your site and to access vital information. Don’t hide your content inside Flash files or behind funky JavaScript navigation. Browse your site in Lynx, and with graphics, CSS, JavaScript, and Flash off. If that gives you problems, it is likely to cause problems for search engine spiders.

Submit carefully

Often slightly overrated, submitting a site to directories and search engines can be useful, especially if the site is new and hasn’t already been picked up by Google and others. Go ahead and submit it to Google. It won’t hurt, but most likely Google will find you anyway.

Two directories that may be worth submitting to are Yahoo! Directory and the Open Directory Project. Be patient – it will probably take several weeks for your submissions to be processed, unless you pay for them to list you.

Don’t try to fool the search engines

Don’t use cloaking, link farms, keyword stuffing, alt text spamming or other dubious methods. They may work for a short while if you’re lucky, but you risk being penalised or even banned from search engines, which you do not want.

Search engines want their results to be accurate, and they don’t like it when people try to trick them. Just don’t do it.

Avoid using frames

While it is possible to provide workarounds that allow search engine robots to crawl frame based sites, frames will still cause problems for the people who find your site through search engines.

When somebody follows the link from a search result listing to a frame based site, they will land on an orphaned document, outside of its parent frameset. This is very likely to cause confusion, since in many cases vital parts of the site, like navigational links, will be absent.

Some sites use JavaScript or server side scripting to redirect anyone trying to load a document outside of its parent frameset to the site’s home page. This is a very user hostile thing to do, and it definitely does not help the people visiting your site. Just lose the frames. They are bad for usability anyway.

Be careful with browser detection

If you need to use some kind of browser detection, make sure that it doesn’t break when a search engine spider (or any unknown user agent) comes along. If the spiders can’t get in, you won’t be found. I’ve seen this happen on the sites of fairly large companies.

Don’t waste your time on meta tags

Most search engines don’t place any great deal of value on the contents of meta tags anymore. They have been used way too much by spammers. I’d suggest using the meta description element, but that’s all. Keywords won’t hurt, but they will rarely help either, so they are generally not worth the effort.

Some search engines use the contents of the meta description element to describe your site in their search result listings, so if possible, make its contents unique and descriptive for every document.

Uuh. That was too much for me to read.

Ok, then. The ultra-short guide to SEO: add quality content regularly and make sure your site is well-built.

What next?

Like I stated at the beginning of this article, these are basic guidelines for SEO. There is much more than this that can be done to increase your site’s visibility in search engines, but following the advice I’ve given here is a good start that, in my experience, will get you very far.

If you’re looking for more info, here are a couple of books on the subject:


This article has been translated into the following languages:

Posted on February 3, 2005 in Accessibility, Search Engine Optimisation, Usability, Web Standards


  1. Thanks for the article. A few things that I have noted makes the sites go up in Google.

    Content first

    Have the content first in your structure. With css this is possible, then related information, navigational elements, footers and top (branding area)

    Hide content

    Do not hide content with display: none, I am not really sure about this but I have a site that I used a h1 directly below the body element. I first had display: none on this one. The thing was that Google refused to index the page. Other pages on the site worked fine. After changing the css to position: relative; left: -100000px Google started to index the page.

  2. Have the content first in your structure. With css this is possible, then related information, navigational elements, footers and top (branding area)

    Also note that this is also an absolutely vital point if you want to produce content for handheld devices, that may have little or no CSS support. There is nothing worse than visiting a site on a cell phone, only to discover you have to scroll through 13 screens of pointless navigation

  3. Regarding the use of the keywords metatag. I would suggest using it if you have your own site-searchengine that does take these into account. For external search engines I agree completely.

  4. Like many things, not all wishes are granted as you expect.

    My own personal site follows these rules; and every so often I go through the referrer logs to see what search terms have brought people to my site. At times, I am gratified to find that a title tag followed by a h1 has been closely matched, and bubbled to the top. But often the search strings are quite bizarre (as I expect many people have found), and not germane to the content. tends to be the usual thing to bring people via apparently irrelevant queries, picking up words or word pairings, often from well below the fold, to rank pages in the top half dozen, if not the actual top - so how one might optimize for that particular engine is anyone’s guess.

  5. Excellent article, and I can only agree that it works. Content, title and clear headings and clean code is the key.

    BTW: Goggle visited Stanford University (in 2003) where they talked about how page-rank and their search engine works. (Video)

  6. February 3, 2005 by Erwin Heiser

    Amen to that. The last 4 sites I did always end up in the top spot at Google and for the most obvious keywords. Search engines just love cleanly marked-up sites with solid content. It’s one of the best selling points to clients too. I remember doing a site a while back for a balletschool which took some convincing to let me use a full XHTML/CSS approach (their previous site was done in Frontpage - nuff said). After their site started showing up in Google in the no.1 spot for several keywords I could do no wrong anymore ;-)

  7. Unfortunately the SEO scammers are trying to address the content issue via thesaurus-supported plagiarism ( — easy fresh content without any of the effort. They’ll stoop to any level.

  8. February 3, 2005 by Roger Johansson (Author comment)

    Paulo: I edited your comment to remove the link to that content generator.

    Anyone else posting a comment on shady methods and characters: please do not post links. Just a URL is fine since it won’t be auto linked.

  9. Roger, that was hardly too much to read. I was just thinking about this stuff earlier this week for a project I’m working on, and while I’ve found several of the items across different sites here and there, I’m glad someone took the time to sum it up nicely. I think there are still a lot of people that are really ignorant to just what search engine results can mean to a site, believing that just submitting a URL or paying for “indexing” will be the solution. What you’ve got here are some real-world, proven methods that don’t depend on the search engines but the sites themselves (as it should be). I could even pass this on to others who’d have no idea what I was babbling on about and they’d find it useful. Thanks.

    Jens, I been using display:none for a while now and haven’t had any problem with Google yet. It’s a simple thing that I’m sure can be done in a different, better way, but the pages still get indexed without any hitches…

  10. Excellent article Roger. I too have recently been exposed to client’s requesting SEO at my new place of work. And you’re quite correct, there is absolutely no way to guarantee results, even though some clients expect their site to jump to #1 on Google as soon as you’ve made some adjustments to the site to make it more SE friendly.

    I find that one of the hardest parts of SEO is getting the client to produce quality content that is relative to the audience they are trying to attract.

  11. This is absolutely brilliant! Another thing that I find works well is to not make the main content of your site accessible only to registerred members.

    I built a site once that was completely standards compliant, well structured and even validated (though it doesn’t now that I’ve left). But it turned out the company wanted the whole content locked off to registered members only (even though membership is free).

    Needless to say: Google can’t register and can’t find anything but the preview pages, and the search engine rankings are poor for even the most obvious keywords. It’s also interesting to point out that my site comes close to beating them for a search of the company’s name, thanks a blog post I wrote about them - I’m only behind by one spot!

  12. I think it’s also very important to mention that it takes time for your site to be indexed and to see movement in your search rankings. I’ve had clients expect to see their site at #1 the next day only to have to explain that it can take a couple months for that to happen.

  13. As a professional SEO, I appreciate your comments. I would like to add that despite the reputation of the industry, the job of most true SEOs is to help get the elusive links to your website that are required to rank well for a particular search phrase.

    It’s this linking strategy that makes for the real professional, and if you are seeking out an SEO company, one of the best places to start are the professional boards where SEOs meet to talk about their industry -;;; etc.

  14. Once again, a great article!!

  15. February 3, 2005 by Roger Johansson (Author comment)

    Thanks for your comments, people.

    Content first is good, though it can make some designs a bit difficult to achieve. Making sure clients understand that things won’t happen overnight is also a very good point.

    I probably should add something about both of the above.

    randfish: yes, getting good links can be very hard, so I can see why some are tempted to use link farms and such.

  16. Check out this Googling the Bottom Line article on Wired. (via /.) They mention something about those ‘link farms’.

  17. I wrote a largely ignored piece on Search Engine Optimization some time ago, and while your guide is more comprehensive (and indeed better), there are a few points from my writing that I’d like to repeat here:

    • If key parts of your content have different spellings in British and American English, include both. For instance, you write “Optimisation”, and I write “Optimization”.
    • If misspellings are common, include some of these. In the aftermath of the Tsunami disaster, one of my commenters misspelled “Thailand” as “Thialand” in the post , and I have had tons of searches using that particular misspelling.
    • If you want to ensure that you avoid the shadier parts of the SEO business, you might be much better off looking for accessibility and usability experts. They will be doing much of the same as an SEO, but with clear focus on ethical methods.
  18. div#head-1, div#tail-1 { display: none; }

    These selectors are named after the Unix utilities that display the first and last chunks of a file respectively (the -1 means one line). The first comes after the opening body tag and holds the page description, the second comes just before closing the body element and holds my copyright information. Google has no problem with either one. They’re also very useful for text only browsers such as Lynx.

    You’re picking my brain again Roger. Nice article. I really have to stop giving away all my secrets. ;-)

  19. Good job, Roger. Nicely summarised with all the important bits.

    I have the content first both on my blog and on the office web site. Ironically, a Swedish company which did an accessibility audit of our site recommended that we should put navigation before content, because that’s what ‘disabled users’ expected. They retracted that statement when one of their own testers (blind) tried our site and exclaimed, “This is great! I get straight to the content!” :)

  20. February 4, 2005 by Roger Johansson (Author comment)

    Arve: The spelling dilemma is something I’ve been thinking about, and it is worth mentioning. Including both American and British spellings is a good tip. You just need to find a good way of doing it.

    However, I really don’t like the idea of misspelling words on purpose. I realise (see, British spelling again ;-)) that in some cases it may be useful, but misspellings can also give a bad impression, so you need to be careful with that.

    One way of getting away with it would be to mention common misspellings, i.e. “When the tsunami struck Thailand (sometimes misspelled as Thialand) …”. It’s a difficult choice — correct spelling or more potential hits from search engines.

    Douglas: Sorry about that. You need to turn down the volume of your telepathic broadcasts ;-)

    Tommy: I think the issue of content vs. navigation first is similar to those of font sizing and fixed vs. fluid width — there is no single way to please everybody. In general though, the earlier content appears in the markup, the better.

  21. The spelling issue (and for that matter the British/American issue) could also be solved by providing a mini-dictionary somewhere on the page.

    I just wish that there was some way to do this cleanly in the markup itself.

  22. Great article, and very relevant to recent discussions within out business.

    I have a question thats been bugging me for a number of weeks now.

    Since Google can now spider URI’s containing querystrings, is there a difference in ranking between static URI’s and those that contain querystrings?

    We have noticed at work that some of our pages that are linked via querystring URI’s are not ranking as well as expected, and ive been asked to create static pages for pages containing important search terms etc.

    Any ideas people?

    Again, another great article and one i’ll be passing onto the managment of the firm.

    thanks again

  23. Jay-Dee: Roger provided you with two links for search-engine-friendly URLs.

    As a side-matter, couldn’t we just refer to such URLs as “friendly”. is also much easier for a human to cope with, compared to

  24. Arve: Thankyou for pointing out the above links.

    My question however is regarding how search engined rank URI’s containing querystrings, and not about how to re-write URI’s.

    If a search engine can spider the URI, then does the fact that it contains querystrings make a difference to its ranking in comparison to a static or “friendly” URI?

  25. Spelling is definitely a tricky question. I usually don’t bother with spelling for minor words that aren’t really relevant to the topic. Ex: “do you realize the …” I wouldn’t bother adjusting that. But, when I put together my colour contrast tool, I specifically included the alternate spelling in the title.

    The spelling, I suspect, is less of an issue if you have a number of inbound links that make use of the alternate spelling. An example of this is the Juicy Studio Colour Contrast Analyzer who never uses the American spelling (color) but sites linking to him do.

  26. Erm, to clarify, I put the alternate spelling on the article and not on the tool. There’s consistency for yah! :)

  27. February 4, 2005 by John B

    Excellent and very timely. I just launched a site last week and the client asked me earlier this week about SEO “services.” Your article refreshed my memory for the perfect response, and also reminded me to take another look at the title tags which I fixed.


  28. February 4, 2005 by Roger Johansson (Author comment)

    Jay-Dee: I have no hard evidence in the form of before and after statistics of real (friendly) URLs vs. query string URLs, but my impression after converting several clients’ sites to use friendly URLs is that it makes them rank much better.

  29. February 4, 2005 by zcorpan

    First, thanks for a great article.

    Using graphics for headings may let you use any typeface you want, but search engines aren’t going to pay much attention.

    Search engines use the alt text for images, so what’s the difference between an image with proper alt text and only text? I agree, however, that we should use the heading elements properly. But that doesn’t mean that we can’t use images inside headings… :)

  30. February 4, 2005 by Roger Johansson (Author comment)

    zcorpan: It should work like that, but because of keyword spamming in alt attributes, search engines may value such text less. Again, I have no hard evidence, only personal experience. At the moment, I’d go with some kind of image replacement technique if I were to use graphical headings.

  31. Valid points, above all else.

    But what exactly is “good” content? I don’t like the word “good.” For that matter, what is “quality” content?

    And “good” page titles? If someone is completely new to the points you make, using words like “good” are not helpful.

    Maybe… “descriptive” page titles.

    And “clear, concise, detailed” content. Or “grammatically correct” content. Or “well written” content. Writing is the key, here, anyway. If the page author can’t write well, they won’t have “good” content.

    Or… is “good” content measured more by “frequency?” Is it better to have poor writing with frequent updates, or solid writing with few updates?

    Just some thoughts. Thanks for this entry… It’s information that will help many.

  32. February 6, 2005 by Roger Johansson (Author comment)

    Matthom: You’re right, the quality of page titles and content should be described in other words than just a vague “good”.

  33. It is odd that your advice regarding meta tags and linking are somewhat opposite to those of Andrew King’s. He highly recommends using the meta-keywords and meta-description tags although he does recommend spending much time and effort on crafting the content of those tags properly.

    Great article Roger, something I was thinking about recently.

  34. Interesting Article.

    A point I would like to make is that placing the content higher in the markup would cause accessibility and usability issues. The WAI Guidleine 6.1 states: When content is organized logically, it will be rendered in a meaningful order when style sheets are turned off or not supported.

    Tommy - With respect, I don’t think you can base this decision from one user.

    Providing a “Skip to main content” link, before the global navigation elements, makes it easy for users to access the main content quickly. If you were to place the navigation at the bottom of the page it could become very annoying. A bit like placing the map of a large department store at the back of the building (i.e. not at the entrance where it would be useful)

    I think for the small improvement you MAY gain in search engine listings is not worth the trade-off in accessibility.

  35. Thx for an excellent article.

    Being a new started webdesigner, dedicated to writing valid code and learning the basics right from the beginning, it is very useful.

    Regarding good and useful content to help get a better rating, how is it with writing in a little used language (like danish)? Does the spiders then compare to other sites in this language.

    I have added an english version of my site, hoping that could have just a little effect on my rating…

    Am I wasting time here - besides the fact that english speaking people in Denmark can read it ;.)

  36. How do you back up your facts for the comment about engines ignoring META tags?

    I completely agree with it—of course. And it definately seems true. The search engines wouldn’t be caught dead trusting what a particular end user claims about a page, especially the author—but I constantly get people on my publishing platform asking for META keyword tag support or some variant of that, or some other wacky odd-ball magical element that their brother’s friend told them out give them “TOP GOOGLE RANK”.

    It’d be wonderful if Google posted some information that I could point to saying “look: they don’t care about magic methods A, B, and C”.

    SEO is such a mystical practice. I want the major engines to provide more transparency on what these dated practices really are.

  37. February 8, 2005 by C. Grocki

    2¢ toward the [mis]spelling debate:

    It’s a user mistake. Your including of the misspelling in your content adds clutter and effectively propagates the mistake.

    Point is, you don’t need to try to guess mistakes, because Google already does.

    Google “Thialand.” They’ll give you their fantastic “Did you mean: Thailand” at the top of the results page. Good search engines intuit the mistakes of their users. As well they should. Now you can worry about bigger things.

  38. February 8, 2005 by Ole Hansen

    Lindkold: In most cases, the language of your page does not matter much - as long as the query-words are on your page. The exception is search engines with language filters.

    Writing in a “fringe” language (like ours) is likely to give you better results - at least when people search for words in that language. You will ceartainly be higher ranked on a search for “registrering af domæne” than for “domain name registration”.

    I have read somewhere, that the meta keywords should be picked carefully. Only pick the 3 or 4 most important words, and especially avoid keywords that are not part of your actual content, or search engines might penalize you for attempting keyword spamming.

  39. February 8, 2005 by Roger Johansson (Author comment)

    C. Grocki: Agreed, you shouldn’t include every spelling mistake you can think of. Google does do a good job of correcting spelling in searches, but if your audience uses a completely different word or a misspelling that Google doesn’t catch, you may want to somehow include it.

    Regarding the meta keywords tag: just about every recent article and book on SEO mentions it as being next to useless, though it won’t do any harm (unless you use it for keyword spamming). Some search engines are said to place some value on words in a meta keywords tag, but in general, it’s better to spend the effort on the main content of the site.

  40. I’m trying to improve the SEO for a site at the moment. It’s homepage is basically a list of headlines to news stories in subsections of the site, with not much in the way of actual content on the homepage.

    I’ve suggested adding the story excerpt as well as the headline - I’m guessing that should help as there will be more for the bots to index.

    Any thoughts?

  41. Thanks for a great article !

    We’re getting more and more questions about SEO from our clients, too. Your article gives me some ready answers…

    It even inspired me to write my own opinions and advices, albeit in Dutch ( Zoekmachine optimalisatie kan je zelf).

  42. Bravo! Bravo! Great article and comments alike!

  43. How do you test for your ranking on search engines? If I searched for 456bereastreet, of course, you know the answer. How does one conduct a ranking test without simply picking words or phrases that you know would be found by those looking for your site? If you don’t use keywords, as I have not and it appears that others here don’t either, then how do you “search” for your site without using obviously prejudiced words or phrases?

  44. February 9, 2005 by Roger Johansson (Author comment)

    Tim: I think adding the story excerpt would be great.

    Jules: One way is to check a document’s PageRank to get a general idea.

    That doesn’t check specific words or phrases though, so I usually search for likely words and phrases, including phrases that aren’t just copied straight from the page.

    To check how well this article is doing, for instance, you could search for “search engine basics”, which is currently at #3 on Google, or “search engine optimization basics” (yes, with a z instead of s — shows that Google takes care of that spelling difference), which is at #1 on Google.

  45. Great article covering the basics. As the author of an SEO book I see you’ve covered a lot of ground. I’ve recently put the whole text of the ABC of SEO online which gives some more detail on the points you have raised (I hope you don’t mind the plug but it is free info).

    One of the big problems I see with people who ask me for help is basic site structure. They should remember that HTML is a markup not formatting language. Web authoring tools are the worst, producing large reams of bad HTML code with little content - ok we are talking on-page factors which are less important with Google and MSN Search but every little helps.

    SEO and Web site creation should really go hand in hand from the start.

  46. Roger, Google doesn’t actually “take care of” the spelling difference. Google indexes this page under ‘optimization’ because other sites have linked to you using the American spelling.

  47. February 15, 2005 by Roger Johansson (Author comment)

    Arve: You’re right. And now also it picks up “optimization” from these comments.

  48. My hope is, however, that one day, Google may consolidate search results for American and British spellings, or at least give you a preference to merge them.

    I learnt British English, but I tend to use both, which sometimes leave me scratching my head.

  49. Google “Thialand.” They’ll give you their fantastic “Did you mean: Thailand” at the top of the results page. Good search engines intuit the mistakes of their users.

    This point is rather moot: People still visit the results, even if Google has suggested that their spelling is a bit off.

    Note that I’m not talking about including the misspellings inside the content - rather I’m suggesting it in a separate section at the end of a document, and only for what should be considered “most important keywords”

  50. congratulations on repeating what everyone already knew 100 times over.

  51. February 18, 2005 by Roger Johansson (Author comment)

    eep: Thanks. I’m happy you enjoyed reading it.

  52. Maybe I missed it, but since SEO for some (both companies and individuals)have become almost an industry within webdesign, there must be a lot of foul play around. If a high ranking is dependent on not only content but also a high number of incoming links, how does i.e. googlebot detect if these links are for real or actual ‘spam-links’ if possible.

    Cause I know one thing: if there’s prestige or value in obtaining a high ranking; then theres bound to be fraud involved at some point.

    Anybody have info on this…?

  53. February 22, 2005 by Roger Johansson (Author comment)

    Lindkold: Yep, competition is fierce, and a lot of people try to cheat. Some get away with it, too. I’m not sure exactly how search engines decide if incoming links are “real” or not, i.e. how they detect link farms — if they have algorithms to do it automatically or if a human has to decide if a link network is legit or not.

  54. For those who uses .NET and dynamic links, this article on MSDN could be of use: URL rewriting in ASP.NET

  55. Would you please consider publishing a specific article on Flash and SEO. Is latest Flash-MX SEO-safe? I don’t belive so myself, but would love to see a very professional discussion on this topic at your place. Thanks much!

  56. Marvellous article, thanks!

    However I was wondering whether it is also important for search engine whether my site has a high-level-domain like or

    Is it really important?

    Thanks in advance for your information!

    With warm greetings from Saarbrücken, Germany, Vitaly Friedman,

  57. Thanks for an interesting article!

    Some useful recommendations from my side:

    • It seems to be useful to use keywords in your domain and url ( - as you actually wrote.

    • Besides, it is important to seek free and reciprocal links preferably from sites in your genre.

    Besides, one can find some interesting information on search-engine-marketing-kit, top-10-google-myths-revealed and search-engine-friendly-urls.

  58. Read the article, convinced of your convictions and want one more little push to completely redevelop my wife’s site, which has been at the top of google until recently for the obvious keywords. I’ve read so much about css, and am, as a musician by trade, attracted to it by its logic and discipline; will it clean up those html pages (front page-I know, you great designers hate it but beginners know no better!), and should I let css take the strain? thanks for the thoughts.

  59. April 9, 2005 by Roger Johansson (Author comment)

    z-design: I don’t use Flash, so I can’t really say much about the specifics of Flash and SEO, other than you should be very careful how you use it.

    Vitaly: Looks like you answered your own question ;-)

    gwilym: Yes, CSS will definitely help you clean up the HTML. Not sure how much of a problem Frontpage will cause though. It may make it more difficult but I can’t say for sure since I don’t use it.

  60. How to make the site maps~

  61. Thank you and excellent article! I was getting scared when I saw how long it was, but luckily our university website that I take partly care of complies with pretty much all you’ve said :-) We still have a company doing SEO for us, but that is rather me failing to convince the marketing personnell to cancel the contract.

  62. Thanks for your article on SEO, now I feel comfortable with my SEO plans as most of the areas covered in your article I am already implementing on my sites. Again thanks.

  63. Of course, a question arises as to how a search engine can detect whether or not a site uses page cloaking. There are three ways by which it can do so:

    i) If the site uses User-Agent cloaking, the search engines can simply send a spider to a site which does not report the name of the search engine in the User-Agent variable. If the search engine sees that the page delivered to this spider is different from the page which is delivered to a spider which reports the name of the search engine in the User-Agent variable, it knows that the site has used page cloaking.

    ii) If the site uses I.P. based cloaking, the search engines can send a spider from a different I.P. address than any I.P. address which it has used previously. Since this is a new I.P. address, the I.P. database that is used for cloaking will not contain this address. If the search engine detects that the page delivered to the spider with the new I.P. address is different from the page that is delivered to a spider with a known I.P. address, it knows that the site has used page cloaking.

    iii) A human representative from a search engine may visit a site to see whether it uses cloaking. If she sees that the page which is delivered to her is different from the one being delivered to the search engine spider, she knows that the site uses cloaking.

  64. IBL’s are they key to google, altaviata, msn and yahoo everything else is irrelivant

    on someone elses site get links that are KEY PHRASE

    thats all there is to it - theres too much mysticism about SEO - its all about brute force.

  65. Hi Roger,

    I have a question about the mispelling thing. What do search engines do with websites that start using words like 1337 (leet), or other forum talk words?

  66. May 18, 2005 by Roger Johansson (Author comment)

    Mark: Inbound links are very important, but in my experience far from the only thing that matters.

    Zeerus: I don’t think they “translate” leet speak, though I haven’t paid any attention to it so I could be wrong.

  67. May 19, 2005 by Mårten Blomberg

    Thanks for a great article. Really came in handy!

  68. Now that you know some of the secrets of quality SEO, it’s a good idea to measure your search engine visibility.

    I’ve written an article on the topic that is similarly thorough. Enjoy.

  69. May 21, 2005 by Roger Johansson (Author comment)

    Eric: That’s good stuff, and you have several other interesting SEO articles.

  70. Thanks Roger, you’ve got some very nice content here, as well. I’m always impressed when I visit.

  71. good tips thanks Roger, i would add to add contet the following:

    Make sure to use and tags for your keywords.

  72. Thanks alot, a nice clear and concise article and i instantly downloaded the lynx browser which cleared up a lot of my puzzlement over bad search engine listings for my site. i’m going to have to do a lot of redesigning, but it’s worth it. i’ve only just gotten into the whole world of css etc.

    thanks. your site is a fantastic resource.


  73. Good Tips, you can find many best Google page rank search engine list to submit yourself.

  74. June 24, 2005 by Troy

    I just want to chime in with a key technique that the search engines are using. You will see a steady increase in “weight” that SE’s give this element in their search algorithms. It’s called Latent Semantic indexing (LSI).

    This comes from the God father of SEO, Bruce Clay. You can hear it from his own mouth in a one hour interview on World Talk Radio.

    Check out the Wikipedia definition

    Essentially, the SE’s are trying to bring you the most relevant content based on the keyword phrases you used in your search. LSI has been proven to be 30% more “human like” in identifying what we would deem as relevant.

    Don’t beleive me? Try visiting Google and look at the paid advertisement. It’s from Google! They are seeking for scientists who understand and can improve this complex idea.

    ~ Cheers

  75. Most search engines don’t place any great deal of value on the contents of meta tags anymore

    Whilst Google and MSN are ignoring meta keywords, from personal experience it seems that the current Yahoo algorithm is definitely taking them into consideration.

  76. June 28, 2005 by Gogo Ganin

    Roger, the article sounds fine for sites with text based content. But, how about sites with rich multimedia content and small amount of text (e.g. terms of use). Even more difficult - for sites with peer-to-peer (non public) multimedia content. How to acchieve best ranking?

    Should I include long descriptions in a separate page? Nobody of the multimedia customers will be really interested on this description. It is even useless for advertisement (because of “less is more”).

    BR, Gogo

  77. Roger, I agree with everything you’ve written above except for the part about keywords use. My supplemental considerations.

    1. Google accounts for only 33%-37% of all users; you cannot ignore all of the remaining search engines or else you’ll lose their remaining 60%.

    2. Page optimization should include meeting accessibility guidelines, standards compliance and usability conventions. A page which can do this will meet criteria for its two users: search engines and site visitors.

    3. Page title and meta:description should reflect page content.

    4. All pages should be considered equally important.

    5. Site optimization is possible on existing sites but not practical; site optimization should be done during new site construction.

    6. All sites should have a site map. The site map should be at the top or near the top of each page. Spiders will find this page before they time-out and, since it is nothing but links to all pages on the site, spiders will spider your site faster which make indexing easier. [If, however, you cannot place your site map at the top of the page due to design issues, hand-submission of the site map to search engines will work.]

  78. June 29, 2005 by Roger Johansson (Author comment)

    Dan, Sean: I have not seen any improvement in ranking by using meta keywords, so in general I don’t think they are worth the effort. If you have the time to enter keywords that reflect each page’s content, by all means do so.

    Gogo: Difficult one. If there’s no text to index, rankings will not be good. The ethical way would be to include descriptions on the same page, without hiding it or otherwise trying to trick the spiders. But if the content is non-public, what is the use of public search engines indexing it?

    Sean: Those are all good tips. Thanks.

  79. Very amazing and helpful. Thanks alot!

  80. Excellent article. I couldn’t read all the comments, but I’ll have to add something with my rather limited knowledge and my own experience:

    Meta description seems to be very important, but keep it short in a single sentence, no summaries here.

    Keywords seem very useful when they are a few (5-10), but they must be to the point. I imagine how I would try to find my page in google, and I add those keywords to the meta keywords. I choose keywords from already present keywords in the page.

    Original content, like you say, is the most important of all. Having original content elsewhere on your site seem to boost all of your pages.

    I use php to serve without re-write, but with rather simple url’s with understandable values, and I am doing really well on a lot of generic searches. But my site is a really small one indeed, I have to admit :)

  81. This is also what i advise my clients who want to rank higher in the SE’s. I spend many hours reviewing the documents written by former university members now working at Google, and i can only say based on my experiance that the info is correct, although there is more to it (more technical issues like hosting, usage of programming laguages, cookies, etc..) Often my reply is simple: do not try to get higher, instead build your page and follow webstandards like: Heading tags, few keywords, good title discription, and most important: CONTENT! if you make a webpage that has a niche in your interest, and you build steadily a network with in/out bound links, nothing can go wrong. in my country we have a local saying: Patience pays off. And with Google it really does.

  82. Hi, there

    Does anyone know if Google penalizes pages with misspelled content or titles? I’ve been running a numerology site for a while, and it never ranked well on Google. I wonder if it is because I wrote “plam” instead of “palm”. Anyway, after reading your article I removed the misspelled words from the title.

    Thanks for the help

  83. here are some more seo tips at

  84. October 6, 2005 by Philip

    So here’s a content question. A lot of times I hear about how content is king and key and all those buzzwords. But so much of the help for SEO and design seems to come from the blogs of the top designers. Yeah, you get good fresh content all the time if you are a blog. But what about if you are a small business selling a few items, you don’t have that much content and there isn’t much you can do to make it fresh. What is everyone supposed to do? Put up a blog and do weekly updates on what kind of new candles their business has created?

    If your site isn’t the type that naturally has new content often (news site, blog), but rather has products that change, how is the content supposed to be kept fresh and of high quality?

  85. October 13, 2005 by Roger Johansson (Author comment)

    Philip: Adding new, real content is best. If you can’t do that, updating the content of existing pages is better than doing nothing. If a product has been updated, change the page that describes it. If it hasn’t been updated, maybe you can rephrase part of the description. You could also regularly add reviews or usage tips. It takes a bit of work, but will keep your site alive.

  86. October 22, 2005 by Search Engine Marketing

    Very nice article. If I had a $ for each time we have to reiterate the same principles to all our clients, i’d be rich!

    One thing though.. the bad SEOs go and spoil the game for the good ones and yet they seem to get away with it more often than not. Are they just really smart or is there just a never-ending supply of gullible clients?

  87. Probably the best tip is to keep it simple.

    Also if your site has great original content, then you will be popular in no time with no work to do.

    If your site isn’t popular then look at your competitor sites and use tools like and waybackmachine to see what changes they made to become popular

  88. Descriptive titles are important, but it is also important to keep page titles short and to the point.

  89. November 5, 2005 by Search Engine Optimization

    Title Tags are really important for search engine, so make sure you include up to 3 of your keywords in the title tag. Also Take care of your description Tag too. Include you Keywords in the title tag in the Description as well.

  90. November 5, 2005 by Roger Johansson (Author comment)

    Search Engine Optimization: The keywords go in the title element, not in the title tag ;-). And only include those keywords if they actually describe the content of the page, or you will mislead people visiting the site.

  91. I know two advices: good content & html style tags used well (h1, a, li).

  92. @ SEO

    Do not use “title tag spamming” there was a great era in which google gave us permission to abuse this, but that is long ago and is considered not done.

    U can use 1 keyword in a title tag which describes the nature of that link or image. It is primary used for spiders (who are blind).

  93. i refered this article:

  94. Thanks for posting this article. Great help on my seo learning.

  95. Thank you for information, very I sought this of information.

  96. cool, another good SEO article.

    just one question though. are you sure about the meta keywords tag? coz i heard Yahoo still use them.

    After changing the css to position: relative; left: -100000px Google started to index the page.

    umm, is this trick can be used to fool search engines as well? it’s not cloaking right?

  97. February 6, 2006 by smith

    Thanks a lot.This article is great help for me.

  98. February 18, 2006 by David O. Faik

    Thanks a bunch. I am web or IT literate but have never designed a page or had to deal with SEO. I am working for a hotel business (GDS marketing) that needs to greatly improve its SEO. They are asking me to look at creating a career section for their member hotels (located all over the world). This page has helped me formulate clear arguments about how that career site will need to be set out in order that it gets found by candidates in the key markets.

    Just a quick question though. If our member hotels (500) all link their job pages to our career site, that would be a very good SEO thing to do (from your article). Am I reading that right?

  99. This article is going to help me to great extent. I am also agree with your view that content is the king and getting links for your site also matters a bit.

  100. Congrats… This article just made Page 3 on most Google Datacenters for the term: Search Engine Optimization spelled the American Way

    and Page 4 for: Search Engine Optimisation Spelled the English Way

    If you had “Optimization” also in the Title, it probably climb to Page 2

  101. Yes these are basics of SEO and things that most of webmasters know but everything is on one place. To be honest I learned about title tag writting from this article, so it’s good time spent.

  102. To: David O. Faik

    If the link is in on different C-block ip, it’s right.

  103. Whoever made the comment about positioning the H1 element 10,000 pixels off the left of the screen is joking right?

    For the record, it doesn’t matter if google can detect these spammy techniques or not. Most of the time they can’t - there are thousands of cloaked sites that rank very well. What matters is whether your competitors can detect your spam.

    Anyone who is serious about getting good rankings should have a good look at the other sites that are beating you in the SERPs. This means looking at who links to them, what phrases they are optimised for, and what their source code looks like. Spammy tactics are usually easy enough to see in the source code.

    When you find a competitor is cheating - file a spam report. If you are wrong, no harm done. If you are right, you get a free upgrade in your ranking, and it’s easier than link begging :)

  104. There are funny situations when some SEO Company tells “You will get highest PR or at least 9 …” When you check their PR is 0 or 2 … :-)

  105. Search Engine Optimization is a highly specialized technology, and requires deep understand regarding various aspects of search engine optimization.

    Most of the companies which end up paying thousands of dollars for competitive keywords have little knowledge on how their money should be spent wisely. On certain websites even 500 back links can beat 20000 back links of a powerful and established website on various earch engines. So it is absolutely no use of submitting to 50000 unrelated websites or directories, but it is a sad story that companies fall into these booby traps

Comments are disabled for this post (read why), but if you have spotted an error or have additional info that you think should be in this post, feel free to contact me.