Validity does not equal best practices

I had a laugh when I read Dustin Diaz’ post Totally Compliant Markup. He’s created an example of a completely valid HTML document that goes against most best practices. Hilarious, but unfortunately a reality on some sites out there. And I’ve seen worse. Much, much worse.

I debated whether to post this or not, but in the end I decided that it’s important to show an example of what can go wrong when people use validation as the only best practice.

Take a look at the website of Swedish municipality Sandviken. Then view source. Oh, the horror. I dig that 37.5 kB of inline JavaScript to create a drop-down menu. The tons of inline CSS. The spacer GIFs. The HTML 4.01 Transitional DOCTYPE combined with HTML compatible XHTML syntax. The javascript: pseudo-URLs. Phew. The HTML actually validates. The CSS, however, does not.

A few examples, slightly edited for readability:

  1. <body style="margin-left: 0px; margin-top: 0px;margin-right: 0px; margin-bottom: 0px; background-image:url(/images/bg11b.gif);">
  2. <a href="/195dd5bf9174c73697fff388.html" accesskey="0" title="Om webbplatsen"><img src="/images/tompix.gif" style="border-style:none; width:1px; height:1px;" alt="Om webbplatsen" /></a>
  3. <div style="float:left; position: absolute; left: 656px; top: 4px;;" id="svid12.1756456108c8ba612280003655">
  4. <div style="float:left; margin-top:1px; margin-right:4px; " id="svid12.7b7b111015cc8f43f80001713"><a name="Sok"></a><a title="Sök" href="/sok.4.1ac5f13f90d6619d47fff124.html"><img alt="Sökfunktion" style="border-style:none; " src="/images/forstoringsglas1.png" onMouseOver="this.src='/images/forstoringsglas2.png';" onMouseOut="this.src='/images/forstoringsglas1.png';" /></a></div>
  5. <select class="smaxmenyer" id="droppen_vem" name="droppen_vem" onChange="window.location=document.getElementById('some_id').value;" style="width: 150px;">
  6. <div style="font-size: 0px; clear: all;"></div>

That’s just scraping the surface. For lots of examples of what not to do when building a website in 2006, study the source code of this municipality site carefully. Of course the municipality is not to blame for this mess. Whoever built the site or sold them the CMS they are using is. And they should be asked for a full refund.

But hey, it validates.

Posted on March 20, 2006 in Web Standards

Comments

  1. Get a load of those URLs. Holy cow.

  2. I also suggest you Sebastien’s Guillon’s XHTML-valid-but-incorrect page. XHTML 1.1 valid, CSS valid but full of worst practices.

  3. The web standards and best practices are not for namesake, they have a purpose. Very good example.

  4. vaildation does not goal of web standards.. but many developer come in web standards think so. ㅜ.ㅜ

  5. Typical problem since for some people, validity equals standard compliance (and thus “best practice”) which is just not the case. But hey, that’s progress, here we’re dealing with a new challenge!

  6. Forget the bad practices! Look at that link in #4 “sok.4.1ac5f13f90d6619d47fff124.html” That’s painful just looking at it.

  7. You should send this url to the people who built the CMS ;) I have -heard- that they will have this fixed in future versions of the CMS.

  8. The difference between validity and semantics is that validity can be assessed on a tag-by-tag basis whereas semantics must be evaluated in much larger chunks. This may be equated with a spell checker vs. a grammar checker as this poem demonstrates.

  9. Good article, and it is a SAD state that many think validation=usable/accessible/standards. What I am seeing, for the most part, is that web developers are LAZY. They want to make a quick buck, and if they can make others believe they know what they are doing - then it works.

    I was talking with another ‘web developer’ this weekend and he was telling me how nice it is that he has found open source pieces that he can jam into his clients sites. It gets REAL ugly once this happens. None of his sites validate, his CSS only validates in some spots (all generated from Adobe Go Live), and none of his sites are even close to being accessible to all (or even degrade well). Yet, he can make money just by telling people he knows what he is doing. He knows ZERO HTML and relies completely on a WYSIWYG. As a perfect example, you can check out a site he recently built: http://www.npnaz.org. This is for a church. He decided to use Calendar Express for the calendar system (which is vulnerable to an array of XSS hacks), he used another program to generate static photo galleries, and is now using another program to edit some content. It is getting out of control - and it only get worse markup wise, with the use of Adobe Go Live code and frames all over the place. Hmm…where was that article about a new professionalism?

    Anyway, now that I stranded - this is a great article, and a sad state the most ‘web developers’ miss the big picture.

  10. March 20, 2006 by DaveMo

    Think that example is bad? Check this out!

    http://www.cbs.state.or.us/external/osha/

    I wish I had been in on the meeting when the consultants sold THIS one to the bureaucrats!

    Your tax dollars at work.

  11. March 20, 2006 by Lowell Wood

    About the whole css validation thing, my css doesn’t validate solely because it says I don’t have the background-color property, when it is actually set to background-color:transparent. Does anybody have any ideas whats going wrong?

  12. March 20, 2006 by Lowell Wood

    Nevermind, those are just warnings, the css is valid, even though it doesn’t know what it is talking about with the warnings.

  13. March 20, 2006 by fens

    I don’t think the church one is THAT bad considering it’s probably not going to be seen by that many. But he’s hardly a web developer if he doesn’t try to get these things right. We all like a little pride in our work :D Techdirt(a large site like slashdot) recently did up its site to be more modern. Very good site for news so it is :) but its just not validating on any page. Now that’s BAD, it’s a large site, seen ny many. About technology. That can’t even get that right. I commented on it and even put it in the feedback.

    As for the css w3c, yeah I get the same. I just put in the correct colour to make it look right to get down the errors. Only one warning now on my site I think.

  14. I completely agreed with you, like I wrote a couple of months back.

  15. When giving advice or admonition, I now use the phrase “valid and semantic HTML.” Kind of solves the problem of improper or insufficient expectations.

  16. Lowell, the warning is with regards to possible low contrast and in some cases inherit will stop that. Like you say in most cases it doesn’t matter, it is just highlighting a possible issue.

  17. Roger, how in the world did you ever find that article burried in my archives? I’ve yet to even put up monthlies let alone topical archives…

    Anyway, I don’t know what any of you guys are talking about. There’s nothing wrong with my markup ;)

    we’re still doing sarcasm right?

  18. RE: fens Yeah, I agree - it won’t be seen by many, but it is still frustrating nonetheless. The ‘developers’ actual site is http://www.celuch.com (have fun with that one) and a proofs site at http://www.celuch.net where you can see his other sites he has done. This takes it beyond the ‘church’ site and into the professional realm where he is selling this junk to people.

    RE: Dustin I am glad he found the article, it was very amusing :)

  19. The nice thing about valid HTML is it should be both backward and forward compatible. However, it’s so far beyond what’s realistically needed to be compatible, it’s just ridiculous. One issue I’ve dealt with lately is the importance of valid HTML for SEO purposes. Is it works in a browser, it will be fine for SEO purposes. Do people really think that Google and other search engines will rank valid HTML sites higher? That would basically penalize great content published from non-anal web publishers.

  20. RE: Ed I would disagree with you on the valid HTML issue. Well, its not so much the VALID part as is it semantics. Using proper semantics in the site will help your site, as certain tags are weighted differently (h1-h6, em, strong, etc). In some of the tests I have seen, the ‘validity’ doesnt seem to play as big of a role - but semantics definitely do.

    I wouldnt consider those who CARE about their development as ANAL. Its important. I hope next time you go to the doctor that you are treated by someone who is ‘anal’ about their profession, versus someone who is just trying to get by. It is important to hold high standards, no matter what the profession.

    I guess I am anal….oh well.

  21. March 20, 2006 by fens

    RE: Nate K Oh. My. God. I like nice markup. Clean code. Makes everyones job easier. But… I can’t even browse those sites. It’s bad code and bad design >.

  22. March 20, 2006 by Henrik

    The really interesting part with the Sandviken site is that it’s the only municipal site in Sweden that follows the availability criterias at least according to the test of Verva (mostly swedish). Verva is a government initiative that among other things helps the public sector with guidelines much like the W3C WAI.

    In a way, you could say everything is more accessible if you have both the onclick and ahref on every link.
    If this is how it´s supposed to look to be accessible I´m not sure if I want to be a part of it…

  23. March 21, 2006 by Roger Johansson (Author comment)

    Henrik: Verva’s test (Mätning av grundläggande tillgänglighet) is only meant as a very, very basic check. It doesn’t really say anything at all about the site’s accessibility. It only checks whether the site uses valid markup, has at least one heading element, does not seem to use tables for layout, uses Dublin Core for metadata, and if it uses frames.

    If a site meets the criteria, the reasoning is that it is more likely to also be accessible and well-built. The Sandviken site clearly shows that this is not always the case. In fact, I’ve heard rumours that the home pages of some Swedish public sector sites are specifically tweaked just to pass the test.

    I believe that the test will be adjusted to be more likely to reveal those who only pay lip service to the public sector guidelines.

  24. March 21, 2006 by Roger Johansson (Author comment)

    Jens: They’re going to fix all the problems? That would be great. I’m not holding my breath though ;-).

    Dustin: Found the link in the massive folder of things to write about that I’ve collected.

    Joe: Yes, that is a much better phrase than just valid HTML.

  25. March 21, 2006 by Henrik

    Roger: I sure hope you are right about the adjustments of the Verva testing. The way the tests are performed now they are guiding developers in the wrong direction and make them believe they have done a good job. Both Sandviken and the people behind the CMS proudly announces being top-ranked in the test. Of course they do, I would too. But I just hope the testresult doesn´t make them think they are done with accessibility issues and, god forbid, that others doesn’t follow this codepractise to reach the same result in the test.

  26. March 21, 2006 by Gary Turner

    Like Mr. Clark, valid html is not enough in my world. My ordering is usually well structured, semantic and valid. Structure and meaning are the elements that require effort. That the markup is valid should be automatic. Further, if it doesn’t make sense in Lynx, it doesn’t make sense. :D

    cheers,

    gary

  27. Henrik, Roger: The Verva test is not meant to be used to check individual web sites (the test disclaimer mentions that). It is the aggregate information that is interesting (what percentage of government web sites use web standards compared to the previous measurement). In that way Verva can get an indication of if the guidelines have had any impact at all.

    That said, for an individual web site it is still valuable to say that they have “passed” the test. If you only change your start page to be able to pass the test, at least you will have learnt some basic things such as:

    • using the validator to test your markup,
    • what headings are,
    • and how to use DublinCore meta data.

    For more information on the forthcoming testing software see PAAKT.

  28. I would have to agree with everything stated here. It’s great that so many people are embracing standards, but to have your site validate is essentially only the first step. From there a human look must be taken at the overall picture and evaluated. The semantics of your markup can not be gauged by any automated process, no matter what the situation. Hopefully as the use of valid markup expands, the idea that the markup must make sense will come along with it. Thanks for the good read.

  29. April 2, 2006 by James

    Great work!

    Might have to add this to my toolbox for my next rewrite of the Target website. ;)

    James

  30. That’s why I include the notation

    © Copyright 2003, Volkan Özçelik. This page can be viewed in any browser, has valid xhtml, valid css and is at least wai-a level accessible with intentional exceptions

    At the bottom of my pages.

    imho, common sense is much more important than strict standards compliance.

    btw, my site is celebrating its one-day naked-ness :)

    Cheers,

Comments are disabled for this post (read why), but if you have spotted an error or have additional info that you think should be in this post, feel free to contact me.