Categories
XHTML/HTML

Adventures in XHTML

Recovered from the Wayback Machine.

During the recent light hearted discussions revolving around IE8 and its faithful companion, Wonder Tag, a second topic thread broke out about XHTML. As is typical whenever XHTML is brought up, the talk circles around to the draconian error handling or yellow screen of death when encountering even a small, harmless seeming discrepancy in a page’s markup.

However, the yellow screen of death is a factor of how Firefox deals with problems, not handling that’s inherent to serving XHTML as application/xhtml+xml. Safari’s error handling is much less extreme, attempting to render all of the ‘good’ markup up to the point where the ‘bad’ markup occurs.

Opera’s error handling is even more friendly. It provides the context of the error, which makes it the best tool for debugging a faulty XHTML page. You might say Opera is to XHTML, as Firebug is to JavaScript. The browser also provides an option to process the page as a more forgiving HTML.

To return to the discussion I linked earlier, in response to the mention of the draconian error handling, I wrote:

I can agree that the extreme error handling of the page can be intimidating, but it’s no different than a PHP page that’s broken, or a Java application that’s cracked, or any other product that hasn’t been put together right.

To which one of the commenters responded:

I don’t want to get off-topic either but I hear this nonsense a lot. You can’t simply compare a markup language with a programming language. They have very different intended authors (normal people versus programmers) and very different purposes.

I disagree. I believe you can compare a markup with a programming language. Both are based on technical specifications and both require an agent to process the text in a specific way to get a usable response. As with PHP or Java, you have to know how to arrange XHTML in order to get something useful. Because HTML has a more forgiving processor than the XHTML or PHP doesn’t make it less technical–just inherently more ‘loose’ for lack of a better term.

In my opinion, the commenter, Tino Zijdel, was in error on a second point, as well: markup isn’t specific to programmers. In fact, programmers are no better at markup than ‘normal’ people. Case in point is the error pages I’ve shown in this post.

As most of you are aware, I serve my pages up with the application/xhtml+xml MIME type. For those of you who have tried to access this site using IE, you’re also aware that I don’t use content negotiation, which tests to see if the browser is capable of processing XHTML and returns text/html if not.

Before yesterday, I still served up the WordPress administration pages as text/html, rather than application/xhtml+xml. Yesterday I threw the XHTML switch on the administration pages as well, and ended up with some interesting results. For instance, both plug-ins I use that have an options page had bad markup. In fact one, a very popular plug-in that publishes del.icio.us links into a post, had the following errors:

  • The ‘wrap’ class name wasn’t in quotes.
  • Five input fields were not properly terminated.
  • The script element didn’t have a CDATA wrapper.
  • Properties such as ‘disabled’ and ‘readonly’ were given as standalone values.
  • Two extraneous opening TR tags.
  • One non-terminated TR element.
  • Two terminating label elements without any starting tag.

For all of that, though, it didn’t take me more than about 15 minutes to fix the page, with a little help from Opera.

The WordPress administration pages work except for the Dashboard, where the version of jQuery that comes with WordPress didn’t seem to handle the Ajax calls to fill the page. I updated jQuery with the latest version, and the feed from the WordPress weblog shows, but not the other two items. At least, not with Firefox 3 or Safari, but all the content does show with Opera.

The Text Control plug-in had one minor XHTML error in the options page, but even when that was fixed, selecting a new text formatting option in the post doesn’t work–the selection goes back to the default. That one will end up being more challenging to fix, because I haven’t a clue what’s stopping the update.

WordPress does a decent job of generating proper XHTML content when using the default formatting. In fact the only problem I’ve had, other than when I embed SVG inline, was my own inaccurate use of markup. I used <code> elements, by themselves, when displaying block code. What I should have used is the <code> preceded by <pre>. When I do, the WordPress default formatting works without problems.

remove_filter('comment_text', 'wpautop', 30);
remove_filter('comment_text', 'wptexturize');
add_filter('comment_text', 'tc_comment');

My error, and the errors of the plug-in creators all demonstrate that though programmers might be more familiar with the consequences of making a mistake with technical text, we don’t make fewer mistakes than anyone else when it comes to using web page markup. Our only advantage is we’re not as intimidated by pages with errors. Regardless of how displayed or our relative technical expertise, though, these error messages aren’t necessarily a bad thing.

One of the advantages to serving the pages with application/xhtml+xml is that we catch mistakes before we serve the pages up to our readers. We definitely catch the mistakes before we release code that generates badly formed markup, or providing broken option pages to accompany our coded plug-ins. I can’t for the life of me understand why any programmer, web developer, or designer would want less than 100% accuracy from their web pages. That’s tantamount to saying, “Hire me. I write sloppy shit.”

Of course, being able to program can have advantages when working with XHTML, especially with many of today’s applications. WordPress does a good job at working in an XHTML environment, but not a great one. One example of where the application fails, badly, is in the Atom feed.

In Atom, WordPress outputs the HTML type as an attribute to many of the fields:

<summary type="<?php html_type_rss(); ?>">
<![CDATA[<?php the_excerpt_rss(); ?>]]></summary>
<?php if ( !get_option('rss_use_excerpt') ) : ?>

This is all well and good except for one thing: when the type is returned as ‘xhtml’, Atom feeds are supposed to use the following syntax for the content:

<summary type="xhtml"><div xmlns="http://www.w3.org/1999/xhtml">
...</div></summary>

This is an outright error in how the Atom feed is coded in WordPress. I’ve had to correct this in my own feed, and then remember not to overwrite my copy of the code whenever there’s an update. What the code should be doing is testing the type, and then providing the wrapper accordingly.

A second issue with WordPress is more subtle, and has to do with that part of XML I don’t consider myself overly familiar with: character sets and encoding. As soon as I switched on XHTML at my old weblog, I started to have problems with certain characters in my comments, and had to adjust the WordPress comment processing to allow for UTF-8 encoding. As it is, I’m not sure that I’ve covered all the bases, though I haven’t had any re-occurrence of the initial problems.

However, during the XHTML discussion, Philip Taylor demonstrated another problem in the WP code, in this case sending through a couple of characters that the WP search function did not like.

I checked with one of my two XHTML experts, Jacques Distler (the other being Sam Ruby), and the characters were Unicode, specifically:

utf-8 0xEFBFBE = U+FFFE
utf-8 0xEFBFBF = U+FFFF 

From Jacques I found that Philip likes the U+FFFE and U+FFFF Unicode characters because they’re not part of the W3C’s recommended regular expression for filtering illegal characters.

Unfortunately, to protect against these characters in search as well as comments required code in more than one place, and in fact, having to hack into the back end of WordPress. This is not an option available to someone who isn’t a programmer. However, this example doesn’t demonstrate that you have to be coder to serve pages as XHTML–it demonstrates that applications such as WordPress have a ways to go before being technically, rather than just cosmetically, compliant with XHTML.

Having said that, I can almost hear the voices now: Why bother, they say. After all, no one uses XHTML, do they?

Why bother? Well, for one thing, XHTML served as XML provides a way to integrate other XML-based specifications into the page content, including in-line SVG, as well as MathML, and even RDF/XML if we’re so inclined. The point is, serving XHTML as XML provides an open platform on which to build. Otherwise, we’re dependent on committees to hash through what will or will not be allowed into a specification, based on one company or another’s agenda.

We can include SVG into a page using an object element, but we can’t integrate something like SVG and MathML together without the ability to include both inline. We certainly can’t incorporate SVG into the overall structure of the page–at least not easily using separate files. There is no room in an HTML implementation for all the other XML-based vocabularies, and we can only cram so much into class attributes before the entire infrastructure collapses.

No, we need both: an HTML implementation for those not ready to commit to an XML-based implementation, and XHTML for the rest of us.

During the recent discussions on IE8, several people asked Chris Wilson from Microsoft whether IE8 will support the application/xhtml+xml MIME type. So far, we’ve not had an answer. Whatever the company decides, though, XHTML is not going away. The HTML5 working draft, which was just released, is about a vocabulary, not a specific implementation of that vocabulary. Both HTML and XHTML implementations are covered in the document, though XHTML isn’t covered as fully because most of the aspects of processing XHTML are covered in other documents. At least, that’s what we’re being told.

What’s critical for the HTML5 effort is that browsers support both implementations. Even the smallest mobile device is not going to be so overburdened by the requirements that it can’t consume pages delivered up as proper XHTML. It’s a sure thing that handling clean markup takes less requirements than handling a mess.

I’d also hate to think we’re willing to trade well designed and constructed web sites for pages filled with missing TR end tags, poorly nested elements, and unquoted class names, just because Microsoft can’t commit to the spec, and Firefox took the “bailing out now!” approach to error handling.

Categories
Browsers

And they’re off

The ACID3 race has begun. Coming around the first lap…

Firefox 3 is in first place, with a comendable lead. Way to burn up the track, foxy!

[image gone]

Coming up from behind, we find the ACID crowd favorite, *Opera!

[image gone]

Winded, but still giving it all she’s got…Safari! (Is that a picture of a cat?)

[image gone]

And in the tail position, dragging, but not dead yet…IE!

[image gone]

The next lap is in six months. Get your bets in now.

Update

*Testing with Opera’s 9.5 beta, we have a new winner, going into the first lap…

[image gone]

Categories
Technology

Macports, Unix, and Graphics

Recovered from the Wayback Machine.

My upcoming book, Painting the Web includes considerable coverage of technology-enabled graphics. Of course, all graphics are technology enabled, but when I say ‘technology-enabled’ I mean graphics via command line tools or accessed through programming language such as PHP.

What to cover wasn’t an easy choice. For instance, how much programming experience should we assume the reader has? Little? Lots? In the end, I focused the writing at a reader who has had exposure to JavaScript and/or PHP, but didn’t have to be either a pro or an expert.

Then there was the issue of the Unix command line, and installation applications for the Mac, such as Macports. Even experienced PHP/JavaScript developers may have no exposure to the Unix command line. Yet there is a wealth of resources available–in Linux and on the Mac–for people interested in graphics who are willing to forgo the desktop interface and get your Unix on, as the saying goes.

In the end, I covered these tools but promised the reader that I would provide web pages with up-to-date links to helpful tutorials and resources that could get them up to speed, either on Unix, or in the programming languages used. This includes one of my most used applications, MacPorts, the installation software useful for installing Unix-based applications on our computers.

Why would you be interested in MacPorts, especially if you’re into graphics?

When I was getting ready for Painting the Web, I spent an entire day downloading and installing software I planned to cover in the book on one of my Macs. An entire day, literally dozens of applications, and yet all combined, none of it took over a gigabyte on my hard drive. That’s one of the real advantages to using an application like MacPorts and free and open source applications that can be installed with this tool. In the graphics port area alone you have applications such as GIMPUFRaw (a RAW editor), Inkscape for vector graphics, the GD graphics library that I use so extensively at this site, libexif for parsing the EXIF section of a photo, and hundreds of other applications, including my favorite, ImageMagick.

Ah, ImageMagick. I can never say enough about ImageMagick. It has got to be one of the most entertaining sets of graphics tools in the living world. Best of all (well, other than it being free) most hosting companies have some version of ImageMagick installed, so you can access the command line tools without having to install them on your own Mac (or Windows, there is a Win version of ImageMagick). Still, if you can get a local copy on your Mac, installing this application pays for the Macports installation, all by itself. When you do install the tool set, make sure to spend time with the online examples, as documentation is a bit light for ImageMagick.

It’s a little ironic that one of the first things I wrote in a book on web graphics was to encourage people interested in graphics to become familiar with the Unix command line. The Unix command line is one of the most non-graphical technologies that exists today. Graphics, though, does not begin and end solely in Photoshop–limiting your tools to those that have a GUI and that are installed with one click of the mouse limits the amount of fun you can have with graphics. And if we’re not having fun, why bother?

  • You will need to install the Apple X11 system using the Mac OS X Install Disc, first. The MacPorts instructions cover this.
  • Next is MacPorts of course. You may have also heard this application called, “DarwinPorts”. The site has a list of ported applications, as well as excellent documentation.
  • Another MacPorts tutorial, providing more of an overview. You can also find an overview of MacPorts at Lockergnome.
  • I don’t use a GUI to MacPorts, but some of you might like one. There are several, including PortAuthority and Porticus. The benefit of a GUI tool is that it can be easier to see, at a glance, what’s installed.
  • One of the advantages of using MacPorts is installing applications that work together, such as the LAMP trifecta: Apache+MySQL+PHP. I found a couple of different tutorials on using MacPorts for installing these three applications: a fairly detailed and involved approach, which might be a little intimidating to new command line users; steps for a Leopard installation. I’m not running Leopard, so I’m not sure how accurate the steps covered are. Frankly, if you don’t need the trifecta, and you’re just playing around with the graphics, I’d get more comfortable with MacPorts and the command line before installing these three. If you want to try some of the PHP-based graphics applications, though, you’ll have to install at least Apache and PHP.
  • One thing about MacPorts is that if there is an application dependency for the application you’re installing, the tool automatically downloads and installs this dependency. I have found that with GIMP, if I use MacPorts to install UFraw, first, it downloads and installs the latest GIMP, and then integrates the two. With this integration, UFraw pre-processes a RAW photo, first, before passing the photo on to GIMP. Regardless, of how you install the tools, you’ll definitely want to be consistent: if you use MacPorts to install UFRaw, don’t use the standalone click installer for GIMP–use MacPorts. Otherwise the GIMP application is installed in the wrong place, and UFRaw can’t find it.
  • ImageMagick is also an available port on MacPorts. There are a significant number of dependencies for ImageMagick, so it make take a considerable amount of time to install this application. May I say, though, that the results are worth the effort? Unfortunately, most of the programming language interfaces to ImageMagick are not in ports. For instance, I use iMagick (source), a PHP-based ImageMagick wrapper, which is accessible via PECL, a PHP extension system, but not MacPorts. No worries, though, as these language-based wrappers are typically quite easy to install. If you’re a Ruby user, you’re in luck: RMagick is a MacPorts port.
  • Throughout all of this, even if you use a GUI MacPorts interface tool, at some point, you’re going to be messing with the Terminal application for the Mac. The Terminal provides an interface into the underlying Unix system, and command line. There are tutorials in using the Terminal, including a TidBits tutorial (part 2 and part 3) and several older articles from O’Reilly.
  • There are a ton of Unix command line how-tos, helps, and tutorials. The nice thing about the Unix command line is the tools you use most, rarely change. Benjamin Han has provided several Mac Unix how-tos, this Mac forum thread provides some nice jumping off points, and there are a couple of books for Mac users covering the command line, though I haven’t read any and so can’t provide a recommendation. You might also want to spend some time with shell scripting especially if you want to package your ImageMagick commands.

This is a start, and I’ll be adding to this list before I formalize it into a separate reference page. If you know of any other resource that should be included, please drop me a note or leave a comment.

Of course, it goes without saying that even the best laid plans go awry, and you’ll want to backup your hard drive before installing MacPorts and any of the applications. I also recommend searching on “MacPorts” and the application name in Google or Yahoo, first. You can sometimes find better ways of installing sets of applications, such as Apache2+PHP5+MySQL. If you’re using Leopard, or running on an Intel-based Mac, you’ll also want to double check that the application does work in your environment.

Happy MacPorting.

Categories
Standards SVG XHTML/HTML

Microsoft: Fish, or cut bait

Recovered from the Wayback Machine.

Sam Ruby quotes a comment Microsoft’s Chris Wilson made in another weblog post:

I want to jam standards support into (this and future versions of) Internet Explorer. If a shiv is the only pragmatic tool I can use to do so, shouldn’t I be using it?

Sam responded with an SVG workaround, created using Silverlight–an interesting idea, though imperfect. Emulating one technology/specification using another only works when the two are comparable, and Silverlight and SVG are not comparable. When one specification is proprietary, the other open, there can be no comparison.

There was one sentence of Sam’s that really stood out for me:

You see, I believe that Microsoft’s strategy is sound. Stallstallstall, and generate demanddemanddemand.

Stall, stall, stall, and generate demand, demand, demand. Stalling on standards, creating more demand for proprietary specifications, like Silverlight. Seeing this, how can we be asked to accept, once more, a Microsoft solution and promises that the company will, eventually, deliver standards compliance? An ACID2 picture is not enough. We want the real thing.

Jeffrey Zeldman joins with others in support for the new IE8 meta tag, based on the belief that if Microsoft delivers a standards-based browser with IE8, and companies adopt this browser for internal use, intranets that have been developed specifically to compensate for IE shortcomings will break, and Microsoft will be held liable. According to statements he’s made in comments, heads will roll in Microsoft and standards abandoned forever:

…the many developers who don’t understand or care about web standards, and who only test their CSS and scripts in the latest version of IE, won’t opt in, so their stuff will render in IE8 the same way it rendered in IE7.

That sounds bad, but it’s actually good, because it means that their “IE7-tested” sites won’t “break” in IE8. Therefore their clients won’t scream. Therefore Microsoft won’t be inundated with complaints which, in the hands of the wrong director of marketing, could lead to the firing of standards-oriented browser engineers on the IE team. The wholesale firing of standards-oriented developers would jerk IE off the web standards path just when it has achieved sure footing. And if IE were to abandon standards, accessible, standards-compliant design would no longer have a chance. Standards only work when all browsers support them. That IE has the largest market share simply heightens the stakes.

From this we can infer that rather than Pauline, the evil villain (marketing) has standards tied to the railroad tracks and the locomotive is looming on the horizon. If we ride to the rescue of this damsel in distress, though, what happens in the next version of IE? Or moving beyond the browser, the next version of any new product that Microsoft puts out that is supposedly ‘open’ or ‘standards-based’? Will we, again, be faced with the specter that if we rock the boat, those who support standards in Microsoft will face the axe, as standards, themselves, face the tracks? There’s an ugly word for this type of situation. I don’t think it’s in Microsoft’s best interest if we start using this word, but we will if given no other choice.

If Microsoft really wants to make the next version of IE8 work–both for its corporate clients and with the rest of us–in my opinion it needs to do two things.

The first is accept the HTML5 DOCTYPE, as a declaration of intention for full standards compliance. Not just support the DOCTYPE, though. Microsoft has to return to the HTML5/XHTML5 work group and participate in the development of the new standard.

The next step is, to me, the most critical Microsoft can take: support application/xhtml+xml. In other words, XHTML. XHTML 1.1 has been a released standard for seven years. It’s been implemented by Firefox, Safari, and Opera, and a host of other user agents. There is no good reason for Microsoft not to support this specification. More importantly, support for XHTML can also be used as a declaration of intentions, in place of the IE8 meta tag.

This is Microsoft meeting us half-way. It gives a little, we give a little. Microsoft can still protect it’s corporate client intranets, while we continue to protect the future of standards. Not only protect, but begin to advance, because the next specification Microsoft must meet will be support for SVG. Perhaps it can use Silverlight as the engine implementing SVG, as Sam has demonstrated. However, if the company does, it must make this support part of the browser–I’m done with the days of plug-ins just to get a browser to support a five year old standard.

Microsoft is asking us to declare our intentions, it’s only fair we ask the same of it. If Microsoft won’t meet us half-way–if the company releases IE8 without support for the HTML5 DOCTYPE or XHTML, and without at least some guarantee as to when we’ll see SVG in IE–then we’ll have our answer. It may not be the answer we want, but it will be the answer we need.

I would rather find out now than some future time that Microsoft’s support for standards is in name, only. At the least, we’ll know, and there will be an end to the stalling.

Categories
Standards

Tyranny of Microsoft

Recovered from the Wayback Machine.

July 20th, 2000, the Web Standards Project issued an ultimatum to Netscape/Mozilla, saying, in part:

Why are you taking forever to deliver a usable browser? And why, if you are a company that believes in web standards, do you keep Navigator 4 on the market?

If you genuinely realized it would take two years to replace Netscape 4, we wish you would have told us. No market, let alone the Internet, can stand still that long. We would have told you as much.

Continuing to periodically “upgrade” your old browser while failing to address its basic flaws has made it appear that you still consider Navigator 4 viable. It is not. You obviously know that, or you would not be rebuilding from scratch. But keeping your 4.0 browser on the market has forced developers to continue writing bad code in order to support it. Thus, while you tantalize us with the promise of Mozilla and standards, you compel us to ignore standards and write junk code in order keep our sites accessible to the dwindling Netscape 4.0 user base. It’s a lose-lose proposition, on our end and yours.

For the good of the web, it is time to withdraw Navigator 4 from the market, whether Netscape 6 is ready or not. Beyond that, if you hope to remain a player, and if you expect standards advocates to keep cheering you on, you must ship Netscape 6 before its market evaporates – along with the dream of a web based on open standards.

If you succeed now, you will regain some of the trust and market share you have lost. And instead of arguing with your competitors, standards advocates will be able to sit back and watch them try to catch up with your support for XML and the DOM.

If you fail now, the web will essentially belong to a single company. And for once, nobody will be able to blame them for “competing unfairly.” So please, for your own good, and the good of the web, deliver on your promises while Netscape 6 still has the chance to make a difference.

Much of the criticism was based on the fact that Netscape, soon to become Mozilla, was undergoing a massive infrastructure change–a change that eventually led to the Mozilla project we know today, and to products like Firefox, and extensions such as Firebug, Web Developer Toolkit, and so on. The WaSP believed at the time that Netscape should focus on delivering a standards compliant browser, putting away the foolishness of XUL until some later time.

In response to a posting at Mozillazine, I wrote a comment about ‘tyranny of the standards’, which eventually led to a full article out at O’Reilly under the same title.

My oh my wasn’t I ripped a new one by members of the WaSP and others. Among those who disagreed with me was Jeffrey Zeldman, who wrote in comments:

The author misses two crucial points, I think:

1. The WaSP has never criticized companies for innovating. If Netscape had not innovated JavaScript, the web would be far poorer – and we would not have the ECMAScript standard today. All the WaSP has asked, repeatedly and clearly, is that browser makers not innovate *at the expense of existing standards.* In other words, that they devote resources toward improving their support for baseline technologies such as CSS-1, HTML 4, XML, ECMAScript and the DOM, *BEFORE* creating new, possibly competing technologies.

For example, we have no problem with IE’s table data “bordercolor” attribute, because IE also provides a standard means of accomplishing the same thing via the standard CSS border property, which they’ve supported well since IE4. Designers and developers can choose to design only for IE if they wish (using IE’s proprietary HTML extension), but most will choose to use the standards IE supports. As long as IE supports those common standards, let them innovate all they like. Similarly, we have not criticized XUL because, as Christian Riege points out, XUL does not stand in the way of Mozilla or Netscape 6’s support for DOM1, CSS, and HTML.

As Bill Pena wrote, ” Before adding a blink tag or ActiveX, CSS-positioning should have been implemented. That’s the real problem.” Historically speaking, blink was unleashed on the world before the CSS-1 recommendation was finished, but Bill’s overall point is exactly what we’re talking about.

Browser makers seem to understand this distinction, which we’ve been raising for nearly three years. It is in our mission statement, and we’ve said it time and again in press statements and interviews. Somehow the author of the article missed it. Most web developers and designers have *not* missed this point, and it is the power of their numbers as much as anything else that has enabled WaSP to influence browser makers in the direction of compliance with these baseline standards.

2. The author paints a portrait of browser companies being “forced” to adapt W3C recommendations by an angry lynch mob. This picture, while it adds a certain dramatic weight to the author’s arguments, ignores the reality of the situation.

*Browser makers themselves are largely responsible for creating these technologies.* When Netscape and Microsoft sat down with the W3C and, along with invited experts, came up with recommendations like CSS-1 … and when they then agreed to support these baseline technologies they’d just helped to create … it seemed logical to us that these companies would work to implement the things they’ve mutually invented and agreed to support.

Today, they mainly do exactly that, and it surely has not impeded their ability to innovate. But in 1998, browser makers were driven by market forces to focus on their points of difference, even as these applied to common and much-needed technologies like CSS, HTML and JavaScript. No organized group was around to remind these companies to fulfill the promises they’d made, giving developers and web users a reliable baseline of sophisticated technologies that would enable the web to move forward. In the absence of any unified voice calling out for these obviously-needed technologies, WaSP was born.

We are not a lynch mob; we’re a small, non-profit, volunteer group using the only tool at our disposal — the power of public opinion — to persuade browser makers to fulfill promises they made as long ago as 1996 (in the case of CSS-1). By and large, browser makers have been working to fulfill those promises since they were made aware that their customer base actually cared about and needed these baseline technologies. The WaSP is not the Politburo or the U.S. Congress. Our goal is not to enhance our own power (of which we have none). Our goal is to wither away like the Communist State was supposed to, as soon as browser makers have finished the job of supporting baseline standards, and web developers are actually using these standards in the sites they build.

Cut forward seven years, and Zeldman writes, in response to the planned rollout of the IE8 meta tag:

We knew when we published this issue of A List Apart that it would light a match to the gaseous underbelly of standards-based web design, but we thought more than a handful of readers would respect the parties involved enough to consider the proposal on its merits. Alas, the ingrained dislike of Microsoft is too strong, and the desire to see every site built with web standards is too ardently felt, for the proposal to get a fair viewing.

Today too many sites aren’t semantic, don’t validate, and aren’t designed to specs of the W3C. Idealists think we can change this by “forcing” ignorant developers to get wisdom about web standards. Idealists hope, if sites suddenly display poorly in IE, the developers will want to know why, and will embark on a magical journey of web standards learning

I commend Aaron Gustafson for his courage and intelligence and thank him and his small band of colleagues, and the engineers they worked with at Microsoft, for offering a way forward that keeps web standards front and center in all future versions of IE.

People change over seven years time. I know I’ve changed, and have become somewhat fanatical about standards. What changed for me between then and now was a thing called IE6, which lasted forever, and has still not properly been retired by Microsoft.

I’m not the only person to change in that time. Where is the man, where is the Zeldman who argued so passionately for standards long ago? Who used to encourage people to contact web designers and tell them to update their sites to meet standards? Who joined with others in condemning Netscape/Mozilla for working on a new infrastructure, rather than pushing a browser out the door that met standards?

Engulfed by the Blue Monster, evidently.

Today, Molly Holzschlag wrote a post, Me, IE8, and Microsoft Versioning where she bemoans the lack of transparency forced on to her, the WaSP team members, and others working with Microsoft.

Open standards must emerge from public, open, bare discussion. Microsoft clearly does not agree with this. It goes against its capitalist cover-up mentality, even when Bill Gates himself has quite adamantly stated that there should be no secrecy around IE8. In fact, he was the one who let the name slip. The fucking name, people! This shows you how ludicrous the lack of communication had become: Gates himself didn’t even know we weren’t allowed to say “IE8.”

This covert behavior is a profound conflict for me as I’m sure readers will at least agree that I’m pretty darned overt by default. But I knew it going in, I just kept and am still keeping my hopes high because that is also my default.

Sometimes the solution is to step back and re-evaluate. Sometimes the solution is to walk away. I haven’t firmed up my personal decisions on that just yet. Maybe it’s time to go back to Old School WaSP-style stinging of MS, but that definitely is not my default.

Can’t we all just get along? No, really. During my time at WaSP, the door was open to a kinder, gentler way. More fool me? So be it. I’m not giving up the greater goal, which is keeping the Web open, free, naked, bare-assed to the world.

To Molly’s post, I wrote a still-moderated comment:

There was another option for you and Aaron and the other people who found Microsoft’s silence so disturbing: you could have quit.

You could have pulled out of the discussions in no uncertain terms and let them know they were making mistakes. You could have used the reasons for your leaving to demonstrate to Microsoft the strength of your convictions.

Bill Gates is first and foremost a poker player. This one significant aspect of his personality has influenced Microsoft from the very beginning. How does the song go? “You’ve got to know when to hold them, know when to fold them, know when to walk away, and know when to run.”

Members of WaSP should never have allowed themselves to be pulled into such a NDA’d discussion.

Two things wrong about all of this.

First, the fact that we, who strive to create standards compliant pages, are the ones who have to change our pages in order to them work with IE8 is, frankly, ludicrous. Leaving aside all of the issues brought up by other people, the idea that the way forward is to have the sites created by people who do things right be the ones to break, rather than the sites created by people who do things wrong, because we’re supposedly the better informed, is ridiculous. It sets a precedent for mediocrity. It signals to agents such as browser makers that they no longer have to worry about those little side trips into proprietary technologies while standards support languishes because, you know, the web will be waiting here for them whenever they decide to remember we exist.

More importantly, I’m seeing too many people who are supporting this tag, doing so because they believe if Microsoft receives complaints from people that their sites are breaking, the company will fire their standards staff and go its own way and all of standards development will be lost, forever.

I don’t know what they call this in Zeldmanland, but where I come from it’s called extortion and blackmail. It is equivalent to saying Microsoft owns the web. Well, we all know that’s not true–Google owns the web.

Secondly, this new tag came about because of closed door meetings under NDA with Microsoft, and involving members of the WaSP, and others who we have come to respect in this industry, such as Molly, PPK, Zeldman, and Eric Meyer. People who have made their name, and their careers, based on support for standards. People who are now finding out that respect in the past does not translate into blind obedience in the future.