Categories
Government Standards

Corporate food production interests yank the chains of Congress

Second update

House just can’t wait to pass this bill. It goes to the Floor on Thursday. Note: there is no comparable bill in the Senate.

stirring up a batter of trouble

Update

In the ultimate of ironies, the Senate passed an amendment to their appropriation bill, that would require genetically modified salmon be given a GMO label. How to explain the inconsistencies?

Sen. Lisa Murkowski (R., Alaska)…downplayed concerns that salmon labeling would set a precedent for labeling biotech crops saying, “Corn doesn’t swim from one field to another and propagate with corn in another state. Fish move. Fish escape,” she said.

No, no. No one has ever heard of pollen floating on the breeze and contaminating organic crops.

earlier
How can you tell if Vermont will prevail in the lawsuit filed against its new GMO labeling law, Act 120? Easy: Congress decides to create a national anti-GMO labeling law. More on this in a moment. First, though, a recap on the court challenge.

In April, Judge Christina Reiss issued a decision denying in part and granting in part Vermont’s motion for dismissal, and denying, outright, the *plaintiff’s motion for preliminary injunction. The latter means that when you consider how speedy civil cases of this nature proceed through the court system, Vermont’s GMO label law will be able to go into effect in 2016.

The Judge quickly dismissed the dormant Commerce Clause challenge to the GMO labeling. After all, the basis for this challenge is that a state law must discriminate against out of state interests, and Vermont’s law applies to in-state as well as out-of-state interests. The decision also reflects a growing push-back against the application of the dormant Commerce Clause, possibly reflecting the Supreme Court’s own ambivalence about its application. I particularly liked the Judge noting that Vermont’s GMO labeling law won’t lead to a “patchwork of state laws”, because no other state has implemented a GMO labeling law, and hence, no inconsistency is introduced with Vermont’s law.

The Judge did feel that the plaintiff’s claim about the law’s reference to the use of “natural” on labels was strong enough to warrant denying Vermont’s request to dismiss the Commerce Clause challenge related to it. Yeah, that was one Vermont would have been best to just leave out of the GMO law.

In my original writing on the law, and the legal pushback from Lauren Handel, we felt the strongest challenge to the Vermont law was the Supremacy Clause, and whether the law was expressly preempted by the labeling requirements in the FMIA (Federal Meat Inspections Act) and PPIA (Poultry Products Inspection Act). The FDA’s FDCA and NLEA are both quite amenable to state labeling requirements, so aren’t really a challenge. The FMIA and PPIA, however, do have strict label requirements, and do assume federal authority of said labels.

Vermont was aware of this, and built into Act 120 exemptions related to meat and meat products, which should encompass those products that would be covered under the FMIA and PPIA. Where we felt there was the possibility of conflict was a product like soup. Soup is a manufactured product and, we assume, would be covered by Vermont’s Act 120. Soup can either contain meat products, or not. If the meat content exceeds 3% raw, or 2% cooked meat, then it would be managed by the USDA; otherwise, it’s managed by the FDA. This soup conundrum reflects the truly mish-mash nature of food safety handling in the US.

Since Campbells is part of the group suing Vermont, I fully expected soup to raise it’s head at some point. If it did, though, it quickly ducked. According to Judge Reiss’ decision:

In opposing dismissal and seeking preliminary injunctive relief, Plaintiffs narrow their FMIA and PPIA preemption claims to argue that some GE food products that contain meat, poultry, and eggs which do not fall within Act 120’s exemption for products “consisting entirely of or derived entirely from an animal,” 9 V.S.A. § 3044(1), are regulated for labeling purposes by the FMIA or the PPIA. They identify canned meat and poultry products and pre-made frozen meals containing meat or poultry as examples of products that fall within both statutory frameworks. In their Amended Complaint and declarations, however, Plaintiffs fail to identify even one of their members who produces a non-exempt GE food product that is covered by the FMIA or PPIA.

In other words, something like chicken noodle soup would either be exempt under the Vermont law, or isn’t a food product covered by the FMIA or PPIA. According to the FSIS guidelines:

Although FSIS has jurisdictional authority over food labeling for products containing meat and poultry, the FMIA and the PPIA explicitly authorize USDA (through FSIS) to exempt from its regulatory coverage food products which contain meat or poultry “only in relatively small portion or historically have not been considered by consumers as products of the meat food industry …

Soup is, typically, not considered a product of the meat industry, no matter how much meat it contains. And let’s face it: most canned soups really aren’t brimming with meat.

If there are no products not exempt under Vermont Act 120, but governed by the FMIA and PPIA, the plaintiffs can’t establish standing for this particular challenge. The only reason the Judge did not dismiss the preemption challenge outright is because the plaintiffs argued there may be small food producers who are making such a product who haven’t been identified yet.

We can only imagine food producers all over the country are working late into the night, trying to create and market some product that falls between the infinitely tiny crack that may exist between the Act 120 exemptions, and FMIA and PPIA governance.

Judge Reiss than took on the First Amendment challenge to Act 120. The plaintiffs claimed Act 120 violates corporate freedom of speech because Act 120 is “a politically motivated speech regulation”—it compels political speech. Well, this is just plain rubbish. The Judge agreed, though more tactfully:

A manufacturer who is required to disclose whether its products contain certain ingredients is not compelled to make a political statement even if such a statement “links a product to a current public debate” because “many, if not most, products may be tied to public concerns with the environment, energy, economic policy, or individual health and safety.”

The more compelling challenge related to freedom of speech was whether Act 120’s disclosure requirement is nothing more than just a satisfaction of consumer curiosity. This is what torpedoed Vermont’s statute related to labeling milk that contains recombinant Bovine Somatotropin (“rBST”) or recombinant Bovine Growth Hormone (“rBGH”). However, unlike that statute, Act 120 did raise the debate about the safety of GMO products, in addition to other factors:

Act 120’s “Findings” and “Purpose” extend beyond the mere appeasement of consumer curiosity, and the State emphasizes that it is not making the concessions it made in IDFA. It cites to what it characterizes as an ample legislative record documenting the scientific debate about the safety of GE ingredients and the studies that have produced positive, negative, and neutral results. This record includes studies about the safety of consuming GE plant-based foods, as well as studies about the environmental impacts of GE and GE crops. The State also points to its interest in accommodating religious beliefs about GE, as well as its interest in providing factual information for purposes of informed consumer decision-making.

The Judge did feel the intermediate scrutiny of Act 120 as it relates to the First Amendment was a question of law, and should be debated during the court hearing related to the case. Therefore, Vermont’s motion to dismiss was denied. However, the Judge also felt that the plaintiffs were unlikely to prevail in this challenge in the court, and their request for a preliminary injunction was denied.

Judge Reiss wrote a long, thoughtful, and careful decision. Though the plaintiffs case was not dismissed outright, many of its challenges were dismissed, or had doubt cast on them as to their viability. And that leads us to HR 1559, the so-called Safe and Affordable Food Act, which just advanced from committee to the House floor. How can you tell if Vermont will prevail in the lawsuit filed against its new GMO labeling law, Act 120? Easy: Congress decides to create a national anti-GMO labeling law.

This bill seeks to preemptively undercut Vermont’s Act 120, before it has a chance to take effect. Many of its proponents are people who consider themselves tried and true “states rights” advocates…well, up and until a northern state, like Vermont, passes a bill that goes counter to select interests in their state. Can’t have them uppity Northerners telling nice southern and midwestern corporate boys what to do, no sirree.

Regardless of your stance on GMO and labeling, the bill should give you pause because it seeks to use Congress to bypass state statutes that reflect the interest of the people of the state and that have withstood a constitutional challenge.

That latter is important. Vermont’s Act 120 isn’t seeking to prevent gays from marrying or women from having access to abortion. It’s a statute impacting on commerce that ensures additional information is provided to consumers. More importantly, it’s a statute that has not failed in the courts—has not proven to be unconstitutional.

It has long been the right of states to impose stricter restrictions on commerce, particularly commerce related to food production, as long as such a restriction doesn’t unfairly impact out-of-state interests. Revoking this right because corporate agricultural interests aren’t happy about disclosing certain information is the proverbial slippery step to undermining other state laws related to food production and safety.

Want to drink raw milk? You can in states that allow it, but not in states that don’t, but this could easily change if the raw milk dairies had enough influence in Congress. Want to allow cottage industries to sell meat products or other food items long restricted? Again, no problem…if the industries have enough influence.

Of course, that’s the real key, isn’t it? These other industries don’t have the power to bring about change at the Congressional level, and that’s not a bad thing. But the GMO labeling law impacts on the very powerful, very wealthy, and very influential chemical, biotech, and food manufacturing interests, and therefore, this particular state law triggers Congressional action. And it does so not in the interests of the consumer—it is a deliberate attempt to withhold information from the consumer. Only the powerful benefit from this bill.

Regardless of your views on GMO labeling, you must deplore such an obvious act of buying Congress.

The biotech, chemical, food manufacturing et al interests have their chances in the court. Our Constitution is giving them their chance. They have the ability to bring their best arguments to the table and defeat Act 120…in the court. With this House bill, they chose not to do so. Instead, they’re putting pressure on Congress, and Congress is allowing them to. It’s a dirty move that is no less dirty because you may not agree with GMO labeling.

* The plaintiffs have filed an appeal related to the denial of a preliminary injunction, and asked for expedited handling of the appeal. This request has been granted, with back and forth filings due by September 8th.

Categories
RDF Standards XHTML/HTML

A Loose Set of Notes on RDFa, XHTML, and HTML5

There’s been a great deal of discussion about RDFa, HTML5, and microdata the last few days, on email lists and elsewhere. I wanted to write down notes of the discussions here, for future reference. Those working issues with RDFa in Drupal 7 should pay particular attention, but the material is relevant to anyone incorporating RDFa.

Shane McCarron released a proposal for RDFa in HTML4, which is based on creating a DTD that extends support for RDFa in HTML4. He does address some issues related to the differences in how certain data is handled in HTML4 and XHTML, but for the most part, his document refers processing issues to the original RDFaSyntax document.

Philip Taylor responded with some questions, specifically about how xml:lang is handled by HTML5 parsers, as compared to XML parsers. His second concern was how to handle XMLLiteral in HTML5, because the assumption is that RDFa extractors in JavaScript would be getting their data from the DOM, not processing the characters in the page.

“If the object of a triple would be an XMLLiteral, and the input to the processor is not well-formed [XML]” – I don’t understand what that means in an HTML context. Is it meant to mean something like “the bytes in the HTML file that correspond to the contents of the relevant element could be parsed as well-formed XML (modulo various namespace declaration issues)”? If so, that seems impossible to implement. The input to the RDFa processor will most likely be a DOM, possibly manipulated by the DOM APIs rather than coming straight from an HTML parser, so it may never have had a byte representation at all.

There’s a lively little sub-thread related to this one issue, but the one response I’ll focus on is Shane, who replied, RDFa does not pre-suppose a processing model in which there is a DOM. The issue of xml:lang is also still under discussion, but I want to move on to new issues.

While the discussion related to Shane’s document was ongoing, Philip released his own first look at RDFa in HTML5. Concern was immediately expressed about Philip’s copying of some of Shane’s material, in order to create a new processing rule section. The concern wasn’t because of any issue to do with copyright, but the problems that can occur when you have two sets of processing rules for the same data and the same underlying data model. No matter how careful you are, at some point the two are likely to diverge, and the underlying data model corrupted.

Rather than spend time on Philip’s specification directly at this time, I want to focus, instead, on a note he attached to the email entry providing the link to the spec proposal. In it he wrote:

There are several unresolved design issues (e.g. handling of case-sensitivity, use of xmlns:* vs other mechanisms that cause fewer problems, etc) – I haven’t intended to make any decisions on such issues, I’ve just attempted to define the behaviour with sufficient detail that it should make those issues visible.

More on case sensitivity in a moment.

Discussion started a little more slowly for Philip’s document, but is ongoing. In addition, both Philip and Manu Sporney released test suites. Philip’s is focused on highlighting problems when parsing RDFa in HTML as compared to XHTML; The one that Manu posted, created by Shane, focused on a basic set of test cases for RDFa, generally, but migrated into the RDFa in HTML4 document space.

Returning to Philip’s issue with case sensitivity, I took one of Shane’s RDFa in HTML test cases, and the rdfquery JavaScript from Philip’s test suit, and created pages demonstrating the case sensitivity issue. One such is the following:

<!DOCTYPE HTML PUBLIC "-//ApTest//DTD HTML4+RDFa 1.0//EN" "http://www3.aptest.com/standards/DTD/html4-rdfa-1.dtd">
<html
xmlns:t="http://test1.org/something/"
xmlns:T="http://test2.org/something/"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<head>
<title>Test 0011</title>
</head>
<body>
<div about="">
Author: <span property="dc:creator t:apple T:banana">Albert Einstein</span>
<h2 property="dc:title">E = mc<sup>2</sup>: The Most Urgent Problem of Our Time</h2>
</div>
</body>
</html>

Notice the two namespace declarations, one for “t” and one for “T”. Both are used to provide properties for the object being described in the document: t:apple and T:banana. Parsing the document with a RDFa application that applies XML rules, treats the namespaces, “t” and “T” as two different namespaces. It has no problem with the RDFa annotation.

However, using the rdfquery JavaScript library, which treats “t” and “T” the same because of HTML case insensitivity, an exception results: Malformed CURIE: No namespace binding for T in CURIE T:banana. Stripping away the RDFa aspects, and focusing on the namespaces, you can see how browsers handle namespace case in an HTML document and in a document served up as XHTML. To make matter more interesting, check out the two pages using Opera 10, Firefox 3.5, and the latest Safari. Opera preserves the case, while both Safari and Firefox lowercase the prefix. Even within the HTML world, the browsers handle namespace case in HTML differently. However, all handle the prefixes the same, and correctly in XHTML. So does the rdfquery JavaScript library, as this test page demonstrates.

Returning to the discussion, there is some back and forth on how to handle case sensitivity issues related to HTML, with suggestions varying as widely as: tossing the RDFa in XHTML spec out and creating a new onetossing RDFa out in favor of Microdatacreating a best practices document that details the problem and provides appropriate warnings; creating a new RDFa in HTML document (or modifying existing profile document) specifying that all conforming applications must treat prefix names as case insensitive in HTML, (possibly cross-referencing the RDFa in XHTML document, which allows case sensitive prefixes). I am not in favor of the first two options. I do favor the latter two options, though I think the best practices document should strongly recommend using lowercase prefix names, and definitely not using two prefixes that differ only by case. During the discussion, a new conforming RDFa test case was proposed that tests based on case. This has now started its own discussion.

I think the problem of case and namespace prefixes (not to mention xmlns as compared to XMLNS) is very much an edge issue, not a show stopper. However, until a solution is formalized, be aware that xmlns prefix case is handled differently in XHTML and HTML. Since all things are equal, consider using lowercase prefixes, only, when embedding RDFa (or any other namespace-based functionality). In addition, do not use XMLNS. Ever. If not for yourself, do it for the kittens.

Speaking of RDFa in HTML issues, there is now a new RDFa in HTML issues wiki page. Knock yourselves out.

updatenew version of the RDFa in HTML4 profile has been released. It addresses a some of the concerns expressed earlier, including the issue of case and XMLLiteral. Though HTML5 doesn’t support DTDs, as HTML4 does, the conformance rules should still be good for HTML5.

Categories
Standards SVG XHTML/HTML

Microsoft: Fish, or cut bait

Recovered from the Wayback Machine.

Sam Ruby quotes a comment Microsoft’s Chris Wilson made in another weblog post:

I want to jam standards support into (this and future versions of) Internet Explorer. If a shiv is the only pragmatic tool I can use to do so, shouldn’t I be using it?

Sam responded with an SVG workaround, created using Silverlight–an interesting idea, though imperfect. Emulating one technology/specification using another only works when the two are comparable, and Silverlight and SVG are not comparable. When one specification is proprietary, the other open, there can be no comparison.

There was one sentence of Sam’s that really stood out for me:

You see, I believe that Microsoft’s strategy is sound. Stallstallstall, and generate demanddemanddemand.

Stall, stall, stall, and generate demand, demand, demand. Stalling on standards, creating more demand for proprietary specifications, like Silverlight. Seeing this, how can we be asked to accept, once more, a Microsoft solution and promises that the company will, eventually, deliver standards compliance? An ACID2 picture is not enough. We want the real thing.

Jeffrey Zeldman joins with others in support for the new IE8 meta tag, based on the belief that if Microsoft delivers a standards-based browser with IE8, and companies adopt this browser for internal use, intranets that have been developed specifically to compensate for IE shortcomings will break, and Microsoft will be held liable. According to statements he’s made in comments, heads will roll in Microsoft and standards abandoned forever:

…the many developers who don’t understand or care about web standards, and who only test their CSS and scripts in the latest version of IE, won’t opt in, so their stuff will render in IE8 the same way it rendered in IE7.

That sounds bad, but it’s actually good, because it means that their “IE7-tested” sites won’t “break” in IE8. Therefore their clients won’t scream. Therefore Microsoft won’t be inundated with complaints which, in the hands of the wrong director of marketing, could lead to the firing of standards-oriented browser engineers on the IE team. The wholesale firing of standards-oriented developers would jerk IE off the web standards path just when it has achieved sure footing. And if IE were to abandon standards, accessible, standards-compliant design would no longer have a chance. Standards only work when all browsers support them. That IE has the largest market share simply heightens the stakes.

From this we can infer that rather than Pauline, the evil villain (marketing) has standards tied to the railroad tracks and the locomotive is looming on the horizon. If we ride to the rescue of this damsel in distress, though, what happens in the next version of IE? Or moving beyond the browser, the next version of any new product that Microsoft puts out that is supposedly ‘open’ or ‘standards-based’? Will we, again, be faced with the specter that if we rock the boat, those who support standards in Microsoft will face the axe, as standards, themselves, face the tracks? There’s an ugly word for this type of situation. I don’t think it’s in Microsoft’s best interest if we start using this word, but we will if given no other choice.

If Microsoft really wants to make the next version of IE8 work–both for its corporate clients and with the rest of us–in my opinion it needs to do two things.

The first is accept the HTML5 DOCTYPE, as a declaration of intention for full standards compliance. Not just support the DOCTYPE, though. Microsoft has to return to the HTML5/XHTML5 work group and participate in the development of the new standard.

The next step is, to me, the most critical Microsoft can take: support application/xhtml+xml. In other words, XHTML. XHTML 1.1 has been a released standard for seven years. It’s been implemented by Firefox, Safari, and Opera, and a host of other user agents. There is no good reason for Microsoft not to support this specification. More importantly, support for XHTML can also be used as a declaration of intentions, in place of the IE8 meta tag.

This is Microsoft meeting us half-way. It gives a little, we give a little. Microsoft can still protect it’s corporate client intranets, while we continue to protect the future of standards. Not only protect, but begin to advance, because the next specification Microsoft must meet will be support for SVG. Perhaps it can use Silverlight as the engine implementing SVG, as Sam has demonstrated. However, if the company does, it must make this support part of the browser–I’m done with the days of plug-ins just to get a browser to support a five year old standard.

Microsoft is asking us to declare our intentions, it’s only fair we ask the same of it. If Microsoft won’t meet us half-way–if the company releases IE8 without support for the HTML5 DOCTYPE or XHTML, and without at least some guarantee as to when we’ll see SVG in IE–then we’ll have our answer. It may not be the answer we want, but it will be the answer we need.

I would rather find out now than some future time that Microsoft’s support for standards is in name, only. At the least, we’ll know, and there will be an end to the stalling.

Categories
Standards

Tyranny of Microsoft

Recovered from the Wayback Machine.

July 20th, 2000, the Web Standards Project issued an ultimatum to Netscape/Mozilla, saying, in part:

Why are you taking forever to deliver a usable browser? And why, if you are a company that believes in web standards, do you keep Navigator 4 on the market?

If you genuinely realized it would take two years to replace Netscape 4, we wish you would have told us. No market, let alone the Internet, can stand still that long. We would have told you as much.

Continuing to periodically “upgrade” your old browser while failing to address its basic flaws has made it appear that you still consider Navigator 4 viable. It is not. You obviously know that, or you would not be rebuilding from scratch. But keeping your 4.0 browser on the market has forced developers to continue writing bad code in order to support it. Thus, while you tantalize us with the promise of Mozilla and standards, you compel us to ignore standards and write junk code in order keep our sites accessible to the dwindling Netscape 4.0 user base. It’s a lose-lose proposition, on our end and yours.

For the good of the web, it is time to withdraw Navigator 4 from the market, whether Netscape 6 is ready or not. Beyond that, if you hope to remain a player, and if you expect standards advocates to keep cheering you on, you must ship Netscape 6 before its market evaporates – along with the dream of a web based on open standards.

If you succeed now, you will regain some of the trust and market share you have lost. And instead of arguing with your competitors, standards advocates will be able to sit back and watch them try to catch up with your support for XML and the DOM.

If you fail now, the web will essentially belong to a single company. And for once, nobody will be able to blame them for “competing unfairly.” So please, for your own good, and the good of the web, deliver on your promises while Netscape 6 still has the chance to make a difference.

Much of the criticism was based on the fact that Netscape, soon to become Mozilla, was undergoing a massive infrastructure change–a change that eventually led to the Mozilla project we know today, and to products like Firefox, and extensions such as Firebug, Web Developer Toolkit, and so on. The WaSP believed at the time that Netscape should focus on delivering a standards compliant browser, putting away the foolishness of XUL until some later time.

In response to a posting at Mozillazine, I wrote a comment about ‘tyranny of the standards’, which eventually led to a full article out at O’Reilly under the same title.

My oh my wasn’t I ripped a new one by members of the WaSP and others. Among those who disagreed with me was Jeffrey Zeldman, who wrote in comments:

The author misses two crucial points, I think:

1. The WaSP has never criticized companies for innovating. If Netscape had not innovated JavaScript, the web would be far poorer – and we would not have the ECMAScript standard today. All the WaSP has asked, repeatedly and clearly, is that browser makers not innovate *at the expense of existing standards.* In other words, that they devote resources toward improving their support for baseline technologies such as CSS-1, HTML 4, XML, ECMAScript and the DOM, *BEFORE* creating new, possibly competing technologies.

For example, we have no problem with IE’s table data “bordercolor” attribute, because IE also provides a standard means of accomplishing the same thing via the standard CSS border property, which they’ve supported well since IE4. Designers and developers can choose to design only for IE if they wish (using IE’s proprietary HTML extension), but most will choose to use the standards IE supports. As long as IE supports those common standards, let them innovate all they like. Similarly, we have not criticized XUL because, as Christian Riege points out, XUL does not stand in the way of Mozilla or Netscape 6’s support for DOM1, CSS, and HTML.

As Bill Pena wrote, ” Before adding a blink tag or ActiveX, CSS-positioning should have been implemented. That’s the real problem.” Historically speaking, blink was unleashed on the world before the CSS-1 recommendation was finished, but Bill’s overall point is exactly what we’re talking about.

Browser makers seem to understand this distinction, which we’ve been raising for nearly three years. It is in our mission statement, and we’ve said it time and again in press statements and interviews. Somehow the author of the article missed it. Most web developers and designers have *not* missed this point, and it is the power of their numbers as much as anything else that has enabled WaSP to influence browser makers in the direction of compliance with these baseline standards.

2. The author paints a portrait of browser companies being “forced” to adapt W3C recommendations by an angry lynch mob. This picture, while it adds a certain dramatic weight to the author’s arguments, ignores the reality of the situation.

*Browser makers themselves are largely responsible for creating these technologies.* When Netscape and Microsoft sat down with the W3C and, along with invited experts, came up with recommendations like CSS-1 … and when they then agreed to support these baseline technologies they’d just helped to create … it seemed logical to us that these companies would work to implement the things they’ve mutually invented and agreed to support.

Today, they mainly do exactly that, and it surely has not impeded their ability to innovate. But in 1998, browser makers were driven by market forces to focus on their points of difference, even as these applied to common and much-needed technologies like CSS, HTML and JavaScript. No organized group was around to remind these companies to fulfill the promises they’d made, giving developers and web users a reliable baseline of sophisticated technologies that would enable the web to move forward. In the absence of any unified voice calling out for these obviously-needed technologies, WaSP was born.

We are not a lynch mob; we’re a small, non-profit, volunteer group using the only tool at our disposal — the power of public opinion — to persuade browser makers to fulfill promises they made as long ago as 1996 (in the case of CSS-1). By and large, browser makers have been working to fulfill those promises since they were made aware that their customer base actually cared about and needed these baseline technologies. The WaSP is not the Politburo or the U.S. Congress. Our goal is not to enhance our own power (of which we have none). Our goal is to wither away like the Communist State was supposed to, as soon as browser makers have finished the job of supporting baseline standards, and web developers are actually using these standards in the sites they build.

Cut forward seven years, and Zeldman writes, in response to the planned rollout of the IE8 meta tag:

We knew when we published this issue of A List Apart that it would light a match to the gaseous underbelly of standards-based web design, but we thought more than a handful of readers would respect the parties involved enough to consider the proposal on its merits. Alas, the ingrained dislike of Microsoft is too strong, and the desire to see every site built with web standards is too ardently felt, for the proposal to get a fair viewing.

Today too many sites aren’t semantic, don’t validate, and aren’t designed to specs of the W3C. Idealists think we can change this by “forcing” ignorant developers to get wisdom about web standards. Idealists hope, if sites suddenly display poorly in IE, the developers will want to know why, and will embark on a magical journey of web standards learning

I commend Aaron Gustafson for his courage and intelligence and thank him and his small band of colleagues, and the engineers they worked with at Microsoft, for offering a way forward that keeps web standards front and center in all future versions of IE.

People change over seven years time. I know I’ve changed, and have become somewhat fanatical about standards. What changed for me between then and now was a thing called IE6, which lasted forever, and has still not properly been retired by Microsoft.

I’m not the only person to change in that time. Where is the man, where is the Zeldman who argued so passionately for standards long ago? Who used to encourage people to contact web designers and tell them to update their sites to meet standards? Who joined with others in condemning Netscape/Mozilla for working on a new infrastructure, rather than pushing a browser out the door that met standards?

Engulfed by the Blue Monster, evidently.

Today, Molly Holzschlag wrote a post, Me, IE8, and Microsoft Versioning where she bemoans the lack of transparency forced on to her, the WaSP team members, and others working with Microsoft.

Open standards must emerge from public, open, bare discussion. Microsoft clearly does not agree with this. It goes against its capitalist cover-up mentality, even when Bill Gates himself has quite adamantly stated that there should be no secrecy around IE8. In fact, he was the one who let the name slip. The fucking name, people! This shows you how ludicrous the lack of communication had become: Gates himself didn’t even know we weren’t allowed to say “IE8.”

This covert behavior is a profound conflict for me as I’m sure readers will at least agree that I’m pretty darned overt by default. But I knew it going in, I just kept and am still keeping my hopes high because that is also my default.

Sometimes the solution is to step back and re-evaluate. Sometimes the solution is to walk away. I haven’t firmed up my personal decisions on that just yet. Maybe it’s time to go back to Old School WaSP-style stinging of MS, but that definitely is not my default.

Can’t we all just get along? No, really. During my time at WaSP, the door was open to a kinder, gentler way. More fool me? So be it. I’m not giving up the greater goal, which is keeping the Web open, free, naked, bare-assed to the world.

To Molly’s post, I wrote a still-moderated comment:

There was another option for you and Aaron and the other people who found Microsoft’s silence so disturbing: you could have quit.

You could have pulled out of the discussions in no uncertain terms and let them know they were making mistakes. You could have used the reasons for your leaving to demonstrate to Microsoft the strength of your convictions.

Bill Gates is first and foremost a poker player. This one significant aspect of his personality has influenced Microsoft from the very beginning. How does the song go? “You’ve got to know when to hold them, know when to fold them, know when to walk away, and know when to run.”

Members of WaSP should never have allowed themselves to be pulled into such a NDA’d discussion.

Two things wrong about all of this.

First, the fact that we, who strive to create standards compliant pages, are the ones who have to change our pages in order to them work with IE8 is, frankly, ludicrous. Leaving aside all of the issues brought up by other people, the idea that the way forward is to have the sites created by people who do things right be the ones to break, rather than the sites created by people who do things wrong, because we’re supposedly the better informed, is ridiculous. It sets a precedent for mediocrity. It signals to agents such as browser makers that they no longer have to worry about those little side trips into proprietary technologies while standards support languishes because, you know, the web will be waiting here for them whenever they decide to remember we exist.

More importantly, I’m seeing too many people who are supporting this tag, doing so because they believe if Microsoft receives complaints from people that their sites are breaking, the company will fire their standards staff and go its own way and all of standards development will be lost, forever.

I don’t know what they call this in Zeldmanland, but where I come from it’s called extortion and blackmail. It is equivalent to saying Microsoft owns the web. Well, we all know that’s not true–Google owns the web.

Secondly, this new tag came about because of closed door meetings under NDA with Microsoft, and involving members of the WaSP, and others who we have come to respect in this industry, such as Molly, PPK, Zeldman, and Eric Meyer. People who have made their name, and their careers, based on support for standards. People who are now finding out that respect in the past does not translate into blind obedience in the future.

Categories
Standards

Bobbing heads and the IE8 meta tag

Recovered from the Wayback Machine.

I was astonished to read the A List Apart article Beyond DOCTYPE: Web Standards, Forward Compatibility, and IE8 and even more astonished to read compliance with the message from Eric MeyerMolly Holzschlag, and the WaSP organization.

How the mighty have fallen is so very cliché but, oh, how appropriate.

According to Aaron Gustafson, who wrote the ALA article, the plan is rather than depend on DOCTYPE to trigger quirks and standard mode for page rendering–a necessity first generated by Microsoft’s IE5/Mac by the way–we all add a meta tag to our pages that locks the page into a specific browser rendering. For instance, the following would lock a page into IE8 rendering:

<meta http-equiv="X-UA-Compatible" content="IE=8" />

IE will then render the page within some form of IE8 compliant mode. Needless to say, as for the old wish for progressive enhancement where if we design our pages to working with released standards, ensuring that they’ll be future proof, well, we must abandon this along the road:

As much as it pains me to lose this particular aspect of progressive enhancement, this behavior is honestly the best thing that could happen, especially when the site concerned is public-facing. After all, we shouldn’t make assumptions about how browsers will behave in the future. If a change in IE9 would break the layout of our site or the functionality of one of our scripts, that could be disastrous for our users, sending our team into a mad scramble to “fix” the website that was working fine before the new browser launched (which is pretty much the boat we’re in now). Version targeting gives our team the ability to decide when to offer support for a new browser and, more importantly, gives us the much-needed time to make any adjustments necessary to introduce support for that new browser version.

I would say that if a change in IE9 would break our standards-based pages, the problem lies with IE, not the pages. The whole point on standards is that by using them we ensure a consistency of access for our pages, now and in the future. When a browser states it supports CSS 2.1 or XHTML 1.1, we know what to expect. Obviously support for standards is not important or part of any plan for Microsoft. Indeed, it would seem that Microsoft has, by supporting (encouraging, funding) this concept, decided to maintain its own path from now into the future, smug in the assurance that it will always manage to lock people into using IE. Frankly, I’m not surprised at Microsoft, but I have to wonder at WaSP, ALA, et al.

This new meta tag is not a browser switch according to PPK, who writes:

A browser detect is a piece of JavaScript or server side code written to parse the user agent string of a browser and take action based on the results of that parsing—typically by denying users of the “wrong” browser access to a page.

The new versioning switch does something completely different. In IE, it starts up a certain mode, such as Quirks Mode, Strict Mode, or the upcoming IE8 mode. In all other browsers it does nothing, since these browsers are not programmed to recognise the meta tag.

Therefore, if a non-IE browser encounters the switch, nothing happens. The browser ignores the meta tag, reads the HTML, CSS, and JavaScript as always, and allows its own rendering engine to interpret it.

In other words, the versioning switch does not have any of the negative effects of a browser detect.

There’s a second difference: the versioning switch is a contract. The IE team tells people what will happen if they insert the meta tag in their pages, and it’s up to individual web developers to decide whether they want to use this contract or not.

Bully for Microsoft. I used to think commitment to standards was a contract. Evidently, my interpretation was incorrect. How gauche of me.

In comments at the IE blog, James Schwarzmeier wrote:

Unlike the majority of other posters here, I have to say that I agree with this approach. I currently wok on a team that maintains a suite 20 large web-based applications. If I had to guess, I would say there’s serveral (if not 10s) of millions of lines of code. If the layout engine radically changed, it would literally take years to fully test everything and update everything to be compatible. It’s not that we’re lazy or “behind the times” — it’s just that the sheer volume of code makes it impossible to simply turn on a dime.

What an absurd statement to make. What Schwarzmeier is saying is that each page in these 20 major applications is hand coded, not using standards, not using a template, and that individual changes need to be made, one page at a time. Frankly, any large site or application in this shape should seriously consider firing its existing team, and starting over.

The days when each web page is hand crafted are over. They’ve been over for years. I can’t believe that there are any major web sites that don’t use some form of templated system now. Templates and CSS.

In fact, I would say that most hand crafted pages now probably wouldn’t work with IE8, or even IE7 or IE6. I find it likely they still have the silly little icon for requiring IE 4.x. They definitely wouldn’t be adding in the meta tag. The creators probably won’t even hear of these discussions.

The argument for this tag is actually the number one argument against this tag: those people with hand crafted pages are not going to be willing to hand edit each page to make it standards compliant–why on earth would they hand edit each of these pages to add this tag? As for being able to test a site against a version of a browser–this site looks good in IE7, but not IE8, or some such nonsense–when are we finally going to actually commit to standards? Not just as browser vendors, but as web page designers and developers? More importantly, as people who use browsers to surf the web?

I am not writing this because I work for Opera or Mozilla. I am not writing this because I’m unaware of the challenges facing web page designers. In fact, in my books I warn people about being aware of their audiences and the browsers that they use. It would be irresponsible for me not to cover these topics.

However, I no longer buy into the stories of millions of charities, schools, or libraries with old computers that can’t run anything but Win95 and IE 5.x.

I no longer buy into the stories of web sites with millions of lines of code, each of which has to be tweaked any time a new standard or browser is released.

I no longer buy into a web where we continue having to add foolishness into our pages in order to satisfy a company who can’t even be trusted to provide an upgrade path for IE 6 on older, but still popular, operating systems like Windows 2000.

Nothing will stop Microsoft from adding its little IE-specific tags here and there. If the company were truly concerned about breaking the web, though, such tags should be opt-in. In other words, Microsoft should render a page in standards mode if this stilly tag is not added to the page–not force all of us to redefine the web just because Microsoft has seemingly brainwashed the WaSP, ALA, et al.

I will not add this tag to my web pages–no, not even using some twisty tech method–and the company had better determine how it is going to render sites like this one, which serves up good, honest, standard XHTML 1.1.

Update

Over at Anne’s the following comment from another member of WaSP:

Just to be clear Anne, the members of the Web Standards Project in general were not informed about this article and Microsoft’s proposal/plans until it was announced on A List Apart. Any Web Standards Project members who consulted with Microsoft did so as individuals and not as representatives of WaSP.

I am sure that I am not the only WaSP member (and web designer/developer) who is unhappy with these proposals on first reading.

I think it’s time for the WaSP to get its ducks in a row. I think it’s also important that members of WaSP, and perhaps the ALA, also, to publicly declare their financial dealings with any of the impacted browser companies, including Microsoft.

Second Update

Jeremy Keith writes of meta-blackmail:

But—and this is a huge “but”—if you don’t include a X-UA-Compatible instruction, you are also condemning your site to be locked into the current version: IE7. This is a huge, huge mistake.

Let’s say you’re building a website right now that uses a CSS feature such as generated content. Any browsers that currently support generated content will correctly parse your CSS declarations. Future browsers that will support generated content should also parse those CSS declarations. This expected behaviour will not occur in Internet Explorer. IE8 will include support for generated content. But unless you explicitly declare that you want IE8 to behave as IE8, it will behave as IE7.

There’s another option: continue as we are.

Last update

Who is this man and what has he done with the real Zeldman? I will say one thing: such prissy, looking down one’s nose arrogance will not win adherents to this approach.

I feel like I’ve walked into an episode of the Twilight Zone.