Categories
Specs Web

Joel Spolsky: Crap is good

Recovered from the Wayback Machine.

Joel Spolksy just spent several thousand words and accompanying diagrams saying one thing: we did things crappy in the past, and we should continue doing things crappy in the future because crap is easy.

Where do I start?

This upcoming battle will be presided over by Dean Hachamovitch, the Microsoft veteran currently running the team that’s going to bring you the next version of Internet Explorer, 8.0.

At a minimum Microsoft can go off and do its own thing in total isolation, and in the long run, Microsoft will end up being the loser. The more I work with SVG and the new CSS, the more I find that I can develop using the new technologies, and the page still works for IE but I don’t have to make it look the same for IE. As long as the page is clean, legible, and accessible via IE, it doesn’t have to look the same for IE as it does for the Big Three (Firefox, Safari, and Opera).

So I’d say that Hachamovitch is a player, but only to the extent that Microsoft wants to be a part of a larger community.

In practice, with the web, there’s a bit of a problem: no way to test a web page against the standard, because there’s no reference implementation that guarantees that if it works, all the browsers work. This just doesn’t exist.

Question: can you see this page?

There is no practical way to check if the web page you just coded conforms to the spec.

Question: can you see this page?

There are validators, but they won’t tell you what the page is supposed to look like, and having a “valid” page where all the text is overlapping and nothing lines up and you can’t see anything is not very useful. What people do is check their pages against one browser, maybe two, until it looks right. And if they’ve made a mistake that just happens to look OK in IE and Firefox, they’re not even going to know about it.

I’m trying to untangle this one mentally and failing. What Spolsky seems to be saying is that standards don’t matter, because people don’t test in all browsers, and standards somehow make lines not even up. Or something.

He can’t possibly be saying that standards break the web. Can he?

Actually, he can.

Standards are a great goal, of course, but before you become a standards fanatic you have to understand that due to the failings of human beings, standards are sometimes misinterpreted, sometimes confusing and even ambiguous.

The precise problem here is that you’re pretending that there’s one standard, but since nobody has a way to test against the standard, it’s not a real standard: it’s a platonic ideal and a set of misinterpretations, and therefore the standard is not serving the desired goal of reducing the test matrix in a MANY-MANY market.

DOCTYPE is a myth.

A mortal web designer who attaches a DOCTYPE tag to their web page saying, “this is standard HTML,” is committing an act of hubris. There is no way they know that. All they are really saying is that the page was meant to be standard HTML. All they really know is that they tested it with IE, Firefox, maybe Opera and Safari, and it seems to work. Or, they copied the DOCTYPE tag out of a book and don’t know what it means.

There’s at least four separate thoughts in these few seemingly related paragraphs. First: there really are no standards, because standards are a thing of the mind. Second, because standards are a thing of the mind, one can’t test pages against a standard. One such standards thing is DOCTYPE, which really doesn’t exist because no one knows what it does, and people just copy it, anyway. Therefore…

I must admit to getting lost at this point. Who’s on first?

And so if you’re a developer on the IE 8 team, your first inclination is going to be to do exactly what has always worked in these kinds of SEQUENCE-MANY markets. You’re going to do a little protocol negotiation, and continue to emulate the old behavior for every site that doesn’t explicitly tell you that they expect the new behavior, so that all existing web pages continue to work, and you’re only going to have the nice new behavior for sites that put a little flag on the page saying, “Yo! I grok IE 8! Give me all the new IE 8 Goodness Please!”

And indeed that was the first decision announced by the IE team on January 21st. The web browser would accommodate existing pages silently so that nobody had to change their web site by acting like the old, buggy IE7 that web developers hated.

A pragmatic engineer would have to come to the conclusion that the IE team’s first decision was right. But the young idealist “standards” people went nuclear.

It’s been a long time since I’ve been called a “young idealist”. I wonder how Sam Ruby likes being called a young idealist? I’m surprised Spolsky didn’t pat us all on the heads, offer us a cookie. But wait, it gets better…

Almost every web site I visited with IE8 is broken in some way. Websites that use a lot of JavaScript are generally completely dead. A lot of pages simply have visual problems: things in the wrong place, popup menus that pop under, mysterious scrollbars in the middle. Some sites have more subtle problems: they look ok but as you go further you find that critical form won’t submit or leads to a blank page.

Fancy that…this young idealist’s web sites both worked with IE8, right out of the box. In fact, the only problem I’ve had with IE8 is with Netflix and that’s because of the ActiveX controls and nothing to do with standards.

I think we’ll find that most web sites don’t break with IE8, or if they do, they’re just as likely break with Firefox 3b, and Opera 9.5b, and the latest WebKit. There’s a reason you have a long beta period for a browser–to give people time to make any necessary fixes in order to have the browser work with the page once the browser is released out of beta.

True, there are sites that will continue to break with IE8 once it’s released. If you want to find them, go to the geocities.com web sites, and search on muscle cars. Better yet: “Unicorn rainbow pony”. Heck, even most of them will *probably work.

Some of those pages can’t be changed. They might be burned onto CD-ROMs. Some of them were created by people who are now dead. Most of them created by people who have no frigging idea what’s going on and why their web page, which they paid a designer to create 4 years ago, is now not working properly.

So the web has to stop because a web site has been burned on a CD, or the person who created the site is dead? Isn’t that equivalent to saying, “No, you can’t have blu-ray, because I still have VHS tapes”? Or maybe more in line with, “No, you can’t have that vaccine because there are people in the world who think the plague is caused by evil spirits, and we have to halt our practice of medicine until they catch up.”

You know, it is OK to let old pages break. There is nothing so valuable online today that we have to halt all further progress of the web because of the off chance a page won’t be viewable in a modern browser. If it were truly that valuable, it wouldn’t be that vulnerable.

Leaving aside vapid, sexist twaddles such as, Mmhmm. All you smug idealists are laughing at this newbie/idjit. The consumer is not an idiot. She’s your wife. So stop laughing (speaking of which, it doesn’t matter where the quote arises, Joel, only your use of it to prove a point), Spolsky’s whole pitch is basically a race for the bottom. Crap has happened in the past, and therefore we should continue supporting crap in the future. Not only support old crap, but encourage new crap because, frankly, people are too stupid to learn how to do things right. She’s your wife, indeed.

In response to Spolsky’s writing, Sam Ruby wrote, If people want web browsers that work with actual web sites, they still have three choices. Three good, solid choices, created by three organizations populated by people who don’t believe we have to be stuck with muscle cars, unicorns, rainbows, and ponies forever.

*Do scroll down the page and look at the comment annotating the page view counter.

Categories
Burningbird

Having one’s cake

Recovered from the Wayback Machine.

I’ve now mapped out a plan for moving forward on the organization of my site, including which tools to use, where and even some preliminary designs. I’ve also played around more with incorporating SVG into a site design, as well as trying out some of the newer CSS3 design attributes. I’m finding out that one can have one’s cake and eat it to.

For instance, you can use SVG for a site design, and the site doesn’t have to look either plain or ugly with IE–just different. If you’re comfortable with different, this isn’t a bad way to move forward with the more advanced browsers, such as Firefox/Gecko, Opera, and Safari/Webkit (the Big Three), while still accounting for a more primitive browser like IE.

Right now, today, at Realtech I have an experimental design up called “World War”, featuring both a photo from an air show, as well as three different SVG images. Only the photo shows with IE, but rather than have a completely white page, I added a background color and repeating background pattern, both of which are overlayed by the SVG ‘background’ image that the Big Three can see.

This is where it gets a little tricky. The SVG element supports both a width and a height attribute. If you specify the width and height in the element as SVG attributes, not in the CSS style attribute, Internet Explorer ignores both, which means the SVG element takes up no page space in IE.

However, the Big Three understand that width and height are supported attributes for SVG container elements, like the SVG element, itself. All three support the width and height setting directly in the SVG element. Not only that, but both Safari and Opera get a bit snitty if you don’t use these attributes and instead set the width and height using CSS, only.

The end result of this mechanization is that the Big Three see the SVG images and override the background image and background color. True, they still load the background image, but since it’s so tiny, it’s not a significant load on the server or client. Best of all: no conditional references have to be used, either in HTML, CSS, or JavaScript. If IE were ever to support SVG someday, the browser would then process the SVG just like the Big Three.

I continued this concept into using some CSS3 attributes. CSS 2.1 provides the meat of web page design, but CSS3 is the desert, and what’s a good meal without desert?

I use the rgba color function when setting the background color for both my sidebar and my article title bars. The rgba function takes four parameters: the three decimal values, in a range from 0 to 255, for the red, green, and blue channels, respectively, and a fourth representing the alpha channel. The alpha channel is what controls the transparency. Using the rgba function allows us to create semi-transparent backgrounds.

I could use a variation of opacity setting, including the CSS3 opacity attribute, as well as the older moz-opacity, filter, thing. However, the opacity settings effect the opacity of the element on which it is set and any child elements. Using the rgba function for the background-color creates a semi-transparent background for the element on which it is set, but has no impact on the child elements. (For more on opacity and rgba, see A brief introduction to Opacity and RGBA.)

What about a gracefully degrading design? For user agents that don’t support rgba, what I’ve found is that we can specify a background color using non-rgba functionality:

.sidebar
{

background-color: #fff;
background-color: rgba(255,255,255,0.8);
}

Either the agent will pick up the non-rgba background color, or it won’t pick up any background color at all. In the latter case, the behavior that the browser demonstrates is that it recognizes a supported CSS attribute (background-color), but not the value (rgba). Therefore it flushes the previously set background color, but doesn’t apply the new background color.

(I believe the former behavior is the correct, while the latter behavior is the incorrect. If you any input on this, please leave a note in comments.)

Combined, these two CSS background-color attribute settings result in the following: the sidebar and the inner panel background are both semi-transparent with Safari and Firefox, which support rgba; Opera doesn’t currently support rgba, but will pick up the earlier, solid white background-color; IE doesn’t pick up any background color, and both items are transparent.

Another CSS3 attribute I use that gracefully degrades is the new text-shadow attribute. With text-shadow, I can add shadow to text, such as the title in the page header. If the browser supports the text-shadow attribute, the shadow displays; otherwise, no shadow.

The text-shadow attribute takes four parameters: the color of the shadow, the x coordinate of the shadow as it relates to the original element; the y coordinate; the radius of the applied blur. I currently have the following text-shadow attribute setting on my main title:

text-shadow: #333 2px 2px 4px;

This CSS setting creates a dark gray shadow, offset 2 pixels to the right and bottom of my current text, with a blur radius of 4 pixels–a relatively soft shadow. The shadow shows with Opera and with Safari, though not with Firefox or IE. As long as no dependency is placed on the shadow (i.e. text the same color of the background, depending on the shadow to make the text show), the look degrades gracefully for browsers that don’t, currently, support text-shadow.

Best of all, when the text-shadow attribute is eventually supported by a browser, the shadow is displayed without any further intervention or modification of the page design. All you have to do to is accept that a page will look different in different browsers. Not “bad”, different. If you’re willing to live with “different”, you can have a lot of fun now with new design elements

Categories
Specs

XHTMLate WordPress comments

Recovered from the Wayback Machine.

I’ve pulled the plug-in. It cleaned out the comment text, but not the name, URL, and email of the person. The email isn’t an issue, as WP ensures the email is clean; the URL and the name, however, are still an issue. A new comment isn’t the problem; edited comments are.

Frankly, if you’re going to serve your pages up as XHTML, your best bet is to moderate comments so you can catch every variation of something that can go wrong. Either that, or get rid of comments, which is also an option.

I’ll post a new version, once I’ve checked those fields, and completed a few other odds and ends.

Categories
Semantics

And Nerds become queens: Yahoo and Smart searches

Recovered from the Wayback Machine.

Great idea on the part of Yahoo to begin incorporating semantic web information into its search open platform. How deep the semantics will go, and in how many directions is still TBA, but I’m please to see interest in microformat and more structured semantic data via RDF. I’ll be even more pleased when we start to see working examples.

Marshall Kirkpatrick believes that Google will follow suit. I just don’t see it. Google might embrace microformats, but the company has long pit its algorithms against human annotation of data, and the semantic web is based on some human annotation–even if the annotation is based, indirectly, on checking an option in a page.

My biggest concern about all of this is if we were to limit semantics to microformats. It’s with relief that I see that Yahoo is going beyond just microformats into the broader scope of the structured semantics based on RDF and its various serializations. Paul Miller also brings up other needed caveats:

The tools to create and embed that structure need to follow, of course. And issues that efforts like Dublin Core struggled with over a decade ago need to be thrashed out in some more detail, as the malicious, the malevolent, the careless and the mischievous rush to ‘game’ the rich structured data with which their web pages will soon be filled.

Putting pressure on the tool makers is essential, though probably not as essential as it once was because most tools provide a plug-in infrastructure that enables expansion. Still, there’s a lot more that tools can do, which is one reason why I’ve been so interested in Drupal: this tools is definitely ahead of this curve.

What’s key to all of this is showing people what they can get if they go that little extra step. I read people who write reviews on books. If we start showing more intelligent search results based on adding a little additional information to their writings that reflect that the work is a book review of a certain book by a certain author, etc., they will, most likely, be willing to spend a little time adding this additional information.

Someday when I’m looking for a new book to download from the web, I’ll be able to pull up a browser in my Kindle ebook reader and see all the reviews written about this book, online. Everywhere. We are so close to making this work, and I’m not normally the type to to tap dance every time someone comes along, breathing the words “semantic web”, through lips moist with anticipation.

Yahoo should have received a hostile takeover bid a long time ago. Lately, the company has been galvanized.

Categories
Technology Weblogging

Upgrading to WordPress 2.5: First, install Drupal

Recovered from the Wayback Machine.

Anil Dash had a clever and humorous, as well as telling, guide titled, A WordPress 2.5 Upgrade Guide. His advice?

As you might know, WordPress 2.5 is about to be released, and we wanted to encourage WordPress users to upgrade. To Movable Type.

I wasn’t even aware that a 2.5 upgrade was on the horizon until I read Anil’s posting. Why on earth do the WordPress people embed a link to the WordPress weblog in the Dashboard if they don’t use it to give people a head’s up? Especially since I gather this upgrade is making some major modifications. Modifications that will probably trash some of the changes I’ve made to XHTMLate WordPress. I am now faced with a decision: do I upgrade to 2.5, and continue to XHTMLate? Or move to Drupal? Or increase my pain, and use both?

Moving to another tool sounds about as much fun as having dental implants. However, now is the time to make this movie if you’re considering it. Though using minor version numbering, from what I can gleam, WP 2.5 is a major upgrade.

For me, the logical move is to Drupal. The tool has just come out with a major new version, which means I don’t have to go through major upgrade blues for a long time. I’ve written in the past about the tool’s support for both SVG and RDF, as noted in the keynote at DrupalCon (thanks, James!). And now Laura Scott writes on the number of women involved with the Drupal development, which I did not know about. Probably because of problems with visibility of women associated with open source, generally. According to Laura:

Part of the problem lies not in macho coding culture, but rather in the woeful state of computer and software education in our schools. Most of the people involved in open source are there in spite of their formal educations (or lack thereof). Computer work is pretty much taught only in Computer Science departments, which usually are subsets of Mathematics departments. Despite the fact that nearly every student will be working with computers in whatever field they enter, they likely will never have even one class where they study any sort of computer science or algorithm theory.

Is it any wonder that women especially are not likely to end up in an open source software community? As I noted before, the leading women involved with Drupal came to it from other vocations and educational backgrounds.

I’m not surprised about women coming in from other vocations. I’ve long thought the problem with the Computer Science degree programs in college is that there are Computer Science degree programs in college. I was pleasantly surprised, though, about the significant women’s involvement in Drupal. This involvement becomes yet another reason to make a move to Drupal.

All appreciation to Laura for her kind words about yours truly, I doubt I’ll have any visible impact on the growth of Drupal, and Matt at WordPress will attest to the fact that I can be a real pain-in-the-butt to have as a user. To be honest, I think Drupal, itself, with its forward moves into semantics and SVG and related technologies, and the community around Drupal are what will have a positive impact on the growth of this tool. Enough to be a threat to WordPress? That’s a silly way of looking at it, because there’s plenty of business for WordPress AND Drupal, and yes, even Anil’s Movable Type. Everybody has different needs.

But, oh, I hate having to go through yet another tool switch.

Manilla->Radio->Blogger->MT->Wordpress->Drupal.

In the meantime, if you are a WordPress user, heads up, as change is coming at you. And if you see strange happenings around here…well, come to think of it, you always see strange things happening around my web sites, so, never mind.