Fire the W3C

Recovered from the Wayback Machine.

I have to disagree with Dare on his recent post about the troubles at the W3C.

I had to work, quite extensively at times, with the W3C working group related to RDF when I was writing Practical RDF. There were times when I thought I had walked into a lab and was chief rat. In particular, I was concerned about the R & D aspect of the work: where were the ‘practical’ people?

It was only later, as I saw RDF hold up under the challenges that I realized that the model has to be mathematically vetted before practical use could be made of it. For better or worse, the only people willing to take on this kind of effort, and having the background, are the R & D, academic types of folks. They’re not easy to live with at times, but they have more background for this work then the average person.

I know that the W3C has had problems. I do think it needs to connect more with the user base. I agree with Molly that it desperately needs to be diversified. But what are the alternatives?

Dare mentions relying on defacto standards. Would that be like HTML? We’re only now starting to pull ourselves out of the nightmare of inconsistent HTML markup and elements such as BLINK, or worse, FONT.

Dependending on proprietary standards such as RSS? But certain aspects of this syndication feed are imprecise, and this imprecision leads to confusion. All you need do is link two enclosures to see this for a fact, and this is only one of the more obvious. Look also at the fact that RSS has political overtones to it that will always cloud it use. Heck, the one organization ‘picked’ to help document it, was fired by the person who picked them! Excuse me, but exactly how are the W3C efforts worse?

As for the microformats community, are we forgetting nofollow? Well if not that, then ask ourselves something: what purpose does hAtom solve? Considering that the generation of the page is most likely from a data set and is dynamic, then how is hAtom any better than just generating Atom from the same data?

Lately I’ve been really looking at microformats and I can understand the utility of some–such as calendar and reviews–focusing on using specific markup to define business data. Others, though, look to me like an exercise in pushing data around just to do so. Have you ever played with dominoes? Where you line them up just right and then push them down? It’s cool a couple of times, but most people get bored and move on. Some (not all) of the microformat effort reminds me of dominoes.

More importantly, there is no real organization unassociated with a specific company driving out microformats.

The W3C has work to do. But I’d rather have the W3C, than not.


W3C Web Services Group

Recovered from the Wayback Machine.

I’m extremely pleased to see the formation of a new working group at the W3C. It will be responsible for generating some meaning of the chaos that is Web Services. I may not be an adherant of the “Standards at all cost” club, but we do need to have standardization in three specific areas:

  • Protocols
  • Syntax
  • Semantics

Syntax is XML, Semantics is RDF, and a  public domain Protocol will, hopefully, come quickly and efficiently out of this new group.

XML as a syntax is safely within the public domain. However, there is too much implied ownership of the protocols (openly available or not), and there’s too little understanding of the importance of the semantics.

We don’t need centralized enforcement agencies such as UDDI., or centralized security paradigms such as Passport if we only have a standardized syntax, semantics, and protocol. Give me these, and I can build you an interface to anything, from anything.

My strongest hope is that the W3C moves swiftly — they’re late in the protocol game.


Future of the Web

Recovered from the Wayback Machine

When people say something I want to respond to, I respond to it. And other people are, hopefully, responding to me if I say something interesting. When I respond to what others write, it is a compliment. It means that what was said definitely got my interest, regardless of whether I agree with what was said or not. When people respond to me, I take it as a compliment, even when they call me nasty things. (Go ahead! Call me a bitch! I live for this!)

Having carefully said all this, I find I do want to respond to something Dave said on Scripting News. I have to respond — to hold it in will cause me an injury.

I was a developer before the Web was even a twinkle in Berners-Lee’s eyes. I love to program, and have worked — worked mind you — with 18 different programming languages, including C, C++, Java, Perl, Snobol (any of you recognize this one?), Smalltalk, Ada, Pascal, Modula II, FORTRAN, LISP, and so on. And I still love to program, though I spend most of my time designing technology architectures and writing now.

When the web came along, it was love at first byte. I thought that this was great stuff — a universal front end to any application. I was so sold that I focused as much of my professional life on the web as I could, and still pay the bills.

I wrote books and articles on CGI and DHTML and JavaScript and XML and CSS and ASP and a host of other web technologies. Even today I find I am as fascinated by the web as I was waaaaaaaaaay back in the beginning. I’ve never seen that the web is low-tech. If anything, I find myself being stretched more by the web than by traditional programming.

In all this time, I just don’t remember there ever being a battle between C developers (I’m assuming by this Dave meant people who don’t want to use the web as an environment for their applications) and web developers. Not all applications fit the web, and not all companies have chosen the web for their environment — but that’s not developers, that’s just business. Most companies today use applications from both environments, something that will probably continue to be the norm into the future. (We don’t want to use Word over the Internet as a service, no matter what Microsoft says. Same for PhotoShop)

There’s discussions — constantly — between server-side folks and the designers. I know that I’ve had a lively chat or two with the WSP people who are, primarily, web designers. But most developers I know of, such as myself, are thrilled to play with the new technologies the web has provided. There might be a few who don’t want to play web, but most of us are as happy (or more) working with web development as we are with traditional development.

The whole thing is really about services isn’t it? Providing services to people who need them. Most computer-based functionality is nothing more than services wrapped in a front end — doesn’t matter if the front end is a VB application or a web page. All that matters is that the services are prompt, efficient, secure, accurate, and effective. If some people prefer to create the front end in VB and put both service and front end on one machine, that’s cool. If they prefer a web page, that’s cool. Where’s the battle? Apples and oranges.

As for Netscape and Microsoft and the W3C not having a vision for the future of the web, oh they most certainly do and did. Microsoft’s whole vision is .NET and owning the internet. In fact, the company’s vision scares me most of the time. Netscape also had strong designs on the web before they became the underdog. As for the W3C, we wouldn’t have the web without this organization’s efforts. I may preach chaos, but I practice chaos on top of a specific development platform, and I have that platform thanks to the W3C.

The key is that there are a lot of groups and people who have their own visions for what is the future of the web. If we continue to work towards a common interface, then we can each practice our own vision and our own chaos behind that interface. But we must have this interface, and I’d rather it be provided by an organization that doesn’t profit, then one that does. The interface cannot be owned by any one company, any one organization, or any one person.