Categories
Specs

Web Services Working Group

I’m extremely pleased to see the formation of a new working group at the W3C. It will be responsible for generating some meaning of the chaos that is Web Services. I may not be an adherent of the “Standards at all cost” club, but we do need to have standardization in three specific areas:

  • Protocols
  • Syntax
  • Semantics

Syntax is XML, Semantics is RDF, and a  public domain Protocol will, hopefully, come quickly and efficiently out of this new group.

XML as a syntax is safely within the public domain. However, there is too much implied ownership of the protocols (openly available or not), and there’s too little understanding of the importance of the semantics.

We don’t need centralized enforcement agencies such as UDDI., or centralized security paradigms such as Passport if we only have a standardized syntax, semantics, and protocol. Give me these, and I can build you an interface to anything, from anything.

My strongest hope is that the W3C moves swiftly — they’re late in the protocol game.

Categories
Specs

Zeldman CNet interview

Recovered from the Wayback Machine.

Never noticed this CNet article with Zeldman before. Hmmm. Seems as if the questions the interviewer is asking echo things — such as the WSP organization not being that much of an influence with Microsoft, but having a negative influence with the Netscape 6.0 release — that I’ve been saying lately. Guess my interpretation of the WaSP’s efforts in the last year or two isn’t all that unique or that unusual.

One thing I have noticed recently — my background is in computer systems and developing software, and I’ve been working in this field long before the Web came along; Zeldman’s, along with many (not all) members of the WSP, have backgrounds in print or art or graphics, and drifted into technology through interest in the Web. Perhaps that’s why there’s a lack of common communication: I can’t understand how these people can’t adapt to the differences in Web user agents, and they can’t understand why I’m not as uptight about standards adherence as they are.

As any experienced software or device driver developer can tell you, standards go only so far.

Categories
Specs

More WaSP Sucks

Recovered from the Wayback Machine.

After the initial fairly unpleasant comments attached to a weblog I wrote earlier, basically blasting WSP (Web Standards Organization, at http://www.webstandards.org), I’ve had several thoughtful responses from readers leading me to want to respond in kind.

To be honest, I never was that interested in WSP or its initial efforts. I support the concept of web standards, and I’m all for a stable baseline of technology from which to build web content. However, I felt that WSP was late in their formation, as well as late in their effort. In the last three years, WSP generated a lot noise, but I doubt that the group had the positive impact that they believed they had. I still believe, strongly, that the moves towards adopting W3C standards support were based on business practices rather than a set of petitions signed by a group of web designers and web page developers and several press releases. No offense .

As for negative impact…

Thanks to wonderful web services such as the Web Archive’s Wayback Machine and Google’s publication of older usenet postings, I found a thread about a specific WSP open letter to Netscape. You all remember this one? I remembered it, but found it resurfacing when I was poking around the usenet archives. I want to very, very carefully comment on this open letter and what led to it.

Netscape made a gutsy move to take it’s browser to open source through the Mozilla organization. Then, Netscape and Mozilla made a second extremely gutsy move — they decided that to provide the support for the modern standards and web technologies, they would need to scrap Netscape Navigator’s old layout engine and design one from the ground up. And they designed not only a new browser and layout engine, the Mozilla group designed the finest technology architecture for a web user agent I have ever seen. You don’t have to believe me, read the documentation at the Mozilla web site.

Unfortunately, a side effect of these two drastic decisions is that Netscape and Mozilla’s release of a new browser would be seriously delayed.

Rather than we, the web development public, being patient and supportive with both Netscape and Mozilla during these difficult decisions, we slammed them, we issued statements as in the above open letter, and we abandoned them for Internet Explorer (while continue to blast Microsoft for having a monopoly).

Think about it — there was no rush on this. Unless I use the more escoteric elements of CSS1, I’ve rarely had problems with getting my content to work with Navigator 4.x (though I haven’t checked it lately, bad on me). As Zeldman (one of the founders of WSP) proved in his posting this week, the older Netscape 4.x browser could support the XHTML and CSS used at the New York Public LIbrary. So, why the rush?

Still, based on public pressure such as the above, and aided by Netscape’s merger with AOL (tragic that it was), Netscape released a version of it’s browser without waiting for Mozilla to release a first public release of the underlying technology. The company was then promptly blasted because Netscape 6.0 released without total support for ALL standards, and was buggy to boot (though it has improved greatly since the initial release).

Could the Netscape or Mozilla browsers have released earlier? I doubt it, not with the necessity to create the infrastructure to support them. Could both have released without the infrastructure — as just a plain jane browser? Sure, but it wouldn’t have advanced our understanding of what web user agents can do and achieve (a concept that, unfortunately, seems to have been lost in all the standards hub bub). In this case, innovation had to take precedence over standards adherance, because innovation was at the core of the rendering agent responsible for the implementation of the standards. It’s true, this wasn’t the short cut route to delivering a browser; this was the best route, from a technology stand point.

The WSP open letter didn’t help. It really didn’t help. Because of the rush to put Netscape out more quickly, because of the rush to support standards at the cost of innovation (and I think I proved this point with this posting) we have lost an opportunity to truly explore what Netscape/Mozilla represent — a new way of doing things on the web. Not necessarily technology as in “must release a new version every six months”, but a technology built for the future — exploring new concepts, following new challenges.

Couldn’t we have waited a year or two for that? Couldn’t we have used HTML tables for layout just one more year? What would be the harm in waiting? The Mozilla folks (and the Netscape people on the Mozilla project) are working on standards support, but it takes time. And open source projects, unfortunately, usually take more time than commercial ventures. That’s a business fact of life. Couldn’t we have been patient?

End to a story that’s too long for a weblog posting: that’s why my indifference to the WSP became active dislike. Not because I’m against web standards. Not because I’m against XHTML or CSS. It’s because in a brief time in web technology history, we supported conformity over innovation; we supported the comfort and safety of living with the lowest common denominator rather than taking a chance on something bright and edgy and new. And I got pissed.

And I’m still pissed. It’s as simple as that.

Categories
Specs

WaSP Sucks

Recovered from the Wayback Machine.

Updated: 12/19/01 The really great thing about weblogging is you can set your own rules. When I wrote an article for O’Reilly about The Tyranny of Standards I had to accept the comments of the readers without being able to comment in return — O’Reilly would really prefer its authors to NOT get into a web-based slamfest with its readers. However, this is my weblog, I set the rules — and this time I’m responding to the weblog posting comments.

I never once said I was against standards. Read my posting and read my article: I never once said that standards weren’t important. However, I do not agree with an organization (WaSP) whose efforts almost destroyed another organization I hold with great respect — the Mozilla organization. I wrote my original article for O’Reilly because WaSP’s little tirades about the importance of standardization and how Netscape and Mozilla are just too, too behind, forced a too early release of Netscape 6.0 and seriously weakened Mozilla’s effort and credibility.

I remember with fondness WaSP’s chastisement of Mozilla for “wasting” time with things like XUL, when (according to WaSP) adherence to standards was more important. I have a clue for you folks — XUL and the underlying Mozilla architecture is the most innovative concept and use of XML to have ever hit browsers. If XUL were in widespread use, a company like Fog Creek wouldn’t have had to build a Windows GUI-based application (City Desk) to provide the functionality it needed for its product.

As it is, primarily because of WaSP’s efforts at the time, the interest that could have been generated for XUL was wasted on Mozilla’s “lack of standards adherence”. Innovation was drowned by the demand for standards compliance. Pure and simple.

Speaking of standards, exactly which ones are you all screaming for? XHTML? That’s a standard? How is it more so than say HTML 3.2 or 4.0? Oh, you mean these are “older” standards, and shouldn’t be supported any more. Same with “old” browsers like Netscape 4.0?

No, no. Don’t support them. Instead of writing pages that might be viewable by older browsers as well as the new ones, you want people to be redirected to a page like this page. Tell me, now — who’s really idiotic idea was it to tell web designers and builders to send customers away, chastised because they aren’t using a newer browser? I have a hint for you — we didn’t like it when Microsoft played this game with MSN, why would we like it when someone else does the same?

Tell me something else — did any of you do this on your jobs, and do you still have a job?

That last one’s a key one for me. I live in the San Francisco area, in SOMA (South of Market) as a matter of fact. This area has been absolutely devastated by the dot com implosion. Empty stores, empty restaurants, empty offices, half vacant apartments and condos, with once bustling streets now slowly being taken over by San Francisco’s homeless. This is California’s newest ghost town.

And I bet you that there would be any number of web page designers and builders who would take a job building web pages that support Netscape 4.x, or that require the use HTML 4.0 instead of XHTML, or Dreamweaver, or any tool, specification, or technology, as long as they’re getting paid. In my weblog posting, that’s the point I wanted to make and none of you got it. Too subtle I guess. I’m must learn to beat one about the head more soundly in my postings from now on. So:

When you give a starving man a dinner, don’t tell him how to hold the fucking fork and knife — all he wants is the food.

Before I sign off on this very obvious rant, one other thing:

Don’t tell me I don’t support standards, just because I don’t support the WaSP — the two are NOT synonymous.

End of Rant –

(Continued at newer weblog posting)

Shelley Powers
aka YASD


Original Posting:

Why do I dislike WaSP so much? I dislike any organization that pushes conformity over innovation. I dislike any organization that pushes conformity over content. I dislike any organization that, collectively, isn’t smart enough to realize that the web progresses whether we conform to the “standards” or not. Ultimately I dislike any organization that says something along the lines of:

 

YET IN SPITE of the efforts of the W3C, the browser makers, and a leading–edge minority of designers and developers, most of the web remains a Balkanized mess of non–valid markup, unstructured documents yoked to outdated presentational hacks, and incompatible code fragments that leave many millions of web users frustrated and disenfranchised.” from the WaSP semi-farewell or whatever it was
Get a clue. Get a clue. Please, please, please get a clue. The only thing frustrating the majority of us is the increasing number of “404 page not found” errors we’re getting because so many web sites (and companies) have closed down. That’s the real issue — not whether we use CSS, or include a DOCTYPE in our web pages.

WaSP don’t go away angry — just go away.

Categories
Semantics Specs

Opinion: Australian Censorship Bill Could Impact P2P

Recovered from Wayback Machine.

Australia’s been in the news before about Net censorship legislation, but the South Australian Parliament may have gone a little extreme even for this Net-conservative country.

A bill introduced in November would make it illegal for content providers to post material that is considered “objectionable viewing material” for children. What’s objectionable viewing material? Anything that the police — the police, mind you — would consider as falling within the R, NC, or X ratings categories of the film industry. Ostensibly this would cover material such as child pornography or content advocating breaking the law. However, the bill is general enough that it could also cover material on topics such as abortion, suicide, drug use, sexual behavior and other sensitive topics that could be termed “adult topics” and therefore R-rated.

Even more alarmingly, under this bill posting this material is illegal even if access to the material is restricted or password-protected. Compounding the problem, content providers would have no way of knowing whether their material would fall under one of the prohibited classifications before posting it; if the material is judged by the police to be within the parameters of this bill, you’d be charged. No warning and no second chance. And the fines aren’t cheap: as much as 10,000 (Australian) dollars per offense.

According to an alert issued by Electronic Frontiers Australia, this bill would actually make material that’s legal offline, illegal once posted online.

Related Articles:

Lessig: Fight For Your Right to Innovate

Free Radical: Ian Clarke Has Big Plans for the Internet

Code + Law: An Interview with Lawrence Lessig

Lessig: Fight For Your Right to Innovate

Search for “censorship” on O’Reilly Network

More on P2P Law


More from the OpenP2P.com

The impact of this bill on Web-based businesses is obvious — the level of censorship implied would give even the most conservative businesses pause when it comes to posting content on their Australian-based Web sites. What may not be so noticable, though, is the impact of this bill on peer-to-peer applications and services. You see, the wording of the bill doesn’t focus on Web-based content; it concerns content distributed via the Internet.

Consider the following scenario: You’re a subscriber to a file-sharing P2P service such as Napster. You make a request for material that could be considered “objectionable” because of the language used — for instance one of the more explicit songs from Alanis Morissette’s album “Jagged Little Pill,” or practically anything from Guns N’ Roses or Eminem. Once you’ve downloaded an “objectionable” song, it’s now on your machine for your personal use. However, in this process, you’ve also “posted” this content for access by other clients through the Internet: P2P is based on the fact that any node within the network can be both a client and server. According to this bill, you would be in violation of the law.

If you’re a subscriber to a decentralized service such as Freenet or Gnutella, the potential problems with this type of bill are even more extreme. With these types of P2P networks, if a file request is made from node A to node B, and then from node B to node C, that file is returned to node B as the intermediary first, and finally to node A. Now, not only is the peer located at C in violation of the law, so are A, who originally requested the file, and B, who did nothing more than subscribe to the conditions of the P2P service that states files may be stored on the client’s machine as a method of disseminating popular files throughout the network.

By its very nature, Freenet hides the identity of nodes supplying or requesting files, making it difficult to ascertain who was the originator of the material or the request. Because of this, it becomes difficult to ascertain who is legally responsible for “posting” the file if it is deemed to fall within the parameters of this censhorship bill. So, what could happen is that the intermediary node containing the file is the one charged with violating the law, rather than the originator, regardless of the technical and legal semantics that form the basis of anonymity within a Freenet network.

At the very least, applying this censorship law to the Freenet or Gnutella network would become a legal nightmare to the South Australian court system. All it would take to demonstrate the unfeasibility of the law is to introduce one highly popular but objectionable file to Freenet, potentially turning all or most South Australian Freenet users into criminals. This issue goes beyond considerations of copyright law.

According to the UK-based Register the South Australian’s politicians must have gone “barking mad” — in other words, the bill’s sponsors may want to reconsider the bill on its own merits.

Read the pertinent sections of the censorship bill at Electronic Frontiers and then join discussions at Slashdot and South Australia’s Talking Point