Categories
Web

W3C Web Services Group

Recovered from the Wayback Machine.

I’m extremely pleased to see the formation of a new working group at the W3C. It will be responsible for generating some meaning of the chaos that is Web Services. I may not be an adherant of the “Standards at all cost” club, but we do need to have standardization in three specific areas:

  • Protocols
  • Syntax
  • Semantics

Syntax is XML, Semantics is RDF, and a  public domain Protocol will, hopefully, come quickly and efficiently out of this new group.

XML as a syntax is safely within the public domain. However, there is too much implied ownership of the protocols (openly available or not), and there’s too little understanding of the importance of the semantics.

We don’t need centralized enforcement agencies such as UDDI., or centralized security paradigms such as Passport if we only have a standardized syntax, semantics, and protocol. Give me these, and I can build you an interface to anything, from anything.

My strongest hope is that the W3C moves swiftly — they’re late in the protocol game.

Categories
Web

Future of the Web

Recovered from the Wayback Machine

When people say something I want to respond to, I respond to it. And other people are, hopefully, responding to me if I say something interesting. When I respond to what others write, it is a compliment. It means that what was said definitely got my interest, regardless of whether I agree with what was said or not. When people respond to me, I take it as a compliment, even when they call me nasty things. (Go ahead! Call me a bitch! I live for this!)

Having carefully said all this, I find I do want to respond to something Dave said on Scripting News. I have to respond — to hold it in will cause me an injury.

I was a developer before the Web was even a twinkle in Berners-Lee’s eyes. I love to program, and have worked — worked mind you — with 18 different programming languages, including C, C++, Java, Perl, Snobol (any of you recognize this one?), Smalltalk, Ada, Pascal, Modula II, FORTRAN, LISP, and so on. And I still love to program, though I spend most of my time designing technology architectures and writing now.

When the web came along, it was love at first byte. I thought that this was great stuff — a universal front end to any application. I was so sold that I focused as much of my professional life on the web as I could, and still pay the bills.

I wrote books and articles on CGI and DHTML and JavaScript and XML and CSS and ASP and a host of other web technologies. Even today I find I am as fascinated by the web as I was waaaaaaaaaay back in the beginning. I’ve never seen that the web is low-tech. If anything, I find myself being stretched more by the web than by traditional programming.

In all this time, I just don’t remember there ever being a battle between C developers (I’m assuming by this Dave meant people who don’t want to use the web as an environment for their applications) and web developers. Not all applications fit the web, and not all companies have chosen the web for their environment — but that’s not developers, that’s just business. Most companies today use applications from both environments, something that will probably continue to be the norm into the future. (We don’t want to use Word over the Internet as a service, no matter what Microsoft says. Same for PhotoShop)

There’s discussions — constantly — between server-side folks and the designers. I know that I’ve had a lively chat or two with the WSP people who are, primarily, web designers. But most developers I know of, such as myself, are thrilled to play with the new technologies the web has provided. There might be a few who don’t want to play web, but most of us are as happy (or more) working with web development as we are with traditional development.

The whole thing is really about services isn’t it? Providing services to people who need them. Most computer-based functionality is nothing more than services wrapped in a front end — doesn’t matter if the front end is a VB application or a web page. All that matters is that the services are prompt, efficient, secure, accurate, and effective. If some people prefer to create the front end in VB and put both service and front end on one machine, that’s cool. If they prefer a web page, that’s cool. Where’s the battle? Apples and oranges.

As for Netscape and Microsoft and the W3C not having a vision for the future of the web, oh they most certainly do and did. Microsoft’s whole vision is .NET and owning the internet. In fact, the company’s vision scares me most of the time. Netscape also had strong designs on the web before they became the underdog. As for the W3C, we wouldn’t have the web without this organization’s efforts. I may preach chaos, but I practice chaos on top of a specific development platform, and I have that platform thanks to the W3C.

The key is that there are a lot of groups and people who have their own visions for what is the future of the web. If we continue to work towards a common interface, then we can each practice our own vision and our own chaos behind that interface. But we must have this interface, and I’d rather it be provided by an organization that doesn’t profit, then one that does. The interface cannot be owned by any one company, any one organization, or any one person.

Categories
Web

More on Zeldman rant

There’s been confusion about why I reacted so strongly to Zeldman’s posting, earlier in the day. Copied from an email I just to a friend:

Zeldman sees only wrong within the Internet. The industry is stupid. Content created without web writers is bad. He seems incapable of seeing the wonder that surrounds him, and that touches him in his everyday interaction.

Case in point – metadata has nothing to do with websites being visited or not. Not really. It’s all about how to generate buzz. We know this from weblogging; it’s all a game. None of us uses meta-data to connect via weblogging. In fact, we’re seeing the beginning instances of a truly semantic web through the human element contained in weblogging — and all Zeldman “sees” is that too many sites aren’t accessible.

I agree with him that web writers aren’t valued as highly as they should; I disagree with the assumption that only web writers can create good content.

It’s elitism and I’ve been fighting this on the internet since day 1. Zeldman is an elitist. If he’s talking about hiring firms, I’m not seeing this from his post. Perhaps the viewpoint is based on our different perspectives of Zeldman.

Categories
Web

Zeldman rant

Zeldman’s on a rant…again

You know according to Zeldman’s estimation of the web, you aren’t here. You’re not reading this. This isn’t usable. None of the web except the infamous Zeldman orange is usable. The web is going to hell – quickly. Better cancel your DSL or cable modem right now. Tell Google it’s failing.

(Found reference at Scripting News)

Categories
Web

National Park System break in

Recovered from the Wayback Machine.

No rain for at least five days. Five Whole Days! It must be MacWorld — Apple brought us a break in the rain. (Well, they sure didn’t bring us anything exciting technologically).

Tomorrow I hit one of my favorite walks — Golden Gate to Presidio to Crissy to Embarcadero.

Speaking of walks and the great outdoors, this one was sure missed. The National Park Service web site is offline because of a court order. Why? Because a hacker was able to break into the system.

Now, breaking into the National Park System isn’t that big a deal; after all the FBI, the CIA, the White House and several other sites have been broken into. Many times. However, the same agency that controls the NPS — The US Department of the Interior — also controls a multi-billion dollar Native American Trust Fund. And the US Department of the Interior is currently involved in a lawsuit brought by several Native American Tribes over said handling of the trust fund. And the court reviewing this case was the one who sanctioned the hacker to break into the system (and into the trust fund itself) to show that it’s vulnerable, and therefore could be a security risk to said trust fund. So because the NPS web site was cracked, it was ordered pulled indefinitely until security can be assured for the site. And we all know how we can guarantee web server security, don’t we?

End result: yours truly — in an effort to achieve peace, calm, and enlightment through a lovely walk — goes out to my favorite NPS web site to get a map of the paths of the Presidio and finds that the site is down, thereby leading to a search to find the reason, thereby finding the only reference to this event at SF Gate, thereby beginning to burn yet again.

If this continues, your favorite bird that burns is going to be extra crispy.

More on this:

FOXNews

Updated: 1/12/02 You can find several articles on this story by doing a Google with Alan+Balaran+hacker as the search term.