Categories
Technology

Stop re-inventing this wheel

Recovered from the Wayback Machine.

Perhaps at SxSW, PPK can convince the new horde of Ajax developers to stop acting as if that they’ve invented the technology, all squeaky new. Via link from Ajaxian, the concept of ‘transparent’ messages from Humanized.

Issues of accessibility and usability aside–been there, done that, wrote the book (page 240 to be exact).

Categories
Technology

Fear no tech

Recovered from the Wayback Machine.

From the new Pew Survey on future of the internet:

A low-cost global network will be thriving and creating new opportunities in a “flattening” world.

Humans will remain in charge of technology, even as more activity is automated and “smart agents” proliferate. However, a significant 42% of survey respondents were pessimistic about humans’ ability to control the technology in the future. This significant majority agreed that dangers and dependencies will grow beyond our ability to stay in charge of technology. This was one of the major surprises in the survey.

Virtual reality will be compelling enough to enhance worker productivity and also spawn new addiction problems.

Tech “refuseniks” will emerge as a cultural group characterized by their choice to live off the network. Some will do this as a benign way to limit information overload, while others will commit acts of violence and terror against technology-inspired change.

People will wittingly and unwittingly disclose more about themselves, gaining some benefits in the process even as they lose some privacy.

This is a bizarre hodge podge of scenarios.

The last item already exists, but the idea of terrorists going around bombing out routers to prevent change is too Unibomber for words. As for the key item: maintaining control of technology that’s advancing in such leaps and bounds, we must remember that the hot new thing now, Ajax, is based on ten year old technologies.

Reading the report, most of the ad-hoc responses were the same flowery, empty phrases we’ve had for years now. The so-called ‘experts’ were ones the report says …have been online since 1993. Wow, that long? Half were online before, half after 1993.

This survey is based on packaged scenarios built on pre-defined assumptions and validated by hand-picked respondents. It’s flawed, almost beyond belief.

Categories
Technology

Ajax Myth Busting

Recovered from the Wayback Machine.

Frontforge has a set of Ajax evaluation criteria that look, at first glance, quite good until you drill down into each item. Since the page is created in such a way that I can’t right-mouse button copy text in order to make quotes (and breaks in Safari to boot), you’ll have to read each section first before reading my notes.

Taking each in turn:

#1 Back button, history, and bookmarks

If I create a slidehow using Ajax, according to this list I must make sure that each slide is reflected in the history and accessible via the browser back button, as well as being bookmarkable. However, one of the advantages of slideshows is that I can go through a set of photos and then hit the back button once to leave the page and return to where I started.

There is no contract with the users that history or back button or any such is ‘required’ in order for a web application to be successful. That’s all made up. What is required is that one understands how people might use the application and then respond accordingly. So, for the slideshow, rather than try to force behavior into the browser for it to manage an arbitrarily constructed ‘history’, what one needs to do is understand what the users might want.

In the case of the show, being able to access a specific show ‘page’ using an URL that either can be bookmarked or, more likely, linked, could be the critical item. In this case, creating a ‘permalink’ for each page, and then assigning parameters to the URL, such as the following, would enable a person to return to one specific page.

http://somecompany.com?page=three

Then the JavaScript framework or whatever can parse the URL and load the proper page. As for having to maintain history, I would actually prefer to see history locked behind signed and sealed scripts and not easily accessible by JavaScript developers. This, then, would also include screwing around with the back button.

#2: standard and custom controls

In this one, the author states that an Ajax framework should have a good set of built-in controls to start a project and then a means to easily extend these for custom behavior. This one I do agree with: a JavaScript library (any class library, really) should include one layer that defines a basic set of behaviors, which is then used by other libraries to build more sophisticated components.

Except that this isn’t what the author is saying. What he or she is saying is that a ‘basic’ behavior is something such as a complete packaged piece of functionality such as a drag and drop or a visual fade, while the more complex behaviors would be something such as a complete mouseover and slideshow or some such thing. Of course, when you see a complex behavior listed as something ‘basic’, it makes sense that customization becomes increasingly difficult. That’s more a matter of incorrect abstraction than making use of something such as JavaScript’s prototype property.

In my opinion, simpler components and good documentation is worth any number of increasingly sophisticated objects.

#3: single page interface

Being able to edit a page’s contents in page, such as that implemented at Flickr is what the author is talking about here. I gather there’s now a formal name for this: single-page interface. That’s like Jesse James Garret’s use of Ajax to describe a set of already existing technologies–if you speak with enough authority, anyone can name anything.

I hereby call this a weblog. Oh, wait a sec…

But the premise of the author’s assumption goes far beyond this. According to the writing, a page should be split into components and then each should be loaded at runtime and formed into a cohesive whole through JavaScript. So a five step form starts by loading the first step and then the additional steps are loaded as the person is working the first.

My response is: why? Why do I need to use JavaScript to access the components that we already know are going to be loaded? I can see using something such as the accordion effect (collapsed sections that hide specific pieces of the form until expanded, such as that being used with WordPress), but it makes no sense to somehow load these individually, starting with the first part and then use Ajax calls to the server to load the rest.

If what’s loaded is altered by what the person has entered, such as providing a list of cities when a state is picked, then it makes sense. But if it’s static content, or content ahead of time, we should let the web server do its job uncomplicated by client-side overriding of basic behavior.

If time is the issue (the author doesn’t really state a good argument for this, but lets assume time is the issue), then perhaps if we didn’t build such over-engineered Ajax libraries, page loading times wouldn’t be such a problem.

Hiding and displaying components of the form and allowing in-page editing does make sense. But this loading by piecemeal using JavaScript? No.

#4: productivitiy and maintainability

This is the one that drove me to write this post. This was the ‘money’ item for me.

According to the author, I gather that it’s difficult to find qualified Ajax developers, and it’s an unwieldy burden to get existing developers (page or server-side) up to speed to be able to use all of this Ajaxian goodness. There’s even an OpenAjax movement afoot to …to promote Ajax interoperability, standards and education in order to accelerate the global adoption of Ajax. (And it doesn’t hurt one’s business to put their membership in this effort into their About page, either. Point of fact to the effort: Ajax does not ‘stand’ for anything–it’s not an acronym. It’s just a simple term that Garrett picked.)

Instead of forcing JavaScript on folks, a better approach (according to the author) is to use existing CSS and HTML elements. How? A couple of different ways.

One way is for the script to take all elements of a certain type and add functionality to each, such as to add validation techniques for each form element (triggered by some behavior such as loss of focus when the person keys away from the field). This isn’t a bad approach at all, though there is usually some JavaScript involved as the person has to attach some information about how a field is to be validated or some such thing. Still, it’s nice not to have to worry about attaching event handlers and capturing the events and processing the data and so on.

However, according to ‘declarative’ Ajax, this additional information isn’t provided into JavaScript. Instead of using script to provide information to the program (no matter how simply this could be done), the Ajax framework developers have ‘extended’ HTML elements to include new attributes. With this, the poor Ajax-deprived page designer and developer need never get their fingers dirty by actually touching JavaScript.

Sounds good, one problem: these methods ‘break’ the page, leaving invalid HTML in their wake.

You don’t have to take my word for it. This page discusses the “right way to do Ajax is Declaratively”. The page validates, but then it’s not using any of the Ajax frameworks. One of the frameworks mentioned Hijax also validates, because it’s using standard HTML elements (list items) and the class attribute (standard for all HTML elements), in order to maintain its effects. The other libraries, such as hInclude and FormFaces, do not validate. Why not? Because they’re using non-standard attributes on the HTML elements.

We’ve spent over a decade finally getting the cruft out of pages. I’m not going to be sanguine about blithly putting more cruft back into the pages because we want a ‘cool’ effect. (For more on this, I recommend Roger Johansson’s Why Standards Still Matter.)

Leaving aside my concerns about how much we’re overloading the ‘class’ attribute, deliberately introducing invalid HTML is not a mark of progress. Leaving aside how this could impact on user agents such as text-to-speech browsers, this approach seems to assume that everyone who wants to incorporate Ajax outside of a select few intelligent enough to understand its complexities must be protected from the ‘working’ bits. That’s arrogant in the extreme because there’s nothing overly complex at all about Ajax. It’s actually based on some of the simplest developing technologies there are. Well, other than when such are obfuscated in order to puff up the concept.

Moving on to the rest of the Ajax must haves…

#5: client server

Well, this one is like being agin sin. Client server is good! OK, we got that one.

There’s little to quibble with on this item. The front end work separated from the back; hide cross-browser differences; communicate with the server using XML (though I think this one can be negotiable). I am concerned with a growing interest in making all aspects of a web application into Ajax initiated function calls, rather than using server-side templates and such; other than that, I can agree with this point.

#6: XPath targetting

My god how have we managed to get by now, with having to use JavaScript functions such as document.getElementById. If I’m reading this one correctly, it would seem that Ajax frameworks have to take the DOM and convert it into XPath notation before we can even begin to work with a web page.

There is something to be said for being able to access a page element using an XPath notation rather than having to use the DOM’s getChildNodes, but the examples given don’t demonstrate the power of such operation, and I’m not sure this is as essential as the author makes it out to be.

#7: comprehensive event model

Luckily, I found I can copy; I just have to use the Edit menu copy, rather than right mouse button. The author writes:

Next-generation Ajax frameworks finally make it possible to put the presentation layer into the browser client where it belongs. This opens up a huge business opportunity to migrate strategic desktop applications to the web where they can deliver value to partners, customers, and remote employees. To do this, it must be possible to define mouse actions and key combinations that trigger events.

Next generation Ajax frameworks may allow this, but I don’t see a mad dash to put desktop applications on the web. This isn’t Oz, and clicking our Ruby slippers together, saying, “There’s no place like the web…there’s no place like the web” isn’t going to make this happen.

I do agree with a standard set of event handlers, and making sure that each library chains its functionality on to an event so that it doesn’t bump another library’s (or the web page developer’s), but I don’t think this means then that we can move Adobe Photoshop on to the web. I don’t think this is something that users are demanding.

As for making a set of ‘custom’ events, such as NextMonth for calendars, this totally abrogates much of the concept of abstration by taking a general event (such as a mouse click) and blurring the lines between it and a business event (NextMonth). This not only adds to the complexity of the frameworks, it adds to the size of the code, as well as the points of breakage any time a new browser is released (or a new library included in the application).

What is really needed is a basic understanding of how event handlers are attached to objects, ensuring that no one library impacts adversely on another library or on the web developer’s efforts. That and a simplified way of attaching events to objects is nice. More than that is an example of a developer with too much time on their hands.

#8: state and observer pattern

It always amazes me to see even the most simple concepts obscured whenever the term ‘Ajax’ is involved.

This last item has some good points but they’re heavily obscured with using design pattern terminology such as “Observer pattern” and so on. Yes, I know that the use of such is to make communication easier so that when a person writes, “Observer pattern” we understand what they mean without them going into details. But then this means that the reader has to stop reading, go to the pattern page, read the pattern, get understanding, return, and then try to decipher is what the person is writing is what was meant by the pattern and so on.

I understand what is meant by ‘observer pattern’, and I can agree with the idea that wouldn’t it be nice to attach a behavior (register) to a form button that allows it to enable itself when certain form fields are completed. The whole thing is based on storing information about the ‘state’ of the object, which as the author of this Ajax framework must-haves notes, isn’t part of basic JavaScript behavior.

Except that it is. Within the same page, information about state of element is stored all the time, and I fail to see how there needs to be a formalized mechanism in place to facilitate this. As for storing state between pages, again mechanisms are in place through cookies, server-side storage and so on.

As for observer/registration, is it that we want all blur events triggered in the page to be funneled through an event handler for the form submit button that checks to see if it can now enable itself? Or blur events for the target fields? That’s doable, but seems overly process intensive. I would think a better approach would be to have the form submit button enabled, and then when the form is submitted, use Ajax (JavaScript) to test if required fields are filled in and then direct the person’s attention to what’s missing. Then event handling is simplified and the only time the process is invoked is when the form is submitted, not when each field is filled in.

In other words, what may work well with an application built with VB.NET may not work well, or never work well, within a browser and what we should not do is coerce functionality from one development environment onto another.

That’s they key for much of these ‘essential’ items for an Ajax framework: many add unecessarily to the complexity of the libraries, as well as their size, the number of points where a break can occur, and the likelihood of failure when more than one library is used or a new browser version is released.

In web development, there’s one important rule: less is more.

Thanks to Ajaxian for pointer to this list. In the interests of disclosure, note that I am writing a book, Adding Ajax, whereby I demonstrate that any web application developer familiar with basic JavaScript can add Ajax benefits and effects to their existing web applications. As such, I’ve violated the premise behind #4 by assuming that just anyone can become an ‘Ajax’ developer.

Categories
Web

Web 1.0 must die

Recovered from the Wayback Machine.

I think Web 2.0 is killing Web 1.0. I think there’s a ‘young eating their parent’ thing going on.

Amazon has been using its resources to put out S3 and the new video service at the same time that the company’s bread & butter online store seems to be taking a hit. Maybe it’s just me and my machines and my internet access, but the site is slow, the redirecting is broken, and the pages are excessively cluttered now.

Google’s another. I really like Google maps, and gmail was OK, but much of the new functionality the company is putting out seems to only appeal to a small group of geeks. The office style products are, to put it delicately, uninspiring. In the meantime, I’ve noticed more and more that searches return links to stores or businesses, rather than useful information. The SEOs are winning, while Google is focused on profitable good works.

Yahoo, on the other hand, bought its way into Web 2.0–perhaps as a strategy to keep from being eaten by the young ‘uns.

eBay bought Skype and Paypal, and we know why it bought Paypal, but I think we’re all collectively scratching our heads on the Skype. Meanwhile, it flounders around now trying to find new revenue streams, while it’s core functionality is being phished to death.

Even the venerable conferences of yore are giving way to SillyValley “Meet Mike” or “Stick it to Tim” shmooze and booze sessions where any pretense of actually discussing technology has given way to breathless panting about startups: hot or not. Isn’t it nice to know that the long tail is being wagged by a puppy?

Categories
Web

And the young eats itself

Recovered from the Wayback Machine.

Liz Gannes at GigaOM writes a story on Evan Williams and Odeo, and Williams confession at the recent Web Apps of the Future yak fest. Williams talks about how he royally screwed up with his startup, Odeo, burning through it sounds like millions, hiring a staff of 14, all to build a product for an audience the company hadn’t even defined yet. So now that Ev has recognized his mistake, is he going to do better?

So what’s he doing to fix these mistakes? Not refunding the VCs their investment, that’s for sure. And not even trying to earn revenue; Williams freely admitted Odeo hasn’t yet settled on a business model.

I expected Williams to get at least a verbal slap for such, but oh no.

All in all, we can’t say we came out of the presentations convinced Odeo is set to conquer the universe, but Williams’ honesty and humility are admirable. The best part is, his advice has a chance of making an impact while it’s still relevant to today’s startups.

Speechless. I’m speechless.