Categories
RDF SVG Web

Tweaking makes perfect

Not long ago, Tim O’Reilly posted a discussion thread about the importance of practice, and one of the participants in the thread, my long-time editor, Simon St. Laurent, reiterated his interest in practicing this year—both on the trumpet, and in his coding.

I never left programming the way I left trumpet. I simply stopped playing trumpet after eighth grade. I’ve gone back and forth with programming since sixth grade, getting totally into it for a year or two at a time and then departing out of frustration, distraction, or the need to do something else. At O’Reilly, I’m exposed to programming constantly – I edit and write computer books after all! – but editorial is a long ways from actually programming. Even writing books about programming is a seriously meta- activity, one that requires more attention on the communications than on the code. (The code has to be right, but – though this may depend on the audience – the explanations have to do a lot more than the code.) My work isn’t programming practice.

One place I practice is with this site. I still have hopes that I can transform my work with this site into some paying work. At a minimum, I enjoy the tweaking and it keeps me occupied.

In addition, I also frequently re-design this site. Doing so allows me to explore new uses of technology, such as the use of SVG for site design, and JavaScript and RDFa in support of semantics. The practice also helps me improve my use of XHTML and CSS, including how to deal with IE without necessarily having to incorporate massive amounts of workaround code. Luckily, the “in” design concepts today are based on a minimalist design, so if my site is legible and clean in IE, it doesn’t matter if it’s plain.

I’m not practicing with every hot technology; I’ve made choices with how I spend my time. Yes for PHP, Python, JavaScript, CSS, SVG, RDFa, various web services, and XHTML. No on .Net, Ruby, Java, and cloud computing. A maybe on HTML5 and C++. Not necessarily the best decisions, perhaps, as Java and .Net are where the money is made, and the folks in Silicon Valley drool when you mention “cloud”, but I really don’t like the technologies or the environments.

Practice is essential for keeping our skills sharp, but that’s not the only reason it’s important. It’s also a way to constructively deal with the constant barrage of unhappy news we’re subjected to. We may not have any control over warring nations, global warming, or the state of economy, but we do have some control over how we live our lives. And that includes finding pieces of ourselves that can be improved with practice.

Categories
Web

Shock, Awe, Economics, and the Web

Battered into a fetal ball by waves of bad economic news, only surfacing to watch an occasional crash and bash flick, such as Iron Man, I discovered my own personal bailout via Naomi Klein’s book, “The Shock Doctrine: The Rise of Disaster Capitalism”. Oddly enough, it wasn’t something that Klein wrote (though she has many interesting points and I hope to write more on her book at a later time). No, it was a quote by the master of the Chicago School, Milton Friedman, himself, that loosed my death grip on self. As introduction to his book, Capitalism and Freedom, Friedman wrote:

only a crisis—actual or perceived—produces real change. When that crisis occurs, the actions that are taken depend on the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes politically inevitable.

The irony that the free market system Friedman loved so well is now experiencing its own “shock and awe”, as corporations grasp at despised government intervention, like a baby its bottle, has not escaped me. But for me, the operative phrase in this quote is “the actions that are taken depend on the ideas lying around”. This is totally irreverent to the problems we’re facing, and I apologize in advance for seeming to trivialize the very difficult times we’re all facing, but when I read this phrase I thought to myself, “Internet Explorer, your days are numbered.”

Consider this: movement forward in the web has been stymied in recent years because, we’re told, thousands of corporate intranets, and millions of corporate employees using these intranets are dependent on tricks and hacks put into place to support Internet Explorer 6. Add to this the, in my opinion, anal fixation that web pages must look the same in every browser, and most of our page design has been stuck like a bug in pitch.

Now that the corporations are downsizing in order to preserve what they can of executive compensation, the machines on which these applications run are being sold for scrape, tossed out along with the other chaff (i.e. employees). And those still employed, frankly, have other concerns than whether IE supports opacity or not.

I don’t believe I’m alone in seeing the Friedmanesque possibilities of our current economic disaster. What better explanation for the recent production release of Google’s Chrome browser? Google released Chrome from its beta utilizing a speed for which the company is not known. After all, isn’t the GMail still in beta? Come to that, isn’t the Google search engine still in beta?

Then there’s the fact that Chrome is currently only supported in Windows, just like IE. Only like IE, as a matter of fact. No, I am sure that Google sensed corporate shock, and moved quickly to displace IE in the hearts and minds of upper management—not to mention the hearts and minds of millions of newly unemployed workers who are no longer subject to the intransigence of corporate IT. If by doing so, Chrome also kicks Firefox, Opera, and Safari in the face in its haste, eh, casualties of war.

I am not displeased by Google’s move. After all, Chrome supports XHTML and some SVG, both of which Microsoft seems incapable of implementing. However, there is some confusion about what Chrome is, or is not, capable of supporting. True, Chrome has utilized the excellent WebKit, which also serves as the soul of Safari. However, as others have discovered and my new experimentation in web design demonstrates, Chrome has a different graphics engine (Skia) than Safari/Webkit. In the interests of “stripping” down the browser to make it lean and mean for web applications, the developers also made it rather, um, unattractive. At least for now. If you view this web page using Chrome, you will see that Google currently does not support the CSS3 text-shadow property, though it does support box-shadow. It also supports border-radius, though badly—the anti-aliasing is less than optimal, as is the support for alpha transparency.

While it is true that text-shadow, box-shadow, and border-radius are CSS3 properties, and thus not part of a released specification, they are supported in Safari 3.1 (and Firefox 3.1 and partially in Opera 9.x). Because of the Webkit tie-in between Safari 3.1 and Chrome, people may be confused when what works in Safari, does not work in Chrome. Well, those people who don’t have other, more pressing, worries.

Screenshot of Chrome in action

Categories
Technology Web

Progressive Enhancement and Graceful Degradation

A List Apart has a timely article titled Understanding Progressive Enhancement discussing the perceptual differences between graceful degradation and progressive enhancement. I enjoyed seeing Steve Champeon’s idea given new light. Additionally, now is as good a time as any to have a go at these topics, with the many new enhancements being added to today’s browsers, while antiques still cutter cyberspace. I could have done without the cloyingly cute M & M analogy in the article, but that’s probably my inner Cranky Woman having a go this AM.

I’ve written about graceful degradation, previously. Graceful degradation means applying modern technology but ensuring the application doesn’t negatively effect those viewing a web site with an Antique (remaining nameless). However, contrary to the ALA author’s statement of Under this paradigm, older browsers are expected to have a poor, but passable experience, graceful degradation is just that: gracefully degrading, meaning that though the person using the Antique doesn’t get all the bells or whistles, their experience at the site is more than “poor but passable”.

Progressive enhancement, on the other hand, begins with the content, rather than the technology; ensuring that the markup used to organize the content is semantically correct and valid. Then, and only then, the web site developer progresses to the use of CSS and JavaScript, both to annotate and enhance the content. That’s been the primary difference between the two approaches: graceful degradation tends to focus on technology, first, while progressive enhancement focuses on content, first.

Of course, the two are not exclusive: one can use progressive enhancement techniques, beginning with the content outward, paying particular attention to the semantics of the markup, and then apply the technique of graceful degradation when applying CSS and JavaScript. In particular when using Content Management Systems, such as Drupal and WordPress, it’s important to not neglect the semantics by focusing overmuch on the themes, widgets, and other, frequently annoying, gewgaws.

Categories
Burningbird Web

q=topic&subject=Google&opinion=sucky

This site, like most others built using a content management system rewrites the dynamic URLs into a static format, primarily to make them more readable. More portable, too, as we move our writings from CMS to CMS.

Google has come out with an odd post about static versus dynamic URLs, and it’s better for the Google bot to leave your URLs dynamic, because people screw up the rewrite rules. If you leave the URL dynamic, then the Google bot can figure out what it needs from the URL. However, if you rewrite it as a static URL, but leave dynamic pieces in, such as page number or the like, the Google bot may interpret the URL incorrectly.

At least, this is my interpretation of the post, and from the comments, other people’s interpretation.

The focus of Google’s suggestion is search engine optimization, and so probably only of interest to the SEO types. However, when Google writes posts like this, they ripple out like waves on a pond after a big stone is dropped in. Within a week or two I’m sure we’ll be hearing about how “best practice” for URLs now, is to use dynamic, not static URLs, regardless of the reason for the best practice.

No more permalinks to you WordPress folks. Or smart URLs for the Drupal users. Be brave, and show your parameters.

Or not.

Categories
Political Web

The web, attention, and truth

Tim Berners-Lee introduced the World Wide Web Foundation a couple of days ago. The focus of the organization, according to the site is to help make the web more open, robust, and accessible, all of which are commendable. But then Berners-Lee mentioned about ensuring the quality of the web through some kind of labeling system.

Short Sharp Science responded with:

Web licences to ensure that people only read sites they can handle are the next logical step. Fortunately it’s much more likely that the whole idea will quietly be forgotten, which will at least prevent Berners-Lee receiving one of the first “potentially misleading” badges for thinking it up in the first place. Let’s hope the World Wide Web Foundation and its laudable goals have a rosier future.

Karl Martino lists other responses, but brought up another effect associated with the “truthy label”.

Take the current campaign for President. How could a labeling scheme help or hurt?…I guarantee you a labeling scheme, in the political sphere, would favor the those who could utilize attention influence the most effectively, and have little to do with actual ‘truth’.

However, I don’t think we have to worry about the truthy battle any time soon, as I’m not seeing much interest in this announcement. Oh, mention of it has appeared here and there, such as in Karl’s post and in Short Sharp Science, and including this post by yours truly. But most of the web community is focused on some new advance in one or another of the browsers, implementation of a new CSS3 or HTML5 feature, or the invention of yet another server-side language that will kill all others. Well, with an occasional picture of a cat, vacation, or cute little cherub (because we do not live by tech alone).

Either the seeming indifference is due to the fact that the web has grown far beyond the reaches of even its original inventor, and few believe that this effort will have much of an impact. Or we’ve been hit with so many new “initiatives” that all we care about now is what’s working, what’s broke, and trying to ensure pieces of the former do not become part of the latter.