Categories
Web

Shock, Awe, Economics, and the Web

Battered into a fetal ball by waves of bad economic news, only surfacing to watch an occasional crash and bash flick, such as Iron Man, I discovered my own personal bailout via Naomi Klein’s book, “The Shock Doctrine: The Rise of Disaster Capitalism”. Oddly enough, it wasn’t something that Klein wrote (though she has many interesting points and I hope to write more on her book at a later time). No, it was a quote by the master of the Chicago School, Milton Friedman, himself, that loosed my death grip on self. As introduction to his book, Capitalism and Freedom, Friedman wrote:

only a crisis—actual or perceived—produces real change. When that crisis occurs, the actions that are taken depend on the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes politically inevitable.

The irony that the free market system Friedman loved so well is now experiencing its own “shock and awe”, as corporations grasp at despised government intervention, like a baby its bottle, has not escaped me. But for me, the operative phrase in this quote is “the actions that are taken depend on the ideas lying around”. This is totally irreverent to the problems we’re facing, and I apologize in advance for seeming to trivialize the very difficult times we’re all facing, but when I read this phrase I thought to myself, “Internet Explorer, your days are numbered.”

Consider this: movement forward in the web has been stymied in recent years because, we’re told, thousands of corporate intranets, and millions of corporate employees using these intranets are dependent on tricks and hacks put into place to support Internet Explorer 6. Add to this the, in my opinion, anal fixation that web pages must look the same in every browser, and most of our page design has been stuck like a bug in pitch.

Now that the corporations are downsizing in order to preserve what they can of executive compensation, the machines on which these applications run are being sold for scrape, tossed out along with the other chaff (i.e. employees). And those still employed, frankly, have other concerns than whether IE supports opacity or not.

I don’t believe I’m alone in seeing the Friedmanesque possibilities of our current economic disaster. What better explanation for the recent production release of Google’s Chrome browser? Google released Chrome from its beta utilizing a speed for which the company is not known. After all, isn’t the GMail still in beta? Come to that, isn’t the Google search engine still in beta?

Then there’s the fact that Chrome is currently only supported in Windows, just like IE. Only like IE, as a matter of fact. No, I am sure that Google sensed corporate shock, and moved quickly to displace IE in the hearts and minds of upper management—not to mention the hearts and minds of millions of newly unemployed workers who are no longer subject to the intransigence of corporate IT. If by doing so, Chrome also kicks Firefox, Opera, and Safari in the face in its haste, eh, casualties of war.

I am not displeased by Google’s move. After all, Chrome supports XHTML and some SVG, both of which Microsoft seems incapable of implementing. However, there is some confusion about what Chrome is, or is not, capable of supporting. True, Chrome has utilized the excellent WebKit, which also serves as the soul of Safari. However, as others have discovered and my new experimentation in web design demonstrates, Chrome has a different graphics engine (Skia) than Safari/Webkit. In the interests of “stripping” down the browser to make it lean and mean for web applications, the developers also made it rather, um, unattractive. At least for now. If you view this web page using Chrome, you will see that Google currently does not support the CSS3 text-shadow property, though it does support box-shadow. It also supports border-radius, though badly—the anti-aliasing is less than optimal, as is the support for alpha transparency.

While it is true that text-shadow, box-shadow, and border-radius are CSS3 properties, and thus not part of a released specification, they are supported in Safari 3.1 (and Firefox 3.1 and partially in Opera 9.x). Because of the Webkit tie-in between Safari 3.1 and Chrome, people may be confused when what works in Safari, does not work in Chrome. Well, those people who don’t have other, more pressing, worries.

Screenshot of Chrome in action

Categories
Technology Web

Progressive Enhancement and Graceful Degradation

A List Apart has a timely article titled Understanding Progressive Enhancement discussing the perceptual differences between graceful degradation and progressive enhancement. I enjoyed seeing Steve Champeon’s idea given new light. Additionally, now is as good a time as any to have a go at these topics, with the many new enhancements being added to today’s browsers, while antiques still cutter cyberspace. I could have done without the cloyingly cute M & M analogy in the article, but that’s probably my inner Cranky Woman having a go this AM.

I’ve written about graceful degradation, previously. Graceful degradation means applying modern technology but ensuring the application doesn’t negatively effect those viewing a web site with an Antique (remaining nameless). However, contrary to the ALA author’s statement of Under this paradigm, older browsers are expected to have a poor, but passable experience, graceful degradation is just that: gracefully degrading, meaning that though the person using the Antique doesn’t get all the bells or whistles, their experience at the site is more than “poor but passable”.

Progressive enhancement, on the other hand, begins with the content, rather than the technology; ensuring that the markup used to organize the content is semantically correct and valid. Then, and only then, the web site developer progresses to the use of CSS and JavaScript, both to annotate and enhance the content. That’s been the primary difference between the two approaches: graceful degradation tends to focus on technology, first, while progressive enhancement focuses on content, first.

Of course, the two are not exclusive: one can use progressive enhancement techniques, beginning with the content outward, paying particular attention to the semantics of the markup, and then apply the technique of graceful degradation when applying CSS and JavaScript. In particular when using Content Management Systems, such as Drupal and WordPress, it’s important to not neglect the semantics by focusing overmuch on the themes, widgets, and other, frequently annoying, gewgaws.

Categories
Burningbird Web

q=topic&subject=Google&opinion=sucky

This site, like most others built using a content management system rewrites the dynamic URLs into a static format, primarily to make them more readable. More portable, too, as we move our writings from CMS to CMS.

Google has come out with an odd post about static versus dynamic URLs, and it’s better for the Google bot to leave your URLs dynamic, because people screw up the rewrite rules. If you leave the URL dynamic, then the Google bot can figure out what it needs from the URL. However, if you rewrite it as a static URL, but leave dynamic pieces in, such as page number or the like, the Google bot may interpret the URL incorrectly.

At least, this is my interpretation of the post, and from the comments, other people’s interpretation.

The focus of Google’s suggestion is search engine optimization, and so probably only of interest to the SEO types. However, when Google writes posts like this, they ripple out like waves on a pond after a big stone is dropped in. Within a week or two I’m sure we’ll be hearing about how “best practice” for URLs now, is to use dynamic, not static URLs, regardless of the reason for the best practice.

No more permalinks to you WordPress folks. Or smart URLs for the Drupal users. Be brave, and show your parameters.

Or not.

Categories
Political Web

The web, attention, and truth

Tim Berners-Lee introduced the World Wide Web Foundation a couple of days ago. The focus of the organization, according to the site is to help make the web more open, robust, and accessible, all of which are commendable. But then Berners-Lee mentioned about ensuring the quality of the web through some kind of labeling system.

Short Sharp Science responded with:

Web licences to ensure that people only read sites they can handle are the next logical step. Fortunately it’s much more likely that the whole idea will quietly be forgotten, which will at least prevent Berners-Lee receiving one of the first “potentially misleading” badges for thinking it up in the first place. Let’s hope the World Wide Web Foundation and its laudable goals have a rosier future.

Karl Martino lists other responses, but brought up another effect associated with the “truthy label”.

Take the current campaign for President. How could a labeling scheme help or hurt?…I guarantee you a labeling scheme, in the political sphere, would favor the those who could utilize attention influence the most effectively, and have little to do with actual ‘truth’.

However, I don’t think we have to worry about the truthy battle any time soon, as I’m not seeing much interest in this announcement. Oh, mention of it has appeared here and there, such as in Karl’s post and in Short Sharp Science, and including this post by yours truly. But most of the web community is focused on some new advance in one or another of the browsers, implementation of a new CSS3 or HTML5 feature, or the invention of yet another server-side language that will kill all others. Well, with an occasional picture of a cat, vacation, or cute little cherub (because we do not live by tech alone).

Either the seeming indifference is due to the fact that the web has grown far beyond the reaches of even its original inventor, and few believe that this effort will have much of an impact. Or we’ve been hit with so many new “initiatives” that all we care about now is what’s working, what’s broke, and trying to ensure pieces of the former do not become part of the latter.

Categories
Technology Web

Opacity returns to IE8

Recovered from the Wayback Machine.

When IE8 beta 1 released, there was a minor uproar at the fact that Microsoft had dropped support for its proprietary version of opacity, while not providing support for the newer CSS-based opacity.

Gone were the days when the following CSS setting would change the opacity of an element in all of the major browsers:

#somediv 
{
  opacity: 0.0; filter: alpha(opacity=0);
}

If you wanted to get opacity to work with IE8, either you’d have to have your users turn on the IE7 compatibility mode, or you’d have to add a meta tag to your web page:

<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE7" />

Yesterday, Microsoft released a post on the IEBlog that had both good and bad news. The good news was that opacity was back. The bad news was that setting opacity in such a way that IE8 would process it as IE8 is now more complicated than ever.

It seems that the reason Microsoft pulled the old filter syntax is because the format was not CSS 2.1 compatible. However, according to comments in the post, Microsoft couldn’t just transfer the opacity functionality over to the CSS approach, because the behavior between the two, CSS and Microsoft filter behavior, differs.

Due to the gently incessant requests (!?) of web developers, Microsoft has added back in the opacity filter. However, the company is using the naming convention standardized for browser based CSS extensions, which means it still meets the CSS 2.1 requirements. Where the old, formalized, filter setting looked like the following:

filter: progid:DXImageTransform.Microsoft.Alpha(Opacity=0)

Which had illegal characters, including the equal sign, Microsoft now has the following:

-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=0)";

Notice the use of the “-ms-” prefix on the filter, and the use of quotes to enclose the setting, and hide the illegal characters.

Of course, for the opacity setting to work with both IE7 and IE8, we have to use both. According to Microsoft, we have to list the new extension format first, and then older setting. This in addition to the CSS opacity setting:

-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=0)";
filter: progid:DXImageTransform.Microsoft.Alpha(Opacity=0);
opacity: 0;

I created a simple example to demonstrate how opacity would work, honoring both formats. It includes setting opacity with JavaScript, and works with Firefox, Safari, Opera, IE8, and as tested with IE7 compatibility mode. Click the image several times to hide one image, and expose another.

theobj.style.filter='progid:DXImageTransform.Microsoft.Alpha(Opacity=' + opacity + ')';
theobj.style.opacity=value;

The challenge isn’t over yet, though. The images in the first test are JPEGs, but I also tried the example with PNGs with alpha transparency. Unfortunately, IE8 beta 2 supports opacity, but if opacity settings are applied to an image or element containing an image, and the image is a PNG that incorporates alpha transparency, the transparent effect is lost when the opacity is changed.

In older versions of IE, PNG alpha transparency was set with the AlphaImageLoader, like the following:

#div1 img {
filter: progid:DXImageTransform.Microsoft.AlphaImageLoader(src=fig0902.png,
sizingMethod='scale');
}

In IE8, which normally would support the PNG transparency, when the opacity filter is changed, the alpha transparency is lost. To compensate, when the containing element’s opacity is changed, the image’s alpha transparency also has to be set in script, as demonstrated in a second example:

document.getElementById("img1").style.filter = "progid:DXImageTransform.Microsoft.AlphaImageLoader(src=fig0902.png,sizingMethod='scale')";

Since the alpha transparency in PNGs is supported with IE8 (and IE7), the CSS setting doesn’t seem to need the new Microsoft extension naming. However, unless the image’s alpha transparency setting is “reset” after changing the opacity, the transparency is lost when the application is run as IE8, as you can see from a third example. Oddly enough, the problem with the alpha transparency on the PNGs doesn’t happen in IE7 compatibility mode. The only thing I can think is more than the name has changed with IE8’s opacity implementation.

Of course, if you use the EmulateIE7 meta tag, you don’t have to muck around with the new opacity extension, or resetting the PNG filter, but you also don’t get the other CSS 2.1 standard settings from IE8.