Categories
SVG

SVG Planet

I noticed this thread on the SVG Interest Group email list about an official location for the SVG community page. Sounds like the group will be using the planetsvg.com site, though planet sites are typically aggregation sites. However, a “planet” can be many things, and the name is comprehensive.

I still have planetsvg.org created to provide an SVG aggregator site, but I’ve not had luck finding feeds for people writing about SVG, nor have I received many requests (any) to be included. I’m not an “official” member of the SVG inner group, so I’m not sure about continuing the site. Perhaps if I can find more feeds that actually work and are related, at least in some way, to SVG, I may continue the effort. Otherwise I might as well close the site down.

Categories
Graphics/CSS Photography

Gimp 2.6 alive and well on the Mac

GIMP 2.6 is now available on the Mac OS X, in addition to both Windows and Linux. On the Mac, you can install it via Macports, or you can use a pre-built version of the application, available for both Tiger and Leopard. I have the Macports version on my Leopard machine, the pre-built on my Tiger laptop.

First impressions of the newest version is that I like the improvements to the user interface. The original application (Toolbox) toolbar has now been merged as part of the image window, simplifying the interface. The application is still a MDI, or Multiple Document Interface, but it’s simple to keep all of the tool’s components visible.

The necessary photo enhancement tools are all present and accounted for, including Layers, Curves, and the all important Gaussian blur, as well as several of the other handy enhancing tools. The application still interfaces with UFRaw, the separately accessible open source tool that provides RAW image pre-processing.

One of the new additions to Adobe’s Photoshop CS4, I gather, is the addition of seam carving whereby the tool can determine where pixels can be compressed and still maintain most of the image’s integral look. GIMP 2.6 also incorporates a plug-in known as Liquid Rescale that is based on the same algorithm. I didn’t have a photo with a long, unending beach, but I did have a photo of a bright red mumdahlia. Following are two versions of the photo, a before and after scaling with Liquid Rescale.

Red Mum before

Red Mum after

The red flower is distorted, which isn’t surprising. However, the bud, leaves, and even some of the background are relatively untouched. Interesting effect. The plug-in’s web site has examples that show how to use Liquid Rescaling to enhance photos without obvious distortions.

Another major change with GIMP 2.6 is the addition of the GEGL (Generic Graphics Library). From the GIMP 2.6 release notes:

Important progress towards high bit-depth and non-destructive editing in GIMP has been made. Most color operations in GIMP are now ported to the powerful graph based image processing framework GEGL, meaning that the interal processing is being done in 32bit floating point linear light RGBA. By default the legacy 8bit code paths are still used, but a curious user can turn on the use of GEGL for the color operations with Colors / Use GEGL.

There’s also a GEGL tool, which provides access to several operations, though I’d use caution when applying any of the operations to a large, RAW image. Among the more familiar of the operations is an unsharp-mask; among some of the more interesting is the whitebalance operation, demonstrated in the following snapshot.

The new modifications for GIMP 2.6 go beyond making our photos prettier. The new Brush Dynamics feature is a kick to play with, and one can see how it would be useful when creating specialized effects. With the Dynamics, I can create a wonderfully fun fairy sparkle effect, just by setting the pressure, velocity, and random settings for the brush opacity, hardness, size, and color.

Some of the more popular plug-ins, such as the Layers plug-in, which emulates the Photoshop layer effects capability still have not been ported to 2.6. Most of the effects, however, can be created by scratch until the plug-ins are updated. Plus, there’s enough to the basic tool, including the new GEGL operations, and the Brush Dynamics to keep one occupied for hours.

GIMP isn’t the tool for everyone. If you’re proficient with Adobe Photoshop, work in an operating system in which Photoshop is released, and can afford the rather expensive upgrades, you should stay with Photoshop. However, with today’s troubled economic times, and an increased interest in being frugal, you can’t beat GIMP 2.6’s price: donate what you can to the project. In addition, the new Photoshop CS won’t run on many older Mac architectures, including both my Leopard and Tiger laptops.

Paired with UFRaw, you have what you need to do sophisticated photo processing with GIMP. And with all of the graphics plug-ins, filters, scripts, and so on, you can do most other graphics work with the tool, as I hope to demonstrate more fully in the future.

Categories
Technology

New Apple notebooks

Interesting reading about the new Apple notebooks. They do sound very attractive, but I think people were expecting a little more when it came to the “under $1,000” market. One dollar under is more marketing than real commitment to the changing times. You can’t even buy a gallon of gas, or milk, for under a buck nowadays.

I’m sure all the machines will do well, and many will sell. They do sound innovative, and rather powerful. I won’t be buying a new machine, but good for those that can.

Categories
XHTML/HTML

On the Myths and Realities of XHTML

Recovered from the Wayback Machine.

Tina Holmboe from the XHTML WG has written a concise overview of XHTML titled XHTML—Myths and Realities. She’s provided a nice overview of the markup, including the purpose behind the development of XHTML and the state of XHTML today. The only somewhat jarring note I found about the overview is it seems that Tina went a bit out of her way not to sell XHTML. Perhaps this seeming “you should really need it before using it” push is the reality part of the topic.

I use content negotiation for my sites, serving up XHTML for those browsers and agents that can process XHTML, and HTML for the rest. I’m looking into embedded RDFa into my text in a new iteration of yet another site design, but my main reason for using XHTML is that I like to keep open the possibility of using inline SVG. I also think that support for XHTML seems to be broader than is implied by Tina, but again that could be her trying to downplay any hyperbole about XHTML—there’s hyperbole about XHTML?

Though I know this is outside of Tina’s overview, I would have like to have more focus on the differences between the HTML5/WhatWG stuff and XHTML 2.0. It’s confusing that we have one group working supposedly on an “XHTML 5.0”, and another on XHTML 2.0. Especially when one of the main issues to do with XHTML 2.0 was XForms, while a milestone reached with HTML5 recently was the incorporation of Web Forms 2.0—but don’t let the “forms” that appears in both fool you into thinking we have any form of consensus or agreement.

I’m beginning to think that the HTML5 working group should completely and thoroughly remove all support for, and even mention of, XHTML from the HTML5 specification. The group finds extensibility to be anathema, but extensibility through namespaces is the heart and soul of XHTML. Seems to me that any form of XHTML, or nod to XHTML, coming out of the group would be a bastard cousin, at best.

Instead of XHTML coming out of the HTML5 group, perhaps we could look at ways to incorporate the new HTML5 objects via namespace to XHTML, but via the W3C XHTML path. In other words, honor the extensibility of XHTML, accept the necessity of a closed world for HTML5 and have one path for HTML, one separate path for XHTML, with the twain meeting via DOM. After all, it’s only serialization differences between XForms and Web Forms 2.0, right?

Or, conversely, we abandon the separate XHTML 2.0 path, and incorporate and embrace extensibility into HTML5. But I’m not one to bank on pigs flying.

I’m not a markup expert, nor am I involved in developing browsers, so perhaps my view is both simplistic and naive. But I can’t help thinking that the HTML5 working group does not have the mindset or interest in extensibility, and at most, will toss bits of seeming extensibility in to placate the noisy. However, this group’s continuing reference to an “XHTML 5” is confusing when you consider there’s a separate, formal upgrade path for XHTML 2.0. The W3C says there’s nothing to worry about because it’s all just serialization under the skin—but it goes beyond just basic serialization techniques, doesn’t it? If it were just serialization technique differences, would the same topics keep arising in the HTML5 WG threads? I mean, if working with RDF has taught me one thing, it’s that converting between two different forms of serialization is trivial—it’s the underlying model that matters.

Really, the W3C is leaving all of this in a bit of a mess.

However, I both digress and am going off on a tangent. This post was about Tina Holmboe’s XHTML overview, which is excellent and worth a read.

(via Simon Willison)

Categories
Technology Web

Progressive Enhancement and Graceful Degradation

A List Apart has a timely article titled Understanding Progressive Enhancement discussing the perceptual differences between graceful degradation and progressive enhancement. I enjoyed seeing Steve Champeon’s idea given new light. Additionally, now is as good a time as any to have a go at these topics, with the many new enhancements being added to today’s browsers, while antiques still cutter cyberspace. I could have done without the cloyingly cute M & M analogy in the article, but that’s probably my inner Cranky Woman having a go this AM.

I’ve written about graceful degradation, previously. Graceful degradation means applying modern technology but ensuring the application doesn’t negatively effect those viewing a web site with an Antique (remaining nameless). However, contrary to the ALA author’s statement of Under this paradigm, older browsers are expected to have a poor, but passable experience, graceful degradation is just that: gracefully degrading, meaning that though the person using the Antique doesn’t get all the bells or whistles, their experience at the site is more than “poor but passable”.

Progressive enhancement, on the other hand, begins with the content, rather than the technology; ensuring that the markup used to organize the content is semantically correct and valid. Then, and only then, the web site developer progresses to the use of CSS and JavaScript, both to annotate and enhance the content. That’s been the primary difference between the two approaches: graceful degradation tends to focus on technology, first, while progressive enhancement focuses on content, first.

Of course, the two are not exclusive: one can use progressive enhancement techniques, beginning with the content outward, paying particular attention to the semantics of the markup, and then apply the technique of graceful degradation when applying CSS and JavaScript. In particular when using Content Management Systems, such as Drupal and WordPress, it’s important to not neglect the semantics by focusing overmuch on the themes, widgets, and other, frequently annoying, gewgaws.