Categories
Books JavaScript

Douglas Crockford’s Good Parts of JavaScript

Recovered from the Wayback Machine.

My editor at O’Reilly sent me a copy of Douglas Crockford’s JavaScript: The Good Parts, which I found to be both interesting and useful. The volume is slim, 153 pages in all, but packed full of information about JavaScript—the good parts and the bad.

I found myself nodding more than once, raising my eyebrow a couple of times, and chuckling a time or two, especially when reading about the equality operators, those “evil twins” of JavaScript, according to Crockford.

The book is not intended for neophyte developers, or even those new to JavaScript. It does, however, give you insight into the mind of a very experienced JavaScript developer. Such insight can provide the information necessary to take you from being familiar with JavaScript to being experienced with JavaScript. In particular, the book’s fourth chapter on Functions is, by itself, worth the price of the book.

Crockford writes clearly and without pretension, which I found refreshing. His aim is to clarify, but without pandering. He doesn’t hold you by the hand, and don’t expect him to explain every last bit of the subjects introduced. However, reading through his material is a nice confirmation as to whether your understanding of JavaScript is comprehensive, or if you’re missing out on some of the bits. I particularly liked his chapter on regular expressions, because I suck at regular expressions.

You’ll also be served a hefty dose of Crockford’s opinions on the JavaScript language, which is no bad thing. I didn’t necessarily agree with all of his opinions, such as avoiding the use of new, but I liked reading the opinions because they help me question my own use of the JavaScript language: is this necessary? Could this be improved? Why am I doing this?

I don’t usually have opinions, good or bad, about components of a language. I either like the language, and learn to adapt to the awkward pieces; or I don’t like the language at all. I like JavaScript, so I tend to like all of JavaScript, even the grungy parts. If there’s one thing I consider to be a “bad” part of JavaScript, it is the experts in JavaScript who tell us not to do this or that, but either give no reason for their opinion, or the reason they give borders on the obtuse and cryptic—assuming that we, of course, know exactly what they’re talking about (if we’re worth anything as programmers, goes the implication).

Reading Crockford laying out his opinion as to what he considers “bad” in JavaScript, and why, in clear, unambiguous terms—with examples—is like a breath of fresh air. His doing so is also worth the price of the book (leaving me to wonder whether I should, in all fairness, pay twice). I can only hope other experts, in whatever language, follow his lead.

My only quibble with the book is one paragraph in the chapter on Objects, and that’s because I’m still puzzled by what I read. Crockford writes:

The simple types of JavaScript are numbers, strings, booleans (true and false), null, and undefined. All other values are objects. Numbers, strings, and booleans are object-like in that they have methods, but they are immutable. Objects in JavaScript are mutable keyed collections.

My understanding of immutable objects is that these are objects, not some form of pseudo-object, or second class object. If I had been a tech reviewer for this book, I would have asked for additional clarification of this paragraph in the text.

Other than this relatively minor quibble, though, I really enjoyed this book. It is a nice read, and invaluable for any JavaScript developer.

Categories
Just Shelley

The long way home

Recovered from the Wayback Machine.

Before weblogging and RSS—long before Facebook, Twitter, or the next poor bastard service, doomed to be worshiped and then sacrificed on some given Friday—I used to write long essays I’d publish online by hand editing the HTML and posting the static files. Having to manually create the HTML template and design, incorporate navigation, and craft the links and images, took a considerable amount of time.

To justify the time, I wanted to make sure that what I published was worth the effort. I would research a story and edit and re-edit it, and look for additional resources, and then re-edit the story again. My one essay on the giant squid actually took two months to research, and days, not minutes, to edit. Even after publication, I would tweak the pages as old links died, or to refine a section of the writing.

Now, we have wonderful tools to make it easy to put writing or other content online. We can think of a topic, create a writing about it, and publish it—all in five or less minutes. We’ve also come to expect that whatever is published is read as quickly. We’ve moved from multi-page writings, to a single page, to a few paragraphs, to 140 characters or less. Though there is something to be said for brevity, and it takes a true master to create a mental image that can stand alone in 140 characters or less, there still is a place for longer writings. We don’t have to be in a continuous state of noise; a race to create and to consume.

Other than a few posts, such as this, all writings at Just Shelley will be spread across pages, not paragraphs, or characters. Such length will, naturally, require a commitment of your time in addition to your interest. However, I can’t guarantee that your time will be well spent, or even that your interest will be held (though the former will, naturally, be dependent on the latter). All I can guarantee is that I probably took longer to create the writing than you will in reading it.

I am using a tool to publish, true, and even providing an Atom feed. There are no categories, tags, or taxonomies, though, because everything here fits under one bucket: it is something that interests me. Taxonomies would just clutter the site’s zen-like structure, as well as set expectations I’m almost certainly not going to fulfill.

To further add to my state of web regression, I’ve not enabled comments, though I’d love to hear from you through some other means. As anachronistic as it may seem nowadays, this is not a site that’s community built. It’s not that I don’t care about you or community, or that I’m asking you to be a passive observer. My hope is that if I don’t inspire you—to talk, to write, to howl at the moon— I make you think; if I don’t make you think, I provide comfort; if I don’t comfort, I entertain; if I don’t entertain, at a minimum, I hope I’ve kept you in the house long enough not to be hit on one of those rare occasions when a meteorite falls from space and lands in front of your home just as you were leaving.

Just Shelley is my place to be still, and my invitation for you to be still with me.

Categories
Just Shelley

The stories this week: Levee fails, Anheuser-Busch says no, Burke is gone

Recovered from the Wayback Machine.

second update Unfortunately the Hesco barrier erected by the National Guard failed. Though I admire the tenacity of the Guard, I’m not surprised the barriers failed.

Winfield is now, more or less, cut off, and many homes will, unfortunately, be damaged. How many, no one knows for sure at this time. The town gave it their all, but the Mississippi is one big river.


update This NECN in Boston shows how fast the water flows through a levee break, and how widespread the flooding is now. The National Guard is disheartened by the break, as they worked on the levee for nine days. This levee was also the destination for the sand bags I helped fill.

The flood crest has been raised at St. Louis, and there’s a possibility of flooding south of Lemay Ferry Road from River Des Peres. This is the drainage river that runs through St. Louis, and is also the waterway that puts us most at risk during a flood. However, the Mississippi would have to crest about 13 feet higher to put us at risk.

We will have to rethink how we manage our waterways in the future. We can’t keep putting our fingers in the dike, and hoping for the best.


The last levee in Lincoln County still holding back the water breached this morning. I don’t think anyone was surprised when a sand boil, a mix of water and sand, appeared in the side, signaling that water was undercutting the foundation of the levee. The folks in Winfield and surrounding areas made a mighty effort to save the levee, but it was not enough.

The waters should be cresting this weekend, though we have more rain in both the Mississippi’s upper river basin, and along the Missouri river basin. Whether this means the flooding will continue hasn’t yet been determined.

Another major event impacting on St. Louis is the InBev offer for our local, beloved Anheuser-Busch. A-B, the largest beer company in the US, has remained in control by family members to this day, and has been an important St. Louis and Missouri business. A-B is very generous to the community; many members of the family are very active environmentalists; from all accounts the company is a good employer— well, needless to say, no one wants InBev to buy the company but greedy, rapacious stock holders.

We’ve already had warnings from employees in other countries where InBev has made acquisitions, and left a swath of destruction in its path. By all accounts, InBev is only interested in profits and power, not a legacy.

This week, A-B turned down the offer, and InBev has already made the opening move of a hostile take over. Hopefully the A-B people can hold their own, but this war will leave this community scarred. The only way that A-B might be able to hold off the bid is by decreasing costs and increasing profits, both of which could mean the end of our gentle neighbor, regardless of who owns the company. I would wish InBev in Missouri…right in the middle of the Mississippi river.

Lastly, St. Louis’ Archbishop Burke is leaving for a position in Rome. Burke’s four year tenure here has been marked by disruption and antagonism, as Burke trounced Catholic presidential candidate Kerry for supporting choice with abortion, excommunicated the members of the St. Stanislaus Kostka, just because the church members wanted to control their own property, and condemned one of the local rabbis for opening her arms to women wanting to be ordained as Catholic priests. Burke has also skirted perilously close to crossing the line in allowable political activity, going just far enough to try to influence local and national elections, but without endangering the church tax exempt status.

I am not surprised at Burke’s appointment to Rome, and wrote not long ago that he had ambitions beyond St. Louis. I am, also, not disappointed at Burke’s leaving, though whether the man appointed in his place will be any better for the community is hard to say.

Categories
Semantics Writing

RDF too

Congratulations to the RDFa folks for rolling out a release candidate of RDFa for XHTML. Now that I’ve finished tweaking site designs, my next step is to see about incorporating smarts into my pages, including the use of RDFa. In addition, I also want to incorporate the RDF Drupal modules more closely into the overall functionality. The SPARQL module still seems broken, but the underlying RDF modules seem to be working now.

The RDFa release candidate is timely, as I gather the BBC has decided to forgo microformats in favor of RDFa. This has re-awakened the “microformats vs. RDFa” beast, which we thought we had lulled to sleep. I guess we need to switch lullabies.

Speaking of lullabies, I had hoped to start work on the second edition of Practical RDF later this year, but it is not to be. The powers-that-be at O’Reilly aren’t comfortable with a second edition and have accepted another book proposal that covers some of what I would have covered in order to make the book livelier. There just isn’t the room for both.

I am disappointed. The first version of “Practical RDF” was difficult because the specification was undergoing change, the semantic web wasn’t generating a lot of interest, and there weren’t that many RDF-based applications available. Now, the specs are mature, we have new additions such as RDFa, increased interest in semantics, and too many applications to fit into one book. I feel as if I started a job, and now won’t be able to finish it.

One issue in the book decision is the “cool” factor. RDF and associated specifications and technologies aren’t “cool”, in that people don’t sit around at one camp or another getting hot and bothered talking about how “RDF is going to change the world!” However, the topic doesn’t necessarily have to be “cool” if the author is “cool”, and I’m not. I don’t Twit-Face-Space-Friend-Camp-Chat-Speak-Shmooze. What I do is sit here quietly in my little corner of waterlogged Missouri, try out new things, and write about them. That’s not really cool, and two not-cools do not make a hot book.

I don’t regret my choice of lifestyle, and not being “cool”. I do regret, though, leaving the “Practical RDF” job undone. Perhaps I’ll do something online with PDFs or Kindle or some such thing.

Categories
Writing

Timing

Recovered from the Wayback Machine.

Now, we have wonderful tools to make it easy to put writing or other content online. We can think of a topic, create a writing about it, and publish it—all in five or less minutes. We’ve also come to expect that whatever is published is read as quickly. We’ve moved from multi-page writings, to a single page, to a few paragraphs, to 140 characters or less. Though there is something to be said for brevity, and it takes a true master to create a mental image that can stand alone in 140 characters or less, there still is a place for longer writings. We don’t have to be in a continuous state of noise; a race to create and to consume.

That’s a quote from my first writing at Just Shelley. It seems serendipitous, then, that Nick Carr’s article in the New Atlantic, Is Google Making Us Stupid? is published the same day, because in this multi-page article, Nick questions the internet’s impact on our ability to read just such longer works.

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing.

Nick wonders if our brains are being subtly altered by the internet. That perhaps we are truly losing the ability to focus, to stay in one place, to even sit and read a book. I find the idea that the internet is actually altering how our brains work unlikely, or at least, no more likely than any other activity. He uses Nietzsche’s use of a typewriter as anecdotal evidence of the medium’s impact on the message, writing, Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche’s prose “changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.” Yet Kittler and now Nick both ignore the elephant in the corner: Nietzsche was a very ill man, becoming increasingly more ill as he got older, and more mad as time progressed. I find it more likely his illness impacted on his writing then the fact that he now used a typewriter over pen and paper.

What I think is happening—without any basis in research other than my own intuition and observation—to Nick and others, especially those who weblog, Twitter, IM and so on, is that we’ve adapted to a set of stimuli that rewards both brevity of focus, as well as speed of response, over long-term study and thoughtful response. There is “pleasure” associated with receiving both acclaim and attention, and those who receive both excel at the 10 minute read, the five minute response. We’re not adapting so much as we’re mimicking what we see to be “successful” behavior in others so that we may, also, partake of the same “pleasure”. This is compounded by an artificial sense of urgency that has been generated in this environment that we have to read more, and then more, in order to “be informed”—informed in this context being the breadth of knowledge, rather than the depth.

In other words, we become less like the computers we use, as Nick presumes, and more like the rats in the lab box, pushing the lever that gives a sexual stimulus over the lever that gives food, because the short term gratification outweighs the longer term need.

Regardless of cause, physical or behavioral, the end result is the same: we run the risk of losing an essential part of ourselves. As Nick eloquently puts it:

Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.

If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture.

There is much to think of from Nick’s writing. Much to absorb and more to write about later. In the meantime, do take the time to read all of the article.

(Also discussed, briefly, at CNet. I’d be curious how many people who wrote comments have actually read the entire work, because as far as I can see, CNet isn’t linking directly to Nick’s article.)