Categories
Web

Dot Com Bust Redux

I’m assuming the only reason that the RealNames failure is getting air time is because the former CEO has published its business dealings with Microsoft.

I glanced through Keith Teare’s papers at his personal web site, and just can’t see the fuss.

Microsoft chose to terminate the relationship with RealNames. With the nebulous nature of the product, the overall opinion against such centralized technology in today’s market, and the business proposal I don’t see how anyone could be surprised by this decision.

RealNames owed Microsoft $25.5m on May 2nd. They didn’t have it. They issued a counter-proposal. Microsoft wasn’t interested. RealNames bites the dust.

Teare believes that Microsoft isn’t demonstrating vision in its current direction, and is seeking solutions that it can control. Maybe so, but consider the proposed future direction for RealNames: Centralized, proprietary, flat architectured Keyword technology in partnership with a company such as Verisign.

I have a hard time identifying with one proprietary, centralized, patent-holding company fighting back at another proprietary, centralized, patent-holding company.

However, I do have sympathy for the 75 people in Redwood City that lost their jobs.

Categories
Burningbird Technology

Space? What space?

I was playing around with my server earlier, trying out some fun and interesting sounding new techie toys. Unfortunately, the new techie toys required ImageMagick.

Those of you with a Unix background are probably going “Oh, No!” about now. I knew I was pushing the bubble with this one, but you only live once.

Damn the server! Full install ahead!

— —- — —

Anyway, we’re almost back to normal. I’ve managed to save the server, and was able to repair the Apache installation. It was also nice hearing from the system kernel, all those “panic!” emails.

If you tried to post comments earlier during some interesting moments of turmoil and they aren’t showing up — Sorry! If you have a minute and wouldn’t mind reposting, I would be grateful!

The great thing about Unix servers is that you can do anything. The bad thing about Unix servers is that you can do anything.

Categories
Technology

Making peace with Google

I can’t wait until I get up in the morning and pop on to my machine so I can download 50+ spam emails. One of the funnest games of the day is to try and find “real” email among all of the junk. When I find one, I holler out “email whack!”

As you can tell, I am being facetious. I don’t know of anyone who likes spam, or wants to spend time on it, or wants to waste email bandwidth on it.

So why do we all like the crazy hits we get from Google?

Dave Winer pointed out a posting from Jon Udell discussing a posting from Dave Sims at O’Reilly. In it, Dave Sims wrote:

Google’s being weakened by its reliance on webloggers and their crosslinks

If Google wants to evolve into a functional resource for all users, it will have to work itself off this current path, or it will open up an opportunity for The Next Great Search Engine.

Jon responds with:

In the long run, the problem is not with Google, but with a world that hasn’t yet caught up with the web. I’m certain that in 10 years, US Senators and Inspectors General will leave web footprints commensurate with their power and influence. I hope that future web will, however, continue to even the odds and level the playing field.

Sorry, Jon. I’m with Dave Sims on this one. Weblogs are weakening Google.

When I ported the Burningbird to Movable Type and moved to the new location, I also created a robots.txt file that disallowed any web bot other than the blogdex or Daypop bots. And the Googlebot, being a well behaved critter, has honored this (as have several other bots, my referrer log is getting sparkly clean).

In the meantime, I’ve left my old site as is, bot-beaten poor little thing that it is. As a result, in the referrer log I’ve found the following searches:

rufus wainright shrek
devonshire tea graphics
missouri point system drivers license
bill gates popular science
entrenched in hatred
richard ashcroft money to burn
shelley bird
pictures of terrorists burning american flags
south carolina state patrol fishing
pictures of women in afghanistan
we start fire billy joel
fairy tale blue bird
beautiful outlook pictures
fighting fishies
high blood pressure burning
hacking statistics in Australia
lord of the rings pictures and drawings sting sword
add morpheus node

…and on and on

And all of these Google searches happened in three days time. Three days.

Comparing usage estimates, Google was effectively chewing up over 30% of my web site CPU and bandwidth on searches that were on the average accurate 3% of the time.

My regular web sites (Dynamic Earth, YASD, P2P Smoke, and Burningbird Network) have on average seven times the traffic of my weblog, with half the Google traffic and an accuracy of over 98%. This figure means that Google searches resulting in hits to the regular web sites are finding resources matching their searches. People may still continue looking at other sites, but the topic of the search is being met by the topic covered in the page.

Weblogs — might as well call us Google Viruses.

This isn’t to say that Google and weblogs can’t work together, but it isn’t up to Google to make this happen. Google is a web bot and an algorithm; we’re supposed to be the ones with the brains.

Weblogs that focus on one specific topic are ideal candidates for Google scanning. For instance, zem is a weblog focusing on topics of cryptography, security, and copyrights. Because he consistently stays on topic, he’s increasing his accuracy ratio — people are going to find data on the page that meets their search.

Victor, who’s as interested in Google as I am, is trying to work with Google by creating a new weblog that focuses purely on web development resources, Macromedia products, and browser development. It’s early days yet, but as time goes by and more people discover Victor’s weblog, he should increase his Google page rank, resulting in an increase of the number and accuracy of his Google hits.

So what’s a weblogger who just wants to have fun to do? Well, if you don’t mind the crazy searches and the waste of your bandwidth and CPU, don’t do anything. Let all those little bots just crawl all over your weblog’s butt. Google’s bandwidth and accuracy is Google’s problem (time for smarter algorithms, perhaps).

However:

-if you’re saving up to add some nice graphics or MP3 files to your weblog and your bandwidth is restricted, as most servers are or

-if you’re getting tired of crawling through the bizarre Google searches or

-if you’re getting tired of not being able to put “xxx” on your weblog page

then you might want to consider providing a few helpful aids to Google.

Google Helpful Aids

1. Create a robots.txt file and restrict Googlebot’s search to specific areas of your weblog web site — not to include your weblog page or archives.

2. If possible, create individual archive pages for each post. Otherwise, for all posts that deserve to stand alone, copy the generated HTML into a separate file.

3. For your weblog posts that you think will make a great resource, and that stay on topic and don’t meander all over the place, copy or hard link it (if you’re using Unix) to a directory that allows bots to crawl.

4. Avoid the use of ‘xxx’ in any shape and form in any of your Googlized pages

Over time, we’ll add to these aids.

Now, if only I can figure out what to do with all these XML and RDF aggregators that are now crawling all over my server….

Categories
Technology Web

Netscape 4.x not supported here

I have a confession to make: I’ve not always been a strong voice for standardization.

As much as I believe in the necessity of standards, I was so concerned when the Mozilla organization was strongly chastised for spending time on new innovations rather than implementation of standards that I wrote an article, The Tyranny of Standards, about this for O’Reilly.

However, there is a difference between pushing back at standards groups because of wanting to protect what I still consider one of the most innovative technology applications of this time, and pushing back because an organization or a person refuses to acknowledge that it’s time to let go of a technology that has outworn its usefulness.

With the upcoming release of Mozilla 1.0, it’s time to say good-bye to Netscape 4.x. It’s time to close this chapter in our lives. It’s time to abandon LAYER and ILAYER and BLINK and move on with our browser-based lives.

After my posting yesterday, both Allan and Jonathon wrote their own views about supporting Netscape 4.x.

Allan, who has a web development company, wrote:

Our small company, which definitely can’t afford the time, let alone anything else, to cater to the whims of an outdated browser, has explained the situation to our new clients.

And, we must have been persuasive, as they’ve all agreed to let us support web standards as far as we can for their sites.

The lavish days of the dot-com boom are gone and most development work on the web is lean and mean and pared down to the essentials. As Allan says, companies can no longer afford the amount of time and resources to expend on a browser that has been replaced by not just one but several different options — Internet Explorer, Netscape 6.x, Opera, and now, Mozilla.

And Jonathon wrote:

So why is it that Netscape 4.x users—who could easily upgrade to a standards-compliant browser—put their desire to use an obsolete browser above the needs of all other Web users? Not just above those with disabilities who benefit most from accessible sites, but above everyone who uses a modern browser. And why are they so frequently arrogant about it? As if using a tenth-rate browser is a mark of distinction.

Arrogance. Is that why Netscape 4.x users refuse to upgrade? Or are there other reasons?

I had an email from a reader who mentioned that her company can’t upgrade their browser because of security. I can see that there might be concerns about upgrading to IE, but what about Netscape 6.x or Mozilla? Or Opera?

I created an online tutorial demonstrating how to use Mozilla’s XUL that I had to remove as the browser continued through it’s many pre-release betas. With the soon to be released version 1.0, I would like to spend time with this tutorial; to update it for 1.0, to try out any new technical goodies being released with 1.0, and generally have a bit of fun with Mozilla.

I can either spend time trying to make sure that this weblog page shows equally for people using Netscape 4.x, or I can use the same time to update my Mozilla tutorial. There is no choice here — I choose to look forward, not back.

Netscape 4.x. You were a good friend at the time and you helped show us that we can do more on the web then click a hypertext link. But It’s time for you to say good-bye. And it’s time for me to post to my weblog:

Netscape 4.x NOT supported here.

Categories
Technology Weblogging

Weblogging Centralization/Decentralization summary

Recovered from the Wayback Machine.

Earlier in the week I made a statement about Radio being centralized that caused some interest and reaction from the Userland folks and others. A lot of back and forth and intense discussion in the comments associated with the postings here and here and continued at Backup Brain (here and here) as well as at Doc Searls and, of course, Userland — both John Robb and Dave.

A lot of cross-posting and cross-discussion. Some confusion. More discussion.

Other than pointing out the links I don’t want to go back and rehash the old stuff. As a point of clarification I did want to say that Radio doesn’t have a dependency on Userland’s or any other RCS (Radio Cloud Server) if you choose the FTP option to upload your files, and don’t use the Radio comments or upstreaming. That’s not to say that there isn’t connectivity between the Radio application and the server, Userland’s by default. There is a handshake that occurs when your Radio application starts, and when you shut it down, and there is no way to disable this as far as I have been able to find out by going through all the associated script. If there is a way, Userland will have to point this out.

Dave also wrote his views of the more popular weblogging tools and how they compare from a centralization point of view. And this essay is something I do want to talk about. However, I’m going to try and talk about it in such a way that I question the views not the person. I guess my comments will tell me if I’m successful in this or not.

In his essay, Dave writes that Blogger is centralized for editing and decentralized for reading. I agree with this assessment. If you host your Blogger weblog on Blogspot, then the tool is centralized for editing and reading; but you don’t have to host your weblog on this server, you can easily use your own.

I had a Manila site from Userland before I switched to Blogger and, again, I agree with Dave’s assessment that Manila is centralized from both an editing and reading perspective.

Where I disagree with Dave’s conclusion is his interpretation of Movable Type being “centralized” because the tool and the posted content rest on your own server.

If Blogger’s posts are decentralized because they can reside on your server, then the same logic must, must apply to Movable Type. And if Movable Type’s posts are decentralized then the tool, which resides in the same location, must also be decentralized.

Finally, I agree with Dave’s assessment of Radio in that the posts can be decentralized (hosted on your own server), and the tool itself for the most part is decentralized but there are some aspects of the tool that aren’t autonomous (I grabbed that from Doc, it is a better fitting word). It does communicate with the RCS — Userland by default, though this can be replaced by your own RCS if you wish to host it.

One other aspect of Dave’s Essay that I thought was interesting and perhaps explains where we have such different viewpoints is this concept of community services. In my own opinion, a weblogging tool is just that — a tool to create a weblog. Associated with this is the ability to archive postings, add other content, and faciliate comments.

To me, community enters the picture through the people rather than the technology. People link to a weblog posting, or add comments or both. Eventually, you can get a chained sequence of communication going, as was demonstrated with the postings earlier this week related to this topic.

I think, though, that Dave sees a more important role for technology in this process, through community servers providing services such as chat, technologies such as news aggregator, OPML outlines and so on.

Neither of our viewpoints are wrong — they’re just different. But they do color our perspective on other aspects of “weblogging”. However, this can add interest to the whole discussion.

After all, if we all thought alike, then we wouldn’t need weblogging, now, would we?

Update 5/5/02 Thread continued here.