Categories
Political Religion

Joy. Oh joy oh joy oh joy

Recovered from the Wayback Machine.

It’s not bad enough that St. Louis in August is characterized by hot, muggy days, with lousy air quality.

It’s not bad enough that we’ve just had our first human case of West Nile Virus in the county, and that the dangerous tick alert is still ongoing.

It’s not terrible enough that the dog days of summer in St. Louis make you want to embrace the cat and kill the pooch.

No, no, it becomes worse.

The National Federation of Republican Assemblies is being hosted here, this upcoming weekend. The event’s tag line?

“Show me your Values”

I can just hear the opening statement now: This here meetin’ of the white trailer park trash of the south is now come together. Anyone around you not waving a cute, little American flag is a godless, commie, liberal, no good spy. Shoot ‘em.”

But wait…it gets even more worse…worser…whatever.

What are the ‘beliefs’ behind this organization?

That all political power and influence should flow from the grass roots upward.

That all human rights are granted by God, not government and that government exists primarily to protect the God-given rights of its citizens.

That the Constitution was written by wise men under the inspiration of God and that the original intent of the Founders is as valid and binding today as it was in their day.

That the Constitution was written to govern a moral and religious people and it is being destroyed by those who are neither.

That the unborn child has a fundamental individual right to life which cannot be infringed. That sacred right extends to all persons regardless of age or infirmity and also would not allow for euthanasia, assisted suicide, or public funding for any of these practices.

That the traditional family is the foundation and cornerstone of our society and we will oppose any attempt to undermine or redefine the family unit.

That the founders never intended to separate God from government but did intend to prevent government from establishing a single state religion or inhibiting the citizen’s right to the free exercise of religion in any setting, public or private.

That free market capitalism is the only economic system that creates the opportunities and incentives that will allow maximum productivity and prosperity for its citizens. It is the necessary partner of political freedom.

In the necessity of national sovereignty, we also consider it crucial to return to appropriate state sovereignty under the 10th amendment.

Yes, let’s forget separation of church and state. Tedious thing being tolerant, idna it?

Let’s forget the fact that the ‘traditional’ family in the country typically consists of a single or divorced parent, trying to raise kids with, or without help, from the spouse no longer living at home.

Let’s forget that capitalism and the ‘free market system’ has brought us Enron, big tobacco and drug companies, and health insurance that costs too much and covers too little.

Let’s also forget that most serial murders in this country are typically committed by Christians, so are most lynchings and beatings, and that no war has ever been caused by an atheist. In fact, I can’t think of one single negative act ever committed in the name of atheism in this country. So as the whole ‘moral’ thing goes, the religious suck at it.

But it’s in the principles that you see the real purpose behind such a group: it’s all about taxes and support for capitalism, and a Darwinian survival of the economic fittest that would bring down the house. Oh, and claiming our ‘god given right’ to beat the crap out of other countries. Well, other countries that have something we want, that is.

Such noble spirits. Such statements of openness and generosity. Why I feel like I’ve just walked into a cramped, dusty, and dark closet when I read sentiments such as these.

Makes me wonder about the Presidential candidates, though. They’ll allow themselves to be associated with racist, ignorant, self-serving po’dunks, like the people in NFRA, but they won’t answer questions from YouTube. I mean, no matter how many potential “Romney girls” or men in white hoods get thrown at the GOPers, it has to be better than lunch with Phyllis Schafly.

Yes, that’s the topping on this little overbaked cake: Phyllis Schafly is keynote speaker. Why, I feel like donning my apron and running right on down, if My Man will let me. After all, I just love Phyllis, I really do; almost as much as Tom DeLay who is also attending.

Oh, rapture! And did you dig the cute little RINO hunter thing? I love it, I really do. The more groups like this shoot down moderate Republicans, the more Democrats win office. Hallelujah and pass the ammo!

You’d think that people in the Lou would have enough problems, what with the heat, the humidity, bugs, and smog — but Phyllis Schafly, Tom DeLay, tossed together with generous servings of self-interest, greed, bigotry, and the smallest minds found anywhere outside of the Shuars in Ecuador and Peru–well, it’s more than a people should be expected to bear.

The only redeeming thing about all of this? You all lost the Republican Party the Congressional vote in 2006, cupcakes. And you’re going to help the Party lose the Presidential race in 2008, too.

Categories
Photography

Creepy digital animation

Pink Tentacle points to the Japanese Motion Portrait web site, featuring software that can take a digital photograph and convert it into an animated, interactive 3D representation.

Among the examples linked is one of a dog, which I agree with PT, is somewhat creepy. It’s the human examples, though, including the interactive on one the main page that leads me to wonder how far we can take this particular art.

Perhaps news organizations will hire a ‘face’, and then just program it to talk.

And just think: every Barcamp can have its very own SillyValley A-Lister. No one could tell the difference.

Categories
Web

Controlling your data

Popular opinion is that once you publish any information online, it’s online forever. Yet the web was never intended to be a permanent snapshot, embedding past, present, and future in unbreakable amber, preserved for all time. We can control what happens to our data once it’s online, though it’s not always easy.

The first step is, of course, not to publish anything online that we really want to keep personal. However, times change and we may decide that we don’t want an old story to appear in search engines, or MySpace hotlinking our images.

I thought I would cover some of the steps and technologies you can use to control your online data and media files. If I’ve missed any, please add in the comments.

Robots.txt

The grandaddy of online data control is robots.txt. With this you can control which search engine web bot can access what directory. You can even remove your site entirely from all search engines. Drastic? Unwelcome? As time goes on, you may find that pulling your site out of the mainstream is one way of keeping what you write both timely and intimate.

I discussed the use of robots.txt years ago, before the marketers discovered weblogging, and most people were reluctant to cut themselves off from the visitors arriving from the major search engines. We used to joke about the odd search phrases that brought unsuspecting souls to our pages.

Now, weblogging is much more widely known, and people arrive at our pages through any form of media and contact. In addition, search engines no longer send unsuspecting souls to our pages as frequently as they once did. They are beginning to understand and manage the ‘blogging phenomena’, helped along by webloggers and our use of ‘nofollow’ (note from author, don’t use, bad web use). Even now, do we delight in the accidental tourists as much as we once did? Or is that part of a bygone innocent era?

A robots.txt file is a text file with entries like the following:

User-agent: * Disallow: /ajax/ Disallow: /alter/

This tells all webbots not to traverse the ajax or alter subdirectory. All well behaved bots follow these, and that includes the main search engines: Yahoo, Google, MSN, Ask, and that other guy, the one I can never remember.

The place to learn more about robots.txt is, naturally enough, the robots.txt web site.

If you don’t host your own site, you can achieve the same effect using a META element in the head section of your web page. If you’re not sure where this section is, use your browser’s View Source capability: anything between opening and closing “head” tags is the head section. Open mine and you can see the use of a META element. Another example is:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

This tells web bots to not index the site and not harvest links from the site.

Another resource you might also want to protect is your images. You can tell search engines to bypass your images subdirectory if you don’t want them picked up in image search. This technique doesn’t stop people from copying your images, which you really can’t prevent without using Flash or some other strange web defying move. You can, however, stop people from embedding your images directly in their web pages, a concept known as hot linking.

There are good tutorials on how to prevent hotlinking, so I won’t cover it here. Search on “preventing hotlinking” and you’ll see examples, both in PHP code and in .htaccess.

Let’s say you want to have the search engines index your site, but you decide to pull a post. How can you pull a post and tell the search engines you really mean it?

410 is not an error

There is no such thing as a permanent, fixed web. It’s as fluid as the seas, as changeable as the weather. That’s what makes this all fun.

A few years back, editing or deleting a post was considered ‘bad form’. Of course, we now realize that we all change over time and a post that seemed like a good idea at one time may seem a terrible idea a year or so later. Additionally, we may change the focus of our sites: go from general to specific, or specific back to general. We may not want to maintain old archives.

When we delete a post, most content management tools return a “404” when the URL for the page is accessed. This is unfortunate because a 404 tells a web agent that the page “was not found”. An assumption could be made that it’s temporarily gone; the server is having a problem; a redirect is not working right. Regardless, there is an assumption that 404 assumes a condition of being cured at some point.

Another 4xx HTTP status is 410, which means that whatever you tried to access is gone. Really gone. Not just on vacation. Not just a bad redirect, or a problem with the domain–this resource at this spot is gone, g-o-n-e. Google considers these an error, but don’t let that big bully fool you: this is a perfectly legitimate status and state of a resource. In fact, when you delete a post in your weblog, you should consider adding an entry to your .htaccess file to note that this resource is now 410.

I pulled a complete subdirectory and marked it as gone with the following entry in .htaccess:

Redirect gone /category/justonesong/

I tried this on an older post and sure enough, all of the search engines pulled their reference to the item. It is, to all intents and purposes, gone from the internet. Except…

Except there can be a period where the item is gone but cache still remains. That’s the next part of the puzzle.

Search Engine Cache and the Google Webmaster Toolset

Search on a term and most results have a couple of links in addition to the link to the main web page. One such link is for the cache for the site: a snapshot of the the last time the webbot stopped by.

Caching is a handy thing if you want to ensure people can access your site. However, caching can also perpetuate information that you’ve pulled or modified. Depending on how often the search engine refreshes the snapshot, it could reflect a badly out of date page. It could also reflect data you’ve pulled, and for a specific reason.

Handily enough as I was writing this, I received an email from a person who had written a comment to my weblog in 2003 and who had typed out his URL of the time and an email address. When he searched on his name, his comment in my space showed in the second page. He asked if I could remove his email address from the comment, which was simple enough.

If this item still had been cached, though, his comment would have remained in cache with his email address until that comment was refreshed. As it was, it was gone instantly, as soon as I made the change.

How frequently older pages such as these are accessed by the bots really does depend, but when I tested with some older posts of other weblogs, most of the cached entries were a week old. Not that big a deal, but if you want to really have control over your space, you’re going to want to consider eliminating caching.

To prevent caching, add the NOARCHIVE meta tag to your header:

To have better control of caching with Google, you need to become familiar with the Google Web tools. I feel like I’ve been really picking on Google lately. I’m sure such will adversely impact on share price, and bring down searching as we know it today–bad me. However, I was pleased to see Google’s addition of a cache management tool included within the Google Webmaster tool set. This is a useful tool, and since there are a lot of people who have their own sites and domains, but aren’t ‘techs’, in that they don’t do tech for a living or necessarily follow sites that discuss such tools, I thought I’d walk through the steps in how to control search engine caching of your data.

So….

To take full advantage of the caching tool, you’ll need a Google account, and access to the Webmaster tools. You can create an account from the main Google page, clicking the sign in link in the upper right corner.

Once you have created the account and signed in, from the Google main page you’ll see a link that says, “My Account”. Click on this. In the page that loads, you can edit your personal information, as well as access GMail, Google groups, and for the purposes of this writing, the Webmaster toolset.

In the Webmaster page, you can access domains already added, or add new domains. For instance, I have added burningbird.net, shelleypowers.com, and missourigreen.com.

Once added, you’ll need to verify that you own the domain. There’s a couple of approaches: add a META tag to your main web page or you can create a file given the same name as a key generated for you from Google. The first approach is what you want to use if you don’t provide your own hosting, such as if you’re hosted in Blogger, Typepad, or WordPress.com. Edit the header template and add the tag, as Google instructs. To see the use of a META tag, you can view source for my site and you’ll see several in use.

If you do host your site and would prefer another approach, create a text file with the same name as the key that Google will generate for you when you select this option. That’s all you need with the file: that it be named the name Google provides–it can be completely empty. Once created, FTP or use whatever technique to upload it to the site.

After you make either of these changes, click the verify link in the Webmaster tools to complete the verification. Now you have established with Google that you are, indeed, the owner of the domain. Once you’re verified the site, clicking on each domain URL opens up the toolset. The page that opens has tabs: Diagnostic, Statistics, Links, and Sitemaps. The first three links most likely will have useful information for you right from the start.

Play around with all of the tabs later, for now, access Diagnostic, and then click the link “URL Removal” in the left side of the page. In the page that opens, you’re given a chance to remove links to your files, subdirectories, or your entire site at Google, including removing the associated cache. You can also use the resource to add items back.

You’ve now prevent webbots from accessing a subdirectory, told the webbots a file is gone, and cleaned out your cache. Whatever you wrote and wish you didn’t is now gone. Except…

Removing a post from aggregation cache

Of course, just because a post is removed from the search engines, doesn’t meant that it’s gone from public view. If you supply a syndication feed, aggregators will persist feed content for some period of time (or some number of posts). Bloglines persists the last 100 feeds, and I believe that Google reader persists even more.

If you delete a post, to ensure the item is removed from aggregator cache, what you really need to do is delete the content for the item and then re-publish it. This ‘edit’ then overwrites the existing entry in aggregator cache.

You’ll need to make sure the item has the same URL as the original posting. If you want, you can write something like, “Removed by author” or some such thing — but you don’t have to put out an explanation if you don’t want to. Remember: your space, your call. You could, as easily, replace the contents with a pretty picture, poem, or fun piece of code.

Once the item is ‘erased’ from aggregation, you can then delete it entirely and create a 410 entry for the item. This will ensure the item is gone from aggregators AND from the search engines. Except…

That pesky except again.

This is probably one of the most critical issues of controlling your data and no one is going to be happy with it. If you publish a fullcontent feed, your post may be picked up by public aggregators or third party sites that replicate it in its entirety. Some sites duplicate and archive your entries, and allow both traversal and indexing of their pages. If you delete a post that would no longer be in your syndication feed (it’s too old), there’s no way to effectively ‘delete’ the entry for these sites. From my personal experience, you might as well forget asking them to not duplicate your feeds — with many, the only way to prevent such is to either complain to their hosting company or ISP, or to bring in a lawyer.

The only way to truly have control over your data is not to provide fullcontent feeds. I know, this isn’t a happy choice, but as more and more ‘blog pirates’ enter the scene, it becomes a more viable option.

Instead of fullcontent, provide an excerpt, as much of an excerpt as you wish to persist permanently. Of course, people can manually copy the post in its entirety, but most people don’t. Most people follow the ‘fair use’ aspect of copyright and quote part of what you’ve written.

There you go, some approaches to controlling you data. You may not have control over what’s quoted in other web sites based on fair use, but that’s life in the internet lane; returning us back to item number one in controlling your data–don’t publish it online.

Categories
Social Media

The ugly face of Facebook

Another weekend, and another carefully calculated self-love link fest where some A lister makes a bold and basically useless announcement, and others rush to support. If you want to increase your link count, writing self-centered, arrogant, and useless posts with bald titles filled with hyperbole works rather well.

What was particularly sad about this weekend’s lovefest, though, is that the subject was about Facebook but didn’t reflect the real story that was going around: the bias and bigotry in Facebook against older people.

I didn’t last on Facebook enough to see it’s ugly face. I found out about such through Ronnie Bennettodd time signature, and Freydblog. What they found was an undercurrent of hatred against older people, manifested in groups like the following:

F*CK** OLD PEOPLE: 107 members
Asking old people for a quarter then throwing it in there face…..hahaha!: 143 members
I Beat up old people: 53 members
I like to beat the living crap out of old people. (sic): 15 members
Pill pushing nurses to the possessed elderly….: 32 members
Eradicating the elderly: 12 members
If this group reaches 2′000 people, i will push a old lady down the stiars: (sic) 164 members
OLD PEOPLE SHOULD JUST DIE: 19 members

Among the messages posted to the groups:

Let us unite and join for a common cause, abolish social security and legalize euthanasia.

Who is with me on this, who thinks old people in school should be taken into the quad and be tarred and feathered for their annoyance , stupidity, and outright wasting of time.

Maria also writes on this topic:

I must admit, when I first signed on to Facebook, I felt a bit like a teenager sneaking into the house late at night, hoping not to wake up the parents — or, in this case, catch the attention of the kids. Reading the quotes Ronni gathered from Facebook makes the blood run cold in my veins, as does the realization that you can’t delete your account on Facebook, only deactivate it. (In some strange way, this maybe a blessing for the old-hating young whose words may well come back to bite them in their eventually sagging asses…)

(Maria also links to other good posts and comments including one by Yule Heibel, who wrote this weekend that Climates of trust are built on response and responsiveness. Not related to the issue, but compelling, nonetheless.)

Of course, youth has always rejected the older, and resented our positions of both authority and influence. Pushing back at old farts is a social phenomena that many of us remember from the days of Vietnam war protests (anyone remember Don’t trust anyone over 30! placards?) It’s not surprising to see such groups or even messages. I think what is disquieting is the fact that Facebook, which promises to abolish ‘hate’ groups, does not see these as such.

This isn’t surprising really, nor is it surprising that the 23 year old founder of the application, Mark Zukerberg, wouldn’t be overly concerned. In our rush to a new social network we have idolized youth; made them the pampered pets of social networking. More importantly, we have both taught and celebrated the right of free expression without promoting an awareness that the best expression is accompanied by both empathy and respect.

The younger the person the more self-absorbed and that’s natural; after all, it takes experience to become empathetic. Over time, society and our interactions within it help most (not all) of us to see beyond just our own needs, our own wants. We become friends with people outside our age group, race, class, or country. We learn that being aware of others, their needs and feelings, isn’t the same as ‘selling out’; nor is it destructive of ‘self’.

However, what I’m seeing with some of the social networking sites (just some, not all), is that rather than expose people to different viewpoints, they can reinforce barriers against the the natural processes that abrade self-absorbed behavior. When challenged in one’s day to day life to give o’er our preconceptions or biases, rather than learn to adapt and grow socially, we can rush home and twitter, blog, and Facebook with others who have exactly our same point of view. We can safely ensconce ourselves behind a buffer of like-minded folks, postponing, perhaps indefinitely, the need to challenge our “world is me me me” view.

An example: another reason I lost interest in Facebook, other than my disinterest in the distraction, had to do with the recent story about Facebook and Zukerberg being sued because another company says he stole their code and concept. The suit is still ongoing and who is to say whether it has merit or not. But one thing I noticed among the Facebook fans is that they were less interested in the merits behind the suit–the possibility that the code and idea may have been stolen–and more concerned about losing their special place and that harm could come to their ‘hero’. They were completely apathetic about whether Zukerberg stole the code or not. If the courts ruled he did, as long as they still have their ‘special place’, they would be indifferent to the finding and Zukerberg would still be their ‘hero’.

The world ‘bankrupt’ was flipped around this weekend, and used incorrectly and badly at that. The real ‘bankruptcy’ I’m seeing with a site like Facebook, and perhaps even some forms of social networking in general, is an empathetic bankruptcy–perhaps even a moral bankruptcy, if that term hasn’t been permanently corrupted because of its overuse and abuse by the religious conservatives–as sites like these become the sugar tit of upcoming generations.

But then, I am over 30, and therefore my opinion and this writing are not to be trusted.

Categories
Legal, Laws, and Regs

More on the Arbitration Fairness Act of 2007

The Consumerist has more on the Arbitration Fairness Act of 2007.

People Over Profits has an email campaign but it also helps to contact your Congressional rep directly. A letter of phone call also works wonders.

How important is this bill? There is no bill pending in Congress that scares Corporate America more than this one. There is no bill pending in Congress that could more help the American people than this one.

Due to rulings in the Supreme Court, mandatory arbitration agreements now trump the Equal Employment Opportunity Commission when it comes to employment discrimination lawsuits. This means that an arbitrator can make decisions based on civil rights, can do so without following the law, can do so without following the arbitration rules themselves, and can do so without any transparency into the decision process.

…after Sherri Warner lost her discrimination and wrongful firing suit in mandatory arbitration, a San Francisco arbitrator not only charged her nearly $16,000 for his time, he ordered her to pay her opponent’s legal fees of more than $207,000.

The fee award would probably not have been allowed in court, and it forced Warner into bankruptcy. But after her lawyer, Stephen Gorski, asked the arbitrator to explain his decision, the arbitrator refused when reminded no rules required him to do so.

Arbitrators rarely issue written opinions, making requests for review virtually impossible.

What’s scarier is that this case was ten years ago, and since then, the Supreme Court has given even more power to arbitration, including giving it power over ruling on employment discrimination that now supersedes that of the EEOC. The Supremes have even given it power over the law, itself. In recent case, one of my favorites, Buckeye Check Cashing vs. Cardenga, a man sued a check cashing company claiming that the conditions of the loan were illegal. The company, which had a mandatory arbitration clause, demanded that the claim be taken to arbitration. The state of Florida disagreed, saying that an arbitration clause that was in a contract deemed to be illegal is not enforceable.

However, our Scalia controlled Supreme Court doesn’t allow a little thing like an illegal contract deter it. It decided that it wasn’t up to the courts to determine the validity of an arbitration clause just because it happened to be in an illegal contract — the only item the courts could determine is whether the arbitration clause is, in and of itself, legal. The rest of the contract was then up to the arbitrator.

Question

Under the Federal Arbitration Act, may a party avoid arbitration by arguing that the contract in which the arbitration clause is contained is illegal?

Conclusion

No. The 7-1 majority (Justice Samuel Alito not participating) ruled that challenges to the legality of a contract as a whole must be argued before the arbitrator rather than a court. The opinion by Justice Antonin Scalia explained that “unless the challenge is to the arbitration clause itself, the issue of the contract’s validity is considered by the arbitrator in the first instance.” The Court held that the Florida Supreme Court had been wrong to rely on a distinction between void and merely voidable contracts, because the word “contract” in the Federal Arbitration Act includes contracts later found to be void. Justice Clarence Thomas dissented due to his long-held view that the FAA does not apply in state courts.

This is a frustrating topic for me, because I’ve watched over the years now as arbitration has eroded all of our judicial rights, as granted by the Seventh Amendment to the Constitution. It’s frustrating because I can’t seem to convey, in this weblog, how serious this can get.

A legal expert in Texas once said that he felt in ten years, there would no longer be a civil court system because of how much it is being eroded by an act that was basically put into law in 1925, as a way for businesses to come to ‘gentlemanly agreements’ in regards to a dispute. It was never intended to be used by corporations against the common citizen.

This is also a case of the breakdown of the system of checks and balances built into our government. The Supreme Court has empowered arbitration and supported mandatory arbitration to the point that it now is undermining the very nature of civil rights in our country, and was allowed to do so, unchecked, in the Republican controlled Congress.

Now we have a Democratic controlled congress. More than that, we have a congress where even many Republicans are beginning to look askance at the miscarriage of justice that occurs under the auspices of ‘arbitration’.

American Corporations do not want this Bill. American Corporations, who have delivered shoddy equipment, surly service, and bad faith consumerism.

Who supports this bill?

The Feingold-Johnson bill is supported by a host of consumer advocate organizations including Consumers Union, Public Citizen, American Association for Justice, Center for Responsible Lending, Consumer Federation of America, Homeowners Against Deficient Dwellings, Home Owners for Better Building, National Association of Consumer Advocates, National Consumer Law Center (on behalf of its low income clients), National Consumer Coalition for Nursing Home Reform, the National Employment Lawyers Association and Public Justice.

The list is only growing, as word of this Bill slowly trickles out.

Support the Arbitration Fairness Act of 2007. Please.