10/06/2008

You Can Take That Panic to the BANK!

As I try and digest the most recent government hoodwink, in the form of the Emergency Economy Stabilization Act, several news articles have caught my eye. I could just ruminate about them to myself while sitting in the closet sipping whiskey from a jar, but I promised myself only to do that once a month. Instead, I'll share them with you!

I'll get right to the point: we done been Patriot-Act-ed. Yes, I know; my posts on this topic of late have been turning up the vitriol even higher than my normal 50% by volume cut off level. You may think that this post is shaping up to be just another rant. Let me assure you: while it does fill me with rage, I do have some actual facts in tow to back such assertions.

Check it out: terrorists (supported by the government?) kill people -> nation quakes in fear while legislation consolidating the nation's autocratic power sails through Congress. Happy Department of Homeland Security to you, the Department of Peace vulture born from the ashes of the Immigration and Naturalization Service (not a coincidence, but that's another story).

Now re-check: bankers (supported by government) screw over people trying to buy houses -> markets quake in fear while legislation consolidating the nation's capitalistic power sail through Congress. Our current door prize is the Office of Financial Stability, attached to the Treasury. You have to love that name. It doesn't quite say, "propping up our failing economic system" in the same way that "The Strong-Arm of Capitialism League" would, or the way a dilapidated FEMA trailer does. but OFS is good enough. It also lends itself to the great anti-globalization/capitalism slogan, "Fuck OFS!" My copyright certificate is in the mail.

Doubt it was fear that motivated? Well, seeing "777" written on the front page of my local paper makes it pretty clear what the average person was supposed to take away. But let's not trust an average person, let's ask a congress-person!

From the New York Times: "Fears about the economy also motivated support. 'Nobody in East Tennessee hates the fact more than me that I am going to vote yes today after voting no on Monday,' Representative Zach Wamp, a Republican, said. 'Monday I cast a blue-collar vote for the American people,' he continued. 'Today I am going to cast a red, white and blue-collar vote with my hand over my heart for this country, because things are really bad and we don’t have any choice.'" Well, Representative Wamp, you've made it pretty clear. Instead of voting in the name of your constituents, you decided to vote with your Jingoistic shirt on. Good on you.

One day, the bill is opposed because the constituents demand it. The next, it passes. Apart from a few tacked on incentives, the only thing that changes is the Dow dives by a number that is only 3 digits away from 666. What changed? They added the FEAR.

But after all, FEAR is the market. Capital is the universal unit of the irrational mob. Don't get me wrong, the mob is sometimes good. All understood signifiers derive their meaning from the linked desire of multiple individuals; linked desire is the mob. But this also means that it can get out of control. And once you link these mob-signifiers to things with actual material significance, like, say, mortgage-backed securities, a simple scream can mean that the government has a new bureaucratic entity run by good-old boys from Goldman Sachs. Or, so it seems.

Hey, and check this out too. You want to see mob-signifiers move a market? I found this story on Slashdot: it appears that a web crawling Google engine accidentally reprinted an old article from 2002, when United Airlines was just about bankrupt. Investors, thinking it was a current article, dumped the stock, which declined to $3 a share from $14, evaporating 1.14 billion in market cap. BECAUSE OF GOOGLE! Ahhhhhhhhh! That's okay, people pass laws in such ways these days. Soon enough, Google will be fact, so that won't be a mistake, it will be reality.

It happened again! Some jerk on CNN's newsTwitter thing posted that Steve Jobs had a heart attack, which he totally didn't. Too late! Apple's stock dived 9%. Now they blame this on the stupid citizen journalists who try to be real journalists, posting lies on their stupid blogs. Stupid blogs!

But wait, real journalists are just as stupid! Take the case of Gary Weiss, the BusinessWeek journalist who, it has been pretty conclusively proven, rigged Wikipedia to spread disinformation about selling "naked short" stocks, a strange market derivative that has been blamed for the current crisis almost as heavily as mortgage-backed securities. This disinformation was not accidental, but postively malicious, though the motive still remains unclear. "Real" journalists trusted Wikipedia, and blackballed CEO Partick Byrne, who was trying to warn marketeers about the dangers of the "naked short" phenomenon. Total cost: unknown. $700 billion dollars anyone?

So, what the deal? Well, it appears that is incredibly easy to manipulate large numbers of Americans and their representative leaders via the equity markets and the information markets. Effect: so-called "value" evaporates, leaving someone holding the tab. And even worse: government, as is typical in the case of panic, responds with authoritarianism, bureaucracy, and worse, which history always proves is worse for you and me.

'Adam, are you arguing that panic leads to Fascism?' Well, I would hardly be the first. But it's funny that you should ask. Apparently some one suggested to US Representative Brad Sherman that if the House did not vote for authoritarian bureaucracy as a delightful middle ground, the next step would be the end of Posse Commitus. I.E. You would have soldiers enforcing local "law and order".



Just a little something to think about. Sounds kind of familar, doesn't it? Say, about seven years ago? You, me, the president, congress, the end of civil liberties, the legalization of torture, and an ongoing, useless war?

Oh, the Dow dropped another 370 points today. Maybe its time to write your congressperson.

10/01/2008

This Museum Belongs in a Museum

I was in Seattle this past weekend--not for any reason in particular, just a little weekend trip. Megan and I visited the Science Fiction Museum and Hall of Fame, which until we drove past it, I did not know existed.

We had to go, even though it looked a bit touristy--it is actually part of the Experience Music Project, which is sort of like the Basketball Hall of Fame for music--if you have never had the pseudo-pleasure, it is sort of an Epcot Center. Lots of video-mentaries in enclosed monitors, a bit of memorabilia, and some high-tech diversions like virtual reality basketball, and such things. So, the ticket was $15 a piece for admission to both the SFM and the EMP. Pretty expensive, yes, but I had to see what a SF Museum was like.

The answer: not so bad, actually. There are good points, bad points, and missing points that could eventually become good points.

First, the bad:

It is heavily movie (read: Star Trek/Wars) weighted, in an obvious attempt to get people in the door. The lightsaber duel theme from Episode One played on a loop outside the building. Most of the memorabilia and artwork in exhibits come from SF movies, or from the movie adaptations of SF books. But, a museum is a largely visual experience, so what do you expect?

It is also small. There are two floors, with a winding trail through each. The overall size equals about one wing in a "real" museum, like the Museum of Natural History or the MOMA.

But, the good:

Among the artifacts that were not bought in studio lot/prop warehouse actions, are some actual things of historical value. Almost every exhibit that features a particular book or author contains a good-condition, first edition copy of the work in question. See the photo to the right, which I snapped before I was informed there were no photographs allowed (sorry, there was no sign, and I was taking pictures of artifacts, not works, with no flash). Behold! The entire 17,000 page manuscript of Neal Stephenson's The Baroque Cycle! I have to be honest; I got a bit choked up while standing there in front of it. Not out of any sort of fanboy awe. (I also have to admit that the trilogy is still on my exponentially expanding "to read" list.) I felt a sense of overwhelming admiration for the amount of work that had to go into such an expansive, creative project, and the sheer magnitude of what 17,000 written pages looks like. I can't judge the literary content standing on the other side of the glass, but still, the amount of effort that went into creating these pages is obvious. I know how I feel about my own petty manuscript notebooks, and how I would feel if anyone of them ever escaped my grasp. So, to think that Neal Stephenson lent this massive, handwritten accomplishment out of his grasp is the essence of creativity indeed. Oh, and if you were curious, the little larvae looking things splayed out like armament for a stealth bomber are all the ink cartridges that he used writing the pages.

These are the things that the museum gets right. The annals of SF that are not still possessed by their creators, I would imagine, are largely in the hands of private estates or collectors, and in the words of Dr. Jones, "belong in a museum" for the sake of preservation and public display, if nothing else. This manuscript is a rare artifact of the writting process, and not to automatically conclude that Mr. Stephenson will or will not join the same canon as Shakespeare, Homer, and Isaac Asimov (chuckle), but these are things that could all too easily become lost, come the all-too-near apocalypse. I, for one, pledge to trek via zombie-sled to Seattle, to form a militia to guard these treasures of humanity.

Anyway, back to the museum. Also on the plus side is the layout. Rather than show the stuff in plot lines (the Star Wars section, the Lost in Space section, etc.) they arranged the museum thematically. You are presented, at each exhibit, with a currated view of a particular aspect of SF. For example, "dystopic stories", or "travel to mars", or "confronting social issues". It made for a much more informative and holistic experience than simply seeing A REAL STAR WARS BLASTER RIFLE!!!!!! Highlights of this aspect include the section on the evolution of fictional spacecraft design, and the behavior and motivation of aliens. This museum is laying the groundwork for college majors in Speculative Fiction Theory, so nerdy teenage males, listen close.

But there is certainly room for growth. There was alot of interesting entertainment-tech, like a hands-on computer that would present various famous spaceships through a widescreen, spacestation-esque view screen. Certainly alot more worthwhile than many other museums' hands-on offerings, but it could be better. SF has, beyond doubt, changed the way that our technology has grown in development and use by predicting and speculating on humans' relationships with their cutting edge tools. Why not do the same in the museum? I'm not suggesting lazer-tag, but maybe a little internet? WiFi units to interact with exhibits, like the audio tours in art museums, perhaps? Other Web 2.0 user particpation could not only enhance the visiting experience, but also build support for this new-concept museum. At the Hall of Fame display, one can email oneself web links for further reading about the inductees. This is a good step, but as fast as the web is expanding, the SF museum has a lot of catching up to do. Just look at the inadequancies of their web site!

One last thing that I would think critical to a study of SF is not only the holistic view point across the "genre", but also a view of it in place within the rest of culture. What does it mean that SF largely consists of marketable media such as toys, books, movies, and other associated paraphenalia? How does being a commodity shape the speculative aspects of SF "vision of the future"? And what is the relationship between speculative fiction and fiction in general? Do we expect different things from them?

Truth told, I had a great time at the museum, though it was only about an hour and a half. It's a great museum, though expensive because it is a for-profit venture rather than a subsidized entity. I suggest you check it out if you are in Seattle with an extra hour, and an extra $15. Frankly, I would love to be a currator for the museum. I can put that on my list of possible careers right between "Pocket Battleship Captain" and "Columnist for the New Orleans Times-Picaynue".

Hooray SF!

9/26/2008

1-2-3-4-5-6-7-8-9-10-700 million

This post brought to you by the number:

12




What's going on today? Let's look through Adam's shared RSS feed!


"Normally, this is a process that would take months — years."

Instead, the law is being worked out, live on television, over the course of a few days.

NPR, quoting the chief lobbyist for the Financial Services Roundtable, Scott Talbott


And here are some other delicious quotes about the wisdom of this bullshit bailout. I can't believe this shit. And once again the democrats are lining up. These truly are incredible times; I know, because I find myself agreeing with the republicans on the matter. Wall Street? Can it! They want the free market, now they got it. Of course, it hasn't been truly free for a long time now.

Seriously though, any elected official who votes for this bailout, any official who in any way sends any further money to these jokers, is on my list. As I paraphrase from an economist, (whose name and literal words I am unable to find at the moment) 'this is not a plan to help the financial organizations, this is a plan to help the poorly run financial organizations.' Credit is our new great commodity, and it is the tool by which Americans are oppressed daily. If those who invented such commodity derivatives were unable to see when they pushed their vectors too far, then that is their problem.

I promise, that any elected official who votes to support this bailout will never receive my vote again. Not for school board, not for dog catcher. This is the Iraq War all over again. People who knew warned that it was a bad idea; and then it was voted into existence anyway. Then later, when it fails, it's, "oh, we were misled! We had false information! The people who would benefit by what we did lied to us!" Bullshit. You are responsible. I guess I can take heart in the fact that this $ 700 Billion isn't could to be used to explicitly murder people, but the money is gone. Meanwhile, the nations highways and dams are falling apart. But for god's sake, let's preserve our financial (i.e. fake, derivative, oppressive) infrastructure! And don't give me that crap that without credit, nothing would get built. You give me $700 billion, I'll build you a road. You give it to failing banks, we'll still have potholes.

Anyway, on the lighter side of things, here is a very nice MA thesis about what sort of cosmic singularities you can expect on December 21, 2012, in the views of people who have experimented with heavy doses of psychedelic drugs.

I've found the subject fascinating for awhile from many perspectives: the history of astrology and calendar systems, the millennial and apocalyptic theories, and the multitude of different, yet very similar "prophetic" visions that people experience while under heavy doses of plant matter that occur naturally on the earth. The article sums up the theories in one, mostly unbiased article that seeks to inform rather than proselytize. A bit long for someone who doesn't have any interest in the subject perhaps, but hey, at least its not about credit markets!

And lastly, the number twelve. As in the number of months in the year, the number of eggs in a dozen, and as in the high score of the most educational pinball game around. In the course of my job I often end up counting the same number over and over again. Counting into stacks of tens, or twenty-fives, etc. I've gotten pretty good at it--so good, in fact, that I can sing the 1-2-3-4-5-6-7-8-9-10-11-12 song in my head while still counting correctly to another number entirely. I love that Seasame Street bit; what a truly brilliant show. I consider that bit no small part of how I was able to adapt so easily to base-12 math in Europe, (where the times-table goes up to 12 x 12, rather than 10 x 10 like in the States). And the best part is, twelve is a number that is not 700 billion.

Scroll back up to the top and watch it again, it all its seizure and psychedelic-apocalypse inducing glory.

9/18/2008

The Publishers are Dead; Long Live Literature

- More about technology and the future of literature -

Two different articles caught my eye today, in the increasingly verbose re-hashing of the paranoia about the End of the Book. The first was a detailed analysis of the current status of the publishing industry in New York Magazine, and the second was yet another article in the form of a question (so, so, gratuitously annoying a form) about whether literature will survive "the digital age", in the UK's The Independent. For background, both of these articles are continuing the thread, which was perhaps not started but summed from the collective Luddite-leanings of modern society by Nicholas Carr in the Atlantic Monthly article, Is Google Making us Stupid?" Note again, an article in the form of a question.

My answer: no. I already voiced my opinion that literature will live on the digital age; an opinion voiced, perhaps in a not very literary fashion, but yet totally digitally. (At least my title question was more of a weak pun than an actual rhetorical act.)

Which brings me back to the issue, in light of these two readings that I have happened across, in order to make an important point:


Literature is written words.
Written words collected together in series are books. (At least until recently.)
Therefore, all literature (until the bright, blinding dawn of the digital age) has been in the form of books.


By the same token, all books are literature.


Yes, strike that false syllogism out! Unfortunately, those whose interests are taken with literature often are given to the inductive logic that "books" represent literature itself. Not true. Books represent many different kinds of word collections, some of which are quite awful indeed. To tell the truth, I would stand and watch my most major publishing houses flaming hulks disappear in a sizzling downdraft beneath the waves of our current cultural crisis. Can you imagine? Celebrity tell-all stories, political hack collections, and pet-themed cookbooks are not successful enough to keep paying their authors millions of dollars in advances, no matter how many of them have a giant "O" of an anus stenciled on the cover! Hosanna! The free-market has finally done something right, and the snake is finally eating itself.

For anyone actually looking, literature is doing is just fine. There are hundreds of working literary journals in the country, and where one falls over, three spring up. If perhaps you wished that you could walk into any bookstore in the country and find the same ten authors that made up some "list", and that you knew well, just like your favorite Starbucks beverage, then you may be out of luck. But there are still many people writing, editing, publishing, and reading literature in print form. Though they might not be making much money at it. And regardless, I'm sure someone will still be publishing Steven King twice a year, no matter what happens.

Literature is written words, and written words have never been more in style.

The idea that literature has to "evolve" in this crazy electronic world is pretty stupid, I think. Literature is not a corporation that has to cater to its stockholders. We should let literature evolve itself. The woman who claims, in the Independent article, that internet forms like Second Life, Twitter, and whatever else are going to give literature a new, digital life are idiots.

First, as to the technology: ink on paper will still be around no matter what. Sure, its use will decline. But it will always be there for a simple fact: a sheet of paper doesn't do anything but lie on the table. It doesn't run out of batteries, it doesn't get erased by magnetic fields, it can't interfere with the navigational equipment on an airplane, and it won't become anymore obsolete than it already is. It is the simplest denominator of the written word, and as such will always have a place in our culture, just as words will have a place upon it.

Second as to the literary quality: the words that are used in cyberspace are most often decidedly un-literary. Literature, as my little syllogism was meant to show, is not simply given via the ability to hold content, regardless of how novel the container may be. Literature is an art that evolves within its own semiotic structure: part of, but not reducible to its technological vial.

Can I say it more plainly? Yes: THERE WILL NEVER BE A TWITTER NOVEL.

Of course, I invite efforts to prove me wrong. I read a poem in McSweeney's that was written in the form of either text messages, emails or blog posts, I forget which. Needless to say: abysmal. Stick to stanzas, not SMS. The former was developed to push the literary content, the latter to push commication. The two are not the same. Making a book into a movie or video game makes the book no longer a book, plain and simple. Literature is still only the written word, whether in ink, in binary, or in LCD pixel. A book's character in Second Life is an advertisement or a simulation, not literature in anyway.

Eventually, no doubt, there will evolve literature that finds its rightful place in the womb of our new digital culture. However, this will not change the fact that the last 1000 years of literature found its placenta made from good old ink and paper. (And before that, speech was the hip technology, and speech is just as likely to fade from common use as paper, in my opinion. True, the oratory has seen better days, but there are still artisans and audiences of the form.)

And furthermore, this strike through of the concept, "words + sphincter = literature" shows why it is idiotic to look forward to an "iPod moment" for literature, when some messianical technological sex-toy descends from the sky to "get everyone reading again". There is no such thing as an iPod moment; we are getting dangerously close the "big-man of history" theory here, a decidedly reactionary conception of anything. (Then again, most literary critics, even the so-called "materialist" ones, seem to conspiculously avoid seizing the means of their production). The only thing the iPod did (even though actually, it was the mp3 that made it all possible) was to give music its "indoor plumbing moment". How amazing a breakthrough is it, really, that now we don't need to rely on record companies and ticket agencies to hear and share good quality music? Raw sewage is no longer flowing in the streets? How delightfully modern!

The iPod for literature is the book. Anyone can write the text, and anyone with an hour, some glue and some paper can bind one. Then you can give it to a friend, sell it to a shop, or burn it if you wish. You can carry it anywhere and it doesn't need electricity. It will even work in zero gravity.

To sum it up: literature, as a field of artistic creation, will probably stay about the same regardless off of what surface or substance it is read. The big book corporations and the music corporations will both, hopefully, go their appointed ways. I'm not worried about literature in the slightest. In fact, I bet literature will only get better. As I've said before, it's only recently that the literacy rate is so high; it should not be surprising that the literature rate has stayed about the same. Oh, and beware those who try to sell you on the quantity-quality conversion (AMAZON). It was just those sorts of quantativists that caused the failure of the publishing houses to begin with (on the stockholder side AND on the rich author side).

9/16/2008

Interdome Notes, Vol. 1

Although most of my posts are long-winded, pseudo-philosophical, personal exhibitionism exercises, I'm going to try and and insert more short segments and brief thoughts, with possibly even no theoretical relevance whatsoever. (We'll see how I do with that.)

Though I've provided ample exegesis on my fondness and support for Google's many projects (even while I feel a bit like Heidegger's 'Rectory Address' every time I declare my metaphysical love for a corporation with a market cap. of $140 Billion; that's right, not like Heidegger but the address itself) I think Google Reader is my favorite. This is actually an affinity for the concept of RSS scripts more than Google; RSS is a little cousin of html and xml script that let's you create your own synchronized newspaper with an appropriate client.

In Reader there is a delightful "share" function, which creates your own RSS feed, similar to a mini-blog, with RSS entries that one finds compelling, with one's appropriate commentary. So, since I have no Reader buddies, I will be sharing these little internet tidbits via my blog; hopefully this will increase my post frequency and also spread the RSS love a bit, since I know that I have at least a few regular readers here.


So, here is
Interdome Notes Vol. 1(general link to my shared material enclosed)

Boing Boing brings us this little tidbit, about McSweeney's apparent plagerism of heavy metal website Encyclopaedia Metallum. The website, and the book, are nothing more than a list of all known metal bands (also the title of the McSweeney's publication). If you didn't already have a reason to dislike McSweeney's, here is another (heh heh). I'm all for open-use and circulation of materials on the internet, but republishing for profit, when it's not in the public domain? That's a no-no. Especially if it is still currently in "print", which the website most certainly is. What's next? Are they going to print Wikipedia too?


Anyway, more of these little items as I find them. I would also mention the current stock market turmoil, which I am watching with interest--but there is no need to enclose a link to that, because you can find that as soon as opening the internet.


Cheers,
Adam

9/11/2008

Grappling with Google

I am an avid Google user, I'll say that right away. My main personal email account a Gmail account and I clearly use Blogger for "Welcome to the Interdome" and my other less-regular blog projects. But additionally, I use many other Google 'products', as they call have nicely named them. Because Google's growing hegemony is always news in this Interdome world, particularly so of late with the release of Chrome and the search engine's tenth anniversary, I thought I might share some of my theories and reflections about the modern phenomenon known as Google, both as a user, and as one with a penchant for waxing philosophical about semiotics, the Internet, the future, and any and all correlations between them all.

Firstly, let me say that this will not be a review of Chrome, and furthermore, I have not yet had the (by almost all accounts) pleasure of trying Google's new web browser. However, I am a user of many other web applications by the big G, on levels ranging from newbie experimenter to heavy user. I use:

Gmail - email
Reader - RSS compiler
iGoogle - widget customized home page
Blogger - blog publisher
Finance - stock portfolio and other market tools
Docs - web based document editing
Sites - wiki-like sites
Notebook - web and hyperlink notes
Bookmarks - bookmarks
Page Creator - a very canned web page creator
Apps - Group Intranet
Checkout - I think this is what is called, it stores billing info for use win appoved online vendors
Search - in various guises
Talk/Chat - IM
iPhone App - a native app portal for various searching services on the iPhone
Maps/Earth - different incarnations of map searching
GoogleSMS - search via SMS message

I think that's it. A lot eh? And while you have no doubt heard of many, I'm guessing that there is almost no one (who isn't a Google developer) who has used them all. I didn't even know I was using them all at first. That's right: I was Googling beyond my wildest dreams.

This is the characteristic of Google that I think is most relevant, and the most likely to cause the Google campus to be stormed by hordes of torch-throwing Luddites. No, not that you are forced to use the word "Google" more times than the mind can take in a simple essay, but... oh wait, yeah, that's it.

So far, I have used "Google" as both a noun, a verb, and an adjective. I can see adverb and preposition as possible too, though I'm not sure about article. The point is: Google has invaded grammar from every angle. At first, when Google was a verb synonymous with "search", it was just a brilliant piece of marketing and a kick-ass testament to domination of a certain market of which most tech firm officers probably have naughty dreams. Now it has even spread further in meaning, if not in common usage. When I talk about Googling myself now, I can't but help think of the way that my internet-self has become incarnate within Google's server farms.

For instance: my blog is hosted on Google's computer; my financial interests in the market (though only "interests" in the most casual sense, the holding values being a stock-market-game) are analyzed through their flash charts; much of my data is off somewhere in their server cloud; even my credit card and billing information are held in trust since I purchased a URL through their services. Somebody call Sandra Bullock; the net (I mean Google) has me.

This is the essence of the cloud. Cloud, for those of you who don't geek out on semiotic/tech/future stuff, is the new term for computing resources that are not on your computer. They are hosted "in the cloud", so that you can access them from anywhere with a connection to the internet, and so that you don't have to have them filling your own personal terminal. Webmail was the first mainstream cloud application I suppose, though conceptually, the point of the entire internet is that it is in the cloud. Equally accessible, always on, as long as you have the hardware to "log on" (and as long as net neutrality wins the day).

This futuristic concept of interconnectedness really troubles some people. They are paranoid, not only of one's microwave talking to one's car radio via invisible wires, but also of having fundamental aspects of their life and personality not found anywhere in space. Ropes are safe, because you can see them. Magnets are black magic, because electromagnetic energy is invisible. Thinking or speaking is natural, but writing is the devil's work because it perpetuates beyond the moment and can carry ideas outside of the soul, etc. This is the history of time, and the human technological legacy. The minute something with "substance" exists in a new dimension, especially one that is non-visible or non-spatial, gets some wood and some rope because we're going to burn these damn witches out of our town.

I actually like that I have a non-corporeal existence. (Okay, I wasn't too thrilled with having my credit card info tied to my address and email, so I took care of that.) But the entire reason that I have spread myself through the Google-verse is that because I like having an identity in the cloud. It's actually very convenient to have messages, addresses, and other personal, oft-referenced information stored in a dimension that is ever-present, and more and more, accessible from nearly anywhere. Google likes it too, because this is one of the goals of their business. If they drive computer usage into the cloud, they will win, because in the cloud is where their applications are often the best. And more importantly, there they can tax the usage through advertising and other, hardly noticeable means.

You may not believe me, who often rails against the dominance and hegemony of any particular system (especially those for-profit) that I am glad that all of my cloud essence is via Google. The key here is interoperability. Back when I was first getting into cloud living, I had a blog, a private-community message board, three email accounts, a thousand bookmarks in my browser, a MySpace, a Facebook, and some other crap too. Trying to sync all of these listings was just too time consuming. In addition, having it all unified under the Google system means that there is a certain guarantee of quality--I am not going to be inundated with pop-ups or emails stemming from my Google account. In fact, their spam-blockers in email are some of the best I've encountered. Of course, it could be argued that their motivation is to let you focus on their own advertisements. But, even here, Google corporate face maintains some stability. They clearly understand that when you blast a person with advertisements, the advertising becomes ineffective. Notice when you enter the Google-verse through a mobile system, you see a lot viewer ads. If one-fifth of my screen was an ad bar, using my Gmail on a mobile phone would be useless. If I can't use Gmail mobile, why would I use it at all? Hence, but presenting ad-free content, they maintain my usership, and can present me with minimally-invasive, targeted ads when the time is right. This is worlds better than the adporn on MySpace, or the bouncing, distracting Flash on many sites, or even better that the radio ads that tempt me into buying a car by shouting at me. If only all the ads in the world were Google Ads!

And this is why Google released Chrome, a browser built for cloud apps, and they are working on a mobile OS, Android (which we should probably see sometime this year on an actual phone). They want to ensure interoperability, and compatibility, so their cloud interfaces can win in customers that will keep it all "in house". They are going for brand loyalty here, which I think is the smartest way to run a business. If Chrome maximizes the cloud experience, and if all of their apps work on all mobile phones, then they win. They don't have to own the software or the hardware, and hence, have been developing both Chrome and Android open-source. They can benefit from other's ingenuity, and keep it all in the family. The future is Google, and the future looks good.

Now, don't think I'm buying Google stock just yet. There are serious problems stemming from this domination, and I'm not referring to the Google mutant army being trained underground. Domination leads to hegemony, and hegemony leads to a lack of change. And change is just what has allowed Google to evolve and stay ahead. They can't forget this, otherwise it won't matter how many new apps are launched, or how long the Google product list becomes. After the "inter" that describes how the apps keep the user within the Google universe, "operability" is the other half of the world. It is one thing to create a branded app for every cloud function you can think of, and buy the ones that you can't create (e.g. Doubleclick, and YouTube). They have to work. Here Google has had a number of successes, but I also see some failures.

Email and search are the biggest successes to me, and naturally so, since I would bet this accounts for 70% of the functionality of most peoples' web experience. Across every app there are very good search functions built in, and these are getting better all the time. It doesn't matter if I'm search my email, the web, web images, my portfolio, or even my desktop; I can find what I want almost instantaeously. Their search capacities put the Google in Google, you might say. Gmail is less heralded brilliance, but groundbreaking nonetheless. Infinite archiving, search, and as I already said, some of the best spam blockers out there. What you want, when you want it, and none of the crap. The "conversation" format is also very good because it seems very natural, and many other text-communication services have come to style themselves on this model, though Google was probably not the first to do so.

The other easily generalizable feature of Google is also the beginning of its weakness. One word: function. Google seems to have a very good handle on the range of people that use the internet. There are people who have no interest in the internet other than email, view a few photos, and perhaps search for an address or an article from time to time. There are those, usually teens, who want video and flashy graphics. And then there are those with specialized interests, ranging from investors, to techies, to anything else you could imagine. Google spreads itself across all these areas, encompassing minimalism and detail.

But the problem is, that sometimes Google seems spread too thin. It seems that they try to cover areas in a temporary, "yeah, we've got that" sort of way, and don't invest the time necessary to really integrate the app into the overall system. For example, it was only recently that they were able to integrate my contacts from Gmail/Chat into my share list for Reader. This would seem like a nobrainer from the very start. In addition, I can insert RSS feeds into my Gmail, my homepage, and my Blog. But I already have a mammoth list of RSS subscriptions in Reader. How come the only one I can view my Reader list through is the homepage? Why isn't that carried to my blog if I want it, or at least my Gmail? It continues. I can absorb URLs and a text note into my Reader "shared" folder. I can do the same thing into one of my Notebooks. But I can't interchange the two. I can share Reader entries via my contact list, but only share Notebooks and Sites via email addresses. I can "Follow" blogs via Blogger, but this has no correlation to my RSS feeds already in Reader. I feel like I'm back in the old days, trying to update my "favorite books" list across three different social networking sites, or remembering my different screen names for different IM clients. I haven't even tried Orkut, Google's own social networking client. I'm afraid of what I might have to re-input there.

And here, is my overall worst compatibility experience. I was trying to set up a web page for my wife's artwork. Searching the web to see all the different hosting options, I saw that I could register a domain through Google Apps. How easy! I won't have to create a new password and user name, I can just extend the functionality through Google. In less that three minutes I had a domain, and I was ready to go. I had launched Google Apps, a quasi-intranet app that makes a sort of Google homepage for your users. Okay, cool. Megan can check her site email via Gmail, and I can administer through mine. This is good compatibility. But wait a minute: I need a new user name for this site, because Apps logs in by the domain name, not by Gmail. New user name? And so it begins...

Then I tried to design the page. Out opens Google Page Creator, a very, very basic site designer. I'm stuck with like three different styles, and a choice of one, two, or three columns. Okay, at least let me edit the colors. Nope. I go to the help page; I figure, I can alter the html of my Google Blog, I should be able to change the hex values of the color somewhere. The help page consists of four entires, and a notice saying that they have ceased support and signups for Page Creator, and that now they direct us to Google Sites, with "new features". Now I go over to Google Sites. By this time I have 6 different tabs open in my browser.

At Google Sites, I find I need to register again. I use my standard Gmail login this time, and now find that I'm not the proud owner of a website, but that I have started a group Wiki that is very similar to the Google Apps look, what with Google functionalities like Calendar and Video dropped into it, but even less customizing in terms of looks. Furthermore, there seems to be no way to link this Site functionality to the Apps or Page Creator. And still, no html editing.

For a different project I thought that perhaps this wiki setup could be useful. I could have a multiple user space for editing, with all the features included. Wrong again. Labels (like you see at the bottom of my blog posts) are used throughout Google and the rest of the internet to group disparate articles and entries, like blogs or wikis. But although there are tags in Blogger, there are none in Sites. And furthermore, there is no way to export a Site! When it's there, it has to stay there. In the Support Group I found a thread in which others ask about this very critical functionality. Mike, a "Google Sites Guide" instructs us to a tab in the "owner" mode, but then apologetically recants, saying that this was an experimental function that hasn't been released yet, and he wasn't sure when they would release it. That was back in March. I eventually went back to Notebook, where my shared users can add label tags, and export the Notebook either as html or as a Google Doc. (both very useful!) But unfortunately, there is only one level of hierarchy available in Notebook, and no linking between additions, like there could be in a wiki.

So at the end, I have all this functionality, but it is useless for what I want. It is designed piece-meal, so that for certain users there is only precisely what they want, although if anyone wants to dig deeper we're stuck, even if this sort of functionality exists elsewhere in the Google universe. I feel like I have the most awesome set of legos ever sitting in front of me, but I don't have any flat pieces. I can build a really long wall, or a giant tower with no roof, but the only sort of enclosed buildings I can make is a chunky-looking, completely solid pyramid.

So this is where Google has to really advance. This functionality has to be complete intergrated. It might even have to be rebuilt, starting from the bottom. What we need is a linguistic syntax; a way of connecting functionality (verbs) with data (nouns) that doesn't require ten languages, each only having a couple of tenses. I wish to write, have written, wrote, and be writing all in the same language. If I can, then I can use that language. If not, it will become a dead language, and that's all there is to it. It's a philosophical mission; to unite functionality through the interface in which it is used. Isn't this the basis of all good-thinking philosophy, whether in politics, economics, morality, or language? How can we describe the world, both qualitatively and functionally?

I know re-creating Google from the ground up is highly unlikely. But if they can really unify all these tools, then they will really succeed in their goal. Microsoft first dominated because they allowed for a full span of different levels of functionality all within the same operating system. If Google can do the same thing in the cloud, well, then we'll hardly need operating systems anymore. Everyone will use a simple computer loaded with Linux, Firefox (maybe Chrome eventually?) and a shit-load of RAM.

8/27/2008

What if I Told You that the Hegelianism Never Ended?

I think my last post responded to a column on Erik Davis' website Techgnosis (though originally it was published elsewhere). As I think I mentioned, I'm a big fan of his writing and his work because I haver such a confluence of interest with his topics.

For instance, this recent post that deals with Blade Runner, Slavoj Zizek, and SF literature. As potent a melange as a beet, cherve, and dandelion-green salad!

But let me add the vinegarette, if I might be so bold: Davis draws attention to the uncanny realization of the replicant in Blade Runner. He quotes Zizek, discoursing on the film:

"Let us recall how, in Blade Runner, Rachel silently starts to cry when Deckard proves to her that she is a replicant. The silent grief over the loss of her “humanity,” the infinite longing to be or to become human again, although she knows it will never happen; or, conversely, the eternal gnawing doubt over whether I am truly human or just an android—it is these very undecided, intermediate states which make me human." [Davis' emphasis]

Yes, this is indeed the post-modern aesthetic that makes such movies and stories so popular in this modern age. "I thought I/it was... but in reality, it was..." The horror, the tragedy! Our ideas explode into flame on the jagged, gothic spikes on reality. The matrix has us all, and we constantly struggle (uselessly) against its modern power that even gets into our minds, (bodies) man!

And self-described Hegelians like Zizek try to argue that it is this liminal zone, this uncanniness, that is the very reality of consciousness, or humanity. And non-self-described Hegelians do as well. (Whether they are Hegelians or not...) Most of the critical thinkers of the last century draw our attention towards some sort of liminality, some borderline, or sublime transference between two states that is used as a point of pivot for their theory. And they are not wrong to do so; this is a very new concept for theory. Many philosophies rigorously distinguish between the this/that, and totally ignore the and/or point. It is a new point on the dialectic: the point of transfer that unites the entire equation...

Or is it? I will can the rhetorics; no it is more of the same. It doesn't really matter how uncanny it is, or that it is "shaky" as opposed to the ur-ground of older philosophers. What it is really looking for, is a new point of authenticity.

Authenticity. I said it again. This is word that I use for it, harvested from the translations of Heidegger that I have read. Heidegger is a great example of this search for authenticity for several reasons. One: despite the very historically-important dualities that he disregards and deconstructs, the authentic vs. the inauthentic is a constant parallel to which he returns. Two: although he all but explicit states that there is no qualitative preference for authenticity of inauthenticity, you can read, very loudly, that he is striving for the authentic. Authentic being, authentic metaphysics, and authentic society. This last item brings us to point three: he accepted Nazism. Now, I would not put his work in the "Fascist" section of the library, nor would I say that it should be read as a causal vector towards Fascism. However, it can't be ignored that brilliant and groundbreaking as he was, he was swayed to desert his colleagues, fellow humans, and even lover in favor of the volk, with such horrific general consequences of which we are all very aware.

Oh, and lastly, "authentic" is a word that is pretty easily accessible to those who haven't read Heidegger. Which, for some reason, seems to be a category that includes almost everyone. It meaning is not so nuanced: there are different ways of doing to same thing. You can win by striving hard according to the rules, or you can win by cheating. One is authentic. Why? Because it wouldn't be a race if it didn't have a start point and an end point, and a prescribed course between the two.

Human race, Aryan race, 100-meter dash... all the same? No, of course not. But all have "authentic ways of being" what they are. This is what Humanism is. A relatively recent development, it takes a collection of moral axioms and cultural standards culled from over the last 5,000 years or so, and binds them all together in one agnostic, ill-defined package called "humanity". For example, if there is a genocide or other tragedy somewhere, you must declare outrage at the fact of suffering, but not necessarily do anything. Competition is fine, as long as there is a general belief in the "fairness" of this competition. Monogamy, for some reason, is really sweet. Love is also great, as long as you submit sacrifices of lust on the altar from time to time, because courtly love is like, way creepy. This is what it means to be human, authentically, circa 2008.

Of course, this is not a monoculture. We have plenty of counter-authenticities. Being ironic in the face of other's suffering is cool. Opting out of competition is also really chill. And generally thwarting monogamy to be "in love with lust" is really current. Especially if you post about it on your blog. You can also post about Zizek on your blog: case in point, myself.

And this is what Zizek is doing; he is defining a post-modern, self-described Hegelian, sub-Lacanian, authentic humanity. Which is: a totally un-humanity, where we are constantly disturbed that in this crazy, year-of-our-internet we don't know where the human begins and the iPhone ends. This isn't a humanity of commandments, or of morals, or that you can write about in a book. Oh, wait, Zizek has written it in a book! How many books has he sold lately?

And it is a very attractive idea to the humans who are constantly stunned by what "they" thought up next, or what's on the internet now, or what's new and different. What if we did wake up and discover we were all robots? Oh well, that's just these times that we live in. My entire reality is rewritten via Wikipedia every night, and broadcast on YouTube every morning.

No! This isn't humanity, or authentic anything! It's channel surfing; it's hyperlinking; it's trivia night at the local hipster bar! The problem is, the authenticity itself is false. It is a wild goose chase, a constant search for what is real, or even, ironically unreal, in and among these shifting sands of modern... what? Irreality? Unreality? Falsehood? Machinery?

This is what PKD is getting at in his books: you can't bet on anything to be what you think it is, not ever--whether it be robots disguised as humans, humans disguised as robots, or humans who think it is cool to be a robot. All his main characters are stumbling through worlds of bizarre, uncanny occurrences. But, these are not worlds of quick-sand reality, where the bait-and-slip is so, so post-modern. SF is always already the questioning of reality. What if we could fly to the moon? What if the Germans won WWII? These are concepts that fuck with reality. PKD's character's do not have reality because their events occur in worlds that are already twice unreal--they are fiction, and speculative fiction that is meant to be different then reality. The characters trip through worlds of sensation and harshly cut-together events, and the only reality is the sense that the phenomena creates at the moment. Any search for meaning in a more meta-sense is doomed to break itself off in a fiendish circle of double, triple, and quadruple ironies that expose the fascination with such metaphorical curiosities as the fashion circus that it is. Sometimes the characters find an ending, and sometimes they don't. There is no reality, and yet, it is as real as anything ever is, anyway. Take Man in the High Castle. This speculative concept of an alternate history: what is the conclusion? The conclusion is, this world is made up, invented by an author. Germans win, germans lose, there was a war, people die after living their lives, those who survive now live in some sort of post-war world. Of course, everything in contingent on what does happen. But what is the end result of reality? Nothing. Except itself. No historical is more authentic than another, and in fact, no reality may be more authentic. If one day we wake up to find it was all a sham, guess what? We still have to go to bed that night and wake up again the next morning. What if this town is the battle ground between chaos and order? What if I'll die tomorrow? What if I'm already dead? What if none of these statements are true? It's not nihilism. Nihilism is the constant, fleeting search for meaning that ends with us dying, having found nothing more than when we started, but only flitting from eternal truth to eternal truth, to the death of eternal truths, to the birth of tangential truths only found in gritty (and cyber-punk, if possible) liminal spaces.

The movie of Blade Runner can be interpreted either way, depending on what is important to the viewer. Is authentic humanity important to you? Then waking up as a non-human would be a huge problem, and one where you could philosophically find your real, true humanity (and sell books about it). Or is simply living important, as it seems to be to Decker? If you can't tell the difference between your authentic humanity and your replicated humanity, why worry?

"Too bad she won't live... but then again, who does?"

The Nexus-6 who dies after rambling on about C-beams glittering through space (an excellent scene, by the way) is dealing with problems of existence, not of humanity. What is it like, to die? Well, he's actually lucky because he has discovered, from the boiler-plate on his spine, that he has a maker, and can go to his address. And when he fails to put off the issue of life/death into the future for you, you can kill him, and maybe feel a little better. Most of us don't have the option, and instead we can only muse about such problems of existence in SF novels or on blogs.

But what if I woke up tomorrow, and discovered that I wasn't going to die? Well, I suppose it would be a little weird at first. I would probably have to find something else to blog about. But is my toast going to taste different without the sword of Damocles hanging over my head? Maybe, if you are a humanist and thought that this relationship between you and your end colored your entire life with meaning. Or, if you a self-described Hegelian, and your conception of reality was based upon a flip-flopping from one plateau of the dialectic to the next, given gravity by that huge authentic signifier forever floating out above your head somewhere. But if you were a realist, and understood all meaning to be as uniquely real as it is transient, then perhaps you merely grin and see if you could get a date with that hot replicant-receptionist. It's not buddhist. It's simply real. I always thought it was funny how, despite the existential, speculatively-mind-warping time-space crises that PKD's characters live through, they never seem to miss an opportunity to sleep with a woman or imbibe a mind-altering substance.

So, when that pharmacist shows up at your door with an energy beam that reminds you that the Empire Never Ended, just smile, take the bag of painkillers, and see if you can pimp it into a SF trilogy. Thanks, and goodnight.


ps. One technical note, that I thought of after finishing the post. If the replicant is truely undergoing a universally human, existential experience upon the realization that one is actually a synthetic human, rather than a wetware version, then it follows that the ability to have an existential experience such as this is actually a synthesized experience. Then it is not a "truly human" problem, and the Nexus corp. only engineered it as a truly synthetic experience. There is no way to say how a true human would respond to the discovery that they are synthetic, because a true human could never, truthfully, discover such a thing. However, the fictional tale of someone very-much-not-quite-unlike a human finding out this truth could evoke a lingering doubt in true humans, who would then philosophize on the idea despite it's purely speculative nature. And, perhaps this is the actual meaning of existentialism--the pondering of realities that are like our current reality yet outside of the reality of experience (e.g. "hell is other people", yet who has written the book about what the opposing heaven is like?). After all, part of the interest in what happens after we die is that there is no one around to give us a review. What are the limits of experience if there is no way to experience them?

But anyway, I think this conundrum brings up a much more interesting question re: Blade Runner. Why the hell did the Nexus corp. invent a machine that would have an existential experience? Was it a philosophical experiment, under which they created a machine in their own image, that had questions about life, the universe, and everything that they themselves had? Or was it a programming bug that developed while they were trying to create a machine with conscious-like artificial memories (the key being, the memories linger but fade, and can be recalled in a not-quite conscious way, giving rise to an understanding with a limited scope of consciousness, that would invariably question what lay outside such a consciousness)? Or was it more purposefully? Frankly, that's a pretty good fail safe. You're memory machine breaks down, and realizes that its a fake (in other words, that limited consciousness now sees more than it was supposed to). So, just like Windows, it phones home the fatal error! The machine travels across the solar system with no desire greater than wishing to meet its maker. Master, I have fatal exception E845FG1, (error code: theorization of the limited totality of consciousness) please fix me!

8/07/2008

The Electro-Fascism Genre and anarchiTunes

Erik Davis is a writer on a large field of topics, including dark metal, information theory, mystic energies, iPods, and paranoid delusions of popular science fiction authors. His website, Techgnosis, is a compendium of sorts for columns and musings that he publishes in various places. It's also in my Google Reader list. Recently, he wrote a column for Arthur Magazine, "Archive Fever". In which he discusses the way that certain folks archive their digital music collection. Digital music and its access is a topic he has discussed elsewhere, and I've found it an interesting focus. Everybody knows the common story on how "Napster changed everything," but these sorts of story lines run in a very generalized sense, as in, how the commercial market is changed. Davis' articles provoked thoughts of my own on how the individual's interaction with the media is changing, both through a connection with digitized media and also through mobile technology. Since I happen to be on a bent of discussing mobile, digital technology and its effects on individual consumption, and I also stumbled into a huge increase in my collection of digital music, I collected some notes, by way of the fact that they are related, yet find a slightly different conclusion than Davis about the urge to fiercely manipulate the way one catalogs one's digital media.

In the past year, through situations I don't really understand, I've become a "DJ" of the sort that does not have any art to it. Rather than co-mingle the latest tracks on the dance floor for the new digital underground, I am hired to set up stereo equipment and a playlist for weddings, alumni events, and other horrible things that seem to occur every day behind closed banquet hall doors. The point is, that through this, I was able to get my hands on roughly 200 gigs of new music (or about 50,000 songs) to add to the 50 gigs (12,000 songs) that I already had. Most of it is complete crap, not even having the interest of cultural artifact that pop music has. However, there is a considerable number of oldies, motown, doo wop, and other forgotten mini-genres that would be lovely to have, and now have fallen into my lap. I'm guessing, however, that I will probably end up deleting about half of the volume before I wrangle it into any sort of usable library, by deleting misnamed, duplicate, and racist country songs (they listen to weird stuff at weddings in oregon).

This is a major databasing task, merely to get the data into the right order to begin to pick and choose what I want. I can tell you that I am also an anal-retentive freak when it comes to my music collection, not unlike Cory Doctorow, as discussed in Davis' column. My track info in iTunes (my music player of choice, perhaps unfortunately) is immaculately maintained--I like to have artist, title, and album info all perfectly correct and relevant to other tracks by the same artist. In addition, I also try to make sure the year of recording is also correct, and I have thoughtfully genre-fied my music into a specifically-small set of 17 genres. Being presented with this horde of poorly-cataloged music is a great source of anxiety for me; I have gone so far as to only work with it under terms of quarantine--I have set up a separate iTunes library for working with it, so that its files cannot be released into my general population. I believe that the ability to form separate, disparate libraries with iTunes was introduced in iTunes 7; I only say so because it seems many users aren't familiar with the ability (hold down shift, or possible the option key with a Mac, while you click and open the iTunes program. A prompt will be launched to allow you to create or select a new library to work from).

I can tell what you're thinking (can I?): I am suffering just the sort of massive commodity hysteria that Davis' referenced. The ability to reduced so vital an art form as music to bits of data has created a new neurosis; now we "own" rather than "listen". I see it a different way. This, is an unbridled opportunity to actually interact with what we are doing, and seize control of our own relations of desiring-production. This is nothing less that anti-fascism, in Deleuze and Guattari's sense.

Here's how I see it: information, regardless of media, is a sort of "literary" substance to our culture. Just sort of take that as is for a minute. Now, if we also consider that "those in power" throughout the centuries have also tried to control other people through their use and control of information, we can see that this information substance is molded, shaped, distributed by those who use it to maintain certain power structures and relationships. Almost like they produce it. Post-structuralism, post-modernism, etc. We've heard it before. At any rate, nobody will really argue that Gutenberg's press wasn't a giant victory for the little guys. And radio as well; the ability to transmit information to a large population near instantly had a profound affect on the way governments and others wielded their information substance. As a result, powerful entities tried to regulate these new technologies. The internet and net neutrality, is the modern incarnation/incantation of this dark necromancy called "freedom of information". But this is all basic history, and hardly anyone would disagree.

What is both different, misunderstood, and related to Deleuze & Guattari's philosophy (ha!) is that we are dealing with the aspects of what they discuss in the plateau of Thousand Plateaus entitled, "A Geology of Morals". Beyond all the distinctions of form, substance, content, etc. there is a crystallization of information that is it's strata, its very existence by way of its molecular structure. Even beyond molecules, because they certainly were not after desire as a new atom; you can always push the zoom one lens deeper (if a visual metaphor even applies). What is being released now, very quietly, is the ability to change the data for ourselves. How do we change it? By altering the structure in which it is produced, and used (consumption-production, if you will).

With Napster, they tried to threaten us. "If you keep downloading music than artists won't make any more money on it and then they won't make music anymore." A pathway that anyone with a job (or a boss) can understand. As if the millennia-old aspect of homo sapiens that creates art in the auditory realm will up and stop because the money does. What were they really saying? "If you behead the boss, then he can't pay you anymore." Or, "if you like Metallica, you better stop, because Metallica likes being rich, and they won't make music if you don't make them rich." The idea that music could be easily stolen en masse was one thing: the RIAA was going to lose their lock on the pattern of distribution; their control over production, their power, was being threatened. In 200 years the idea of the RIAA having the right to control music production and sue to protect it will be as strange as the idea of the Catholic church having the right to censure all books. Well, maybe 400 years.

But the "stealing" thing was only the face of the battle, depicted for (again) media consumption in order to win control in the moral structure of the courts and public opinion. I think the real battle was about the form, the digital music file. Not only is it easily copyable, it's also mutable, in a way that no other art form has been. Remixes and mashups aside, with programs like Winamp and iTunes the control of music listening itself has been taken away from the distribution networks. Radio is no longer necessary; just look at the glut of "HD Radio" ads to see how the radio industry has recognized its own twilight. Music stores, and the harsh distribution control networks behind them (ask any indie record store owner how much money they get from the sale of a cd) are no longer necessary, even for legally sold music. And, relevant to this discussion, more abstract methods of control like "critics", "acclaim", and "genre" are becoming defunct.

This is all about distribution. There are literally, millions of songs out there. But how do you know which to listen to? No money will flow if the track isn't played. Hype, airplay, word-of-mouth, are all very important, and until recently, these could be controlled, i.e. bought and sold, by those powers that made music "popular". This Band Could be Your Life is a well-known saga of the compiling of these sorts of distributive indicators, these cultural markers, what in a more Marxist tongue could be called the means of production and distribution. Casey Casem is out, and even your little sub-culture is not so important outside of its figureheads; now the blogs are in, and the individuals who publish them.

Personally, I think mp3 blogs are great. It isn't Napster; I don't get what I want when I want. Instead, I have to search until I find some individuals who are interested in similar things to myself. Then, via their posted downloads and my comments, we communicate and share. Overall, its a highly lucrative (music-wise) and non-pretentiously rewarding system. Also, a great resource for archiving music that is out of print, nearly unheard of, or both (if that is what you're into).

This, along with the rise of RSS feeds, is what I think the next generation (in the low-gestation period that is the web's reproduction) will be. First it was info, plain and simple. Then, it was user-generated content, out of the control (almost) of "content providers". Now, it is the customization, the cataloging, and the seamless access of such content. Using an RSS reader like Google Reader, or any number of other free web apps, I can customize my flows of information directly to my desires, to conform to my day to day whim. In five minutes I can create my own custom newspaper, with sections for news, arts, entertainment, economy, conspiracy theory, agitprop, personal ads, or whatever else I deem interesting.

And this (at long last I come to my point!) is what the categorization and genrefication of mp3s is all about. Doctorow, myself, and other "geeks" are not just clamping our bureaucratic sphincters closed on our informative flows, we are wielding them. With my smart playlists, and my immaculate data keeping, I can get the computer to play me what I want to hear, when I want to hear it. Of course, when I write or do other work there are certain albums I want to hear, or, certain genres, or time periods. But often I can relinquish that direct choice to a semi-intelligent algorithm that will bring me "songs that I've listened to once per month but not more that five times in the past week." I hear something I didn't expect, but I didn't hear my mp3s of Ginsberg reading "A Supermarket in California". Randomness is good, randomness is bad, but algorithms are me. It's not typing, its not banging my head against the keyboard; its cut-and-paste, drag-and-drop. It is decidedly anti-fascist (in terms of the lesser, micro-fascisms) because all control is given to the user, who can then define his/her own level of control. Doctorow rates his tracks with the star system; I don't, because to conform my various subjective rating systems to a five-star system is too hard for me. I use the "times played" counter as my most significant metric, because I find it to be the most true. If I like a song, I've listened to it. If I don't like it, I don't. Easy enough.

However, there are problems. I first started heavy digital music cataloging after I worked as music director for my college radio station. I was interested in statistics and databasing, and this seemed like a fun little project. I was, however, distraught at various elements of iTunes. The "times played" count is not tallied until the song has played completely through, leaving un-counted half plays, which is especially disheartening in the case of "secret tracks", when the last track of an album contains some 10-15 minutes of silence so there can be a last song right before the CD runs out of capacity. There are also limited ways to programs the relationships between data types in the smart playlists, no way to view statistics as a whole outside of through the song lists, and no easy way to edit information or transfer information outside of the linked relationship between the iTunes library and the literal song folder. I sent Apple an email detailing all this a while back. Some things have been made better (there is now a "grouping" data type in addition to "genre") but others haven't (the proprietary iTunes library language).

And this is a big problem. iTunes, through its clean, white browser, easy compatibility with Apple devices, and its iTunes store, is quickly becoming the standard. The users, even the power users, are still left with a lot of the "power" not in their hands. Winamp has similar but different problems; a dearth of choices and compatible content providers makes the hands-on control also quite hands-off, and despite improvements, I still think the software has running issues. I have faith though, because thanks to "ubergeeks", there is no technology that can be created that cannot be reverse-engineered, and no proprietary software that cannot be replaced with open-source. This is what the modern "geek" is, s/he is the cyberpunk, that can unlock doors that other people see. Of course, this power can be used for good or for bad (like all power), but it is heartening the way that those very knowledgeable about information systems and programming languages seem as drawn toward anarchic distributions of that power as corporations and commodifiers have been drawn to control. It makes me think that perhaps all of this post-structuralist liberation stuff about overcoming the hegemony of signifiers, the commodification of desire, etc. might actually not be an idealistic dream. Perhaps Marx's dreams of contradictions leading to capitalisms downfall are not as outdated and they are outmoded. iTunes may not be the Hitler Youth yet; but I think of it more like liberal democracy. There are elements of freedom, but still a very certain, very deliberate control (supposedly, the algorithm behind the generic "shuffle" is one of Apple's most closely guarded secrets). Music is a very personal experience, and while music is not exactly the same as syndicalized production, if you can't have anarchic information relations, it seems unlikely that you could have an anarchically liberalized production environment. In many dystopic tales of authoritarian regimes, they certainly don't forget to include the wide proliferation of public-address systems that cannot be turned off. Collection of information (surveillance and reporting) is certainly kin to the distribution of information (publication and broadcasting).

Hopefully I won't freak out while trying to add 200 gigs of music to my collection. Slow additions is the key, I think, rather than a general merge. Much the same could be said for bureaucracy; a general goal to catalog a certain area of life with data can easily fail or cause more problems then it starts. But if it is a rotating view (a databasing term that means the axis of reference can be switched out easily for another) with good data organization, one can really own one's data. And if control of oneself rather than submission to ulterior organization isn't revolutionary, I don't know what is!

7/31/2008

Welcome to the Hegeldrome?

It's been about 3 weeks since I got my iPhone, and as I forecasted in my last post, my life is already completely changed! I have designed a custom ear-piercing so that I can implant a Bluetooth headphone DIRECTLY INTO MY EAR! No, just kidding! Megan talked me out of that one. But really:

We have entered a brand new age of digital literacy!

Or, at least I have. I have discovered that since I last spent a lot of time online, there are many great sites that provide an RSS feed, eliminating all the flashy business and most of the ads, so that I can have direct access to the text. With the help of Google Reader, a RSS managing web app, and the Google App, I can now see the updates to interesting sites in small, bite size pieces, from which it is easy to "hyperlink" to the full article!

Interesting!

This is, nothing less than a Darwinian leap forward for web content. All the dodos (YouTube, hopefully... what a horrible GUI! Not even its own iPhone App can save it!) will be hunted down and killed by Portuguese sailors. Metaphorically, of course. Those who adapt their content to the proper form will move forward, and I think that this can only be a good thing. Why? Because those who properly form their content sites are those with the better content.

I know this sounds like a strange, digital age argument for classical forms, but think of it the other way around; rather than rely upon the time-honored forms of literature, we are now going to be pushing forward to more innovative forms, and, more importantly, not just innovation, but innovation that is found to be useful and more conducive to its content.

Take YouTube, for instance, that I just bashed. It has been heralded as a synonym for Google, not only in ownership, but in the epoch-defining titles that pundits love to trumpet, hoping that they will be known as the soul who coined the term. But no one will ever be the YouTube generation. Here's why.

The site is: a very convenient means for sharing videos, that was made even more convenient when they made it possible to embed video in your web page. Your blog/social site/web page doesn't support video, or you don't know how to make it do so? Bam. Now it/you do.

The site isn't: a video Wikipedia. Enough said. The search tags are awful, as is 90% of the content. If you're looking for bullshit, you found it. If you're looking for the "Glittering C-Beams" speech from Blade Runner, then okay, maybe you found that too. But say, you want to find specific Congressional Testimony, then perhaps not. Why? Because nobody uploaded that video. The bottom line is, if you don't want to weed through the crap, you shouldn't even bother.

Now, I'm not saying that there is a web site out there that will give me something good about any subject, all the time. The internet is all about the search and link. Even a great search engine like Google only gets you about 30% of the way to a nebulous concept, the rest is up to you.

That's why I really like the RSS business. Now I can program 10, 100, or 1,000 different content providers into my little heirarchy, and then search those trusted sources for what I need, or what I might need, but of which I'm not yet aware. For example, in my "Theory" folder, I get the most recent updates from The New York Review of Books, The Anomalist (a review of parascience sites), Erik Davis' Techgnosis, and the journal N+1. In the "Comics" section, I read Dinosaur Comics and The Comics Curmudgeon, and in my "Associates" category I have updates from my friends and colleagues blogs (see the sidebar for their links). Now I have my own little newspaper, made of the sections and contributors I choose, all updated 24-7 and prepped for my mobile device, or computer viewing. Amazing!

And you can make your own, and share articles like how Dad clips out columns for you. Cool!

One of the articles I just happened across was from the New York Times Book Review, called Online, R U really reading? See how they made the "R U" just like you were writing online? Isn't that clever? The article asks the question that Luddite babyboomers would love the answer to: "Is this crazy internet fad going to make my children into porn-addicted, illiterate nerds?"

The answer, clearly, is no. The fact that people like the author of the piece forget, is that there are many more people than Luddite babyboomers in the world. What I mean by this, is that not only are there people in the world that don't have a healthy respect for To Kill a Mockingbird, and other falsely-attributed "classics" (ME, for one) there are also people who although they have the ability, do not read at all, or only read tripe. This was occurring long before the internet, and will occur for long after. 90%+ literacy is a relatively new thing, and the fact is that most people, even in our culture, have no desire to pick up any book at all. So a teenager wants nothing more than to write fan-fiction and read online celebrity gossip... fine. S/he probably wasn't a Jr. Proust anyhow. Not that they shouldn't be prodded towards have a bigger view of the world and culture than is on YouTube, but I think we should look elsewhere than your modem for the problem.

The "classical" element of culture has always been the minority, and probably always will. And I think most who consider themselves part of that culture would like it that way. I remember being able to flash my Philosophy Department credentials in order to get in a standing-room-only lecture by Zizek in New York; imagine if I had to go up to Yankee Stadium (by then, renamed the Hegeldrome) and buy $150 scalped tickets just to watch that lecture on the Jumbotron!

But, what about the internet? Is there any goodness in it for us high and mighty, classical-culturally inclined? Of course! Did you not just hear how I made my own custom newspaper with a few cut-and-paste's in five minutes FOR FREE? The nation's newspapers are going out of business? Fuck 'em! As long as there is online advertising to be sold, there will be more than enough knowledgable opinions online to sort through. And after all, newspapers aren't going to disappear; they are only going to lose some weight, long over due. For those who read them, you will never beat a printed sheet. You can quote me on that. But if the NYT stopped printing it's society pages, I know I wouldn't mind.

The internet is not a dreamland, however. There are censorship issues a plenty, and I'm not (only) talking about the government. But that is a theme for a further post.

7/14/2008

The Age of iQuarius

I have a terrible confession to make:

I bought an iPhone.

And not only did I get one, but I got one the second day they were out. I waited in line for two hours to get it. And the entire week prior, I was scanning the internet for rumors as to what it would be like, because I was so damn excited to get one.

"What?" you say. "Adam, who so typically eschews anything hyped, anything deemed 'the next big thing', has fallen into lockstep, and not only that, but signed a two-year contract?"

Perhaps. But what is this blog if not a long-winded apologia for everything I have ever thought, said, or done? So, in this spirit, let me tell you exactly how world-shatteringly important it was for me to get a new iPhone.

This, really, is nothing less that the future. Ten years ago, if someone told you they were getting a laptop, you would say to yourself, "Why does s/he need a laptop? S/he isn't a business person, so what use could s/he possibly get from a portable computer?"

Now, times have changed. To have a desktop is to be anchored, permanently, to a desk. That isn't a "computer" as we have come to know it; rather, it is an appliance. You don't bring your microwave out to the coffee shop anymore than you bring your car to bed; an appliance or a piece of machinery fills a task in a particular place or time, to have it outside of that context is non-sensical.

So what in the context changed? Well, you could say the internet has proliferated. Web 2.0, mp3s, MMUDs, blah blah blah. Also, WiFi has grown, so that having a "plug" for a computer to be useful isn't as necessary.

But this is the key: WiFi hasn't grown just to reduce our use of cords. It has grown to web almost all major cities on the planet because what one does with a computer has changed. Computing has become a lifestyle. Actually, it has joined, grown, and insinuated itself into many different lifestyles. Computing isn't something that you do anymore--its something that helps you do other things.

This is more than being "jacked in" to some kind of net. That happened with the Walkman. All of a sudden, you could be linked to another "dimension" via a portable electronic device. Mp3 players just upped the ante on a technological development that was 20 years old. So you can plug in, and have your attention drawn to something other than the world around you. I see people who read books while walking, hell, while driving! Some people don't even need that, wandering out into traffic while plain lost in thought. Being distracted from the "real world" is as old as the "real world" itself.

What is different is that a dimension is unfolding that is not just alternative to the "real world", but that is intimately bound to it. In fact every day, this dimension gets a little bit more hyperlinked, "digg"ed, and posted into where the old, "real world" used to begin and end. Of course, as the good, semi-Marxist semiotician that I am, I could argue that literature does nothing different. And I do. Literature is an extension of consciousness, a material building-outwards of our psychic material into the world, like rickety scaffolding from a dock out into the ocean.

And this is where this amazing new techology fits in: it is aiding our ability to read and write. "What?" the aging pedagogue retorts. "The internet age had caused a decline in reading! The young people are too busy sucking Facebook to pick up something as important as literature!"

Well, think about how many people used to read the classics up until this century. Almost none. Except for the academics, of course. So then we entered the age of literacy, and penny-novels sold like penny-candy. Now people read crime dramas and celebrity tell-all novels, and... read internet sites about celebritites, and watch online videos of crimes!

Don't get me wrong, I will rant against bullshit attempts at cultural enrichment until the day I die. But get this: there is an iPhone app planned that will let you read any book in Project Gutenberg in eBook form, for free! Amazing! Now when I'm out and trying to think of that first line from The Confidence Man, I don't have to run home to my meticulously kept library, I have it in my hand. Literacy, 1: ignorance, 0.



I don't mean to imply that the downloads of this app (I believe it is called Stanza, but it is not available yet) won't be hugely dwarfed by downloads of Super Monkey Ball. But this is always the way that things are. What technology is doing is changing the way that we can interact with and manipulate literature in our daily lives. Literature came out of the monasteries with Gutenberg, it was delivered from the corporations by digital printing. It was finally released from its chains of material, destructible (need I mention, burnable?) hardcopy by the internet. And now, it has been rendered universally accessible throughout the "real world" by the mobile web.

So this is the importance that I see in the iPhone. This is the beginning of a new epoch; one approaching the era of cyborgs: when our technology will seemlessly link our consciousnesses to the real world through interface. The mind-body debate will fall away, because our minds will be as fluid and disjointed from reality as our bodies are. When you can feel as much information about how your ligaments are functioning as what your current mood is, then you'll believe me. It is important to get in on the ground floor with this; we all need to be understanding the transformation as it occurs. Matrix-like transformation are only uncanny if they suddenly replace your established reality. If they grow on you, it is no different than growing older, or learning. Also, this way we can make sure we, the users, control the technology. Love your local tech-nerd, because one day he may be the only one who can unlock, or "hack" your proprietary bowel software so you can take a shit when you want.

Just one thing that I've already noticed about how mobile web will change the way we think about the world and the internet:

A big gripe about the iPhone is that it doesn't have Flash support. People want this, considering how many slick websites are all Flashed out these days. But, on the other hand, think about the last Flash site that you visited. It looked all snazzy, but wasn't it as hard as fuck to navigate to what you wanted? Especially if a particular function wasn't written into the site? You had to let all the cool clothing fly around the page just to look at the next shirt in the catalog without returning back to the orbiting "men's section". How annoying!

On the other hand, if you are accessing the web through a mobile platform, you are automatically at a disadvantage because of screen-size and data transfer speeds. Hence, no Flash support. What you do have though, are "mobile oriented sites." If you don't have mobile web, go here for an annoying video that lets you see what it looks like. It is basically bare-bones: limited pictures, relevant text only, small page span, large links. And... OMG! no ads on a Google site! Shh... don't let them know I noticed.

But this is an evolutionary trend: the useless (Flash heavy sites that are the equivalent of 400-page glossy magazine/catalogs) will be left by the wayside for efficient, well-designed means of data transmission. Of course, data transmission will only get better, and I bet we see Flash on the iPhone in less than a few years. But still, this evolution will continue. As the web-surfing public becomes more mobile, there will still be a need for efficient, well-designed sites and apps. (And from my brief survey thus far, Google looks primed to take the lead. Their Search, News, RSS Reader, and Chat web apps are currently available in mobile form, and all are amazingly useful and well-suited for the iPhones data and interface capacities. Still waiting on Blogger!) And of course, there will always be a place for Super Monkey Ball, just like there will be fashion catalogs until the last printing press rusts away. But those of us who are interested in information, in literature, and in improving our interface with the world will be the true winners. The fashionistas and salon goers will still be chasing their tail through the epochs, no doubt.