Showing posts with label Communication. Show all posts
Showing posts with label Communication. Show all posts

3/09/2010

Interdome Content-Object Shakeup

Been thinking a lot about Tim's (of Quiet Babylon internet-fame) project, "Unlink Your Feeds".

The problem of multiple, interlinked feeds has long been a burgeoning neurosis of mine.

Let me share my problems with you!!!

I love interconnectivity. I use my Google account with great zest, trying out the new features as they add them. A lot of them are useful, but a lot of them I try to use, simply because having everything linked together makes it easy to experiment. I don't have to invent a new password and username to try X service, I just click the link. If I don't like it, I just stop using it, and never pay it another thought.

Additionally, Google's interconnectedness is a huge plus. There are plenty of portals and application uses between Gmail, Docs, Calendar, and so on, which I don't have to spell out for you. I use my iGoogle page as my widget desktop, and then access my cloud holdings from there. The interconnectedness makes it superior in function even when it lacks particular features, and all of this has made me continue using Blogger rather than going solely to Wordpress, caused me to shun Facebook, and even avoid using Twitter for a long time.

But Twitter... oh, oh Twitter.

It's just too damn easy. All those one-line witticisms I come up with during the day, with no one to share them. The ease of retweeting, rather than saying something original. The ability to take a picture of a strange car and share it with the world, all from my cell phone, while still driving with my knees.

Twitter isn't owned by Google.

So, I have this other feed going. In addition to my RSS feed, and my Reader feed. Now that I know what a wonderful world Twitter is, I get curious as to what the hold-outs who refuse to get Twitter are saying on Facebook. And I think maybe I should start cross-posting to Facebook.

Then they created Buzz, which is the Google-Twitter I always wanted, except that it still sucks, so everyone read on Twitter is still only on Twitter. So I linked my Twitter feed to Buzz, as well as my reader account.

And now I live in this hellish world I have helped create.

Tim is right--we cannot live like this. And while the idea of cutting the feeds free, and using each as each is best suited has a nice, Marxist "to each according to his need" sort of feel, I still have this need for interconnectedness. I just can't blog, feed Reader, Tweet, Buzz, and Facebuke (the verb?) separately and simultaneously. Hell, I'm supposed to be writing books! I still want to maintain my Internet presence, and read what I want to read, so there has to be some over lap.

How, oh, how, can I make all this damn technology work for me?

The secret to using a Google Account, in my opinion, is to be flexible, and also to be patient. Many things work, and over time, Google makes them work better. And, work better together. Also, as I experiment with different tools, I change my use of them as they work, and work differently, together. Only recently did Google Docs get to a point where I was comfortable doing serious writing with it. And, since I spend less time online in "open surf mode" than I used to, I now only bookmark certain feed posts, rather than keep detailed notes about various web sites.

The same is possible with my feeds. I think I've hit upon a way to link them so that people who want to know what the deal is can still find out, without redundancy, and with most of the input automated.

This is the way it is working now:

I am looking at these feeds in terms of objects of communication. Any feed-ready information posted to the web is an object, treated as an individual, accessible, feedable piece. Different classes of objects hold different amounts and kinds of content, which may or may not overlap other objects' capacities for content.

Additionally, certain objects may have a certain shape, which allows them to fit into different data flows or lines of assembly (feeds). For example, I can set up Buzz to attach Twitter-objects to Buzz-objects in my Buzz feed. However, this shape is a shape of intensive flows (in the sense that it allows a certain change-of-state transition of the object, rather than a transitivity, like A = B = C). I'm getting all complicated with my terms, but basically, think of it like how yeast can bake into a loaf of bread, but bread cannot be dissected back into yeast. A Buzz-object cannot be translated into a Twitter-object (at least not yet).

RSS is the smallest common denominator in terms of objects, more or less. Any feed, be it Twitter or Facebook or a blog, almost always has an RSS feed generated with it. There are various tools to mash RSS feeds together, Yahoo Pipes being the most well-known. However, I am doing this because I don't want to create a new feed, I want to merge and ally the feeds of services I already use. Many of these services also have APIs, and perhaps I could work out some sort of program for managing my posts among the feeds. However, I'm not so skillful in the programming department. Also, if I use the recognized features of these different feed services already in existence, I'll probably be in better shape down the line to adjust to new features, and also I have the services' dependability to fall back on.

Content-wise, Twitter is the smallest common denominator. You can't find anything smaller than 140 characters. I also use Twitter most frequently, because of its small content size.

Additionally, many objects translate into Twitter-objects. Via Feedburner, I can make my blog posts echo as Tweets. Using Tweetdeck, my mobile Twitter client, I can post to Facebook and Twitter simultaneously.

Granted, it is annoying to see someone's high score in a video game echoed as a Twitter post. However, I've been posting less and less on the blog, and when I do, it is a often something that started as a Tweet, but then grew too long. In this way, the Twitter-object derived from my blog's RSS feed is just a link to a memo, a "please see my memo re: X". The duplication is, in a sense, to force people to choose. Either they want my full feed, and tune in to the Twitter feed for the details, or if they wish they were spared my constant witticisms, they can just go to the blog RSS.

It is not yet possible to auto-transmute my Reader shared RSS to Twitter, but that's okay, because I can always post a single Reader item to Twitter via a sidebar tool. This is probably for the best, because I can share anything interesting to those who like my reading tastes, but reserve only the "mainstream" articles for my main, common denominator feed.

Buzz can, and in my case is posting all of these things together in one, lump feed. It is a taste of the chaos. But, because Buzz has set out to aggregate all of these things, it is less of a network in itself, with its own flavor, and more of simply an aggregator. The cool part is, it shows up in my Google profile, so if anyone happened upon me via Google, they could get a good taste of what I'm into on the Internet.

Blogger, with all the new Gadgets you can add into it, is looking more and more like MySpace every day. But this flexibility is good for me. I now have two columns here. One to store these text essays, and the other to provide an easy way to examine the difference between my feeds. The Twitter feed, on the top, is the general feed. Next is the blog only, then my Brute Press stuff only, then other-source RSS only. All of it retains its original, unique feed-object character and content, while one feed, the Twitter feed, acts as the main door way, strictly under my control.

Control, after all, is what this is all about. It's a personal micro-manager's Internet dream. The solution I just described for myself will undoubtably not work for anyone else. This probably won't even be my own solution in six months, as I discover new tools, and start using the tools I have differently.

This is the key philosophical lesson of Internet content, content-objects, and object feeds. (You knew there was going to be a philosophical lesson, didn't you?) If you told people in the blog boom that they would be abandoning their blogs for a service where you tapped in 140 characters with your thumbs from your cell phone, they would have thought you were crazy. Same thing if you tried to sell Gutenberg a Kindle, or some other such foolish spatial-temporal technology metaphor-mashup. I'm sure whatever's next will also be just as weird as the present, and the past. Time is aggregation, after all.

The tools we have to communicate change the way we communicate. Big surprise, no? I'm sure we'll eventually figure out our feeds, so we don't have to read cross-posts anymore. Otherwise we'll drown in echoes, or suffocate in Internet Balkanization catatonia. People will sort out how to mean what they want to mean to the people they want to mean it to. Right now, to me, I'm directing my meaning to the people who follow me on Twitter. If you don't like that feed, please consider the other options to the left. If you don't like that, there is plenty of other Internet out there.

This way, both my communications and the people to whom I communicate can grow towards each other and into each other, like roots into the earth.

2/03/2010

Mailing List

Had this vision a couple days ago, and thought I'd share.

I was imaging a time when all relevant advertising is done online, either through sponsored site ads, or generated through your free services, email, and etc. As they get better and better at targeted web ads, either to your particular online identifier characteristics or simply to location and demographic, the idea of targeted direct mail seems pointless.

There's nothing more transitory than a mailing address, these days. Mail can be lost or damaged. People move. Mailing lists accumulate over such a long period of time, that often it's the people themselves that are changing.

But people have their cell phones everywhere. Even if they don't pick up your phone call, they'll probably listen to your message right away.

And email is even better. Machines can read email. Machines tell you what is important and what you should look at, in the days of the wide, wide, atemporal web. If you can appeal to a machine, you're message will definitely be read. In some time period, it might be more important to have a machine read your message than any human.

So, perhaps the mail will be abandoned. People mail letters less, but in the amount of weight going through the mail industry, the real customers to abandon the industry are the mass mailers. Postcards and newsprint are going through the mail by the ton per minute, all in the hopes that maybe you'll see it. Fruitless! It's only a matter of time before all advertising is online.

While this may sound like a death dial tone for the mail industry, it could also be its rejuvenation. As people get bills online, ads online, and news online, they will no longer guard their mailing address as the gateway to their ad-free souls. Already, I readily give out my phone number and guard my email address, just because an email address is so easily passed around, and ads are sent by script. I know someone is unlikely to call me with crap when they can email much more cheaply. They'll email me every damn week. But I can easily screen my calls, almost easier than I can sort through spam. Maybe there is a time when the email address is portal to the individuality, and the mailing address is as casual as a Twitter username.

This could be the rebirth of the postcard--the original technologically truncated global communication. Replacing the @ with the Airmail stamp. Hell, it only costs half an iPhone app to send a PHYSICAL PIECE OF PAPER with writing or whatever on it clear across the continent, if not the world. People could opt-in to mailing lists, where they get weird, semi-promotional musings at irregular intervals. Why? For entertainment? For social networking? For world-wide democracy? Who knows why. Maybe just to bitch about what we're watching on TV.

Sure, no one writes letters anymore. But never has anyone written quite enough letters. How many emails have you received lately that totaled over a hundred coherent, properly spelled words? But wait a minute, people write philosophical essays on blogs! (At least some of us do.) Why do they do it? Because we're crazy. Because for some reason, the human race loves to communicate with people not in the immediate area, but will not make eye contact with strangers. Because people used to make pamphlets and hand them out even though it was against the law. Because people read stuff that ends up on their door step.

Because paper is a really freakin' weird device. Okay, get this--no phone, no 3G, no CAMERA, yet a remarkable resolution, fully-interactive surface over the entire object, and depending on what sort of input device you use, you get completely different results. It has been said that no one writes on this thing in the same way! Although it is easily recyclable, it can be made nearly indestructible, and even if it is totally damaged, it often still works like new. You can fold it, bend it, glue it, tear it, EAT IT, and repurpose it for any number of uses, from building bicycles to spitting it at your little brother. It's really cheap, too.

So maybe, in some hypothetical time period, when everyone is communicating instantly via Device X and Service Y over Network Q, all the really hype kids are mailing each other printed picture postcards of their sex organs, sharing the new slow-net meme, or even sending the track they just recorded with their band, "crimped" to paper via their DIY groove printer.

Maybe.

Here's some stuff the Post Office could be doing to make this time period not just a time period, but a SOON--some of it which I can't believe they don't do now.

- Create unique postal addresses (UPA) for each person in the country. Make it a twenty-four digit number, or some hex code. Nobody has to remember all their friends numbers, or even their own. They can still mail to a so-called "street address", or other such mnemonic. But the mail service looks up the actual client via a reference database, not unlike a DNS database. You can change your mnemonic via the database anytime you want (or "move", as they once called it) but as long as the mnemonic and the physical delivery address are still linked via your UPA, the mail is delivered. I may not be the only Adam Rothstein in town, or the only person ever to live at 4835 SE Sherman Street, but as long as my mnemonic handle, "Adam Master of the Interdome Rothstein" is on the envelope, I still get the mail. Sure, I chose the name when I was sixteen and it's silly, so it's only for personal mail now. Business mail goes to my business handle, "Adam Corporate Jerk Rothstein", which is also connected to my same UPA, and therefore both coming to my home address, even though none of the senders know where I actually live. Or maybe my UPA listing filters mail from certain senders to certain physical addresses. The database handles all of this, and all I have to do is update my record. Did you know that when you fill out a change of address form at the Post Office, you get a postcard to both addresses to confirm the change? It might as well say, "Please click on this link to confirm, and do not reply to this address as this email was auto-generated." They just need to take it a step further, and give you an IP address.

- It gets even easier now that you can print a custom barcode on any piece of mail with your home printer, using the online UPA database, very similar to a DNS whois. In a barcode there's nothing to misspell. The barcode is, naturally, instantly readable to anyone with a camera phone.

- Stop home delivery. Or, charge for it on the receiving end. Businesses pay for mail delivery, of course. Everyone else can do their correspondence by email... via the free Webmail client the USPS now provides, if they like. People with disabilities and the elderly get free mail delivery. Mail can always be picked up at the post office with a private 24-hour box also costing money, but not as much as home delivery. Picking up mail at the window of your local branch is free. Post Office boxes and branch storage has an expiration, of course. After a certain physical amount of mail taking up space, you either have to pay for an upgrade, or it gets "deleted" (recycled into USPS mailer material). Just like your free email account in the distant, limited digital storage past. All the more reason to do important business by email now, because our email inboxes hold just about a terabyte, keeping personal records of bills and other annoying number series for our entire lives, without ever having a potentially compromising personal mailing to shred.

- The frequency of delivery is increased. Once you are paying for home delivery, you can avail yourself of all the different service plan options. You can pay per delivery, perhaps once a week, either prepaid, or with a credit card on net terms. Or pay for unlimited service, up to three times a day. As the quantity of bullshit mail decreases, the speed of service should increase. Especially if you pay for it.

Each of those bullet points contains numerous changes, but all have the same general inclination: the USPS should start re-envisioning itself as a Mail Service Provider, in overlapping silhouette of Internet Service Providers. There are clients with certain but varying hosting needs, physical networks of transmission, and of course, the content to be provided. Landline ISPs provide data packets over a network that has changed a certain amount, but also stayed roughly the same for a while. There is datahosting, packet transmission, and the sale of services. Mobile ISPs are new to the game, and are starting to pick up the product end of things. Comcast will skin you to rent you a cable modem, but AT&T partnered with Apple on the iPhone, which was probably their smartest and most customer friendly move ever. They still have a way to go, obviously, and many more milestones to pass before they are less of a "telecommunications company" and more of a "network access company". But they're starting to get the idea. The USPS has much further to go. The idea that they are delivering "mail to addresses" should go the way of the AOL portal and national news magazines. MSPs are delivering content to customers, and should totally redesign their service and distribution network around this. If my location-aware cell phone can tell I'm at a cafe in New York this week, why the hell is my mail going to my house? They need to make some network choices here. Maybe redesign a standard uni-mailer, into which all correspondence must fit, and is addressed and sold only directly at USPS kiosks, auto-printed with unreadable barcodes that will never be touched by human hands. If it improves service, people might complain, but they'll buy in. Apple knows it. Hell, people still fly on airlines, so they'll put up with whatever is necessary to get from here to there.

But I'm also going to do my part to further this transition to a redesigned, "Post-Net". (great name, no?) I'm starting a mailing list, after the old style, when it was the only way to swap pornography, or read the latest conspiracy theories, or to get the good music, books, and comics. Not for any of those things necessarily, but to send and receive. All you have to do is send me your old fashioned, obsolete format mailing address. If it's your friend's address or your work address, that's fine. If its a PO Box, that's even better. Just somewhere where you want to receive unsolicited mailings on a basis as irregular as the mail. Email me your address, and start checking your mail box again. Really. Do it.

Or mail me!

4835 SE Sherman St.
Portland, OR 97215

Through codec and bitrate, and gloom of social media, nothing will stop mail from pointedly plodding from one place to somewhere else. Except for no delivery on Sundays.

1/16/2010

Talking with machines

On the Twitter microsyntax front, Project EPIC is working out an emergency syntax for Haiti rescue communication efforts.

It's interesting to me because the syntax expression itself is not new, as it just uses hashtags. However, it utilizes hashtags for pre-set message components, as befitting important communication relay elements.

From the link above:

Our team and collaborators are proposing a Tweet-friendly hashtag-based syntax to help direct Twitter communications for more efficient data extraction for those communicating about the Haiti earthquake disaster. Use only requires modifications of Tweet messages to make information pieces that refer to #location, #status, #needs, #damage and several other elements of emergency communications more machine parsable.

EXAMPLE1: #haiti #imok #name John Doe #loc Mirebalais Shelter #status minor injuries

EXAMPLE2: #haiti #need #transport #loc Jacmel #num 10 #info medical volunteers looking for big boat to transport to PAP

EXAMPLE3: #haiti #need #translator #contact @pierrecote

EXAMPLE4: #haiti #ruok #name Camelia Siquineau #loc Hotel Montana

EXAMPLE5: #haiti #ruok #name Raymonde Lafrotune #loc Delmas 3, Rue Menelas #1

EXAMPLE6: #haiti #offering #volunteers #translators #loc Florida #contact @FranceGlobal


PRIMARY TAG
#need
#offering
#imok
#ruok
#damage
#injured
#road
.....

SECONDARY TAG
Need/Offering Descriptor Tags
#food
#water
#fuel
#medical of #med
#shelter
#transport
#volunteers... can shorten to #vols
#translator
#status
#status
#financial or #money
#information or #info
#supplies [list specific supplies needed]
.....

Data tags
#name [name]
#loc [location]
#num [amount or capacity]
#contact [email, phone, link, other]
#photo [link to photo]
#source [source of info]
#status [status]
.....

End Tag
#info [other information]

Overall order is not as important as tag-descriptor connection.


In a time of crisis, it makes sense to not quibble about whether slashes, backslashes, other symbols, or certain pre-set abbreviations make the most sense. And so, they've actually put something together quite sensible--they've basically converted a twitter post in a DB data record, with hashtag delimination. Any firehose sniffing program should be able to pick out and synthesize the relevant information from this list of tags.

They don't have any parsing programs showcased on the site, but they are live, tweeting with this syntax (@epiccolorado), and it shouldn't be too hard (for someone other than me) to build one pretty quickly.

There are some things I really like about this.

- It's simple. It takes a convention people already know, and re-uses it.

- It's basically making a simple little code book. One could print out the list of commonly-used tags on an index card, and in only a few seconds put together a message readable to the network of people looking for this format.

- It is indentifying a basic sentence structure, on a level up from "twitter syntax". This is new for Twitter semiotics. If you look at a commonly used syntax, such as the re-tweet, you will see a variety of different amalgamations of the syntax. Some put the "RT" first, or last, or some are now using "via" rather than "RT". Some RT only the last person in the RT chain, some put the first, or some put all. None of this matters, of course, because the message is still getting across. But with this EPIC format, the order of the tags matters, and yet is still a bit flexible. It leads with the identifier, "#haiti", and then continues in a line of primary, secondary, data, and then additional information tags to shape the message in an understandable way. The simplest way of forming a regular sentence is with [Subject] -> [Verb]. Then, you can expand that to [Subject] -> [Verb] -> [Object]. And then, [Subject] -> [Verb] -> [Object] -> [Adjective]. And then, [Subject] -> [Verb] -> [Adverb] -> [Object] -> [Adjective]. You get the idea. The position changes depending on what language you use, but our system of language is basically a database, assigning values to these different data types in a particular record, and then parsing the record in conjunction with other records. This EPIC format is doing that with the basic information types for crucial rescue information.

- It's readable by humans as well as machine. Anyone looking at a tweet in this format could tell what it means. In this way, it fits into the main trending flow of #haiti tweets, but also can be pulled out from the noise. It is a very ingenious, although simple, middle ground between incomprehensible DB record, and common language sentence. This is where I see the microsyntax on Twitter heading... some common, comprehensible ground between XML script and common language punctuation. It is an understandable written language, but syntaxed to be capable of being metadata.

It will be interesting to see how well this works in Haiti, but thinking ahead to the next disaster, they should print up laminated index cards with these tags on them, and syntax examples on the other side. They can air drop them, or distribute them with Twitterized cell phones. The beauty is that anyone can contribute to the information collection, using whatever means happens to work: cell phone, SMS, Internet, Twitter app, or even potentially voice. Add geotagging to the metadata, and you are getting near instant, localized, specifically formatted information from the ground. It should be pretty easy to go back and rank the tweets coming in, as DB reports are verified, bumping up users who provide good information. Any responder on the ground could easily be linked into the overall real-time awareness DB, without having to transfer on phone and radio, or waiting for confirmed contact. Report is made, and then the responder can go about his/her work.

Just wait until this sort of thing goes audible. Ten codes, the codes police and dispatch use over the radio, are currently being phased out all across the country because they are not unified, and sometimes cause confusion in hectic situations. But these are merely translations. One ten code stands for something else. What if they were syntactical codes, to let a computer or human listening know what sort of information was being read over the air? What if we are started using a vocal "click" to denote a hashtag, so the next spoken word would be known as an indexable primary or secondary tag, giving additional meaning to the data spoken next? It would be "plain speech", but plain speech imbued with metadata for easy compilation into DB style records. With voice-to-text-capture on the radio feed, there could be one open channel, with everyone speaking at once. The computer would capture the speech, complete with hash tags, and publish it to a readable timeline on the screen. The radio metadata (the unit's number is already included silently in the broadcast in current technology) would allow the dispatch or the particular units to follow the timeline of only particular units, say, involved on that particular response. You could listen to the open feed for instant vocal communication, or you could filter the feed to particular data tags.

Language has a great potential for cyborgization. Cybernetics is an extension of our logical thought processes, so there is no reason why our thought processes can't increase our computerized tools by interfacing our current age-old communication techniques with our new technology. Speak the future.

3/18/2009

Of 140-character-matology

So, yeah: there was this thing in Texas, and well:

TWITTER

That pretty much covers it. Maybe it's alive, maybe it's dead; but you can be effin' sure that it's Twitter.

I've been conducting my own Twitter experiment for a little over four months now and I've learned a lot of things (and yes, SOMEWHERE on this page is the hidden link that will let you share in the joy of my micro-blogged arteries).

The foremost point: like just about everyone else who has tried it, I like it. After getting a good group of people to subscribe to, the feed is easy and addictive to read. I'm not sure if folks enjoy my posting or not (I have at least 15 or so non-spam followers) but I imagine the readership is somewhat like this blog--a bit random, perhaps sparse, and yet strangely, half-dedicated at the same time. Posting is fun, easy, and provides opportunity for a bit of silly, 140-character word play.

I already expounded upon some of the semiotic ramifications of a 140-character communication medium, so I won't repeat that here. Also, just about anyone who knows anything about the Internet, and who likes to tell you what they know about the Internet, will tell you how important Twitter is to the Internet. So, I think I'll leave that one well enough alone too.

But here's something you might not have heard: there are many, many people who have no idea what Twitter is.

These are people who blog. These are people who can find the best gay club in a new city with Google and a quick twitch of the wrist (well, at least one of them can). And these are people in the publishing industry. At a dinner party a few weeks ago, I off-handedly mentioned Twitter, and this group of six mid-twenty year-old's did not know what I was talking about. Half said, "what's a Twitter?" and the other half said, "so what is that, anyway?"

I have no reason to thinking poorly of these people; after all, at the time I was only three months ahead of them in the Twitter-verse. But this is something I believe many 'Net-Theorists are not fully-recognizing: this revolution in communication going on is only among a vanguard.

So, one might argue, has every new chapter in this new internet saga. Throughout the last twenty years, the Internet has met the different members of the human race one by one, and in different times and places. Some have yet to make its acquaintance. And every single one came into the fold at a different stage in the game.

I'm not a technology guru. I understand the technology behind the Internet and its means of transmission and its interfaces only marginally better than the average person, and certainly not well enough to operate any of them as anything more than a user (though I am currently enrolled in the Computer Programming course at Wikiuniversity!)

But what I do understand, and actually have some professional and theoretical training in regards to, is methods of communication, language, and human psychology. I am seeing something occur here--and although you may have heard this about 100 times in the past year (and 20 of them since SXSW), I believe it is true: this is something that has never happened before.

One can talk about the digital revolution through any number of metaphors relating to other advances in information technology in the past several millennia. I'm not going to. I'd rather talk about what we are doing now.

And this is it: we are conflating our language with our method of transmission. In other words (duck your head while the theory comes past), we have seized our means of communication by creating a productive unity between the product (meaning/symbol) and the production (expression/transmission).

Or rather, we are seizing it. Not all have done so, and those who have begun are not nearly finished, even if there was such a final state.

Let me say it over again, but simpler, because I really think this is an important point.

The method of digital production (the networked computers and communication lines we commonly refer to as the Internet) has dropped the cost of producing, distributing, and consuming information to practically nothing. Furthermore, along with the spread of end-user information such as news and media, comes the spread of the knowledge of how to manipulate, and further seize and shape the technology of digital production. I can learn programming on the Internet for free (or at least in theory--as I have yet to actually do so). A real example: a child of nine in Southeast Asia can become a certified Help Desk technician. Another example: a person using the Internet can invent, create, and promote their own communication client, with little-to-no actual investment of materials used, other than the computer they already had.

Open-source, APIs, and so forth. But here is the other interesting part: through word of mouth, networking, and good old trial and error, one Internet user's programming project becomes an Internet start-up. This start-up becomes a business. This business project changes the way the entire world communicates. One does not change the relations of production by oneself, but does so in concert with a dedicated and involved group: a network. Crazy, no?

None of this is news, really. But it's still pretty awe inspiring when you step back and look at it. Back in the old days, inventors used to die broke, sick, and alone. They were persecuted by the Church, and the children in the village would hurl fruit at them when they would step outside to rake the gravel outside their hut. These days all you have to do is read a lot, practice, and then you sell your craft project to VC, move to California, and write your own biography. Crazy!

Here's what's really crazy: the pace has sped up for the users as well. They are part of the network; they are both the means of production as well as party to the relations of production, and they work uncontrolled by any boss. Therefore, mass adoption is only possible when the masses themselves undergo a revolution in production. But even though a critical mass may be joining a particular revolution in production, they are leaving people behind. And this is totally okay, but strange, in the face of the pace of technology. Most people don't even know how Twitter works, let alone why its good. But now, simply knowing what Twitter is, and updating your status isn't enough. You have to use a good client, with search capacities, TwitPic, and a URL-shrinker all combined. You have to Trend. And you can't even just Trend! This past weekend one had to construct crazy Boolean search constraints just to find out where the tacos where at. Just wait until you have to write your own App on the fly at an event to handle the ever evolving and expanding data stream pouring out of the API.

Try explaining #SXSW to someone who doesn't use Twitter, but only uses Facebook. It barely makes sense to someone who hasn't shifted their means of production along with the revolution in the network. It's not enough to know how to use the Internet--you have to own the interface. Communication is not a matter of being about to read, or even operating a card catalog. Ever try to explain how to Google something to a person unfamiliar with a search engine? You have to be a skilled operator of the means of communication to be able to communicate. Pretty soon you might have to have a Library Science degree just to figure out what you're missing.

This is not simply literacy. This is an ability to critically think on the fly--to creatively craft information and symbols, and interpret, in a constant productive and consumptive process. The old Rhetoric and Speech classes of yesteryear will be replaced with Reg. Ex. and Javascript.

I'm looking at my browser window right now. Firefox. I have five tabs open. One is my iGoogle page, with news, RSS Reader, Twitter, and Email widgets all included. I'm glad I can fit them all into one page. I have my blog edit screen open, two half-read articles, and another blog post open for reference. I have a full bookmark bar at the top, but I hardly use these anymore. I have Javascripts for Google Notebook to handle my evolving collection of links and notes. I have Javascript button to subscribe to an RSS feed, and a button to post a page to my Shared Item page (part of Google Reader). I also have a link to my off-line TiddlyWiki, where I am compiling notes for a writing project. I play mp3s through FoxyTunes, and I have just installed Ubiquity, so I can jump to certain tasks like Twitter or Email with a hotkey and a typed command.

And with this personalized Interface, I am barely keeping up! Several of my RSS feeds are devoted to sources to keep me abreast of the new formats and apps reaching the market. I have to know what's going on, just to maintain the struggle to know what's going on!

This concept, the ongoing technological revolution in means and relations of production, is whispering to me. It is whispering to me about a future for the Internet (cough and head for the exits, its prophet-brain-dump time!):

[starry-eyed, swirly, white-out fantasy bells...]

Twitter will meet Ubiquity or another semantic web program half-way. Using some bastardization of Javascript, mutated by downloaded user-customized commands and uniquely-hacked libraries, Internet users will message, email, search, read, and archive data using these complex 140-character commands. Most of the text sent back and forth between users will not resemble written speech--it will be a hybrid of scripts and links, slangs and references. There will be pockets of written text at fixed URLs--the remnants of today's blogs and wikis. Everything will be accessible with XML or some derivative (though probably not fully "semanticized"), so it can be searched, compiled, subscribed, shared, and archived via the TwitScripts. "Privacy", in terms of personal data, will break down as a concept. You will not have a password--you will have a registered device. You will log on to a single user-name (perhaps a seperate for business) and launch your TwitScripts from an open and readable timeline. The data must flow...

[...trauma-inducing crash back to reality.]

But this is only one possible future, for certain users who are adept and interested enough to learn TwitScript. For those who are not interested in this particular technological revolution, other interfaces will become popular. For instance, there is also the MySpace future.

MySpace is AOL. There--I said it. It is a portal to media content. To be fair, the user-generated element of MySpace, YouTube, and Facebook are much more interesting (due to their variously-shifting standards of user-control) than AOL's portal ever was. But the goal is the same--a controlled (and heavily advertised) environment in which users can log on and roam about, never having to learn anything new or create an interface from scratch. Profiles, skins, centralized app access--it's the trading card game of the web.

But this appeals to certain people--especially young people. It is easy, and the community is ready-made. Sign on, and join your school first. Then, form other groups from that. This will be the media-tized Internet of the future. Note: not the future of Internet media, but the future of the Internet as a media channel. Imagine--Microsoft buys MySpace or Facebook, make a few connections and software upgrades, and all of sudden the Xbox is the WebTV kids actually want. For people who are not interested in learning the means of distribution, these media portals are the perfect product.

And then there is the regular old Internet as well. I think of a guy I know at work--he loves Craigslist. He struggles with Mapquest, but is on Craigslist almost every day, looking for deals. He has that one interface down, but doesn't have a need to learn anything else. There are people who are the same with eBay, or their favorite news/discussion sites. Or even just Google--find my movie times, and I'm out. The capacity of the user to learn the means of communication dictates how far he or she will choose to go, and more importantly, how far the technology will go with them.

My TwitScript conception of the future (for the record, this term is, along with the rest of this blog, under CC license as of now [is he joking? or serious?]) is the direction in which I imagine those pushing the limits of the technology will take it. To deal with the increasing amount of information available in protean distribution formats, we will need to become literate in the mechanics of information distribution--this includes text mark-up languages, browser mechanics, and the new consumer info "packet": the Tweet (more about the power of the 140-character set in that earlier post I alluded to). The line between the cutting edge users and the programmers will diminish, just because of the rapidity of the pace. The designers will be the beta testers, the early adopters, and the constituency. They will be the only one's that matter, from the Twitter-verse's perspective. Think about it: when you are Twittering, what else really matters? You are communicating with other users, for other users.

Here is a post by Tim O'Reilly (@timoreilly) for a day or so ago:

RT @elisabethrobson: Interesting stats from the iPhone 3.0 preview yesterday: (via @iphoneschool) http://cli.gs/nL1yJ5 #iphone

Look how far we are already! Only 85 characters of this are actually readable text! The rest are hyperlinks, short-hand, tags, and citations. And in fact, because it is a Re-Tweet, none of it is his original words. This is an index; it points in a direction through the network, distributing information even though substantially, it itself says almost nothing. And it doesn't need to, because by utilizing the method of internet indication (the hyperlink) Tim is giving us more information than 140 characters ever could. He is linking us into the network, and insinuating us into a pattern of unlimited knowledge in a yet reasonable and understandable gesture.

I promised that I actually had some theory for you regarding semiotics and psychology, or other such nonsense. If you are not interested in such things, feel free to skip out now, taking the conclusion: Twitter is the beginning of a revolution in the means of communication, to a conflation of content and expression. But, for those who read my Marx-between-the-lines, and desire more, here is a deliciously (or perhaps annoyingly, depending on your preferences) difficult description:

The signifier, as the point of expression for meaning (the signified), has been receiving an altogether privileged place in our understanding of language. Whether it be the Holy Word, the unattainable signifier of Lacan, or even the juridico-discursive power of the "I" point in modern testimony, the moment and form of expression (I think therefore...) is seen to be the cutting-edge of the language tool.

While this signifier is hardly diminishing in its psychological position (consciousness demands a position for the "I"), our evolving technology of expression is reducing its sacred position over the signifier/signified duality. The psyche, as a technological realm of semiotic expression, is not in itself shifting; but in our current relations of production, in which our minds are interfacing with digital networks, we are ironically becoming "unwired" from our binary (the basic two digits) understanding of our own communication. We are not just signifying now, we are manipulating the way we signify as part of the signification.

The role of the author is shifting. The power of attribution to a fixed, historical "I" is less important than the information to be understood. Understanding, and hence, expression, is less reliant on the signifier as a perfect concept of content production. Misspellings are common, and ignored. If anyone is asked, of course the signifier still plays a role, but as the signifer grows in scope to encompass not only the privileged identity between word and speaker, but also between a choice in language, distribution network, semantics, time, and distribution, the signifier is becoming more meaningful as a material object. We are bringing the signifier back down to earth, muddying it with the effluviance of the signified phenomena, and enacting a phenomenological semiotic, rather than a formal (Platonic, Hegelian, etc) semiotic. To appropriate Merleau-Ponty, our words are again made flesh. To appropriate Marx, our commodities are returned to the realm of production and use-value. To appropriate Freud, our fetishes are no longer abstracted neuroses of our unconscious investments, but properly sublimated transferences: well-oiled psychic machinery.

When we type hypertext, we are not only indicating, we are expressing the act of indication. This is not only "something to see", but "something I want you do see". Please click on this. The signifer now has supplementary value as a signified. The signified and signifier meet again, not through a reduction of the difference, but by a meeting of the two aspects in a properly material plain--abstraction is conquered (aufhebung
alert!) by a reevaluation and redeployment of the means of this semiotic production: scripting is a proletarian consciousness of digital writing.

Now: the sense in which the two terms "signifier" and "signified" are used shift between every philosopher's iteration, and even within each author use (somebody should be able to say something significant about that). To draw out this complicated dynamic and really treat the two terms fairly, I could read you the entirety of Of Grammatology, but I think we would all be relieved if I did not. (And certainly Derrida's own confusing play with the shifting meaning of these two words are indicative of his own philosophy. I imagine one could agree with that statement whether one appreciates him or not!)

However, I will simply Derrida a bit, to close my thoughts for now. With the caveat, of course, that my use of the two terms here are not exactly the same as his--but I believe the point holds true for both of us.

If, we wish to push our ability to write and express meaning beyond our current means, we must seek to unravel, and perhaps "de-construct" the nature of our current system. I cringe while saying so, but we must "hack" our language. Perhaps "script" is a better verb, not sounding quite as cliche, and closer my idea of what we should actually be doing: using and adapting pieces of our language as a material code for better interfacing with our language. A book is an excellent material technology, but we cannot use a book as our model of communication after considering our new, and vastly more "scripted" material technologies of signification:

"The good writing has therefore always been comprehended. [...] Comprehended, therefore, within a totality, and enveloped in a volume or a book. The idea of the book is the idea of a totality, finite or infinite, or the signifier; this totality of the signifier cannot be a totality unless a totality constituted by the [material] signified preexists it, supervises its inscriptions and its signs, and is independent of it in its ideality. The idea of the book, which always refers to a natural totality, is profoundly alien to the sense of writing. It is the encyclopedic protection of theology and of logocentrism against the disruption of writing, agains its aphoristic energy, and, as I shall specify later, against difference in general. If I distinguish this [un-totalistic, scriptable, material] text from the book, I shall say that the destruction of the book, as it is now under way in all domains, denudes the surface of the text. That necessary violence responds to a violence that was no less necessary."

-Derrida, Of Grammatology, "The Signifier and Truth"