Saturday, June 9, 2012

Lo!


            I’m not exactly a Fortean, but I am fascinated by Charles Fort’s work. In the scope of world events and unfolding, he had a knack for finding the most curious phenomenon. Fort embraced a philosophy of “damned” evidence: anomalous events that no hard science could properly explain, or explain away. Frogs falling from the sky, rains of blood and black ink, mysterious indentations and glyphs left in cliff faces, weird sounds heard over enormous geographical spaces. The list goes on and on. His writing is a kind of manic fever dream in which shadowy things become marginally lucid and coherent.
            I couldn’t help but think of Fort when I came across an article last week in which scientists discovered strange radiation bursts recorded in tree rings about 1,200 years ago. Somewhere between AD 774 and AD 775, an enormous burst of carbon-14 (14C) hit the earth’s atmosphere. So far, the findings rule out a solar flare or supernovae. What on gods’ green earth could produce enough 14C isotopes in the atmosphere to raise the global 14C measurement to nearly 20-times its nominal level?
            Modern radio and x-ray telescopes give us the ability to look through time and into space for remnants of massive astronomic activity. However, so far, nothing is visible from that epoch to indicate a supernovae or enormous influx of γ-rays or protons. Again, we are stuck with the questions: What? How? Where? And science has no answer and would most likely refute any supra-normal hypothesis. Therein lies a problem that makes me fickle about science and the pompous poise of our age.

Science is made weak by the same conceit that makes religion feeble. The self-assuredness to explain something away theoretically without factually bringing evidence to the table is the self-same arrogance religious institutions use to control and manipulate. “Hard” science often contributes its own heresies to the collective reality by making presumptions based on past experiments and data, which may have held true in the past, but don’t adequately account for the future model of understanding. Yet, it seems every other day, new information and data is presented that has us re-calculating, re-hypothesizing, and re-thinking our reality funnel.
No one is bringing answers or possibilities to the table. This kind of discovery makes me wish for a man like Charles Fort to step to the table with a wild talent for paranormal pronouncement. What fun lies in this reality if we take all the mystery out of it?

Sunday, June 3, 2012

Cortexting Neurotifications


            Remember that rad (yes, I still use this word) house party you went to last Friday? Remember the totally suave status update you posted while the cute girl you were chatting up went to the bathroom? And those pictures! Who knew you could embarrass yourself so much that a complete stranger would catch it on film and tag your drunken face on Facebook.
Now imagine these feelings, physical states, and emotions presented in ways text characters and words could never express. What if the capacity to capture the psycho-emotive-physiological state of an experience was as easy as uploading a packet of information for another to receive and re-experience?
            I find myself discussing brain-control interfaces (BCIs), EEGs, eye tracking, and similar technologies quite frequently. These technologies will unequivocally reshape our understanding and techniques of computing and data interaction over the course of the next decade. Much like in the film Strange Days, I expect a day in the not too distant future, where a “status” update will be a complex data packet capable of recreating emotion, physical sensation, and psychological state through some readily accessible and affordable hardware medium. Perhaps we should call these experiential neurobiological updates neurotifications or cortexting.
            Of course, this brings a whole host of tricky and problematic questions to the table. Privacy, security, and accessibility all linger like dark shadows in the corner booth at a smoky bar when the adytum of the soul is exposed for the world to see and dissect. The discussions regarding privacy and security on social network platforms like Facebook (e.g. persona/profile hijacking for affiliate advertising, facial-recognition and tagging automation scripts, “phantom” accounts for agitation, etc.) present a whole generation with ideas more foreign to the human mind than any generation preceding could have ever imagined. These concepts may not dominate our thoughts on a daily basis, but it is hard to deny they’ve sown deep seeds and may bring forth a new dynamic in the eons-old vacillation between knowledge of self and knowledge of other.
            I can recall many times, sitting there staring at the blank update box, wanting to say something that might stir the same overwhelming emotion or feeling that put me in a “loss of words” state in the first place. Driven by some sick hope that if I were able to select just the right words, just the right rhythm, the right moment, that little box could bare my essence to others so they might understand. More often than not, I find the words and ideas chosen to communicate those candid thoughts presented in a solipsistic manner. Quick as I was to share that feeling, I often wipe the slate clean and go back to some other distraction or sublimation of emotion.
            It strikes me that the possibility of tapping into the neural network of the human body to advance the trend begun by social networking will have many folks up in arms. Oddly enough, I find myself wont to embrace and apply such a technology. However off-kilter it may seem to the reader, a part of me perceives such a wild technology fostering the capacity to achieve the effect social networking ought originally have been designed for: to put us intimately in touch with one another; to increase cross-cultural and pan-global understanding and empathy; and to dissolve the geo-political borders which parochial, xenophobic governments have put in place to barricade us from embracing the next phase of human evolution.

Friday, June 1, 2012

Truncating Communication via Space Travel


            “Splashdown successful!! Sending fast boat to lat/long provided by P3 tracking planes #Dragon.” – @elonmusk
            This Tweet changes many things. It changes the way we look at the future of space travel and exploration. It changes our perspective on what is possible once the individual endeavors to excel where the governing institution resigns to irrelevance and extinction in matters of exploration and discovery. Most importantly: it changes everything about the way we communicate.
            If you pay attention to technology and science trends (and how can you not in this day and age?) you probably know Space Exploration Technologies (SpaceX) put their flagship rocket, Dragon, into orbit last week. It delivered its payload to the international space station then completed re-entry yesterday, splashing down in the Pacific Ocean not too far off the coast of California.
            You might find yourself asking: “How does the future of private space exploration change the way we communicate?” Well, let me explain. It isn’t really the space exploration; it’s the Tweet. This transformative platform of communication isn’t exactly new. Twitter, SMS messaging, and Facebook have contributed significantly to this new paradigm. What I see as the impetus for future dissemination of information is that we’ve surrendered to this hyper-truncated method of communication. Complex thoughts, data, and emotion need to fit in 140 characters. This means conveyance is shifting towards rapid-fire exchanges of 0.14kb of data or less. Given the volumes of data we now handle on a daily basis, this seems, perhaps, miniscule.
            Take it one step deeper with me. To understand the Tweet at the beginning of this post, one needs to have an encompassing awareness of many trends and events for the context of the Tweet to make any sense. For most of us, this comes through our array of RSS feeds, hardcopy or digital newspapers, or word of mouth. Hold that thought for just a second: word of mouth? Whereas this predominantly shaped our means and method of communication of news for several centuries (post, social gossip, town crier), the advent of newspapers shifted that transference of information into high gear. The parturition of the Internet took that from high gear into interdimensional overdrive. I have to pause here for a moment, because I can only think of Spaceballs when Rick Moranis takes “Spaceball One” from ridiculous to ludicrous speed and says, “My brains are going into my feet!” That scene succinctly captures the feeling of what we face—word of mouth becoming near instantaneous, pan-global awareness in 140 bytes or less.
Twitter’s enmeshed use of the hash mark (#) and the ampersand (@) as identifying classes for objects, ideas, and persons is not groundbreaking. Anyone who has spent time developing with or learning an object-oriented programming language has experiential familiarity with these codifications. The uniqueness lies in the crossover from the cold, digital cyberworld into the neurobiological wetware of the human brain’s left hemisphere. We find ourselves trying to adapt to the staggering volume of information coming in; and in so doing, we’ve created shortcuts to navigate the cybersphere that cross-correlate to classifying identifiers in our own minds. This psychological matrix, to survive in the digital future, finds it necessary to produce routines and algorithms unique to each individual to keep pace with the data, events, and ideas coming in on all fronts.
Two years ago, I spent a semester teaching expository writing to college sophomores. I didn’t quite suspect it then, as I do now, but I knew we stood on the precipice of a huge shift in the linguistic structure of our verbal and written communication systems. When I discovered the first “u” as replacement for “you” in an academic paper, I was furious. It undermined the whole system of grammar and attention to language I fostered as truth just as a priest wields a Bible like a cudgel. However, I now see and accept this transition and shift in communication as inevitable. Just a few days ago, I was reading through an old historical record on the history of Witchcraft in the British Empire during the 16th and 17th centuries. Middle and Old English seem foreign languages in comparison to modern American or British English. Inevitably, future variations and linguistic aberrations will manifest, so the English of as little as five to ten years from now will in no way be recognizable. How else will we keep pace with complex communiqués in 140 bytes or less?