Tuesday, January 31, 2012

Redolence, cerebral capacity and a protocol for cyborg information transfer

Sherlock Holmes once gave the opinion that the human brain has a fixed capacity and that when new information is encoded then old unused information is automatically lost. Even though this was Conan Doyle's fictional take on the matter I took this to be a fairly serious assertion. It seems obvious to me that the human brain has a limited capacity and therefore when it becomes full it seems evident that in order for new data to be stored then old data must be lost.

There are others who say that humans only use a small fraction of the brain's capacity. This is absolute nonsense. Only when the way that the brain works is understood can this idea be posited one way or the other. The human brain is an almost literally unfathomably complicated structure and it's quite possible that we are simply not capable of understanding how it works let alone be able to derive the workings through science.

Is it possible to say when the brain is full? I don't think so. The capacity is not a function of the number of neurons or connections, but of the integration effects and higher order interactions that take place because of the number of neurons and connections. These mechanisms are still unknown to scientists. In my view it would be possible for two individuals to have the same mental capacity, but with a significantly different number of brain cells.

I'm mentioning this stuff because until recently I was going along with Holmes's idea and not only lamenting the gaining of knowledge due to the commensurate loss of data, but actively trying to filter the knowledge gained in order to retain the info I have already. I was struggling to remember things occasionally and put this down to a combination of tiredness, age and an almost full brain! This seems silly now. I guess I hadn't really thought about it too much. I now don't think I have reached my mental capacity. I still have some serious confusions, however I put this down to causes other than there not being enough "space left"!

The best way to remember things is to strengthen the memories by methods such as repetition, referencing and stress. These are partly described by Cognitive Rhetoric which is a field in which I have particular interest, but that's a topic for another post.

On the same tack, and to use the rhetorical device of Reductio Ad Absurdum, I could say that my brain would be more receptive to new information if I were to remove some old data. If I were able simply to clear out the memory pathways of things I don't care for any longer then that would surely allow them to be used anew for stuff that I now want to remember.

Unfortunately this absurd reduction isn't practical. I can't unremember things easily. The best way to forget something is to not recall it for some length of time, but by recalling it it strengthens the memory so I can't know what has been forgotten - I just need to keep focussed on the things I care about and hope that those old memories just fade away.

I believe that this happens all the time. Our minds don't keep a very good track of the memories we have so when some drift off it's almost as if they were never there in the first place.

There's a feeling that people sometimes experience that we call redolence. It's not the dictionary definition - which is more olfactory - but by explanation I would describe it this way: an other-worldly experience of remembrance or reminiscence similar in its indescribableness to Deja-Vu and triggered often by smells and music.

I experience this quite often. Perhaps I smell a particular odour and instantly a part of my brain is set into action and interacts pleasurably with my mind in order to bring back a memory or moment in time. It's a lovely feeling. There are also some songs that do exactly the same thing although slightly more contrived, e.g.
Emma's House by The Field Mice,
If I could Shine by The Sweetest Ache
Most of Dire Staits' first album or Making Movies
Certain songs of David Bowie, Space Oddity, Fame, Young Americans
Many songs from Big Country's first two albums

These are songs that I listened to in my early teens and I guess they stuck. When I play Down to the Waterline or Espresso Love by Dire Straits I'm instantly transported back to myself at the age of 13 playing ZX Spectrum computer games with my brothers.

I was recently experimenting with a BBC B BASIC emulator. I was typing a very simple BASIC program into the screen copied from the INPUT magazines. I completed the program, typed RUN and pressed ENTER. The program worked nicely. I was then faced with the dilemma of how to go back and view the program. Without thinking it seemed like my brain told my mind exactly what to do. A strange and uncanny feeling came over me like something had been dredged up from a dark corner of my memory - and literally it had - I hadn't typed the LIST command into a computer for about 20 years. That particular data had lain dormant all that time, taking up space in my brain and suddenly popped back up for air and shocked me into the line of thinking I'm on with this post. What other dark and dusty corners of my mind will be resurrected next? It's a little scary.

What I take from all this is that the reorganisation of my brain is what's important rather than the particular content at various fixed points. I can't do much about the memories I keep, but I can do something conscious about the way it's organised, linked and cross-pollinated.

As human brains evolve over the next centuries it would be surprising if they don't become more aware of the influence of memories and thereby become more aware of the data stored therein. I imagine a brain of the future to be much more self-aware and more logically interconnected. Perhaps a new lobe will appear that will deal with the organisation of memory. A mechanism that can recall specific memories more precisely than now would only benefit the lucky individual who had that mechanism. I would argue counter to that idea, however and say that that individual would perhaps lose out on the experience of redolence I describe above. I value logic and precision, but perhaps humans aren't supposed to be too mentally robotic. It would be a gradual change of course, and my prejudices of how humans are "supposed" to be would be invalid in a society that is based on high technology and logic.

A better solution (perhaps I mentioned this in a previous post) would be to understand the integrations and processing of information in the brain so as to define a translation mechanism to that other great medium for information manipulation - Silicon! I imagine a future where the precise action potentials, inter-synaptic neurotransmitter mechanisms and other communication methods are understood at least in sensory areas of the brain so that a silicon-based transmitter and receiver can be placed in the loop. This may be able to receive signals from the brain at various points in order to create a direct output. Similarly the module could transmit data directly into the brain to produce experience not before possible.

The technicalities of the interface are relatively trivial compared with the deciphering of the data. How is information actually passed around the brain? How can this information be Typed so that it can be stored in Silicon? I believe that it's possible to do this.

Someone wants you to watch their home movies. You press a button that wires their signal directly into the optic nerve pathway to your brain (or more likely it would need to be pre-integrated and fed directly to the Occipital lobe) and hey presto! you're experiencing their home movie effectively first hand.

Obviously this is not my idea. Writers and dramatists have been conjuring this trick for a long time (the Black Mirror recent Channel 4 series on TV dealt with it nicely, if trivially in "The Entire History of You" by Jesse Armstrong). I am, however, very interested in the mechanism by which it might work. How can we translate a feeling into a dataset? The method that I think will work is an ontological one. By defining an ontology the domain can be modelled in a dynamic and flexible way. The key to making it work is to define the Entities and Relationships cleverly. The entities and relationships that I can imagine right now are probably not the ones that will eventually be employed in these devices of the future - I cannot understand because the neuroscience, cognitive science and psychology have not yet brought forth the entities and relationships that are universal enough to capture both the deep processing of our brains and the universal truths of our perceptible universe.

In the meantime I will endeavour to delve into the realm of universal truth if only to skim lightly on the surface tension and perhaps, if I'm lucky, occasionally break the tension enough to make wet my big toe!

Tuesday, January 03, 2012

Facebook on Mars

About 12 years ago there was a considerable panic about the so-called Year-2000-Bug. This was a potential problem in computer systems caused by the programs running on them being fit only for dates up to the year 2000. The exact reasons why people thought there would be a problem are due to the way that computers handle dates and are fairly complex, but in effect people believed that as soon as the last second on December 31st 1999 passed then planes would fall out of the sky, nuclear arsenals would spontaneously fire at their targets and fridges the world over would cease to work. This turned out not to be true in the end and I don't think it's known widely why - perhaps a combination of media frenzy (it's a good story) and IT professionals making a bit of extra wonga.

The phenomenon seems to have gone from the public consciousness. Not mine though. The issue of computer programs being fit for purpose has never gone away and perhaps it's only good fortune that we can't think 1000 years hence to the next Millennium Bug. We can, however, predict in our children's lifetimes a proliferation both physically and mechanically across our solar system - Moon bases, Mars bases, Titan outposts, some populated by humans and some not.

Are computer designers today thinking about the implications of where their code may end up in 50 years? I doubt it very much. It's probably less of a problem these days compared with the causes of the Y2K problem - which lie in the 1960s - because there is more awareness, however some things spring to mind.

Of course organisations such as NASA already have to deal with this sort of problem, however this has yet to trickle down to the rest of the computing industry except for a few forward-thinking groups.

I worry for the CEO or CTO of the future. When viewing statistics across the whole solar system group how will they view correctly the time/date-based attribute hierarchies of their MOLAP systems?! At the moment Earth-based data considers a year to be 365 Earth-days (or perhaps 365.25 to be more precise). This will not be the same on Mars. The Martian year is about 687 Earth-days long. Fortunately the Martian day is just over 24 Earth-hours so that should be easily converted.

If you're living on Mars and updating your Facebook status then the how will the update time be displayed to someone on Earth? There are already mechanisms that mimic UTC on a solar-system scale, but are people ready for that?

Is it about time that computer practitioners were made more aware? This is, of course, slightly tongue-in-cheek, but a more serious problem for the West may be that conversion of Western natural language and calendars (English, French, Gregorian, Hebrew etc) into Chinese equivalents (if they exist) is lacking too. If the current trend of space-exploration being led by China continues then those software programs that can't convert in this way will not survive.

Monday, January 02, 2012

Homosexuals, women and automated natural language representation.

I have the utmost respect for individual people who achieve something creative and worthy. Gender and sexuality have very little to do with this since intellect is not generally influenced negatively by either. Most societies, however, view gender and sexuality as more important then they are. This certainly applies to homosexuals and women who I believe have had a rough time at the hands of bigots and idiots for millennia.

This imbalance is being addressed most in progressive societies and I feel lucky that I live in one of the most advanced cultures in this respect. That's not to say that there are not problems still and also it's not fair to deny that there are some differences between different classes of humans based on gender and sexuality that are sometimes ignored - these can cause problems too and ought to be addressed as openly as the fact that the intelligence and personality of humans is more important than what they look like, their chromosomal orientation and what they find attractive in other humans.

Ada Lovelace was a Victorian woman who worked closely with Charles Babbage in the middle of the 19th century in England. Charles Babbage brilliantly developed the theory for a number of different computing engines. One of these was the Analytical Engine which is assumed to be a very early precursor to the modern general purpose computer (a PC for example). Babbage was no doubt a genius for mathematical problems and theorising on computation, however the leap from maths to a general computer is often attributed to Ada rather than Charles. Lovelace's notes on the Analytical Engine contain an explanation of a way to represent alphabetical letters via the mathematical computation and thereby produce computation on language rather than only on numbers.

These notes were mostly forgotten for half a century, but contain the seeds of an idea later developed by Alan Turing in the 1930s. Alan Turing was a brilliant mathematician who developed a number of ideas interbellum that laid the foundation for modern computing. Working in mathematical theory only he was able to develop ideas about the Turing Machine. This is a model of a computing system that uses algorithms on inputted letters to produce various required outputs. It was used by subsequent computer scientists to develop actual computing devices and is still used today for theoretical work in computer science.

The modern computer is almost ubiquitous in the West, but very few people understand how they work and perhaps fewer still what they actually are! In essence a modern PC is a translation machine. It takes various inputs in binary that are representations of things that are important to humans and translates them via calculations (simple additions, subtractions, Boolean logic etc) into different representations that are important to humans. The computer never has any care or understanding of what the representations mean; it's only the fact that a human can attribute importance to the alphabetical representations (or graphical) that gives a computer any worth.

This is the key idea that Lovelace documented over 150 years ago and that Turing mathmetised and practised at Bletchley Park over 75 years ago. We owe the modern world and computers to a particular woman and a particular homosexual man. This post is a celebration of the positive affect these figures had on the world despite their situation being considered substandard even today by some. In my opinion it's time their contributions were better known and appreciated.

I'm quite sure that there were many, many individuals who contributed to this development who were neither homosexual nor female and I don't wish to forget their efforts too, but that can wait for another post.

As long as computers are representing things we care about in electrical circuits then they will always need attention to keep them up-to-date with what we find informative. The Turing Test is another legacy of Alan Turing and one which is close to my heart since I have wanted to crack it most of my life. It's extremely difficult to fool a human being when it comes to natural language since we are almost without exception experts in it and can recognise innately minute variations from "correctness" almost instantly. I'm not sure that I agree with the idea that the nut is close to being cracked (news item link). Apple's Siri is at the forefront of popular attempts at the Turing Test and even when it can recognise the language itself (not often it seems) it very often fails with "understanding". It's the understanding part that is difficult and it's predicated on many things that come naturally to us as speakers and listeners, e.g. our world knowledge, the context of the situation, body language, how we "feel" at that moment and various complex language recognition mechanisms that we don't even realise we're doing such as Anaphora Resolution.

It's my opinion that an avenue of research to follow that might lead to solving the AI language problem involves the representation of ideas from Neurolinguistics and Cognitive Science. By using the results of research into how human brains use natural language combined with logic systems from philosophy and maths we can create representations that underlie the vagaries of context, grammar and language recognition that can then be combined with other research to develop a computing device that "thinks" like a human rather than "translates" like a machine.

P.S. Britain's treatment of Alan Turing after WW2 is a disgrace in my view and an apology extremely long overdue. This being a year of celebration of Alan Turing might also be a starter for the development of a movement to further quell prejudice and harm to people based on gender or sexuality.