Sponsorship

Thursday, September 24, 2009

Skin Me A Surrogate

This article describes the search for a perfect skin for a robot.  And it's true, many times over we humans have proven ourselves to be biased against all sorts of things, and odd skin is one of them.  When's the last time you shook someone's cold, damp, pudgy-feeling hand and decided there and then that you didn't quite like or trust them?

So yes, finding a skin for a robot that we're prepared to accept is important.  But I'd like to point out something, and that is the (as the article states) social aspects of touch.  In an agrarian society, a rough dry and slightly damaged skin, and firm handshake is "normal" and acceptable.  In a city environment, you'd shrink away ever so slightly from anyone proffering such a hand to shake.

Socially touching is considered differently in different countries.  And while there are some people who can't abide furred animals and therefore don't have pets, most people enjoy stroking the cat or the dog, the feel of fur under the hand.  And some people in fact consider this a social requisite to partnership and sex, so you have a segment of the population that we label "furries" to whom fur is the socially normal skin feel.

What I'm saying is that this kind of research is good, but don't go overboard on "social acceptance" because with a few years to get used to it, people can accept pretty much anything.  As long as the covering has the mechanical properties the robot needs, and isn't razor-sharp or adamantium-hard, I'm pretty sure people would accept it.  Maybe even fur for where it's appropriate... %)

Then at the end, they mention the Surrogates movie, and suddenly, it all doesn't matter again - after all, if everyone is in surrogate, it would be easier to program the surrogates to "feel" whatever covering the other robot surrogates have as 'perfect skin..."

You Want To Stick A Chip Where?

So - remember when I said Augmented people will be here sooner than expected?  Here's another step closer: Scientists have developed a chip that implants in the eye and feeds signals to the optic nerve at the retina.  They are not claiming they will be able to restore full sight to a person, and with good reason - our eye actually has totally piss-poor resolution, and even if they managed to hit every optic nerve ending the total resolution would be only a few thousand pixels...

This is because the eye doesn't work like a camera, where each pixel of a scene corresponds to a "pixel" or nerve ending on the optic nerve.  Our eyes perform "saccades' which are rapid movements, and the brain is fast enough (and the saccades are accurately enough performed) to map a composite map into the brain.  The eye scans the scene and builds it up a few thousand pixels at a time.

Our brains also fill in a lot of the tedious detail which isn't in the center of our vision, with generic extrapolations.  The reason you don't notice is because it's not in your field of attention - and when you look at that part of the scene, it becomes the center of your field of attention and therefore the eye/brain start to record that part in more detail.

But it's not hard to track saccades, or even just the nerve impulses that tell the eye to saccade, and therefore the first scientists to build a chip that can map a scene onto that extended canvas that our brains use by simulating what happens naturally for sighted people, then full field vision will be possible.  It's not too hard to emulate.

That's why, for the moment, those external devices that project back into the retina are effective, and are better than any implanted chip can be. The eye saccades and picture source in this case is not fixed with respect to the eye so it gets "swept" across the retina.

And of course, defence forces are also going to be interested in this kind of technology, because again, it provides a much needed situational awareness tool for their troops on the spot, the kind that can't be dropped on the floor and thus lost.

Me, I'm imagining myself at over 50 years of age being fitted with something like that for browsing the web, and i can see one, somewhat less obvious problem: As we age, our eyes lose conformity and our eyesight becomes limited.  Because the implant is behind the cornea and lens which cause that blurriness, any image I see from it will be crystal clear and sharp, and focused at whatever distance. Without me needing glasses or correction.

If I'm only overlaying a web browser on my vision, I'm pretty sure the experience of seeing one object apparently at 2m distance being sharp and clear while a physical object beside it is blurred, would probably induce headaches and nausea, and problems seeing properly.

So for the moment, external retinal projectors are still the best bet for general augmented vision.

Friday, September 18, 2009

Who'll Think Of The Space Junk?

When the amount of crap in orbit becomes enough of a problem that DARPA wants to find a solution, you begin to realise what pigs we really are.  You also begin to realise that we're probably not going to make it through the weather crisis, either.  Unless we have some creative solutions.

Solution to global warming?  Put even more space junk up there to deflect some sun.  Worry about cleaning it up in 20 - 100 years when we have the temperature and weather back in some semblance of balance.  But it interferes with our satellites? Well I'd say with the kinds of austerity and cutting back on wastes of resources that global warming is going to bring, the least of your problems will be that satellite radio isn't working...

Solution to the space junk for whenever we want to clean it up?  A couple of years ago (in 1993/1994 I seemt o recall) I was talking to a few NASA types via BBS echomail and we figured that there were ways to get reuseable craft up there from converting existing types of craft, using water ablation shields, etc etc etc yada yada...  The purpose?  To clean up space junk...

Part of the solution was to attract private investment, set up something along the lines of converted Blackbird aircraft and small remote controlled (which these days would be autonomous) scoop craft to collect the junk and bring it to your shuttle craft.  In order to save bringing all that junk back down, the plan was to aggregate it at an L5 point, and at some stage begin to salvage some of the junk and cobble together a habitat.

See, the other thing is that space is not owned by anyone.  You make a space station, you're sovereign.  So in addition to having collected a fee for removing space junk, you've established a stockpile of salvaged material, a habitat, and in effect, a new world...

So will someone please think of the space junk?

Wednesday, September 16, 2009

Syberious Stuffe

It's just occurred to me, so it must have occurred to other people around the world already - are we ready for a real cyberwar?  This article on RISKS Digest (Networks and Nationalization With Respect to Cyberwar) crystallised an already-forming idea of mine.  We've already liberated a lot of the proletariat, over the centuries.  From serfdom to clerkdom to CEOdom, baby! But of course I mean more than that, I'm a devious bastard that thinks sideways. %)

What I mean is - speech empowered cavemen types, because it allowed the transmission of more complex ideas and procedures - in short, it allowed culture to blossom.  Culture in it's turn created aggregations of people that were more than just an extended family, it turned us from a widespread population of creatures into much larger distinct social units of humans.  Along the way, it stratified those social units so that we ended up with villages with headmen, hunters farmers and providers, and so forth.

Our next communications leap was writing.  This let us cement those units over generations.  Village traditions and historic tales, once committed to a more permanent record than oral tradition, kept the identity of those units from one generation to the next, and allowed other villages to be informed, to join villages into larger communities.

Writing was initially for the elite.  Despite the limited reach of the written word, it achieved an agglomeration of villages, the dissemination of stories and legends, and of course, it provided a way to record tithes and taxes, births and deaths, and laws.  Despite not being able to read, the population allowed themselves to be ruled by the power of those recorded words.  Writing provided control over one another.

Then, writing and reading became more mainstream, and ordinary people were suddenly able to read and contribute.  That empowered a great many people to begin to advance the sciences, the arts, the politics.

Finally, the printing press made all that knowledge available to pretty much anyone.  Not by coincidence, the technological revolution quietly snuck in with this phase, and liked it here.  Because suddenly anyone could learn the science, the oratory, the mathematics of the time, and many minds make mincemeat of stagnation.

As a side effect, more and more people could also get their works in print.  Our "world library" turned from something that may have had five thousand well considered, well written, and accurate works, to something that had millions of works.  Dilution occurred, you had to pick your books that you read in a lifetime.  but somehow, people managed it. They sorted the gems from the overburden, and many went on to further their fields with more well-written works.

Printed books cemented countries and locked in politics, defined and entrenched fields of science, created larger units of social structure, some of which (to the horror of the aristocracy) seeped across old political borders and formed international bodies.  Also, of course, smaller societies were formed within social units, clubs and arcane guilds, secret societies, and insurrective organisations.

Which brings us to the Internet. Nothing has shrunk the world, or aggregated those larger social units, quite like the Internet.  Suddenly too, sifting all the books of the world and pulling threads from the dross looks like an easy task compared to absorbing the overload of material that's here online now.  How did you come across this article? 

But again, the medium has broken down barriers and made us a larger body.  And again, smaller units are given much power to change things.  It really is at the stage where I can put up a flag in my loungeroom, call it New Cyberia, and begin to take down the infrastructure of any country I want to, one power grid control at a time, where I can take over one missile or fire control network after another, ground aircraft and put ground traffic into disarray - all in the name of New Cyberia.

And if I'm clever about it, I could end up being given the keys to half the developed nations of the world in return for their ability to function properly again, without anyone ever seeing me or my flag, nor even knowing which country it's in...

That makes the remark about the teenager in their basement seem a lot more sinister and imminent than it did before.  Because if I can think of it, someone is already working on doing it...

Tuesday, September 1, 2009

Killer Technology Developed

There have been studies done that prove that texting while driving is extremely unsafe.  And it's - luckily - been quite easy to spot a texting driver.  Yet they still do it, because I've watched dozens in the last month or two, swerving ever so slightly, eyes downcast and some even hoisting the phone up to dash level to text, not really realising or caring that they can't focus on the close-up screen and the distant road conditions at the same time.  It's a law of optics.  

Then too it's been shown that even just having a conversation while driving can impair driving skill by as much as a good session at the local pub.  And it's a bit harder to spot a driver with a bluetooth in their ear, but they still do that, too.  And again, if you look for the telltale lapses of attention, you can usually tell which drivers are doing it.

Even pedestrians are not safe, with news suggesting that maybe having your music player's earpieces in and drowning out ambient noises makes you more prone to having accidents.  While walking.  For chrissakes.  This isn't obvious to the people wearing their music?

Let's face it - people who multitask are actually very bad at it.  We're not yet built to multitask.  But that of course doesn't stop people doing it in droves, does it?

It's also becoming more and more difficult to spot the once obvious signs - drivers with their cellphone up at eye level or up to their ear are now able to use speech to text, inbuilt handsfree systems, and any number of other technological advances to enable them to become a little bit more unsafe.  More power to them, as long as they take each other out and not me.

Or they can take out:

The pedestrians, who once upon a time you could tell were attention-impaired because they were hauling a boom box on their shoulders.  Then they had reasonably visible headphones, and now they are a great deal harder to spot except for the white wires and their tendency to be found under buses, trains, and other vehicles with texting drivers...

Now there's a game changer.  A technology that will make it less necessary to spot the distracted.  Because they will be the "old school of distracted," and several orders of magnitude better than the people wearing this technology, which you won't be able to spot at all...

Look - I know I'm a bit of an evangelist for Augmentation, and I will always be.  The more we develop, the more we are finding that augmentation is convenient and in some cases (artificial hearts, kidneys, etc) even vital.

I think people have an amazing capacity to assimilate this kind of augmentation into their daily lives, as witnessed by - well, as witnessed by the number of people who safely manage to integrate hands-free calls, car music systems, iPods, and making calls while walking, into their daily lives.  

Yes - for every dickhead out there putting everyone's life and limb at risk, there are a few dozen that either choose an appropriate time to perform such attention-sapping behaviours, or else have managed to train themselves to perform multiple functions without significant deterioration of their survival skillsets.

It's easy enough to train people to integrate multiple information streams at once, and manage to coordinate their actions at the same time.  It's a matter of teaching them how to assign attention to every stream, assign priorities, and then act on the highest priority inputs first.

The Defense forces are able to train soldiers to do that, generally by brute force repetition, but more and more by giving training designed to reinforce the attention/priority/action cycles.  And they've been doing a smashingly good job at it, too - they can keep a soldier alive on the battlefield while listening to orders, watching the situation, doing quite complex tasks with their weapons and equipment, and even managing to breathe...

So I'm suggesting that perhaps rather than making it illegal to use a phone while driving or walking, let's make it a licensed activity.  As part of the driver training, teach new drivers to deal with distractions, assign priorities, and carry out the relevant actions.  Teach pedestrians how to maintain watchfulness while using the technology.  Then make it one of the things they must be tested on if they want their license to include that activity.

You can't eliminate the technology, and pretending that by legislating you can make it go away is just fairy dust.  This kind of stuff should be being taught at school level, and reinforced at every level-up people go through.  Learning to drive a train?  Then dealing with cellphones and text messages should be an integral part of that training.

Because no matter what, you already know that in two - three years you'll have the first people walking around with these contacts in and playing augmented real life pacman among the traffic...