Showing posts with label augmented reality. Show all posts
Showing posts with label augmented reality. Show all posts

Wednesday, 11 August 2010

We Are Mobile Phones, But Our Mobiles Phones Are S**t


There are currently 4.023 billion mobile phones connected globally with the world population of 6.75 billion people. That's 0.596 phones per capita (per person). It (roughly) means that more than a half of the world population owns a mobile phone. Let's dig it.

More people have access to mobile phones in India (543 million), than to adequate sanitation (366 million) (UN, 2010). 

China is a leading single country with the highest number of mobile phones connected - 547.3 million (with India being second). But lets put it in the context. Population of China is 1.31 billion people. So the estimate is 0.416 phones for a person.  

Europe has 712.8 million mobiles connected. With population of around 593 million, we have 1.2 phones for a person. Even if you look at fine details, countries like Germany, Italy, France, UK, Poland, Spain or Ukraine have roughly between 1.1 - 1.5 phones per capita. So proportionally, we are leading, side-by-side with Russia (which has also around 1.3 phones per capita).


It's a big number. It means statistically every person in Europe has a mobile phone. Obviously, that's not the case, but number is still extremely high.

Mobile phone is becoming a body organ. You can now browse the net, watch TV, listen to radio (ok thats old school), geotagg the reality around you, direct yourself around any place (with GPS coverage), play complex games, shot photos and videos, book a tickets or restaurant with just few clicks. We are constantly connected. You go to the pub and you see 8 people sitting silently around the table, clicking their phones. They eventually come back to conversation, but every social pause will be an excuse to send a text or check/update Facebook/Twitter.

But with all this attachment and expansion of mobiles there is still not that much innovation coming in a field, just redesign and recycling old solutions.

Sure, iPhone redefined touch interface and globalised common app use, and the trend is now evolving very quickly within mobile application development. But hardware development just froze in time, and mobile phones are becoming increasingly clunky. Phones from HTC, Apple, RIM, Nokia are loosing their key features - battery life, sustainability, simplicity of interface, cheapness, signal strength and toughness. And services are also going down with expensive contacts, confusing insurance policies, expensive roaming and data transfer rates. My iPhone can let me send the e-mail, but it will cost me £3/Mb from outside UK, and the battery dies on me after a day. Ten years ago I had Nokia 3310 which battery least for 5 days easily, with frequent use. 

My point is that someone has to redevelop current concept and services of mobile phone. Software is there, but hardware and mobile networks lacks perspective. It's all iPhone-touch-screen-HTC-another-version-clones now, with competition on 'who get's bigger OLED screen' and whether O2 or Orange screws more users on data plans with "free" phone.

So...

I want good battery life (5-7 days, or 2-3 weeks like ebook readers!). 
I want very simple interface (well, Apple kind of succeeded here). 
I want just a few apps I use on the daily bases. Honestly, how many apps that you download from App Store you actually use regularly? 5? 8? I recon that's my personal estimate. And I have about 100 of them kicking around in my iTunes library - most of them just useless crap (for some of which I paid, looser...). 
I want a quality calls (with faces, yes, they have it in Japan already, Apple) and good network coverage. I want cheap roaming rates and cheap data. Not for downloading YouTube movies, but to turn on simple GPS to find some cool bar when I'm chilling somewhere abroad. 
I want simple and clear contract, that I can easily personalise and adjust according to my calling/traveling/data needs with free access to wi-fi hotspots around Europe (especially airports).

Is it really THAT much? ;-)

[All data in this post has been calculated and sourced using Wolfram Alpha]
[Image: Geek On Acid ©]

Thursday, 8 April 2010

Skinput, Text 2.0 and Atari Pink Floyd


You sit on the sofa, relax and you flick your finger - immediately projector in front of your turn on, and a tiny remote control icons are displayed on the skin of your arm. You tap the icons and switch the channels, another tap on the skin and you run some applications in background to get e-mails and news. Than you just flick your fingers, and display a nice table on your entire shoulder showing tunes that you have in your music library or list of movies. You 'click' your skin where the title is displayed...

This is Skinput - a new technology that will turn your skin into control surface. Researchers in Carnagie Mellon, in collaboration with Microsoft Research designed a sensor that will register vibrations and sound conducted by your skin. The sensor will also contain simple LED based projector, that will display icons, titles, whatever images you want on your skin, allowing you to control any devices around you. In the simplest format, you will be able to give commands to your devices only using simple finger taps, without even displaying anything on your skin. And mind you - this is Chris Harrison PhD project. Neat.

Next - tablets! Favourite media topic for the last couple of months, my technology search engines are filled 90% with iPad reviews. But one thing came through as potentially very interesting - eye-tracking tablet with Text 2.0. The idea is that tiny camera in front of your tablet will register your eye movement. You will be able to focus on a word and see definition of it, but Text 2.0 is something more - it will know when you skim the text and it will fade out irrelevant words for you. It will learn your reading habits, with possibility to give you feedback. Most important, it is completely new possibility of controlling applications just by LOOKING at particular points on the screen and vast possibilities for visual perception research. Above - Apple patent for eye-tracking - it's coming ;-)

Finally, some music, check out this album by Brad Smith - he made the entire Pink Floyd album 'Dark Side of the Moon' in the chiptune, atari-sound-like version. Especially listen to side two - awesomeness in 8-bit clothing - it made my day ;-)

Monday, 18 January 2010

Noticin.gs

There is a new game I started playing recently - Noticin.gs. The rules are simple - when you NOTICE something interesting in your environment, you make a photo, and then you upload it to Flicker, with geotag for the location where you captured the photo. The other rules are:
  • People aren't noticings,
  • You can only submit one photo of each 'thing',
  • Each player is limited to three noticings per day,
  • It must be clear to other players what the noticing in the photo is, using the title of the photo if it's unclear in the picture.
After each day you photos get scored and you are earning the points. That's it.

Why is it interesting?

For the same reason as the reactive music I described here - this is a trick that makes you connect and attend to reality, rather then isolate and disconnect from it. When I am searching for the thing to capture I am more aware of everything around me. Speaking simply, noticin.gs makes you NOTICE things more.

So get your mobile phone set up with Flicker and start playing, and you will notice a lot of cool things surrounding you...

Sunday, 10 January 2010

Nexus One, Transparent Display, Project Natal and shit loads of touch tablets.

Its the beginning of 2010 and there is a lot of stuff happening in the technology world. Particularly, I have been focusing on Consumer Electronic Association (CES) conference to see if there is anything worth cyberpunk attention.

First - everyone has been hooked on the release of Google Nexus One phone (pictured below), which for me looks really like another iPhone copy. Some specs are better (larger screen, 5 Megapix camera, voice recognition, multitasking), but to be honest - I would expect more from Google. All the internet gossips about Nexus One came true, which was... very disappointing, because it was all so predictable! I really like the fact that you can dictate your SMS's to the Nexus, but I know from experience that as a non-native English speaker, those technologies doesn't work so well for me. Nothing really new there, it is like Google just recycled iPhone. Not impressed at all. However, what has to be admitted is that the real war is not in the hardware but software that matters in mobile phones at the moment. As Mike Harvey reflects in his Times review from yesterday that only Google Android OS is a real candidate to overtake Apple on the mobile market. We shall see how it develops, and what phone we will have in our pocket in a year time...


Second - the cool stuff, is the prototype for Samsung transparent OLED display and 0.05 mm OLED panels (pictured below in the format of window displayed during CES conference). Yes! I was waiting for it! I already see my mobile phone with transparent display scanning the reality around me with augmented labels, or my flat windows being my displays. Also, the perspective of reading a e-newspaper on 0.05 mm display is even more exciting. Ok, maybe I haven't outlined the most practical aspects of this technology (like hmmm... transparent medical body scanner - you know, the one from Aliens) but it still has very high cyberpunk-geek factor for me. It's the kind of stuff from sci-fi movies that comes true. Ah, shame it's still in research and development, no release date yet...


Third - Microsoft Project Natal for 'new' Xbox 360 - finally. 'You Are The Controller' as the advert says - full body gaming interface. I think it was inevitable after Nintendo Wii, but it will finally be here, this year, in your living room. Project Natal is a tiny camera (pictured below), that combines depth sensor, multi-array microphone, and software which provides camera with 3D full body motion capture, face capture, voice recognition, acoustic source localization and ambient noise suppression. So you slash, shot, speak, shout, kick, or jump, and your on-screen gaming character does the same. However, on the CES they didn't actually demoed this device. Microsoft folks just showed cheesy clips of Natal and jumped around excited how amazing it's going to be. Demo please Microsoft, we want proper user demo. Anyway, its coming soon, this summer. I think I will actually buy Xbox just to experience it...


Forth, fifth, sixth... all the other releases are taken by so called 'Slates' or touch tablets. Lenovo Tablet, Dell Tablet, Freescale Tablet, T-Mobile Vega Android Tablet... This year on CES almost every possible company decided to release their own tablet, probably in competitive anticipation for widely-gossiped iSlate (that is supposedly coming soon from Apple). And, I totally agree with my pal from lab, David (who already criticized it here), that tablets don't fill any market gap, and therefore they are useless. Ebook I understand, because I want to read and store journal papers, newspapers and articles on the thin and light device with E-ink display (and they are cheap). Netbook I acknowledge, because I would like to have a light laptop working in the Cloud (and they are cheap). But I don't need a 10" copy of my iPhone with touch LCD or LED screen. Not useful for writing, not useful for reading, shit, it's a retarded technology and it's expensive. Enough for today.

[photo credits: Google, Engaget, Wikipedia]

Wednesday, 16 December 2009

Geolocated Augmented Reality

Is it possible to have Terminator-like augmented vision on our mobiles?

I mean, yeah, we have some basic augmented reality apps and I already spoke about them in one of my previous posts. However they all suffer from obvious problem - the mobile GPS location is accurate ONLY down to about 8-12 meters or even more. The consequence of this is that the augmented reality data floats all over your screen and it's not accurate enough.

I want to look at my phone and have a nice icons, descriptions, clips and images overlying on the surrounding reality in the right places without the irritating floating effect. I want to be able to pick an object in city space and label it in a way that will effectively display for other users. I want it to be like in a science-fiction movies.

But how to achieve that?

One company thought about interesting way to solve the floating and accuracy problem. Earthmine has adapted a 3-D space mapping technology used in Mars rovers to capture the city street and create a detailed, 3D representation of space. At first it sounds just like a sophisticated Google Street View. However, Earthmine vision is to merge 3D maps they capture with location capacities of mobile phones and create a fully interactive, and stable geolocation engine. So inaccurate GPS signal from the phone will be combined with the recognition of the surroundings you are standing in, which will allow you to get a precise augmented reality display.

I just hope that Google have a plan like this, and can use that in their Street View to apply such technology, because it would really be... the quickest solution, considering how much of the world they have already mapped....

[photo credit: Earthmine]

Thursday, 10 December 2009

Reactive Music and Reconnected Reality

I have different soundtracks to everyday world, my iTunes library is filled with playlists for different moods, actions and places.

Those playslists also allow me to escape from the surrounding noise, and therefore from the actual experience of auditory reality that I exist in.

However, I've recently discovered reactive music, the type of music that is shaped by your everyday world sounds.


I have two apps on iPhone: Kids on DSP and RjDj, both based on the same engine - that is to play some pre-recorded samples, but morph and mix them with sounds recorded in real time from microphone placed in the headphones. So the software records sounds that surround you and sample them into a continuously evolving soundtrack. As a result you actually listen to your environment which makes you more connected to reality. Your medium returned you to the world that surrounds you.

It is very crude and simple, it needs to be improved, but I see a type of augmented reality here that works like boomerang, that comes back to you with some upgrade of your actual experience of the moment and the sounds of your surroundings. You don't actually cut yourself from the noise, but you use those sounds to make your music.

So I recorded my trip back from work today using different 'scenes'* from RjDj and Kids on DSP and I managed to make a small album out of it, ;-)

*'scenes' are small programs inside RjDj that control how sound is recorded and sampled with the default sets. Those 'scenes' can be programmed and published for users by anyone who make one.

Geek On Acid: Back From Work
[recorded with RjDj and Kids On DSP]


Tuesday, 17 November 2009

Virtual Dresses and Skin Displays

Today something for geek chicks out there. Some hot tech-news from the world of fashion.

First - you have a trouble with deciding whether to buy this online dress? Not sure about the size, or how it will go with your jewelery? Well, folks at Tobi Shopping created Virtual Dressing Room, Fashionista. It works as augmented reality, so you print a special marker, stand in front of your computer camera, and just change the dresses or other clothing that overlay on the top of your body image, with everything showed in your browser display in a real time. With printed marker you adjust the position of the dress in the virtual camera space. It is a bit crude, but very interesting idea in terms of applying augmented reality in the user interface. I tested it with Ola, and she got hooked on it quite quickly. Below are her photos with the virtual dresses :) But watch the tutorial before you try the dresses.

Second - remember those gang members from Gibson's Virtual Light? Or drummers from Stephenson's Diamond Age? Yes, the ones with interactive tattoos which were displaying different images under their skin depending on emotions and their behavior. Now research team from University of Pennsylvania finally made it, with the LED tattoos technology that can turn your skin into the full color screen. Those LED's will be the combination of silicon microchip with silk substrate, allowing the chip to smoothly dissolve into your body. Unnoticeable microelectronics implanted in the surface of your skin will allow to hook it to any electronic device, displaying anything you want. They will be also useful in diagnostic aspects, like monitoring vital metabolic signals, blood pressure, etc. Initial displays will be black and quite basic, but the potential application for the future is limitless, from full body displays to microchips being implanted to your retina to regulate the amount of light coming to your eye or displaying augmented reality. Philips is already exploring some commercial use of electronic tattoos. It is a bit creepy, but... I love it ;-)

Monday, 9 November 2009

Neurointerface

William Gibson in his 1984 novel Neuromancer predicted that we might connect the computer to our brain. His dark story was a fiction 25 years ago, but now it becomes a reality.

We are closer than ever to interface our brain with the computer.

One of the most promising developments is BrainGate (currently during clinical trials conducted by Cyberkinetics). BrainGate consist of a sensor in the form of sophisticated micro-electrodes implanted directly into motor cortex and a decoder - dedicated software translating brain activity into useful commands for external devices. At this stage the majority of tests are conducted in patients with severe forms of paralysis, like Quadriplegia or Locked-in syndrome.

Application? Prosthetic limbs control, complex computer operation with augmented reality, therapy for neuronal-based disorder (in expansion of deep-brain stimulation).


Certainly, we don't know enough about the brain at this stage to be able to create complex interfaces, but even hooking motor cortex with functional microchip will be a milestone in neurocybernetics.

Problems?

What problems? ;-)

But seriously, one outlined in this month issue of Wired is neurosecurity. Folks at the Medical Device Security Center (MDSC) showed that they could reprogram implantable hearth regulator with simple radio equipment. Now, think about your neuroimplant being hacked, and your prosthetic limb, anti-depressive brain stimulator or remote control function (;-) taken over? So MDSC is now developing encrypting security methods for neuroimplants.

A Firewall for your brain.

[photo credits: Cyberkinetics]

Tuesday, 27 October 2009

Printing Image on Your Eye

I will be talking a lot about Augmented Reality. This is an exciting feature that recently invaded most of mobile phones running Android or iPhone, with the number of geolocative applications like Layar, Nearest Tube or Accrossair Browser. It is very simple at this stage, you use camera implemented in your phone together with 3G and GPS location information to display various data on the objects surrounding you. Sounds neat, but in practice I found it bizzare. You stand in the middle of the street waving your phone, but then you get access to information quite quickly and visually attached to reality surrounding you. I used Layar and Nearest Places on iPhone, but I found it a short term fascination. Too small, to distractive as an interface, too complex, not exciting enough.

But the potential is there, a massive one.

Augumented reality projected directly to your retina is a very close possibility.

First you got Brother Industries who designed 'retinal image display' (upper image), where a device attached to glasses frame is drawing the image on your retina using a laser.

Second you got those contact lenses from Prof. Parviz at the University of Washington (lower image), who want to print circuits and LED based displays on the contact lens.

Now both devices are certainly in the trial stages, but its a question of 2-3 years before it will be fully developed and tested by the US army, before making it to the market.

Of course that would lead augumented reality into the new stage where you fully incorporate visual display into your daily real world.

[photo credits: Parviz Research Group, University of Washington; Brother]