The Future of Gestures and a Niche For BB10 or QNX
I'm hoping to start a discussion on the future of gestures in mobile computing, and the potential for BB10/QNX to find develop a strong niche.
The potential for simple gestures to interact with a hub workflow is going to be a strong point for BB10 devices. However, much focus with the z10 and x10 is on the relationship between gestures and a touch screen. Has anyone else been following the development of LEAP motion? https://leapmotion.com/
The Leapmotion device is pretty simple -- it is similar to Microsoft's Kinect. It's designed for users to be able to control a screen without physical interaction with a device.. It is much more precise than kinect, and runs off of infra-red -- so it can track what each finger is doing with incredible accuracy. The real innovation seems to be in the algorithms for interpreting gestures. It is incredible really -- and there is a lot of potential. Imagine sitting in front of your screen or a projection and just be able to control it with your fingers -- no keyboard, mouse, or even touch screen.Specialists in all fields will be able to interact with programs without 'switching' to the action of computing. Surgeons will be able manipulate a program without taking their gloves off -- oil workers while doing messy mechanical work -- policemen arresting a violent person -- bratty teenagers in classes who have been told to turn off their devices. Leapmotion costs a mere 70$, and you can plug it into your computer with USB until the next wave will operate wirelessly. Developers are ensuring it will work with all operating systems before it is released -- but it feels like there is a lot of potential in an augmented reality future to wed a device like this to QNX.
Could something like this be integrated into a BB10 device? It would involve including a small infra-red scanner. Imagine if your phone could not only read the gestures that you make on the screen, but also those you make with your hands in the air. A simple voice command could tell your Z10 to switch on IR gestures -- and you just start interacting with your device. With the phone in your pocket, and your hands by your side, you could be 'typing' a message while standing in an elevator, or holding the wheel of a car.
Imagine if when you get to the office, you could just put your z10 down in front of a screen or a projection, and start 'computing' using gestures in the air -- surfing through pages or drafting images.
BB10 is already great -- I'm banking on it being a success.. But I think that if this sort of capability could exist with BB10, it would genuinely give it the edge that it needs to outpace Apple and Android. Very soon, everything is going to be 'smart'. It won't just be our phones, or porche cars. It will be our homes, toasters, elevators -- and gestures are going to be our way of navigating. RIM should be anticipating this. Remember the comment about 'nobody will watch a video on a phone'? Rim has clearly caught up -- and has created a better OS... however this could be the next step in creating a niche.
The Future of Gestures and a Niche For BB10 or QNX
Originally Posted by
mw12341 Thanks!
Yeah, I agree the screen on a phone itself would be a little small for some of the more exciting things you can do with a device like Leap. But I was thinking that because of the mobility of the phone, it could act as a controller for all of those devices (TV, Computer) etc. It would just communicate with them like a wireless/bluetooth input device. I'm sure there are already android apps that allow you to control those larger devices through the touch screen.
Ok, I'm understanding more now.
That would be great, if a phone could be used as a wireless controller!
Sent from my BlackBerry 10 device, in the near future!