Rumors abound that X-Box Kinect motion-tracking technology. But what would Apple do with PrimeSense's motion sensor tech if the acquisition goes ahead? It could certainly come in handy if Apple goes ahead with a game console, but Kinect was never that popular as a game controller. So what else could it be used for?
PrimeSense developed the architecture for the Kinect that appeared in the first Microsoft X-Box. While Microsoft opted to develop their own version of the 3D sensor for the X-Box One, which should be out later this year, PrimeSense have been working on a new, cheaper, less energy-hungry and much more compact version called Capri.
In a deal reported to be worth a measly $280 million, the motion-sensor technology could give Apple an angle on their rumored game console (which iPhone, bringing them more in line with smart- and air-gesture technology Samsung are already using.
The most obvious, and perhaps least exciting, use for the Capri would be in a game console that Apple are reported to be developing, as both they and Google try to get further into our living rooms. But the Kinect was not a particularly popular interface for hard-core gaming, due primarily to the delay between action and response, that just couldn't compete with the immediacy of hand-held controllers.
Sure, it was fun, but it wasn't enough to become the standard interface for gaming. It was quite popular too, as was the Wiimote motion-sensing controller, but developers never produced a genre defining game that set the standard for how motion-sensing controllers should be used. So the Kinect remained largely in play for less intense game titles like Dance Central.
Of course, Microsoft will have undoubtedly improved on the problems present in the first Kinect when they release the new version with the X-Box One and I look forward to seeing the games that come with it. Apple too, are no slouches when it comes to ironing out any problems and presenting the best possible version (usually) of new technology so an Apple console with Capri could be very cool indeed.
So what could Apple do with motion-sensor tech in their computers, smartphones and tablets? The opportunities are pretty endless, especially considering Apple's penchant for shedding the old in favor of the new. We could be looking at a whole new range of contextually aware, motion tracking interactions with our handheld devices and desktops.
Head and eye tracking
The most obvious use for motion sensors on a handheld device would be to track the motion of your head or eyes. Web navigation, music controls, emails and texting are just a few features currently being operated with this technology. Considering the amount of time we spend looking at the screens of these devices, a lot more could be achieved with better sensors to break down the reliance on swiping and on-screen keyboards. Just imagine, Swiftkey eyeball version!
A lot of smart features could also be achieved with eyeball tracking, and if the sensor sees you have closed your eyes or looked away from your device for a while, it could pause your game, enter sleep or power-saving mode, or even send you a notification to get your attention back. You might never doze off while watching a film on your tablet again. There is also, of course, convenient advertising applications for eyeball tracking.
Better motion sensors that would allow you to be further away from your device than current sensors allows, might help make the leap from your smartphone being used primarily as a handheld device to an all-seeing life companion capable of sensing a range of different gestures from across the room, making hands-free really zing. Your phone could recognize when you wake up and start narrating the day's activities to you, or sense when you've laid down on the couch and launch a ''what would you like to do'' menu: take a nap, listen to music, watch tv, call a friend?
Motion and gestures
In combination with accelerometers, gyro and compass components, the motion sensor could be used to answer calls simply by moving the handset to your ear. The same could be done with ending a call. Depending on how your device is moved (for example, if it is lying on a table or in motion) could provide markers for different modes to be activated (some of these features already exist on the Galaxy S4).
If the phone senses it has been dropped it could automatically run diagnostic tests to check for damage, for example. Air gestures might become the primary means for handling your device. Why even touch your phone when you can simply perform a gesture that answers a call and initiates the speakerphone? Samsung are playing with lots of these features already, and while some work better than others, combined with voice recognition, there is plenty of room to move forward.
Improved motion sensors could also be used to combat stability issues when taking digital photos. If your device can better sense what's going on around it, it should be able to help you take better selfies. Perhaps whenever the camera is turned to face you at arms length you device will know what you want it to do and launch the camera with image stabilization if necessary. Gestures or voice activation could also allow you to take easier ''timed'' shots when you're not actually holding onto the device too. There's also great applications for music videos that you can see in the video below, which translates Kinect data into 3D visualizations.
Exercise and bio feedback
We already have smartwatches and other wearables that track bio-data. Well what if your phone could also do this and more without the need for a separate device? Recall the recent promo video we shared from Samsung. The sensors available for smartphones are already sophisticated enough to keep track of your movements all day long, and there are apps available now to keep track of and manage this data to help keep you in shape. What else can they do?
Sensors could also detect accidents and automatically call for help, while other sensors could provide alerts if you are starting to get ill by monitoring your temperature and baseline biometric data. If all this sounds a bit weird, remember that Sony was looking at including a ''gamer sweat'' sensor on the PS4 controller to monitor player emotions and stress levels by measuring skin conductivity (Wii also had a vitality sensor). Wouldn't it be nice if your smartphone could detect rising stress during a phone call and do something about it?
Facial recognition and personalization
Your phone can already recognize your face and unlock your phone whenever it sees you looking at it, and it can also learn your voice. Better motion sensors could use that contextual and user awareness to start certain features automatically. For example, if you go running it could launch your favorite tunes, or start biometric data recording, or provide prompts to keep you moving a tiny bit faster than your last session, or all three at once.
There's a lot your phone could learn to do for you based on time and location data, facial recognition (when you're out with a partner on a Friday night, or with friends who like the same music as you etc) that could become standard for smartphones in the future. Sensors could also monitor your response to content on your phone, providing recommendations to other internet cat videos or switching to a different television channel for you when 90210 comes on.
What I'm trying to say with all this is that there are so many cool things that Apple, or Samsung for that matter, could do with even better motion sensors in their devices that it makes perfect sense for Apple to get a jump start (or play catch up with the Galaxy S4, depending on your perspective) on a proven motion capture technology. This is especially true if it's coming as cheap as PrimeSense reportedly is. The future for not holding onto your phone so often looks bright. We'll just be waving our arms and heads around a lot more.
What applications would you like to see for motion sensors on your smartphone?