Nintendo NX May Contain Kinect-Like Features

Rumours galore surround the Nintendo NX, the successor to the Wii U. Now it seems like another one has come forward thanks to a patent showing off what seems to be Kinect-like features.

Nintendo took the console market by storm when it released the Wii back in 2006. Featuring motion controllers through the Wii remotes, the console saw entire families playing bowling together in their sitting room and racing around tracks with Luigi and Mario against the grandparents. Since then several consoles have used similar methods to achieve motion capture gameplay, from the use of light orbs to accelerometers in controllers the consoles have experienced an influx of movement related gameplay features. The Kinect took it to a whole new level and it looks like a similar method could be used to take the NX that little bit further.

The Kinect used cameras to not only recognise who was playing the game but also their movement and distance (something that proved difficult for other consoles). The methods listed for the depth tracking software include everything from a distance measuring laser to using the thermal signature of a person to determine how close / far away from the camera they are.

Combined with the depth measuring technology there also appears to be gesture recognition, something which could see you swirling your arms around like Iron man as you power up and play your favourite games.

Are you excited for the NX? We are excited to see how it will turn out and what rumours are included in the latest console from Nintendo!

Intel Revealed new Tiny Long-Range RealSense Camera for Smartphones

Intel’s RealSense camera has found its way on PCs, Laptops, tablets and even drones. The company’s technology uses the power of gestures and 3D scanning to improve user interactions.

Smartphones have been a bit tricky to fit with Intel’s tech, but the company finally managed to do it in the end. Intel’s CEO, Brian Krzanich, revealed the latest addition at IDF in Shenzen, emphasising that the new module is significantly smaller and slimmer than the previous version, has a lower thermal output, and claims to have a longer detection range as well.

Intel has also taken advantage of the opportunity to announce a partnership with Chinese online retail giant JD in an attempt to help improve its warehouse management. The company displayed how a tablet with integrated RealSense depth camera can quickly measure the required box sizes for products of all shapes, and consequently summing up the space needed for shipment or storage.

The new RealSense integration has not been given any detailed specs or an availability date just yet, but Intel is bound to release some information soon.

Thank you Endgadget for providing us with this information

Ford Working With Intel on Bringing Face and Gesture Recognition Technology to Cars

Automaker Ford and chip maker Intel apparently have started looking into how people can interact with their gadgets even more. With smart watches and virtual reality already on the way, the two manufacturers apparently are looking into making their own innovative technology, but for automobiles.

“Project Mobii” is the name of Intel and Ford’s little project, having it stand for Mobile Interior Imaging. The project itself is not that “small” either, having a team formed from ethnographers, anthropologists and engineers working to achieve a “more personalized and seamless interaction between driver and vehicle.”

What all of this means is that Intel and Ford are experimenting on new ways people might interact with cars in the future. Cameras could be connected and allow owners to check the car remotely via an app, or gestures might be able to control a car’s features, such as a sunroof or windows, or even have the car implemented with facial recognition software in order to identify the driver or give certain permissions to family members.

Though the project is being worked on, this does not mean we will see sci-fi cars on the streets anytime soon. Intel and Ford have apparently made it clear that the project is currently just for research and exploration purposes for now.

Thank you Endgadget for providing us with this information
Image courtesy of Endgadget

London-based Startup Invents ‘Wave your wallet at a computer’ Payment Method

Wondering how to make online payment more interactive and easy to use? A London-based startup has the answer for you. They have allegedly created a technology which can recognise an object, pose and gesture. It is said that you can merely wave your wallet around at a computer and it will open up an online payment.

The company behind the technology is called Seeper, which have developed the Seemove application for many purposes and not just online payment. Other described uses are Iron Man types of poses to pretend one’s Tony Startk. But let’s set aside the childish things which can be done with Seeper and go into more ‘mature’ and interesting things it can do.

[vimeo]https://vimeo.com/80474997[/vimeo]

The Seemove technology is said to be able to memorize and interpret any object in order to identify it. For example, Seemove can identify your smartphone and sync with it in order to share photos and videos between the computer and handset, which is quite useful if you are too lazy to sync it yourself via wireless or cable.

However, Evan Grant, the founder of Seeper, says the video is designed as a demonstration of what the technology can do and may or may not reflect how consumers ultimately use Seemove. Grant says that Seemove will be released as Middleware for developers to use in creating applications based on gestures, poses and objects. He also adds that, besides waving your wallet to trigger an online payment, Seemove could also be used for interactive kids toys, controlling home entertainment systems, gaming and even sign language.

Thank you Mashable for providing us with this information
Image and video courtesy of Mashable

Leap Motion-enabled technology unveiled in upcoming HP laptops

 

HP announced their next line of laptops and are trying to innovate through implementing Leap Motion technology, allowing users to control applications with the use of hand gestures.

Leap Motion has been thought of as an integrated sensor that comes embedded into computational systems rather than a peripheral gadget. It looks like HP took the initiative and will take the first step into embedding this technology. The HP Envy 17 Leap Motion Special Edition will be one of the laptops that can be controlled from afar with the help of hand gestures.

The embedding process is possible thanks to the startup crew from San Francisco who worked tirelessly for the past few months to reduce the size of the sensor, therefore being able to fit it into a laptop chassis.  The sensor is around 3.5mm high and is placed below the keyboard and to the right of an off-center trackpad.

The Envy HP 17 Leap Motion Special Edition will include 5 bundled games (Boom Ball, Jungle Jumper, Dropchord, Disney Sugar Rush and the HP-exclusive Jack Lumber) and HP will also offer a quick link to the Airspace Store, the store that hosts Leap Motion compatible applications. There are currently over 100 applications available in the store and the number is  growing.

Here is a preview of how the Leap Motion technology works:

[youtube]http://www.youtube.com/watch?v=2YCKMQigMDI[/youtube]

HP has priced the Envy 17 Leap Motion Special Edition laptop at $1050 (around £655) and is set to hit the market on the 16th of October.

Thanks to Bit-Tech and Adevarul for providing us with the information.

Images and video courtesy of Adevarul.