Kinect in Spaaaaace!

The incredible space geniuses at The University of Surrey and Surrey Satelite Technology Limited (SSTL) are using Kinect tech to develop a twin-satelite in-orbit docking system called ‘STRaND-2’.

SSTL Project Lead Shaun Kenyon explained: “We were really impressed by what MIT had done flying an autonomous model helicopter that used Kinect and asked ourselves: Why has no-one used this in space? Once you can launch low cost nanosatellites that dock together, the possibilities are endless – like space building blocks.”

You can keep track of their progress via the STRaND Facebook Page www.facebook.com/nanosats and even follow them on Twitter @SurreyNanosats

Link

Kinect Cameras Watch for Autism

From the latest New Scientist (issue 2863) comes a pretty incredible article about University of Minnesota’s Institute of Child Development‘s study which uses a Kinect camera to track kids at play and carefully watch for signs of Autism. It doesn’t act as a replacement for diagnosis, but a sort of seismograph to indicate to teachers which children might benefit from attention from a specialist.

Getting Dirty Virtually and Literally with Kinect

Video

From the space brains at UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences

“This video of a sandbox equipped with a Kinect 3D camera and a projector to project a real-time colored topographic map with contour lines onto the sand surface. The sandbox lets virtual water flow over the surface using a GPU-based simulation of the Saint-Venant set of shallow water equations.

We built this for an NSF-funded project on informal science education. These AR sandboxes will be set up as hands-on exhibits in science museums, such as the UC Davis Tahoe Environmental Research Center (TERC) or Lawrence Hall of Science.

Project home page: http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox

The sandbox is based on the original idea shown in this video: http://www.youtube.com/watch?v=8p7YVqyudiE

The water flow simulation is based on the work of Kurganov and Petrova, “a second-order well-balanced positivity preserving central-upwind scheme for the Saint-Venant system.”

Beam Me Up Kinect!

Video

From the You Tube Video Description:

“The Human Media Lab at Queen’s University in Canada has developed a Star Trek-like human-size 3d videoconferencing pod that allows people in different locations to video conference as if they are standing in front of each other.

“Why skype when you can talk to a life-size 3d holographic image of another person?” says Professor Roel Vertegaal, director of the Human Media Lab.

The technology is called TeleHuman and looks like something from the Star Trek Holodeck. Two people simply stand in front of their own life-size cylindrical pods and talk to 3d hologram-like images of each other. cameras capture and track 3d video and convert it into the life-size surround image. Since the 3d video image is visible 360 degrees around the pod, the person can walk around it to see the other person’s side or back.

The researchers also created a 3D holographic anatomical browser called BodiPod, which allows students to examine medical imaging data using the cylindrical display.”

For more info and high resolution images, visit human media lab’s website!

The Biggest Fruit Ninja You’ve Ever Seen

Sometimes, Microsoft and Apple work together more beautifully than most people would guess. Orlando’s Crunchy Logistics is getting some attention for creating “the largest interactive reproduction of an iPad in the world spanning 24 feet wide and 12 feet tall”.

This massive display was created as a demonstration of their Padzilla Two Interactive Case which offers gesture control of iOS on an iPad powered by Kinect. According to Crunchy Logistics’ webpage,

“The giant iPad display is constructed through the newest 3.5mm LED wall matrix display technology. 3.5mm is the thinnest LED separation currently possible for this type of LED matrix display which increases pixel density to possible “Retina” display quality at typical viewing distances. Padzilla now enables the user to interact on a large scale from a multitouch reproduction platform and is now multi-input gesture capable.”

These magic engineers chose Fruit Ninja as the game to display on such a massive scale. Of course, this now means I want to see if I can defeat an elephant in Fruit Ninja.