Kinect for Windows SDK 1.5 is Now Available!

This just in from the incredible whiz-kids over on the Kinect for Windows team — They’ve just released version 1.5 of the Kinect for Windows runtime and SDK.

They’ve added more capabilities to help you brilliant developers make cool things!

  • Face Tracking with a real-time 3D mesh of facial features tracking the head position, location of eyebrows, etc
  • Seated skeletal tracking!
  • Improved performance and data quality enhancements to permit RGB and depth data to be mapped together (e.g. “green screen”)
  • All new samples!
  • Many more (for full details read the Team Blog post here)

They’ve also made Kinect for Windows hardware available in Hong Kong, South Korea, Singapore, and Taiwan — with 15 more countries starting next month! To learn all 31 countries Kinect for Windows will be available in check out the Kinect for Windows blog!

Kinect Tech Helps Microsoft Research See with Sound

We’ve got a ton of space geniuses here at Microsoft. Our team isn’t the only one working to make Kinect smarter, faster, and stronger. Microsoft Research has some pretty incredible whiz kids that have recently used Kinect to help innovate a new gesture-based motion “controller” that doesn’t actually need any sort of controller, including the Kinect, to work!

Project SoundWave utilizes the existing speakers and microphone built into a PC or notebook to detect your movement. The speakers emit inaudible tones in the 18 – 22 KHz range (your cats and dogs are safe). SoundWave then uses the same system’s microphone to pick up these tones as they are bounced back by your failing hands and arms. The tones are passed through a detection algorithm and any frequency shifts detected are processed to figure out what in the world you’re trying to do!

Want to see it in action? Watch the video below!

Link

Kinect Cameras Watch for Autism

From the latest New Scientist (issue 2863) comes a pretty incredible article about University of Minnesota’s Institute of Child Development‘s study which uses a Kinect camera to track kids at play and carefully watch for signs of Autism. It doesn’t act as a replacement for diagnosis, but a sort of seismograph to indicate to teachers which children might benefit from attention from a specialist.

Getting Dirty Virtually and Literally with Kinect

Video

From the space brains at UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences

“This video of a sandbox equipped with a Kinect 3D camera and a projector to project a real-time colored topographic map with contour lines onto the sand surface. The sandbox lets virtual water flow over the surface using a GPU-based simulation of the Saint-Venant set of shallow water equations.

We built this for an NSF-funded project on informal science education. These AR sandboxes will be set up as hands-on exhibits in science museums, such as the UC Davis Tahoe Environmental Research Center (TERC) or Lawrence Hall of Science.

Project home page: http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox

The sandbox is based on the original idea shown in this video: http://www.youtube.com/watch?v=8p7YVqyudiE

The water flow simulation is based on the work of Kurganov and Petrova, “a second-order well-balanced positivity preserving central-upwind scheme for the Saint-Venant system.”

Beam Me Up Kinect!

Video

From the You Tube Video Description:

“The Human Media Lab at Queen’s University in Canada has developed a Star Trek-like human-size 3d videoconferencing pod that allows people in different locations to video conference as if they are standing in front of each other.

“Why skype when you can talk to a life-size 3d holographic image of another person?” says Professor Roel Vertegaal, director of the Human Media Lab.

The technology is called TeleHuman and looks like something from the Star Trek Holodeck. Two people simply stand in front of their own life-size cylindrical pods and talk to 3d hologram-like images of each other. cameras capture and track 3d video and convert it into the life-size surround image. Since the 3d video image is visible 360 degrees around the pod, the person can walk around it to see the other person’s side or back.

The researchers also created a 3D holographic anatomical browser called BodiPod, which allows students to examine medical imaging data using the cylindrical display.”

For more info and high resolution images, visit human media lab’s website!