Kinect in Spaaaaace!

The incredible space geniuses at The University of Surrey and Surrey Satelite Technology Limited (SSTL) are using Kinect tech to develop a twin-satelite in-orbit docking system called ‘STRaND-2’.

SSTL Project Lead Shaun Kenyon explained: “We were really impressed by what MIT had done flying an autonomous model helicopter that used Kinect and asked ourselves: Why has no-one used this in space? Once you can launch low cost nanosatellites that dock together, the possibilities are endless – like space building blocks.”

You can keep track of their progress via the STRaND Facebook Page www.facebook.com/nanosats and even follow them on Twitter @SurreyNanosats

Kudo and Dean Talk Kinect

Today on Venture Beat, game-industry super news guy Dean Takahashi and super star Kinect “Cheerleader” Kudo Tsunoda chatted all about Kinect and content.

They discussed game developer’s gradual adoption of Kinect features as everyone learned which parts of the Kinect could work for existing games as well as how it could enhance the experience for players.

“In lots of ways you see Kinect, especially in the core areas, in some ways being put into franchises that already exist. I think about a game like Mass Effect 3, and I don’t think necessarily the motion technology of Kinect in any way really dictates what goes into the game and what doesn’t as much as that game is perfectly suited f


Kudo explained.
 “Fable: The Journey, coming out this holiday, will be using the

the motion technology, and it has really good fidelity in the motion technology. It is improving over time to drive a more core gaming experience. I think we’re proving out things in Kinect all the time — building new things. Allowing creative people to use Kinect to bring a different type of functionality in a way that makes sense for their franchises.”

New technologies can be tricky to master, but here we are a year and a half after the launch of Kinect and many developers are starting to get the hang of it!

“People understand better how to develop on any kind of technology. You’re going to get better performance, better experiences, new inventions over time.” Kudo observes. “I think that’s why you see Kinect branching off into all different genres of games now. It’s because developers are able to do more with the technology as they’ve become more experienced in working with it.”

Want to read more from this conversation? Head on over to Venture Beat to read the whole thing!

Which game franchises do you think should try to add Kinect features? Offer your opinions in the comments!

Kinect for Windows SDK 1.5 is Now Available!

This just in from the incredible whiz-kids over on the Kinect for Windows team — They’ve just released version 1.5 of the Kinect for Windows runtime and SDK.

They’ve added more capabilities to help you brilliant developers make cool things!

  • Face Tracking with a real-time 3D mesh of facial features tracking the head position, location of eyebrows, etc
  • Seated skeletal tracking!
  • Improved performance and data quality enhancements to permit RGB and depth data to be mapped together (e.g. “green screen”)
  • All new samples!
  • Many more (for full details read the Team Blog post here)

They’ve also made Kinect for Windows hardware available in Hong Kong, South Korea, Singapore, and Taiwan — with 15 more countries starting next month! To learn all 31 countries Kinect for Windows will be available in check out the Kinect for Windows blog!

Link

Kinect Cameras Watch for Autism

From the latest New Scientist (issue 2863) comes a pretty incredible article about University of Minnesota’s Institute of Child Development‘s study which uses a Kinect camera to track kids at play and carefully watch for signs of Autism. It doesn’t act as a replacement for diagnosis, but a sort of seismograph to indicate to teachers which children might benefit from attention from a specialist.

Getting Dirty Virtually and Literally with Kinect

Video

From the space brains at UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences

“This video of a sandbox equipped with a Kinect 3D camera and a projector to project a real-time colored topographic map with contour lines onto the sand surface. The sandbox lets virtual water flow over the surface using a GPU-based simulation of the Saint-Venant set of shallow water equations.

We built this for an NSF-funded project on informal science education. These AR sandboxes will be set up as hands-on exhibits in science museums, such as the UC Davis Tahoe Environmental Research Center (TERC) or Lawrence Hall of Science.

Project home page: http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox

The sandbox is based on the original idea shown in this video: http://www.youtube.com/watch?v=8p7YVqyudiE

The water flow simulation is based on the work of Kurganov and Petrova, “a second-order well-balanced positivity preserving central-upwind scheme for the Saint-Venant system.”