I’m at the 2009 Printed Electronics conference in San Jose today and tomorrow. Below are highlights from a talk about wearable sensors by Professor Joe Paradiso of MIT.
(Note: Paradiso gave the disclaimer that his projects are less about printed electronics and more about the cool things that can be done once electronics become more practical to print. Thus, these examples consist of rigid circuit boards.)
Sensor Network as Skin: It’s a bit clunky in this incarnation, but the idea is that a collection of multisensor nodes sense the environment and communicate with each other. If this can be shrunk down, it might make a good sort of artificial skin for robots.
SportSemble: This project puts sensors on the Boston Red Sox to track subtle movements as well as fast, dramatic movements. The technical challenge is to monitor such a large range of movements well–most sensors don’t give the range and precision at both ends of the movement spectrum. So the researchers kluged together two types of sensors. They hope the sensor pack can offer some insight into the way that athletes’ movements and body positions change over the course of a game.
Spinner: What if software could automatically edit video to fit a narrative structure? It would be useful for lifecasting, for sure. The project, called Spinner, uses video from cameras installed at the Media Lab and data from people who wear smart badges to keep track their activity and location. The researchers use software to pick out certain characteristics in the video and string it together. For instance, you could instruct the software to put together clips that show the Media Lab with bright light and low activity, clips of Sue at the Media Lab during low activity, and during high activity, and finally, clips of the Media Lab with darkness and low activity. As a bonus, the software matches a soundtrack to the video.