Chris Harrison at Carnegie Mellon University and Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington, recently came up with their latest invention called Skinput, which represents a skin-based interface that makes it possible for a person to use his or her palm as a touchscreen. The Skinput can be used to play games, control various devices, make phone calls and surf the Internet. The invention features a keyboard, menu and a number of other graphics that appear of the user's palm and forearm. The graphics are generated by a pico projector that in incorporated in an armband. When the user touches a certain point on his or her palm, the acoustic detector in the armband identifies the part that was activated and performs the respective action. Scientists explain that the differences in bone density, size and mass, along with filtering effects from a person's soft tissues and joints, imply that various locations on the user's skin have different acoustic features. It is worth mentioning that the acoustic detector used in this invention is able to identify five skin locations, registering an accuracy of about 95.5 percent. Using wireless technology, the researchers' latest invention can convey the signals to a cell phone, iPod or computer. The system was tested by 20 volunteers who gave a positive response to the device and its ability to provide fast navigation. Researchers look forward to present their latest invention in April at the Computer-Human Interaction conference which will take .
Microsoft (NSDQ: MSFT) is working on a new flesh-control input technology called “skinput.” But, it’s not what you’re (probably) thinking. While it might be possible to one day adapt this tech to more, shall we say, “erotic” applications, the first iteration of the skinput technology focuses on using the flesh as input controls for mobile devices. The implication here is that everything from smartphones to music players to computers could be controlled with a simple double-click on your forearm. Skinput technology works by “listening” for the sounds made by your finger tapping on a particular part of your body. Since skin, joint and bone density are highly variable on any normal human being, those taps are associated with different acoustic profiles – tapping close to your wrist would result in a slightly different “sound” than tapping closer to the elbow. The demo you see in the video below projects a control interface onto a forearm, giving the user a visual guide as to where to tap. So far, Microsoft and researcher Chris Harrison, from Carnegie Mellon University, have been able to use their flesh-control technology to play a game of Tetris and to control an iPod. In the future, though, skinput tech might completely change the way you think about double-clicking your lady’s mouse. Sorry, couldn’t resist. Microsoft will unveil their skinput tech in April.
•Think touchscreens are cool? What about touchskins? Skinscreens? However you decide to coin the term, Chris Harrison – former •intern with Microsoft Research and “Skinput” developer – wants the process of navigating personal technology *literally* in the palm of your hand. •If you’re confused (or scared), here’s how “Skinput” works: with the help of an arm-mounted pico-projector and a Bluetooth connection, the palm and forearm of the user’s body becomes the navigation center of their phone, MP3 player, or other personal technology item. Keyboards/keypads are projected onto the user’s skin which, in turn, respond to touch/tap motions. While touch is a key component to the function of “Skinput,” much of the accuracy lies in its ability to distinguish between specific, inaudible sounds generated by particular motions in the skin and bone of one’s arm. •In addition to the keyboard projection, “Skinput” also responds to various hand gestures, all of which can be programmed per desired function – tap...