Apple has included simple hardware features on their laptops that have found new and different applications in the minds of users. Here are two enabling technologies that made news recently, along with an idea I'd like to see inspired by the movie Minority Report and the Sony EyeToy.
 
The Sudden Motion Sensor
Apple began including a Sudden Motion Sensor in the 2005 PowerBooks as a feature to park the hard drive heads during a sudden shock, such as a fall. Apple wasn't the first company to include a disk parking feature using SMS, but was one of the first to widely market the feature.
 
Last fall, Amit Singh of kernelthread.com wrote an introduction to the PowerBook's Sudden Motion Sensor and described some interesting and novel applications of the motion sensor as a human interface device.
 
One application of Singh's interface idea was SmackBook Pro, a software utility that had the SMS watch for a sudden bump from the user, and respond by switching the desktop. Others examples:
 
  1. MacSaber plays sound effects to match a laptops’ speed and force as it is  swung around in battle
  2. Bubblegym is a tilt controlled game
  3. SeisMac offers a real-time display of the SMS’ three-axis acceleration graphs
 
The Illuminated Keyboard’s Ambient Light Sensor
Another hardware featured extended in an interesting direction involves the illuminated keyboard on Apple's 15" and 17" laptops. It included an ambient light sensor designed to dim the keyboard to match the brightness of the room.
 
SmackBook Pro was adapted to use these light sensors to control desktop switching, as an alternative to the SMS. The user would simply shade the light sensor area with their hand to trigger the action. Shortly afterward, a German student released another application for the keyboard light: iSpazz, a visualizer plugin for iTunes that pulsed the backlight to the beat of the current song.
 
The iSight as a Human Interface Control
The latest Apple laptops and iMacs have another hardware feature that could be used as an human interface device: their built in iSight camera. Rather than simply monitoring light levels, the iSight creates high resolution video information that software can analyze for movement, such as hand gestures. Imagine invoking Exposé with a quick wave of the hand.
 
The idea isn't new. In 2003, Sony released a camera unit for the PlayStation 2 called EyeToy that detects color and movement to involve players in a game. Players stand in the active area in front of the camera, and jump, kick, and punch to trigger actions in the game.
 
Games range from Groove, a dancing game that helps burn the calories off fat kids, to Operation Spy and other interactive games that simulate moves from karate, bowing or volleyball.
 
A Mac version of the EyeSight, called ToySight, was featured on Apple's website back in 2003. Why not move this from being a gimmick for games into a real user interface control?
 
A reader pointed out that Apple had released a concept video called Future Shock in the late 80’s, which demonstrated users interacting with computers using hand gestures; another reminded me of the Facetop project at University of Northern Carolina, Chapel Hill. Apple has also patented ideas involving a hand scanner on notebooks, and video camera integrated into the pixels of a display.
 
Think of the hand gestures Tom Cruise used to move around documents in a futuristic computer system depicted in the movie Minority Report. What's missing from today's Macs? Nothing: Apple now has a video camera built in to the majority of the new Macs sold, and Macs certainly have the horsepower to analyze motion, unlike the earliest video game consoles.
 
A common problem related to the Sony EyeSight involves inadequate lighting in the play area. MacBook and iMac users are illuminated by their display, and will generally be sitting closer and centered in front of the camera, making it easier to develop a standard set of gestures that are easy to recognize.
 
It would be very cool to have iSight camera motion detection enabled in and easy to turn on, just as voice recognition and keyboard navigation are. Users could get even better resolution from the device by using a pointer, perhaps the Apple Remote, to more accurately point and select windows or trigger behaviors.
 
Other iSight Applications
Beyond using the iSight as an alternative pointer device, there are already third party tools that use the iSight for barcode scanning, to act as a motion detector and trigger security features, and to capture time lapse movies.
 
  1. Gawker (free) does timelapse capture,
  2. Evological’s EvoBarcode, EvoCam provide various camera utilities,
  3. Delicious Monster’s Delicious Library organizes media titles using barcodes,
  4. Daydreamer  is a web cam viewer that does time lapse.
  5. SecuritySpy does time lapse and various surveillance features.
 
Apple has already delivered PhotoBooth, and QuickTime Broadcaster in Mac OS X; they could easily build support for other basic video functions such as fancy video capture with Quartz Composer filters, motion detection, and video gesture input, and really add value to Macs with a camera.
 
Perhaps Apple could even resurrect the dotcom era CueCat non-craze... without having to mail out thousands of crazy cat shaped devices!
 
I really like to hear from readers. What do you think? Leave a comment or email me with your ideas.
 
 
| | Comment Preview

Send | Subscribe | Del.icio.us | Digg | Furl | Reddit | Technorati

Read more about:
Click one of the links above to display related articles on this page.
 
 
Using iSight as a Hand Gesture Input Device
Wednesday, July 12, 2006

Apple iTunes

Apple iTunes

Apple iTunes

 
Apple iTunes

Urban | Moto | Tech | Journal | Podcast