on 20-Aug-2015 07:32
Aristotle (384 - 322 BC) is credited as the first person to classify our five sense organs: sight, smell, taste, touch, and hearing and Immanuel Kant, a famous philosopher from the 1700s said that our knowledge of the outside world depends on our modes of perception. Our highly developed organs of the eyes, ears, nose, tongue and the skin on your hand provide the sensing equipment necessary to send that information to the brain. In some cases, one of the sensors might not work properly in the case of the blind or deaf, yet the four other senses are heightened and exceed normal operation to make up for the missing information. Daniel Kish, for example, uses echolocation like a bat to see the imprint of the sound waves as they bounce back. Pretty cool, eh?
Today, we're building gadgets that are used in conjunction with or completely taking over the the tasks of the eyes, ears, nose, tongue and hands. Things that were always part of our body are being replaced with micro-chipped things that act like, attach to - or better yet - integrate with our body.
Sight: Of course there are security cameras to help us see our homes when we are away and most of us have heard of Google Glass but there are now eyeglasses being prototyped by BMW’s Mini division. They are combining the wearable with the connected car. These glasses communicate with the car via WiFi and offers a heads-up display like no other. While you can still see the real world, the glasses offer an overlay of speed, navigation, backup cameras and more. You can see just how close you are to the curb from the wheel's point of view. You can also look at a street sign and have it come to life with other overlays or additional info. While most of the data is just telemetry for now, engineers are looking to possibly incorporate driving features within the view. This is where IoT gets interesting - where one is used to compliment another. Also, Swiss engineers have developed a camera based on the human retina. Understanding the biology of the real thing, they've made a more efficient camera.
Smell: Although there were attempts earlier, in the 1940-50's, Hans Laube created a system called Smell-O-Vision which would emit odors during the movie so the audience could smell what was happening in the movie. It was only used once. GE also developed a system in 1953 that they called Smell-O-Rama. Now you can get a smell app on your phone. ChatPerf is a thumb-drive-sized atomizer that plugs into your mobile device so it can be triggered to release specific odors on command. But those are scents out. Machines that can whiff stuff in have been around awhile. Think of your smoke, carbon-monoxide or radon detectors. Today we have wearable vapor sensors that can smell diabetes. Scientists have figured out how to use a sensor to identify the odor from melanoma to detect this form of skin cancer. Those human skin cells give off an odor that doctors can pick up with a sensor. And scientists in Israel who have already developed a nanotechnology breath analyzer for kidney failure are working on one that can distinguish between the breath of a lung cancer patient verses a healthy exhale. Crazy!
Hearing: According to U.K. firm Wifore Consulting, Hearable technology alone will be a $5 billion market by 2018. Roughly the size of the entire wearable market today. Ears are able to capture things like oxygen levels, electrocardiograms, and body temperature. While sound drives the bulk of technology within this space, those ear buds could soon have technology that not only sends sounds but also captures some of your body information. And it is small enough and discrete to wear everywhere rather than carrying a mobile device. Initial uses trend with fitness. Ear buds that play music but also give you feedback on your workout. There are also smart earrings that monitor heart rate and activity. I've always said that there will come a time when we all have IPv6 chips in our ear and we'll just tug the lobe to answer a call. Carol Burnett would be proud.
Touch: Want to give a robot the ability to feel? Done. Researchers have developed a flexible sensor able to detect temperature, pressure and humidity simultaneously and a big leap towards imitating the sensing features of the human skin. While still in the early stages, future sensors could be embedded into the "electronic skin" of prosthetics, allowing amputees sense environmental changes. Another is BioTac, a fingertip that can sense force, temperature, and vibration—in some cases better than a human finger. With laser 3D printing, some orthotics can be delivered in hours rather than months.
Taste: Sweet, sour, salt and bitter used to be the domain of the tongue. Soon, electronic 'tongues' could be used to monitor the quality control of bottled water. Using chemical sensors, researchers in Texas have demonstrated that the electronic tongue can 'taste' different solutions. The sensors responded to different combinations of the four artificial taste elements with unique color combinations of red, green and blue. This enabled the device to analyze for several different chemical components simultaneously. I've written about smart chopsticks that can detect oils containing unsanitary levels of contamination, a fork that monitors how many bites you take and a smart cup that counts the amount and calories you drink. This is the Internet of Food.
Wearables make technology personal and our five senses are what helps us navigate life, gives us perspective. Who would have thought that an individual's perspective would someday become embedded within coded software.
|Connect with Peter:||Connect with F5:|