Munching While Motoring – the Haptic Experience

Have you ever eaten trail mix while driving?  No need to look into the bag to make selections: M&Ms, cashews, or raisins are easy picking thanks to the sense of touch or haptics, which is how fingers “see”.  I extrapolated the trail mix experience to haptic stickers which I put on different car controls. I wanted to see whether doing this improved my driving experience when placed on the A) steering wheel cancel cruise control button, B)  drive button and C)  heads-up display.

From the cabin of an Acura 2017 TLX

Here’s what I found:

  1. For steering wheel buttons that adjust cruise control (left photo), my right thumb moved back and forth between the toggle that adjusts speed to the cancel button that has a haptic sticker on its surface.  If the haptic sticker was not there, I would either have to look down to find it or use the brake pedal. Either of these options would require more effort or be more distracting. Because the steering wheel button application was helpful, I have applied more haptic stickers to other buttons and continue to rely on them.
  2. The orange haptic sticker on the drive button (middle photo) was helpful to a lesser degree. I need to reach for the button to start driving and watch the movement of my finger.  Having a distinct tactile target may still be helpful by providing tactile feedback when the button has been touched and possibly by shortening the time that I am visually guiding my reach.
  3. The haptic stickers on the heads-up display (right photo) took more getting used to but also may have also be advantageous like with the drive button. Having a tactile target seems more natural, satisfying and confirmatory than reaching for a flat screen without distinct tactile features. Flat screens take one’s eyes off the road.  It is distracting to need to watch one’s finger while reaching, study the display, and make sure that the desired option has been selected.  (Consumer reports provides a nice review on this topic.)

Conclusion. I found it helpful to add tactile features to the controls in my car, especially on the steering wheel but to a lesser degree when reaching. Doing this helped keep my eyes on the road and hands on the wheel. In other words, keep haptics close at hand.                You would not want to eat your trail mix any other way!




Umwelt and My X-Ray Glasses

There was a time long ago when there were no smartphones or internet and virtual reality was imagination.  In the 60s, when I was 13, reading Superman and Batman comic books was a favorite activity. The best part was the page on the back with ads for things like whoopee cushions, stink bombs, and x-ray glasses, that could see through clothes! The makers of these glasses were onto something big!!  I appreciate this much more now because of a great TED talk by neuro-scientist David Eagleman that discusses umwelt or one’s experience of reality, which  is determined by what we can sense and think. For example, on my morning walk,  my dog constantly needs to stop to smell street posts. I  used to find this annoying, but now I understand her umwelt which allows her to read the daily dog news coded in scent.

Another thing I learned from the talk was that our brains are like computer processors that discern patterns from the inputs of our senses, which are like interchangeable computer peripherals. When one sense fails, our brains can adapt to discern patterns from other senses. This allows the blind to use hearing to “see” sound. Technology can also compensate for blindness by integrating with the optic or auditory nerves so that electronic peripherals can send vision or sound to our brains. Technology augments our senses. One may wear glasses  to see infrared, like snakes do, or ultraviolet light, like bees, or wear hearing aids to hear sounds waves beyond our natural range. I would like to find a nasal device to enjoy the daily dog news with my pooch – what a bonding experience that will be!

Augmentation, miniaturization, and biologic integration – this seems to be where we are going. As a physician, I see new frontiers, which allowed me to understand why a certain tumor makes a “plop” sound as it moves between the chambers of the heart. As an inventor focused on automotive innovation, I see  ways our senses can connect drivers with their vehicle. I proposed putting buttons on the steering wheel that are recognized by the sense of touch so that drivers can control their vehicles without the need to look away from the road. The toolbox of all our senses will plug’n play our brains and devices together in the realm of robotic avatars. New superpowers are enabling us to do tasks distant from our physical location – like perform surgery or explore Mars. While this all seems fantastic, I need some new comic books to find out what happens next.

Additional comments and hyperlink reference notes:

  • X-ray glasses in the 1960s seems pretty lame  compared to the internet of thought that pervades today’s adolescent mind. How unfortunate!
  • The Argus® II Retinal Prosthesis System wirelessly transmits data from a video camera in a blind patient’s glasses to an eye implant that electrically stimulates remaining retinal cells. This enables perception of visual patterns.  A brief presentation can be found here.
  • Cochlear implants consist of a microphone, processor, transmitter and electrode array that transforms sound into electrical stimulation of the auditory nerve. This technology can help a deaf person perceive speech and other sounds.
  • I published a paper called “The Etiology of Atrial Myxoma Tumor Plop” in the Journal of the American College of Cardiology, which was based on frame by frame analysis of ultrasound images. I showed that when an atrial myxoma moves from the left atrium to the left ventricle, it obstructs the mitral valve and causes a high velocity blood jet that may correspond to a tumor plop sound.
  • Our hands represent the ultimate approach for machine control because of their fine motor and sensory abilities.  Because touch is how hands  “see”, adding textures, shapes, and temperature differences to interfaces will enhance our ability to manually control devices.
  • James Cameron reached the deepest depths of almost 7 miles with his Deepsea Challenger submersible. There are many advantages to explore biologically perilous environments on Earth (and other planets) using robots that remotely integrate with our body and mind, with our being safely situated eating popcorn. Robotic avatars can extend human  senses and abilities vastly in all fields of endeavor, especially medicine.
  • Photograph of digital eye is from iStock Getty Images by Brian A Jackson.  Dog photos are copyrighted by GeriCoh LLC.

Baby Medicine for the Senior Driver

Reading the tiny print on a baby medicine bottle is a challenge at 3 in the morning. With bleary eyes, I struggled to make out the dosage while my little girl cried. The task was difficult at the age of 30, and impossible at 60. Manufacturers do not always adapt their products for seniors. The subject is so topical that a recent Consumer Reports article rated different car models according to accessibility and safety for older drivers. This means making vehicles with more features like lane change warning and dashboard buttons or controls that are easy to see, feel, and operate.  One would expect great interest in selling to aging baby boomers because they represent a large segment of the population with money to spend.

There are many things that occur with getting older. These include wisdom, experience, and patience, which translate into safer driving. There are also challenges, such as seeing and hearing less well, slowing of reflexes, and reduction of coordination and mental processing.  These changes can be addressed by applying the concept of “universal design” which advocates simplicity, intuitiveness, ease of sensing, and low physical effort.

When these goals are applied to a car’s cabin,  controls are easy to see reach, feel and manipulate. Using more than one sense to operate a control enhances its accessibility, such as recognizing a button because of its distinctive appearance and feel. Consider a large round knob for adjusting the radio volume. Controls that are readily recognized because of their appearance or feel reduce distraction, especially at night or when driving conditions cause sensory or information overload. 


Alternatively, consider some challenging interfaces for drivers of any age and possible solutions (some relating to SofTrek innovations):

  1. Smooth flat screens that require precise finger positioning guided by sight for item selection.
    1. Add tactile features to the smart screen.
    2. Simplify display with larger but fewer items.
    3. Associate display items with buttons at the side of the screen.
    4. Associate display items with the tactile features of buttons at the side of the screen.
  2. Horizontal or vertical grouping of buttons that are typically smooth surfaced and difficult to distinguish from each other. A vertical line of buttons may be at the side of smart screens. A horizontal line of buttons may be under a radio display.
    1. Increase button sizing, spacing, and diversity.
    2. Add tactile features so that one button can be distinguished readily from the other by the way it feels when it is touched.
    3. Associate the appearance, positioning, and feel of a button with the format and appearance of menu items on a heads-up display.
  3. Small controls with small print or icons on steering wheels, the dashboard, and elsewhere in the cabin.
    1. Add tactile features to the buttons
    2. Make buttons larger with larger print or images, if possible.
    3. Incorporate auditory or visual feedback when the driver touches or selects a control.
  4. Crowding many controls on or around the steering wheel and in the cabin.
    1. Reducing the number of controls in a cabin by increasing the number of functions that can occur when a control is manipulated.
    2. Using software coding to diversity the functionality of a control.
    3. Displaying on the windshield or heads-up display the various functions determined by a control.
    4. Using a database to determine the functionality and appearance of menu options associated with a control.

So the next time you find yourself bleary-eyed because of small print, consider the principles of  “universal design”. They are helping me shop for that special car with my needs in mind.



“Center for Universal Design at North Carolina State University”. 

Braille at the Edge of Sight – The Driver’s Experience

Peripheral vision does not get much fanfare, though it is essential for survival. I learned this from a reality TV show on what to do while walking through the African bush country at night. It’s easy to take for granted what we see at the edge of sight because peripheral awareness is often semi-conscious; that is until a snake falls from a tree or a car veers into one’s lane.   Furthermore, peripheral vision interacts with other senses such as touch continually in all our navigations, like when we walk and the ground meets our feet or when we reach out to hold something, like a crack in a cliff. Understanding how senses work together can also be helpful in the design of a vehicle’s human machine interface (HMI) and can make it easier to find controls and reduce distracted driving.

Peripheral vision has characteristics that deserve special consideration. Although we are naturally more aware of our central vision,  the different parts of our retina work in tandem and have complementary differences. Peripheral vision often seems to run in the background and less consciously, but draws in your central vision when it recognizes something worth seeing more fully. Peripheral vision depends on the outer parts of the retina and senses the world differently than the central retinal or fovea, which seems to have most of our attention. The table below summarizes the features of the 2 parts of the retina.

Peripheral Vision Central Vision
Less detail Detailed sight
Reduced color vision Optimal color vision
Lacks depth perception Sees in 3D with 2 eyes.


Dashboard controls, like the volume knob, may be first recognized peripherally before central vision watches one’s hands reach for and touch the control. This takes one’s eyes off the road and is distracting and possibly dangerous. An alternative is to keep one’s gaze forward and use peripheral vision to guide one’s  reach and  contact with a target. Certain things make this exercise easier such as good target

1)    peripheral visibility, because of its size and illumination.

2)    accessibility, because it is easy to reach or close to one’s hands and fingers.

3)    tactile features, distinguished by its size, shape, protrusions, depressions, and temperature.

To illustrate these concepts, consider the following scenario. I  was driving  at night on a dark and busy street in LA and wanted to change the radio volume and station. I needed to keep my gaze forward (at label A with the yellow outline of my sunglasses). Some controls (at lower right and labelled B)  were easy to see in my peripheral vision because they were large, round and illuminated. This also made it easy to reach  and feel them without looking  away from the road. In contrast, the radio station selection buttons  (yellow, dotted rectangle between top B’s) lacked tactile distinguishing features, were small and were hard to see. In this case, prolonged central gaze was needed to distinguish one button from the other.  As an analogy, my finger launched like a rocket from the steering wheel, relied on peripheral vision to navigate my hand  through space, and was able to make an accurate  touch done because of the distinctive tactile structure of the target landing pad. 

The senses of touch is very important. Akin to using Braille because of blindness, even the sighted need to to use their fingers to “see” the world because of low ambient light or  because machine controls are not where where one is looking.  Consider the moto, “eyes on the road, hands on the wheel”.  Controls with distinctive tactile or haptic features help  guide selection of menu items on a computerized heads up or windshield display  so the eyes can stay forward as much as possible. This video demo shows how to accomplish this ideal and also employs menu  integration with haptic controls.

Braille at the edge of sight is not such an abstract an idea after all. It helps keep us safe while driving. With this in mind, you may want to look at the controls in the cabin of your vehicle and see whether their design and layout makes sense to all your senses.


Source of the image of the eye: Hans-Werner Hunziker. Hans-Werner34.

Auto Vision of the Peripheral Penny

What do you do if you run out of gas while driving at night in African bush country? Do you curse because your cell phone has no signal? Do you wait until the AM? If you chose to leave your vehicle to find help (not recommended), your peripheral vision may be  really useful because it can see better in the dark than your central vision and is good at detecting motion, like a snake lunging at your from an overhead branch. Awareness of vision at the edge of sight is a practiced skill of jugglers. In urban jungles, the car that changes into your lane unpredictably and threatens to crash into you triggers your peripheral vision first.

Most of us focus (pun) on the color, 3D, and detailed vision that is generated from the central part of our retina, called the fovea. Peripheral events often trigger us to  bring the image into our central awareness. In many ways, central and peripheral vision complement each other. Vision can also complement other senses too, like the sense of touch. This is needed for hand eye coordination and can be evaluated during a neurological exam with  finger to nose tests. This examines the integration of different parts of brain involved in coordination, vision, spatial awareness, and sensory and motor abilities.

The lack of coordination of movement is called dysmetria and can be tested with the nose-finger-nose exam. Here is how it is done.  You put a finger on your nose and then as quickly as you can touch the doctor’s finger in front of you before returning your finger back to your nose. This is done quickly and repeatedly. It depends on one’s central vision and a target your finger can feel to know that you have reached the target. Think of your finger as a rocket blasting off from your nose, going through space, and landing on the lunar surface of the doctor’s finger before returning to your terrestrial nose.

Now consider what you would experience if the exam was modified. What if you did not have the sense of touch (alternatively, what if the doctor’s finger was a hologram). In this case, you would have to compensate with vision to be sure that your finger reach its target and did not overshoot or undershoot. This would take a lot of concentration. Accuracy and speed of this exercise would reasonably be expected to be worse.

In another scenario, what if you did this exercise using your peripheral vision instead of your central vision. Peripheral vision is not as detailed as central vision and sacrifices depth perception. The finger to nose test would be harder still, and even harder if the sense of touch was also absent.

This information is useful for understanding how vision and touch enhance each other when you are driving and selecting controls on the dashboard or elsewhere in the cabin. To prove this, consider a game called “Touch the Penny”. To do this, take a blank sheet of white paper and put a penny in its center. Trace the periphery of the penny so that is outline remains and fills the circle with brown color. Place the sheet on a table to the side of your dominant hand and hold a pen between your thumb and second finger and place your third finger on your nose. Now without looking down, and keeping your eyes straight ahead, use your peripheral vision to guide your third finger to the drawing of the penny and then mark the center of thepenny with an “x”. After that, bring your finger back to your nose, and repeat the cycle ten times, moving back and forth as fast as you can. Now repeat the entire exercise, but this time stick the actual penny on the sheet. Again, your third finger is used to guide its landing into the middle of the penny, one that you can actually feel it. Like the finger to nose test, speed and accuracy are better with a recognizable tactile target.

When it comes to driving, the ideal is “eyes on the road all the time”. It would be helpful if it was never necessary to divert one’s gaze off the road to operate cabin controls. This couldhappen with heads up displays that project menus onto the windshield without significantly obstructing the view. Another way this can be achieved is by adding distinct tactile qualities to the controls in the cabin so that the driver need not look away from the road to recognize the control. If the control is viewed using peripheral vision, the sense of touch can improve selection accuracy and confidence.

There are times when buttons and controls in a car’s cabin are difficult to see. Perhaps it is night or the controls are hidden, perhaps because of novel positioning underneath the rim of the steering wheel. When vision is challenged, the benefits of Braille for the blind should not be unseen by the sighted. In sum,  1) many senses can be combined to improve our interactions with our machines and 2) make sure your vehicle is well equipped when driving into the wilderness, urban or otherwise.

Source of the image of the eye: Hans-Werner Hunziker. Hans-Werner34.

A Street Car You Desire

wiki-horse-street-carThere is something called “desire paths”, that I learned about while watching an excellent TED talk by Mr. Tom Hulme . Desire paths are the short-cuts or “paths of least resistance” that one recognizes while interacting with a structured model. Sidewalk landscaping around buildings is a good example. Have you ever found that the walkways to buildings are too circuitous, taking you on unnecessary journeys through gardens and parking lots? They abdicate the “line” rule, the shortest distance between 2 points. Impatient people like me may cut through the lawn and  wear down the grass until a more direct dirt path emerges. If the architected field of dreams does not correctly anticipate what we need,  users may not come. 

Recognition of desire paths may improve implementation of any technology. When the elevator was first developed, there was nervousness about the cable breaking and the risk of a free fall for all.  Innovators, like Elisha Otis in 1852 pioneered solutions. Now, the pleasantries and culture of attendant operated elevators are forgotten and automation is taken for granted. The designers of the autonomous car also believe that we will learn to accept and safely use their technology too.

So what are desire paths that may enable quicker implementation of the autonomous car? One way is to put smart cars on smart roads. This means switching on autonomy when the enabled vehicle drives on a road that can interact with it because of technology integrated in the roadway, signs, and lighting. Highway lanes could be designated for autonomous vehicles just as there are lanes for vehicles with more than one passenger.   When the vehicle leaves this special lane, it would revert to manual control. 

Another model for autonomous driving is vehicle platooningHere, cars or trucks could be
switched into autonomous mode when they join a string of similar vehicles. This sequence of vehicles resembles the attached cars of a train. The first vehicle in the platoon is manually driven. In turn, it chauffeurs the vehicles behind it. This model is being investigated by the
Safe Road Trains for the Environment (SARTRE) project in Europe. Advantages include fuel efficiency, decreased wind drag, and autonomy for the chauffeured vehicles

Perhaps someday, I will own a car with a single red brake button and no steering wheel or floor pedals. And I may find special desire paths for driving such a car. Until that day comes, there are other ways autonomous vehicles will become mainstream soon – at least that’s what I desire.

Formula One race car with light effect. Race car with no brand name is designed and modelled by myself

References: The concept of “autopilot” lanes was described in an article by T Melba Kurman, Triple Helix Innovation and Hod Lipson, Cornell University in December 2013 called: Where Are the Autopilot Lanes for Driverless Cars? (Op-Ed) 

Future Car from iStock photo. Formula One race car with light effect.

Horse drawn street car: “Rapid transit in 1877″ – First horsecar run in Manchester, New Hampshire”. Published 1908 by the Hugh C. Leighton Company, Portland, Maine. Image was downloaded from Wikimedia Commons.

Braille for the Sighted


Many people have black and white thinking when it comes to vision. Think a blink; sight – eyes open, blind – eyes shut. But here’s the paradox: the brain, encased in the dark confines of the skull, uses many senses to “see” the world. With blindness, the occipital lobe, normally devoted to vision, can interpret other stimuli such as touch, sound, and smell, using these abilities to navigate the world, which includes reading Braille.

The sighted can also benefit from using touch to “see”. Here are some ways this can be incorporated in technology: 

  1. You are plugging a cord into the back of a computer that cannot be turned and but are challenged to find the desired empty port. Solution: put matching textures, ridges or shapes on the end of the cord and beside or around the port to guide their connection.
  2. You are operating the TV cable remote control but need to look where you put your fingers to change channel and volume. Solution: Make it easier to identify buttons by their shapes or textures so you don’t need to look away from the TV. When picked up, the remote keypad configuration and tactile features is mirrored onto the TV display and guides button selection.
  3. Your smartphone starts to ring in your pocket but is difficult to pull out of your pocket. Fortunately, you can recognize different buttons by the way they feel and choose to answer a call or not.
  4. You’re driving your car and want to operate the cruise control, change radio channels and volume, and make phone calls using the many buttons embedded in the steering wheel. These buttons are hard to distinguish and you don’t want to look away from the road to select these controls.  Solution: Put buttons on the steering wheel that can be recognized with the sense of touch to reduce distraction. The menu items on a heads up display can be associated with icons that match the positioning, shape, and feel of the buttons on the steering wheel, to guide the selection process.

So here’s my takeaway: tech should stay in touch and put on a haptic face.

References and Comments: 

Michael Proulx has written an excellent paper called Blindness: remapping the brain and the restoration of vision. Sensory substitution technology enables sight without visual input. He describes the neuroplasticity of the brain and references papers that have found that the visual cortex can process other senses (sound, sight, and smell) to “see” without eyesight.

Massathusetts Institute of Technology published at an article titled: Parts of brain can switch functions: In people born blind, brain regions that usually process vision can tackle language. The visual cortex is integral to reading in the sighted. Interestingly, it continues this role in the blind by processing Braille.

The paradox of the brain in its dark confines seeing the world is derived from All the Light We Cannot See,  by Anthony Doerr, that won the Pulitzer Prize for fiction and exquisitely describes how a girl uses touch and other senses to adapt to becoming blind.

“Haptic” relates to the sense of touch.






Blood Flow As a Model for Autonomous Transportation

Nature has a way of recreating biologic events in surprising ways, such as the coin-like stacking of red cells and the linear linking of autonomous cars moving through arterial highways. Red cells are round and similarly shaped and can connect to one another by
protein links in a stack of discs, called “rouleaux”.  Greys RBCs 2

Autonomous vehicles might similarly connect in platoons to increase roadway capacity and transportation efficiency. Good-bye the 2 second safety spacing rule between non-autonomous cars!  Instead, autonomous cars, linked bumper to bumper, will move as one, like stacked blood cells. When the light turns green, the line of connected cars will move through the intersection together. Just as red cells may collect in rouleaux or disconnect to move individually, the autonomous car can join or leave a sequence of platoons according to destination.

The study of blood can lend other lessons to future transportation planning, such as laminar and turbulent flow, cellular diversity, rheology, and pathology. Understanding these factors may spur novel approaches to vehicle and roadway design and anticipate imperfection and disease. Driving through highway construction traffic, I learned of Ford’s plan to mass produce autonomous cars in five years. Sounds like a bloody good idea to me!

Reference: The above image is from Henry Gray’s Anatomy of the Human Body (1918) via Wikipedia Commons. Panel “a” shows red blood cells en face. The cell has a discoid or bi-concave form that maximizes its surface area, which may be more important for laminar flow than diffusion of oxygen.  Laminar flow is orderly flow in parallel layers, rather than the disorganized motion of particles moving in different directions and velocities seen in turbulent flow. If most cars on the road adopted the same size and shape, speed, and direction, more efficient laminar-like flow would be expected. Autonomous vehicles in rouleaux formation might take the turbulent steam out of road rage and other erratic driving styles. Panel “b” shows red cells stacked in rouleaux formation, which can be visualized on ultrasound or echo images of the heart and blood vessels as smoke


Tech and the Common Touch

I recently learned about some techo-magic that seems irresistible. Radar micro-motion sensors now enable controls of electronics with the snap, tap, or rub of a finger. In a car, instead of reaching for the volume knob, you can  slide your thumb over the top of your first finger and turn up the radio. The science of sound extends what we can do and what we can see – from radar to echocardiography, from seeing the heart to gesture recognition.

This ability has many advantages but it is one more way that technology separates our bodies from the physical world. Humans have been around for 200,000 years and it is relatively recent that we have reached through space to control our world. From flint spears to missiles, our trajectory evolves with benefits and costs, which may include losing the acuity of our senses and changing the way we interact with our world for better or worse.

Industry may presume that we prefer a cabin full of sterile flat screens and consoles rather than the knobs, buttons and levers of years past. At first glance, this pact with devilish
convenience seems like a good idea. Yet dream cruises that display old cars highlight the
wonderful multi-sensory experience that our parent’s parents enjoyed. This includes manual
connections and the enjoy of interacting with controls of different shapes, the wooden
paneling, the leather and fabrics. The tactile experience of interacting with our vehicles and electronics deserves restoration. This will keep us connected, not only with these tools, but with what keeps us human.  

iStock_000011092631Large                                                  iStock photo

Share This:


An Engineer at Heart

I told my teenagers  that I  joined the Society of Automotive Engineers while driving home from school. In other words, Dad was not  just a cardiologist but a CAR-diologist. After the predictable eye rolls and plugging in of ear buds, I expounded that hearts and carts share a few things in common, and by carts, I refer to the internal combustion engine of cars.


– require an electrical discharge, oxygen, and carbon based fuel so that a chamber can pressurize.

-emit carbon dioxide as a byproduct of this mechanical activity.

-have valves and associated inflow and outflow lines.

-have cyclical pressure cycles that alternate between filling and emptying.

-have pressure priming before a much greater force is triggered.

-transmit energy to connected and remote components.

-have standstill when repetitive mechanical cycling stops.

True, there are differences. The internal combustion engine depends on explosions while a heart chamber depends on biochemical processes that contract its muscular wall.

Then I became concerned that my kids, immersed in their devices, might not appreciate all this as much as their Dad, the science nerd. Who would think that biological beings would create machines that, in some ways, would emulate their biology?

Cardiology and cars are not so unrelated.  Both are in the midst of a technology revolution (stay tuned). The American College of Cardiology, American Society of Echocardiography and Society of Automotive Engineers – I glad to be part of it all!

Share This: