Haptic Assist Technology is a HAT That’s Hot!

Thinking about a new hat. How about Haptic Assisted Technology! It’s the new go-to tech in town, though no stranger to the driving experience. Older cars had a diverse tactile landscape with levers, knobs, and buttons. This landscape consisted of “anticipatory haptics”, recognized with the sense of touch, that reduces the need to look away from the road while driving. One study referenced in a recent New York Times article found that the use of manual transmission in teenagers with ADHD driver attention and safety, indicating a benefit that occurs with more engagement with driving not less.

Fast forward to today, and many vehicles are missing many traditional haptic controls. Instead, untextured buttons and smart-screens dominate.  The problem is that smart screens distract the driver’s gaze from the road and require study for content and precise finger selection. In all its forms and sizes, these screens are seductive and addictive eye candy that cause distraction and endanger the driver.

Driver assistance technologies and education can improve safety, such as lane-departure warning and blind-spot detection systems. Haptic feedback enhances these systems, such as by applying resistance to a dangerous turn in the steering wheel. A University of South Florida study found that one way to enhance blind spot detection was by sending a vibration to the hands of the driver during an unsafe lane change. Many vehicle brands depend solely on an audio or visual alert for this purpose, which may go unnoticed because of traffic complexity, road noise, and other distractions. Multi-sensory feedback that includes haptics may be more effective.

SofTrek has previously presented another type of Haptic Assist Technology that in conjunction with other systems may mitigate the danger of online distraction by keeping eyes on the road and hands on a steering wheel with haptic controls.

Consider this scenario:


Example of how menu items can correspond to the feel and position of buttons on a steering wheel.

You are driving to work in the middle highway lane on a sunny day. Your car senses the vehicles in every direction by sound waves, pattern recognition, and vehicle to vehicle and vehicle to road communication and keeps you at a full distance from other objects on the road.  You are looking at the windshield and a superimposed projection of a weather website. The format of the website includes menu items that match the position and feel of steering wheel button. You can make reading a news story online projected onto the windshield and can make selections on the web page because steering wheel buttons have tactile features that match icons on the steering wheel.  When you want to select an item, such as the weekend forecast, you press a steering wheel button that corresponds to the position and feel of the menu item. This keeps your hands on the steering wheel and eyes forward on the superimposed view of the road. Like a roadside billboard, the projection may be off to the side or not fully opaque, so that you can still see the road. Other driver assistance technologies kick in to cause abrupt disappearance of the projected website if driving conditions become complex or if an unexpected traffic event occurs. For example, a car or an object cutting into your lane would trigger automated collision warning and autonomous emergency braking.   When safety prevails, the driver can multitask and use haptic steering wheel buttons and projected menus to make phone calls, manage infotainment, navigate the internet and operate the vehicle controls.

Sound far fetched? Consider air flight and the evolution from flying kites to blasting rockets to Mars and beyond. When we consider the past and let our imagination soar, autonomous vehicles become a reasonable part of the future. Assistance technologies are part of the evolution in that direction and include a HAT worth wearing!

Imagine if autonomous vehicles looked like this!

Footnotes: Online search terms like “Haptic Assistance Technology” or “Haptic Assisted Technology”  generate a list for “Haptic Assistive Technology.” These refer to sensory substitution aids for individuals with visual. Braille, designed to help the blind, is a prime example but may have related application for the sighted, such as situations where the sense of touch is needed because of darkness or the out of the line of sight position of one’s hands.

Images from iStock photo

IS A DRIVERLESS CAR LIKE THE HEADLESS HORSEMAN OF SLEEPY HOLLOW?

John Quidor: The Headless Horseman Pursuing Ichabod Crane.

Sometimes when I think about what the future will be like when autonomous vehicles take over the roads, the tale of the Sleepy Hollow comes to mind. It is the story of a man making his way home on a dark country road, when he hears distant galloping and is frozen with fear by the approach of a horse saddled with a headless man. Though dead and lacking the usual above shoulder attachment, the figure races forward with purpose and precision. Though there are differences, the autonomous car can also be scary. A recent article in the New York Times describes explicit hostility where this technology is being tested because of safety and job loss concerns. Deadly accidents are one factor that put the kibosh on consumer acceptance which is in flux.  A AAA survey reveals that most drivers are not comfortable with sharing the road with driverless cars.

Like the evolution of living beings, technology is a work in progress that depends on endless experimentation and adaption to challenges and change.  Who knows what the driverless car will be like in a hundred years? Past experience may guide our expectations. Pedestrians had to get used to walking where horses and more recently driven cars tread. Now the challenge is getting used to driverless vehicles. As with Sleepy Hollow, dealing with fear is key. Trust needs to be built. Errors need to be understood. The public needs to be educated on how to use and interact with this technology safely, just like we needed to learn how to operate stoves. What do you do when the headless horseman races towards you? How do you interact and make sure that you are being recognized?    

SofTrek has a solution that keeps a head on the horseman by keeping the driver’s “eyes on the road hands on the wheel”. To do this, SofTrek adds tactile features to computer buttons so that they can be located without the need to look away from a display. If these buttons are on a steering wheel and matched with the format of projected menus, the driver does not need to look down to locate a desired button or away from the road to reach for a dashboard or cell phone screen. Safety is maintained by enabling a sudden attention switch to the road by what is sensed by the driver and the vehicle’s computer.

By combining different solutions, there are happy trails to Sleepy Hollow. Men and women, their horses, and little children too, depend on them.  

A Room With A Virtual View

When I had my office in the old building, I really enjoyed having a window to the outside. The flow of the day shifted with the sunlight and clouds, the rustle of the trees, and the movement of people and animals outside while I was tied to a monitor with reading echocardiograms.

That all changed with the construction of a shiny new office tower and my relocation to a new, spacious office, which was very nice except for the absence of windows. I missed the “brain breaks” that glancing outside gave me. Reading echos depends on constant data processing, actively forgetting and making room for new incoming so managing mental fatigue is very important.

I looked to my hobbies to put breathers into my workflow.  This included an aquarium with live plants, sandstone, and peaceful fish, like guppies and mollies. The chaotic activity of my fish helps release the knots in my mind. A hydroponic system called AeroGarden ®  was placed beside the aquarium. The system generates a mood-lifting natural light and grows stunning flowers that prompt happy comments from people walking by my open door. I teach the beauty of life can be found in the structure of plants and hearts.



The view from the desk where I interpret echocardiograms. Nature Relaxation™ image wa reproduced with permission.

Fatigue management also consists of a stunning 4K monitor on the wall in front of my desk that was generously provided to me by my employer. The large monitor mirrors my desktop so that cardiology fellows can follow and learn from my interpretations as I point to difficult structures with my mouse. Sometimes, there will be a group of fellows, residents, and medical students learning from me. At times, I raise my sit-stand desk so we can stretch and re-frame.  

I can play nature videos on the 4K monitor like the ones from Nature Relaxation™  or Youtube and revel from vistas of Norwegian fjords, Patagonians landscapes, and the underwater dances of whales. The monitor is my virtual window. It helps me focus and lighten my mood. When I use the 4K monitor for teaching, I position my laptop beside my desktop monitor and glance at nature images on a smaller screen. My students enjoy this very much because it inspires pleasant banter and intermingles thoughts of nature with the learning of cardiology concepts. The restorative images give me the stamina to analyze massive amounts of medical data.

My experience is supported by what has been published. Research pioneered by Rachel and Stephen Kaplan shows that seeing nature manages mental fatigue and stress and improves memory and attention. Renowned writer and neurologist Oliver Sacks MD notes in his essay collection “Everything In Its Place” that nature “exerts its calming and organizing effects on our brains” and describes how exposure to gardens benefited his patients. He asserts that this is also “critical for people working long days in windowless offices”. These are powerful observations that concur with my experience. Furthermore, projected images also do the trickfor me; they help me stay focused and in flow. It addresses a nature deficit disorder, described by Richard Louv in children, but can affect adults too. His book, “The Nature Principle”, describes how connecting with nature promotes health, creativity and mental acuity. For education, video projections allow a simulation of the outdoor classroom inside.

Wherever I go, I see people glued to their screens, together and separated at the same time.  But these screens can also be beneficial. Many people work in rooms and cubicles without windows and have similar needs. The good news is that we can benefit from screens to access nature and work in a room with a virtual view.

Disclaimers and credits: No endorsement of this blog or website is implied by references to individuals, institutions, companies or their products.

A Room with a View is a book by E.M. Forster that was published in 1908.
Oliver Sacks MD quote from  New York Times article and his essay collection  “Everything in Its Place”  


Munching While Motoring – the Haptic Experience

Have you ever eaten trail mix while driving?  No need to look into the bag to make selections: M&Ms, cashews, or raisins are easy picking thanks to the sense of touch or haptics, which is how fingers “see”.  I extrapolated the trail mix experience to haptic stickers which I put on different car controls. I wanted to see whether doing this improved my driving experience when placed on the A) steering wheel cancel cruise control button, B)  drive button and C)  heads-up display.

From the cabin of an Acura 2017 TLX

Here’s what I found:

  1. For steering wheel buttons that adjust cruise control (left photo), my right thumb moved back and forth between the toggle that adjusts speed to the cancel button that has a haptic sticker on its surface.  If the haptic sticker was not there, I would either have to look down to find it or use the brake pedal. Either of these options would require more effort or be more distracting. Because the steering wheel button application was helpful, I have applied more haptic stickers to other buttons and continue to rely on them.
  2. The orange haptic sticker on the drive button (middle photo) was helpful to a lesser degree. I need to reach for the button to start driving and watch the movement of my finger.  Having a distinct tactile target may still be helpful by providing tactile feedback when the button has been touched and possibly by shortening the time that I am visually guiding my reach.
  3. The haptic stickers on the heads-up display (right photo) took more getting used to but also may have also be advantageous like with the drive button. Having a tactile target seems more natural, satisfying and confirmatory than reaching for a flat screen without distinct tactile features. Flat screens take one’s eyes off the road.  It is distracting to need to watch one’s finger while reaching, study the display, and make sure that the desired option has been selected.  (Consumer reports provides a nice review on this topic.)

Conclusion. I found it helpful to add tactile features to the controls in my car, especially on the steering wheel but to a lesser degree when reaching. Doing this helped keep my eyes on the road and hands on the wheel. In other words, keep haptics close at hand.                You would not want to eat your trail mix any other way!

 

WARNING: THIS PURPOSE OF THIS POST IS TO ILLUSTRATE HOW ADDING TACTILE FEATURES TO BUTTONS MAY BE USEFUL. BECAUSE SAFETY STUDIES HAVE NOT BEEN DONE ON PERSONALIZED TACTILE MODIFICATION OF CONTROLS, SOFTREK DOES NOT RECOMMEND THAT READERS FOLLOW THESE EXAMPLES.  SOFTREK INC ASSUMES NO RESPONSIBILITY IN THIS REGARD FOR ANY INJURY, DEATH, OR LOSS OF ANY KIND. THE PERSONALIZED MODIFICATIONS DESCRIBED ABOVE HAVE NOT BE ENDORSED BY ANY MANUFACTURER OR REGULATORY INSTITUTION. 

PHOTOGRAPHS ARE DERIVED FROM MAXI_AIDS, INC “BUMP-DOTS” AND ACURA VEHICLE AND iSTOCK PHOTO. NO ENDORSEMENT BY THESE COMPANIES OF THIS POST IS IMPLIED AND iSTOCK PHOTO.

Umwelt and My X-Ray Glasses

There was a time long ago when there were no smartphones or internet and virtual reality was imagination.  In the 60s, when I was 13, reading Superman and Batman comic books was a favorite activity. The best part was the page on the back with ads for things like whoopee cushions, stink bombs, and x-ray glasses, that could see through clothes! The makers of these glasses were onto something big!!  I appreciate this much more now because of a great TED talk by neuro-scientist David Eagleman that discusses umwelt or one’s experience of reality, which  is determined by what we can sense and think. For example, on my morning walk,  my dog constantly needs to stop to smell street posts. I  used to find this annoying, but now I understand her umwelt which allows her to read the daily dog news coded in scent.

Another thing I learned from the talk was that our brains are like computer processors that discern patterns from the inputs of our senses, which are like interchangeable computer peripherals. When one sense fails, our brains can adapt to discern patterns from other senses. This allows the blind to use hearing to “see” sound. Technology can also compensate for blindness by integrating with the optic or auditory nerves so that electronic peripherals can send vision or sound to our brains. Technology augments our senses. One may wear glasses  to see infrared, like snakes do, or ultraviolet light, like bees, or wear hearing aids to hear sounds waves beyond our natural range. I would like to find a nasal device to enjoy the daily dog news with my pooch – what a bonding experience that will be!

Augmentation, miniaturization, and biologic integration – this seems to be where we are going. As a physician, I see new frontiers, which allowed me to understand why a certain tumor makes a “plop” sound as it moves between the chambers of the heart. As an inventor focused on automotive innovation, I see  ways our senses can connect drivers with their vehicle. I proposed putting buttons on the steering wheel that are recognized by the sense of touch so that drivers can control their vehicles without the need to look away from the road. The toolbox of all our senses will plug’n play our brains and devices together in the realm of robotic avatars. New superpowers are enabling us to do tasks distant from our physical location – like perform surgery or explore Mars. While this all seems fantastic, I need some new comic books to find out what happens next.

Additional comments and hyperlink reference notes:

  • X-ray glasses in the 1960s seems pretty lame  compared to the internet of thought that pervades today’s adolescent mind. How unfortunate!
  • The Argus® II Retinal Prosthesis System wirelessly transmits data from a video camera in a blind patient’s glasses to an eye implant that electrically stimulates remaining retinal cells. This enables perception of visual patterns.  A brief presentation can be found here.
  • Cochlear implants consist of a microphone, processor, transmitter and electrode array that transforms sound into electrical stimulation of the auditory nerve. This technology can help a deaf person perceive speech and other sounds.
  • I published a paper called “The Etiology of Atrial Myxoma Tumor Plop” in the Journal of the American College of Cardiology, which was based on frame by frame analysis of ultrasound images. I showed that when an atrial myxoma moves from the left atrium to the left ventricle, it obstructs the mitral valve and causes a high velocity blood jet that may correspond to a tumor plop sound.
  • Our hands represent the ultimate approach for machine control because of their fine motor and sensory abilities.  Because touch is how hands  “see”, adding textures, shapes, and temperature differences to interfaces will enhance our ability to manually control devices.
  • James Cameron reached the deepest depths of almost 7 miles with his Deepsea Challenger submersible. There are many advantages to explore biologically perilous environments on Earth (and other planets) using robots that remotely integrate with our body and mind, with our being safely situated eating popcorn. Robotic avatars can extend human  senses and abilities vastly in all fields of endeavor, especially medicine.
  • Photograph of digital eye is from iStock Getty Images by Brian A Jackson.  Dog photos are copyrighted by GeriCoh LLC.

Baby Medicine for the Senior Driver

Reading the tiny print on a baby medicine bottle is a challenge at 3 in the morning. With bleary eyes, I struggled to make out the dosage while my little girl cried. The task was difficult at the age of 30, and impossible at 60. Manufacturers do not always adapt their products for seniors. The subject is so topical that a recent Consumer Reports article rated different car models according to accessibility and safety for older drivers. This means making vehicles with more features like lane change warning and dashboard buttons or controls that are easy to see, feel, and operate.  One would expect great interest in selling to aging baby boomers because they represent a large segment of the population with money to spend.

There are many things that occur with getting older. These include wisdom, experience, and patience, which translate into safer driving. There are also challenges, such as seeing and hearing less well, slowing of reflexes, and reduction of coordination and mental processing.  These changes can be addressed by applying the concept of “universal design” which advocates simplicity, intuitiveness, ease of sensing, and low physical effort.

When these goals are applied to a car’s cabin,  controls are easy to see reach, feel and manipulate. Using more than one sense to operate a control enhances its accessibility, such as recognizing a button because of its distinctive appearance and feel. Consider a large round knob for adjusting the radio volume. Controls that are readily recognized because of their appearance or feel reduce distraction, especially at night or when driving conditions cause sensory or information overload. 

i

Alternatively, consider some challenging interfaces for drivers of any age and possible solutions (some relating to SofTrek innovations):

  1. Smooth flat screens that require precise finger positioning guided by sight for item selection.
    1. Add tactile features to the smart screen.
    2. Simplify display with larger but fewer items.
    3. Associate display items with buttons at the side of the screen.
    4. Associate display items with the tactile features of buttons at the side of the screen.
  2. Horizontal or vertical grouping of buttons that are typically smooth surfaced and difficult to distinguish from each other. A vertical line of buttons may be at the side of smart screens. A horizontal line of buttons may be under a radio display.
    1. Increase button sizing, spacing, and diversity.
    2. Add tactile features so that one button can be distinguished readily from the other by the way it feels when it is touched.
    3. Associate the appearance, positioning, and feel of a button with the format and appearance of menu items on a heads-up display.
  3. Small controls with small print or icons on steering wheels, the dashboard, and elsewhere in the cabin.
    1. Add tactile features to the buttons
    2. Make buttons larger with larger print or images, if possible.
    3. Incorporate auditory or visual feedback when the driver touches or selects a control.
  4. Crowding many controls on or around the steering wheel and in the cabin.
    1. Reducing the number of controls in a cabin by increasing the number of functions that can occur when a control is manipulated.
    2. Using software coding to diversity the functionality of a control.
    3. Displaying on the windshield or heads-up display the various functions determined by a control.
    4. Using a database to determine the functionality and appearance of menu options associated with a control.

So the next time you find yourself bleary-eyed because of small print, consider the principles of  “universal design”. They are helping me shop for that special car with my needs in mind.

………………………………………………………………………………………………………………………………………

References:

“Center for Universal Design at North Carolina State University”. Design.ncsu.edu. 

Braille at the Edge of Sight – The Driver’s Experience

Peripheral vision does not get much fanfare, though it is essential for survival. I learned this from a reality TV show on what to do while walking through the African bush country at night. It’s easy to take for granted what we see at the edge of sight because peripheral awareness is often semi-conscious; that is until a snake falls from a tree or a car veers into one’s lane.   Furthermore, peripheral vision interacts with other senses such as touch continually in all our navigations, like when we walk and the ground meets our feet or when we reach out to hold something, like a crack in a cliff. Understanding how senses work together can also be helpful in the design of a vehicle’s human machine interface (HMI) and can make it easier to find controls and reduce distracted driving.

Peripheral vision has characteristics that deserve special consideration. Although we are naturally more aware of our central vision,  the different parts of our retina work in tandem and have complementary differences. Peripheral vision often seems to run in the background and less consciously, but draws in your central vision when it recognizes something worth seeing more fully. Peripheral vision depends on the outer parts of the retina and senses the world differently than the central retinal or fovea, which seems to have most of our attention. The table below summarizes the features of the 2 parts of the retina.

Peripheral Vision Central Vision
Less detail Detailed sight
Reduced color vision Optimal color vision
Lacks depth perception Sees in 3D with 2 eyes.

 

Dashboard controls, like the volume knob, may be first recognized peripherally before central vision watches one’s hands reach for and touch the control. This takes one’s eyes off the road and is distracting and possibly dangerous. An alternative is to keep one’s gaze forward and use peripheral vision to guide one’s  reach and  contact with a target. Certain things make this exercise easier such as good target

1)    peripheral visibility, because of its size and illumination.

2)    accessibility, because it is easy to reach or close to one’s hands and fingers.

3)    tactile features, distinguished by its size, shape, protrusions, depressions, and temperature.

To illustrate these concepts, consider the following scenario. I  was driving  at night on a dark and busy street in LA and wanted to change the radio volume and station. I needed to keep my gaze forward (at label A with the yellow outline of my sunglasses). Some controls (at lower right and labelled B)  were easy to see in my peripheral vision because they were large, round and illuminated. This also made it easy to reach  and feel them without looking  away from the road. In contrast, the radio station selection buttons  (yellow, dotted rectangle between top B’s) lacked tactile distinguishing features, were small and were hard to see. In this case, prolonged central gaze was needed to distinguish one button from the other.  As an analogy, my finger launched like a rocket from the steering wheel, relied on peripheral vision to navigate my hand  through space, and was able to make an accurate  touch done because of the distinctive tactile structure of the target landing pad. 

The senses of touch is very important. Akin to using Braille because of blindness, even the sighted need to to use their fingers to “see” the world because of low ambient light or  because machine controls are not where where one is looking.  Consider the moto, “eyes on the road, hands on the wheel”.  Controls with distinctive tactile or haptic features help  guide selection of menu items on a computerized heads up or windshield display  so the eyes can stay forward as much as possible. This video demo shows how to accomplish this ideal and also employs menu  integration with haptic controls.

Braille at the edge of sight is not such an abstract an idea after all. It helps keep us safe while driving. With this in mind, you may want to look at the controls in the cabin of your vehicle and see whether their design and layout makes sense to all your senses.

————————————————————————————————————

Source of the image of the eye: Hans-Werner Hunziker. Hans-Werner34. https://upload.wikimedia.org/wikipedia/commons/3/39/Double_system_e.jpg

Auto Vision of the Peripheral Penny

What do you do if you run out of gas while driving at night in African bush country? Do you curse because your cell phone has no signal? Do you wait until the AM? If you chose to leave your vehicle to find help (not recommended), your peripheral vision may be  really useful because it can see better in the dark than your central vision and is good at detecting motion, like a snake lunging at your from an overhead branch. Awareness of vision at the edge of sight is a practiced skill of jugglers. In urban jungles, the car that changes into your lane unpredictably and threatens to crash into you triggers your peripheral vision first.

Most of us focus (pun) on the color, 3D, and detailed vision that is generated from the central part of our retina, called the fovea. Peripheral events often trigger us to  bring the image into our central awareness. In many ways, central and peripheral vision complement each other. Vision can also complement other senses too, like the sense of touch. This is needed for hand eye coordination and can be evaluated during a neurological exam with  finger to nose tests. This examines the integration of different parts of brain involved in coordination, vision, spatial awareness, and sensory and motor abilities.

The lack of coordination of movement is called dysmetria and can be tested with the nose-finger-nose exam. Here is how it is done.  You put a finger on your nose and then as quickly as you can touch the doctor’s finger in front of you before returning your finger back to your nose. This is done quickly and repeatedly. It depends on one’s central vision and a target your finger can feel to know that you have reached the target. Think of your finger as a rocket blasting off from your nose, going through space, and landing on the lunar surface of the doctor’s finger before returning to your terrestrial nose.

Now consider what you would experience if the exam was modified. What if you did not have the sense of touch (alternatively, what if the doctor’s finger was a hologram). In this case, you would have to compensate with vision to be sure that your finger reach its target and did not overshoot or undershoot. This would take a lot of concentration. Accuracy and speed of this exercise would reasonably be expected to be worse.

In another scenario, what if you did this exercise using your peripheral vision instead of your central vision. Peripheral vision is not as detailed as central vision and sacrifices depth perception. The finger to nose test would be harder still, and even harder if the sense of touch was also absent.

This information is useful for understanding how vision and touch enhance each other when you are driving and selecting controls on the dashboard or elsewhere in the cabin. To prove this, consider a game called “Touch the Penny”. To do this, take a blank sheet of white paper and put a penny in its center. Trace the periphery of the penny so that is outline remains and fills the circle with brown color. Place the sheet on a table to the side of your dominant hand and hold a pen between your thumb and second finger and place your third finger on your nose. Now without looking down, and keeping your eyes straight ahead, use your peripheral vision to guide your third finger to the drawing of the penny and then mark the center of thepenny with an “x”. After that, bring your finger back to your nose, and repeat the cycle ten times, moving back and forth as fast as you can. Now repeat the entire exercise, but this time stick the actual penny on the sheet. Again, your third finger is used to guide its landing into the middle of the penny, one that you can actually feel it. Like the finger to nose test, speed and accuracy are better with a recognizable tactile target.

When it comes to driving, the ideal is “eyes on the road all the time”. It would be helpful if it was never necessary to divert one’s gaze off the road to operate cabin controls. This couldhappen with heads up displays that project menus onto the windshield without significantly obstructing the view. Another way this can be achieved is by adding distinct tactile qualities to the controls in the cabin so that the driver need not look away from the road to recognize the control. If the control is viewed using peripheral vision, the sense of touch can improve selection accuracy and confidence.

There are times when buttons and controls in a car’s cabin are difficult to see. Perhaps it is night or the controls are hidden, perhaps because of novel positioning underneath the rim of the steering wheel. When vision is challenged, the benefits of Braille for the blind should not be unseen by the sighted. In sum,  1) many senses can be combined to improve our interactions with our machines and 2) make sure your vehicle is well equipped when driving into the wilderness, urban or otherwise.

Source of the image of the eye: Hans-Werner Hunziker. Hans-Werner34. https://upload.wikimedia.org/wikipedia/commons/3/39/Double_system_e.jpg

A Street Car You Desire

wiki-horse-street-carThere is something called “desire paths”, that I learned about while watching an excellent TED talk by Mr. Tom Hulme . Desire paths are the short-cuts or “paths of least resistance” that one recognizes while interacting with a structured model. Sidewalk landscaping around buildings is a good example. Have you ever found that the walkways to buildings are too circuitous, taking you on unnecessary journeys through gardens and parking lots? They abdicate the “line” rule, the shortest distance between 2 points. Impatient people like me may cut through the lawn and  wear down the grass until a more direct dirt path emerges. If the architected field of dreams does not correctly anticipate what we need,  users may not come. 

Recognition of desire paths may improve implementation of any technology. When the elevator was first developed, there was nervousness about the cable breaking and the risk of a free fall for all.  Innovators, like Elisha Otis in 1852 pioneered solutions. Now, the pleasantries and culture of attendant operated elevators are forgotten and automation is taken for granted. The designers of the autonomous car also believe that we will learn to accept and safely use their technology too.

So what are desire paths that may enable quicker implementation of the autonomous car? One way is to put smart cars on smart roads. This means switching on autonomy when the enabled vehicle drives on a road that can interact with it because of technology integrated in the roadway, signs, and lighting. Highway lanes could be designated for autonomous vehicles just as there are lanes for vehicles with more than one passenger.   When the vehicle leaves this special lane, it would revert to manual control. 

Another model for autonomous driving is vehicle platooningHere, cars or trucks could be
switched into autonomous mode when they join a string of similar vehicles. This sequence of vehicles resembles the attached cars of a train. The first vehicle in the platoon is manually driven. In turn, it chauffeurs the vehicles behind it. This model is being investigated by the
Safe Road Trains for the Environment (SARTRE) project in Europe. Advantages include fuel efficiency, decreased wind drag, and autonomy for the chauffeured vehicles

Perhaps someday, I will own a car with a single red brake button and no steering wheel or floor pedals. And I may find special desire paths for driving such a car. Until that day comes, there are other ways autonomous vehicles will become mainstream soon – at least that’s what I desire.

Formula One race car with light effect. Race car with no brand name is designed and modelled by myself

References: The concept of “autopilot” lanes was described in an article by T Melba Kurman, Triple Helix Innovation and Hod Lipson, Cornell University in December 2013 called: Where Are the Autopilot Lanes for Driverless Cars? (Op-Ed) 

Future Car from iStock photo. Formula One race car with light effect.

Horse drawn street car: “Rapid transit in 1877″ – First horsecar run in Manchester, New Hampshire”. Published 1908 by the Hugh C. Leighton Company, Portland, Maine. Image was downloaded from Wikimedia Commons.

Braille for the Sighted

yo

Many people have black and white thinking when it comes to vision. Think a blink; sight – eyes open, blind – eyes shut. But here’s the paradox: the brain, encased in the dark confines of the skull, uses many senses to “see” the world. With blindness, the occipital lobe, normally devoted to vision, can interpret other stimuli such as touch, sound, and smell, using these abilities to navigate the world, which includes reading Braille.

The sighted can also benefit from using touch to “see”. Here are some ways this can be incorporated in technology: 

  1. You are plugging a cord into the back of a computer that cannot be turned and but are challenged to find the desired empty port. Solution: put matching textures, ridges or shapes on the end of the cord and beside or around the port to guide their connection.
  2. You are operating the TV cable remote control but need to look where you put your fingers to change channel and volume. Solution: Make it easier to identify buttons by their shapes or textures so you don’t need to look away from the TV. When picked up, the remote keypad configuration and tactile features is mirrored onto the TV display and guides button selection.
  3. Your smartphone starts to ring in your pocket but is difficult to pull out of your pocket. Fortunately, you can recognize different buttons by the way they feel and choose to answer a call or not.
  4. You’re driving your car and want to operate the cruise control, change radio channels and volume, and make phone calls using the many buttons embedded in the steering wheel. These buttons are hard to distinguish and you don’t want to look away from the road to select these controls.  Solution: Put buttons on the steering wheel that can be recognized with the sense of touch to reduce distraction. The menu items on a heads up display can be associated with icons that match the positioning, shape, and feel of the buttons on the steering wheel, to guide the selection process.

So here’s my takeaway: tech should stay in touch and put on a haptic face.

References and Comments: 

Michael Proulx has written an excellent paper called Blindness: remapping the brain and the restoration of vision. Sensory substitution technology enables sight without visual input. He describes the neuroplasticity of the brain and references papers that have found that the visual cortex can process other senses (sound, sight, and smell) to “see” without eyesight.

Massathusetts Institute of Technology published at Sciencedaily.com an article titled: Parts of brain can switch functions: In people born blind, brain regions that usually process vision can tackle language. The visual cortex is integral to reading in the sighted. Interestingly, it continues this role in the blind by processing Braille.

The paradox of the brain in its dark confines seeing the world is derived from All the Light We Cannot See,  by Anthony Doerr, that won the Pulitzer Prize for fiction and exquisitely describes how a girl uses touch and other senses to adapt to becoming blind.

“Haptic” relates to the sense of touch.

 

 

.