Archive

Archive for the ‘News from product design’ Category

Self-driving cars

It’s been a while since I last posted an article on this blog, here we go. Recently I came across an article about self-driving cars. It is a topic that is hard to avoid lately. It appears that every automobile manufacturer is investing into research about self-driving cars. The majority of research thereby concerns the communication between “driver” and car, and ethical considerations how to handle safety critical situations. Toyota recently proposed something like a personal assistant in the car that is suggested to communicate with the driver (Source). The communication between a self-driving car and pedestrians or other traffic participants is, at the moment, on a sidetrack of research.

However, this communication is an important part of the driving task in city environments. A driver communicates with other traffic participants to clarify each others intentions and negotiate a safe passage for everybody. For example, a pedestrian might wish to cross a road without traffic light, if there is a traffic jam or heavy but slow flowing traffic on the road. To ensure he/she can pass the road safely, the pedestrian communicates with the driver in front of whose car he/she wants to cross the road. The pedestrian observes the drivers face and ensures that the driver is looking at him/her. The driver, in response, indicates that he/she has seen the pedestrian with a smile, a nod or a hand-signal and slows down the car or increases the distance to the car in front. Then the pedestrian crosses the road safely.

With a self-driving car this communication is lost. There is no driver to communicate with. What does a self-driving car need to deal with such a situation? A solution would be to leave the decision about the next action up to the pedestrian and let the car handle the situation with its pedestrian recognition and emergency brake system, but that might be unpleasant for the self-driving car’s passengers and it would leave the pedestrian with an uncertainty if the car will stop. So a communication would solve the uncertainty and reduce the risk in that situation. To establish a communication between self-driving car and pedestrian, the car would need the ability to recognise that someone wants to communicate with it, e.g. recognising the pedestrian who wants to cross the road. At best, car would then have a set of signals that helps the pedestrian to see its intentions. The car can then interpret the pedestrians answer and adjust its behaviour, e.g. slow down to allow the pedestrian a safe passage.

The self-driving concept car from Volkswagen (VW) has an interesting design that pushes towards natural communication. The designers have taken Don Normans quote, that turn signals of cars are their facial expressions, literally . The concept car’s face could be used to communicate with pedestrians in a natural way, signalling that it recognised the pedestrian (looking at him/her), a nods in response to the pedestrians request to cross the road (moving its eyes up and down), indicating where the car goes (e.g. movement of its eyes) and if it stops (eyes look forward and rad brake light).

A difficulty could be that the car’s face appears to be visible from the front only. In a traditional car, it is possible to see where the driver is looking through the windows at least from both sides. Other signals that help to predict a drivers intentions are indicators, brake light, and the driver’s reflection in the mirror. It is easy to implement indicators and brake light in a self-driving car. The design becomes more challenging to make its behaviour clearly visible from the sides. Perhaps, if the car’s eyes are implemented on front and side, like the indicators in a traditional car. Another challenge lays in the communication itself. Pedestrians can choose from a range of signals to communicate – e.g. smile, head movement, eye contact, and hand signals. A part of a communication involves to understand each others signals. For a natural communication a car would need to learn a range of signals and how to respond to them. Or do we need to learn a sign language to communicate with a self-driving car?

An interesting aspect is how the communication proceeds over time. When pedestrians learned that self-driving cars always brake for them, does that change their behaviour? Would you as pedestrian be more persistent in your attempts to cross a road? Certainly, self-driving cars are an interesting area of research with manifold aspects to consider from a technical and from a human-centred side.

Sources:

https://www.theverge.com/2017/3/6/14832354/volkswagen-sedric-self-driving-van-concept-mobility

https://www.theverge.com/2017/1/4/14169960/toyota-concept-i-artificial-intelligence-yui

http://www.jnd.org/dn.mss/the_personality_of_a.html

http://www.jnd.org/dn.mss/chapter_11_turn_sig.html

Advertisements

Shower head shows water temperature

October 22, 2016 Leave a comment

A pen that can draw conductivity

September 8, 2016 Leave a comment

The japanese start-up company Kandenko invented a pen that can draw conductivity. Watch the impressive video of what the pen can do. The pen uses a silver ink which seems to harden when it is drawn on a surface. So the drawing is actually 3D and lines that were drawn with the pen can be picked up as thin solid layer. When the drawing is picked up it looks a bit like in a children’s pop-up book.

The pen is already available as product on the Japanese Amzon website.

3D Modelling Made Easy – sketch and get a 3D model

In a TED talk I just learned about Gravity Sketch. The company develops a software for 3D modelling. It was founded in 2013 from students at the Royal College of Art London. Development focus lays on a usable interface to make 3D modelling a fast and easy process that is open to all people (with a tablet). The interaction concept bases on sketching forms on a tablet. Dependent on the applied function the form is then transferred into a 3D model. From the first impression the interface is looks much less crowded than in other 3D software and it was possible to design the 3D model of a glass in a few seconds (demonstration). However, they apply different gestures, so it remains for the user to learn the functions that the tool offers and how to apply different gestures. Don Norman talks about challenges in gesture design. The first questions are: What can I do and where and how? The user needs signifiers in the interface and for a good memorable interaction they would need to make sense to the 3D modelling action. The interaction should fit to the conceptual model that the user has of creating the 3D object. At best the signifiers are such that are used in other tools as well. There still seems to be no standard for that. Gestures can be adapted from typical interaction with a touchscreen, e.g. swipe to move objects or the two finger gesture for zoom. Perhaps the tool could offer a guide talking the user through the interface, meanwhile showing a video of the specific interaction. Experienced users could have the option to turn the teaching mode off.

In the video tutorials below you can see a bit of the functionality. Keen explorers can watch their YouTube channel and can learn in their tutorials how to design different 3D objects. If you want to have a go then get a free version of the program from the ITunes. The tutorial below explains how to design a 3D giraffe (a child’s version):

One major aim of the software is to make 3D printing, specifically 3D printing of self-designed objects, easier. As 3D printers are still expensive it is unlikely that customers have them at home yet. The software bridges the gap through collaboration with other companies. A link to a 3D printing company is already integrated in the software. Users can upload their 3D model into the other company’s wensite and order a 3D print.

Driving Style of Autonomous Vehicles

February 28, 2016 Leave a comment

Jaguar Land Rover investigates how a natural driving styles (meaning driving styles from everyday drivers on the roads) could be adapted into an autonomous vehicle. Basis for the data comes from instrumented vehicles which collect data to understand the everyday driving style and then to apply that into an autonomous vehicle. The research strategy contributes to find a way to make highly automated or autonomous cars more trustworthy. Trust is a challenge for new technology, bound strongly to acceptance of new technology. Certainly, as humans, we tend to trust things more if their behaviour is similar to our own (see also my previous blog on automation and trust). However, the driving style varies dependent on the driver’s personality, experience, driving environment, and purpose of the travel. Occasionally a usually calm driver changes his / her driving style if the purpose of travel is urgent, e.g., a family member is sick and awaits the visit. On another occasion one might just want to enjoy the beautiful landscape without any other specific purpose of travel. That again influences the driving style. Of how much the driving style changes is a question to be answered.

The vehicle could ask for the purpose of travel and try to associate a driving style to that. Asking a user helps in general to understand the intentions of the task that the user wants to do and so to provide better service. However, it involves a trade-off of receiving the knowledge to deliver a better service and asking too much, making the user impatient.

To implement human like behaviour into autonomous cars occurs as trend. I found a report from last year that Google is doing research in that area as well. The report reveals another important reason to adopt human like behaviour in a car. Whereas autonomous cars are great in following rules such a behaviour can result literally in a roadblock in the unexpected environment of everyday travel. A rule is, for example, to never go over double yellow lines marking the edge of the road. Now, if a car parks in a way that another car cannot pass on the street without going over the yellow line a natural driver behaviour would be to just go over the double yellow lines. An autonomous wouldn’t do that. For no understandable reason, for the driver, it would recalculate the route. Another issue is the faster reaction time of autonomous vehicles, letting autonomous vehicles come to an abrupt stop when they sense a pedestrian. It results in a challenge for human drivers behind who are not so fast to react. Implementing adaptive human like strategies in autonomous cars helps them to deal with an environment where not everything is working along the rules, but also makes their behaviour more understanding for a human and brings their skills to level where they can interact safely with human traffic participants surrounding them.

See here for details directly at the Jaguar Land Rover website.

3D Printed Maps in Braille

February 28, 2016 Leave a comment

Another nice idea to take advantage of 3D printers Рmaps in Braille. For blind people it can be challenging to find their way in a new environment and existing maps that are installed in the buildings tend to have only limited Braille labels (if any at all). A 3D printed map specifically designed for the needs of visually impaired or blind people of, e.g.  a university building, could enable them to find their way more confident. The maps are portable, in size of a tablet. The maps are developed in the Rutgers University School of Engineering.

See here for more details.

Auto completion for hand-drawings

January 15, 2016 Leave a comment

Auto complete for hand-drawings presented, watch the video below. This software is very useful for digital animations – repetitions can be auto completed.