Everyday Usability 38: coffee break with Don Norman

October 22, 2016 Leave a comment

Discussion about conceptual models with Don Norman and Bruce Tognazzini. The video is from 2013, but I just found it on YouTube. You can also find a link to the video on Don’s website: jnd.org

Categories: Everyday usability

A pen that can draw conductivity

September 8, 2016 Leave a comment

The japanese start-up company Kandenko invented a pen that can draw conductivity. Watch the impressive video of what the pen can do. The pen uses a silver ink which seems to harden when it is drawn on a surface. So the drawing is actually 3D and lines that were drawn with the pen can be picked up as thin solid layer. When the drawing is picked up it looks a bit like in a children’s pop-up book.

The pen is already available as product on the Japanese Amzon website.

3D Modelling Made Easy – sketch and get a 3D model

In a TED talk I just learned about Gravity Sketch. The company develops a software for 3D modelling. It was founded in 2013 from students at the Royal College of Art London. Development focus lays on a usable interface to make 3D modelling a fast and easy process that is open to all people (with a tablet). The interaction concept bases on sketching forms on a tablet. Dependent on the applied function the form is then transferred into a 3D model. From the first impression the interface is looks much less crowded than in other 3D software and it was possible to design the 3D model of a glass in a few seconds (demonstration). However, they apply different gestures, so it remains for the user to learn the functions that the tool offers and how to apply different gestures. Don Norman talks about challenges in gesture design. The first questions are: What can I do and where and how? The user needs signifiers in the interface and for a good memorable interaction they would need to make sense to the 3D modelling action. The interaction should fit to the conceptual model that the user has of creating the 3D object. At best the signifiers are such that are used in other tools as well. There still seems to be no standard for that. Gestures can be adapted from typical interaction with a touchscreen, e.g. swipe to move objects or the two finger gesture for zoom. Perhaps the tool could offer a guide talking the user through the interface, meanwhile showing a video of the specific interaction. Experienced users could have the option to turn the teaching mode off.

In the video tutorials below you can see a bit of the functionality. Keen explorers can watch their YouTube channel and can learn in their tutorials how to design different 3D objects. If you want to have a go then get a free version of the program from the ITunes. The tutorial below explains how to design a 3D giraffe (a child’s version):

One major aim of the software is to make 3D printing, specifically 3D printing of self-designed objects, easier. As 3D printers are still expensive it is unlikely that customers have them at home yet. The software bridges the gap through collaboration with other companies. A link to a 3D printing company is already integrated in the software. Users can upload their 3D model into the other company’s wensite and order a 3D print.

100% usable Product by Scott Adams

Categories: Everyday usability

Interface design: The gulf of execution explained in a funny way

Everyday Usability 38 – Usable doors, an up to date topic since 25 years

Don Norman recently shared a video, featuring him, about a very classic object of usability discussion – the door (see Don’s Website or the creator’s podcast). Don used doors as practical explanation for usability and why it is important to apply human-centered design in his book “The design of everyday things” 25 years ago (revision in 2013). Still some door designs confuses users (not to say cause much frustration) while applying a good design could be so easy. See the funny video:


Driving Style of Autonomous Vehicles

February 28, 2016 Leave a comment

Jaguar Land Rover investigates how a natural driving styles (meaning driving styles from everyday drivers on the roads) could be adapted into an autonomous vehicle. Basis for the data comes from instrumented vehicles which collect data to understand the everyday driving style and then to apply that into an autonomous vehicle. The research strategy contributes to find a way to make highly automated or autonomous cars more trustworthy. Trust is a challenge for new technology, bound strongly to acceptance of new technology. Certainly, as humans, we tend to trust things more if their behaviour is similar to our own (see also my previous blog on automation and trust). However, the driving style varies dependent on the driver’s personality, experience, driving environment, and purpose of the travel. Occasionally a usually calm driver changes his / her driving style if the purpose of travel is urgent, e.g., a family member is sick and awaits the visit. On another occasion one might just want to enjoy the beautiful landscape without any other specific purpose of travel. That again influences the driving style. Of how much the driving style changes is a question to be answered.

The vehicle could ask for the purpose of travel and try to associate a driving style to that. Asking a user helps in general to understand the intentions of the task that the user wants to do and so to provide better service. However, it involves a trade-off of receiving the knowledge to deliver a better service and asking too much, making the user impatient.

To implement human like behaviour into autonomous cars occurs as trend. I found a report from last year that Google is doing research in that area as well. The report reveals another important reason to adopt human like behaviour in a car. Whereas autonomous cars are great in following rules such a behaviour can result literally in a roadblock in the unexpected environment of everyday travel. A rule is, for example, to never go over double yellow lines marking the edge of the road. Now, if a car parks in a way that another car cannot pass on the street without going over the yellow line a natural driver behaviour would be to just go over the double yellow lines. An autonomous wouldn’t do that. For no understandable reason, for the driver, it would recalculate the route. Another issue is the faster reaction time of autonomous vehicles, letting autonomous vehicles come to an abrupt stop when they sense a pedestrian. It results in a challenge for human drivers behind who are not so fast to react. Implementing adaptive human like strategies in autonomous cars helps them to deal with an environment where not everything is working along the rules, but also makes their behaviour more understanding for a human and brings their skills to level where they can interact safely with human traffic participants surrounding them.

See here for details directly at the Jaguar Land Rover website.