City Touch

An Interactive, tactile and Voice enabled map for people that are blind
Project Details

Context: University Project: Design Research

Tasks: User research, UX concept, Interface Design, Coding, Usability testing, Prototyping

Awards: Universal Design Expert Award 2018, Universal Design Consumer Award 2018

City Touch is a prototype of an interactive, tactile and voice enabled map for people that are blind. With its dynamically changeable surface and its integrated voice assistant it enables to experience an environment with other senses than just sight. The project aims to provide a new way of orientation and navigation and to give the peope with visual impairment more independence from limited, printed 3D maps.
What makes City Touch special?
For me, there are two major reasons why City Touch is one of my favorite projects and was definitely one of my biggest challenges: The first one was having to leave out all the visual design and only being able to rely on tactile and auditive communication. This forced me to use a whole new way of prototyping and showcasing my ideas but also pushed me to step out of my comfort zone and teached me to design for new technologies and interfaces. Second, designing for people that are blind meant designing for needs I could never experience, their way of managing their lifes was a completely unknown field for me. Extensive user research and especially a lot of interviews made it possible to step into their shoes, understanding their needs and adressing their problems.

the Product

Tactile interaction

The user can feel and differentiate the streets and points of interest due to their different shapes and heights. When he presses on one of the elements CityTouch displays additional information in an auditive way. This would be for example, the name of the street or the type of the POI (e.g "Doctor Messner, dentist").

Speech interaction

In case the user has more complex input, e.g. searching a specific location or looking for the way from one place to another, he can do so via interacting with our digital assistant 'Louisa' and tell her what to display. She can also help finding POIs and give additional information, as opening hours.
City Touch display with its functions and elements: wide streets, narrow streets, user's location, points of interest, search results, zooming function, button to center current location, auditive feedback
City Touch display
"Louisa, please show me all doctors within two kilometers."
Since we weren't able to prototype all functions of the real product, we focused on one user journey: Displaying all doctors in one area in munich.

The user starts with a flat display. After asking to show his current location, the street and his location evolve on the map. He is now able to press on a street and the name of the street is played via audio. Afterwards he can ask for doctors around him and the points of interest appear and are displayed higher than the streets. By pressing and holding a POI, the user is able to elevate it even more and therefore highlight and save it.

The Topic: Soft Machines

1. Challenge

The topic of the class was 'Soft Machines'. Our task was to explore what becomes possible when we don't stick to the 'hard' hardware that we normally use when interacting with computers.

2. Focus

After exploring the characteristics, possibilities and technologies of Soft Machines, we chose to focus on one of their biggest advantages: The flexibility of their surface.

3. Insight

Being able to dynamically change the shape of an interface gives the opportunity to communicate via its form. This insight led to our approach to concentrate on the tactile interaction with our machine.

4. Target Group

And who would benefit the most from a tactile communication? Those who are already used to rely on it and can't use the majority of our products, because they require a visual interaction: People that are blind.


design process: user research, technical research, low-fidelity prototype, user tests, high-fidelity prototype

user research



During the user research we conducted several semi-formal interviews with people that are blind, persons working in the field and a designer who had already developed an app for them. The goal was to understand better how people manage their life without being able to see and detect their biggest pain points.


User Needs

Limited Perception
Normally-sighted people are able to perceive objects that are far away, by seeing them. People that can't see however, are only able to experience what is within their reach or is recognizable by noise. Estimating distances and spatial relationships in unknown environments is therefor one of their biggest challenges.
Limited Orientation Resources
To get on their own from A to B, there are mainly three options for people that are blind: The first one is smartphone navigation, which is realized with voice commands, that are describing the way. The problem in this case is, that the user just perceives a row of voice commands and has no possibility to perceive the route as a whole or to locate the destination in its spatial context. Another option is a mobility coach, a person who is showing the way and helps to study the unique turnpoints, so that they can be managed individually later on. All interviewees really appreciated mobility coaches, the downside of this option is that hiring one is very expensive and only partly covered by insuarances. The third options are printed 3D maps, that can also be explored in a tactile way. People with visual impairment considered these maps as a good way to get a feeling for the place, but also criticized that they are not available for many places and are often outdated. Also, due to the limited space on a printed map the information contained has to be restricted to a few points of interested and can't be adapted to other requirements.
People with visual impairment are often lost in unknown environments and it is almost impossible for them to create a mental map of a place and make a connection between locations. This leads to an inability to locate themselves in their surrounding and a feeling of insecurity.



Prepare for the way
To take away some of the insecurity within new surroundings, we want to give the possibility to study the way in advance, review the turnpoints and get a feeling for the distances.
Show what's around
Blind people are very limited in perceiving what's around them and what kind of shops, buildings or other amenities are on their way. We want to enrich their experience and give them the possibility to explore what is along the way or in a specific location.
Create a mental map
Making connections between places and locating them in their context is very hard for blind people, since they usually navigate from waypoint to waypoint. We want to give them a way to create a mental map of an environment.
Provide the full flexibility of information
We want to develop a product that is able to show all places and display various kind of information, depending on the usecase.
Support Independence
People that are blind depend a lot on the help of others, our product should give them some of their independence back, take away the unknown and enable a feeling of more security.

The Design Challenge

As a blind person, I want to be able to perceive my environment in a tactile way, in order to experience information outside of my reach and my hearing distance.

Technical research

After the user research and drafting the first concepts we had a map in mind, that could somehow change its shape, to present every location in the world. From a technical point of view we were facing two challenges: First, finding a technology that would make our vision of a dynamic surface possible and second, some way to simulate that technology in our prototype. Since we had limited time and (no) resources and most technologies for dynamic surfaces are still in their research phase, we quickly realized, that we had to use another technology in our prototype than we envisioned for our final product.

After researching a couple of technologies we focused on two main approaches for the vision of our final product: Ultrahaptics and Shape Displays. Ultrahaptics is a haptic technology that uses ultrasound to create the feeling of three-dimensional shapes and textures. It felt like the perfect technology for our usecase. The problem was, that none of us had ever experienced it and we didn't know how we should prototype the feeling. On the contrary, shape displays were more tangible. The concept behind it is that the display consists of a large number of rectangles, which can be moved up and down. It is therefore possible to create a three-dimensional structure that can be experiences in a haptic way. Since Shape displays seemed more feasible to prototype, we decided to simulate this technology.

Testing the low-fidelity prototype

Low-fidelity prototyping
To get early feedback on our idea we built four low-fidelity prototypes. Each prototype had different characteristics, e.g the width and height of the streets or the shape of the points of interest.
User tests
We then tested our prototype with a group of blind people. Our main results were:
  • No information overload on the map
  • Speech as the way to navigate within the map and to filter information
  • Differentiation between wide and narrow streets is important
  • Users were feeling the map with two hands: One serving as a reference frame, the other one identifying the actual information
Low fidelity prototype Low fidelity prototype
Low fidelity prototype Low fidelity prototype

High-fidelity prototype

structure of the high-fidelity prototype: Perforated metal plate, Nails, Street with buttons to detect input, engine
structure of the high-fidelity prototype
process of entering the nails
back of plate with cables
structure of POIs
front of plate with detectable streets
hand touching the prototype

 All works Bot