Case studies
eynat pikmanUX Expert

How we created a product vision movie without having a product

Background

The HP Safe City product vision movie tells the story of Big Data and how UX is an essential part of making it usable. The Safe City film imagines a command center of a “smart city” and demonstrates how a city can be managed in a more effective manner by leveraging HP’s Big Data technologies. Our goal was to envision a real product that shows how these technologies can be used. But before we even started thinking about the product, we had to come up with a story—a good story that connects all device fragmentation with its limitless combinations of devices, browsers, and versions into one holistic tale that delivers the message. Once we had that story, we started thinking about the final delivery. What will it be? A presentation, an interactive prototype, a movie? We had many dilemmas, but finally, we chose to make a movie since, in our opinion, it was the best way to quickly and easily help the audience grasp the message.

 

Safe City product vision movie

There were few challenges though:

a. No such HP product existed. We had to invent it, create use cases, screens, and UI all from scratch.

b. The movie had to be ready for the big HP Discover bi-annual event, which meant that we had only two and a half months to work on it.

c. None of us in our studio had ever created a movie before.

d. Last but not least, we had no budget.

So how did we do it?

Inventing the Safe City product

Since our story was about helping city leaders to gain insights into various aspects of city management under routine conditions and in times of crises, the natural key players of our futuristic product were the city’s underlying agencies, such as emergency services, police, power supply company, and more.

safecity2
City underlying agencies

Most of these service agencies have the same basic organizational hierarchy and also share the same critical need to be aware and in control of what is happening in the city. This led us to the design strategy not to create a unique solution for each agency, but instead to unify their UIs—of course with adjustments to each agency according to its special needs.

Next, we needed to figure out the personas that are going to use the Safe City product. Since time was short, we collected information in any way we could: we visited a police station control room, we read web interviews with different command center operators and policemen working on the field, we even “discovered” an HP employee who was a police volunteer; he helped shed some light on a day in the life of a traffic policeman. This was done in hopes of clarifying how things really work. Having said that, we naturally made many assumptions and used a lot of common sense (which is not always correct) to fill in the gaps. Eventually, we focused on three personas: the operator of the control room, the shift leader of the control room, and the onsite worker.

The UI was adapted to the device each persona would probably use according to their use case, to several large screens for the control room, and to a variety of smart mobile devices for the field people.

Two monitors for the operator
Two monitors for the operator

Trying to envision how new smart devices might come into play led us to find new solutions, such as a smartwatch alerting the policeman of events nearby, or an augmented reality device that helps the power engineer to get extra information on the street.

We aimed to craft a novel approach that would allow a more natural flow of information and insights between all the personas.

UX strategy

Several guidelines helped us focus on our message.

The first guideline was that the UI should empower the idea of “getting the correct information to the correct person in the correct context.” This guideline is of utmost importance when your application is in use in a time of crisis and pressure. For example, the Safe City app determines that due to damaged power cables, children from a nearby school who are about to finish their school day and leave the school premises might be in imminent danger — the app sends an alert to the operator about this additional danger threat and gives a recommendation on how to act. The app automatically takes the operator to the street view of the area and shows the phone number of the responsible contact person from the school.

Threat identification over street view
Threat identification over street view

So, with one tap, the operator can warn the school from releasing the children to go home. All this without needing to identify the problematic situation on the map by himself or to search for the contact person in his collaboration lists, and so on.

safecity5
Timeline component

The second guideline was to create a unified UI that could support all service agencies (for example, the power supply company and the police would use the same UI). One of the UI elements that support this is the timeline, which synchronizes in real time all agencies’ occurrences. Any update or action taken by one command center is logged and shown in all agencies’ command centers. For example, the police operator can see that the operator of the power supply company has already handled the school danger threat, so he knows that he can move to the next action item.

The third guideline was to demonstrate that it is all about one story, but with many contexts of use. We achieved this by showing the use of the application on various devices in different situations: tablet in a police car, policeman’s smartwatch, power engineer’s smartphone, and power engineer’s augmented glasses. It was important to establish the device’s UI within a strong and genuine use case. For example, the use case for the policeman’s smartwatch:   on some occasions, the policeman will be occupied in doing something else (maybe giving a traffic violation ticket to someone and therefore unable to look at his smartphone), but he can take a quick peek at his watch when it alerts him about events that need his attention.

Policeman’s smart watch
Policeman’s smart watch

UI design

The common practice is UX first and UI after, but with the time constraints of this project, both UX and UI were started at the same time. The UI designer was involved from the beginning as an important participant in the creation of the story and in the UX discussions. This created an unusual challenge—starting a UI concept without any details and before the UX had matured enough; and an unusual opportunity—giving the UI designer the chance to influence both the story and the content.

So how do you design the look and feel design for a product that does not yet exist?

First, the efforts of the UI designer were focused on the operator’s screen. At that point, it wasn’t even clear how many screens the operator needed. The only certain thing was that there would be a full- size map to help target the situation awareness of the user and to function as the core base of the Safe City application. We figured there might be different types of map modes — geographic, street, and satellite — which would probably be embedded with color strips that had content on each.

Preliminary map design
Preliminary map design
Preliminary map design
Preliminary map design
Final map design
Final map design

As the work evolved and there were enough UX mock-ups for the UI designer, the UX and UI were split apart.

The data sources, action items, and timeline components also evolved over time; they stretched and became borderless in order to utilize maximum screen space and to show more content with less visual clutter.

Final design
Final design

So by now, we had a concept of content layers on blurred glass.

Final design
Final design

The look and feel of the blurred layers also served as the augmented glasses’ mask.

Power engineer’s augmented glasses design
Power engineer’s augmented glasses design

The final challenge was to grant all devices a similar look and feel that spoke the same language as the operator’s screen. By bonding them all together and enhancing the sense of continuity, the devices all become extensions of the same Safe City application.

Power engineer’s smartphone design
Power engineer’s smartphone design
Policeman’s tablet design
Policeman’s tablet design

From story in slides into a movie

safecity15

For building the story, we used the Indigo Studio (UX prototype tool) storyboard. It helped us to keep the story coherent and stay focused on the Big Data values we wanted to convey. When all the materials were ready, it was time to pack it all into a movie. We wanted to give the movie a high-end feel, but we didn’t have the time to produce a full-capacity video. And since our studio is not specialized in video editing, we wondered what would be the best way to tackle it in such a short time. We weren’t sure how we were going to take a bunch of slides and pictures and create a movie out of them.

We chose to keep it simple. We had a live prototype that we created in Indigo Studio that included all the important animations, so we recorded the live prototype and used it as the base for the video (using Adobe Premier to edit the video).

OOn top of that, we used Adobe After Effects to add the elements that explained the story and gave the right atmosphere.

Metropolis 1927

One example of this was using inter-titles like those that were used in silent movies to narrate the story points and help the audience make sense of the chain of events.

We used filters in transitions between slides and we also added footage to help the viewer get into the mood of the stormy weather.

Using inter-titles in safe city movie

We used filters in transitions between slides.

In addition, we used basic methods such as animation to manipulate the slide content: text animations, blur effects, and zoom effects.

Closing notes

Storm footage

The end result of this project was a product vision movie, but in essence, we tried to design the Safe City application as if we were going to start developing it as soon as the video was completed— building an end-to-end UI that takes in all the important considerations. And we did all of that in two and a half months.

After we finished the video, the project gathered amazing positive feedback from across the company, including all major stakeholders and customers.

Telling a good story, always keeping the end user in mind, and remembering the value you want to show are the three key points I took from this project.

The hard-working Studio members who worked on the project: -)

UX: Eynat Pikman, David Ismailov & Oded Klimer

UI: Shiri Gottlieb & Mor Goldstein