How to innovate better products

Automated Marketing Redesign Case Study

Our team was tasked with redesigning the automating marketing product to help improve user confidence and efficiency. We applied our five step UX process to reveal insights and move forward with a better product design. Automated marketing is a complicated product to design since it can require complex sequences to filter and target recipient segments.

1. Discover

The Users

In order to improve the product, we needed to know where the users were getting confusing and discouraged. We engaged in a number of activities to understand the current product and how it could be improved.

Contextual Inquiry
Watching a user work in a product is one of the most enlightening ways to understand their workflow and frustration. We documented the experience of new users and power users to understand how the different users were working. Since remembering the entire journey of a user in complex software is near impossible, we used a screen capture tool to record the inquiry sessions and started building a library of videos of user interactions.

User Interviews
We interviewed users to find out what they liked the best about the current product and what was the most frustrating.

Group of experts
We formed a team of power users to get feedback on the product and brainstorm better solutions. Our team consisted of digital facilitators, account managers, a product manager, and the UX team.

The Competition

There are bigwigs in the market like Adobe, Salesforce, MailChimp and Hubspot who provide marketing tools to wide audience. We researched what they are doing and how they are thinking about automated marketing. One method we used to quickly understand the landscape was to watch tutorial videos on how to use different software. With the abundance of how-to videos on the web, it is a time effective way to sift through the offerings to find some good explanations.

What we found

There are two prevalent patterns with which to visual an automated marketing sequence of events. One is an event tree or flow chart diagram and the second is a vertical list following the timeline of the events. The event tree style diagram is represented well by products such as Drip and Salesforce Pardot.  The vertical timeline is more the approach of Mail Chimp and Hubspot.

Armed with industry insights, we brainstormed solutions based on the two patterns giving ourselves permission to diverge completely from how the existing product looked and functioned. Experimentation led us to create hybrid patterns and look at other navigation patterns that could be refactored to work well with the objectives of the redesign.

2. Define

From the discovery phase, we began to distill the insights from market and user research. Prominent themes emerged from the user research and key problem areas were targeted. These areas were divided into features that could be ranked by development. Once the ranking process was complete, a clearer picture to the features costs and ROI emerged.

With that knowledge, features were prioritized. At that point, we also worked on the larger road map, knowing that a UI redesign was imminent and we could carry components over to the new design. Therefore, features that were persistent in the road were prioritized over temporary UX improvements.

3. Ideate

Start by sketching and white boarding sessions. By quickly expressing ideas, we can iterate quickly to the best ones. I encourage teams to consider ideas no matter how crazy. Creating a safe environment to be creative will produce better results and the team will have more fun in the process.

We created drawings first, then created wireframes in Sketch. At this point, we were running sessions with the product managers to review the concepts that gained traction.  We began getting feedback on the prototypes, but realized that we wanted a way to measure how quickly users could learn the new flows.

4. Test

We wanted to understand how easily users could learn the new pattern of interaction, but building a fully functioning prototype was cost prohibitive based on the complexity of the product. After some brainstorming, we settled on a paper prototype style user study involving six different challenges.

The challenges stated an objective at the top and provide slots for each step. The steps were provided as paper cut outs and the participant arranged the steps in the way they felt would meet the objective. We wanted to control for a number of variables in the study so we could test them against our hypothesis. The more complex challenges had hints to help the participant, which mimics the way the interface would assist the user in building the automated marketing campaigns. The final challenge involved the user explaining in their own words what was happening, so it was analyzed differently than the first five.

A standard script was written to ensure all participants got the same instructions and participants were recruited within the office. Many of the participants had never seen the product, which made new user testing easier.


  • Comprehension will increase throughout the challenges
  • Time per step will decrease as participants understand the patterns in the challenges

Information recorded during study

  1. Time  per challenge
  2. Prior experience with product
  3. User rated difficulty per challenge

Analysis Plotted

We plotted three measurements on a normalized scatter plot across the six challenges using Excel.

  1. Time spent on each challenge
  2. Hardness rating as recorded by participant
  3. Comprehension: the number of steps the participants answered/arranged correctly

Participants started with good general comprehension, and it increased as they completed more challenges. They directly correlated difficulty with how long the task took, but that did not necessary translate to how correct they answered.

Challenge 1

Challenge 2

Challenge 3

Challenge 4

Challenge 5

Challenge 6








Time per step







Hardness 1-7








5. Iterate & Implement

As we analyzed the results of the challenges, patterns emerged as to how to participants would order the pieces. We had established three personas based on experience with the product at the beginning of the test and viewed the results through the lens of the personas to see if there were any patterns. Problem areas came into focus and we started to iterate on the design.

Prototypes were created in Invision with two different build paths for the user. Since the product is complex and there are many microinteractions, the prototypes were a simplified version of the experience while still providing a good representation of the experience.