ipushpull

Redesign of the web app dashboard and other features to improve conversions and engagement rates

A redesign of the web app dashboard and other aspects of the app for a financial technology software-as-a-service platform that securely connects data from the desktop to third party applications, mainly used by medium to large financial services companies.

The Proposal

Challenge

How can we show users that the platform is easy to use and help them incrementally engage in different platform functionalities?

Key objective

Restructure areas of the web app, increasing the conversion and engagement rates.

The Solution

Working with the development team to check for viability of proposed solutions, I redesigned the subscription page, the dashboard and introducing a tutorial on login. Based on data analysis and research, it was concluded that onboarding users to new features needed to be done on platforms they were already using and I created strategy to reflect that.

The Role

Product Designer

Tools

Figma, Miro, Adobe Suite, Hotjar, Trello, Google Analytics and Jira

Impact

Increased dashboard engagement

• Including stakeholders in analysis and decision making at certain stages of the project resulted in enthusiastic stakeholder buy-in during the whole process, due to their understanding of the improvements required, and the subsequent tangible quantitative results which proved the research correct.

Decreased dashboard bounce rates

For this project there was a significant design impact which was quantified with long term measurements. Following heuristic analysis and primary research, the following changes had a positive effect on the business:

• With the dashboard redesign and addition of a tutorial there was an increase in engagement by 13% and a 16% decrease in bounce rate.

• A redesign of the subscription page led to a 7% increase in conversions for the platform packages.

• The addition of information to the chat application workflow led to a 5% increase in conversions via that workflow.

Increased packages conversions

Ideation

The team and I spent time discussing the web application and the habits of the users based on data that was already available from the usage metrics. The metrics consisted of the usage and downloads of specific APIs and plugins, as well as visits to the web app and conversions from within the app. The dashboard was the most visited page within the web app and therefore the team wanted to enhance it and the user experience.

Considering the personas that were created during the website redesign phase, it was evident that users did not visit the web app very often because they had a lack of time, preferring to use the app from the plugin/add-on that lived within Excel and their chat applications such as Teams and Symphony. We decided that a heuristic evaluation was needed of the web application to highlight areas that needed urgent attention, and there was not enough information to determine which features within the web app were the most prevalent, to add them to the dashboard. This was also addressed during the research stage.

Research

Heuristic evaluation

Goal: Analysis of the platform that will provide insight into problematic areas and allow prioritisation of platform re-work.

To form the framework of the evaluation I used ‘Jacob Nielsen’s 10 usability heuristics’ as well as the ‘Heuristics test strategy model’, and asked 8 lovely people from our team who spent the most time with the customers (sales, marketing and C-suite) to conduct the evaluation in their own time. In order to speed up the process, I had to compensate some participants with chocolate.

At the beginning of the heuristic analysis I looked at the Google Analytics data and mapped the most frequent user journeys across the app to systematically simulate user interaction. The heuristic evaluation team also met to analyse which parts of the app were the most important. I managed the meeting and incentivised a discussion by presenting the user journeys I aggregated, and we moved on to create compelling scenarios that would make the most impact for the user when completed.

Types of testing:

  • Scenario Testing

  • User Testing

For the analysis of the platform, I created worksheets for the participants outlining 5 scenarios for the testing and some general user goals when on the platform to catch more scenarios that we may not have thought about. The team each completed 5 scenarios, then moved on to freestyle scenarios. They scored each scenario on the basis of a score card with the following criteria on a Linkert scale of 1-7:

Capability: Can it perform the required functions?

Reliability: Will it work well and resist failure?

Usability: How easy is it to use the product?

Charisma: How appealing is the product?

Security: How well is the product protected against unauthorized use or intrusion?

Compatibility: How well does it work with external app and APIs?

Performance: How speedy and responsive is it?

Each section had bigger definitions and aspects to look out for, participants were also encouraged to jot down specific comments for each scenario. As a team we met up for a retro which I led, as well as the discussion which followed. After compiling all the findings, I analysed all the submitted score cards and comments and created a bigger Miro document with all the findings.

Findings

Due to receiving quite a lot of feedback, we decided to firstly focus on the features with the lowest scores and the areas within them that required the most attention.

Dashboard (Overall score: 2.5)

  • Colours do not align with the brand

  • Distracting workflow: The dashboard immediately presents a way to set up the app plugins and input data. This leads away from the everyday use of the app and can be distracting for the user by presenting them with a complex set up straight away

  • When logging in onto the app, there was no usage data, which is what the pricing of the package depends on

  • When first logging in, there was no explanation of how to use the platform, which might seem complex at first, it needs a tutorial

  • The set up workflow didn’t have an estimation of how long it would take. Although there were only 4 steps, some of the steps had multiple parts and were quite time consuming

Subscription page (Overall score: 1.8)

  • The formatting of the page had too many rows. The fields were set too close together and not very user friendly

  • Not possible to select which package you need, a package is assigned to you automatically

  • Impossible for users to upgrade

  • Asking for too much information in this section created a purchase barrier

  • Cancel button and purchase button are too far away from each other

Chat api integration workflow (Overall score: 3.4)

  • There was no information about the platform within the app description inside the chat applications

  • No live data disconnection message within the chat app

  • It was difficult to figure out what features ipushpull has or what they/we do

  • Didn’t need to go into the web app at all, was easier to stay in the chat app

Primary research

Feature User Survey

In order to discover what would be the most useful dashboard features for the user, I conducted satisfaction surveys which were embedded in the web app dashboard as an alert banner. The participants were users within the 35-65 age group with the job title of Head of department (primary platform user) or other professions within financial services that were the secondary platform users, with job title such as trader or financial analyst. This type of research was conducted in order to obtain quantifiable stats which could be used towards the new dashboard.

The survey had a list of features and the participants were asked to arrange them in rank-order (scale), and they could also add their own features and rank them. The second part of the survey was open ended questions asking for feedback on the dashboard and any features they had feelings towards as feelings are easier for users to remember than usage frequency.

Quantitative analysis

21 participants replied to the survey. Quantitative analysis was performed on the ordinal data from the rank-ordering of the features, and I was pleased that the data had a significance alpha of 0.05 making the results statistically significant.

Weighted rank aggregation gave an overall score for each item which I turned into a percentage, allowing me to compare which features are most preferred on average across participants. The features were also ranked from top to bottom in order of score given.

Qualitative analysis

At first it was difficult to determine which type of analysis to perform on the open ended questions within the survey as there weren’t enough responses to use thematic or content analysis which is the best types of analysis in my opinion. However, I ended up using sentiment analysis which was a point of learning for me.

I assessed the tone/sentiment of the responses by categorising them as positive, negative, or neutral. This analysis helps gauge overall attitudes about features based on the open-ended feedback.

Analysis cross referencing

I compared the rankings of certain features with the themes/sentiments expressed in the open-ended responses. For example, if a feature ranked consistently high in the rank-order questions, I checked the open-ended responses to understand why participants valued that feature, allowing for deeper insights which gave a better understanding of participant answers.

Research outcomes

I collated the findings from the heuristic evaluation and the primary research survey into a new document on Miro, grouping the findings by section or workflow for the platform features and separating the different types of analysis by colour, while giving each feature an overall heuristic score.

Feature Rankings

1 - Tracking which org users are active: 46%

2 - Tracking if clients are active on the platform: 28%

3 - Keeping track of data usage: 23%

4 - Creating a new page: 2%

5 - Accessing most recently used pages: 1%

6 - Setting up workflows: 0%

7 - Accessing page history: 0%

“When I use the dashboard it’s mostly to check how much data the clients are using”

“I like to see which platforms are being used by the team and if we need a bigger package”

Recommendations

  • Data analysis shows that users mostly don’t set up the main workflows through the web app, and instead do it through the API. Therefore the current dashboard workflow is redundant

  • The features that were most popular with users currently don’t reside on the dashboard and the dashboard can be redesigned with these in mind, and aligned to the new branding

  • For users, the features which would be the most useful on the platform would be: keeping track of data usage and tracking which clients or organisation users are active on the platform in order to improve onboarding

  • On first login a tutorial is required for a walk through of the platform

  • The subscription page needs to be redesigned to make it easier for users to start a plan or upgrade

  • The company information on the subscription page needs to be relocated

  • Within the chat APIs workflow, platform features information needs to be added

  • A warning message needs to be added when the live data connection isn’t active for the chat APIs

Planning

Features

From the research and analysis, it transpired that there were a few features that were in dire need of a redesign. The following features needed to be redesigned or added:

  • The dashboard: redesign to match the feature rankings

  • Platform tutorial: to pop up on login

  • Subscription page: redesign and streamlining

  • Features information: to be added within the chat platforms

  • Live data connection warning: to pop up when the connection isn’t active within chat APIs

The team and I met to discuss the scope of the changes involved and so that all the stakeholders understood the business value of the changes. We worked to prioritise the features in line with the business value, urgency and any dependencies, adding them into sprints or creating new sprints based on how long all the changes would take. The dashboard required a technical review meeting after which the work was green lit and the design phase commenced.

Prototyping

To translate the user insights into meaningful design solutions that would work well on both desktop and mobile, I created wireframes to plan out the interaction design, visualise the layout and clarify the functionality that would align with the research.

WIREFRAMES

Mid-Fidelity prototypes

To show the functionality of the redesigns to the stakeholders and fine tune the experience before committing to the full design, I created mid-fidelity prototypes.

The dashboard

Subscription page

Platform tutorial

Connection alert

High-Fidelity prototypes

The dashboard

Subscription page

Platform tutorial

Connection alert

A/B Testing

Goal:

To improve the overall dashboard user engagement

Metrics:

  • Time on page

  • Bounce rate

To ensure accurate results, only incremental changes were made on the dashboard, the changes were introduced in stages and tested. When more features were added A/B testing was carried out once more.

Audience segment:

Existing users on desktop and mobile, and the users were split 50/50 between the 2 versions.

Sample:

The A/B test required a large sample size (based on Optimizely’s sample size calculator) of 2,900+ users per variation to guarantee statistical significance. Therefore, 4 stages altogether over the period of 6 months were carried out in order to get enough traffic to analyse the results.

Results:

For each test the statistical significance was above the 95% confidence level, making the tests statistically significant.

For each stage there were 2 versions, one being the original and the other the change made to the dashboard. The results reported were the difference between the previous version and the new version in terms of number of visits per user and time spent on page. These versions were compared to test only the new changes made and not the overall changes.

Insight:

One idea from the C-suite was to replace the A/B testing with a Beta version with all the versions in one go. However, this would not allow for testing in increments and the specific type of persona that chose to go for the Beta version may have biased the results. So I had to persuade the C-suite team to carry out A/B testing which allowed me to get results per feature introduced.


Stage 1

In this stage, a new functionality was introduced at the top of the page that allowed users to see their usage summary. The functionality that was already there was moved down the page. The new functionality was marked as version B and the original page was version A.

Combined users: 5,970

Difference - Bounce rate: ↓5%

Difference - Time on page: ↑4%

Stage 2

For version B another new functionality was introduced underneath the feature introduced in stage 1 and before the original features. For version A, stage 1 was selected as there was a positive result from the changes made.

Combined users: 5,912

Difference - Bounce rate: ↓8%

Difference - Time on page: ↑7%

Stage 3

In this stage, the features that were added remained the same, however we removed some of the older features that were “legacy” from the original dashboard. This was then made into version B for testing, while version A retained everything from stage 2 and compared.

Combined users: 5,898

Difference - Bounce rate: ↓1%

Difference - Time on page: No change

Stage 4

The last stage of the testing incorporated a new tutorial. It popped up automatically on users navigating to the dashboard for the first time. Although this feature would be the most useful for users who have not seen the new features, I could not test the tutorial without testing the new features first.

Combined users: 5,985

Difference - Bounce rate: ↓2%

Difference - Time on page: ↑2%

Number of users who clicked on the tutorial button located at the top of the dashboard was 18%.


Test conclusions

The overall difference between stage 1 and stage 4 in bounce rate was a 16% decrease, and for time on page there was an increase of 13%. Therefore the changes made to the dashboard were positive overall and the dashboard changes were kept. There was also evidence that the new tutorial had gained traction and users were engaging with it.

Heat maps

The only aspect of the testing that was questionable was stage 3 where removing the old features did not incur much of a change in user behaviour when looking at the test metrics. So to get more insight into this data, at the end of that stage I looked at Hotjar heat maps of the two versions to find out where users were clicking on the page during their visits. It showed that users were not very engaged with the old features and there were only a few clicks on these, so I qualified these features as “cold”. Such an insight confirmed the removal of these features during this stage was the correct decision.

Post stage 4, all the changes made to the dashboard were kept on.

Insight: If I had to do this again, I would run another satisfaction survey after the A/B testing. Quantitative analysis is usually quite dry and gathering insight into the affective responses of the users shows the rate of emotional resonance which makes for stronger connections with users on an emotional level.

Feature metrics

Subscription page

One reason why the subscription page was redesigned, was to increase the conversion rate of the page. It was scored at 1.8 during the heuristic analysis which was the lowest score of all. Along with the redesign, the developers and I made it possible for users to change their package.

Conversion rate

Before the redesign: 62%

After the redesign: 69%

Change: ↑7%

The conversion rate was increased by 7%, meaning the redesign of the page made it easier for users to sign up to a package or upgrade.

Features information

Another planned improvement was to add features information within the chat platforms to improve conversions.

Conversion rate

Before the redesign: 46%

After the redesign: 51%

Change: ↑5%

There was a 5% increase in the conversion rate, showing that this addition made it marginally easier for users to subscribe. Since the work once hinged on UI design by myself and not any development time, it had good return on time investment.


Other Work

IPUSHPULL WEB & MOBILE DESIGN

IPUSHPULL DESIGN SYSTEM & BRANDING

FUND FAST WEB & MOBILE DESIGN