Multitask is a desktop productivity application for virtual meeting hosts and participants of various ages and backgrounds.
I was in charge of creating mockups and prototypes, synthesizing user feedback into functional designs, developing user research plans, and act as advisor for developers in implementing designs.
The product has a beta available for download now and is estimated to be on the Apple Store by the end of 2022.
Employees go through meetings. A lot of meetings. Ever since COVID in 2020, the advent of virtual meeting software enabled communication for businesses and other organizations as well as gave rise to positions originally in-office becoming remote.
At the same time, meetings, when not done well by the hosts, get in the way of employees' work.
The problem: How can I control my meetings better?
Any solution that provides a massive distraction to employees and their work flow is not good enough, but a solution that doesn’t provide vital functions doesn’t work either–we need both a minimally invasive and effective solution.
The solution: Create a dashboard to access functions outside of the meeting.
We collected data on our MVP product from both unmoderated, video testing as well as prototyping testing with a survey for employees, and the results were better than expected.
With the rise of remote work in 2020 and its presence growing in 2022 and beyond, many people are allowed to work from home when previously they were not.
Employees are often invited into meetings where they’re just listening in rather than speaking for a portion or the entire meeting. Should a user need to hide the meeting window to check emails or other business, it becomes difficult to get back to the meeting.
How might we allow access to the features of virtual meeting applications outside of the application for participants so that they can quickly switch between tasks during meetings?
Before diving into competitive analysis, 4 participants were interviewed, 18 years old to 65+ years old, about how they use virtual meeting software in order to:
Participant interviews were quick and easy, allow short, one-on-one sessions, and inform our user stories and feature ideas.
Our interview goal is verify who and what we are solving for and determining how employees use meeting software.
After initial secondary research and validation, the team identified and synthesized the customer’s context and needs into user stories.
I avoided user personas and went with user stories because the types of people we interviewed maintained similar, yet multiple goals, for what they want.
User scenarios provided the groundwork for further research into our MVP design. Each bullet point in the user scenarios was also looked into further in three areas.
I designed two user flows to show how a user can interact with the MVP–opening the dashboard, mute/unmute, and enable/disable camera–so we had an up-to-date reference of the path a user takes when using our product.
From doing interviews, user stories, and user flows, we revised the "How might we" problem statement to a different question moving forward which is more concise and encompassing of the problem space.
How do we provide more control to a user’s meeting environment?
Once armed with how we want out product to work and what types of employees to cater to, the challenge was to target the audience needs and focus next on any business opportunities we can find in other products. By adhering to Jakob’s Law, we thought it best to look at current solutions, keep our functionality similar to those products with our own unique solution, and see what we should do to solve the problem.
The team met and decided on a single product as every other company offered something similar and proved to be a feasible solution.
We create a dashboard to sync into and access meetings.
Providing a way to access meeting settings regardless of what application employees were in was key to reducing frustration and pain points.
Determining how well the solution worked and received praise on launch was defined by the product manager with my input and advice. After another meeting, we determined how to rate product success on launch and found the following KPIs relevant to tracking product performance.
Based on the interviews and analyses from participants surveyed, and that the product is on a 8 week deadline to ship MVP, we generated a list of desirable features and ranked them based on the RICE method.
The team settled on a dashboard as a solution, but there were multiple ways to go about the dashboard. Rather than spend excessive time on the sideline, I drafted some ideas to power through and got the team to a starting point.
We went with Wireframe 2; a dashboard you can show/hide using your mouse containing the features we offer and developers confirmed wireframe 2 is feasible within the deadline.
I outlined the wireframes of the MVP process and showed the states of the product with meetings active and inactive on the user’s screen. The wireframes are setup to emulate using the dashboard on a regular, MacOS desktop computer.
A problem mentioned in secondary interviews is a user can “have trouble sometimes viewing a chart or a graph” on their screen. From the wireframe stage onwards, the visibility concern is addressed by increasing the icon sizes for buttons so they are easier to see.
We don't know the employees we're testing that well, so I assumed:
We did unmoderated user studies through 4 people with video recordings and transcripts saved.
Unmoderated user studies were used because while we couldn't go through the product with employees, we wanted to see how employees interacted with the product and proof of interaction and not rely heavily on their words and submitted answers.
Each participant was sourced through userbrain.com and paid 35 USD to go through a high fidelity prototype of MultiTask. They completed tasks, such as muting yourself in a meeting and accessing the meeting, and I checked for indicators such as time on task and error rate.
For each testing done on userbrain, I received a video transcript of how they went through the prototype, as well as CSV file of notes for each test.
I originally assumed text+icon was fine, but participants were ok with, if not wanting, only the icons visible.
The testing results supported this change so the icons were edited like so from wireframes to final design, and icons only on the dashboard were a developer constraint as well.
I worked with two developers who coded the designs I made and adhered to any constraints and capabilities they had.
Any designs I hand off to developers would cause greater frustration compared to submitting feasible designs from the start.
There were a few main constraints in development:
I created a style guide and component guide to consolidate all design relevant items in one place for developers and manager to access.
The style guide and component guide helped for developer handoff because every design-to-code reference was all in one location.
In the initial 8 week period we had, only three features made its way into the dashboard before launch.
These are the most important features so the MVP is considered a success as we met our initial design and development goals. The MVP design is what I will show in the case study design.
This is the dashboard employees interact with. It has the ability to show and hide itself by hovering over it with a mouse, and it has toggle states for its main features.
Hovering to show was our method to keeping a "minimally invasive dashboard," as our competitive research yielded, and it could only show up in full when the employee made a conscious decision to interact with the product.
Employees first create a new meeting or join an existing meeting in order to sync up the dashboard with their meeting. With the product installed and set up correctly, users don't have to worry about messing with syncing up every time.
If the onboarding section doesn't work, and the product doesn't sync, then our product is bad. We need to get this right first before any other step.
With the meeting open, all an employee has to do is hover their mouse to the left hand side of the screen over the small, white bar.
A user can mute their microphone by clicking the button on the dashboard. An employee also can disable their camera like their microphone by clicking a single button.
The dashboard maintains all functional statuses even when meeting is minimized.
Engaging with a team of product managers and developers means handling constraints outside of the designer’s control.
The team wanted to go with a multi-colored set of assets and buttons to make them look pretty, but I held my ground on keeping the icons and the color palette high contrast and simple, because this decision means both the hovered over part and the dashboard itself need to be visible against a variety of screen backgrounds.
Instead of the product looking more like our competitors, it took on an appearance similar to a browser bar in Firefox, Chrome, etc. Simple background with 1 or 2 color buttons, no accompanying test on the buttons.
Each week during the sprint period, I presented my designs and conversed with the product manager and developers on feedback and more insights.
I held around 50% control on the overall design of the product as I was the product designer. The PM and developers freely offered suggestions and expressed technical concerns, which I accommodated in iterations and confirmed in user testing. I sometimes defended myself several times on what should go into the product and how it is presented, but did not always win.
While the team did want to make the dashboard similar to competitors like Elgato’s Stream Deck, participants were accepting of the high-fidelity designs presented to them during testing. There were no comments, complaints, or reviews during testing on any selected user. The simple design kept contrast high, reduced detractors from work, and provided no reason at this time to edit the overall design further.
While the term allotted was a 8-week sprint, the team creating the product believe there’s business value and will continue development of it with the originally planned features.
This isn’t the first time I worked with a cross-functional team, but it is the first time I worked with a typical, Agile product team. I spent a good deal of time discussing and revising designs as requirements changed, research proved designs correct/incorrect, and development of the product necessitated design changes.
If there’s any lesson to be learned from this experience, it is that product ownership does not belong solely to the product designer. Ownership shared amongst the managers, developers, designers, and multiple other roles within the company. As stakeholders in the product, their perspectives and ideas broaden what can be done and improved the design overall.
Being the only designer means being the subject matter expert on everything design. I need to consider edge cases, how to evaluate employees and turn the research into designs, and answer any “What If?” questions that popped up throughout the process. While you didn't have another designer on the team to work with, the project manager, developers, and non-designer roles can still offer design advice.