Usability Testing for a Tobacco Cessation App
To evaluate the usability and perceived usefulness of the application among diverse user groups with varying tobacco use preferences.
Usability Testing and Analysis

Project Overview
Team: Research team of 2
Timeline: 8 weeks (2024) | Sample: 100 users across 4 Indian cities
My Role: Lead UX Researcher (Usability Testing Lead) Team: Research team of 3 | Design handled by in-house studio
Methods: Mixed-methods usability study (quantitative surveys, qualitative interviews, observational testing)
The Problem: What Were We Trying to Solve?
The application was designed as a digital companion to help individuals quit tobacco consumption through personalized quit plans, NRT awareness, and expert consultations. However, before launching at scale, the product team needed to validate whether the designed experience actually worked in users' hands.
Our research mandate:
Identify usability friction across the entire user journey (onboarding → quit plan → expert booking).
Evaluate whether users understood and could act on NRT (Nicotine Replacement Therapy) guidance.
Measure whether users could navigate the app independently without external support.
Understand cultural and behavioral nuances across diverse Indian tobacco users (urban/rural, SLT vs. smokers, age, gender).
This was a post-design, pre-launch validation study, our findings would directly inform launch readiness and immediate design iterations.

My Role: Senior UX Researcher
My responsibilities:
Research Strategy & Planning
Defined the study scope and success metrics in collaboration with the product team.
Selected a stratified sample of 100 users across 4 major Indian cities (Pune, Lucknow, Delhi, Nashik) to capture urban/rural diversity, gender balance (54M/46F), and tobacco type (SLT vs. smoking).
Ensured all participants had self-driven motivation to quit (critical for realistic behavior)

Methodology Design
Designed a mixed-methods protocol combining:
Quantitative: 0-9 rating scales for ease-of-use across 6 core tasks.
Qualitative: In-depth interviews exploring motivations, fears, and emotional responses.
Observational: Screen recordings to catch "silent failures" (e.g., users who struggled but didn't verbalize it).
Created task scenarios mirroring real-world usage: sign-in → create quit plan → explore resources → book expert consultation.
Execution & Moderation
Co-moderated 100 remote and in-person sessions, adapting questioning techniques for users with varying digital literacy.
Managed logistics across 4 cities, coordinating with local research partners.
Analysis & Synthesis
Analyzed quantitative ease-of-use scores segmented by demographics (age, gender, tobacco type).
Coded qualitative transcripts to identify recurring behavioral patterns and mental models.
Created a prioritized findings report categorizing issues by severity (blockers vs. enhancements).
Strategic Recommendations
Translated user pain points into actionable design and technical requirements for the studio team.
Presented findings to stakeholders with video clips and direct user quotes to build empathy and urgency.

The Process: How We Tested
Recruitment & Sampling
We recruited 100 daily tobacco users with genuine intent to quit, ensuring diversity across:
Geography: Urban metros (Pune, Delhi) vs. Tier-2 cities (Lucknow, Nashik).
Gender: 54 male, 46 female (oversampled women, often underrepresented in tobacco research).
Tobacco type: Mix of smokeless tobacco (SLT) users and smokers, as behavior and motivations differ.
Age range: 18-55, to capture different life stages and digital comfort levels.

Key Insights
1 · The Quit Plan Was Invisible After Creation
58% of users liked the idea of a quit plan. But when asked to find it 10 minutes later, most couldn't.
Why: Users see chatbots as ephemeral conversations, not places that store things. Once the chat scrolled, the plan was mentally "gone."
Fix: Move the quit plan out of the chatbot into a persistent "My Dashboard" that users can always find, edit, and own.

2 · Stories Were Too Fast to Be Useful
Users loved the brief, Instagram-style health tips. But screen recordings revealed they were retaining almost nothing.
Why: Slides auto-advanced too fast, and the "Tap & Hold to Pause" gesture was invisible to most users, especially older or lower digital-literacy participants who had never used Instagram Stories.
Fix: Mandatory 8-second minimum per slide + a first-use tooltip explaining the pause gesture + a Stories archive for revisiting content.
3 · Human Connection Was the Real Feature
Users were scared to book an expert consultation. "It feels like a doctor's appointment. I'll be judged."
After one session? 97% satisfaction. They called the expert a "friend," not a doctor. This single feature drove the strongest motivation to quit we observed in the entire study.
Fix: Surface "Meet Your Expert" previews and peer testimonials in onboarding, before users reach the booking screen, to remove the fear barrier early.

Results

What We Recommended
Fix before launch:
Decouple the quit plan → persistent dashboard
Fix Stories speed + add pause gesture tooltip
First sprint post-launch:
Multi-lingual support (Hindi + Marathi): 34% of users asked for it
Add peer "Success Stories", the #1 content request
"Meet Your Expert" preview in onboarding
Reflection
A 75% task success rate can hide a product-killing problem. The quit plan "passed" technically, but users couldn't find it 10 minutes later. I now always include recall testing as a standard step in usability studies.
The bigger lesson: in emotionally loaded domains, health, identity, safety, human connection is the UX. The expert consultation outperformed every interface feature we tested. Features enable trust. People build it.