Telecommunications B2B User Research
Overview
One portal, two logins, thousands of visitors. But, a surprising source of confusion…
The BT Business My Account portal serves as the primary login point for all BT Business customers, from small to global enterprises.
Traffic: ~19,000 views per week (Source: Contentsquare).
Structure: Two separate login options:
SMB – businesses with fewer than 250 employees. These businesses log in to My Account using their BT Business login.
Large/Global – businesses with 250+ employees. These businesses log in to My Account using their BT Global login login.
The Problem
When two doors look the same, users will try the wrong one
Previously, the My Account portal presented two login options: one for SMBs and one for large/global businesses, but neither description made it clear who should use which. Customers often chose incorrectly, since the page didn’t mention that:
SMB customers use a different set of tools and systems than larger organisations.
Large/global customers access their accounts through BT Global Services.
This confusion created friction, wasted time, and led to failed login attempts.
Process
Starting with questions before answers
1. Stakeholder Alignment
I first met with the Senior Content Designer to clarify goals: To determine if users could identify the correct login, assess the impact of new headers, and observe recovery from mistakes via an unhappy path.
2. Hypothesis
The content team updated the page to explicitly state system access:
SMB login: “Select this option if you access your account through BT Business.”
Large/Global login: “Select this option if you access your account through Global Services.”
We hypothesised that this clearer distinction would help customers understand which login applied to them, reducing incorrect clicks.
3. Test Setup
I created two identical tests: one for SMB customers, one for large/global, which were run on UserTesting.com, focusing on a real-world task of logging in to view a bill.
4. Execution
I validated the test with the content designer, launched it, and captured both behavioural and verbal feedback.
Test Questions
Designing clear, non-leading tasks to uncover user behaviour
To ensure the usability test captured both behavioural actions and perceptions, I created a five-part task flow. Each question was designed to avoid leading participants, while capturing both what users did and how they felt.
Task 1: Behavioural Task (Observed users’ natural navigation and first-click choices.)
“Show us how you would use this website to log in and view your recent bill. Please remember to think out loud.”
Task 2: Verbal Reflection (Captured immediate reflections on the process and self-corrections.)
“How did that go? What would you do next to log in and find your recent bill?”
Task 3: Rating Scale (Quantified perceived ease of use alongside observed performance.)
“On a scale of 1–5, please explain how easy or difficult this task was.”
Task 4: Open-Ended Feedback (Gathered nuanced insights on friction points beyond task success.)
“Did you encounter any difficulties or frustrations whilst navigating through this task? If so, please explain them.”
Task 5: Image Review (Tested comprehension of page structure and content clarity, using a static image of the prototype.)
Spend a minute reviewing this page. How well does this page help you to log in? What might make it easier for you?”

Quantitative Findings
Clarity wins. Mostly.
The usability test provided both quantitative evidence and qualitative feedback, showing how the updated content influenced user behaviour and perceptions of the login process.
User Group | Participants | Correct Login First Attempt | Average Time on Task |
---|---|---|---|
SMB (<250) | 6 | 6/6 | 1 minute 44 seconds |
Large/Global (≥250) | 6 | 4/6 | 2 minutes 05 seconds |
Total | 12 (100%) | 10/12 (83%) | - |
Qualitative Insights
User voices confirm the shift
Quotes from the tests include:
“Really straightforward to log in… it's pretty clear how you would log in.” - SMB Participant 2
Although two of the large/global business participants clicked the SMB login first, one of them immediately realised the error and corrected themself quickly.
Impact
A solid step forward, with room to refine
The new content improved success rates, especially for SMB users.
Recommendations included emphasising employee counts and exploring stronger visual separation between options.
The content changes can be viewed on the live site: https://business.bt.com/login-select/
User Flow Diagram
A straight path, and what happens when it’s not.
Happy path: The user logs into their correct portal (e.g., an SMB owner logs into BT Business My Account to view a bill). They land on the My Account login page and choose the My Account for small and medium businesses portal. Once they enter their login credentials, they are redirected to their dashboard and navigate to view their bill.
Unhappy path: The user logs into the incorrect portal, e.g., a large/global business director wants to log into BT Global Services to view a bill, but they click on BT Business My Account instead. When entering their login credentials, they receive an error message informing them that their login details were incorrect due to using the incorrect portal.

Conclusion
Clearer content led to measurable success.
The usability test demonstrated that small but targeted content changes significantly improved login accuracy. With an 83% overall first-click success rate and 100% success among SMB participants, the updated design proved far more effective at guiding users to the correct login.
While further refinements to visual design could raise accuracy even higher, the test validated the content change as a successful step toward reducing customer frustration and misdirected logins.
From the Principal User Researcher:
Reflection
Why this project mattered, and what I learned from it
This project reinforced the value of clear, unambiguous content in user interfaces and how small copy changes can lead to measurable behavioural shifts.
From a personal perspective:
I strengthened my skills in test design, ensuring questions were unbiased and tasks were realistic.
I saw the importance of including unhappy paths to understand how users recover from errors.
I learned how collaborative alignment with content designers early on can make the test both relevant and actionable.
If I were to repeat the study, I’d aim to pair content changes with visual design enhancements in the same test, allowing us to measure combined impact and potentially achieve a near 100% first-click success rate.
I would like to thank all test participants who contributed their thoughts to this study.
Thank you for reading.