Skip to content
System-FirstDesigner
System-FirstDesigner
Senior Mode for Android

Senior Mode for Android

Make critical phone state obvious + enable one-tap recovery when accidental setting changes happen.

Role

Product Designer — end-to-end

Scope

Research + interaction/UI + prototyping + usability testing + iteration

Platform

Android (concept) + Figma interactive prototype

Primary users

Seniors (60–<80) + caregivers (adult children / helpers)

Timeline

~1.5 months

Testing

Remote moderated usability testing

Participants

10 seniors + 8 caregivers (India)

Outcome Summary (0–20 seconds)

Senior Mode simplifies the smartphone experience to reduce anxiety and support load, validated through iterative testing with seniors and caregivers.

Prevent accidental settings changes from turning into “my phone is broken” moments.

Senior Mode makes critical device state obvious (ringer / silent / connectivity basics) and reduces the chance of accidental changes causing stress or missed calls.

Less stress for seniors. Less panic + support load for caregivers.

Seniors regain confidence because the phone behaves predictably. Caregivers get a faster way to restore the intended setup when something goes off-track.

Validated with seniors + caregivers, then iterated and re-tested.

Remote moderated usability testing with 10 seniors + 8 caregivers (India), followed by a V2 iteration and a micro-test to check if the fixes improved the earlier issues.

Problem & context

I kept seeing the same pattern (including in my own family): a senior accidentally changes a setting (often Silent, sometimes Wi-Fi, brightness, or rotate) and then everything spirals.

Senior experience

“Something is wrong. I don’t want to touch it.” Confusion turns into paralysis.

Caregiver experience

Missed calls, anxiety, repeated troubleshooting, and time lost doing remote "phone support."

Root issue

Phones hide critical device state behind gestures and subtle UI cues. Seniors don’t fail because they’re “not smart.” They fail because the system doesn’t make state and recovery obvious.

Design goal: Reduce communication blackouts by making phone state (especially sound) instantly legible for seniors and enabling fast, transparent caregiver recovery when something breaks.

Design Principles

Make the phone’s status obvious

Seniors shouldn’t have to interpret toggles—critical state like “Will my phone ring?” must be visible in plain language.

Make the phone’s status obvious

Offer one clear next step

When something goes wrong, the UI should present a single, safe action (fix it now / request help), not multiple paths.

Offer one clear next step

Build trust with transparency

Caregiver help should never feel “mysterious”—show what changed, who changed it, and give the user control (log / disable help).

Build trust with transparency
Approach

Scoped the opportunity

Reviewed comparable solutions and sketched early flows to define what the prototype must prove.

1

Built the prototype

Created a fully wired high-fidelity flow for both seniors and caregivers.

2

Designed the study

Defined 6 tasks, success criteria, and metrics (pass/fail, time, errors, confidence/trust/effort).

3

Ran remote sessions

Tested with 10 seniors and 8 caregivers, rotating task order to reduce learning bias.

4

Iterated and re-validated

Synthesized results, shipped a targeted V2, and micro-tested the riskiest flow to confirm improvement.

5

Key Decisions

01

Make sound state explicit on “Home” and “Controls”

Make sound state explicit on “Home” and “Controls”

Surface “Phone will ring / Phone is silent” as readable language, not subtle UI state.

02

Fix discoverability: add an explicit “Controls” entry (not swipe-only)

Fix discoverability: add an explicit “Controls” entry (not swipe-only)

Provide a visible Controls entry on Home so seniors aren’t required to discover a gesture.

03

Caregiver fixes apply instantly (for “safe” fixes), with audit + undo

Caregiver fixes apply instantly (for “safe” fixes), with audit + undo

Allow caregivers to apply fixes immediately for low-risk changes (ringer/brightness), and log it on both devices.

What the testing proved (and where it broke)

Round 1(full usability)

Seniors86.7%

Overall pass rate across senior tasks, but one blocker dominated: fixing Silent via Quick Controls (S2).

Caregivers100%

pass rate across caregiver tasks; confidence ~5.6/7.

Segmentation25%

Low device-familiarity seniors had 25% pass on S2 and 100% required help—this is the core risk segment you’re designing for.

I iterated on the existing designs based on the round one test outcome

Iteration

Made “Will my phone ring?” impossible to miss.

What we saw:

Seniors noticed the small “Ringing” chip, but it didn’t translate into confidence about the phone’s actual state—or what to do next.

What I changed:

Replaced subtle chips with a plain-language status tile (“Phone will ring”) and added an explicit Controls entry so recovery isn’t hidden.

Made “Will my phone ring?” impossible to miss.

Turned Quick Controls into a reliable recovery hub.

What we saw:

In V1, Quick Controls didn’t provide enough reassurance—Sound state felt easy to misread and Recovery didn’t feel like an immediate fallback.

What I changed:

Elevated Sound into a top card (“Phone will ring” + Test ring) and surfaced Restore Senior defaults as a clear, one-swipe-away safety net.

Turned Quick Controls into a reliable recovery hub.

Made the Sound action unmistakable (and confirmable).

What we saw:

Even after finding Sound, users could still hesitate on “which option is active?” and “did it actually change?”

What I changed:

Made Sound label-first (“Phone will ring”), strengthened the Ringing/Silent control hierarchy, and added a prominent Test ring + visible reassurance (silent auto-reverts / restore defaults).

Made the Sound action unmistakable (and confirmable).
Constraints

Recruiting & scheduling:

Seniors + caregivers had to be available together, remotely.

Testing environment:

Remote moderated sessions require careful neutrality (avoid leading, but prevent drop-offs).

Device realism:

Prototype testing via Figma (Mirror/prototype) instead of a real Android build.

Audience limits:

Participants were India-based (strong relevance, but still a sample limitation).

Ethics:

Verbal consent; no personally identifiable info captured (participant IDs only).

Trust, safety, and ethics

Because fixes can apply instantly (without senior consent in-the-moment), the design must communicate control and accountability:

Kill switch / disable remote help

(with authentication) so seniors can opt out at any time.

Kill switch / disable remote help

What will change

preview for higher-impact actions (recommended), while keeping instant apply for safe fixes.

What will change

Audit trail on both devices + visible

"what the senior will see" after applying a fix.

Audit trail on both devices + visible

What I built

Senior experience

Caregiver experience

Trust + safety layer

Edge-case handling

Onboarding Senior Mode

  • -Plain-language Ringing / Silent status
  • -Visible Controls entry (no hidden gestures)
  • -Guardrailed settings + Restore defaults
  • -Request help → Sent confirmation

Reflection

What I learned

Older adults don’t struggle with “complexity” as much as invisible state and hidden interactions—designing for them means making status and next steps explicit.

What I’d do differently

I’d plan a baseline comparison on stock Android from day one and capture more structured observational notes per task (not just scores) to strengthen the story.

What’s next

Turn the concept into a lightweight spec + prototype package for an Android team (permissions model, audit/log requirements, edge cases), and validate with a slightly larger sample across device familiarity levels.

0%

vibe coded by

Manohar Achar