Converting Transport Screen Text into Clear, Audible Speech.
RideOnTime is a mobile app that helps visually impaired commuters by turning unreadable LED transit signs into clear text and audio. Developed through inclusive co-design, it is supported by the UCL Venture Builder under our startup, Solora Limited, and was awarded Best Pitch at the UCL SustainTech Competition for its focus on accessible and equitable mobility.
iOS mobile app
Co-founder, Head of UX and Product
Led a cross-functional team of four across UX, development, marketing, and outreach. Owned the end-to-end design process, from user journeys and specifications to testing and iteration.
UX research, wireframing, prototyping, Figma, interviews, usability testing, accessibility design, analytics (Amplitude), stakeholder collaboration, co-design workshops
THE PROBLEM
LED signboards often appear distorted on regular phone cameras. For partially sighted commuters, zooming in rarely helps, and the text remains hard to interpret. This makes independent travel stressful and disempowering.
‘A’ lives with central vision loss and struggles to access real-time transit information independently. Despite existing tools, she often relies on others for help.
Refined the design for clarity and flow, testing with visually impaired users to ensure accessible onboarding. Worked closely with the developer to build the app.
After rounds of QA, collaboration with our developer, and legal support for terms and conditions, we successfully launched the app as a pilot.
We reached out to non-profit organisations who responded positively to our vision and the accessibility gap the app addresses. Several have agreed to partner with us.
With over 100 users so far, we use Amplitude and UXCam to monitor behaviour, uncover pain points, and guide ongoing improvements based on real usage data.