Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

Stephen Hicks

Stephen Hicks

prototype

prototype

Our latest prototype uses an infra-red depth camera to collect information about the distance to nearby objects. We present this on a pair of OLED displays.

A pilot study using an LED version of a depth-based visual aid.

A pilot study using an LED version of a depth-based visual aid.

In this short study we tested whether blind and partially-sighted people could use a wide angle (120 degrees) but very low resolution display (96 pixels per eye), to locate and orient towards targets in space as a precursor to walking around independently. We also studied how well controls could learn to use a live depth map on the same low resolution display to navigate a small obstacle course.

Direct link to paper.

Stephen L. Hicks


Research Fellow in Neuroscience and Visual Prosthetics

Biography

My research aim is to improve functional vision for people with severely impaired sight.

Over 300,000 people in the UK are registered as blind. We are developing a pair of smart-glasses that might be able to help people use their remaining vision to see and avoid obstacles and enjoy increased independence.

Using computer vision and electronic components usually found in mobile phones, we are busily building and testing concepts that we hope to build into an affordable pair of glasses.

Collaborators include:
Professor Phil Torr at Oxford Brookes
The Royal National Institute of Blind People (RNIB)

Funding:
My work is primarily funded by the NIHR i4i scheme

More information about the glasses can be found here Assisted Vision

My Google scholar publication list.

My G+ profile. Work updates and general interest.

Brian Mercer Innovation Award

Brian Mercer Innovation Award
Research Summary

Transparent smart-glasses prototype

Transparent smart-glasses prototype

My background and PhD is in neuroscience, visual perception and computer vision. In previous post-docs in Australia and at Imperial College I built a wearable visual prosthetic simulator to study how to optimise visual information in a retinal implant. The resulting paper won the 2011 Ruskell Medal from the Worshipful Company of Spectacle Makers. I have since developed computer vision algorithms and designs for wearable display that could provide intuitive visual information in a non-invasive manner for legally blind individuals. I have been awarded two substantial grants from the NIHR to develop this into a validated commercial assistive device. At the 2011 Royal Society’s Summer Science Exhibition my exhibit “Interactive Bionic Vision” was highly successful and I have received national and international press attention for my work on “Smart-Glasses”.

For more information on the smart glasses project see our website Assisted-Vision.com.