Thursday, 16 February 2017

#DigitalHealth: Using VR headset to self-monitor vision


First of all, a bit about myself, my name is Nicolas Dubuisson. I completed my medical training in Belgium and so far I am 3 years through my Neurology training, so well on the way to becoming a Neurologist. I joined BartsMS as an ECTRIMS Clinical Training Fellow to ProfG four months ago. I plan to stay with Barts team until October 2017.


Currently, I’m working with Alison and the team in order to develop new self-monitoring tools for people with MS. One of these projects involves working with a team of designers to develop two smartphone applications that will be able to evaluate visual field and colour vision. The ultimate aim is to embed these into the web-EDSS calculator to improve it as an outcome measure. At the moment people with MS have to rely on their neurologist to tell them if there eyes are affected by MS, or not.

The first function of the APP will be to develop the Humphrey test, currently conducted by a optometrist, into an engaging resource for people to use. Its first objective is mainly to evaluate deficits in the visual fields, which could occur following a lesion along the visual pathway.
This function of the APP is not specifically for people with MS, but rather we hope the resource can be used to help diagnose and monitor other ophthalmological problems as well.
In the existing humphrey test, the optometrist asks the patient to look at a white dot with one eye at a time. Other bright dots will appear on the screen in different places and the patient must press a button each time the signal is caught by their eye.


Existing Humphrey Test in ophtalmology clinic.

The final result is displayed as two circles (one for each eye) with different shades of grey. The light grey represents a perfect vision and the black an absence of vision.


Humphrey result showing a right superior quadrantanopia (black zone).


The application we are developing takes exactly the same principle but the machine is replaced by the virtual headset and a smartphone.



We are developing software to conduct the same test from the phone and a headset.

The second function of the APP will be to test the colour vision. In clinical practice, the current test is called the Ishihara test. We are working to develop an APP in the form of a 3D game. The first prototype that we have shows a wall with 16 coloured squares in front of you. You then have to point out (moving your head) which box is a different colour. Once the correct box has been identified, the wall rises and the player move towards the next wall.




The first step in our project (which will start next week) will be to invite patients from neuro-ophthalmology clinic to compare our tests with those used in routine practice (Humphrey and Ishihara). Once validated (we know that our tests work the same as the current tests), we will look into making these test more engaging and ultimately enable these tests to be made available for others to use at home.

We plan to share updates on this project on the blog. We’d be interested to hear about your experiences of eye tests in relation to your MS. Have you completed any of these tests or have you tried any eye testing APPs?

NB: This will not replace your appointments with the ophthalmologist but on the contrary will be complementary to it and hopefully empower you and other people with MS to monitor their own disease.

6 comments:

  1. Sounds like a great idea. I have been struggling to assess changes in my left eye that are due to optic neuritis. Will the tests have the ability to test each eye individually without having to remove the headset?

    ReplyDelete
  2. I am part of a 5 year study with the University of Sydney studying correlation between disease progress and retinal thickness layer. It was a long series of tests and was very interesting. There is meant to be an at home weekly test using an iPad but they haven't started that part yet. I had optic neuritis in my left eye in 2014

    ReplyDelete
  3. Hey Nicholas - good to "meet" you

    I take it that someone has had the "cross platform app development" chat with you. If not, try me.

    In earlier days, when C++ was the thing, one of my Honours students wrote a simulation of macular degeneration for an opthamologist he worked with.

    Good luck on the path to numerology (recent diary miss type!)

    Father of another mouse doctor

    ReplyDelete
  4. I am having problems with my eyes and appreciate the opportunity to self monitor but I don't have a smart phone, will this app work with a laptop?



    ReplyDelete
  5. "Have you completed any of these tests or have you tried any eye testing APPs?"

    I had acute optic neuritis (AON) in my left in 2010 with loss of vision in that eye down to light perception (I could tell where the window was in an otherwise dim room on a bright day but my vision would be overloaded by a bright light.)

    At the ophthalmologists I had "field of vision" tests as you similar to your description, but with both white and pale red dots, until I was discharged. After that at my optometrist a Humphrey test was conducted using an automated machine. Last visit to the optometrist he suggested missing that as as the field of vision had been fine for a few years now (presumably when they changed by recall period back to two years).

    I had the Ishihara test using cards at ophthalology and at the optometris't at the first visit after AON. Not since. There was a FaceBook test basedon shades of colour advertising a premium game 3 or so years ago. It told be that I was "blind as a bat". My general contrast perception is poor. In particular I have noticed with green because some software I use highlights in green and I have to make it brighter.

    There was an Android eye test app including an Ishihara test but I now have a Windows phone.

    My neurologist has not mentioned vision but that my MS is probably changing to "progressive".


    ReplyDelete
  6. This is very interesting, could you link this with a camera in the visor?

    I was thinking that something could replace the VEP. Flash an image in the periphery, then use the camera to determine the lock on time to the new image. You could then time the delta between the image appearing and the eye focusing on the new image, this would give you a VEP like metric, without the hassle of attaching probes etc.

    ReplyDelete

Please note that all comments are moderated and any personal or marketing-related submissions will not be shown.