The voice banking software I’m using is from the Speech Research Lab at the University of Delaware. They say they are in the process of commercializing it; hopefully it will still be free to the disabled. Probably not looking for donations though.
Another interesting communications assistance project is Dasher. They have a Java applet demo as well as programs for PC and smart phones. It does predictive input designed to maximize effective bandwidth. A little confusing at first but supposedly after some practice you can type fast with only minimal use of the controls. I say supposedly because I haven’t used it much, it’s not clear what I might be controlling it with. I should practice with it some more, it sounds likely to be part of an overall solution. Would be cool to control it with BCI, sit back and just think to type your messages.
Everybody with ALS talks about how terrible it is, all the things you can’t do any more. But nobody seems to notice that there are all these things you get to do that you’ve never done before. I’ve never used a power wheelchair. I’ve never controlled a computer with my eyes. I’ve never had a voice synthesizer trained to mimic my natural voice. If I told people on the ALS forums that I was looking forward to some of this, they’d think I was crazy. Maybe people here will understand.
I spent two years as the maintainer of Dasher, and would be happy to answer questions on it. It’s able to use any single analog muscle for control, as a worst case (and a two-axis precise device like a mouse as a best case). There’s a video of using Dasher with one axis here—breath control, as measured by diaphragm circumference:
Head-mice (you put an infra-red dot on some glasses or your forehead and then just move your head to move a pointer) are a common and cheap input method; they cost less than $100, and Dasher’s very accepting of noisy input; if you oversteer in one direction you can just compensate later.
You’re not the first person to consider Dasher with BCI—here’s a slightly outdated summary:
I was listening to an interesting podcast recently with the fellows who founded “patientslikeme.com″ I have no idea what the community is like, but it may be an interesting resource you haven’t encountered yet.
I understand—it reminds me of the Max Berry story “Machine Man” where the protagonist, a robotics researcher, loses a leg, so he designs an artificial one to replace it. Of course, it’s a lot better than his old leg… so he “loses” the other one. Of course, two out of four artificial limbs is just a good start (and so forth). I wouldn’t wish your condition on anyone, but you might just have been lucky enough to live in a time when the meat we were born with isn’t relevant to a happy life. Best wishes regardless.
I’ve played around briefly with dasher and like many of these alternate text inputs it is not designed with coding in mind. I can’t remember the forms of punctuation it uses, but the frequencies will be all wrong to start with.
What you really want is a cross of dasher and the visual studio style auto-complete, so the words/letters it puts largest are the the variables in scope or from libraries included, or the member functions for the object you are accessing. You’ll probably need to specialize your tools to a single language to start with, which is a shame. Pick wisely!
I’d love to play around with controlling a computer with my eyes.
Dasher, at least the Mac version and presumably the other desktop versions, can be given a custom character set (including how they’re ordered and grouped), and you can feed it an arbitrary text file to learn frequencies from. If you feed it plenty of program text it should learn the common phrases just fine, though without context-specific completion as in an IDE.
(Update: The current Mac version seems to be have an entirely nonfunctional preferences dialog and thus lost the character set functionality (there is a blank list box for it). Feels like the app got released in the middle of development work — hopefully it’ll be fixed sometime. The basic functionality still works; I typed about a third of this paragraph using it before I got tired of the lack of uppercase and punctuation.)
The voice banking software I’m using is from the Speech Research Lab at the University of Delaware. They say they are in the process of commercializing it; hopefully it will still be free to the disabled. Probably not looking for donations though.
Another interesting communications assistance project is Dasher. They have a Java applet demo as well as programs for PC and smart phones. It does predictive input designed to maximize effective bandwidth. A little confusing at first but supposedly after some practice you can type fast with only minimal use of the controls. I say supposedly because I haven’t used it much, it’s not clear what I might be controlling it with. I should practice with it some more, it sounds likely to be part of an overall solution. Would be cool to control it with BCI, sit back and just think to type your messages.
Everybody with ALS talks about how terrible it is, all the things you can’t do any more. But nobody seems to notice that there are all these things you get to do that you’ve never done before. I’ve never used a power wheelchair. I’ve never controlled a computer with my eyes. I’ve never had a voice synthesizer trained to mimic my natural voice. If I told people on the ALS forums that I was looking forward to some of this, they’d think I was crazy. Maybe people here will understand.
Hi Hal. I’m sorry to hear of your diagnosis.
I spent two years as the maintainer of Dasher, and would be happy to answer questions on it. It’s able to use any single analog muscle for control, as a worst case (and a two-axis precise device like a mouse as a best case). There’s a video of using Dasher with one axis here—breath control, as measured by diaphragm circumference:
http://www.inference.phy.cam.ac.uk/dasher/movies/BreathDasher.mpg
and there are videos using other muscles (head tracking, eye tracking) here:
http://www.inference.phy.cam.ac.uk/dasher/Demonstrations.html
Head-mice (you put an infra-red dot on some glasses or your forehead and then just move your head to move a pointer) are a common and cheap input method; they cost less than $100, and Dasher’s very accepting of noisy input; if you oversteer in one direction you can just compensate later.
You’re not the first person to consider Dasher with BCI—here’s a slightly outdated summary:
http://www.inference.phy.cam.ac.uk/saw27/dasher/bci/
All the best,
Chris.
How confident are you of this? I’d be surprised if there weren’t some there who understood.
I was listening to an interesting podcast recently with the fellows who founded “patientslikeme.com″ I have no idea what the community is like, but it may be an interesting resource you haven’t encountered yet.
http://itc.conversationsnetwork.org/shows/detail4118.html
I understand—it reminds me of the Max Berry story “Machine Man” where the protagonist, a robotics researcher, loses a leg, so he designs an artificial one to replace it. Of course, it’s a lot better than his old leg… so he “loses” the other one. Of course, two out of four artificial limbs is just a good start (and so forth). I wouldn’t wish your condition on anyone, but you might just have been lucky enough to live in a time when the meat we were born with isn’t relevant to a happy life. Best wishes regardless.
I’ve played around briefly with dasher and like many of these alternate text inputs it is not designed with coding in mind. I can’t remember the forms of punctuation it uses, but the frequencies will be all wrong to start with.
What you really want is a cross of dasher and the visual studio style auto-complete, so the words/letters it puts largest are the the variables in scope or from libraries included, or the member functions for the object you are accessing. You’ll probably need to specialize your tools to a single language to start with, which is a shame. Pick wisely!
I’d love to play around with controlling a computer with my eyes.
Dasher, at least the Mac version and presumably the other desktop versions, can be given a custom character set (including how they’re ordered and grouped), and you can feed it an arbitrary text file to learn frequencies from. If you feed it plenty of program text it should learn the common phrases just fine, though without context-specific completion as in an IDE.
(Update: The current Mac version seems to be have an entirely nonfunctional preferences dialog and thus lost the character set functionality (there is a blank list box for it). Feels like the app got released in the middle of development work — hopefully it’ll be fixed sometime. The basic functionality still works; I typed about a third of this paragraph using it before I got tired of the lack of uppercase and punctuation.)