I just remembered that in WWDC apple mentioned efficient virtualization and the ability to run linux on the new arm based systems. I wonder if their voice control would be able to integrate with content inside the virtual machine.
For example, with iOS / macOS voice control all interactive items show a number next to them and you can just say the number to click the button. Could it possibly integrate tightly enough with a VM to find those interactive items and allocate them numbers?
@freedcreative Similarly, I was wondering how far Windows accessibility features would extend to WSL2.
For shells and command like software on Linux running in Windows, most likely. Graphical apps? That would be most interesting.
Another approach might be ChromeOS and its Linux support. (ChromeVox was discontinued for the Chrome browser in favor of ChromeOS.) I wonder if ChromiumOS would support this, or if one needs a ChromeBook to test.
@garrett oh yeah, given the WSL stuff isn't containerized in a VM theoretically it should be more accessible right?
I took a run at getting voice control going on windows but I didn't get very far. It seems like it might be abandoned?
I looked up accessibility options with chrome OS but I couldn't see anything about help if your hands are messed up. Have you found something like that for the system?
Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.