According to some of the comments they may be working on a Mountain-Lion-compatible version with no certain release date. However I'll believe it when more evidence comes out.
I totally know where he got that idea. Every time I've seen someone new to computers try to use a mouse, the move-something-on-the-table and your eyes following a non-equal amount of movement on a separate screen... it just seems the most unintuitive thing ever. Even touch screens, with their total lack of tactile feedback, are more intuitive in the way it has a more direct relationship with what you're seeing.
Then again, joysticks managed to make inroads in arcade gaming and they've got similar problems.
Watching someone like my mom try to do a "double-click" (which she never mastered before her death... if only she'd had Linux back then!) was enlightening, because I'd forgotten what it was like for me learning to do that.
I seriously think it would be the coolest thing ever, to set up some kind of program, which has a big sack of $$$ funds, and convinces people like Chris and people with various disabilities to get paid money from this fund to use (test) websites of webdevelopers and companies who sign up (they pay a fee which goes to the main funds) and with video and screen capture so paying developers can watch real people use both our own sites and apps and also those everyone tends to use like YouTube.
Also, the various Linux desktop environments need more help getting on-board with these technologies. I do know of a dedicated Linux for the visually impaired, Vinux, but everything else out there is just some developer's dream. The guys at Igalia are working on an open-source library for working with the Kinect though. Imagine using cameras for turning someone's gross motor movements into a way to further use computers (compared to how most computer inputs are based on tiny things and fine motor movements).