Sebastian Anthony at ExtremeTech:
Using functional near-infrared spectroscopy (fNIRS), which is basically a portable, poor man’s version of fMRI, Brainput measures the activity of your brain. This data is analyzed, and if Brainput detects that you’re multitasking, the software kicks in and helps you out. In the case of the Brainput research paper, [MIT’s Erin Treacy] Solovey and her team set up a maze with two remotely controlled robots. The operator, equipped with fNIRS headgear, has to navigate both robots through the maze simultaneously, constantly switching back and forth between them. When Brainput detects that the driver is multitasking, it tells the robots to use their own sensors to help with navigation. Overall, with Brainput turned on, operator performance improved — and yet they didn’t generally notice that the robots were partially autonomous.
Now, it’s easy to see how this could be extrapolated out into the real world. We already have steering wheels that detect when we’re falling asleep — with Brainput, your car could automatically drive itself during that split second where you turn around to shout at your kids, or twiddle with various dashboard knobs. The same goes for airplane pilots, or indeed anyone seated behind the controls of a large, dangerous vehicle…. Imagine a computer that increases the size of buttons and text when you’re tired, or a video game that slows down when you’re stressed. Your Xbox might detect that you’re in the mood for fighting games, and change its splash screen accordingly. Likewise, Firefox could detect that you’re feeling amorous, and automatically load up Private Browsing mode. Menu buttons could move around and change in size — or disappear entirely. Eventually, computer interfaces might completely remold themselves to your mental state.
















