Not quite mind reading, not mind control the way people usually think of it, but significant nonetheless.

At the Massachusetts Institute of Technology biotech researchers have made progress on an area of prosthetics that most people don't think about because it's so obvious but is still very important nonetheless: The neural interface. Specifically, they've worked out an algorithm that converts patterns of chemoelectrical activity in the brain that signify intent of motion into commands for an external device. Current prosthetics aren't directly hooked into the central nervous system but the "network edge" of the peripheral nervous system via interface jacks connected to nerve endings. Let's be clear, interface jacks that accept only broad sorts of input, such as tensing the left pectoral muscle to cause an end effector to close, and relaxing it to open the gripper. This interface algorithm represents a means of control with a far finer resolution than anything available today. To be fair, research teams across the world are using different techniques to monitor brain activity (EEG, MRI, and optical, just to name a few), but they're also all using very different methods of interpreting the results. This algorithm is sufficiently generic that it can accept data from any of these methods and process it in a useful manner.

Is it ready for production? No. Can we finally get our cyberlimbs ala $cyberpunk_novel here? No. Can we see the Singularity on the horizon?

...don't ask me that. I don't know!

All kidding aside, this is a major step in the right direction, as well as a field of research that's worth keeping an eye on.