Fabbing components, parallel processing with rats, and synthetic neurons.

Jul 27, 2015

Life being what it is these days, I haven't had much time to write any real posts here. If I'm not working I'm at home studying because I'm back on the "get letters after my name" trail, and if I'm not studying or in class I'm helping get family moved out and set up on the west coast. Or I'm at the gym because I'm fighting alongside my essential vanity by trying to lose weight; people tell me that I look good these days but there's a fine line between looking healthy and needing new clothes. So there you have it, from the depths of my psyche just above some of the interfaces.

I do have one or two interesting things in the pipeline that I need to write about - in fact, they're going to be submitted to a couple of conferences if all goes according to plan. But I think I'd better get the conference papers written first because you never can tell if the organizers will pitch a fit (or threaten legal action - they're being held in the US, after all) if you blog about something you're going to present. But enough about that.

Some years ago, after the field of 3D printing really took off, a number of hackers began working on the problem of fabricating circuit boards with 3D printers instead of going through the process of laying out and etching circuit boards with chemical processes that are often nasty and messy. But then the question of acquiring components comes up - Radio Shack is as dead as Walt Disney so it's not as if you can jander down to the strip mall and pick up the parts you need anymore (mostly - some Micro Center outlets have entire sections dedicated to this sort of thing, as do Fry's outlets) if you really need something for a project and can't wait to order it online.

A couple of days ago research teams at the University of California at Berkeley and the National Chiao Tung University of Taiwan published a paper last week in Nature's open access journal Microsystems and Nanoengineering which detailed how they used a 3D printer to fabricate reasonably standard electronic components. Their 3D printer was a dual extruder model which laid down successive layers of structural plastic and sacrificial wax to form hollow spaces inside the figure that were later cleaned out to make room for multiple injections of silver paste which formed the conducting portions of the components. The hollow spaces were engineered to have certain electrical properties so that different kinds of components could be constructed, among them inductors and resistors. From these basic components electrical circuits were constructed; as a proof of concept the research team built a "smart milk cap" which had what amounts to a simple lab-on-a-chip to keep tabs on whether or not the milk in the carton had gone bad or not by analyzing changes in the electrical properties of the milk. Data was transmitted from the smart milk cap via a passive RF transmitter that blipped out data whenever an RF probe energized it.

The size of their components? Before cleaning them up they fit comfortably on top of a penny with room to spare. The resolution of their printer (a 3D Systems ProJet HD 3000) is 30 μm or 30 millionths of a meter. This ain't your dad's breadboard. A long-awaited technology for many is direct neural interface, connecting directly to the central nervous system to create a symbiotic link between the organic and the electronic. It's not an easy thing because interfacing with individual neurons is hard, to say the least. Sticking even microscopic electrodes into neurons causes irritation, rejection, and scarring, causing the connection to degrade until it's no longer usable. It probably kills some neurons, too. There are new techiques of connecting to neurons that seem like they might not have that problem but it's too early to tell. Regardless, something incredible happened in a laboratory recently. A research team at Duke University in North Carolina made a breakthrough in the field of direct neural interface - they networked the brains of laboratory rats and primates and were able to solve problems much more efficiently than a single test animal could solve alone. Each test animal was implanted for electrocorticography; an electrode mesh was surgically implanted atop the motor and somatosensory cortices of the test animals which allowed high resolution monitoring of the chemo-electrical activity. It also provided for the electrical stimulation of those parts of the brains from outside, in this case, with electrical activity from the brains of the other test animals as well as synchronization signals for training. The rats were connected in a fully connected network, meaning that every rat was plugged into every other rat.

As a side note, several experiments with primates were done, but it's the rat experiments that are most interesting to me because the primates were largely limited to cooperating with one another while the rats were processing information, so I'm writing about those. Please read the articles I linked to for detail on the primate trials. Moving along...

The rats were trained to synchronize with one another by feeding a very weak clock signal into their brains while depriving them of water. When the rats had figured out how to use the clock signal to operate in concert the next phase of the experiments began. One of the experiments involved feeding the rats weather information (temperature and barometric data) represented somehow (I'd love to hear how they pulled that off) and the trained rats were able to repeatedly process the data and predict more accurately than chance short-term weather conditions. Experiments with storing and retrieving sensory information (tactile information, because that's a part of the brain that was specifically interfaced with) were also successfully conducted.

Which brings me right along to some pretty hardcore science out of the Karolinska Institutet in Sweden: A first stab at a fully synthetic neuron worked in the lab. The synthetic neurons are wholly inorganic, built out of conductive polymers but they mimic one of the fundamental communication channels of organic neurons, namely, the detection and emission of the neurotransmitter acetylcholine (one of many neurotransmitters, but they had to start somewhere) instead of direct electrical stimulation of discrete neurons (which works but has its problems). Looking a little closer, while neurons operate using electrical action potentials internally (I highly recommend enabling Flash to watch the animations, they're very instructive) chemical signalling is used to bridge the synaptic gaps between neurons. The synthetic neurons were used to cross-connect cultured neurons situated in petri dishes across a distance of several inches. Not covered in any of the open access articles I've found are how the charged the far sides of the synthetic neurons with acetylcholine or how the polymer structures were fabricated. The research team published a peer-reviewed paper entitled An organic electronic biomimetic neuron enables auto-regulated neuromodulation, but unfortunately it's behind a paywall. #icanhazpdf?

Could cyberbrains be on the horizon? Beats me. If this ever pans out (and if one person's commentary on excitotoxicity either isn't accurate or can be addressed (and good observation!)) this could be the case. From doing some reading about synaptic signalling I'm wondering if the commenter might be incorrect in one of their fundamental assertions. Concentrations of sodium and potassium ions are manipulated by neurons' ion channels in response to neurotransmitter signals upstream and, so far as I can discern do not seem supplied by the neurotransmitters themselves. Additionally, the ion channels cover the cell bodies of neurons and not necesssarily the surfaces of dendrites, which are studded with postsynaptic terminals that receive neurotransmitter molecules. The commenter also seems to miss the point that the synthetic neurons in question are actively releasing acetylcholine instead of only electrically stimulating neurons. That said, exocortices are rapidly coming into their own with the proliferation of semi-autonomous software agents, intelligent personal assistant software, and widely available wireless data networks. Right now their user interfaces are largely textual with some speech recognition and speech synthesis and the odd pretty visual effect thrown in. Some time in the future, if synthetic neurons wind up being feasible (and get cheap enough for people to afford!)... I don't know. I hope so. I'll certainly be wheeling and dealing to become a beta tester (probably... remember, you want the first .X release after the first DefCon following the technology hitting the market!)