"Neural Dust" Could Enable a Fitbit for the Nervous System
A technology with the potential to blur the boundaries between biology and electronics has just leaped a major hurdle in the race to demonstrate its feasibility.
A team at the University of California, Berkeley, led by neuroscientist Jose Carmena and electrical and computer engineer Michel Maharbiz, has provided the first demonstration of what the researchers call “ultrasonic neural dust” to monitor neural activity in a live animal. They recorded activity in the sciatic nerve and a leg muscle of an anesthetized rat in response to electrical stimulation applied to its foot. “My lab has always worked on the boundary between biology and man-made things,” Maharbiz says. “We build tiny gadgets to interface synthetic stuff with biological stuff.” The work was published this week in the journal Neuron.
The system uses ultrasound for both wireless communication and the device’s power source, eliminating both wires and batteries. It consists of an external transceiver and what the team calls a “dust mote” about 0.8x1x3 mm size, which is implanted inside the body. The transceiver sends ultrasonic pulses to a piezoelectric crystal in the implant, which converts them into electricity to provide power. The implant records electrical signals in the rat via electrodes, and uses this signal to alter the vibration of the crystal. These vibrations are reflected back to the transceiver, allowing the signal to be recorded—a technique known as backscatter. “This is the first time someone has used ultrasound as a method of powering and communicating with extremely small implantable systems,” says one of the paper’s authors, Dongjin Seo. “This opens up a host of applications in terms of embodied telemetry: being able to put something super-tiny, super-deep in the body, which you can park next to a nerve, organ, muscle or gastrointestinal tract, and read data out wirelessly.”
The team also plans to develop implants that can stimulate as well as monitor nerves, allowing closed-loop control of nervous system activity. This has potential applications in the emerging field of bioelectric medicine, which promises to deliver a whole new class of therapies called electroceuticals. For example, on Monday GlaxoSmithKline announced it is teaming up with Google sister company Verily Life Sciences to create Galvani Bioelectronics, which will develop implants that can modify nerve signals in an effort to treat chronic illnesses. Animal studies already suggest the technology could be used to treat type 2 diabetes, and there are many other possibilities. Seo says they are looking into treating bladder control problems and bowel disease.
“While the technology is still in its infancy, initial results clearly demonstrate the potential of the ultrasonic backscatter approach,” says Victor Pikov, head of research platforms at GSK Bioelectronics, who was not involved in the Berkeley team’s research. “The Berkeley group is on the exciting path towards developing a completely novel type of sensors that would have widespread uses in bioelectronic medicines.”
The next steps are to test whether the dust motes remain viable for long periods after implantation, and to conduct experiments in awake and freely moving animals. The team then plans to make various improvements. “As we validate these platforms are stable for chronic use, we'll be making them smaller, adding additional functionality like stimulation, and other types of sensors,” Maharbiz says. “The idea that you could use these to take data about pH, oxygen, chemicals, tumors, all sorts of things, deep in your body, and communicate robustly, is extremely exciting.”
The team also plans to use multiple transceivers to keep better track of motes if they move. This would also allow steering the ultrasonic beam to communicate with multiple implants. “The vision is to implant a bunch of these motes anywhere in the body and have a patch that sends ultrasonic waves to wake up the sensors and receive information for any desired therapy,” Seo says. “Everything would be sealed in, with one patch over the site that can talk to the implants individually or simultaneously.”
The original aim of the project was to develop the next generation of brain-machine interfaces, Seo says. The group published a theoretical analysis in 2013 showing the technique could work with implants as small as 50 microns, a scale comparable to neurons. One of the biggest challenges facing neuroscientists is how to put things inside the brain without damaging or disrupting tissue, and part of the problem is anything bigger than a couple of cells tends to provoke biological responses including inflammation.
Not only do standard implants damage tissue, which slowly degrades performance; the wires they need also run a risk of infection. Previous wireless implants have also suffered critical drawbacks such as limited depth and a lifespan that ranges between months and two years. These drawbacks are mainly blamed on the fact that the devices use electromagnetism (EM). EM waves travel much faster than sound, which means wavelengths are longer for a given frequency. This limits how small the implants can be, since wireless communication needs receivers of similar dimensions to the wavelengths used. EM wavelengths short enough to communicate with tiny implants would be extremely high frequency, which would cause tissue damage and not penetrate as far. Ultrasonics solves these problems. “Walking in a parking lot one day, it occurred to me that ultrasound might be the answer, because soft tissue is relatively transparent to ultrasound, and the wavelengths are perfect,” Maharbiz says. “I got very excited; a sort of eureka moment.”
If the team can develop sufficiently small implants, it may pave the way for treating neurological disorders including epilepsy. It will also fall within the remit of President Barack Obama's Brain Research through Advancing Innovative Neurotechnologies (BRAIN) initiative, which encourages the development of new tools for communicating with the brain. “What's exciting about this is it uses an old technique we already know much about, that's used in clinical settings every day,” says Miyoung Chun, executive vice president of science programs at The Kavli Foundation, a major player in the BRAIN initiative. “A lot of new tools developed in animals, like optogenetics, are not ready for human application, but there are different opportunities here; this looks really exciting.”
Simon J Makin is an auditory perception researcher turned science writer and journalist. Originally from Liverpool in the north of England, he has a bachelor's in engineering, a masters in Speech and Hearing Sciences, and a PhD in computational auditory modelling from The University of Sheffield. He spent several years working as a research fellow in the psychology dept at The University of Reading, before recently branching out and retraining in journalism.