The First Mind-Controlled VR Game Will Hit Arcades in 2018

Neurable’s brain-scanning headband brings hands-free control to virtual reality entertainment

Photo of person wearing VR headset.

“Wake up, this is not a test,” intones a voice as the virtual reality game Awakening begins. Your game character is a child trapped in a nefarious government lab, and as you scan the room you see a variety of objects lying on the floor, each flashing with light. You focus your mental attention on a block, and it rises up and rotates in the air before you. Then you focus on a mirror on the wall, and the block hurtles toward it and smashes the glass, revealing a scrawled sequence of numbers beneath. You notice a keypad by the door with numbers that are also subtly flashing. Using only your Jedi powers, you focus on certain digits in the correct sequence to open the door.

The technology that makes this game possible is a brain-scanning headband that attaches to a VR headset. That headband, paired with software that interprets the neural signals, enables wearers to play games without using any sort of hand controller. The creators of this brain-computer interface system, at the Boston-based startup Neurable, believe this intuitive controller will be the next big thing in VR. “We’ve essentially created a brain mouse,” says Ramses Alcaide, Neurable’s co-founder and CEO.

Awakening is the world’s first brain-controlled VR game. And curious gamers will get a chance to play it later in 2018 when Neurable’s game will arrive in VR arcades around the world. The headband incorporates seven bulky electrodes that record EEG (electroencephalography) signals, a standard method of monitoring the electrical activity of broad swaths of brain cells. To detect the user’s intention, Neurable’s system makes clever use of a type of brain signal called an event-related potential. As you focus on a toy block that’s pulsing with light, for example, your brain subconsciously registers its particular pattern of flashes, and certain neurons “fire” in response. Neurable’s software processes the noisy EEG data, finds the signal therein and translates it into a game command: Use the block.

Neurable chose to use flashing objects and the associated neural signals because its EEG system’s scalp electrodes can reliably pick up those brain patterns. Neuroscientists haven’t yet figured out how to detect signals that would allow for more direct control (such as a signal that means “move the block to the left”) without resorting to surgically implanted electrodes. (https://spectrum.ieee.org/semiconductors/devices/the-first-mindcontrolled-vr-game-will-hit-arcades-in-2018)

Alcaide says that Awakening isn’t very sophisticated in its story line. He explains that Neurable hired a VR graphics company to create the game merely as a demonstration of the technology. But the company is now offering a developer’s kit that game designers can use to devise all manner of entertainments and experiences, and he is looking forward to seeing what third-party developers will dream up. “We’re not game designers; we don’t know how to lead players through these environments,” Alcaide says. “The narrative is the hard part, not the technology.” While the current EEG headband fits best with the HTC Vive headset, it’s also compatible with other VR systems.

Rolling out the game in VR arcades is a smart strategy, says Jitesh Ubrani, an analyst at the research firm IDC, headquartered in Framingham, Mass., who co-authored a recent VR market report. Ubrani says the high price of VR headsets has prevented widespread adoption, particularly because consumers don’t have many opportunities to try before they buy. “I think VR arcades will play a very important role,” Ubrani says. “They make it easy for people to try it out and learn about the VR experience.” While only a handful of such arcades have opened in the United States and Europe, Ubrani says they’re already “hugely popular” in China and elsewhere in East Asia. Neurable isn’t the only company aiming to make a more intuitive interface for VR. Ubrani notes that companies like Leap Motion, based in San Francisco, are working on hand-tracking systems that allow for gesture-based interfaces. Such systems, expected to debut in the next few years, also aim to replace handheld controllers and might seem more natural to gamers than Neurable’s brain-control system.

Neurable’s Alcaide says he isn’t worried, because he sees VR games as just the first application of his company’s technology. To make the system more versatile, Alcaide says its hardware will evolve to become less obtrusive: He envisions first a headband with only one or two small EEG electrodes, and eventually an EEG sensor that fits snugly into an earbud. Those discrete sensors could then be used with augmented reality (AR) glasses, which layer virtual reality on a view of the real world. If such glasses catch on for commercial or consumer use, Neurable’s technology would enable interaction without using a smartphone, gesturing, or issuing voice instructions. Instead, users would just focus their attention on a menu command, a “record” button, or whatever else they wanted to click on. “EEG offers a screen-free solution that’s private,” Alcaide says. “You won’t have to wave your arms around or talk out loud on the bus.”

Bats, Blimps, and Giant Camera Chips

  • Moonward Ho!

    In December 1968, Apollo 8 became the first manned mission to orbit the moon. A half-century on, SpaceX, Elon Musk’s spaceflight company, is vying to do the same thing, offering to send two private customers on a lunar flyby aboard its Dragon 2 capsule. Meanwhile, German startup Part-Time Scientists aims to land the first 4G LTE base station on the moon this year. The base station will relay signals between the company’s yet-to-be-launched rovers and mission control back on Earth, but it could also be used by future lunar explorers. Further-out moon ventures include an inflatable orbiting habitat being developed by Bigelow Aerospace. If all goes according to plan—admittedly, a big “if”—2018 could mark the beginning of the return of humans to the moon. And this time it’ll be for a good long stay.

    EU Doubles Down on Data Privacy

    On 25 May, the European Union’s General Data Protection Regulation (GDPR) will take effect, with tough rules aimed at protecting the privacy of people living in the EU. Europeans already have many more privacy protections than, say, U.S. citizens, including the “right to be forgotten.” But the GDPR goes much further: It protects virtually every kind of data pertaining to individuals, including medical records, online transactions, and social media posts. It also gives EU residents the right to opt out of automated decision making—via a machine-learning algorithm, for example—and to demand an explanation when an automated decision involves them in some significant way. The GDPR applies to companies doing business in Europe as well as companies that handle the data of Europeans. Unsurprisingly, firms far and wide are scrambling to comply.

    Blimp Cell Towers Head Skyward

    This year, Altaeros Energies plans to launch the first of its tethered-blimp cell SuperTowers. Each aerostat, floating up to 600 meters above the ground, will provide coverage equal to 30 traditional cell towers. The blimps are intended for remote locations where broadband service is too difficult or costly to supply by conventional means. Several other companies aim to do similar things, including Google, with its Project Loon balloons, and Facebook, with its solar-powered Internet drone, Aquila. Altaeros’s other big push is in high-altitude wind turbines. Who knew you could build a diversified business around lofting tech-laden tethered balloons?

    A Home That Floats

    Worldwide, hundreds of millions of people live on floodplains, where they’re at risk of losing their homes, if not their lives, to rising water. Such risks could be reduced if their homes could float. That’s the idea behind LifeArk, a prefabricated modular dwelling that is cheap to make, easily transported in shipping containers, and then quickly assembled on-site using standard tools. A project of the architectural firm GDS, the 6-square-meter units can be bolted together into larger structures and connected to the main power grid and sewer system, if available. For off-grid locations, the units come with solar panels, rainwater harvesting and filtration, and waste management systems. The first prototypes will be floated, er, installed on a lake in Lindale, Texas, about 140 kilometres east of Dallas, later this year

    Every Shark Counted

    Sharks and rays are threatened worldwide, but even scientists who study them haven’t been able to quantify the extent of the problem. Vulcan Inc., in Seattle, a philanthropic entity of Microsoft co-founder Paul Allen, aims to fill in the missing data. Its three-year Global FinPrint project is counting sharks, rays, and other marine life around coral reefs, using remote underwater video stations as well as a video-processing AI that helps identify animals caught on camera. The survey of 400 reefs is scheduled to wrap up this year. Already, the data has been used by Belize to create a ray sanctuary, and it’s informing the Dominican Republic’s efforts to protect sharks. The project has also generated intriguing clips of eels, sea turtles, and sea snakes—which admittedly don’t have quite the viral pull of cat videos.

    150 Megapixels in Your Camera

    Sony continues its domination of digital camera sensors with the release this year of the IMX411, a CMOS sensor chip capable of an “absurd” (as one blogger put it) 150 megapixels. The chip will also shoot ultrahigh-definition 8K video at 30 frames per second. Two other sensors, the IMX461 and IMX211, will offer 100-megapixel resolution. All three chips are intended for medium-format digital cameras—Sony’s as well as other companies’—and for applications like large-area surveillance, digital archiving, and industrial inspection. If you’re thinking you really need such a camera, better stock up on storage, too: Each 150-megapixel image will translate into a 300-megabyte file.

    Linking Up Chile’s Long, Skinny Grid

    From north to south, Chile extends 4,300 kilometres, but at its widest point, it’s just 350 km. This elongated profile poses a challenge for the country’s grid manager, Coordinador Eléctrico Nacional (CNE). Until recently the Chilean grid consisted of four separate electricity networks, so there was no way to move, say, solar energy generated in the northern desert to the country’s populated middle. Last year, though, construction wrapped up on the 580-km Mejillones-Cardones interconnection, finally linking up the northern and central grids. Later this year, a new 750-km transmission line will better connect points within the central network, and CNE plans to fund another US $600 million in transmission projects, including a 500-kilovolt line for the south. A robust transmission network could allow Chile to tap into the ocean and tidal energy—with 4,300 km of coastline, it’d be a shame not to.

    A Subway Fit for a Queen

    Late this year, the first major section of London’s £14.8 billion Crossrail train network is set to open. When the new rail service fully opens in December 2019, it will add 42 kilometres of tunnels to the capital’s transit system, along with 10 new stations and upgrades to an additional 30 stations. The 10-year effort—the biggest construction project in Europe—promises to relieve congestion and shorten travel times for up to 200 million passengers a year. Although the Elizabeth Line is named for England’s longest-reigning monarch, the queen strikes us as an unlikely commuter.

    Error-Detecting Voting Tech

    The two big concerns about electronic voting are that a system error will cause votes to be inadvertently miscounted or that a hacker will cause votes to be intentionally miscounted. Starting this year, the state of Colorado plans to roll out a technique that proponents say will guarantee the correct outcome: risk-limiting audits. This statistical approach, which the state successfully rolled out in the November 2017 election, relies on comparing a random sample of paper ballots with the corresponding digital votes. The closer the election result, the more ballots get audited. If the audit finds an error in the reported outcome, a full hand count will be done. But if the audit finds the reported outcome to have a high likelihood of being correct, no hand recount is needed. The company developing the software for Colorado, Free & Fair, is open-sourcing it so that other states can adopt it.

    Frankenstein Turns 200

    This year marks the 200th anniversary of the publication of Frankenstein. Although Mary Shelley began writing her gothic novel on a dare to devise a good ghost story, she also wove in elements of the latest scientific theories of the day, including Galvani’s studies of “animal electricity” and contemporary debates over human consciousness. Thus did Shelley spark a pop culture meme that today is as popular in Hollywood as it is revered in academia. Universities around the world will host Frankenfests throughout the year to celebrate the book, its creator, and her ideas.

    Waiting for Stratolaunch

    Announced in late 2011, Paul Allen’s humongous rocket-launching aircraft was supposed to take its first test flight in 2016, which got pushed to 2017 and then 2018. Most recently, Allen’s company said it would conduct engine tests at NASA Stennis Space Center in the second half of this year. Next year could finally see the Stratolaunch’s maiden voyage. The idea of the 117-meter-wide, six-engine plane is still appealing: Taking off from a commercial runway, it will ascend to about 9,100 meters carrying one or more rockets, for a total payload weight of 230,000 kilograms. (That’s an order of magnitude greater than the payload of Orbital ATK’s Stargazer.) From that altitude, a rocket is clear of more than half of the planet’s atmosphere and thus far easier to propel to low Earth orbit. The project’s long timeline only goes to show that reducing the cost and complexity of rocket launch is still about as hard as rocket science itself.

    Good News for Bats

    When bats meet wind turbines, it’s invariably the bats that lose. According to one study, U.S. wind power killed more than 600,000 bats in 2012. Since then, the world’s wind-generating capacity has doubled. Curtailing wind turbines during periods of peak bat activity does reduce fatalities, but it also cuts into an operator’s revenues. This year, NRG Systems, based in Hinesburg, Vt., will release a commercial version of its ultrasonic bat-deterrent system, which requires no curtailment. The equipment sits on the turbine’s nacelle and emits ultrasonic sound between 20 and 50 kilohertz—the same frequencies North American bats use for echolocation. A bat nearing the turbine will immediately change direction, thereby avoiding its date with destiny.

Your Next T-Shirt Will Be Made by a Robot

Georgia Tech spin-off SoftWear Automation is developing ultrafast sewing robots that could upend the clothing industry

illustration

Sometime later this year, dozens of robots will spring into action at a new factory in Little Rock, Ark. The plant will not make cars or electronics, nor anything else that robots are already producing these days. Instead, it will make T-shirts—lots of T-shirts. When fully operational, these sewing robots will churn them out at a dizzying rate of one every 22 seconds. For decades, the automation of the sewing of garments has vexed roboticists. Conventional robots excel at manipulating rigid objects but are rather inept at handling soft, flexible materials like fabric. Early attempts to automate sewing included treating pieces of cloth with starch to temporarily make them stiff, allowing a robot to manipulate them as if they were steel sheets. This and other approaches, however, never became commercially viable, mainly because the clothing industry has resisted automation by relying on cheap labour in developing countries.

Now a Georgia Tech spin-off, SoftWear Automation, in Atlanta, claims to have built a practical sewing robot. And it doesn’t need starch. Rather, it’s based on a much higher-tech approach, one that combines machine vision and advanced manipulators. At the Arkansas factory, owned by Tianyuan Garments Co., one of China’s largest apparel manufacturers, SoftWear’s robots, called Sewbots, will equip 21 production lines, designed to make 23 million T-shirts per year for Adidas.

“Around the world, even the cheapest labour market can’t compete with us,” Tang Xinhong, chairman of Tianyuan, told China Daily last year, referring to the cost of producing each T-shirt, which he expected to be only 33 U.S. cents. The fact that a Chinese company will use robots to make T-shirts in the United States appears to be a watershed moment for the clothing industry. Satyandra K. Gupta, director of the Center for Advanced Manufacturing at the University of Southern California, in Los Angeles, says sewing robots will ultimately allow factories to produce clothing not only faster and cheaper but with greater customization. “You’ll get clothes made based on your body size and fashion tastes,” he says. “This has potential to significantly change the industry.”

Today, if you walk into a garment factory, you’ll find workers performing almost every task required to make a piece of apparel. What happens when robots take over their labours? While some observers warn that millions risk losing their jobs, others argue that in the long term automation will decentralize manufacturing, creating new, better jobs in many more places.

“Our vision is that we should be able to manufacture clothing anywhere in the world and not rely on cheap labour and outsourcing,” says Palaniswamy “Raj” Rajan, the chairman and CEO of SoftWear, which has raised US $7.5 million from venture capital firm CTW Venture Partners. When manufacturers are located nearer their customers, he says, they can design and deliver new products faster and also reduce transportation and inventory costs.

But the changes won’t happen overnight. Automated sewing, despite the progress demonstrated by SoftWear and others, remains extremely challenging. Fabric comes in many different weights and textures, and handling such a wide variety is still tricky for robots. “Wherever there’s a need to manipulate fabric—for example, to load the sewing machine—then the human is still very much in play,” says David Bourne, a principal scientist at Carnegie Mellon University’s Robotics Institute who focuses on building intelligent systems for automated manufacturing. “The material-handling part of this whole thing is missing.”

The approach SoftWear came up with to solve this problem is rather ingenious. (The company has three issued patents and several more patent applications.) Instead of trying to manipulate a piece of fabric by keeping track of its overall dimensions—which is tricky because textiles stretch and deform—the company decided to track individual threads in the fabric. To do that, it developed a specialized camera capable of capturing more than 1,000 frames per second, and a set of image-processing algorithms to detect, on each frame, where the threads are. At the same time, the company built robotic manipulators to mimic the way sewing-machine operators use their fingers to handle fabric. These micromanipulators, powered by precise linear actuators, can guide a piece of cloth through a sewing machine with submillimeter precision, correcting for distortions of the material.

In addition, SoftWear came up with two other systems to move fabric panels around: One is a four-axis robotic arm with a vacuum gripper that can pick up and place fabric items on the sewing table; the other is a 360-degree conveyor system that uses spherical rollers embedded on the table to slide and rotate the panels at high speed. The company’s current Sewbots can make bath rugs, pillowcases, towels, and other products that are flat and mostly round or square in shape. Rajan, the CEO, says 2 million such products are already for sale at Target, Walmart, and other major retailers, and that before year-end “our robots will be making 30 million pieces a year.”

SoftWear is now improving its sewing robots for operation at the Tianyuan factory. Making a T-shirt is much more complicated than making a rug because a T-shirt requires multiple seams and hems that are not flat. If all goes as planned, the Sewbot will be able to carry out the same tasks that 10 workers currently perform, like sewing a sleeve or attaching a label, in a conventional production line—except the robot will be able to make the same T-shirt in about half the time. After T-shirts, SoftWear wants to focus on jeans, dress shirts, and uniforms, which are even harder to make. Will robots eventually sew every piece of clothing we wear? No, Rajan says: “High fashion, bridal dresses, things like that—those are still going to be done by humans.” So, for now, it appears that “robot couture” will have to wait.

This AI Hunts Poachers

The elephant’s new protector is PAWS, a machine-⁠learning and game-theory system that predicts where poachers are likely to strike

illustration

Every year, poachers kill about 27,000 African elephants—an astounding 8 percentage of the population. If current trends continue, these magnificent animals could be gone within a decade.

The solution, of course, is to stop poachers before they strike, but how to do that has long confounded authorities. In protected areas like wildlife preserves, elephants and other endangered animals may roam far and wide, while rangers can patrol only a small area at any time. “It’s a two-part problem,” explains Milind Tambe, a computer scientist at the University of Southern California, in Los Angeles. “Can you predict where poaching will happen? And can you [target] your patrols so that they’re unpredictable so that the poachers don’t know the rangers are coming?”

To solve both parts of the problem, Tambe and his team created an artificial intelligence system called PAWS, which stands for Protection Assistant for Wildlife Security. A machine-learning algorithm uses data from past patrols to predict where poaching is likely to occur in the future. And a game-theory model helps generate randomized, unpredictable patrol routes. The system has been field-tested in Uganda and Malaysia with good results, and in 2018 its use will expand to China and Cambodia. In addition, Tambe says, the PAWS system could soon be integrated into an existing tracking tool called SMART, which wildlife conservation agencies have deployed at most sites worldwide to collect and manage patrol data.

In a one-month trial with the Wildlife Conservation Society in Uganda’s Queen Elizabeth National Park, rangers patrolled two areas that they rarely visited but that PAWS indicated had a high probability of poaching. Much to the Rangers’ surprise, they found numerous snares and other signs of illegal activity. A later 8-month trial looked at the entire park. Again, the patrols verified the model’s predictions: In the high-probability areas, they found about 10 times as much poaching as in the low-probability areas. A new trial in Uganda’s Murchison Falls National Park is checking whether PAWS will work equally well in a different location.

Andrew Plumptre, director of science for the Wildlife Conservation Society’s Africa program, is collaborating with Tambe’s group on the Uganda field trials. He says that on normal patrols, rangers enter data about what they’re seeing, using a smartphone app called Cybertracker. About once a month, that data gets uploaded to SMART. “You’re able to map where patrols have searched, where they found snares and carcasses of elephants and whatever,” says Plumptre. “But there’s nothing proactive about it. Ranger patrols alone aren’t sufficient to stop poaching.” He’s hoping that PAWS’ predictive abilities will make those patrols as efficient and effective as possible.

The PAWS system grew out of work Tambe and his students started doing more than a decade ago for the port, airport, and airline security. The U.S. Coast Guard, the Transportation Security Administration, and the Los Angeles Sheriff’s Department have all deployed AI systems developed by Tambe’s group. And he cofounded Avatar Intelligence, in Venice, Calif., to commercialize this research.

About six or seven years ago, Tambe was at a World Bank meeting and saw a talk on the dire plight of tigers, fewer than 4,000 of which survive in the wild. “I guess I’d heard about such things, but I never appreciated the scope of the problem. I suddenly realized the potential of AI to help,” Tambe says. He quickly got in touch with conservation groups.

Fei Fang, a former student of Tambe’s who is now an assistant professor at Carnegie Mellon, had worked on a Coast Guard system to protect the Staten Island Ferry, in New York City, before turning to PAWS. The two scenarios are similar, she notes. “There is a defender, which is the wildlife ranger or the Coast Guard, and there is an attacker, which is a poacher or a terrorist, and they’re interacting with each other in a way that you’re trying to predict.”

/image/Mjk5NzI4Nw.jpeg

For the PAWS team, the field trials drove home an important reality of wilderness policing: The world is not flat. When the team began working in Malaysia, Fang says, they didn’t factor in the densely forested, mountainous terrain. “In our first model, we took a map, divided the whole area into grid cells, drew a line on the grid, and said, ‘Patrollers, please follow this line,’ ” she recalls. “We’d have Skype calls with them, and they’d tell us: ‘No, no, no, this is not going to work.’ We didn’t understand.”

Only when the PAWS team visited the Malaysian reserve did they get it. “We walked the route with the rangers, and it took us about 8 hours to go a couple miles,” Fang says. A subsequent refinement of PAWS takes into account geographical features that are easy to walk on, like ridge lines, streambeds, and old logging trails. “We built a virtual street map for the conservation area and then plotted routes based on the map.” Patrollers following the new routes found “all kinds of signs of animal and human activity,” Fang says.

At press time, Fang was in the midst of a three-month field trial of PAWS in northeast China with the World Wildlife Fund, where the animal of greatest concern is the Siberian tiger. Fang says one enhancement they’re working on is to help rangers make decisions while on patrol. “They may see footprints and tree marks, which indicate the direction the poachers are heading,” she says. “And they need to decide, Should I chase the poachers? What is the best strategy for changing plans if they see the new information?”

Tambe and Fang are also collaborating with a wildlife conservation service called Air Shepherd, which uses drones equipped with infrared cameras to search for poachers at night. Their AI-based video-analysis system is automating what is otherwise a tedious and difficult task for humans: reviewing hours and hours of grainy black-and-white footage and then alerting rangers when illegal activity is detected.

The next step for PAWS is to make it available to other NGOs, ideally by integrating the algorithm into existing tools, like Cybertracker and the SMART system. “We’re probably never going to completely stop poaching,” says Plumptre. “But we can get it down to a lower level so that populations don’t decline.” AI is usually applied to problems of modern technology, Tambe notes, but this work is different. “We’re using AI to save the natural world—these stunning landscapes and animals that we hope won’t disappear,” he says. “These are important treasures.”

Tiny Robots in Disguise Combat Bacteria in the Blood

Miniature robots cloaked in platelets and red blood cells can clear bacterial infections in the blood

MRSA bacteria (spheres) attached to the biohybrid nanorobots.

Esteban-Fernández de Ávila/ Science Robotics

Researchers have come up with all sorts of ways to propel tiny robots deep into the human body to perform tasks, such as delivering drugs and taking biopsies. Now, there’s a nanorobot that can clean up infections in the blood.  Directed by ultrasound, the tiny robots, made of gold nanowires with a biological coating, dart around blood, attach to bacteria, and neutralize toxins produced by the bacteria. It’s like injecting millions of miniature decoys into the blood to distract an infection from attacking the real human cells.

The invention, developed in the labs of Joseph Wang and Liangfang Zhang at the University of California San Diego (UCSD), was described today in Science Robotics. The researchers hope the robotic detoxification system could provide an alternative to the multiple, broad-spectrum antibiotics currently used to treat life-threatening infections—one that can work in minutes. So far, they have demonstrated the proof-of-concept system in the lab using test tubes of blood. Next, they hope to scale up and refine the process enough to test it in mice, says Berta Esteban-Fernández de Ávila, a postdoc in Wang’s lab who led the study.

The bacteria busters add to the growing list of tiny robots that can deep dive into the human body. Many of these were built with drug delivery in mind, and they move in imaginative ways. There’s the squishy clockwork robot, the jackhammer, the stomach acid-driven nanorocket, the magnetic field-guided nanoparticles, and even robots propelled by sperm cells from bulls and sea creatures. UCSD’s latest bot could have applications in drug delivery, too. But for now, the team is focused on fighting gram-positive bacterial infections. In that type of infection, there are two elements to battle: the bacteria themselves, and the toxins produced by the bacteria. Toxins poke holes in red blood cells, and bacteria attach to platelets in the blood. Both actions eventually destroy their targets, resulting in serious infections in humans.

To address both of these forces, the UCSD team cloaked the nanowires in exactly what the pathogens are looking for: platelets and red blood cells. The fused coating, derived from the cellular membranes of these two components of blood, disguise the nanorobots, making them look like the real thing and giving them biological functions.  When they unleash the cloaked biobots in the blood, the bacteria attach to what seems like a platelet, only to find themselves held captive on a nanowire. The toxins interact with what appears to be a red blood cell and get neutralized in the process. Bonus: The disguise is good enough that the body’s own defence systems might not notice the nanowire invaders, either.

Esteban-Fernández de Ávila and her colleagues used ultrasound waves to control the movement of the nanowires, which convert the acoustic energy into motion. With this system, the team was able to increase the collisions between the nanowires and their pathogenic targets, accelerating the detox process. Guiding the collisions with ultrasound is “not the same as putting static nanowires in solution and just waiting,” says Esteban-Fernández de Ávila. “When we apply the acoustic field, the movement of the nanowires produces a very rapid interaction with the pathogen.”

The team tested the nanorobots on methicillin-resistant Staphylococcus aureus, or MRSA, one of the toughest infections around. Samples treated with acoustically-propelled robots produce a 2.4-fold lower rupture of red blood cells and a 3.5-fold increase in bacteria binding, compared with static nanorobots. The nanorobots aren’t going to replace antibiotics anytime soon, but the UCSD team is working on it. Next steps include scaling up the acoustic design, improving the propulsion in complex biological fluids, evaluating other types of propulsion mechanisms, and testing the treatment in mice. After that, they’d like to load up the membranes with high amounts of drugs to see how well it can perform targeted drug delivery

– Emily Waltz/ The Human OS- Biomedical- Biomedical Devices