For more than a decade, Thomas Oxley has argued that the human brain doesn’t need to be cut open to connect with machines. The Australian-born neurologist believed the safest path to a brain-computer interface lay not through the skull but through the bloodstream.
In 2012, while finishing his medical training in Melbourne, Oxley and biomedical engineer Nick Opie built a device that could be delivered like a stent but record brain activity instead of opening arteries. They called it the Stentrode™. Thirteen years later, that stent-like scaffold has become one of the most closely watched technologies in neuroscience.
Unlike traditional implants that require drilling into the skull, the Stentrode is inserted through a catheter in the jugular vein and positioned inside a blood vessel above the motor cortex. It records neural signals and sends them wirelessly to an external receiver, giving people with paralysis the ability to control digital devices with their thoughts.
Synchron, which moved its headquarters to Brooklyn in 2016, has run clinical trials in both the U.S. and Australia. Ten patients so far, many living with ALS or spinal injuries have used the implant to send texts, browse the web, and manage connected devices.
“We’ve built the first non-surgical brain-computer interface designed for everyday life for people with paralysis,” Oxley said.
When the Mind Becomes the Interface
The company is now blurring the line between neuroscience and artificial intelligence. Over the past two years, Synchron has layered machine learning onto its neural-signal pipeline, training algorithms to translate brain activity into intent. Inside its Cognitive AI division in New York, engineers and neuroscientists feed hours of brain data into GPU clusters that learn how to decode thought in real time.
That work took center stage at Nvidia’s GTC conference in San Jose this year. In a live demo, Rodney Gorham, a patient in Melbourne who has ALS, used Synchron’s implant paired with Apple’s Vision Pro and Nvidia’s Holoscan platform to control his home. He played music, adjusted lights, ran a robotic vacuum, and even triggered his pet feeder all using brain signals interpreted by AI.
“Faster and more precise decoding shortens the delay between intent and response,” said David Niewolny, Nvidia’s senior director of health care and medtech. “With Cognitive AI, the mind itself becomes the user interface.”
The system’s neural data runs through GPU-based models trained to recognize specific signal patterns. Each user’s brain builds its own model, tuned over time. The result: a BCI that learns the person, not the other way around.
The Apple Connection
Synchron was also the first brain-computer-interface company to integrate directly with Apple’s ecosystem. Working with Apple’s accessibility team, it built a Bluetooth-based protocol that lets brain signals control iPhones, iPads, and Vision Pro headsets via Switch Control, without touch, voice, or eye-tracking.
“The goal is digital agency,” Oxley said. “People who lose physical control should still have the ability to communicate and express themselves.”
Synchron recently raised $200 million in Series D funding, led by Double Point Ventures with participation from ARCH Ventures, Khosla Ventures, Bezos Expeditions, T.Rx Capital, the Australian National Reconstruction Fund, Qatar Investment Authority, and IQT. The round brings its total financing to $345 million and values the company near $1 billion.
“Synchron is building the first scalable, minimally invasive brain-computer interface that fits within existing health-care systems,” said Campbell Murray, Double Point’s managing partner, who has joined Synchron’s board. “Its combination of neurovascular access, device engineering, and adaptive AI represents a major step in restoring digital control to people with paralysis.”
The funding will support a large-scale pivotal clinical trial in the U.S. and development of a next-generation interface that expands beyond the motor cortex to multiple brain regions using the same vascular route. Roughly one-fifth of the capital is earmarked for that new platform.
Engineering the Next Generation
The next-gen system is being built at Synchron’s San Diego engineering hub, led by Andy Rasdal, former CEO of DexCom, and Mark Brister, a Medtronic and DexCom R&D veteran. Their brief was to develop a high-channel, transcatheter whole-brain interface capable of collecting a far broader range of neural signals.
“It’s still delivered through a catheter, but it can reach multiple regions and gather many more channels,” Oxley said. “We want a delivery method that scales and a data stream that’s comprehensive.”
Synchron’s model keeps complexity out of the operating room. By using the same endovascular tools common in cardiac and stroke procedures, the company aims to make brain-computer interfaces available in ordinary hospitals, not just research centers.
Synchron is often mentioned alongside Elon Musk’s Neuralink, but the two companies embody different philosophies. Neuralink’s implant requires open-brain surgery to place electrodes directly into tissue, promising high bandwidth but limiting accessibility. Synchron’s catheter-based route avoids surgery altogether and can be performed by interventional neurologists.
That difference allowed Synchron to reach human trials first. Its 2019 study demonstrated computer control five years before Neuralink’s first patient implant. Where Neuralink focuses on hardware breakthroughs, Synchron’s advantage lies in clinical pragmatism and data infrastructure.
The AI Feedback Loop
Every new patient adds brain-signal data that strengthens Synchron’s decoding algorithms. Combined with Nvidia’s computing power, the company’s models can detect intent faster and with greater precision. The result is a feedback loop: as AI improves decoding, patients gain smoother control; their cleaner signals, in turn, refine the AI.
For now, the mission is medical by helping people who have lost movement or speech regain a measure of control. Yet the implications reach deeper into the future of computing. By translating the language of the brain into structured data, Synchron is pushing AI systems to learn from the most complex dataset of all: human thought.
“The immediate focus is restoring communication and independence,” Oxley said. “But the underlying idea is simple—your brain should be able to talk to technology directly.”
Synchron’s regulatory path follows the playbook for implantable medical devices: a multi-year pivotal trial, FDA review, and eventual commercial rollout. The company’s endgame isn’t consumer gadgets but a medical platform that can scale safely within existing health-care systems.
Still, its work is redefining what an interface can be. The same pipelines that help a paralyzed patient send a message could one day teach AI to interpret human intention at unprecedented depth.
In Oxley’s words, “AI has learned to speak. Now it’s learning to listen—to the brain itself.”








