Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

ARTIFICIAL INTELLIGENCE IS PUTTING ULTRASOUND ON YOUR PHONE

Source – wired.com

If Jonathan Rothberg has a superpower, it’s cramming million-dollar, mainframe-sized machines onto single semiconductor circuit boards. The entrepreneurial engineer got famous (and rich) inventing the world’s first DNA sequencer on a chip. And he’s spent the last eight years sinking that expertise (and sizeable startup capital) into a new venture: making your smartphone screen a window into the human body.

Last month, Rothberg’s startup Butterfly Network unveiled the iQ, a cheap, handheld ultrasound tool that plugs right into an iPhone’s lightning jack. You don’t have to be a technician to use one—its machine learning algorithms guide the user to find what they might be looking for. With FDA clearance for 13 clinical applications, including obstetric exams, musculoskeletal checks, and cardiac scans, Rothberg says the new device is poised to disrupt and democratize the medical imaging industry in the same way the Ion Torrent, his DNA sequencer, once made inroadsagainst genomics giant Illumina.

So how did a guy who floats around the Connecticut coast on a $40 million yacht named Gene Machine miniaturize ultrasound devices to the size of a postage stamp? It starts with a search for the beginnings of the universe.

BUTTERFLY NETWORK

In the summer of 2010, Rothberg went to hear a physicist named Max Tegmark speak at MIT about an exciting new way to image the cosmos. To do it, he had to tether together tens of thousands of telescopes (not literally, but with algorithms) to measure energy coming from far-off stars. But getting a whole bunch of antennas to talk to a whole bunch of computers turned out to be a massive computational bottleneck. So Tegmark and a grad student named Nevada Sanchez had come up with a method to split up the work in an efficient way. They called it the Butterfly Network.

As Rothberg sat in the audience, he realized he could use their algorithms to solve a totally different kind of problem: networking thousands of ultrasonic speakers on a silicon chip to make crisp, three-dimensional images of the human body’s insides. It’s something he’d wanted to do ever since sitting through his oldest daughter’s endless doctor’s visits, for a disease called tuberculosis sclerosis that causes dangerous cysts to grow in her kidneys.

At the end of the lecture, Rothberg introduced himself to Tegmark, gave him his pitch, and asked for permission to steal the physicist’s best student. A year later, Rothberg co-founded Butterfly Network with Sanchez, planning to reinvent the clinic’s most popular imaging test.

Despite ultrasound’s lo-fi reputation—the grainy images have got nothing on the clarity of an MRI or CT scan—the tech is surprisingly complicated. It works on the same principle that bats use to find prey and not run into things—send out a sound, receive the echo, calculate the distance. For the last 40 years, almost all ultrasound machines have used compressed charged crystals or ceramics to do that. When electric current is applied to the crystals, they rapidly change shape, which sends out vibrations. When those sound waves bounce back, the crystals emit electrical currents. But each one has to be individually wired together, then attached via cables to a separate machine for processing. Plus, the crystals have to be tuned to produce the right type of ultrasonic wave for imaging at a particular depth. You’ve got one probe for the heart, and another for the stomach, and another for the uterus. A typical system with multiple wands and a display screen cost upwards of $100,000.

BUTTERFLY NETWORK

Getting around these limitations was a two-step process. First, Butterfly’s engineers replaced the crystals with rows and rows of teeny tiny drums, also known as capacitive micromachined ultrasound transducers, or CMUTs. You can think of them like a metal plate suspended between two electrodes. Run some current through them and the plate vibrates, and you tune the frequency by changing the power of the electric field. They had help from Stanford professor Pierre Khuri-Yakub, who made the first CMUT back in 1994. His research got companies like General Electric, Philips, and Samsung interested in a new way to do ultrasound. But they never figured out how to make them function reliably at scale. “This is a high-powered, electric field-based device,” says Khuri-Yakub. “It’s like a Mustang. If you can control it, it will be good to you. If you can’t, it will be your downfall.”

His engineering and Sanchez’s math completed the redesign. They did away with the wiring and bonded the CMUTs directly to a semiconductor layer containing all the necessary amplifiers and signal processors to turn sounds into pictures. It’s a ton of data; the chip is processing about 20 copies of Wonder Woman on DVD every second. “If the old version of ultrasound is a straw, this is a fire hose,” says Rothberg. But the company can take advantage of the progress in mass-market fabrication techniques perfected for computer chips to bring the cost down. Rothberg says they plan to start shipping units next year, for $2,000 a pop. They hope to outfit each device with deep learning software that’s been trained on hundreds of thousands of ultrasound images, so it knows the difference between a high-quality and poor-quality shot of different body parts, pending a separate FDA approval. And while the device isn’t on the market yet, it has already saved a life.

In July, Butterfly’s chief medical officer, John Martin, was at a hospital in Denver, doing a set of validation studies on the device. He’d had a cold and some stiffness in his neck, but he figured it was just an overactive lymph node. Since they were testing it anyway, he smeared some gel on his throat, and ran the probe over it. A dark, 3.4 cm mass loomed into view. “That’s not a lymph node,” he thought. It wasn’t. Under his tongue squamous cell cancer had been growing for months.

He got in to see a specialist, and after a five-and-a-half hour surgery was tumor-free. He’s now undergoing radiation therapy, which makes his voice prone to breaking as he speaks. He says he probably wouldn’t have sought out a doctor for something as small as a lymph node, and hopes the technology will help other people who might make the same mistake, or not have access in the first place. “Thermometers once lived only inside hospitals. And blood pressure cuffs, and defibrillators,” Martin says. “The sooner we can put smart technologies in the hands of people at home, the sooner the right diagnosis can be made. I’ve yet to find a disease state where earlier detection didn’t lead to better outcomes. And I’m living proof of that.”

Related Posts

Subscribe
Notify of
guest
42 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
42
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence