Taking too long? Close loading screen.
Connect with us

Science

Facebook and NYU use artificial intelligence to make MRI scans four times faster

Published

on

If you’ve ever had an MRI scan before, you’ll know how unsettling the experience can be. You’re placed in a claustrophobia-inducing tube and asked to stay completely still for up to an hour while unseen hardware whirs, creaks, and thumps around you like a medical poltergeist. New research, though, suggests AI can help with this predicament by making MRI scans four times faster, getting patients in and out of the tube quicker.

The work is a collaborative project called fastMRI between Facebook’s AI research team (FAIR) and radiologists at NYU Langone Health. Together, the scientists trained a machine learning model on pairs of low-resolution and high-resolution MRI scans, using this model to “predict” what final MRI scans look like from just a quarter of the usual input data. That means scans can be done faster, meaning less hassle for patients and quicker diagnoses.

“It’s a major stepping stone to incorporating AI into medical imaging,” Nafissa Yakubova, a visiting biomedical AI researcher at FAIR who worked on the project, tells The Verge.

The reason artificial intelligence can be used to produce the same scans from less data is that the neural network has essentially learned an abstract idea of what a medical scan looks like by examining the training data. It then uses this to make a prediction about the final output. Think of it like an architect who’s designed lots of banks over the years. They have an abstract idea of what a bank looks like, and so they can create a final blueprint faster.

“The neural net knows about the overall structure of the medical image,” Dan Sodickson, professor of radiology at NYU Langone Health, tells The Verge. “In some ways what we’re doing is filling in what is unique about this particular patient’s [scan] based on the data.”

The AI software can be incorporated into existing MRI scanners with minimal hassle, say researchers.
Image: FAIR / NYU

The fastMRI team has been working on this problem for years, but today, they are publishing a clinical study in the American Journal of Roentgenology, which they say proves the trustworthiness of their method. The study asked radiologists to make diagnoses based on both traditional MRI scans and AI-enhanced scans of patients’ knees. The study reports that when faced with both traditional and AI scans, doctors made the exact same assessments.

“The key word here on which trust can be based is interchangeability,” says Sodickson. “We’re not looking at some quantitative metric based on image quality. We’re saying that radiologists make the same diagnoses. They find the same problems. They miss nothing.”

This concept is extremely important. Although machine learning models are frequently used to create high-resolution data from low-resolution input, this process can often introduce errors. For example, AI can be used to upscale low-resolution imagery from old video games, but humans have to check the output to make sure it matches the input. And the idea of AI “imagining” an incorrect MRI scan is obviously worrying.

The fastMRI team, though, says this isn’t an issue with their method. For a start, the input data used to create the AI scans completely covers the target area of the body. The machine learning model isn’t guessing what a final scan looks like from just a few puzzle pieces. It has all the pieces it needs, just at a lower resolution. Secondly, the scientists created a check system for the neural network based on the physics of MRI scans. That means at regular intervals during the creation of a scan, the AI system checks that its output data matches what is physically possible for an MRI machine to produce.

A traditional MRI scan created from normal input data, known as k-space data.
GIF: FAIR / NYU

An AI-enhanced MRI scan created from a quarter of normal input data.
GIF: FAIR / NYU

“We don’t just allow the network to create any arbitrary image,” says Sodickson. “We require that any image generated through the process must have been physically realizable as an MRI image. We’re limiting the search space, in a way, making sure that everything is consistent with MRI physics.”

Yakubova says it was this particular insight, which only came about after long discussions between the radiologists and the AI engineers, that enabled the project’s success. “Complementary expertise is key to creating solutions like this,” she says.

The next step, though, is getting the technology into hospitals where it can actually help patients. The fastMRI team is confident this can happen fairly quickly, perhaps in just a matter of years. The training data and model they’ve created are completely open access and can be incorporated into existing MRI scanners without new hardware. And Sodickson says the researchers are already in talks with the companies that produce these scanners.

Karin Shmueli, who heads the MRI research team at University College London and was not involved with this research, told The Verge this would be a key step to move forward.

“The bottleneck in taking something from research into the clinic, is often adoption and implementation by manufacturers,” says Shmueli. She added that work like fastMRI was part of a wider trend incorporating artificial intelligence into medical imaging that was extremely promising. “AI is definitely going to be more in use in the future,” she says.

Source : TheVerge ScienceRead More

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Science

Too bright to breed

Published

on

Night light from coastal cities overpowers natural signals for coral spawning from neighboring reefs.

PHOTO: NOKURO/ALAMY STOCK PHOTO

Most coral species reproduce through broadcast spawning. For such a strategy to be successful, coordination has had to evolve such that gametes across clones are released simultaneously. Over millennia, lunar cycles have facilitated this coordination, but the recent development of bright artificial light has led to an overpowering of these natural signals. Ayalon et al. tested for the direct impact of different kinds of artificial light on different species of corals. The authors found that multiple lighting types, including cold and warm light-emitting diode (LED) lamps, led to loss of synchrony and spawning failure. Further, coastal maps of artificial lighting globally suggest that it threatens to interfere with coral reproduction worldwide and that the deployment of LED lights, the blue light of which penetrates deeper into the water column, is likely to make the situation even worse.

Curr. Biol. 10.1016/j.cub.2020.10.039 (2020).

Source

Continue Reading

Science

SpaceX launches Starlink app and provides pricing and service info to early beta testers

Published

on

SpaceX has debuted an official app for its Starlink satellite broadband internet service, for both iOS and Android devices. The Starlink app allows users to manage their connection – but to take part you’ll have to be part of the official beta program, and the initial public rollout of that is only just about to begin, according to emails SpaceX sent to potential beta testers this week.

The Starlink app provides guidance on how to install the Starlink receiver dish, as well as connection status (including signal quality), a device overview for seeing what’s connected to your network, and a speed test tool. It’s similar to other mobile apps for managing home wifi connections and routers. Meanwhile, the emails to potential testers that CNBC obtained detail what users can expect in terms of pricing, speeds and latency.

The initial Starlink public beta test is called the “Better than Nothing Beta Program,” SpaceX confirms in their app description, and will be rolled out across the U.S. and Canada before the end of the year – which matches up with earlier stated timelines. As per the name, SpaceX is hoping to set expectations for early customers, with speeds users can expect ranging from between 50Mb/s to 150Mb/s, and latency of 20ms to 40ms according to the customer emails, with some periods including no connectivity at all. Even with expectations set low, if those values prove accurate, it should be a big improvement for users in some hard-to-reach areas where service is currently costly, unreliable and operating at roughly dial-up equivalent speeds.

Image Credits: SpaceX

In terms of pricing, SpaceX says in the emails that the cost for participants in this beta program will be $99 per moth, plus a one-time cost of $499 initially to pay for the hardware, which includes the mounting kit and receiver dish, as well as a router with wifi networking capabilities.

The goal eventually is offer reliably, low-latency broadband that provides consistent connection by handing off connectivity between a large constellation of small satellites circling the globe in low Earth orbit. Already, SpaceX has nearly 1,000 of those launched, but it hopes to launch many thousands more before it reaches global coverage and offers general availability of its services.

SpaceX has already announced some initial commercial partnerships and pilot programs for Starlink, too, including a team-up with Microsoft to connect that company’s mobile Azure data centers, and a project with an East Texas school board to connect the local community.

Source

Continue Reading

Science

Erratum for the Report “Meta-analysis reveals declines in terrestrial but increases in freshwater insect abundances” by R. Van Klink, D. E. Bowler, K. B. Gongalsky, A. B. Swengel, A. Gentile, J. M. Chase

Published

on

S. Rennie, J. Adamson, R. Anderson, C. Andrews, J. Bater, N. Bayfield, K. Beaton, D. Beaumont, S. Benham, V. Bowmaker, C. Britt, R. Brooker, D. Brooks, J. Brunt, G. Common, R. Cooper, S. Corbett, N. Critchley, P. Dennis, J. Dick, B. Dodd, N. Dodd, N. Donovan, J. Easter, M. Flexen, A. Gardiner, D. Hamilton, P. Hargreaves, M. Hatton-Ellis, M. Howe, J. Kahl, M. Lane, S. Langan, D. Lloyd, B. McCarney, Y. McElarney, C. McKenna, S. McMillan, F. Milne, L. Milne, M. Morecroft, M. Murphy, A. Nelson, H. Nicholson, D. Pallett, D. Parry, I. Pearce, G. Pozsgai, A. Riley, R. Rose, S. Schafer, T. Scott, L. Sherrin, C. Shortall, R. Smith, P. Smith, R. Tait, C. Taylor, M. Taylor, M. Thurlow, A. Turner, K. Tyson, H. Watson, M. Whittaker, I. Woiwod, C. Wood, UK Environmental Change Network (ECN) Moth Data: 1992-2015, NERC Environmental Information Data Centre (2018); .

Source

Continue Reading

Trending