{"id":6319,"date":"2025-05-12T10:30:00","date_gmt":"2025-05-12T10:30:00","guid":{"rendered":"https:\/\/www.despatch.com\/blog\/?p=6319"},"modified":"2025-10-26T18:31:44","modified_gmt":"2025-10-26T18:31:44","slug":"philips-and-nvidia-team-up-to-revolutionize-mri-with-ai","status":"publish","type":"post","link":"https:\/\/www.despatch.com\/blog\/philips-and-nvidia-team-up-to-revolutionize-mri-with-ai\/","title":{"rendered":"Philips and NVIDIA Team Up to Revolutionize MRI with AI"},"content":{"rendered":"\n<p>MRI (Magnetic Resonance Imaging) has been a real game changer in the medical field, providing doctors with a clear picture of the soft tissues inside our bodies. Still, AI has the potential to significantly improve the functionality of MRI machines, and now, Philips and NVIDIA are taking that transformation to the next level.<\/p>\n\n\n\n<p>At this year\u2019s ISMRM conference, Philips announced a groundbreaking collaboration with NVIDIA to develop a foundational AI model specifically for MRI. The goal? Faster scans, clearer images, and a smoother workflow from start to finish.<\/p>\n\n\n\n<p>And it\u2019s not just theory. Philips already uses AI to cut scan times and enhance diagnostics. But this new partnership aims to go deeper, building a large-scale, deep-learning model trained on vast amounts of MRI data. The model will serve as the basis for a new generation of intelligent applications that could redefine how radiologists work and how patients experience imaging.<\/p>\n\n\n\n<p>\u201cBy partnering with NVIDIA to build an MR Foundational Model, we\u2019re pioneering a new frontier for medical imaging, one that has the potential to transform the role of MR in the diagnosis and treatment of a wide range of diseases,\u201d said Dr. Ioannis Panagiotelis, Business Leader of MRI at Philips.<\/p>\n\n\n\n<p>But what\u2019s unique about Philips and NVIDIA\u2019s approach? For one, the foundational model will support zero-click scan planning. No need to manually set parameters for different body parts \u2014 the system will handle it, speeding up the entire process. It will also offer interactive enhancements like denoising, sharpening, and super-resolution \u2014 tools that can help radiologists spot subtle changes with greater confidence.<\/p>\n\n\n\n<p>Radiologists will even be able to preview and adjust image quality and scan speed before the scan starts. In other words, less guesswork and more control. And potentially, more accurate outcomes.<\/p>\n\n\n\n<p>But it doesn\u2019t stop there. The model will support automated interpretation of scan findings, making diagnostics faster and possibly more consistent. The long-term goal is clear: better care for more people, without overburdening healthcare providers.<\/p>\n\n\n\n<p>Under the hood, Philips will build on NVIDIA\u2019s VISTA-3D \u2014 a foundation model built for 3D imaging \u2014 and MAISI, a next-gen tool for generating synthetic images with or without anatomical labels. Combined, these platforms will help deliver a highly specialized AI solution tailored to the complexities of MR imaging.<\/p>\n\n\n\n<p>If successful, the Philips\u2013NVIDIA collaboration could change not just how MRIs are read, but how they\u2019re done. A bold step forward in patient-centered, AI-driven healthcare.<\/p>\n\n\n\n<p>Article &amp; image source by <a href=\"https:\/\/www.philips.com\/a-w\/about\/news\/media-library\/20250226-dual-ai-engines-in-smartspeed-precise.html?src=search\" target=\"_blank\" rel=\"noreferrer noopener\">Phillips<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>MRI (Magnetic Resonance Imaging) has been a real game changer in the medical field, providing doctors with a clear picture of the soft tissues inside our bodies. Still, AI has the potential to significantly improve the functionality of MRI machines, and now, Philips and NVIDIA are taking that transformation to the next level. At this [&hellip;]<\/p>\n","protected":false},"author":13,"featured_media":6326,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[58,56],"tags":[1459,2079,2080],"acf":[],"_links":{"self":[{"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/posts\/6319"}],"collection":[{"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/comments?post=6319"}],"version-history":[{"count":1,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/posts\/6319\/revisions"}],"predecessor-version":[{"id":6322,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/posts\/6319\/revisions\/6322"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/media\/6326"}],"wp:attachment":[{"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/media?parent=6319"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/categories?post=6319"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.despatch.com\/blog\/wp-json\/wp\/v2\/tags?post=6319"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}