Subscribe to our AI & ML newsletter to stay updated on the latest research and market news within the space. We summarize two stories every week and compile additional research articles, company data releases, as well as M&A and funding activity for a quick digest. See our archive of past issues and connect with our AI & ML team below:
SUBSCRIBE
Seth is an Associate at DeciBio with experience in identifying novel opportunities within ML & AI, molecular diagnostics, immuno-oncology, and research tools. He is passionate about supporting disruptive technologies that can be used in the clinic and is the curator of the Big Data & AI weekly newsletter. Connect with him on LinkedIn or email him at [email protected].
Microsoft announced a new medical imaging server solution that can accelerate machine learning solutions. Gregory Moore, Corporate VP of Microsoft Health Next, claims that imaging makes up 74% of all healthcare data, and is often used to detect diseases and guide treatment strategies. Microsoft’s new solution includes the company’s cloud-based DICOM server which can be used alongside Microsoft’s Azure programming interface for healthcare interoperability. This enables providers to merge clinical data with images, and perform tasks that are difficult and expensive to complete with current “on-premises” systems.
Google and Mayo Clinic have partnered for an AI-based radiation therapy initiative. Current radiation therapy planning requires physicians to distinguish tumor tissue from healthy tissue through a process called contouring, which is often done by hand on patient images, and can take up to 7 hours per patient. The proposed algorithm would help accelerate the time-consuming contouring procedure, enabling specialists to plan more treatments in less time. The AI algorithm is currently undergoing training on de-identified imaging data. The technology will start with head and neck cancer data, where contouring is a challenge due to the number of delicate structures in that region. Upon successful development, the algorithm will be used on cancers in other areas of the body as well.
Market Analysis
Research / Publication News
Experts with Dartmouth College have developed a novel curriculum to bolster physicians’ know-how in the ML / AI space. AI-RADs, as it’s called, has shown early promise, with residents rating it as a 9.8 out of 10 and having significant gains in comprehension when reading AI articles after lectures, experts reported in Academic Radiology. The course was built around a sequence of foundational algorithms, presented as logical extensions of each other and based around familiar examples, such as movie recommendations. They also incorporated secondary lessons on topics such as pixel mathematics, since most residents have little to no computational background. They further built out the program with a journal club, exploring the algorithm discussed in the most recent lecture, along with study guides that helped cut through “intimidating technical descriptions.” The team then administered questionnaires before and after each lecture to gauge their effectiveness, along with additional surveys at each journal club. Trainees “overwhelmingly” felt that the content depth was just right, with the examples offered proving useful in understanding key AI concepts. With success thus far, they are now working to establish an online infrastructure to house the program, along with publishing all materials and sharing the educational series with those interested.
Scientists have developed a novel smartphone application that uses artificial intelligence to diagnose a stroke in less than four minutes. The tool does so by analyzing a patient’s speech patterns and facial movements and can make the determination with the accuracy of an ER doc, researchers claim. Testing out the model on patients, they found it measured up well against ER docs, including a 93.12% sensitivity rate while maintaining 79.27% accuracy. Experts hope this intervention will help providers strike the right balance between over-using CT scans and underdiagnosing this concern. “If we can improve diagnostics at the front end, then we can better expose the right patients to the right risks and not miss patients who would potentially benefit,” said John Volpi, MD, a vascular neurologist and co-director of the Eddy Scurlock Stroke Center at Houston Methodist, said in a statement. “We have great therapeutics, medicines and procedures for strokes, but we have very primitive and, frankly, inaccurate diagnostics.” Penn State and Houston Methodist are also jointly pursuing a patent for the app.
Market Analysis
Research / Publication News
After a baby is born, doctors sometimes examine the placenta–the organ that links the mother to the baby–for features that indicate health risks in any future pregnancies. Unfortunately, this is a time-consuming process that must be performed by a specialist, so most placentas go unexamined after the birth. A team of researchers from Carnegie Mellon University (CMU) and the University of Pittsburgh Medical Center (UPMC) reported the development of a machine learning approach to examine placenta slides so that more women can be informed of health risks such as preeclampsia. Normally, classifying large images such as a slice of a placenta sample is difficult for computer vision tools. These researchers took an alternative approach, first having the computer detect individual blood vessels within the image, and then analyzing each one to determine if it should be classified as diseased or healthy. The algorithm also considered pregnancy-related features including gestational age, birth weight, and any conditions the mother might have. While this type of tool may not reach the clinic for some time, these researchers have shown how algorithms may be used in the future to speed up workflows, reduce physician burdens, and identify clinical features that may be missed in the current clinical standard of care.
Google Cloud announced it was selected by the Defense Innovation Unit (DIU) to prototype an AI-enabled digital pathology solution at select DoD facilities, with the goal of improving the accuracy of diagnoses. The VA Center for Innovations in Quality, Effectiveness, and Safety estimates 5% of outpatient diagnoses are conducted in error in the U.S., and this project will help to improve the accuracy of diagnoses by physicians and lower healthcare costs. Working with DIU, Google Cloud will prototype the delivery of an augmented reality microscope that overlays AI-based information for doctors, providing pathology-based cancer detection tools at the point-of-care. Google’s approach will leverage TensorFlow, an open-source framework to help deliver machine-learning models as well as the Google Cloud Healthcare API for data ingestion and de-identification to maximize patient privacy. The initial rollout will take place at select Defense Health Agency treatment facilities and Veteran’s Affairs hospitals in the United States, with future plans to expand across the broader U.S. Military Health System.
Market Analysis
Research / Publication News
The Department of Health and Social Care has announced £50m funding for three digital pathology and imaging artificial intelligence (AI) centers in Coventry, Leeds and London. The three centers to share the latest tranche of funding will deliver digital upgrades to pathology and imaging services across an additional 38 NHS trusts, said the department. Darren Treanor, national pathology imaging co-operative director and consultant pathologist at Leeds Teaching Hospitals NHS Trust, said of the £50m investment: “This will allow us to use digital pathology to diagnose cancer at 21 NHS trusts in the north, serving a population of six million people. We will also build a national network spanning another 25 hospitals in England, allowing doctors to get expert second opinions in rare cancers, such as childhood tumors, more rapidly.”
In a study published in JAMA: Oncology, investigators from Karolinska Institutet in Sweden tested and compared the accuracy of three AI algorithms designed to identify breast cancer based on previously captured mammograms. Based on their results, the most effective algorithm diagnosed the same percentage of women correctly compared to the average practicing radiologist, according to study author Fredrik Strand. According to research team’s analysis, the algorithm had 81.9-percent sensitivity, 96.6-percent specificity, and an area under the cover of 0.956 for the detection of cancers at screening or within 12 months. Alongside these findings, Strand’s team said, they concluded that combining the interpretation of one radiologist with the highest-performing AI algorithm produced better results than the combination of two radiologists’ image evaluations. The outcome of this study underscores the findings from another article Strand’s group published recently in The Lancet Digital Health that showed an AI algorithm could successfully sort mammography images into groups that require further radiologist attention and those that AI can accurately assess without overlooking cancers. The team now plans to investigate how AI can improve on the current breast screening system of having two radiologists interpret a mammogram and discuss any disagreements.
Market Analysis
Research / Publication News
A collaborative project called fastMRI between Facebook’s AI research team (FAIR) and radiologists at NYU Langone Health has trained a machine learning model on pairs of low-resolution and high-resolution MRI scans, using this model to “predict” what final MRI scans look like from just a quarter of the usual input data, enabling scans to be done faster, meaning less hassle for patients and quicker diagnoses. The fastMRI team has been working on this problem for years, and are publishing a clinical study in the American Journal of Roentgenology, which they say proves the trustworthiness of their method. The study asked radiologists to make diagnoses based on both traditional MRI scans and AI-enhanced scans of patients’ knees. The study reports that when faced with both traditional and AI scans, doctors made the exact same assessments. “The key word here on which trust can be based is interchangeability,” says Dan Sodickson, professor of radiology at NYU Langone Health. “We’re not looking at some quantitative metric based on image quality. We’re saying that radiologists make the same diagnoses. They find the same problems. They miss nothing.” The next step is getting the technology into hospitals where it can actually help patients. The fastMRI team is confident this can happen fairly quickly, perhaps in just a matter of years. The training data and model they’ve created are completely open access and can be incorporated into existing MRI scanners without new hardware.
RapidAI, announced that Rapid LVO has received Food and Drug Administration (FDA) clearance for detecting suspected LVOs (Large Vessel Occlusions). Rapid LVO helps physicians speed up triage or transfer decision-making. Working in as few as 3 minutes, Rapid LVO uses a vessel tracker in conjunction with assessment of brain regions with reduced blood vessel density to identify suspected LVOs with a sensitivity of 97% and a specificity of 96%. Stroke team members are also immediately notified when a suspected LVO is detected. “LVOs are the most disabling and deadly ischemic strokes,” said Dr. Greg Albers, Professor of Neurology at Stanford University, Director of the Stanford Stroke Center and cofounder of RapidAI. “The ability to identify LVOs rapidly facilitates more effective treatment. This is why we are very excited about the FDA clearance of Rapid LVO, a significant step forward in stroke diagnostics and care.”
Market Analysis
Research / Publication News
SOPHiA GENETICS launched a first-of-its-kind multimodal solution to predict COVID-19 disease evolution, opening new dimensions of insight into the fight against the worldwide pandemic. SOPHiA’s approach aims to support containment efforts globally by demonstrating immediate benefits for community contact tracing and essential viral monitoring research. This important analysis can support paths to new protective measures and outbreak protocols around the world. As part of the approach, SOPHiA GENETICS has built an AI-powered solution to conduct full-genome analysis of SARS-CoV-2 which can compare insights from the viral genomic data with human “host” genetic information. In addition, the new SOPHiA Radiomics for COVID-19 offers a CT-based automated workflow for whole-lung segmentation and disease quantification. Additionally, their machine learning approaches can be used to discover abnormalities predictive of disease evolution and leverage multimodal research data sets. “Controlling this virus means understanding it at new levels that go beyond simple testing. The evolution of the disease must be predicted in order to create containment measures. We can do this by building a world map of longitudinal tracking, beginning with highly accurate and reliable virus data, further powered by radiomic data,” said Jurgi Camblong, SOPHiA GENETICS’ Founder and CEO.
Sight Diagnostics, the Israel-based health-tech company behind the FDA-cleared OLO blood analyzer, announced that it has raised a $71 million Series D round. With this, the company has now raised a total of $124 million. “Historically, blood tests were done by humans observing blood under a microscope. That was the case for maybe 200 years… we just replaced the human eye behind the microscope with machine vision” says Sight CEO and co-founder Yossi Pollak. While they originally started with analyzing complete blood counts, one of the diseases they are looking to diagnose is COVID-19. “We just kind of scratched the surface of the ability of AI to help with blood diagnostics,” said Pollak. “There’s so much value around COVID in decentralizing diagnostics and blood tests. Think keeping people — COVID-negative or -positive — outside of hospitals to reduce the busyness of hospitals and reduce the risk for contamination for cancer patients and a lot of other populations that require constant complete blood counts. I think there’s a lot of potential and a lot of value that we can bring to different markets.”
Market Highlights
Research / Publication Highlights
Researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have developed a machine learning system that can either make a prediction about a task, or defer the decision to an expert. Most importantly, it can adapt when and how often it defers to its human collaborator, based on factors such as its teammate’s availability and level of experience. The team trained the system on multiple tasks, including looking at chest X-rays to diagnose specific conditions such as atelectasis (lung collapse) and cardiomegaly (an enlarged heart). In the case of cardiomegaly, they found that their human-AI hybrid model performed 8 percent better than either could on their own (based on AU-ROC scores). Researchers have not yet tested the system with human experts, but instead developed a series of “synthetic experts” so that they could tweak parameters such as experience and availability. In future work, the team plans to test their approach with real human experts, such as radiologists for X-ray diagnosis. They will also explore how to develop systems that can learn from biased expert data, as well as systems that can work with — and defer to — several experts at once.
Infervision received U.S. Food and Drug Administration (FDA) 510(K) clearance of the InferRead Lung CTAI product, which uses artificial intelligence and deep learning technology, to automatically perform lung segmentation, along with accurately identifying and labeling nodules of different types. InferRead Lung CTAI is designed to support concurrent reading and is designed to aid radiologists in pulmonary nodule detection during the review of chest computed tomography (CT) scans, increasing accuracy and efficiency. InferRead Lung CTAI is currently in use at over 380 hospitals and imaging centers globally. More than 55,000 cases daily are being processed by the system, and over 19 million patients have already benefited from this advanced AI technology. “Fast, workflow friendly, and accurate are the three key areas we have emphasized during product development. We’re excited to be able to make our InferRead Lung CTAI solution available to the North American market. Our clients tell us it has great potential to help provide improved outcomes for providers and patients alike,” said Matt Deng, Ph.D., director of Infervision North America.
Market Highlights
Research / Publication Highlights
Researchers at EMBL’s European Bioinformatics Institute (EMBL-EBI), the Wellcome Sanger Institute, Addenbrooke’s Hospital in Cambridge, UK, and collaborators have developed an algorithm that uses computer vision to identify genomic alterations within tissue samples from cancer patients. Normally, patient biopsies may be examined under a microscope by histopathologists and also undergo molecular testing by cancer geneticists or molecular pathologists. These researchers have shown that their algorithm can distinguish between healthy and cancerous tissues, and can also identify patterns of more than 160 DNA and thousands of RNA changes in tumors from just histology slides. A key differentiator of this study is that the researchers generalized their approach on an unprecedented scale: they trained the algorithm with more than 17,000 images from 28 cancer types collected for The Cancer Genome Atlas, and studied all known genomic alterations. “What is quite remarkable is that our algorithm can automatically link the histological appearance of almost any tumor with a very broad set of molecular characteristics, and with patient survival,” explains Moritz Gerstung, Group Leader at EMBL-EBI. Overall, their algorithm was capable of detecting patterns of 167 different mutations and thousands of gene activity changes.
Zebra Medical Vision, developer of artificial-intelligence-based software for scanning X-ray images and automatically spotting critical health issues, has received its first FDA clearance in cancer screening. The company’s AI is designed to identify mammograms suspected of breast cancer. The program is the company’s sixth FDA green light, following digital solutions using CT scans and X-rays to detect brain bleeds, pneumothorax, spinal fractures and more. Zebra Medical’s HealthMammo offering, which previously received a CE mark, aims to scan a radiologist’s entire workflow and flag the 2D mammograms with higher cancer risk for priority review. By moving the few scans with suspicious lesions to the front of the line, the company hopes to shorten the time a patient waits for their diagnosis. “With this fully commercial and regulated product, we aim to provide even more value and help patients and providers navigate the new COVID-affected reality we are all facing,” CEO Ohad Arazi said. “We’re proud of the achievements we’ve made in the past few months, providing U.S. healthcare with a growing portfolio of automatic solutions to enhance patient care, especially during these times.”
Market Highlights
Research / Publication Highlights
Medipath, which provides pathology services to more than 170 hospitals and clinics across France, has completed deployment of Ibex’s Galen Prostate as part of its routine clinical practice. With Ibex’s CE-marked solution, a highly accurate AI algorithm analyzes prostate biopsies and raises alerts when discrepancies with the pathologists’ initial diagnosis are identified. Alerts can include a potential missed cancer and provide a safety net that helps minimize diagnostic errors in the lab by enhancing quality control. The Galen Prostate has demonstrated outstanding results in clinical studies, including an unmatched AUC (Area Under Curve) of 0.997 for cancer detection, and has already been successful in detecting missed cancers in real time. “Medipath’s implementation of Galen Prostate represents a significant step in our global expansion and extends the scope of cooperation between our companies to routine cancer diagnosis – the core of Medipath’s clinical practice,” said Joseph Mossel, CEO and Co-Founder of Ibex. “We are happy to share the vision that AI-powered pathologists will take the center stage in the future, with AI applications becoming standard in the cancer pathway and patient care.”
Paige, announced it received FDA 510(k) clearance for the FullFocus, a digital pathology image viewer for the purpose of primary diagnosis. This 510(k) clearance from the FDA allows in vitro diagnostic (IVD) use of FullFocus with FDA-authorized Philips Ultra Fast Scanner and paves the way for IVD use of FullFocus with additional IVD Whole Slide Imaging (WSI) scanners in the future. The FullFocus viewer operates within the Paige Platform, which also offers fully-managed, cost-effective storage capabilities. The combined offering can be quickly deployed in any clinical setting, requires minimal upfront costs and allows for collaboration between geographically distributed pathologists, streamlining the path to digital pathology. The Paige Platform is also designed to deliver best-in-class computational pathology products. “The COVID-19 pandemic has made it painfully clear that pathologists need better solutions to work safely and remotely,” said Leo Grady, PhD CEO at Paige. “Pathology labs, hospitals and biopharma companies need to serve patients and conduct research with little disruption, without having to be physically present at their labs. Receiving FDA clearance for the FullFocus viewer allows Paige to further its commitment to modernizing workflows for pathologists so that they can manage their ever-increasing workloads in an efficient, organized, collaborative and secure way and ultimately help patients get the right care at the right time.”
Market Highlights
Research / Publication Highlights
Paige, the startup that spun out of the Memorial Sloan Kettering Cancer Center and launched in 2018 to help advance cancer research and care by applying AI to better understand cancer pathology, is today announcing a milestone in its growth story: it has raised a further $20 million from Goldman Sachs and Healthcare Venture Partners, closing out its Series B at $70 million. Leo Grady, Paige’s CEO, says the funding will go towards several areas. It will be used for hiring; to continue expanding its partnerships with biopharmaceutical companies (deals that have not yet been made public); and to continue investing in clinical work, based around algorithms it has built and trained using more than 25 million pathology slides in MSK’s archive, plus IP related to the AI-based computational pathology that underpins Paige’s work. It will also be used to help it expand to the UK and Europe. “We initially invested in Paige recognizing the potential of their products to add significant value to the industry and impact the future of cancer care,” added Jeffrey C. Lightcap, senior MD of Healthcare Venture Partners. “After seeing Paige make tremendous progress in such a short period, we added to our investment to further accelerate their growth.”
MammoScreen, the explainable and actionable artificial intelligence (AI) based software assisting radiologists in reading screening mammograms, received 510(k) clearance from the U.S. Food and Drug Administration (FDA). This FDA clearance was received after submitting results from a multi-reader multi-case study conducted last year. Study findings revealed improvement in readers’ performance in cancer detection in mammograms when paired with MammoScreen compared to radiologists alone. MammoScreen automatically detects and characterizes suspicious soft tissue lesions and calcifications in mammogram images while assessing their likelihood of malignancy. The results are presented in a summary report that characterized suspiciousness of each lesion scored on a scale of 1-10, with 1, being least likely to reveal malignancy and 10 most likely. “Receiving FDA clearance for MammoScreen is a major milestone for Therapixel,” said Pierre Fillard, Founder and Chief Scientific Officer of Paris-based Therapixel. “This is the result of our collaboration with radiologists over the last three years to turn the algorithm that won the DREAM challenge in 2017 into a powerful product that is truly meaningful to their day-to-day-work.”
Market Highlights
Research / Publication Highlights
The two most prominent radiological societies in the U.S. are urging the federal government to proceed cautiously in its pursuit of artificial intelligence models that operate autonomously. RSNA and the American College of Radiology spelled out their concerns in a letter sent to the Food and Drug Administration on Tuesday. In it, top officials from the two groups said they believe it’s unlikely the FDA can provide assurances of such technology’s safety in imaging care, absent further testing, surveillance and other methods of oversight. They recommend that the FDA waits until such AI algorithms have a broader penetration into the healthcare marketplace, prior to granting its approval in the future. “If the goal of autonomous AI is to remove the physician from the image interpretation, then the public must be assured that the algorithm will be as safe and effective as the physicians it replaces, which includes the ability to incorporate available context and identify secondary findings that would typically be identified during physician interpretation,” wrote ACR Chair Howard Fleishon, MD, and RSNA Chair Bruce Haffty, MD. “We believe this level of safety is a long way off, and while AI is poised to assist physicians in the care of their patients, autonomously functioning AI algorithms should not be implemented at this time.”
General Electric Healthcare has partnered with UH’s Cleveland Medical Center to evaluate the world’s first on-device AI tool that is designed to identify collapsed lungs, a crucial need since intensive care units have seen an increase in patients during the ongoing COVID-19 pandemic. Katelyn Nye, general manager of Global Mobile Radiography & Artificial Intelligence at GE Healthcare, said, “Today there are a multitude of AI algorithms beings developed, but very few solutions seamlessly integrate into actual clinical workflow. GEHC selected UH to be the first USA pilot site because of the extensive research relationship, progressive IT and radiology teams looking to integrate AI to improve workflow, and the center of excellence for cardiothoracic care.” The imaging system enabled with Critical Care Suite software acts as AI on board the OptimaXR240amx mobile x-ray unit. Following a chest x-ray, the system will recognize if a patient’s lung has collapsed, and flags the image for immediate reading. The technology is now in daily practice and flags up to 15 collapsed lungs per day within the hospital.
Market Highlights
Research / Publication Highlights
Proscia, a provider of AI-enabled digital pathology solutions, is collaborating with Royal Philips to advance an open ecosystem that helps laboratories accelerate and scale digital pathology adoption. Through the collaboration, Proscia will integrate the Philips Pathology SDK to natively utilize Philips’ iSyntax image format with its Concentriq digital pathology platform and suite of AI modules, providing users of Philips’ digital pathology solutions with expanded options to meet their business and workflow needs. By incorporating the Philips medical-grade iSyntax image format into Concentriq, Philips and Proscia are providing laboratories with the freedom to choose the best components of an integrated digital pathology ecosystem and realize added value from their images. “We’re pleased to collaborate with Philips to deliver on our shared vision of an open digital pathology ecosystem,” said Coleman Stavish, Proscia’s Chief Technology Officer. “Our integrated solution is empowering a best-in-class approach to scanning, image management, and computational pathology that builds upon the industry-leading iSyntax format and provides laboratories with a future-proof approach for scaling their digital implementations.”
Imaging Artificial Intelligence (AI) provider Qure.ai announced its first US FDA 510(k) clearance for its head CT scan product ‘qER’. The US Food and Drug Administration’s decision covers four critical abnormalities identified by Qure.ai’s emergency room product. Now, the AI tool can be used to triage radiology scans with intracranial bleeds, mass effect, midline shift, and cranial fractures. Two of these capabilities – cranial fractures and midline shift – are exclusive to Qure.ai’s product. This means that the newly cleared qER suite will be able to triage nearly all critical abnormalities visible on routine head CT scans. The qER suite plugs directly into the radiology workflow and prioritizes critical cases on the worklist. This triage drastically reduces the time taken to open critical scans, so those with time-sensitive abnormalities get to be read and reported faster, leading to better patient outcomes. The qER product has undergone extensive validation including a 2018 peer-reviewed publication in The Lancet and has been actively deployed at many hospitals and teleradiology providers globally. Qure.ai’s other products include a CE-marked chest X-ray AI tool qXR, and COVID-19 progression monitoring solutions for chest X-rays, with both in clinical use in over 20 countries.
Market Highlights
Research / Publication Highlights
GE Healthcare launched an artificial intelligence-powered chest X-ray analysis suite, designed to spot and highlight eight common conditions, using algorithms built by the South Korean startup Lunit. Lunit’s Insight CXR program is designed to scan thoracic X-rays and label the probable signs of diseases such as tuberculosis and pneumonia, including that linked to COVID-19, as well as fibrosis, pneumothorax and the existence of potentially cancerous lung nodules. “The launch of our Thoracic Care Suite is a part of GE Healthcare’s larger effort to help ensure clinicians and partners on the front lines have the equipment they need to quickly diagnose and effectively treat COVID-19 patients,” GE Healthcare President and CEO Kieran Murphy said in a statement. “The pandemic has proven that data, analytics, AI and connectivity will only become more central to delivering care.” The AI overlays its results on top of the X-ray image, outlining the location of an abnormality along with a score that estimates the probability of the finding. The software also generates case reports summarizing each evaluation. Lunit’s algorithms previously received a CE Mark in November 2019, and have been used clinically in Korea, China, Thailand, Mexico and the United Arab Emirates.
RSIP Vision announced a new set of AI-based medical ultrasound modules. These advanced modules will serve as AI-based building blocks for a variety of applications, ranging from automated diagnosis, measurement, and volume estimation, to advanced procedure planning and in-op guided tracking. When integrated into third-party ultrasound carts, PACS systems, and proprietary cloud platforms, these modules will enable a new set of innovative capabilities in a wide range of domains, such as urology, OBGYN, general imaging, cardiac procedures, and more. These new building blocks will enable new applications, including biopsy guidance, automated measurements and function evaluation in cardiac ultrasound, 3D reconstruction, and patient screening. “Ultrasound is extremely user dependent and can be challenging to interpret. These innovative AI modules help medical teams make quick and accurate clinical decisions and lower the dependence on teams’ experience,” says Dr. Rabeeh Fares, Radiologist; Department of Diagnostic Radiology, Sourasky Medical Center, Tel Aviv, Israel.
Market Highlights
Research / Publication Highlights
Apprentice.io, a startup developing a conversational AI and augmented reality platform for pharmaceutical, biotech, and chemical companies, announced it has raised $7.5 million. CEO and cofounder Angelo Stracquatanio says the capital will enable Apprentice to scale to accommodate customer growth attributable to the pandemic. Apprentice’s suite supports batch execution, with computer vision systems tailored to life sciences that understand how operators are interacting with equipment to provide real-time feedback. The platform allows managers to plan out batches and schedule campaigns for an entire year and to set parameters for batch runs so that every batch remains the same. Using AI and machine learning, Apprentice facilitates dynamic batch flows, data reporting, and data monitoring, and it integrates with existing enterprise systems to make batch record processes ostensibly faster. The organization, which claims Fortune 100 clients in life sciences based in the U.S., Asia, South America, and Europe, says site deployments have increased 6 times since March as a result of the pandemic. To meet this demand, the company has been shipping lab technicians rapid deployment kits preconfigured with its augmented reality and intelligent software solutions. Insight Partners led this latest investment, which brings the startup’s total raised to nearly $20 million, following an $8 million round in September 2018.
Ultromics, an Oxford, England-based artificial intelligence firm has raised $10 million in new capital from investors including the Mayo Clinic and is eyeing further expansion into the U.S. Other existing shareholders including Barcelona health tech investment firm Nina Capital and Oxford Sciences Innovation also pitched in. The firm scored its first clearance from the U.S. Food and Drug Administration last year for its EchoGo Core, which automates the analysis of heart ultrasound scans. A more advanced version of that product, the EchoGo Pro, can predict coronary artery disease from heart imaging, Ultromics said. The AI tool earned approval for use in Europe, with U.S. FDA clearance currently pending. “Mayo Clinic and Nina Capital bring a strong track record and deep expertise in the health-tech sector. They will help our push into the U.S. market and accelerate the development of new innovations and services,” Ultromics CEO Ross Upton, PhD, said in a statement. The company is also currently working with “several” providers in the states including Mayo, using its AI system to analyze how COVID-19 affects the heart.
Market Highlights
Research / Publication Highlights
DNAnexus, which provides a data analysis and management platform for DNA sequence data, announced it has secured $100 million in a round co-led by Perceptive Advisors and Northpond Ventures. DNAnexus says the funds will advance its international growth, enabling the company to serve health care and life science organizations around the world as it grows its cloud-based clinical testing offerings. A number of DNAnexus customers — which include eight of the top 10 clinical diagnostics companies, seven of the top 10 pharmaceutical companies, and over 100 enterprises — tap the platform for machine learning applications. These perform AI-aided variant calls, which entail identifying substitutions of single nucleotides (the building blocks of DNA) and small insertions and deletions from next-generation sequencing data. They rely on deep learning for pathology processing, medical record interpretation, and extracting insights out of large-scale data proteomics (the study of proteins) and metabolomics (the study of the products of metabolism) data. Investors in DNAnexus’ latest funding round include GV (formerly Google Ventures), Foresite Capital, TPG Capital, First Round Capital, and first-time equity investor Regeneron Pharmaceuticals. This series G round brings the company’s total raised to over $270 million, following a $68 million series F in February 2019.
Geisinger Health System has inked a 10-year technology agreement with Siemens Healthineers to access diagnostic imaging equipment and artificial intelligence applications. The Danville, Pennsylvania-based health system said the partnership will advance and support elements of its strategic priorities related to continually improving care for their patients, communities and the region. The medical technology company will provide Geisinger access to its latest digital health innovations, diagnostic imaging equipment and on-site staff to support improvements. Education and workflow resources will also be available, which will provide Geisinger staff with the ability to efficiently make decisions and continually optimize workflows, the companies said. “Making better health easier by bringing world-class care close to home is central to everything we do at Geisinger,” said Matthew Walsh, chief operating officer at Geisinger. “This partnership will allow us to continue to equip our facilities with the most advanced diagnostic imaging technology in the market to care for our patients.”
Market Highlights
Research / Publication Highlights
Lunit announced that it received CE Mark approval in Europe for its Lunit Insight MMG artificial intelligence-based software. Insight MMG uses AI to analyze mammography images and provide the location of lesions that are suspicious for breast cancer, along with an abnormality score that reflects the probability of the existence of detected lesions. According to a news release, the software analyzes the images with 97% accuracy. Lunit touted a recently published study in which AI alone showed 88.8% sensitivity in breast cancer detection, while radiologists alone showed 75.3%. When radiologists were aided by AI, their accuracy increased to 84.8%. This data came from a set of over 36,000 biopsy-proven, independent cancer-positive cases. “Among the patients suspicious of breast cancer upon screening mammography, only 29% is actually diagnosed with cancer after a biopsy,” Lunit CEO Brandon Suh said in the release. “I am delighted to introduce Lunit Insight MMG, now CE certified, to healthcare professionals and institutions across the continent of Europe.”
PathAI announced the development of an AI-based assay for quantification of the tumor microenvironment (TME) and the application of the method using data from Genentech’s IMpower150 trial. The findings of the analysis support the importance of the TME and vasculature in determining response to PD-L1 and VEGF-targeting therapies and shed light on potential mechanisms of action of combination therapy. In the study, PathAI developed a deep learning-based TME assay and applied it to 1027 digitized H&E slide images from IMpower150 to generate a high dimensional set of human interpretable features (HIFs) characterizing the TME, including quantitative measurements of cancer cells, immune cells, stromal cells, and the cancer vasculature. The findings demonstrate the utility of broadly examining the TME, including measurement of both vascular and immunological components, to dissect the biological basis of drug response in clinical trials of innovative combination therapies in immuno-oncology. “We are excited about this first milestone as part of our multi-year Strategic Partnership with Genentech. The insights generated with the PathAI research platform demonstrates the potential power of digital pathology and AI technologies to advance cancer research and drug development,” said PathAI co-founder and Chief Executive Officer Andy Beck, MD, Ph.D.
Market Highlights
Research / Publication Highlights
Indica Labs, a provider of computational pathology software, and Octo, an information technology systems provider to the U.S. Federal Government, announced the online COVID Digital Pathology Repository (COVID-DPR), a virtual collection of high resolution microscopic COVID-related human tissue images hosted at the National Institutes of Health. This repository will enable international collaboration by providing a centralized, cloud-based repository for sharing and annotating digital whole slide images of lung, liver, kidney and heart tissues from patients infected with COVID19, as well as the closely related coronaviruses associated with SARs and MERs. The whole slide images, annotations and metadata in the repository will be used as a reference data set for education, research and future clinical trials aimed at limiting further infection, disease, and death. COVID-DPR is underpinned by Indica Labs’ HALO LinkTM software, a collaborative image management platform designed specifically for secure sharing of digital whole slide images and data. The current initiative involves multiple institutes within NIH, and COVID-DPR will be available immediately as a shared resource for researchers at institutes around the world with initial data sets being provided by infectious disease labs across North America, Europe, and Australia.
Mount Sinai researchers are the first in the country to use artificial intelligence (AI) combined with imaging, and clinical data to analyze patients with coronavirus disease (COVID-19). They have developed a unique algorithm that can rapidly detect COVID-19 based on how lung disease looks in computed tomography (CT scans) of the chest, in combination with patient information including symptoms, age, bloodwork, and possible contact with someone infected with the virus. Their study was published in the May 19 issue of Nature Medicine, and could help hospitals across the world quickly detect the virus, isolate patients, and prevent it from spreading during this pandemic. The algorithm was shown to have statistically significantly higher sensitivity (84 percent) compared to 75 percent for radiologists evaluating the images and clinical data. The AI system also improved the detection of COVID-19-positive patients who had negative CT scans. Specifically, it recognized 68 percent of COVID-19-positive cases, whereas radiologists interpreted all of these cases as negative due to the negative CT appearance. Mount Sinai researchers are now focused on further developing the model to find clues about how well patients will do based on subtleties in their CT data and clinical information. They say this could be important to optimize treatment and improve outcomes.
Market Highlights
Research / Publication Highlights
To aid COVID-19 research efforts, NVIDIA has developed new artificial intelligence models, genomic sequencing software, and speech recognition technologies for the medical community. The new capabilities are a major expansion of the NVIDIA Clara healthcare platform, and will help healthcare researchers, technology solutions providers, and hospitals combat the pandemic faster. NVIDIA released a set of AI models that can help researchers detect and study infected patients through chest CT scan data. NVIDIA is also leading a multinational COVID-19 federated learning initiative in partnership with Mass General Brigham. The initiative aims to expand COVID-19 AI models to x-ray imaging that enables local adaptation without sharing patient data and protecting patient privacy. In addition to the AI models, NVIDIA is also making computational genomics software freely available to COVID-19 researchers. The software, called NVIDIA Clara Parabricks, achieved a new speed record by analyzing the whole human genome DNA sequence in under 20 minutes. NVIDIA has also introduced GPU-accelerated RNA-sequencing pipelines that return results in less than two hours, providing researchers with critical insights about patients’ susceptibility to disease, disease progression, and response to treatment. NVIDIA has also launched NVIDIA Clara Guardian, a platform that uses intelligent video analytics and automatic speech recognition technologies to help a new generation of smart hospitals perform vital sign monitoring while limiting staff exposure. With these new capabilities, researchers, developers, and hospitals across the healthcare continuum can further combat, track, and treat COVID-19.
Zebra Medical Vision, the deep-learning medical imaging analytics company, announced its fifth FDA 510(k) clearance for its Vertebral Compression Fractures (VCF) product. The solution automatically identifies findings suggestive of compression fractures, enabling clinicians to place patients that are at risk of osteoporosis in treatment pathways to prevent potentially life-changing fractures. The VCF product expands the company’s growing AI1™ (all-in-one) bundle of FDA cleared AI solutions, which has now received a fourth US patent in its bone health series. Nearly half of all women and a quarter of men over the age of 50 will suffer an osteoporotic fracture in their lifetime. According to the National Osteoporosis Foundation (NOF), the cost of osteoporosis-related fragility fractures to the U.S. is estimated to be $52 billion annually. Osteoporosis, also referred to as “The Silent Killer,” is the most common preventable cause of fractures, causing more than 2 million cases of broken bones in the U.S. alone every year. “Zebra-Med’s latest AI solution will be advantageous to value-based healthcare systems. With appropriate intervention, it has potential to reduce secondary fractures in high risk patients and reduce additional related costs, while enabling providers to benefit from higher Medicare star ratings and increased revenue from RAF score adjustments, ” says Dr. Keith White, Medical Director of Imaging Services, Intermountain Health.
Market Highlights
Research / Publication Highlights