Taking Spatial Toward the Clinic: DeciBio’s Q&A with Sandy Au of NeoGenomics

October 25, 2022
DeciBio Q&A
Research Tools

I recently had the opportunity to sit down with Sandy Au, Director of Multiplexing Operations, at NeoGenomics. NeoGenomics is a household name for contract research and is consistently at the top of the service provider game with respect to research output at top oncology conferences. NeoGenomics offers a multitude of cutting-edge spatial biology technologies, including their proprietary MultiOmyx platform for high plex mIF. Special thanks to Richard Hughes, Director, Strategic Marketing, Pharma Services, for connecting us and for some insights offered as well. In this interview, we cover the spatial biology market landscape, including top pharma interests, as well as what NeoGenomics brings to the table to support these technologies. Spatial biology is a market that we cover extensively via custom research projects, data products like BioTrack and market reports. This discussion is in no way sponsored by NeoGenomics – we are always open to speaking with KOLs to understand the innovation going on in our industry – if that sounds like you, please feel free to reach out.

Sandy, thanks so much for joining us today. Let’s start by giving our readers a brief background on who you are and your role at NeoGenomics.

Glad to be here, thank you for the opportunity. I'm Sandy, or Qingyan Au and I'm currently Director of Multiplexing Operations at NeoGenomics. My role is to provide scientific and operational oversight for our multiplexing modality here. That includes the MultiOmyx platform, proprietary to NeoGenomics, the  PhenoImager platform, and the RareCyte circulating tumor cell platform, as well as others. I also oversee the image analysis for our mIF assays. As for my background, I started with high content screening in drug discovery, then moved on to gain expertise in FFPE tissue multiplex immunofluorescent technology. Then I continued my growth in that area by joining NeoGenomics. So far, it has been almost 10 years in NeoGenomics.

NeoGenomics has a few spatial products in its tool belt. There's MultiOmyx, there's the PhenoImager platform, and then RareCyte as well. We'll get into those in more depth later on, but to start, could you give me a quick overview on how each of those platforms are used?

The technology we offer here for spatial biology is very comprehensive. For tissue based multiplex immunofluorescence, we have higher-order technology. This is proprietary to NeoGenomics and known as MultiOmyx. This platform was one of the first high plex assays, as shown in a 2013 PNAS paper, which demonstrated the ability to characterize 61 protein markers from one single FFPE slide. We regularly perform 12 to 20 plex assays on this platform. We also recently onboarded the lower-order multiplexing technology and use it for up to six biomarker from one single FFPE slide. This is the PhenoImager platform from Akoya Biosciences. Along with our existing, duplex IHC capabilities, we have all these different options for tissue-based spatial biology. One additional thing that may be worth pointing out is, for all the technology here at Neo, we provide an end-to-end solution. Not only do we perform assay development or wet lab staining, but we also handle image analysis.  We have a proprietary deep learning-based image analysis pipeline called NeoLYTX that is optimized for the MultiOmyx platform, and we also have qualified enterprise imaging analysis platforms such as Halo from Indica Labs and Visiopharm. We also have very long-standing partnership with NanoString, and we have a NanoString DSP platform.

I’d like to jump into our market-level questions now. My first question for you is, what plex preferences do you currently see across the research development stages? How has that changed from two years ago?

MultiOmyx has been proven for 61 biomarkers, but for most of the custom assays that we are offering, these are around 12 to 25 plex. In the past couple years there's a trend for higher plex panels. Before, most of the assays are less than 20, often times, 12 to 16. Now it seems like we see larger panels more often. With a larger panel, different immune populations can be included in the same assay, enabling characterization of all the different populations from a single slide. Researchers are getting used to higher plex proteomics with the new higher plex instruments that are now available, such as the PhenoCycler-Fusion system or the Lunaphore COMET system; this allows users to explore and get hands on experience performing high plex assays.

In general, the past five years have seen a transition from using multiplexing immunofluorescence to support exploratory translational research, to supporting clinical trial studies through retrospective testing. To support this shift, we have implemented a lot of changes. For example, we implemented a robust end-to-end QC matrix. Before we start to perform the multiplexing, all the samples are QC-checked by the Neo medical team. The pathologists also annotate every single sample going through multiplexing. We perform 100% QC during the post-imaging process as well. All these efforts are to ensure high data quality. Over the past few years, we have also completed multiple fit-for-purpose validation studies for our multiplexing panels to demonstrate reliable and robust performance of the assays to support clinical use. Just earlier this year, we successfully completed a validation study of a 30-plex assay. All of these practices have helped us build a robust process to perform clinical testing. We’ve also deployed a deep-learning based algorithm for MultiOmyx image analysis, including nucleus segmentation and biomarker classification.

As you mentioned, there’s a trend we’ve been following from exploratory analyses toward clinical trial support. Which therapeutic area is driving this? Of course, we’re aware that immuno-oncology and immune cell differentiation is something that drives a lot of multiplex utilization. Is that still the case in your experience?

Our main focus is still to support the immuno-oncology space, and oncology more broadly. Most of our studies are in immuno-oncology. Like you mentioned, for immunophenotyping and co-expression in the tumor microenvironment, multiplexing has a real advantage, and sometimes probably a must-do for certain populations. You can also study certain morphological features, as in a paper published last year which looked at tertiary lymphoid structure. I think that’s also the sweet spot to demonstrate the capability of multiplexing.

I'd love to hear your perspective on how the research and technology needs vary between your academic customers and your biopharma customers. What are the key differences that you see between research contexts in terms of how these technologies are being used?

Based on what we have seen, when we work with academic collaborators, its typically more discovery based, meaning we’re trying to understand the mechanism, so we tend to have a larger panel.  For the pharma clients, they often come with a very specific interest. Plus, the turnaround time is also critical for the clinical testing. This means we generally use smaller panels for the pharma clients than for the academic clients.

When pharma comes to you, do they usually have a specific panel in mind, or is part of the process helping them discover what panel they need in order to answer their questions?

The interesting thing about immuno-oncology is that it's very hard to use a standard panel for everything. Each pharma client has their own targets of interest. At Neo, we support a custom approach and most of our work uses custom panels. We already have comprehensive panel offerings, especially in the immuno-oncology space. These panels can serve as starting point for the sponsors. In addition, we can leverage internal technical experience, and work very collaboratively with our clients to create a panel for their target population.

We're very interested in the path to a multiplex CDx. Perhaps in the next five years we see one with routine clinical utility, and that marks a transition from the current clinical research stage to routine clinical utilization of spatial biology. Do you believe that one spatial analysis has more clinical potential than others? Do you think it will be colocalization in relation to a bispecific drug, the presence of tertiary lymphoid structures, the colocalization of a drug that may be implied in the mechanism of a particular therapeutic, or something else?

First, I want to say that I think multiplexing absolutely has the clinical potential. As we’ve already touched on, multiplexing can provide very rich information. We’ve also started to see growing volume of retrospective testing data as more and more researchers understand the application of multiplexing.  We’ve also seen interest in using multiplexing to support late-stage development work, but there are a few things that need to unfold for us to get there. One is exactly what you point out: we have to know exactly which markers to use, and the best use case for multiplexing technology to support prospective testing. What are the predictive markers? That's really where many researchers want to get to. Another obstacle is making sense of the rich data that multiplex analyses produce. What exactly is the output?  Is it a yes or no, or a density or intensity? There are a lot of things to figure out. I’m not exactly sure which analysis will prevail, but these are some of the questions we need to answer to advance prospective clinical use of spatial.

That is a nice transition for us to talk about image analysis and your perspective on AI, ML, and deep learning. Do you believe that a clinical multiplex diagnostic will require some sort of AI-based analysis to boil the output down to a clinically actionable output?

From our own experience, I do think deep learning and AI-based analysis brings improved accuracy of quantification. This is why we decided to implement deep learning based NeoLYTX analytics workflow in 2017. In a sample, there are all these different cell populations present. It’s very challenging just thinking about how to segment the nucleus accurately as the first step. We can see an improvement in nucleus segmentation using AI and deep learning-based methods. Multiplexing also often uses fluorescence-based methods, and this causes some special challenges, because of the auto-fluorescence inherent to FFPE tissues. With a deep learning algorithm, we can classify biomarkers based on more than just intensity.  We can consider morphology, staining patterns, and this will yield improved classification results. As I mentioned, we perform 100% QC for our results, and see improvements from using AI.

Before we shift topics, I'm curious what the top pain points are that you experience when you're working with a new multiplex assay.

Many custom panels start with qualifying a new antibody. Some of the new targets, especially, are just restricted by the availability of the antibody to use in the FFPE tissue. Like I said, we work very collaboratively with the sponsor, and sometimes we might direct them toward a different approach based on the targets and antibody availability. We also have to consider the availability of tissues for development use, especially for rare indications. We need to consider all these at the beginning of the development work or before we start a project.

As we think about clinically scaling, what are the barriers that exist today to these technologies being used in routine clinical care?

For multiplexing as prospective testing, the first step is knowing exactly the predictive or prognostic biomarker. Right now, researchers are still going through retrospective data trying to identify them. The use case and the biomarkers need to be identified. The second piece is standardization. Multiplexing is a multi-step process. There is assay development, there’s staining, there’s image analysis. Standardization, and how to make sense of data, is critical. We are starting to have guidelines on this, but the nature of the assay is complex. There is a need to think about the best practice at each step, and to standardize those practices. These things are critical to push forward for late-stage development. It's moving in the right direction, but we’re not there yet.

When you think about both clinical research and routine clinical testing, there's a lot of biomarker testing that's going on, and in many cases, the samples are also being considered for genomic testing, whole exome sequencing for example. Are you having any issues getting the samples you need due to increased competition from all the other biomarker testing programs taking place?

The key question really is, what do you want to get from this assay?  Often times we ask this question before we start a project. This helps us guide our clients in deciding which assay to go with.  At NeoGenomics, we are a single stop for all testing, multiplex IF included. There are IHC, flow cytometry, and a lot of molecular assay offerings as well. Oftentimes we make our recommendation based on the client’s desired output, their main interest and the biologic question they want to address through the assay, and what the intended use of the assay is. This information helps design the best path from day one. I don't think it’s really conflicting or competing, because the best assay is the assay that gives you the output you are looking for.  

So, if your client wants to do multiplex tissue analysis and genomic testing, you're able to help coordinate the best use of that tissue so that they're able to get all the answers that they want?

Yes, absolutely.

[Richard Hughes] I’d like to draw a quick perspective as an adjunct to the previous point.  We've seen this trajectory with NGS as an example, as a sort of neighboring modality. NGS has been held up as having high growth potential, and we've seen the advent of liquid biopsy on top of using the traditional approach with tissue. But NGS has had to face the same headwind of standardization. So, I’m going beyond the technical aspects and considerations laid out earlier, downstream to the more commercial end of where this needs to get to. We're not there yet with mIF being industry wide. However, it's on that trajectory. And the important consideration would be having a robust reimbursement infrastructure for mIF. That's going to help drive adoption, but it'll also help overcoming some of the objections that might face it downstream with clinical adoption. And clinical adoption, as we know, can be take time. As multiple stakeholders we are all trying to understand the underlying disease biology, and in our case, the cancer biology at a greater level of detail. When we support the sponsor, with this endeavor, they can advance their drug. And what I'm seeing is that the drugs, particularly in oncology, are becoming more niche. What I mean by that is that they may be targeting individual mutations or individual steps of a MOA, or mechanism of action. The spatial context provides that extra level of information, at single-cell or sub-cellular resolution, generating more insights from less sample if you're dealing with a scarcity of tissue.

The commercial element is one that we certainly think a lot about, and I'm glad that you drew those parallels with the commercial expansion of next generation sequencing. We look at where this testing could take place, and there are a lot of indicators that specialty reference labs such as NeoGenomics might play a very important roles by offering a standardized setting for these tests to take place, much like NGS requires a specialty set of handling skills and bioinformatics. It seems that there's a very similar path set forth for the scaling of spatial biology in the US market, and potentially overseas as well.

I’m interested to learn about the history of the platforms that that you're offering, and how you feel the offerings are differentiated. The one that we all hear about in abstracts and in the press is MultiOmyx. How has it changed in NeoGenomics’ hands since the acquisition, and how do you feel it's differentiated from other mIF technologies, like the ones offered by Akoya and Ultivue?

MultiOmyx first LDT was launched in 2013. Over the time, we established a high throughput testing workflow compliant with CAP/CLIA guidance. We’ve already touched on a lot of what we have implemented. It is an evolving process as we transition to support clinical trials, higher throughput testing, and as we improve our algorithms. We’ve also launched a tool called NeoVUE which allows the sponsor to explore the data package from MultiOmyx. This tool allows the clients to see the high-resolution images along with the data in one click. These are the improvements we have made. In terms of the way how we operate, we provide end-to-end solutions. From sample receipt to data package, we have very strong medical and scientific support teams for every step along the way.

Lastly, we’re curious if there's anything you can share about what to expect for this year.  Are there any technical advances that you're working on currently, or any other fun announcements that you can share?

Absolutely. We have a very comprehensive I/O menu, and each year, we are working hard to try to catch up with the literature and expand our panel offerings. For instance, we noticed that natural killer cells are becoming a major interest and plan to put a panel around NK cells. Additionally, we see growing interest in what we call ‘integrated’ assays. The integrated assays allows detection of mRNA targets in addition to protein-omics by mIF. This is something we have worked on, with R&D dating back to 2016, as demonstrated in an AACR poster. We are working on adding more integrated panels to our menu. This is a great tool to support research such as CAR T or CAR macrophage studies where no commercial antibodies are available. mRNA detection by RNAScope to identify target cells could be a very good surrogate in this case. In addition, what we’ve worked hard on the past two years is to ensure we have a path for prospective testing. We recently onboarded a PhenoImager system, and we are one of the preferred CROs for Akoya. We have also completed GxP validation of the Indica Halo image analysis platform to use for PI image analysis. That's a big milestone we achieved last year.

So, the onboarding of the PhenoImager platform was motivated by client interest, and it's something that you feel you may need to address the needs of those clients that are particularly interested in going toward the clinic. Is that correct?

Yes, because when we go toward clinical diagnosis and commercialization, we need to consider a few other factors, for instance, turnaround time. We want to make sure that we have that offering to support the needs.

Great, thank you so much, Sandy and Richard, for being here with us today, and for sharing all of your wonderful insights. This was extremely informative and interesting.

Precision Medicine is evolving at a rapid pace

Discover how we can help

Get in Touch