Jeff Elton (00:02):
Welcome to the ConcertAI Podcast. This is Jeff Elton, CEO of ConcertAI. Today, I have the great pleasure of welcoming Dan McSweeney, President of TeraRecon, ConcertAI’s digital imaging solution.
TeraRecon is the leading solution for three-dimensional radiological interpretation, and actually has taken an exceptionally novel approach of integrating AI directly into the care delivery models.
So Dan McSweeney, welcome to the podcast, and thanks for spending some time this morning.
Dan McSweeney (00:38):
Thanks for having me.
Jeff Elton (00:39):
First, I’d love to hear a little bit more about yourself. You have a really interesting background. You’ve been at a couple different health technology, health informatics companies, played in a few different parts of the spectrum of technologies in the area. You’ve had a very strong provider-centric, as well as life-centric, but very strong provider-centric. What brought you into the field? What keeps you close to healthcare and health tech?
Dan McSweeney (01:04):
Yeah, I love talking about that actually, so thanks for the question. My family has been in healthcare for a long time. So my father’s a retired radiologist, my grandfather is a physician. I have a lot of physicians in the family, so I was exposed to it pretty early on. At the barbecues and at the gatherings, there was always a lot of medical talk.
Little fun fact, actually, my dad was a med student in New York, and I don’t remember, but they have pictures of me playing on the tractors when they were building Memorial Sloan Kettering Hospital back in the late ’60s. So, I go back a long way with my exposure to healthcare.
Jeff Elton (01:35):
Wow, you even have that oncology overlap too.
Dan McSweeney (01:37):
Exactly.
Jeff Elton (01:39):
If I kind of think about it, obviously you’re surrounded by healthcare, you’re surrounded by people that had a passion around healthcare. What about for you personally, what else kind of compels you, motivates you, brings you to the field?
Dan McSweeney (01:51):
Yeah, it’s really so impactful what we do, it’s so energizing. We’re at sort of this intersection of super smart people doing things that are super relevant and super impactful. So I just think for a lot of us that are in healthcare, it’s sort of inside us, it’s what puts a little bit of spring in our step. Just when you think about not just what we’re doing today, but where this is all going 10 years from now, 30 years from now, it’s so exciting to be a part of that. So, it just keeps us energized.
Jeff Elton (02:22):
Super. I mean, share the same thing. So you’re at the helm now of a company, TeraRecon, and you’re in a very unique area. You’re in an area where there’s lots of data, radiologically acquired images, particularly three-dimensional ones, application of AI, machine learning, and other quantification approaches. In fact, arguably there may be more accepted utility of AI in radiology-related fields than in almost any other part of healthcare.
I’d like to kind of spend a little bit more time on this topic, because I think it’s one that actually comes with great promise. There’s arguments about what’s the appropriate integration points of AI in that decision, workflows, et cetera, so I think there’s some real meaty areas we could kind of go into.
Given TeraRecon’s position, I think it’s got a unique role in that. But maybe can you just give us a little view of what is TeraRecon, and where is it in the whole flow of the workflow of healthcare and healthcare diagnosis, treatment selection, et cetera?
Dan McSweeney (03:29):
Yeah, we think that we are very, very uniquely positioned in TeraRecon for a couple of reasons that we’ll talk through. One is just the experience that we have in the marketplace. Over 1,000 facilities globally, we’ve been doing this for 20 years.
Jeff Elton (03:42):
1,000, that’s a lot of facilities.
Dan McSweeney (03:44):
It is, and growing, by the way. We’ve been doing this a long time, we’ve got great brand equity, significant access, which is really one of the most important things. Significant access to the workflow to understand truly where the challenges are, where the variation is.
Ultimately, what we’re all trying to do in this space is really a couple things. One is to reduce that variation in clinical outcomes, that’s a big problem. The other is to increase the productivity. The only way at least I know that that’s going to get done is if we’re deeply embedded into the workflow.
So at TeraRecon, we also feel a big differentiator for us is our neutrality. So when you think about a lot of the other players in the space, especially on the AI front, they’re tied to some other system. They’re tethered either a modality or a PACS or some other device. We as a truly standalone vendor-neutral player in the workflow can integrate with all of those.
You don’t really have homogeneous environments in provider spaces, so people have different modalities, different IT systems and different imaging solutions. So, the ability to integrate into those deep into the workflow regardless of vendor profile we think is a big differentiator for us.
Jeff Elton (04:54):
When you talk about a PACS system, oftentimes it’s an archival system for storing the images that would come off an instrument that requires that. When you talk about modalities, that’s the different instrument systems for the acquisition of the image, whether it’s MRI, CT. When you’re talking about different manufacturers and system offer those.
When you talk about that, that seems to be quite unique, because you’ve got some pretty big companies that are some of the largest companies in the world that are offering part of that infrastructure. Say a little bit more about the neutrality. If you think about that variability and consistency, how does that kind of come together? Because that’s a pretty unique positioning.
Dan McSweeney (05:37):
It is, and I’ve worked for some of these companies, so I have a little bit of insight into how they operate. When you think about one of the missions of some of these bigger companies, and it’s a very fine mission, is they want to make their stuff work with their stuff, right? But the other side of that equation is that their stuff doesn’t work with other people’s stuff.
As I said, if you’re in a homogeneous environment in the provider space and the device capturing the imaging study is the same manufacturer as sending it to the archival imaging system is the same sending it to the interpretation system, then there’s no problem, but that is absolutely not the case out there.
So, we have access and the desire to actually integrate with other manufacturers’ modalities, other manufacturers’ separate disparate IT systems, whether it’s driven by AI, machine learning or not, and other manufacturers’ imaging systems. So, it’s pretty unique out there in the marketplace.
As larger companies are more focused on larger pieces of equipment, we’re focused on delivering the right data to the right person in the right format in the workflow. There’s no other motivation that we have, such as selling a larger system or having some type of different relationship with the customer. So, we feel that that is a unique perspective for us to be in the market that [inaudible 00:06:52]
Jeff Elton (06:52):
Yeah, really interesting. If I go back to your language, you talked about variability. At least from what I know about healthcare and healthcare quality and trying to achieve outcomes, reduction of variability and moving it to a consistency of process and implementation of evidence, both evidence that may be literature-based, institutionally-based, that’s all part of quality, that’s all part of contributing outcomes. Is that how you feel you guys play into that kind of approach?
Dan McSweeney (07:18):
That’s exactly how we feel. The advisory board came out and one of the top three challenges of the healthcare provider system is variability in care. That’s what makes this so difficult for the industry to get our arms around. So as an example, if everybody were doing something wrong the same way all the time, that would be a much easier problem to fix, but they’re not.
The variation in care from rural to urban based on socioeconomic factors, social determinants of health, geography is staggering. So that’s what makes this so difficult for everyone to get their arms around, but also so challenging and so energizing about trying to fix it. It’s just that moving target that we’re spending a lot of time on.
I think that’s something that as we get into artificial intelligence, machine learning around clinical decision support, is to be able to augment the information that the radiologist is seeing, cardiologist, clinician. That’s going to be a big factor we think in reducing the variability of care.
Jeff Elton (08:14):
We’re going to head into that theme next. So now you’ve got consistency, ubiquity kind of given the large footprint, and my understanding is you’ve got quantification of features. Now, I’m going to differentiate quantification of features from say where there may be AI machine learning-based models that are deployed as 510K or software as a medical device, some call SaMD [inaudible 00:08:39] come through.
Can you give a little bit of the distinction between this hierarchy of where there’s features and features that can be refined and quantification, all the way over to AI and machine learning and help kind of break that down a little bit?
Dan McSweeney (08:51):
Yeah, so we think about it in a few different buckets. So, one is just the old school productivity, right? So radiologist burnout, clinician burnout is a real thing, and part of the challenging around imaging is the imaging studies, we’re getting more and more imaging studies done and they’re more and more complex, so it takes more and more time to do them and there’s more of them. But we’re not outputting any more radiologists than we were, the system is only designed to put out, so that math doesn’t work.
Jeff Elton (09:18):
Fewer from what I understand, actually.
Dan McSweeney (09:19):
Exactly. So, a big part of where we focus is just old school productivity, right? Where AI and machine learning can play a large role in that and is around things like automation, auto measurements, vessels, segmentation, those are areas where AI has proven already, or has demonstrated, I would say, already that they can affect productivity.
The other is around, as we talked about, the variability in care. So, things like determining is the central line placed correctly? Is there a potential neurovascular event in that CT scan? Is there a pneumothorax that’s picked up on the X-ray? Are there lung lesions? Those are all areas where they can provide a little bit of an assist to the clinician to potentially see something that the clinician might not have seen or see it in a different way. Again, none of this is replacing the clinician, this is all artificial intelligence-guided assist to provide that right information to the clinician in that care setting.
Jeff Elton (10:13):
So I’m hearing it as almost a continuum, but during the last part there, really an augmentation, not a replacement. It’s almost an assurance that key features that may get highlighted that are going to be material to that diagnosis are actually seen with appropriate fidelity and prioritization in that field of view. Is that kind of a accurate way of trying to understand that?
Dan McSweeney (10:36):
Absolutely. There was a lot of churn maybe 10 years ago when artificial intelligence started to come out and clinical decision support that it was going to replace the radiologist, there could be nothing further from the truth. This is all designed to augment the radiologist to provide them more information to automate certain tasks that they’re taking their very valuable time to do now.
Ultimately the clinician, we are focusing on the radiologist, but cardiologist, neurovascular, clinician, we’re providing them with some AI-assisted information. The ultimate decision point is up to them.
Jeff Elton (11:08):
So Dan, as I’m listening to you, I’m kind of coming away with this impression and maybe in the not too distant future that radiologists wouldn’t even imagine working without an array of augmented tools as part of their own basis of practice. Is that an accurate view? Are radiologists really at the forefront of bringing these technologies forward and getting them into clinical practice?
Dan McSweeney (11:34):
So we absolutely believe that, and there’s multiple evidence, both in the medical field and outside of the medical field that AI assist is pervasive in our daily lives. Radiology, clinical care area has always been a leader on the tech side, so when you rewind 30 years, they were the first to demand really from the industry digital modalities and digitization of their workflow.
None of this will work unless it’s pulled by the clinicians. So when you think about what the vendors, what we’re developing, we’re developing this all in concert with clinicians. In order for this to take hold and for it to really impact the workflow, it’s got to be radiologists in the radiology space, obviously other clinicians in their space and care areas as well, it’s got to be driven by them.
So the first examples that have been coming out, whether it’s lung lesion, whether it’s neurovascular emergencies, central line placements, many of those, whether it’s calcification scoring, are all being pulled by the market to really let us know that these are some of the most impactful relevant areas to start this journey.
Jeff Elton (12:37):
So I want to pick up on a theme that you brought up a couple minutes ago, and it’s clinical decision support. Very complicated field. It’s a terminology that means different things to different people. It’s almost in that category of I’ll know it when I see it, but it’s very difficult to find it. Years and years ago, probably I think 2009, I worked on a clinical decision support. In fact, I was the lead author on some IP in that area, never came to practice.
So, you’re describing something though that sounds like the foundations of a real practical approach to clinical decision support. If I added digital pathology and pathomics, you start to see the pieces, how should we think about this? Because this is super complicated and healthcare is usually made up of very narrow highly specialized decisions. How do we go from where we are now to laying in the layers and the foundations of things that can actually have high utility clinical decision support really become part of the way we operate clinically?
Dan McSweeney (13:46):
Yeah, that’s part of the challenge that we talked about earlier around the variability in care. There are tens of thousands of new clinical information that’s published weekly. With the workload on today’s physicians and clinicians all through the care continuum, they just can’t keep up. Things are changing-
Jeff Elton (14:01):
Back to a prior life you led, in fact, actually, in that field.
Dan McSweeney (14:04):
Exactly, exactly. What we see is they just can’t keep up with treating patients, which is the most important thing, and all of the literature and all of the latest protocols, treatment protocols and study results. So, the intent of the clinical decision support is really that final word, which is support.
It’s intended to be able to aggregate all of this other information and bump it up against the specific, let’s say, clinical case that that physician is working with, and really show them, hey, here’s something you might think of. Here’s the way that you may think about the differential diagnosis, here’s some new treatment protocols that you may not be aware of that might be available, as well as just the vast breadth of the information.
When you think about the genomics and you think about the labs and you think about the imaging, and then you think about the physical symptoms, acute symptoms as well, it’s very challenging for the physicians in these environments, in these short interactions with the patient to be able to compile all that. So, why not use the power of technology to provide this digital cockpit, we’ll refer to it as, that allows them the information to make better decisions.
Jeff Elton (15:11):
So, I’m hearing a few very important themes in what you said. One is support, not make. It’s I’m supporting a decision process, I may be supporting a care team. Contextualization, I’m taking the data and information, the decision that has to be made and I’m contextualizing vast amounts of data available on the patient, but also a vast amount of evidence around that. Then this notion of temporality of when does it get presented in the point where maybe it has the greatest utility.
Is this something we should think about as almost operating on a continuum? It’s not a, here’s your platform, it’s ready to go, but we’re going to see it layering in increasingly across the different activities in healthcare? Or how should we think about this?
Dan McSweeney (15:57):
I think it’s absolutely going to be a journey. First, it has to be in the workflow. We’ve all seen swivel chair options or things that are clunky for the user and they just don’t get adopted, so it absolutely has to be in the workflow, which is why it’s so critical that the clinicians are involved.
Second, it has to be fluid. Healthcare is so personal, so part of the reason we have all this variability is not just physician interpretation. That’s based on education, experience, age, risk, tolerance, but also each of us is different on the other side of the equation. So it absolutely has to be not only in the workflow, but this real-time type system where one piece of data might be relevant on Tuesday, but that piece of data might be less relevant on Thursday based on developments of the patient.
Jeff Elton (16:45):
Wow. Let me take a slightly different turn. TeraRecon today is part of a broader collection of entities and capabilities inside ConcertAI. My understanding is that 50% of the interpretive work of TeraRecon oftentimes is cancer-related, solid tumors and other, and obviously very active in acute cardiovascular and neuro-related indications. But as you think about now this broader set of AI data technology domain here, what’s possible, and what are you most excited about that could come together now?
Dan McSweeney (17:26):
So, this is a whole other aspect of the power of ConcertAI and TeraRecon together that really excites us. We’ve just spent a good amount of time talking about more in the acute care setting, post-imaging study. The power of adding imaging, whether it’s imaging data, whether it’s prospective or retrospective imaging data upstrea
m into the clinical trial environment, and then even further upstream potentially into the drug discovery and drug development environment is truly revolutionary.
I know that ConcertAI and TeraRecon are uniquely positioned to do that. I don’t know anybody else in the market who has the expertise on the clinical trial management side, evidence real world data and on the advanced imaging side. Most all clinical trials for oncology involve some type of advanced imaging.
Just imagine if we can take that genomics data, if we can take that patient data and if we can take that imaging data, move it way upstream and really be able to have a tremendous impact, much more than just that one patient that that one clinician is supporting at that time, and design better studies to yield more statistically significant results and then design better drugs. I mean, when you think about just the opportunities there, it’s truly energizing.
Jeff Elton (18:40):
Absolutely. Actually as you’re talking, I can already tell you we’re going to have a dedicated podcast just on this particular topic as well. I know that from some of the biopharma sponsoring directions, they’re excited about the TeraRecon capabilities even for doing patient identification for steady eligibility using that quantification and AI layer. Images are obviously an important part of registrational trials, so super interesting significant area for sure.
Let me take you to my last question. I’m going to come all the way back to your personal origins and the nature of the patient and kind of that. If you’ll think out over the next three years, no more than five years, if you think about the things you know your teams are working on and that the fields going on, what are you most excited about that you think will have the most material benefit for patients going forward?
Dan McSweeney (19:32):
So, TeraRecon supports patients by enabling the care teams that support them, that impact them and provide them. We are in such an exciting place in the timing of this market with the explosion of artificial intelligence. Just to go back, what I feel fantastic about is that we are focused on two of the biggest challenges that the market has told us exist. One is reducing that variability in care, and the other is addressing radiology burnout.
When you think about our roadmap and how we’re thinking about it, whether it’s vessel segmentation, plaque, calcium scoring, auto detection. Then on the other side, when you think about auto TAVR, EVAR and just taking things that are taking clinical teams hours and hours to do today and having those done in seconds, again, under the supervision of clinicians, of course. The ability to free up their time, their highly valuable time to spend more time with patients, to be able to think through maybe a differential diagnosis differently and more thoroughly really excites us. Just the ability to positively impact the people that are positively impacting the patients.
Jeff Elton (20:39):
So Dan McSweeney, I have to admit, I’m excited too. It is absolutely wonderful, and I think what’s been accomplished so far is tremendous. The promise you laid out, it’s kind of on the forefront of everything we’ve been working for for a year, so it’s great to see this kind of come to fruition.
So I want to thank you for coming to the podcast room here in Cambridge, Massachusetts, and I look forward to the next session, which we’ve already highlighted what the agenda’s going to be for that one. So, thanks again very much.
Dan McSweeney (21:05):
Thank you for having me.
Jeff Elton (21:07):
So, I want to thank Dan McSweeney again for his participation in this podcast. Dan McSweeney outlined the TeraRecon solution, which actually offers an open AI architecture, which actually allows the medical device, 510K software medical device at TeraRecon to be joined by first party and third party solutions.
He also outlined why radiologists are on the vanguard of implementing AI for medical diagnosis. In fact, lays the foundation for intelligent clinical decision support that will continue to evolve over the course of the next two to five years.
Thanks for tuning in to this month’s podcast and we hope you’ll join us again next month. To all those who have listened, good morning, good afternoon and evening.