Podcast #63

AI and Mental Health: The Future of Care and Treatment featuring Chris Hemphill

AI technology is revolutionizing the way we approach mental health, including, increasing access to care, and improving patient outcomes. Join Cardinal’s CEO, Alex Membrillo and Sr. Director, Commercial Intelligence at Woebot Health, Chris Hemphill as they take an in-depth look at the current and future state of AI in mental healthcare and the impact it has on the healthcare industry as a whole.

Episode Highlights:

Chris Hemphill: “The basic concept of Woebot is that– There’s a concept called Cognitive Behavioral Therapy… If I am having a spike in anxiety or depression, if I am to pick up the phone today to be seen, in many cases, we can be looking at 2 weeks out, 6 weeks, 20 weeks, or not even available at all. People having major challenges finding these services. Our obsession is around access to care. Woebot is a relational agent that you’re speaking back and forth. It’s analyzing what you’re saying to assess your needs and then sending you down the right Cognitive Behavioral Therapy pathway that fits your needs.

Related Resources

Announcer: Welcome to the Ignite Podcast, the only healthcare marketing podcast that digs into the digital strategies and tactics that help you accelerate growth. Each week, Cardinal’s experts explore innovative ways to build your digital presence and attract more patients. Buckle up for another episode of Ignite.

[music]

Alex Membrillo: What’s going on, everybody? This is a special, special episode, special, special, special. I’ve got Chris Hemphill here, Senior Director of Commercial Intelligence at Woebot Health. Usually, we interview marketing directors at multi-site healthcare groups. Today is going to be different. This is different. This is AI tech in the behavioral space.

Guys, this is the future, and it’s really timely because we just had ChatGPT hit us in the last few weeks. Everybody’s wondering where AI is going. Today, we’re going to be talking about it, at least in the mental health space, so this is fun. Chris, welcome to Ignite.

Chris Hemphill: Thank you, Alex. It is an absolute pleasure to be here. A little bit of background in the healthcare marketing space. Right now, I know everybody’s like, “What if somebody from the behavioral digital health side have to say?” Do come with what my previous role was in the healthcare marketing analytics space, in an applied AI position there.

It’s just interesting to see the crossover and be on this digital health side, but still, I can’t take away the healthcare marketing aspect of my brain. It’s just going to be fun to synthesize what’s going on in this language-based AI world and then making a relationship to healthcare marketing.

Alex: Yes, absolutely. At the end of the day, we’re trying to, as healthcare marketers, trying to connect patients with the care they need. Woebot Health, we’ll find out more about that, helps more patients get care. Then we win, right? This is fun, and you get to do it all. Chris, tell us quickly, what is Woebot, and where is it trying to be in the market? Replace therapist? What does it do?

Chris: The basic concept of Woebot is that– There’s a concept called Cognitive Behavioral Therapy. There are others DPT, IPT. There’s a whole bunch of acronyms that I can throw at you, but a lot of these therapeutic methods that were developed within the past 50, 60 years, a lot of these work were developed in mind of being able to be delivered at home or at the time of the incident of need.

Think about if I’m having an incident. I’m having a spike in anxiety or depression and things like that, if I am to pick up the phone today to be seen, in many cases, we can be looking at 2 weeks out, 6 weeks, 20 weeks, or not even available at all. People having major challenges finding these services. Our obsession is around access to care. These approaches, Woebot is a relational agent that you’re speaking back and forth. It’s analyzing what you’re saying to assess your needs and then sending you down the right Cognitive Behavioral Therapy pathway that fits your needs. For example, you’ve been playing with Woebot?

Alex: I have been playing with it. It’s crazy. It’s crazy. I love it.

Chris: I encourage everybody to access it, play with it. We won’t charge you any money for going on and using Woebot. Please, download it. The first thing that it asks you is how you’re doing or what issue it can help you with. It’s using your response there which even certain free text, which free text is always going to be a big subject that we’re talking about today [unintelligible 00:03:43] all the general technologies and things like that out there.

Our company was founded by a clinical psychologist, by someone who’s obsessed with the science behind it. It’s using that free text to understand are you having an issue with relationships? Are you having an issue with your family, or grief, anxiety, things like that? Then route you down a CBT, Cognitive Behavioral Therapy, pathway. Ultimately, designed to alleviate your stress [crosstalk]

Alex: I noticed it was trying– I said, “I’m feeling grumpy and anxious. Oh, that’s pissed me off.” Something here at the office. [unintelligible 00:04:23] It said, “Do you notice how you’re using extreme words like always?” Then when I said, “Yes, I have noticed that.” He said, “Do you feel like he feels that way about you?” I said, “You know what, I bet he doesn’t.” Then I felt better.

Chris: Yes. Honestly, in my early use of it, because I’m a very cynical person, and I’m like, “What? .” When I was talking back and forth, and sometimes you just want to blame other people for the crap that’s going on, but like you’re talking about, Woebot got you to look at your own thoughts, and then challenge them.

There’s this other thing. Here’s a stop that I want to stand on for a second, is this term self-care. When we hear about self-care on Instagram, there’s always like, “Oh, buy these tickets to fly out to Europe to this expensive vacation getaway or buy this makeup.” Fine, that’s referred to as self-care, but I think that we should be really reclaiming that term, self-care, to mean having the skills and understanding to be able to manage your own care at those moments. Why should self-care be associated with how much money you’re able to spend on things rather than developing the skills to be able to help yourself in the moment?

Alex: Sure. That makes sense. Woebot, was it invented to take the place of therapists? I just need this. I’ll do my check-ins, or is it to replace therapy because we have limited access, or an addendum?

Chris: Ali Darcy, who’s our founder, Athena Robinson, who is our chief clinical officer. These folks are practicing therapists and clinical psychologists. That’s where a lot of our leadership comes from. We don’t really even believe it’s possible for a technology like this to replace the therapeutic relationship. A significant number of our users actually at Woebot, they see in-person therapy, but they use this as something that’s in-between or the fancy word is injunctive.

Either way, it plays out in usage because, imagine, earlier, I was talking about the ridiculous length of time that it can take to even be seen. I didn’t even mention the cost and things like that that prevent access. Imagine a world in which access problem is solved. You can be seen within three or four days. Still, there would be a place for this technology based on the fact that our usage, about 74% of our usage happens outside of the hours that a clinic would be open. The longest conversations that people have with Woebot are around 2:00 in the morning. Even with human therapy resources available, there are people that are a good fit for the tool and other people that probably wouldn’t be as fit. For that group of people, there’s still that need because of just the way that it can be delivered at any time.

Alex: It is so crazy. AI can synthesize voices. Chris was mentioning just three seconds of audio. There’s some tool out there that can come up with your voice.

Chris: Just last week, Microsoft released– I think it’s still internal. I don’t think that I can go access VALL-E right now, unfortunately. It’s a tech called VALL-E that, with just three seconds of your recorded voice, not even at a pristine quality, it can then synthesize a fairly convincing replica. I was listening to the voices, and then you could tell there was audio quality issues and stuff like that, but imagine as they iterate on that technology.

Alex: Yes. My therapist, Veronica, the best obviously, you can tell, kept it all together here for the last 14 years. She’s awesome. She can record her voice, and then Woebot can eventually call you, I would assume, and say, “Alex, how are you feeling?” “Well, you know, I’m not feeling too well.” Is that where you all want to take it?

Chris: No.

Alex: No? [laughs]

Chris: All right. I haven’t talked to anybody about this, but it just gives me a no in the chest right here because we were just outside. We were just talking about this company called Coco. Sorry, Coco. You’re on blast right now. They’re a chat technology that allows you to– Say that you’re having a problem with your grief or anxiety, and you talk to this chat program. It’s a peer source help. They’ll send it out to a network of people on Discord. Spooky. They’ll send it out to that network of people, then those people will start replying back, and you write the responses like did this help you. Like, “Hey, here’s your response to them. Did it help you feel better? Great.”

What Coco started to do was they took the messages that were coming in and rather than sending them to look like [unintelligible 00:09:27] to humans, they introduced ChatGPT in the process. What they didn’t tell the users was that the people responding were potentially using ChatGPT to generate the response. What happened with that was as users learned that their response actually was generated or inspired by the AI, they significantly dropped their ratings. They didn’t know that their data was going to be sent in that way. Knowing that the response came from the AI made it feel less authentic, that they had been being hacked.

Using Woebot– I know you haven’t been using it for a super long time or anything like that, but one thing that you’ll notice is Woebot is always talking about things like, “Oh man, my ears are grinding on this.” It is always jokingly reminding you who it is. You’re talking to an AI. There’s only so much that an AI can do. There’s limitations there.

As humans, we are rightfully distrustful of if somebody is trying to deceive us, an AI’s trying to deceive us into thinking they’re human. Sidestep that, the ethical approach to an AI delivered therapy is to remind the user of the limitations of this thing like, “We can’t do everything that a human can do, but hey, right now, maybe I can help you with your thoughts.”

Alex: I’ll say this. To Coco’s defense or to the defense of AI, the users thought the responses were valuable before they knew they were robots or ChatGPT. That means it can work-

Chris: Yes. Honestly, I sat back–

Alex: -and be clean, we should tell users, but they thought it was good.

Chris: Yes. Imagine if they had been upfront with those users. Imagine if they said, “Hey, we’re doing an AI-assisted therapy approach to help our people, this peer source who don’t necessarily know the quality that you’re getting.” Imagine that they were upfront with their users and said, “Would you like to go forward with this?” It creates a whole different perspective when you know what you’re walking into, when the trust is already established there. I really think that, again, the messaging and everything like that, people responded positively to it. Gosh, if only they had done it with the proper ethical framework in play, we’re talking about growing trust rather than diminishing it.

Alex: Yes. It’s those kinds of plays that screw up the AI game for everybody.

Chris: Make it hard for the rest of us.

Alex: [laughs]

Chris: [unintelligible 00:12:09] Woebot, they did– Yes.

Alex: Healthcare marketers, we help marketers on behavioral groups. How do we need to look at this? Are you going after the same demographic? Are we going after it with different messaging? I’m wondering, where do you see healthcare marketing going? What are we, as healthcare marketers, going to need to diverge on to get people exposed to– I’m trying to think if Woebot was my client, what I would do differently , actual clinics and therapists. Where do you see this going?

Chris: Help me dig a little bit deeper into– because I have some thoughts coming from that healthcare marketing background. I clearly have some thoughts on what that relationship looks like. Could you just go a little bit deeper into what you’re asking about the healthcare marketers?

Alex: Yes. I’m wondering how Woebot markets itself because right now, we take out Google ads and SEO. When someone types in, “Therapists near me,” we want to show up. What does Woebot do? How do they get clients?

Chris: Gosh. That’s a really good question. From the consumer perspective, this was something that was really surprising to me was when I came into the company, there were about 1.4 million people who have found Woebot mainly through things like the App Store.

Alex: Really?

Chris: Yes. We don’t invest a huge amount of money into consumer marketing, except that if you take some designers’ words, if the product is so effective and works so well that people talk about it and recommend it to others, then your investment in product is an investment in marketing. There’s a big amount of investment that goes into the product side. On the enterprise and commercial side, we know who the major health systems and insurers and things like that.

Alex: Is that who the main client is? You’re trying to get after the insurance company, the payers so that they offer incentives for people? Who’s the client?

Chris: The client tends to be health systems and payers or even the combination, pay bidders and things like that, that are focused on, if you think about it, having access to a tool that can deliver care, help you out in a time of need. If you’re in a fee-for-service model, it’s different in a fee-for-service model where you’re trying to acquire as many services that they have as possible. It’s not necessarily for that, but it’s more so right now, more so for those organizations, especially value-based care.

Alex: Is that the value-based care where you want to reduce re-admissions and stuff like that? The health systems are giving this out to anybody that’s been in for any behavioral or mental health challenge in hopes that this–

Chris: We can even go a little bit deeper from that because knowing that there’s that shortage of behavioral therapists, and the amount of time to even be seen for behavioral care. You were asking about the biggest trends a little bit earlier. The acronym running throughout today is BHI, Behavioral Health Integration. It’s recognizing that behavioral health needs going unmet, they are associated with much higher costs in other areas. As we fall down– I’ll just use an example of depression. Let’s say you’re depressed and you’re also diabetic. Having some of those symptoms makes you less likely to take the treatments and things like that that you need to manage the diabetes. That cost curve bends in very much the wrong way. The earlier that we could help address those mental health needs, the–

Alex: That’s wild. They’re giving it to patients of all kinds?

Chris: Yes. Like, “I have a problem. My primary care provider is not a trained therapist, so they don’t necessarily know what to do about some of these problems.” It gives a way rather than these needs going ignored or given a referral that never gets filled. It gives something to primary care physicians to be able to help address their issue.

Alex: Does the PCP say, “This is an in-the meantime-thing. Go get help from the therapist. This is an in-the-meantime app that can help you with some of these things right now, help you get care right now.” That the positioning?

Chris: I like that question because I’m thinking like, “Whoa, we should come up with some kind of scripting to help with how they deliver that recommendation.”

Alex: Yes. They need some kind of delivery and then it’s training the provider on when to bring it up, how to bring it up.

Chris: Yes. That’s where we see a lot of success is– God, I personally just love talking to physicians. My partner is actually an OBGYN. The reason I like having those conversations with physicians is they don’t want to recommend something that’s going to do their patient harm. The fact that they have that much closer connection with that patient, and they have the most serious questions about the way the science behind it works, when you can break through–

Now, I’m saying this for all the people out there that have challenges with their physicians adopting digital health platforms. Once you work on that science and explain it in a way that that breaks through, then you drive that adoption. Sorry, it’s just like a little tangent that–

Alex: No, I’m thinking because we have a GI plan. Won’t say who it is. Initially, they had terrible reviews when we went in bed with them. I said, “Why are the reviews for this practice so bad? It seems like the practice–” They said, “Strangely enough, your gut is tied to the emotional state part in your brain.” Often the GI practices have tons of bad reviews because the people are– there’s something mentally unstable for them. The clients are having stomach issues. This would’ve been a perfect tool for them. Have a GI practice so I can see that things go hand in han., I see somebody use–

Where do you see digital health platforms evolving from here? Where does Woebot want to go next? What do you see as the next big evolution? Can the Derm Group use this? I can send this weird shit I have around my eyes, take a picture, send it to some Derm digital health platform. Where’s this going?

Chris: There are a lot of people that play in this space of being able to get whatever analytics you can off your phone with these high-quality cameras that we have that can pick up all kinds of nuance. When I was presented with these technologies back in 2015, my eyes couldn’t have rolled far enough out of my head because the results weren’t there. This stuff didn’t work.

I was just at health this past November, the health conference in Nevada. Gosh, again, I said I’m a very single person. I walked into the thing cynical, but then when I saw– I met up with the physician who was also going round touring, making the rounds and looking at some of these biological inspector things, and his excitement was palpable about it. It just made me think maybe–

Alex: There’s something to it if the providers are excited about it?

Chris: Yes. If providers are excited about it without me going through all that science, a med degree myself, sometimes I think that that’s a good thing to look at as a look ahead of what might actually be working. Digital health as a whole, right now, I’m going to speak more from a hope perspective, is that we know that funds are harder to acquire right now. The well is drying up for all sorts of companies. What I’m hoping for is that as a result of these funds drying up, rather than investments and things like that being based on revenue–

Alex: Revenue

Chris: Yes. Rather than it being based on a very light understanding of what someone’s doing. I really hope that VCs get really responsible with their dollars, get deeper into the science behind how some of these things work, and we just get more checks and balances, more quality investment in organizations that are obsessed with the science.

Alex: I’m all about it. This showed me how much MedTech can do. It can tell you atrial fibrillation, all that stuff. I think the best way that we’re going to be able to prevent cancers and stuff– Something came out in a recent congress. I don’t know if it’s that one, but it’s something that sits in your toilet and it does your analysis every time you go pee.

Chris: I can’t believe how excited. I was like, “Yes.”

Alex: Yes. Think about it. It can detect whether your all these different cells are fighting, if you’re showing early signs of cancer or something like that. I think MedTech is going to be huge. I see my wife scrolling every night and and I’m like, “What are you doing, baby?” She’s on TikTok and just going through different things. The algorithm, I guess, is so good that it serves her up ads, but I don’t really get it. It seems like with showing up at our door every day that I didn’t go on an Amazon and purchase, she’s getting influenced by TikTok and everything like that. Is there any place for social ad platforms in the mental health space? Is there any place for advice from TikTok? It sounds crazy. I don’t know.

Chris: Ooh. The emphasis on that word, advice. Whoa. I’ll say this. What do the aerial say? She said, “I’m going to go where the people are.” When it comes to these platforms, we have to understand how to reach people in a way that is safe and effective because you said the a-word. Advice is the a word. If people are going to these platforms and credible sources aren’t there, then who does that leave the door open to speak to them? Overall, I really think that we have to be there, that credible voices have to be there to drown out the wave of disinformation that people would otherwise get. We can see that there’s a major potential for that disinformation to do harm.

Alex: I could see a lot of mental health providers being above it, and you’re saying, “No, you should dive in because everyone else–” You know who’s going to be there? Alex, with no therapy degree and no training. He’s going to be there.

Chris: Because why would that content get millions of views and likes other than it’s something that people want to hear? Think about if you’re in your mid 30s or late 20s or what have you, and you’ve been having issues all your life or there’s just been challenges that you’ve had and there was nothing to put a name to it. It’s a very powerful effect if someone is putting a name to it, and that same thing where we were talking about how ChatGPT was delivering these messages in a way that was convincing to people. It proves that the power is there. Who wields that power?

Alex: Yes. Scary, scary, scary. As someone in his almost 40s. I just think of Terminator all of the time and we’re inching closer to them, becoming self-aware. This make me very nervous. You’re in the business, so I assume you’re excited about where AI’s taking healthcare. Anything you’re really excited about the near future that we haven’t talked about yet?

Chris: Yes. We’ve talked about it. You don’t hear seeing me saying very much about ChatGPT on social media, a lot less than you would expect. The reason for that is I feel like I appreciate how it’s opened up the imagination. I appreciate how it has shown people. This goes for not just the ability to generate this speech, but also the image generation technologies, the Midjourney, DALL-E, now we talk about VALL-E, voice synthesis, and stuff like that.

People are thinking on a much bigger scale of what’s possible. Now what the age that I’m hoping for over the next short term is the ChatGPT alternatives, the people that are thinking from different perspectives than just language models, the people that are thinking about, “Well, how can we use these technologies in a way that is safe, and we communicate effectively?” Now the doors are open. I got hit up on LinkedIn for some conversation with an investor about this stuff. There’s a lot of interest that’s happening. My biggest hope is that there are some really stronghearted ethical players that get into this and start thinking about things that are useful and helpful for populations in all kinds of ways. [crosstalk]

Alex: You think it’s just the beginning?

Chris: Yes. We’re at an early phase and when I think about the usefulness of ChatGPT and all that, there’s some– Look, I’m really excited actually by its ability to summarize text and things like that, the generation capabilities. Again, it looks very cool and all that, but there’s a lot of work that would need to happen. Ultimately, there needs to be some ground truth. It’s predicting a next word, but it doesn’t have any concept of what a ground truth is. When we’re thinking about the types of approaches that would work on the fly in healthcare, I would say that we’re pretty far. The doors open to at least start thinking about what the path to get there looks like.

Alex: It’s also just regurgitating what’s out there. I think the place for humans is still in original and unique content. You’re not going to be able to get by with just regurgitated normal blog posts that just have a different spin of someone else. You’re going to have to come up with something original that is provider and clinically-informed. ChatGPT can’t do that.

Chris: Let’s take it to cooking because my first request with ChatGPT, this is a real hardcore test, is how do you make a French omelet? When it responded, it gave me a recipe, and I was like-

Alex: “It’s not right.”?

Chris: -Not enough butter.

Alex: What did it do? It just went to Google. What’s it do? Goes to Google and just search for–

Chris: It doesn’t go to Google. If you want a really freaky way to start thinking about this technology, think about when you’re using your phone, and you’re typing, and it’s predicting the next word that you’re likely to say. Think about it from that context. Then whenever you ask it a question, think about how ChatGPT would have been able to answer that question only by looking at a whole bunch of documents and then making a prediction about what the next word is most likely to be.

Such as if you ask ChatGPT what is 4×4 and it responds with 16, the initial feeling is like, “Oh, wow. This thing is doing math,” but it’s not doing the type. It is not taking four and adding it four times. What it’s actually doing is looking at where these occurrences– within this body of tests that it’s trained on based on the position of 4 x and 4, what came the next most frequently. This is a language approach to approximating math. It didn’t know the answer. It didn’t calculate it. There’s no calculator there happening other than what’s the most likely next word based on the previous one.

Alex: Where does it get its historical information? Where was it fed from?

Chris: See, here’s another, well, another hope of mine within this is I couldn’t tell you the exact corpus of data that OpenAI used, but it was basically trained on data they found on the internet and various places [crosstalk] up to 2021.

Alex: Really? That’s what it is. It did use the internet, every website out there.

Chris: Yes, it used it up to 2021. I don’t know which websites out there, but it used a good chunk. The real point I was just making was that it’s not going and searching like Google for those answers. It’s predicting the next word [crosstalk]

Alex: Most commonly used next word.

Chris: Yes. That. There’s, of course–

Alex: There are lots of French omelet recipes out there. Who did they know which one to give you?

Chris: It knew which one not to give me. It gave me the wrong one.

Alex: If it’s not enough butter–

Chris: Yes. If only the AI was trained on recipes from the South and Chef John, then it would’ve been much more accurate to my use case.

Alex: It has limitations. As marketers, I’m nervous of where this is going. The algorithms are getting really smart. They can write, they can design, and they can speak now. I don’t know what the hell’s left for us. I just let me the strategist, I think we’re going to fly the ship, and I don’t think we’re going to do work because I think all these algorithms in AI are going to be able to do a lot of work.

Chris: If you start thinking from that context of how did it know this by predicting the next word, then it’s a little bit of a Wizard of Oz effect. That’s the magic behind it, and then on the other side. Think about this. Think about if you got a communication from your health provider about needing to come in for [crosstalk]. I don’t know how those messages are formed and all that. On the other side of this text generation paradigm is the idea of now people want to know what text was procedurally generated versus what came from a human. At the same time, you have these generative algorithms. You also have these other algorithms that give you the probability that it was generated by a ChatGPT versus human-made.

Imagine your email platform adjusts for that, and it tells you the likelihood that a human developed that, and you get a communication from your health care provider, and your email system tells you that it’s not human. It strikes me that there are ways to do this without copy-pasting out from ChatGPT. Effective ways like instead of writing an article on the need for dialysis, instead, what are the steps that you would take to write an article on dialysis, and then that would be creativity rather than copy-pasting from an output.

Alex: Goy it. You need to heavily revise ChatGPT responses.

Chris: There we go.

Alex: Man, this is crazy. This is just a crazy time to be recording this. For Woebot, what is the next big step they’re looking to take? I don’t know if it’s public knowledge. Anything that’s public knowledge, you’re okay to share. Where do they want to take it next? It looed like it had tools, relations, everything else in there. It helps you go through CPT and the other one.

Chris: IPT.

Alex: IPT, whatever. Where do they want to take it next?

Chris: It’s really just a matter of understanding where the technologies are, and maybe you catch a hint of it from the fact that I said I was hoping to see different approaches like we have the ChatGPT and their approach. Really, I’m looking to see who is going to do something that is clinical grade. If you get to that perspective, and with Microsoft making that large investment in OpenAI, that could be something to take seriously.

Overall, when we’re talking about healthcare, and we’re talking about something like you were talking about, delivering right at the point of care, there needs to be a whole lot more standards and a whole lot more [crosstalk] than what’s happening today. It’s really assessing, keeping an eye out on things out there, looking beyond any use case that is something– We don’t want that. We don’t want that ChatGPT style just immediate response where it’s, again, just predicting the most likely next word. That’s not effective for someone’s health there. It’s just looking at the different–

Alex: Something that’s clinical grade and well-thought-out and pragmatic, yes.

Chris: In terms of privacy and security as well, because if you’re developing the app using OpenAI right now, that requires sending data to them. I’m supposing it’s their philosophy.

Alex: This guy, Alex, he’s [crosstalk] [laughter] [crosstalk]

Chris: “What’s going on there?”

Alex: I don’t know. Somebody call Atlanta. Chris, any other big things where you see the near future heading in digital AI, health platforms, digital health platforms, anything else you want listeners to know [crosstalk]?

Chris: Didn’t we talk about enough? Wow. There’s not enough big things happening? I’m kidding. Here’s the main thing that I’m going to say, and I think it applies to everybody, is that nothing is inevitable. There’s nothing that is happening right now that– We say AI, but part of how even this latest algorithm is trained, they took a big chunk of the internet and developed some predicted models based on it. There’s also an approach called Reinforcement Learning with Human Feedback that occur.

They got a bunch of people to evaluate the responses that were coming out of the approach, and they tailored the system to match what would be more likely to get that positive response from the human. There are people involved in every step of this process, and even if you tell ChatGPT to generate some volume text, that’s a human-synthesized thing. I like to remind people that the right things don’t happen and that we don’t move in the right direction without the right people focusing on the right thing, looking for ethical approaches to do this. I’m not with Coco. That’s the main thing I [unintelligible 00:35:02]

Alex: Nothing is inevitable. Nothing is inevitable. That’s good. Chris, this has been a ton of fun. We haven’t talked about something I don’t get to talk about, which is digital health, digital AI. I got to play around on the app and see its use and stuff. This is really cool. Obviously, there’s some kind of bridge that’s going to happen between human therapy and AI-assisted therapy, AI-assisted healthcare in general, to bridge the gap. I think if you’re a provider, this is an exciting time to be in.

If you’re a provider that was looking to get out of providing, this is a good time to get out because a whole ton of shit is about to change in the next 10 years, how you provide care, how quickly patients can get care. This is going to be really fun time being in healthcare, really fun time to be in healthcare marketing, and only the most innovative, I think, will survive in both instances. Chris, thanks for dropping us on Ignite.

Chris: Appreciate it. Appreciate the message and the energy that you’re delivering too. Hopefully, it’s falling on good ears. I’d love to see this all push through.

Alex: Thanks, man. Lots of Diet Coke.

[laughter] [music]

Announcer: Thanks for listening to this episode of Ignite. Interested in keeping up with the latest trends in healthcare marketing? Subscribe to our podcast and leave a rating and review. For more healthcare marketing tips, visit our blog at cardinaldigitalmarketing.com.

Healthcare Marketing Insights At Your Fingertips

Listen and subscribe to Ignite wherever you get your podcasts.

Get Started

Ready to Grow?

Great partnerships start with great discoveries. We start with your business goals and budget, and then help you find the right digital marketing strategy to fuel growth.

Fill out the form to get started!

"*" indicates required fields

Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
This field is for validation purposes and should be left unchanged.