Explore our Topics:

Algorithmically enslaved healthcare consumerism

Is AI the new digital enslavement? Explore the risks of over-reliance on generative AI algorithms in healthcare, especially in mental health.
By admin
May 29, 2024, 2:16 PM

Regardless of the future hype curves, the great debate will continue over the balance between technology and intuition in the delivery of care. Even the most ardent supporters have an “Oh, wait!” moment about the advice being dispensed from various flavors of AI, whether generative or image based. Is AI the new digital addiction? Will algorithmic enslavement hurt patient care?


From Netscape to Dr. Google: The evolution of algorithmic enslavement

Personal experience tells me there is an evolution that occurs in the relationship with advanced algorithms. It starts with a suspicious curiosity in much the same way that we all had when we engaged with the first browser. I spent hours feverishly searching on Netscape and the result history took me from fly-tying techniques to world music artists in Mongolia and recipes for veal osso buco. In that era, it was a migration from hundreds of Usenet Alt.dot groups to basic Browser Enslavement.

Fast forward 40 years, and we are now entering a world of algorithmic enslavement, with healthcare arguably at the epicenter. This is not only happening at the clinical and research level, but especially from a consumerism perspective.

Dr. Google is now on generative steroids!

As with the earliest browsers, this new enslavement occurs very slowly. At the beginning, generative AI addiction was only possible by registering for ChatGPT. And as with any hyped product, there soon came waiting lists to eventually get access.

But the digital enslavement turning point occurred when the algorithms recently became embedded in mobile devices. Typing prompts into a generative platform on one’s laptop became the gateway drug for the consumerized version on the phone. Considering the mobile device has become an extension of one’s body, it becomes much more personal.


Healthcare’s algorithmic addiction: Are we swiping right on our own demise?

Mobile device usability experts soon made it difficult to just do a simple “old fashioned” search and made the generative search the first choice for us as with the positioning of Gemini on the Google browser.

Were it not for the age-old adage that, “Healthcare is different,” this evolution might not be so concerning. However, as we know, the algorithm’s relationships with human prompts can result in some questionable results. This can surely occur with front-line professionals in clinical support, which is why many are reticent to go too far with situational AI. This is especially true in the realm of mental health, where responses to prompts can be ambiguous and require human empathy that a sentient-free algorithm simply cannot provide.

My point is not to litigate the efficacy of generative AI results for clinicians or everyday patients. It’s more related on a new dependency on the technology regardless of its output. One can easily see that there will be and are algorithmic co-dependences forming in much the same way that the original browsers created a time drain never seen before. Lest we forget the smartphone has done the same and likes to remind us of our comparative screen times every week.

So what could possibly go wrong when a patient is enslaved both by their mobile device and an algorithm developing a “personal” relationship to counsel them through a mental health or interpersonal crisis. Getting a second opinion will take on a whole new meaning. Will it be second-guessing the algorithm and opting for human contact? Or frighteningly, will it be second-guessing the mental health professional and asking for a consult with the infamous and infatuated generative bot Dr. Sydney.

Show Your Support


Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.