Historically, Americans have been distrustful of the healthcare industry, viewing "big pharma" and "corporate hospitals" negatively (foreigners, see our opioid epidemic). The healthcare industry as its drive for profit has been accused of being the root of many problems in American society, but through the COVID-19 pandemic, they've been the heroes. I'm not one for conspiracy theories, I'm even fully vaccinated, but I think there's something to be said that this is the first time (at least since 2001) that more Americans have said they view the healthcare industry positively (51%, up from 38 in 2019) rather than negatively (31%, down from 48% in 2019). Please see the attached link for the Gallup poll.
I look forward to reading your thoughts on this.