Insurance companies don't want us healthy, they get their money one way or another. They are a middle man between people and their doctors that takes the majority of the profits.
If people were healthy then they may opt to go without insurance or only get catastrophic insurance.
Thank you for your perspective. Again, my post was strictly my own bias which I feel I may be able to re-evaluate to get more certainty. I have heard for a long time about Doctors having inappropriate relationships with insurance companies in the form of kickbacks for both to be able to capitalize. I know some of it may be bias journalism with a touch of conspiracy theory. Nonetheless, I appreciate all feedback :)