Doctors Are Just Drug Dealers That Went to School Longer Than Most People

“Oh, you don’t feel good? Here’s the narcotics I suggest for numbing the pain. Go hit up my supplier down at your local pharmacy, he’ll hook you up. Come back in a couple weeks when you need another fix.”

Doctors just treat your problems instead of curing them; because they can’t profit off of curing you. 

If doctors actually cared about your well-being they would prescribe things like a healthier diet, more sleep, exercise, drinking more water, etc. Because those things would actually cure your problems. 

But they can’t profit off of that. They prescribe you drugs that just numb the pain instead of fixing the problem. They make you pay for all of these drugs so they can profit off of you. 

People treat doctors like gods and will listen to anything they say. 

People are so against drugs but once some dude in scrubs gives them a prescription it’s suddenly okay. 

It’s the same exact drugs that people are going to prison for. But just because a doctor sold them, it’s suddenly okay. 

These drugs are just like Advil or Ibuprofen. All they do is numb the pain for a bit; but the problem still persists. 

*The leading killers of the world; cancer, diabetes, heart disease, obesity, etc. are all caused by unhealthy diets and poor lifestyles. 

People think they just randomly get sick. People think these diseases are random and they are unlucky if they get them.

All these illnesses are not random. They are caused strictly from your diet and lifestyle. The food you eat is what causes all of these problems. If you had a healthier lifestyle you would have never gotten sick in the first place.*

If you want to be healthier and happier, all it takes is a better diet. All you need to do is eat more natural food and eat less junk food. 

But, instead, people choose the unhealthy life path and then pay drug dealers thousands of dollars just to give them some pills that numb the pain. 

I am not saying doctors are evil and do these things on purpose. It is what they are taught. Most doctors aren’t taught about all the benefits of a healthy diet and they’re usually not allowed to recommend those things. 

I’m not saying they are intentionally bad people. A lot of doctors pursue that career to help people. 

But the sad truth is they don’t realize they are doing more harm than good. They are just drug dealing salesman that went to school for a long time. 

They would make no money if they just told you to eat healthier and change your diet, because they would never see you again. 

All the money comes from the visits to the doctors office and they run all these random tests and then give you subscriptions to narcotic drugs that only numb the pain, and so once the drug wears off you will come running back to the dealer to get your fix. 

If you truly want to fix your problems and be healthier and happier, change your diet. Eat less processed food and eat more natural food. It truly is that easy. 

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: