Lately I've been fighting a pretty intense depression, as I'm sure some of you are aware. I know depression is a chemical change in the brain, a condition that can easily be treated with medication. Yet for some reason, as bad as it is and as much as I want it to go away, I can't seem to make myself go to a doctor for it. Even though my conscious mind knows it's just a medical condition like having the flu, the rest of me seems to think that if I go in and get help, I'm admitting some sort of weakness. I feel like I SHOULD be more in control of my emotions, so I keep prolonging this and just making it worse for myself.
Why do we, as a society, think that mental disorders are somehow something to be ashamed of? Why do we treat head problems differently from body problems? It doesn't really make sense to me, but it seems deeply ingrained in all of us to look down on mental disorders as some embarassing situation that we can't talk about. It's dumb, and I'm as much to blame as anyone, since I can't make myself go in and get the help I need.