When did healthcare become a right?

I am dirt poor and therefore I am all for universal health care, but I don’t think it’s a good idea, nor do I remember even wanting to pay for my own I sure as hell don’t want to pay for anyone elses – I, mostly because I think it will ruin the American health care system and cause a lot more pain then help. Besides when on earth did health care become a right?

On the surface it seems like a novel idea, people who cannot afford to pay for their provided by the state, or their company (which I definitely don’t agree with) but what if they could afford to pay for it, but wanted to use that money for fattening foods and a big screen TV, why should the tax payers foot the bill.

My father and almost every frum Jew I know says it’s socialism, looking at countries like Norway and Sweden kind of gives me the idea that socialism did something right, I don’t really think it’s such a bad word. We already have public education, public parks and public retirement, I don’t see how we aren’t already socialist. The left wing people I talk with say the free health care systems of the world rock, the right wingers say those people come to America to get stuff done. I don’t believe either of them.

What I am really thinking is that if you think everyone should health care, everyone should be mandated to have it – sort of like car insurance. I am speaking from complete ignorance by the way, but these are some random thoughts I have been having as Obama keeps pushing this health care thing through.