Perhaps I'm way out of the loop here, but I got the impression that this forced Americans to buy health insurance or else risk a fine. While this seems to be popular consensus, other people defend it as something else.
" They aren't forcing anybody to buy anything, my good soldier, just Incentivizing it. They CANT force anybody, there is no way to. Plus, who in their right mind goes without Health insurance these days anyway? The only reason I don't have it is because I can't afford it for myself, but hopefully, this bill will change that."
This was a comment on a Facebook post by a friend of mine. Is there some major misconception that Obamacare gives insurance to those who cannot afford it, or is it much deeper than that? I certainly don't want to ever be forced to buy a product or face a fine, but I'm not quite sure if this is just Fox News spewing bullshit everywhere.