Saturday, January 7, 2012

Do you think corporations should be regulated?

What is this society coming to? I mean I know this is America...you can be born to a janitor and become the next billionaire and we have the American dream. It's just at what point will corporations who are out for a profit and not to better society be held accountable. Isn't this almost wrong. Examples-- Drug companies who are out to make a profit off of society and not actually do whats best for society. Instead of affordable drugs...we get millions spent advertising and targeting us, lining the pockets of doctors, millions lobbying. How about all of the food companies who take some shred of food at some point, pile on millions of chemicals to produce a good shelf life, and pay millions to advertise to you, sponsor things, and then have you eat this unhealthy/nice tasting piece of crap. Our society is so unhealthy...we have companies constantly targeting, over-marketing to us, selling us stuff we don't need. In the old days, weren't they called scammers or whatever the old fashion word was. People who went door to door 100 years ago selling you stuff that was a scam and rip off. Isn't this the same thing? I think we are just so use to it, it doesn't even phase us anymore. Will it ever be a nation whos focus is on the people instead of american business?

No comments:

Post a Comment