You Deserve to Read This: The Culture of "Deserving" in America
You know what always bothers me about contestants on reality shows? How they continually announce that they "deserve" to win because they've worked hard, as though their hard work is some how different and more honorable than their fellow bakers/designers/survivors. This mentality extends to interviews done with family and friends as well, often times focusing on the contestants morality: He deserves to win because he's such a good person. She deserves this because she's got such a big heart.
How has the media, such as L'Oreal's "You're Worth It" campaign, played into this? Have evolving parenting methods affected our sense of self-worth? And what is the effect of this phenomena on our ability to sympathize with others, especially those who are very "other"?