Patriarchy in the U.S.
The U.S. is a patriarchal culture which means that more than women, men are regarded as being the leaders and heads of family, state, government, etc. It’s not to say there aren’t women in these positions, it’s to say that the norm of leadership in our country has historically been dominated by males until the present day. For example, our country has never had a female president. The symbol of a “strong” male Caucasian figure in a suit and tie has represented the leadership of the country until Obama was elected. Males in associated with positions of power continue to be the symbol/image that comes to mind of Americans when thinking about leadership in general.
When we think about men, we often think about stereotypical associations of masculinity. In our culture, we define masculinity as being tough, strong, stoic, independent, etc. We learn at a very…
View original post 1,753 more words