Answers

2016-06-21T14:58:07+08:00
"Sex roles" redirects here. For the scientific journal, see Sex Roles (journal).

A gender role is a set of societal norms dictating what types of behaviors are generally considered acceptable, appropriate, or desirable for a person based on their actual or perceived sex or sexuality. These are usually centered on opposing conceptions of femininity and masculinity, although there are exceptions and variations. The specifics regarding these gendered expectations may vary substantially among cultures, while other characteristics may be common throughout a range of cultures. There is ongoing debate as to what extent gender roles and their variations are biologically determined, and to what extent they are socially constructed.

Various groups have led efforts to change aspects of prevailing gender roles that they believe are oppressive or inaccurate, most notably the feminist movement.

The term 'gender role' was first coined by John Money in 1955 during the course of his study of intersex individuals to describe the manners in which these individuals express their status as a male or female, in a situation where no clear biological assignment exists.

0