Respuesta :

Evolving gender roles in American Society has reflected a shift in other-oriented cultural values toward is option d. Femininity.

Feminism in the United States refers to the collection of movements and ideologies aimed at defining, establishing, and defending a state of equal economic, political and social rights for women in the United States.

Femininity (also called womanliness) is a set of attributes, behaviors, and roles generally associated with women and girls. Femininity can be understood as socially constructed

A gender role, also known as a sex role,[1] is a social role encompassing a range of behaviors and attitudes that are generally considered acceptable, appropriate, or desirable for a person based on that person's sex

To read more about Feminism click here https://brainly.com/question/9254131

#SPJ4