Cinema explains American society. It's like a Western, with good guys and bad guys, where the weak don't have a place.
May I say, if you were suddenly put into a woman's body, wouldn't you be slightly interested in your breasts, and why people look at certain parts of you, and why certain parts move like they do?