I think for a long time it seemed like working in an art form and being a feminist meant portraying women in a perfect, angelic light. And there's nothing feminist about that.
What right does Congress have to go around making laws just because they deem it necessary?

