Answer :
American culture mostly treat women with disrespect because pop culture only keeps an eye on womens' beauty, they make women feel bad/sad by saying harsh things, and they represent women as weaker than men. Hope this helps :)