Art Gender
Men throughout history have dominated the art world.
The portrayal of women’s bodies by male artists was often their solely contribution to the western arts. But change came and ‘postmodernism has remade the world in ways that can never be retracted...