Feminist Art History
Body art is a form of visual art that involves the manipulation of the human body to create a piece of artwork. This can include practices like tattooing, piercing, and performance art that uses the body as a canvas or medium. In the context of feminist art, body art serves as a powerful means of self-expression, challenging societal norms around femininity, identity, and the female experience.
congrats on reading the definition of Body Art. now let's actually learn it.