Eye Candy
Our body is the conduit through which we experience the physical world, and it holds our most primal instincts. Since David and Venus, we've elevated the human form to the highest pedestal, establishing and perpetuating an unreachable beauty ideal. Our natural body has been sexualized, vilified, and often shamed into oblivion via this warped lens of beauty. Why is nakedness offensive? Is it political in nature? Isn't there more to us than meets the eye?