People Reveal Their Scars And How They Got Them In A Powerful Photo Project

Scars get a bad rap. They are often seen as ugly, dangerous, criminal and something to hide and be ashamed of. In popular culture, it’s the bad guys that have the scars.