When skin or other tissues in the body are injured, the body starts to repair itself. Scars can form when the skin heals.