Identifying the Warning Signs of Self-Harm
Self-harm is the act of intentionally harming your own body to cope with emotional pain, anger, or frustration. The practice has existed for years, but it is of particular concern today due to its marked rise in teens over the past decade. Generally, self-harm is done by impulsively cutting, hitting, or burning your body, but…