Skin cancer is the most common type of cancer in the United States. If you could visibly see signs of skin cancer on your body, would you be more likely to visit the doctor?
Skin cancer is the most common type of cancer in the United States. If you could visibly see signs of skin cancer on your body, would you be more likely to visit the doctor?