
Why Self-Care Is More Important Than Ever
Written by Stephanie Sutton, M.D. – Self-care is a general term that includes intentional actions done to help yourself on the inside and out. It incorporates caring for your physical and mental wellness. Practicing self-care can include everything from washing your face at night to eating healthy. Self-care is more important than ever because it …