This is a random post for me, but this question has stuck with me over the years and is important for me to ponder.
I always see posts on Instagram that tell girls to be “body positive”. While I think that that’s important, I don’t think that it’s a replacement for taking care of your body.
For example, “thick” and “curvy” girls are surely beautiful, but if they have diabetes, then should they really be “body positive”?
Likewise, if you’re too thin and aren’t eating right (i.e. anorexic), can you really be body positive either?
I think that loving yourself is important. But there comes a point where you’ve got to be honest with yourself and say, “Sure, people say I look good. But do I feel good?”
Feel free to leave your thoughts below.