I dont feel great when some people say that, cause people who say such things are obviously scoping my skin and my "outside" look all the time, and it makes me to hate myself cause I hang out with that kind of people, I dont say "all of them are like that", if someone who cares about you (like your girlfriend or boyfriend, parents, very good friends.. etc) says something like that... thats another story of course.
I was asking because for me, it depends on the situation. If it's you and them alone, and they bring up the topic and tell you that it's looking a lot better, then that makes me feel really good about myself. But if someone brings it up in a group, then I feel as though the attention has just been switched to me and my skin, and I try to change the topic of conversation ASAP