I’ve been back from vacation for a month now, and ever since I arrived back in the States I just don’t feel like myself anymore. It’s almost like I left my whole self back in the UK. I’m not interested in going out, I don’t plan lunches, brunches, and dinners like I used to.
I’ve traveled out of the country before and have visited some beautiful cities(LA, SanFrancisco, NY, & Boulder, CO(which I love more than NY, and that’s saying a lot)] in the USA but I’ve always been able to come back home and feel like my normal self. Have you ever been somewhere and when you returned home you feel like you left yourself there?