Ever since the leaves starting changing way back in the very beginning of October, all I’ve done is rave about how much I love autumn. And how could I not? Not only are the leaves beautiful to look at but they’re especially fun to step on and crunch once they’ve fallen off the trees. Do you y’all do this too or am I the only childish one around here?
But now that we’re almost halfway through November, I’ve started to recognize the early signs of winter. The temperatures are beginning to creep lower and lower, the sun is setting before I even leave work, and, worst of all, most of the trees are completely bare without a red or gold leaf in sight. But instead of excepting that the first snowfall will probably happen within the next few weeks, I’ve decided instead that I’m going to live in a total state of denial and pretend like autumn is still in full swing… Not entirely healthy but trying to get excited for winter is asking a lot of a girl who grew up in Florida!
Are you guys looking forward to winter or are you like me and still hung up on autumn?