Last week I told my therapist that spring makes me depressed. (Is revealing you have a therapist even personal anymore? Doesn't everyone have a therapist?) Eventually I come around to the whole beautiful weather and blossoming flowers thing and start to enjoy it as much as anyone else, but the transition from winter always makes me feel down. Apparently this isn't that weird. If you're not feeling that awesome to begin with, the weather getting warmer and the sun staying up can just make your inside world feel more at odds with the outside one.
I like fall. Maybe because I was born in October or because that sense of back to school anticipation never wore off for me. More than spring, it feels like a time of new beginnings. When you have the chance to completely reinvent yourself starting with the chunky Steve Madden loafers your dad got you at the mall. Spring is just life starting over for everything else trees, animals having babies. I can't see how any of that relates to me.