Read a news story few days ago them making an official statement of winter starting. Thought it was weird. Now seeing official announcement of start of winter having been canceled feels even more weird.
Why to have official winter start announced in the first place, even if you knew when winter starts, but especially if you don't? They try to make even weather political, trying to make reality by imaginary statements?
TIT I Guess.