Americans Care About Climate Change More Than Ever But They Still Don't Want to Pay For It
After years of being battered by hurricanes, wildfires, and heat waves coupled with a climate denier-in-chief, it seems Americans might finally be coming around to the stance that climate change is real and we should do something about it thing. Welcome, friends.
