Is global warming really a bad thing?
Keep in mind that I live (by choice, good or bad) in a climate that is more or less 9 months winter and 3 months summer.
If over the history of the earth, the earth has been significantly colder (by average global temperature) and also significantly warmer (again by average global temperature), why are we fussing over the effects of global warming now? If Mother Nature has, in the past, adjusted the global temperature, what difference does it make that now our industrialized society is rapidly changing the global temperature?
If scientists believe that Antartica and the Actic Circle were once warm, with no ice, why is it a problem if we go back to that?
Earth's history reveals that tropical animals lived in Canada, and also that arctic organisms lived in the lower U.S.: so maybe the Gaia theory is at work here, and the Earth is adjusting to a preferable climate through the works of industrial damage to the environment. Maybe, in the past, Nature had to rely on natural disasters to make the changes it deemed necessary.
As a child, I learned that when the population of rabbits went to a level that was too high, some disease would appear and wipe out half the population. (I have often wondered if modern diseases, such as AIDS and hepititus follow along this same logic). Nature has a way of taking care of itself, and in our technological society, are we forgetting that maybe "nature" has its own agenda. If industry wasn't causing all these problems, maybe something else (a natural disaster?) would.
Please feel free to debate me on this issue. I don't necessarily agree with the statements I have made. I may only be encouraging a little "class" participation in our online experience.