Those in denial over man-made global warming sometimes argue that increasing concentration of CO2 in the atmosphere will not result in increasing greenhouse effect, because as far as infra-red (IR) absorption is concerned, the atmosphere is saturated with CO2, so more CO2 does not absorb more IR.
This argument goes back to 1900. As soon as Arrhenius come up with greenhouse gas theory, Koch ran a experiment with CO2 in a test tube, demonstrated the absorption of IR, then reduced the CO2, measured the absorption, found it did not change very much, and everyone said, "That's OK then. No need to worry about CO2 in the atmosphere". Which remained the prevailing view for the next 50 years or so, and still is, as this debate demonstrates.
The thing is, the atmosphere is more complex than Koch's test tube. Near the ground, the CO2 IR absorption is indeed saturated, but higher up, more CO2 will capture more of the heat which would otherwise escape. After all, the heat can only escape from the very highest layers of the atmosphere, and more CO2 in these layers is very effective in slowing heat loss from our planet.
More here:
I must admit it took a bit of time to get my head around this, but an analogy was helpful:
Imagine a production line carrying Smarties (M&Ms) past greedy but picky children. The red Smarties are infra red energy and the children are CO2 molecules. The children only eat the red Smarties, and can only eat so many per minute (this is their saturation point). So one child will let a lot of Smarties through, because he is saturated. Add another child (more CO2 at a higher level) and more Smarties get eaten.
Tuesday, January 12, 2010
Subscribe to:
Post Comments (Atom)
1 comment:
Post a Comment