Disclaimer:
I am not vouching for this article - I find that most articles on most sites on either side of this issue are likely to have a number of issues with them, some large, some small. I only offer this as a thought starter.
Background:
My personal stance, as mentioned many times in this forum, is:
1. The Earth has been warming since 1850 (end of Little Ice Age), with an increase in warming since 1900 or so (even though C02 didn't really spike until 1940 or so, and I question some of the data adjustments being made)
2. A doubling of C02 concentrations (takes about 100 years) will directly cause about 1.0 deg Celsius of warming. (Greenhouse effect, essentially).
3. Humans are a large portion of the increase in C02 concentrations since 1940
4. The "feedback" effect, which is captured in the "climate sensitivity" assumption for these models, however, is much less clear. The feedback in theory is due to the higher temperatures from #2 above causing an increase in water vapor in the atmosphere, which creates a multiplier effect on the initial 1.0 deg Celsius of warming. This multiplier was initially thought to be 3.0 to 5.0 times, causing 1.0 deg C of warming to turn into 5 degrees of warming = ARMAGEDDON!. However, this number has been decreasing as analysis has continued and the correct value likely lies between 1.0 and 3.0. At minimum the SCIENCE ISN'T SETTLED here, and it is quite likely the value is around 1.5.
5. A climate sensitivity of 1.5 means that we will not see catastrophic warming (CAGW). Thus, I believe the earth will warm in the future, but that the warming will not be catastrophic. This is not an alarmist position nor a denier position, but a "lukewarmer" position that I believe will be validated as we learn more. The "pause" has shown that nature clearly is more of an influencer than the models accounted for.
6. Adopting to this warming is cheaper than trying to mitigate it, since getting all countries to mitigate together and in sufficient numbers is highly unlikely (and punishes the poor by causing higher costs of energy). This will never be proposed by the strongest alarmist groups because there are other motives besides warming that are driving some of these mitigation recommendations.
Here is the article in question: http://wattsupwiththat.com/2015/08/09/the-trouble-with-global-climate-models/
One thing I like in particular about some of these sites (including but not limited to WUWT) is reading through the comments and trying to see how people attack the paper in question. One can learn as much from that as the article itself.
One of the more interesting comments in this case was this one, which hits upon several best practices that I am somewhat familiar with from doing engineering and predictive modeling projects:
Dr. Daniel Sweger
August 9, 2015 at 9:03 am
There are several problems with GCM parameterized models that typically not encountered in engineering applications. The reason parameterized models are necessary is that the process involved is nonlinear and does not lend itself to closed solutions. The models are trained using existing data over a well defined range of input values. The model is then applied to a particular problem.
However, the range for each variable needs to cover the entire range of the anticipated applied values. If input value of one or more of the variables exceeds the range used during model training then the model output is considered to be unreliable. The reason for that is relatively simple. The effect of that variable on the output is not known with any degree of certainty, and the further that variable gets from the training values the more unreliable the output of the model is. Model variables that behave “properly” over the training range can diverge quickly, even exponentially, outside of that range.
With engineering applications the ranges are set by experimental conditions. In wind tunnel experiments, for example, the experimental wind speed must exceed the anticipated real-life case. If and when the tested vehicle encounters wind speeds greater than the range of the model inputs the results can be disastrous, as has been encountered with ultra-supersonic test flights or attempts to set land speed records.
It is also possible for singularities to occur with combinations of input variable values that were not tested for. Such was the case with the 1940 Tacoma Narrows Bridge collapse in Washington State. A resonance condition was established by a combination of wind speeds and direction that was never tested.
In the case of GCMs the most important of the input values is assumed to be the CO2 concentrations. But these models are only trained over a relatively narrow range of values. They are trained on hind cast temperature values for the thirty years from 1975. During that thirty year time span, the value of atmospheric CO2 concentration ranged from 330 ppm to 385 ppm. The models are then run for values of CO2 that far exceed the training values by as much as 100%.
Another potential problem with GCMs is that the model should never be evaluated based on the training data. There must be a data set that is independent of the training data that is used for that purpose. This is typically done by randomly dividing the total data set into halves, one of which is used for training and the other for evaluating. This then requires a sizeable data set, which does not exist with global climate data.
These are just some practical problems with the modeling. Other theoretical problems include how to define a “global” temperature. Temperature is an intensive property, but only extensive properties are additive, and thus subject to simple averages. For example, if you mix a mass of dry air at a temperature Tdry and another equal mass of air at 100% relative humidity, i.e. saturated air, at a different temperature Twet the resultant temperature is not (Tdry + Twet)/2.
It is about time that the true scientific community stands up and disputes this raw attempt at power grabbing and wasteful spending of hard-earned tax dollars. There is a great deal of value in exploring for new and novel methods of generating electricity, but not at the expense of destroy emerging economies that are dependent on currently inexpensive technologies. The current climate “solutions” are nothing more than modernized versions of eugenics.
I am not vouching for this article - I find that most articles on most sites on either side of this issue are likely to have a number of issues with them, some large, some small. I only offer this as a thought starter.
Background:
My personal stance, as mentioned many times in this forum, is:
1. The Earth has been warming since 1850 (end of Little Ice Age), with an increase in warming since 1900 or so (even though C02 didn't really spike until 1940 or so, and I question some of the data adjustments being made)
2. A doubling of C02 concentrations (takes about 100 years) will directly cause about 1.0 deg Celsius of warming. (Greenhouse effect, essentially).
3. Humans are a large portion of the increase in C02 concentrations since 1940
4. The "feedback" effect, which is captured in the "climate sensitivity" assumption for these models, however, is much less clear. The feedback in theory is due to the higher temperatures from #2 above causing an increase in water vapor in the atmosphere, which creates a multiplier effect on the initial 1.0 deg Celsius of warming. This multiplier was initially thought to be 3.0 to 5.0 times, causing 1.0 deg C of warming to turn into 5 degrees of warming = ARMAGEDDON!. However, this number has been decreasing as analysis has continued and the correct value likely lies between 1.0 and 3.0. At minimum the SCIENCE ISN'T SETTLED here, and it is quite likely the value is around 1.5.
5. A climate sensitivity of 1.5 means that we will not see catastrophic warming (CAGW). Thus, I believe the earth will warm in the future, but that the warming will not be catastrophic. This is not an alarmist position nor a denier position, but a "lukewarmer" position that I believe will be validated as we learn more. The "pause" has shown that nature clearly is more of an influencer than the models accounted for.
6. Adopting to this warming is cheaper than trying to mitigate it, since getting all countries to mitigate together and in sufficient numbers is highly unlikely (and punishes the poor by causing higher costs of energy). This will never be proposed by the strongest alarmist groups because there are other motives besides warming that are driving some of these mitigation recommendations.
Here is the article in question: http://wattsupwiththat.com/2015/08/09/the-trouble-with-global-climate-models/
One thing I like in particular about some of these sites (including but not limited to WUWT) is reading through the comments and trying to see how people attack the paper in question. One can learn as much from that as the article itself.
One of the more interesting comments in this case was this one, which hits upon several best practices that I am somewhat familiar with from doing engineering and predictive modeling projects:
Dr. Daniel Sweger
August 9, 2015 at 9:03 am
There are several problems with GCM parameterized models that typically not encountered in engineering applications. The reason parameterized models are necessary is that the process involved is nonlinear and does not lend itself to closed solutions. The models are trained using existing data over a well defined range of input values. The model is then applied to a particular problem.
However, the range for each variable needs to cover the entire range of the anticipated applied values. If input value of one or more of the variables exceeds the range used during model training then the model output is considered to be unreliable. The reason for that is relatively simple. The effect of that variable on the output is not known with any degree of certainty, and the further that variable gets from the training values the more unreliable the output of the model is. Model variables that behave “properly” over the training range can diverge quickly, even exponentially, outside of that range.
With engineering applications the ranges are set by experimental conditions. In wind tunnel experiments, for example, the experimental wind speed must exceed the anticipated real-life case. If and when the tested vehicle encounters wind speeds greater than the range of the model inputs the results can be disastrous, as has been encountered with ultra-supersonic test flights or attempts to set land speed records.
It is also possible for singularities to occur with combinations of input variable values that were not tested for. Such was the case with the 1940 Tacoma Narrows Bridge collapse in Washington State. A resonance condition was established by a combination of wind speeds and direction that was never tested.
In the case of GCMs the most important of the input values is assumed to be the CO2 concentrations. But these models are only trained over a relatively narrow range of values. They are trained on hind cast temperature values for the thirty years from 1975. During that thirty year time span, the value of atmospheric CO2 concentration ranged from 330 ppm to 385 ppm. The models are then run for values of CO2 that far exceed the training values by as much as 100%.
Another potential problem with GCMs is that the model should never be evaluated based on the training data. There must be a data set that is independent of the training data that is used for that purpose. This is typically done by randomly dividing the total data set into halves, one of which is used for training and the other for evaluating. This then requires a sizeable data set, which does not exist with global climate data.
These are just some practical problems with the modeling. Other theoretical problems include how to define a “global” temperature. Temperature is an intensive property, but only extensive properties are additive, and thus subject to simple averages. For example, if you mix a mass of dry air at a temperature Tdry and another equal mass of air at 100% relative humidity, i.e. saturated air, at a different temperature Twet the resultant temperature is not (Tdry + Twet)/2.
It is about time that the true scientific community stands up and disputes this raw attempt at power grabbing and wasteful spending of hard-earned tax dollars. There is a great deal of value in exploring for new and novel methods of generating electricity, but not at the expense of destroy emerging economies that are dependent on currently inexpensive technologies. The current climate “solutions” are nothing more than modernized versions of eugenics.