One Engineer’s Perspective on Global Warming
David Simpson posted on August 12, 2014 | 39557 views
Many scientists and non-scientists are discussing "Global Warming" (or as it is increasingly being called "Anthropogenic Climate Change" or ACC).  ACC would simply be an interesting topic for discussion if it were not for the politicization, polarization, and sensationalism that have accompanied the science. 

Most scientific discussions start with a hypothesis followed by experimentation, data collection, analysis, theory modification to fit the data, and then further testing of the revised hypothesis.  This is healthy scientific inquiry.  When a headline says "Snowfalls are now just a thing of the past" and the UN forms an Intergovernmental Panel on Climate Change (IPCC) that regularly issues dire predictions of imminent catastrophe, we have moved past limiting the discussion to its scientific merits.

The site www.eng-tips.com is a technical forum for practicing engineers.  ACC is a frequent topic on this site.  A search of www.eng-tips.com for "Global Warming" yielded 431 discussions. 

A recent discussion (Kicking the Climate Change cat further down the road) has 444 posts.   The previous long discussion (A Lid for the Can of Worms, Good Heavens, We'll Freeze to Death!) had 244 posts.  The one before that (can of worms alert) Globe hasn't warmed in the last 16 years had 457 posts.  The last thread started by someone who could not be classed as a "denier" or "skeptic" was started in 2007 and had activity for 12 months ( "Educated" opinions on climate change) with 315 posts. 

 

The Global Warming hypothesis
[Note:  Many warmists would strongly question my ability to describe ACC objectively.  They may be right, but the following is an honest attempt to be objective.  These supporters of the ACC Hypothesis have been claiming that the hypothesis does not require a positive feedback loop, but I have been unable to identify the mechanism that would break the cycle once started.]

The fundamental hypothesis is that certain "greenhouse gases" accumulate in the atmosphere and prevent heat from radiating into space.  The bulk of the mass of greenhouse gas in the atmosphere is water vapor.  The next largest contributors by mass are carbon dioxide (CO2) and methane (CH4).  The theory expects that these gases in the atmosphere at today's levels (e.g., CO2 approaching 400 ppm, CH4 approaching 2 ppm) will create a positive feedback loop in the atmosphere (i.e., the famous "Mann Hockey Stick" graph (Figure 2) of global temperatures variation vs. time) and make life on Earth untenable due to extreme temperatures. 

 

The issue with the Hockey Stick Forecast for Future Temperatures

Figure 2--Mann Hockey stick graph from
3AR Chapter 2

Nature is replete with examples of negative feedback.  For example, when local ocean surface temperatures increase, water evaporates and the latent heat of vaporization leads to local cooling. 

It is difficult to find an example of sustained positive feedback in nature other than the hypothesis of ACC.  Even something like an avalanche where the force of the falling material leads to other material breaking free (the snowball effect) can only exist until the moving mass reaches an unsurmountable barrier or runs out of incline, a matter of minutes. 

A visual extrapolation of the Mann Hockey Stick graph has led many people to the conclusion that greenhouse gases will trap heat in the low-Earth atmosphere, temperature will increase, evaporation will put more greenhouse gas into the atmosphere which will trap more heat, etc.  Without a dampening effect, the graph supports the interpretation that within a few years (definitely less than a decade from when the graph was created in 1999) the temperature of the earth will be universally untenable.

 

The engineering discussion on climate change
While there are over a million engineers who are members of www.eng-tips.com, fewer than 100 have participated in the active discussions of ACC.  While we cannot draw broad conclusions, it can be illuminating to look at one of the threads in some detail.

The latest long thread on www.eng-tips.com (Kicking the Climate Change cat further down the road) has had 444 posts consisting of 86,000 words with several hundred graphs, figures, and links from 30 different contributors, seven of whom clearly accept the ACC hypotheses, 21 who clearly do not support ACC, and two who did not reveal a position on this issue. 

This discussion has tended to be civil with a minimum of the "how could you possibly be this stupid" arguments that are so prevalent in general, non-technical ACC discussions.  It is fascinating to me that every single time someone linked to Huffington Post or The Telegraph, (and once the U.S. EPA was put into the same category as the Huffington Post by a supporter of the ACC Hypothesis, much to the amusement of the people who do not support either ACC or the EPA as an authority) their arguments were immediately torn to shreds by people with expertise in Engineering fields.  Contributors were quite diligent in trying to bring reputable arguments into both sides of the discussion.

In this thread the argument in favor of ACC was led very competently by three contributors (with a combined 106 posts).  These individuals passionately presented the strongest arguments in favor of ACC that are available, and they defended an onslaught of pointed attacks on these positions.  When it became clear early in the discussion that a "call to authority" was not going to work (i.e., the argument that "the best minds in the field all accept that …" was always met with, "here are the reasons that their beliefs do not sway me") the champions of ACC fell back to the very technical and obscure essence of the scientific literature where the discussion should properly be focused. 

The argument against ACC was led by 9 contributors (282 total posts).  The recurring theme of the argument against was "models cannot prove anything," "adulterated data is invalid," and "raising taxes on carbon emissions is simply income redistribution on a huge scale with no net impact on the environment."

One post 19 days into the 4 month (so far) conversation carries the essence of the discussion for me:

TGS4 (Mechanical)                                                                     31 Mar 14 9:43

Rconnor - you have presented a textbook case of argumentum ad ignorantiam. Since "it" can't be a few things that we know, it obviously must be our pet theory. Natural cycles? Internal variability? Well, we don't understand that, so obviously it can't be that. Puleeze!

Now, to your asinine suggestion that, although we may know models and even computational models, because we don't know climate models we are singularly unqualified to proffer a learned opinion on said models. What a load of codswallop! I have been using computational/numerical models in the style of finite element / finite volume / and finite difference for 20 years, and have almost 20 papers in those topic areas to my credit. Damn right I know a thing or two about "models", and it matters not what is going on in the element/volume, there are certainly some universal truths:

1) Boundary conditions: all models are sensitive to their boundary conditions. For climate models, that means what's happening at the edges of the model. Any textbook description of the atmosphere shows a huge variation in temperature as a function of height (and the height being a function of latitude. Albedo is a boundary condition that is a slight function of the near-surface temperature (and geography and geology).

2) Initial conditions: our current climatological data is so spatially and temporally heterogeneous that setting proper initial conditions sufficiently far in the past so as to train or tune the model to match recent history is a fool's errand. Ergo, and training or tuning of the model to match historical conditions is not and cannot be physics-based.

3) Discretization and discretization error: the volume size in the current models are woefully inadequate to resolve spatial and temporally-significant weather and climate (climate being merely the time and spatial-integral of weather) phenomenon.

I have lived and travelled in some pretty diverse places, and I can say categorically that the spatial grid-size is poor. I've also done numerical simulations (CFD, in this case) where we are trying to simulate phenomenon such as shock waves. The grid size is everything. Have you seen such presentations of upper-level winds such as http://earth.nullschool.net/#current/wind/isobaric... the features shown (and this is actual data, not a simulation) are very important climactically-speaking and yet the grid-size necessary to resolve such details is at least an order or magnitude greater than what the current generation of models have.

4) Volume or element formulation. I have done simulations where there are more than 20 variables per grid (including some that had three distinct temperature metrics - plasmas are a blast to model, BTW). Within a single grid, you can model only the most simple physics. Since the resolution of the current climatological models is so coarse, they try to cram all sorts of extras into each grid. Been there - done that - and it's a fool's errand.

5) Validation: this is something that has been hammered home to me so many times by professors and mentors. Does your model match an experiment or reality? Well, the divergence of the atmospheric temperatures during this long "pause" between the real world and the model world shows that validation is not yet achieved. And this failure of validation is likely due to the above-noted issues.

Now - I agree that the CO2-temperature hypothesis does not need these sorts of models. However, claims of forthcoming catastrophe most certainly do. In fact, everything in this topic that is forward-looking relies on "the models". Without catastrophe, there is no need to "act". I am certainly willing to admit that my philosophical and political leanings bias me against the proposed "actions" required to "save the planet", but being sufficiently self-aware, I also know that my technical understanding of this topic is not clouded by my pre-existing biases. Can you say the same?

 

Political ramifications of Climate Change
The focus of the political portion of the climate-change discussion is the reduction of the anthropogenic (i.e., "man-made") portion of the total atmospheric CO2 and CH4.  The IPCC has published several reports all concluding that imminent efforts are required to save the world from man's activities.  The major milestones of this discussion have been:

Since the 1AR, governments have been progressively increasing their presence in this scientific discussion.  The EU implemented a "cap and trade" program that allowed companies to increase (or even maintain) their carbon emissions only by finding another company that was emitting less than its allotted share of carbon and purchasing the excess.  Several countries and subdivisions of countries have implemented similar programs. 

The U.S. Environmental Protection Agency (EPA) was thwarted by the courts in 2013  in their attempt to regulate CO2 emissions, but has implemented an extensive "inventory" system and has put severe de facto (if not de jure) limits on CO2 and CH4 emissions.  The Executive Branch of the U.S. government has continued extensive efforts to implement greenhouse-gas controls on many fronts including the Department of the Interior, Department of Energy, and the Department of Commerce in addition to the Environmental Protection Agency.  Thus far they have been largely unsuccessful in classifying carbon as a "pollutant".  Their efforts are continuing.

 

Scientific support for anthropogenic climate change
The "climate" is big.  Very big.  Full-system experiments are very difficult to perform.  In the absence of a full-system experiment, scientists are limited to physical models and mathematical models. 


Figure 3--Sample of the equations that control the
behavior of the atmosphere (University of Arizona)

Physical models rely on "similitude" to ensure that the model is representative of the full system.  Similitude is a technique that attempts to scale a natural phenomenon of an inconvenient size to a convenient size.  This scaling in fluids problems is based on matching several dimensionless parameters such as Reynolds Number, Weber Number, or Nusselt Number. 

The theory is that if you can build a physical model that results in more than two dimensionless parameters being equal between the model and the full system then the data from the model has a high potential to reflect the full system.  Matching the horribly complex interactions on a global scale to a manageable physical model has not been successful.

Mathematical models are the other tool (Figure 3). If an interaction involving mass and/or energy transport can be described by an equation or a system of equations, then insights can be gained into the functioning of that physical event.  When the system of equations moves beyond rudimentary relationships, you must assemble the variables and equations, along with boundary conditions and environmental variables into a computer model to be able to begin to assess what you know, let alone assessing how things might change with time.

The primary tool of climate science is the computer model.  Computational fluid dynamics (CFD) and finite element analysis (FEA) tools are used to try to reflect the climate of the earth and determine the drivers of climate change. 


Figure 4--Computer model grid layout
(University of Arizona)

The set up for the computer modeling software requires selecting a grid size (Figure 4).  In most modeling software, the most important thing about a grid cell is the information that is transferred from one cell to the next. 

In other words, the model considers all energy, force, and reactions within each grid to be homogenous.  The model enters a grid cell with a very large number of parameters (e.g., temperature, atmospheric pressure, surface wind velocity, atmospheric gas composition, humidity, cloud cover, solar irradiance, albedo, jet stream velocity, and hundreds more) and uses mathematical functions to predict the state of each of the variables on exit/entrance to the next cell.  Some of the parameters (such as cloud cover) and greenhouse gas concentrations are inputs to the model instead of outputs.  This series of calculations is then adjusted for hysteresis (i.e., the amount that a previous state impacts a future state) and run for the next time step.

Once the model is built, it must be "calibrated" or "trained" to verify that it can reflect past time periods before it can be allowed to attempt to predict future states.  The calibration takes the data from a known past point in time and tries to match other more recent time periods.  The modeler has a number of "levers" that he can "pull" to adjust the influence of the various parameters in the model and to change the magnitude of the fixed variables.

Major cyclical events like el Niño, la Niña, sunspots, Pacific Decadal Oscillation, or Atlantic Multi-Decadal Oscillation are put into a supervisory file that drop their impact into the model on the frequency that these events have occurred in the past.

When the model is built and calibrated, it is turned loose to predict future outcomes with a temporal granularity ranging from a few hours to 10,000 years (i.e., it averages every single climate or weather event within the time step into one set of numbers).  The model output parameters that are reported in the press with the highest frequency are global average temperature and sea level.  The scientific community tries very hard to present this data as factually as possible with little reference to social impacts.

Extrapolations from the scientific literature are made in the press and in political discussions about what this increasing global average temperature will do (e.g., increased tropical hurricanes and typhoons, increased frequency and duration of droughts, more tornadoes, etc.) and the dislocation effects of rising sea level to assess the risks of inaction.

 

Scientific consensus on climate change
Much has been made in the press about the report that "97% of climate scientists agree" that "Global warming is a real threat, and that mankind's activities are the cause—the science is settled".  

One of the sources frequently cited for the consensus is Affiliated Professor of Earth and Planetary Sciences Naomi Oreskes from Harvard.  Professor Oreskes examined the abstracts from 928 articles (no disclosure as to how the 928 articles were selected from a body of peer-reviewed literature in the thousands) and found that 75% supported the view that human activities are "responsible for most of the observed warming over the previous 50 years."  An interpretation of an author's opinion on a broad topic from a paper outline seems to be a stretch.

An article published by Doran and Zimmerman in Eos:  Transactions of the American Geophysical Union in 2007 reported the results of a two-question online survey of selected scientists that claimed "97% of climate scientists agree."  The questions were:

  1. Have global mean temperatures risen since the pre-1800's?
  2. Do humans significantly influence the global temperature?

It is difficult, even for a skeptic, to answer those questions in the negative.  It is a strong skeptic position that climate changes, climate has always changed, climate will always change.  It is also difficult to say that the biomass of 7 billion people would not have some impact on a heat sink. 

The Doran and Zimmerman survey was sent to 3,146 scientists who were identified by having had a paper published which mentioned climate change.  Of this subset, only 79 responded.  So this evidence of a consensus actually had 77 people respond positively to inane and general questions.

In 2013, a paper by Cook et al. published in Environmental Research Letters claimed that their review of the abstracts of peer-reviewed papers from 1991 to 2011 found 97 percent of those that stated a position explicitly or implicitly suggesting that human activity is responsible for some warming.  The Cook paper was reviewed by Legates, et al. in Science and Education who found that "just 0.03 percent endorsement … that most warming since 1950 is anthropogenic."  They found "only 41 papers – 0.03 percent of all 11,944 abstracts or 1.0 percent of the 4,014 abstracts expressing an opinion … had been found to endorse the quantitative hypothesis."

Many of the authors of abstracts that were included in Cook, et al. analysis have since come forward to refute that their position was properly categorized. 

A thorough review of the topic of consensus on Anthropogenic Climate Change (ACC) can be found at The Heartland Institute.  A discussion of the Social Psychology of the consensus is at José Duarte's blog site.  In his June 3, 2014 blog entitled "Ignore climate consensus studies based on random people rating journal article abstracts" he says:

Ignore them completely – that's your safest bet right now. Most of these studies use political activists as the raters, activists who desired a specific outcome for the studies (to report the highest consensus figure possible), and who sometimes collaborated with each other in their rating decisions. All of this makes these studies completely invalid and untrustworthy (and by customary scientific standards, completely unpublishable.) I had no idea this was happening. This is a scam and a crisis. It needs to stop, and those papers need to be retracted immediately, especially Cook, et al (2013), given that we now have evidence of explicit bias and corruption on the part of the raters. (It's crazy that people think the consensus needs to be artificially inflated to absurd heights – do they think 84% or 90% isn't good enough?)

Most of the engineers who have been participating in the various threads on www.eng-tips.com find the very concept that a scientific consensus could constrain alternative research to be objectionable.  At one time the "scientific consensus" was that the sun revolved around the earth, and Galileo Galilei suffered mightily at the hands of the Roman Inquisition for putting forth an alternate hypothesis.  "Scientific consensus" is a powerful thing to entrench a particular concept and stifle contrary opinions.

 

The other side of the climate change discussion
People who do not accept this science as "settled" are frequently called "deniers" and "skeptics".   Skeptics call the people who feel that the science as settled "warmists" and claim that the warmist position is much closer to a religion (i.e., you must take certain things on faith, and you must "believe") than to free, scientific inquiry.  The discussion is very polarized. 

Everyone who is skeptical about ACC has their own reasons for this skepticism, but mostly the basis fits into one or more of the following categories:

The climate has always changed; the climate will always change; live with it.  Since mankind began walking the Earth we've have ice ages, droughts that extended over decades, brief periods of clement weather, and everything in between. 

It has been warmer than today by a considerable margin.  It has been colder than today by an equally large margin.  Life has adapted.  Regardless of the cause, magnitude, or direction of the next set of changes, we will adapt if allowed to.  No action by the governments of the world will prevent changes in climate.  Even if the current trend is actually one of increasing temperatures, and even if that trend is due to human activity, successfully changing that human activity will only remove a single factor in an impossibly complex group of factors and some other factor will cause warming or cooling that we will have to deal with. 

Mankind has survived five "ice ages" and the subsequent "global warming" that followed; there is a good chance that if the politicians don't muck it up we'll survive the next one too.

In engineering activities the fact of climate changing would be treated as an "environmental variable" that can be measured, assessed, and factored into activity, but that cannot be successfully modified.  In other words "The ant really should not try of move the rubber tree plant," even with "high hopes."

 

Data
The more information that is released about historical climate data, the less valid it seems.

    • Heat island effects.  It seems to make sense to most people that urban locations will be warmer for a given solar flux, cloud cover, and wind conditions than a rural location would be.  Over time cities have encroached on monitoring sites that had been rural.  The warmists claim that the data can be mathematically adjusted to account for this fact to allow the station to show a consistent set of conditions over time.  The magnitude and basis of the adjustment for a given station is different in different data sets.

      In engineering activities this kind of systemic modification of data would be done based on explicitly divulged algorithms and would be reversible.

 

    • Station location.  Souleyman Fall, et al. did a peer-reviewed study published in Journal of Geophysical Research, Vol 116, D14120 in 2011, where they found that only 7.9 percent of U.S. climate monitoring stations provided data that was within ±1°C.  They also found that 70.6 percent of the stations were worse than ±2°C.  When you realize that the worst projections of ACC were on the order of 0.5°C/decade temperature increase, it is hard to have much faith in data that was incapable of demonstrating that number.  Results this poor from the richest country on earth do not bode well for the overall integrity of the global data set.

 

    • Original data.  The climate dataset is very large.  Many station's data is appropriately edited (e.g., a site with a temperature instrument stuck at 999°C for several months needs to be edited), other stations have edits that are more subtle (e.g., edits for the heat island effect mentioned above).  Regardless of whether the edits are done to correct errors or to adjust reality, the original data is not retained.  There is no way for future researchers to evaluate different heat-island adjustments for example because the owners of the data do destructive edits in the claim that the datasets are simply too big to allow non-destructive edits.

      In engineering activities destroying part of a data set or replacing measured data with "judgmental data" is done all of the time—with the ability to roll the changes back out to be able to demonstrate the magnitude, reason, and technique for the opinion that you have a "better number".  Without this ability to reassess a raw data set there is no way to prove that the edits were unbiased towards any specific conclusion.

 

    • Pre-industrial data.  The 20th Century data before the 1990's was all taken from analog instruments that rarely had calibrated steps tighter than 5°C.  The person making the record had to interpolate between marks that were physically very close together.  Even worse is the tree-ring, sea floor, and ice core data used for pre-20th Century.  Tree rings are thicker when the tree sees adequate moisture and considerable sunshine.  They are thinner if either moisture or sunshine is lacking.  Scientists can make some reasonable guesses about temperature from an analysis of tree rings.

      In engineering activities, it is important to honor the uncertainty of the data.  If an instrument provides data that has an uncertainty of ±2.5°C, then it is irresponsible to report a calculation done with the data to more significant digits than ±1.25°C.  The data from before the 20th Century has a temporal granularity of seasons, years, decades, and even centuries.  Ice core data does not contain a direct read of temperature, but allows the creation of a temperature proxy from isotopes of hydrogen and oxygen.  A computer model is used to try to typify whether the isotope mix came from the Pacific, Atlantic, or Indian Oceans, and then the model uses the magnitude of the count of the relevant isotopes to estimate the temperature required to evaporate that much water.  Many papers have been written about this.  An article in AstroBiology Magazine in 2012 said:

"We ran an oxygen isotope-enabled atmosphere model, so we could simulate what these ice cores are actually recording, and it can match the actual oxygen isotopes in the ice core even though the temperature doesn't cool as much," Carlson says. "That, to us, means the source of precipitation has changed in Greenland across the last deglaciation. And therefore that the strict interpretation of this iconic record as purely temperature of snowfall above this ice sheet is wrong."

      The divergence problem has brought any use of tree-ring data into question, further a computer model is used to convert the limited data available from a tree ring into a temperature; some claim that this step is fraught with potential for bias.

      Even with all of this temporal and magnitude uncertainty, the data from these proxies is regularly posted on a -1 to +1°C, with conclusions in the ±0.1°C range.  In engineering this is referred to as "making stuff up".

 

    • "Granularity."  There are parts of the world where monitoring stations are within a few miles of each other.  Other parts of the world might have one station every few hundred miles.  Some stations have been off line for years while wars were waged around them—in some cases the last data point recorded is simply reported forward, in other cases the date data is honored (i.e., data from 21 Nov 1999, is copied to that date in 2000, 2001, 2002, etc.), and in other cases the date data is honored but "adjusted" for global warming.

 

"Climategate"
In November, 2009, the email accounts and work files of a number of highly regarded climate scientists was posted on the Internet (some say by hackers, some say it was leaked).  Extracts from the 3000+ documents were widely published with the intention of showing that the field of ACC research as being rife with fabrications, cronyism, data that is selectively excluded, and data that is modified to fit a narrative.

A number of investigations of the leaked documents all found that the fraud alleged by the skeptics had not been in evidence.  Specific documents still available on the internet (mostly without context) seem to indicate that there actually was a conspiracy in spite of assurances by the Union of Concerned Scientists, several universities, and several governments that it was all a hoax and/or that in context the documents are justifiable.

Skeptics claim that all of the assessments were done by organizations with a strong vested interest in there not being a problem.

 

Computer models
Computer modeling is a cornerstone of modern engineering so there have been many individuals with considerable expertise in computer modeling that have participated in this discussion on www.eng-tips.com.  This topic is one of very few where everyone with real expertise in modeling agrees—computer models cannot prove anything.  Ever. 

Computer models are outstanding at pointing out areas that warrant further analysis or that have weaknesses.  At best they represent the biases of the author.  At worst they can easily be manipulated to tell any story the author wants to tell.  It is nearly impossible for an outsider to conduct a competent audit of someone else's model.  If there is intentional bias or even fraud in a model it is highly unlikely that it will ever be discovered.  Every single assertion of the community supporting ACC is predicated on the output of a computer model.

  • Grid size.  The surface area of the earth is 196.9 million square miles [510.1 million (km)2].  The "atmosphere" is generally considered to end at 62 miles [100 km] above sea level so the generally accepted volume of the atmosphere is 9.6E8 mi3 [4E18 m3]. 

    The current generation of computer models divides the earth into 2-20 vertical layers, surface blocks ranging from 120-600 miles [200-1,000 km] on a side, and time scales ranging from hours to 10,000 years.  This leads to millions of trillions of grid processes for a single model run.  IPCC First Assessment Report (1AR)says it well:

3.7 Summary
Many aspects of the global climate system can now be simulated by numerical models The feedback processes associated with these aspects are usually well represented, but there appear to be considerable differences in the strength of the interaction of these processes in simulations using different models. 

Unfortunately, even though this is crucial for climate change prediction, only a few models linking all the main components of the climate system in a comprehensive way have been developed This is mainly due to a lack of computer resources, since a coupled system has to take the different timescales of the sub-systems into account, but also the task requires interdisciplinary cooperation.

    In other words even the IPCC lacks much faith in the ability of the models to faithfully represent reality.  5AR tries to spin these deficiencies with statements about how far the "science" of computer modeling has come since 1AR, but the essence of the above quote from 1AR still exists 24 years later.

 

  • Cell to cell math.  The climate is strongly influenced by the movement, accumulation, and storage of fluids.  This fluid activity is defined by the engineering field called "fluid mechanics."  Fluid mechanics relationships are so complex that the only way to solve problem is to assume a long list of simplifying assumptions (e.g., to develop the well-known Bernoulli equation, Daniel Bernoulli had to assume that there is no fluid friction, fluids were incompressible, there is no heat transfer into or out of the system, there was no rotation, and the fluid does no work). 

    None of the standard simplifying assumptions applies to the atmosphere as a whole.  Not a single one.  This results in the models being forced to rely on empirical equations that try to give "good enough" answers in limited cases. 

    The physics of the energy transfer in the atmosphere is even more complex than the fluid mechanics.  The arithmetic underlying these models is barely competent to describe the fluid reactions related to dropping a rock into a still pond, yet it is being used to drive whole economies.

 

  • Number of iterations.  It is not unusual for a computer model to have to iterate upwards of a million times per time step.  Doing a marginal operation on questionable data through a million iterations can result in some pretty random numbers.  You can demonstrate this by entering 1.000000000001 in one cell of an Excel spreadsheet and .999999999999 in the next cell.  Then in the next row square the two numbers.  Duplicate that operation a few thousand times down the sheet.  Within 20,000 repetitions your indication will be infinity and zero.  That is a long way from a million steps, and your data confidence far exceeds what you can get from a weather station. 

    This is an example of an unconstrained experiment.  All of the models have very strong artificial constraints to reign in out-of-control data, by forcing a "wrong" answer back into the realm of "right" with no external indication of this having happened.  The concepts of "right" and "wrong" can only exist within human biases.

 

  • Is warming bad?  As we come out of the Little Ice Age and move towards temperatures consistent with the Renaissance (the first time in man's history that the general population had enough wealth to support the arts and science) you have to wonder what is bad about "warmer?"  The counter argument that warmer will melt the ice in Antarctica and Greenland, flooding low lying regions doesn't carry much weight with the skeptics since both Amsterdam and Venice thrived during the last warming period.

 

  • Is atmospheric CO2 bad?  The current CO2 concentration at Mauna Loa in Hawaii is around 400 ppm.  Ice core data indicate that this level has been reached and passed before.  Extrapolations into the previous epoch suggest that it was much higher during the time of the dinosaurs.  CO2 is the fundamental building block of all life on earth—if plants don't have it then everything dies.  Many commercial growers who operate physical greenhouses dope the atmosphere to 1500 ppm CO2 to accelerate plant growth.  Current levels do not seem to be the pending catastrophe that we've been led to believe.

 

  • Leading or lagging?  Several times in the ice core data, increases in CO2 can be correlated to increases in temperature.  The problem is that the temporal granularity of the data can be as much as ±100 years (it is never better than seasonal)—meaning that all of the information gleaned from a data point was laid down somewhere within two centuries.  So in one scenario, temperatures rose, some of the permafrost in Siberia, Alaska, and Canada melted, millions of tons of biological material that had been frozen for centuries began to decay, atmospheric CO2 increased. 

    The data supports this "lagging" theory precisely as well as it supports a "leading" theory that requires CO2 to be a cause of warming instead of an effect of warming.  The inherent uncertainty of the timeline does not preclude either scenario, and a lagging level of CO2 does not require a positive feedback mechanism.

 

  • The earth hasn't warmed since the 20th century.  Much has been made of the fact that all of the models from the last century predicted temperatures by 2014 that were markedly warmer than what has been observed.  They predicted increased severe weather events when in fact we've seen decreased severe weather (2013 had the lowest number of deaths from hurricanes, typhoons, and tornados that has been recorded since the mid-20th century). 

    Warmists claim that this is perfectly well explained by the deep oceans warming even though we only have reliable ocean temperature down to about 160 ft [50 m] and no data at all from below 2,300 ft [700 m].  The only data that begins to explore this theory is the ARGO Program which has only been in effect since 2007. 

 

Science vs. politics
If this were a pure scientific debate then every engineer "denier" that I've ever talked to would be cheering for the scientists to nail it down.  We'd be helping.  The problem is that climate change has become a political debate in the guise of science.  A climate scientist who doesn't support the idea of ACC bringing global catastrophe will have a hard time getting published, tenure, or even a job.  Few learned papers suggesting that ACC is neither real nor a pending catastrophe get published, and very few pass a peer review. 

The politics are particularly insidious.  Governments are doing real harm to their economies by mandating that "40 percent of the national power supply will come from renewable sources," or "CO2 emissions from power plants must be reduced by 30 percent" or "Cap and Trade" or "Carbon Taxes."  The tone of the majority of engineers in the www.eng-tips.com discussions has been "Show me how raising my taxes, utility costs, and fuel costs will impact the climate that my grandchildren will live in."  The only response is to trot out yet another computer model running on adulterated data with a potentially biased calibration.

The politicians and press may have convinced some portion of the general public that this proposition is supported in the science, but they are quite a ways from convincing the preponderance of the engineering community.  While I can't find any "skeptics" who have become "warmists" or "warmists" who have become "skeptics," there have been a large number who have gone from "its not my field, and I don't have time to think about it" to very skeptical.  Fewer of the uncaring masses have moved into the warmist camp.

 

This engineer's perspective on climate change
It should be very easy for the reader to tell from this document that I have a very strong bias against the actions of government to adapt the planet to the ACC Hypothesis.  I'm frequently asked "but what if you are wrong; isn't doing something better than doing nothing?"  This is a fair question.  I've been wrong before.  I'll be wrong again. 

According to the Union of Concerned Scientists I am wrong this time, and it is already too late to correct the damage done, we have already passed the tipping point and the climate is falling out of any possibility of control towards catastrophe.  As of September, 2013, this group claims we are experiencing:  (1) Accelerating sea level rise and increased coastal flooding: (2) longer and more damaging wildfire seasons; (3) More frequent and intense heat waves; (4)Costly and growing health impacts; (5) An increase in extreme weather events; (6) Heavier precipitation and flooding; (7)More severe droughts; (8)Growing risks to our electricity supply; (9) Changing seasons (spring arrives earlier, fall arrives later); (10) melting ice; (11) Disruptions to food supplies; (12) Destruction of coral reefs; (13) Plant and animal range shifts.  The facts of some of these things are verifiable, others are not.

 

Sea Level
There is an edited data set (as best I was able to find, there are no raw data sets available) at The Permanent Service for Mean Sea Level that has the data points.  Extracting, reformatting, and plotting that data shows that from 1807 to 1860 the sea level dropped fairly rapidly (about 150 mm decrease over 57 years).  It started rising in 1860 until it returned to the 1807 level by 1919 (150 mm increase over 112 years).  It continued at about the same trend to the end of the data in 2010 (170 mm over 91 years). 

A change in sea level of 6.6 inches over 91 years seems like a rate of change we can adapt to.  I was unable to find any actual data after 2010 (but plenty of model-output posing as data).  United Nations Development Programme (UNDP) predicted in 2007 that a 3°C temperature increase would make 330 million people homeless due to sea level rising.  It is now 7 years later and while there have been reports of the "first climate change refugee" looking for a new home every year since 2007; they haven't left their houses yet.

 

Longer and more damaging fire seasons
Fire statistics are really tough to parse.  In 1994, there was a movement in the Northern Hemisphere to stop counting every ignition event as a separate fire (instead count the fires after the fact based on contiguous burned acreage).  Some jurisdictions/organizations followed this, others did not.  Sifting out real counts from the noise seems to be beyond most researchers, it was certainly beyond me.  Most of the peer-reviewed papers I looked at fell back to model predictions after 1994.  There was a paper by Marlon, et al that looked at historical charcoal records and found the biomass burned in wildfires in the U.S. to be largely unchanged over the last 3,000 years (and slightly down for the last 200 years). 

Wildfire is a very interesting discussion.  Mankind's ham-handed attempts to suppress and manage wildfires has led to inventories of unburned fuel in the forests that are so great that when fires start, they burn so hot that they sterilize the forest floor (increasing the time required to replenish vegetation), and they become much more difficult to extinguish.  So one might say that man's activities have worsened the risk of wildfire, but it was absolutely through "doing something" instead of "doing nothing".

The next few consequences described by the Union of Concerned Scientists are all pretty much manifestations of the same thing; it is hard to distinguish between heat waves, severe weather, flooding, and droughts.  "Heat wave" doesn't seem to have a generally-agreed upon definition, so statistics on heat waves are difficult to acquire and claims that they are increasing are based on something other than data.  If we look at environmental-related deaths as a surrogate for heat wave, the CDC Morality Database can be queried to see that in the period 1997-2002, 16,313 people in the U.S. died from extreme cold, 8,589 died from extreme heat, 2,395 died from flood, 1,512 died from lightning, 1,321 died from tornados, and 460 died from hurricanes.  Total weather related deaths in that period accounted for 0.058 percent of the 2.1 million people who die in the U.S. each year.  As a percent of population this relationship is consistent with the pre-1950 data.

If we look at tornados as a surrogate for extreme weather, we are limited to data starting about 1954 (evidence of tornados does not have a proxy in tree rings or ice cores).  Looking at that data from NOAA, 1972 and 2011 were the highest counts of F1+ or larger tornados.  But 2012 was the third lowest number ever recorded.  The large storms (F3+) are a bit different.  Four years since 1954 had more F3+ storms than 2011 (2012 was in the bottom 10).

Hurricanes/typhoons are another useful surrogate for extreme weather.  There is data that seems to indicate increased hurricane activity since 1990, but research has suggested that the apparent increase was due to improved ability to see and count storms that formed in the deep ocean and never threatened landfall.  According to the World Climate Report in 2008, we were on a downward trend in frequency of hurricanes that made landfall.  That trend started in 1920.  More recently, 2013 had the lowest number of hurricanes to hit shore ever recorded (14 named storms in the Atlantic, 2 hurricanes that made landfall).  2014 is off to a slow start and has had one named storm so far.

The rest of the list is far too amorphous and subjective to try to refute.  In short, the risk of "doing nothing" seems to be far less damaging to the world than the proposed actions.  To reduce U.S. coal use by 30 percent will require significant capital investment; those costs will be passed on in the form of rate increases which will cascade into job losses.  Much of Europe is "doing something" about global warming and the price of electricity is so high in some countries that people are required to choose between having food and using the power to cook it.  Forests are being depleted at alarming rates to provide heating fuel, economies are being damaged.

Man's track record at being able to manage nature has been horrible.  We want to stabilize a river bank so we bring in foreign species of plant, the new species pushes out the native species and becomes invasive.  We bring in a beetle to attack the invasive and it spreads out of control to the native plants, etc.  We eradicate the large predators from Yellowstone, and life is so easy for the deer, elk, and moose that they congregate near the rivers, destroying the vegetation that stabilizes the river bank, clogging the pristine mountain streams with mud.  We fight fires and create an abundance of fuel that turns "just a fire" into a "fire catastrophe".

If we fail to respond to ACC, and ACC is a real threat, then the result will be environmental change that engineers will be in the forefront of the efforts to adapt to.  If ACC is not actually changing the climate, then sunspots, the Yellowstone Caldera, falling stones from space, or space aliens will create change that engineers will have to rally to combat.  ACC is simply not the place to get proactive.

 

About the Author

David Simpson, P.E. is the owner and principal engineer at Muleshoe Engineering.  David is an MVP in the professional forums at www.eng-tips.com and a member of the Engineering Writers Guild

Follow David (zdas04) at http://eng-tips.com/userinfo.cfm?member=zdas04

Recommended For You