Quantcast
Channel: Comments for Climate Etc.
Viewing all 147818 articles
Browse latest View live

Comment on Assessing climate model software quality by Latimer Alder

0
0

‘modellers have ‘learned to live with a lower standard’ of code and development processes simply because they are good enough to produce legitimate scientific results’

Would I be even more than normally cyncical if I surmised that ‘legitimate scientific results’ in this contex should be interpreted as:

‘good enough to get me a paper pal-reviewed and published and so secure my next grant for another few years’

rather than anything to do with accurate predictions of the climate?

Because once the intimate connection between prediction and observation is broken (and in climate modelling this was severed a very long time ago), then there is nothing at all to anchor the once proud and trusty ship ‘Legitimate Result’ as it floats away across the Sea of Fantasy to the Land of Wishful Thinking.

sorry – I am a


Comment on Assessing climate model software quality by Latimer Alder

0
0

:-)

And in the UK we have only one ‘Monopolies Commission’.

Comment on Assessing climate model software quality by Jeff Corwith

0
0

I’ll toss a couple of semi-random observations about modeling. These arise out of my perspective as one who builds and uses models of a different sort.

-(Probably overly obvious but nevertheless): It’s important that the model honors the basic physics (‘material balance’ and laws of thermodynamics come to mind firstly) of the system. The model developers must test against problems to ensure that the delivered code does this.

-There needs to be a distinction between the basic physics and assumed and/or empirical relationships/behaviors in the model. Ideally the latter won’t be hard coded into the model such that alternatives (however unlikely) can be tested – at least so the modeler can have an understanding on how much those assumptions drive end results. Those using the model(s) need to understand the assumptions behind their model and the limitations therein.

-Model runs with alternative assumptions and parameters (uncertainties) are useful so that modeler can develop an understanding for how much they drive the model results. Some uncertainties will be shown to have relatively minor impacts. Those uncertainties which affect the largest changes in outcomes are the ones on which future studies should focus.

-finally – my own soap box – even though the models may with all likelihood be far from good at predictions, that doesn’t mean that they aren’t useful. Just developing an understanding of the drivers for prediction performance (while fully understanding the context of the model itself; being able to isolate physical processes from model and coding artifacts), will go a long way towards leveraging future research efforts.

Given the political ‘climate’ around this sort of modeling, I do not envy these modelers in the least. There are going to be a lot of folks (from all sides of the issue) lying in wait to play “gotcha” with selected portions of their results.

Best of luck!

Comment on Week in review 4/13/12 by climatereason

0
0

Ah! Mann of La Manncha

tonyb

Comment on Assessing climate model software quality by Girma

0
0

The models must represent the observed data accurately.

The current models don’t. They don’t have cyclic component. They don’t have turning points and point of inflections. Until they do that, they will remain incorrect.

Comment on Assessing climate model software quality by Latimer Alder

0
0

‘the models may with all likelihood be far from good at predictions, that doesn’t mean that they aren’t useful. Just developing an understanding of the drivers for prediction performance (while fully understanding the context of the model itself; being able to isolate physical processes from model and coding artifacts), will go a long way towards leveraging future research efforts. ‘

Maybe so.

But they shouldn’t be claimed to be predictive tools unless they have shown some abilities in those areas. And yet climatologists and AGW advocates are far too ready to seize upon model outputs and use them as dire warnings of ctastrophes to come. It’s just a reworking of the old 1970s idea ‘it must be true, it comes from a computer’

And after 30 years of throwing good money after bad to develop climate models that clearly can’t do any useful prediction now and have no hope of ever doing so in the future, I detect a groundswell of opinion that says there is little point in spending much more effort in this area,. Time to go and do something more useful instead.

Comment on Assessing climate model software quality by Bart R

0
0

Mr. Orssengo

You do realize that comparing like-to-like would mean you’d be obliged to fit the climate model projections with exactly the same broad (+/- 0.2C) variance bands you allow for your own, yes?

And on that basis, so far, the overlap of the actual raw (not smoothed) data fits every model projection (not just the lower ones) at least as well as your own empirical formula?

While you’re making flat absolute statements about your graphs, you clearly do not understand how one interprets graphical data.

Go ahead. Draw the variance bands on the 0.2C/decade from 1990 line on your own graph. Stop comparing your graph to the smoothed data. Use the r^2 value from 1990 for each function from the raw data.

And set some valid test for whether your proposed function or the (rather silly) 0.2C/decade function is a better fit, to predict under what conditions either hypothesis can be rejected and at what confidence.

You do know how to do those things, yes?

Comment on Assessing climate model software quality by Bart R


Comment on Assessing climate model software quality by Bart R

0
0

Comment on Assessing climate model software quality by David L. Hagen

0
0
Wayne2 <b>Buggy:</b> All models are buggy - we appear to have little idea by how much. Thus the need for independent verification & validation. <b>Chaotic:</b> Weather and climate have <b>chaotic uncertainty </b>- most models are run with few replications - we don't know how much of the difference between models and data is chaotic, how much is poor understanding of weather/climate and how much is climatic trends - or how much of the trends are natural vs anthropogenic – despite IPCC's claimed > 90% confidence. e.g., See Fred Singer <a href="http://www.sepp.org/science_papers/ICCC_Booklet_2011_FINAL.pdf" rel="nofollow">NIPCC vs. IPCC Addressing the Disparity between Climate Models and Observations: Testing the Hypothesis of Anthropogenic Global Warming (AGW)</a> <blockquote>(2) Climate models are known to be chaotic. None of current models have a sufficient number of runs to overcome chaotic uncertainty and therefore cannot be validated against observations. . . . Attribution of observed warming trends to GH-gas increases is based largely on claimed agreement between observed (tropical) tropospheric trends and modeled ones [Santer et al., IJC 2008, Fig 6]. We show that the claimed consistency is spurious. </blockquote> <b>Uncertain data</b>. The data has major uncertainties - there are major issues trying to evaluate by how much. To validate models we need to compare models against climate data. e.g., See Nigel Fox of the National Physics Lab: <blockquote>Dr Nigel Fox, head of Earth Observation and Climate at NPL, says: "Nowhere are we measuring with uncertainties anywhere close to what we need to understand climate change and allow us to constrain and test the models. Our current best measurement capabilities would require >30 yrs before we have any possibility of identifying which model matches observations and is most likely to be correct in its forecast of consequential potentially devastating impacts. The uncertainties needed to reduce this are more challenging than anything else we have to deal with in any other industrial application, by close to an order of magnitude. </blockquote> <a href="http://www.eurekalert.org/pub_releases/2011-09/npl-ucm091911.php" rel="nofollow">Uncertain climate models impair long-term climate strategies</a> See Fox's presentation: <a href="http://www.npl.co.uk/science-lectures/resolving-uncertainty-in-climate-change-data" rel="nofollow">Resolving uncertainty in climate change data</a> Climate science is still wandering in a wasteland of uncertainty. We are trying to make policy issues to spend billions per bug – while having wide variations between models for unknown reasons. Better to get the bugs out first, reduce the uncertainty in the data to be able to test models, understand the physics of climate especially the clouds, and then compare to see if there is a serious issue – so we can evaluate the pros/cons of what to do about it.

Comment on Assessing climate model software quality by bob droege

0
0

“This is all pathetically and tragically amateur.”

And tragically and pathetically not related to reality.

Comment on Assessing climate model software quality by Bart R

0
0

Girma | April 16, 2012 at 9:57 am |

One imagines if the actual global temperature had real turning points and points of inflection, was actually cyclic instead of reflecting the sums of many recurring and random incidents, your statement might bear some similarity to truth.

However, as the raw data is just data, and not the product of a mysterious linear function hidden in messages sent from the universe to those cunning enough to decipher them, what you say here is simple fantasy.

Comment on Assessing climate model software quality by Jim Cripwell

0
0

Norm, I understand what you are saying. I have a slightly different way of looking at the problem. Someone has to use the model to try and do something. That someone is responsible for the whole ball of wax; model, input data, etc. I dont differentiate. That is what I was trying to say that it is the misuse of models that is the problem. Users putting in wrong data, and claiming the results are valid is just another way models are misused.

Comment on Assessing climate model software quality by Bart R

0
0

You know, nothing’s stopping individual skeptics from producing their own models to challenge the ones they don’t like; with distributed computing power over the Internet, the only real obstacle is will.

Look at Mr. Orssengo’s example. He’s spend years with his own models. Imagine what someone with even a little mathematical and programming ability could do with so much as a tenth his stick-to-itivness.

Comment on Assessing climate model software quality by Jim2

0
0

Roy Spencer and Willis E. have taken a stab at the “quantifying uncertainty” bit as defined in the summary. They were generally met with cat calls, heckling, and guffaws. But they have the right idea and this paper backs up their efforts – at least to the extent that they were going down the right road.


Comment on Week in review 4/13/12 by Jim2

0
0

This may be an example of developments that pessimistic people, as some here tend to be, wouldn’t anticipate.

“Coskata, together with its strategic collaborator Total Petrochemicals, is also developing “microorganism-based technology” to produce propanol, a three-carbon building block for making propylene, from the same feedstocks the company would use to produce cellulosic ethanol. Over time, according to Coskata, it expects to eventually expand its platform to produce valuable four, five and six carbon chain chemicals, adding that it has already demonstrated at lab-scale the production of propanol, butanol, butanediol, hexanol, organic acids and certain fatty acids.

As a result of its low-cost syngas fermentation process, Coskata sees market growth for its biochemicals, stating, “We are able to immediately address the 23 billion gallon global fuel-grade ethanol market,” the company stated in its S-1 filing. “Our cellulosic ethanol will also address the 1.7 billion gallon global industrial ethanol market and can be converted into ethylene, a $140 billion market. In the future, our propanol production technology, combined with alcohol dehydration technology to produce propylene from propanol, under development by Total Petrochemicals together with IFP Energies Nouvelles & Axens, is expected to target the $100 billion global propylene market.””

http://ethanolproducer.com/articles/8433/coskata-files-registration-statement-for-proposed-ipo

Comment on Psychological(?) effects of global warming by sunshinehours1

0
0

A message for Web the Oil Drum Shill”

Every old, declining oil field can potentially be rejuvenated by fracking.

“Oil companies have breathed new life into the Cardium in recent years, using newer horizontal drilling and multi-stage hydraulic fracturing technologies to recover previously uneconomic oil in the geological structure that cuts through much of the province and has long been key to conventional oil production. Cardium oil output has totalled some 1.5 billion barrels since the 1950s, when the huge Pembina field was discovered.

ExxonMobil has a “significant acreage position” in the Cardium, according to the joint statement, which noted RN Cardium Oil’s project may promote technology development that could be applied on unconventional reservoirs in Russia.

As of the end of last year, Exxon had “encouraging early results in the Cardium,” said Exxon vice-president of investor relations David Rosenthal on a Jan. 31 conference call with analysts, according to a transcript.

Exxon had two wells on production at the end of 2011 and had planned for additional drilling in 2012, Rosenthal said, noting that of the company’s tight oil and liquids-rich gas prospects, it is particularly optimistic about the Cardium, in addition to the Bakken in North Dakota and the Woodford Ardmore in Oklahoma.

“I will say that if you look, for example, up in the Cardium, where some of the tight oil well rates that you’ve heard from others, ours are certainly doing well and at the upper end of that range,” Rosenthal said.”

http://www.calgaryherald.com/business/Update+Rosneft+gains+Alberta+Cardium+play+stake+ExxonMobil+trades+part+interest+strategic/6467144/story.html

Comment on Week in review 4/13/12 by WebHubTelescope

0
0

That is incorrect. The oceans act like a high capacity heat sink when stimulated by a sustained thermal forcing function.

If no thermal forcing function exists then it will maintain whatever steady state it is in. Otherwise, the oceans will take up a portion of the excess heat with extremely long response times.

I have quite an elegant way of applying the heat equation to this system.
http://theoilconundrum.blogspot.com see the post on the missing heat.

Comment on The Internet: World War 3.0 (?) by Arcs_n_Sparks

0
0

This is actually why the smart grid is a bad idea. The Chinese cannot easily attack the dumb grid we have today.

Comment on The Internet: World War 3.0 (?) by Arcs_n_Sparks

0
0

This topic actually goes back to why the U.S. has a Constitution: the balance between anarchy and tyranny. If the internet geeks work it out correctly, the outcome will be the same. Rule of law, limited interference from government, individual liberty with associated responsibility. All with appropriate checks and balances.

Viewing all 147818 articles
Browse latest View live




Latest Images