Monday, May 28, 2018



Spurious Correlations in Climate Science

Naive statistics underlie many causal claims in climate "science"

You know who Charles Darwin is of course but you may not have heard of his mad cousin Francis Galton who did the math for Darwin’s theory of evolution. Two of the many procedures Sir Galton came up with to help him make sense of the data are still used today and are possibly the two most widely used tools in all of statistics. They are ordinary least squares (OLS) linear regression and OLS correlation. [Soon after these amazing mathematical innovations, Sir Galton retired from the evolution business and devoted the rest of his life to making the perfect cup of tea.]

Both of these statistics are measures of a linear relationship between two variables X and Y. Linear regression coefficient B of Y against X is a measure of how much Y changes on average for a unit change in X and the linear correlation R is a measure of how close the observed changes are to the average. The regression and correlation metrics are demonstrated below with data generated by Monte Carlo simulation used to control the degree of correlation.

In the HIGH (R=0.94) and VERY HIGH (R=0.98) correlation charts, linear regression tells us that on average, a unit change in X causes Y to change by about B=5 and this assessment is very consistent. The consistency in this case derives from a low variance of the regression coefficient implied by high correlation. The strong correlation also implies that the observed changes in Y for a unit increases in X is close the the average value of B=5 over the full span of the data and for any selected sub-span of the time series.

In the LOW (R=0.36) and MID (R=0.7) correlation charts, the regression coefficients are correspondingly less precise varying from B=1.8 to B=7.1 for LOW-R and B=3.5 to B=5.6 for MID-R in the five random estimates presented. The point here is that without a sufficient degree of correlation between the time series at the time scale of interest, though regression coefficients can be computed, the computed coefficients may have no interpretation. The weak correlations in these cases also imply that the observed changes in Y for a unit increases in X would be different in sub-spans of the time series. The so called “split-half” test, which compares the first half of the time series to the second half, may be used to examine the instability of the regression coefficient imposed by low correlation.

Correlation is a necessary but not sufficient evidence of causation. Although correlation may imply causation in controlled experiments, field data do not offer that interpretation. If Y is correlated  with X in field data, it may mean that X causes Y, or that Y causes X, or that a third variable Z causes both X and Y, or that the correlation is a fluke of the data without a causation interpretation. However, because correlation is a necessary condition for causation, the absence of correlation serves as evidence to refute a theory of causation.

An issue specific to the analysis of time series data is that the observed correlation in the source data must be separated into the portion that derives from shared long term trends (that has no interpretation at the time scale of interest) from the responsiveness of Y to changes in X at the time scale of interest. If this separation is not made, the correlation used in the evaluation may be, and often is spurious. An example of such a spurious correlation is shown in the graphic below. It was provided by the TylerVigen collection of  spurious correlations.

As is evident, the spurious correlation derives from a shared trend. The fluctuations around the trend at an appropriate time scale (whether annual or decadal) are clearly not correlated. The separation of these effects may be carried out using detrended correlation analysis. Briefly, the trend component is removed from both time series and the residuals are tested for the responsiveness of Y for changes in X at the appropriate time scale. The procedure and its motivation are described quite well in Alex Tolley’s Lecture  available on Youtube.

The motivation and procedure for detecting and removing such spurious correlations in time series data are described in a short paper available for download at this link: Spurious Correlations in Time Series Data . The abstract of this paper follows: Unrelated time series data can show spurious correlations by virtue of a shared drift in the long term trend. The spuriousness of such correlations is demonstrated with examples. The SP500 stock market index, GDP at current prices for the USA, and the number of homicides in England and Wales in the sample period 1968 to 2002 are used for this demonstration. Detrended analysis shows the expected result that at an annual time scale the GDP and SP500 series are related and that neither of these time series is related to the homicide series. Correlations between the source data and those between cumulative values show spurious correlations of the two financial time series with the homicide series.

It is for these reasons the argument that “the theory that X causes Y is supported by the data because X shows a rising trend and at the same time we see that Y has also been going up” is specious because for the data to be declared consistent with causation theory it must be shown that Y is responsive to X at the appropriate time scale when the spurious effect of the shared trend is removed. Some examples from climate science are presented in the papers below along with the URL to their download sites.

Are fossil fuel emissions since the Industrial Revolution causing atmospheric CO2 levels to rise? Responsiveness of Atmospheric CO2 to Fossil Fuel Emissions

Can sea level rise be attenuated by reducing or eliminating fossil fuel emissions? A Test of the Anthropogenic Sea Level Rise Hypothesis

Can ocean acidification be attenuated by reducing or eliminating fossil fuel emissions? An Empirical Study of Fossil Fuel Emissions and Ocean Acidification

Is surface temperature responsive to atmospheric CO2 levels? #1 Validity and Reliability of the Charney Climate Sensitivity Function

Is surface temperature responsive to atmospheric CO2 levels? #2 Uncertainty in Empirical Climate Sensitivity Estimates 1850-2017

Is surface temperature responsive to atmospheric CO2 levels? #3 The Charney Sensitivity of Homicides to Atmospheric CO2: A Parody

A further caution needed in regression and  correlation analysis of time series data arises when the source data are preprocessed prior to analysis. In most cases, the effective sample size of the preprocessed data is less than that of the source data because preprocessing involves using data values more than once. For example taking moving averages involves multiplicity in the use of the data that reduces the effective sample size (EFFN) and the effect of that on the degrees of freedom (DF) must be taken into account when carrying out hypothesis tests. The procedures and their rationale are described in this freely downloadable paper Illusory Statistical Power in Time Series Analysis.

Failure to correct for this effect on DF may result in a false sense of statistical power and faux rejection of the null in hypothesis tests as shown in this analysis of Kerry Emmanuel’s famous paper on what he called “increasing destructiveness” of North Atlantic hurricanes: Circular Reasoning in Climate Change Research.

An extreme case of the effect of preprocessing on degrees of freedom occurs when a time series of cumulative values is derived from the source data as in the famous Matthews paper on the proportionality of warming to cumulative emissions [Matthews, H. Damon, et al. “The proportionality of global warming to cumulative carbon emissions.” Nature 459.7248 (2009): 829].

It has been shown in these downloadable papers that the time series of cumulative values has an effective sample size of EFFN=2 and therefore there are no degrees of freedom and no statistical power.

Degrees of freedom lost in moving window preprocessing Effective Sample Size of the Cumulative Values of a Time Series

Degrees of freedom lost in a time series of the cumulative values of another time series #1 Limitations of the TCRE: Transient Climate Response to Cumulative Emissions

Degrees of freedom lost in a time series of the cumulative values of another time series #2 From Equilibrium Climate Sensitivity to Carbon Climate Response

Degrees of freedom lost in a time series of the cumulative values of another time series #3 The Spuriousness of Correlations between Cumulative Values

SOURCE





Extinction of Puffins premature



The Telegraph sent its science reporter up to the Farne Islands in Northumbria to write up a story it dramatically headlined ‘UK puffins may go the way of the dodo with fears of extinction in 50 years.’ It claimed:

    So far the news has been bleak. The puffins arrived four weeks later than usual and initial estimates suggest the number of breeding pairs has fallen by 12 per cent.  A combination of climate change, overfishing, plastic pollution and extreme weather has left the little seabirds struggling for survival.

This was followed up by BBC Radio 4’s Today programme and also by the Daily Mail.

But the story is #fakenews.

First, as one reader pointed out, puffin colony numbers go up and down all the time.

Second, as Paul Homewood notes, all that stuff about “climate change, overfishing, plastic pollution and extreme weather” is just pabulum. It’s like a catechism that true believers are required to chant – a ritual demonstration of faith, not the thorough examination of facts that you might hope for from a responsible specialist reporter.

Yes, there seems to be evidence that industrial fishing of sand eels has had a deleterious impact on puffin colonies. But definitely not climate change.

    If numbers rose from 3000 in 1939, to 55000 in 2003, when supposedly we have had global warming, how can it now be responsible for a decline?

Well indeed.

You might argue there’s no harm when the legacy media runs stories like this: all they’re trying to do is sex up a nice nature pic with a bit of attention-grabbing doom and gloom scribble underneath, and it all helps to raise awareness of environmental issues.

But I’d disagree.

#Fakenews environmental crisis stories like this, repeated day in day out, have a cumulative impact in generating precisely the kind of mass hysteria which has led to the great climate change scare.

People feel in their bones that something needs to be done – and urgently – because why would newspapers and the BBC be running this stuff if it weren’t true and a real problem?

Politicians, in turn, feel compelled to respond to this apparent crisis. The resultant policies are invariably disastrous.

SOURCE





Climate change should HELP Midwest corn production

Climate change and global warming put some forms of life at risk, but researchers found one instance that might not feel the heat – corn.

Contrary to previous analyses, research published by Michigan State University shows that projected changes in temperature and humidity will not lead to greater water use in corn. This means that while changes in temperatures and humidity trend as they have in the past 50 years, crop yields can not only survive – but thrive.

“There is a lot of optimism looking at the future for farmers, especially in the Midwest,” said Bruno Basso, lead author of the study and University Distinguished professor.

Basso and his colleague Joe Ritchie, co-author on the study, calculated how much energy crops receive from the sun and how it is converted to evaporative loss from the crop, known as evapotranspiration.

“Think of the energy balance like a bank account. There are additions and subtractions,” Basso said. “The energy coming from the sun is a known, measured quantity that adds to the bank account. The primary subtraction is liquid water from the crop, and soil using the solar energy to convert the water to vapor.”

The researchers used the energy balance to calculate the evaporative water loss for 2017, which set a world record yield of 542 bushels per acre. They found that the water loss was the same as it was for lower yielding crops because the energy balance was about the same.

The trend for the past 50 years of a slightly more humid environment decreases the energy for the crops’ water use.

“Our analysis, and that of other climate researchers, shows that the amount of water vapor in the air is gradually increasing in the summers because the daily low temperatures are getting gradually warmer, but the daily high temperatures are cooling – or staying the same – in many areas of the Midwest,” Basso said. “This causes more humidity and slightly decreases how much energy is used for evaporation.”

Basso also tested a water balance calculation on the crop models that, similar to the energy balance, has additions from rainfall and irrigation and subtractions from evaporation from the crop.

“A water balance is just like the bank account of an energy for crops,” Basso said. “There must be a balance to make crops ‘happy’ so that all the energy reaching the crop surface is evaporated.”

In the United States, as a result of improved hybrids and agronomic practices, corn production has steadily increased by an average of two bushels per acre every year for the past 40 years.

Basso explained that data from the National Corn Growers Association competition for high yields shows the potential for continued higher yields in the future. His findings support that climate change won’t hinder its production if the trend of the past 50 years continues into the next 50 years.

“The energy for evaporation is changing little, so if the number of days the crop grows and uses water is the same now and, in the future, the evaporation loss will be the same and slightly less,” Basso said. “In fact, the warmer temperatures allow the use of longer season hybrids that will make for even greater yield possibilities.”

SOURCE





The German wind energy market is threatened by a sharp downturn after years of continuous growth. Ten thousand jobs have already been cut last year

Düsseldorf: When Volker Malmen sat on stage in a hotel in Bremerhaven just over a week ago, the head of Orsted Germany could not help laughing when the moderator asked him which countries were important for the wind industry as a growth market. “Well, Germany probably not,” replied the managing director. And Malmen’s statement has weight in the industry – the Danish company is one of the world’s leading wind farm operators. The Orsted boss is not alone in his opinion.

According to a survey by the market research institute Windresearch, together with WindEnergy Hamburg, the largest wind fair in the world, the mood in the industry is basically positive – but in Germany the surveyed project planners, operators and manufacturers are not so optimistic about the situation, in some cases even consider it “very negative”. The survey is exclusively available to Handelsblatt.

Germany is the most important sales market for wind power in Europe; last year alone, more wind turbines were installed here than ever before. But while the global wind industry is booming, the German market is threatened by a sharp downturn after years of continuous growth.

In 2017, around 1800 new wind turbines with an output of 5330 megawatts were added in Germany, but in the worst case scenario it could only be 1100 megawatts in 2019. The German Wind Energy Association warns against the loss of thousands of jobs. Ten thousand jobs had already been cut last year.

On the one hand, the industry is facing enormous price erosion due to the worldwide reduction in subsidies and the switch to tenders. On the other hand, Germany is considered a particularly difficult market. Here, the demand for wind turbines had almost collapsed due to the auction system introduced only in 2017.

Over 1200 companies from all over the world were surveyed in the WindEnergy Trend Index 2018, both in the onshore and offshore wind sectors. In both cases, the interviewees assess the situation in Germany as significantly worse than for the rest of Europe, Asia or North America. The onshore industry seems to be particularly concerned, with 38 percent rating the current situation as negative to very negative.

Dirk Briese, Managing Director of windresearch, attributes this to the tenders. “In other countries such auction systems have existed for some time. In addition, the lowest results to date were achieved in Germany. And in the shortest time,” Briese explains.

The operators Orsted, EnBW, Vattenfall and also the Spanish energy supplier Iberdrola won part of the projects with zero-cent bids in two tender rounds. They then want to market their electricity completely without EEG compensation at the price traded on the stock exchange. To date a novelty in the wind industry. According to Briese, this is a very rapid and radical change.

Nevertheless, the majority of people believe that they have grown at this turn of time. They are already expecting a more optimistic mood in the offshore sector this year. It is hoped that the situation could brighten further from 2020. Only half of twenty percent currently believe that the German market is in a negative starting position. The prospects for onshore wind energy are not quite as bright. Here, 19 percent do not believe in an improvement in two years either. However, more than forty percent even assess the situation as positive to very positive in two years’ time.

One reason for this optimism could be further cost reductions in the construction of wind turbines. There’s still potential down here. All those surveyed agree that the biggest leap in offshore wind energy is imminent. Almost seventy percent of those surveyed estimate the chances of saving even more at high to very high levels.

Siemens offshore CEO Andreas Nauen recently called in an interview with the Handelsblatt newspaper for politicians to improve the general conditions in order to prevent Germany from falling behind. “If Germany wants to stay at the top and not fall behind other countries, something has to change. We hope that the expansion corridor for offshore wind energy will widen considerably,” said Nauen. According to the government’s targets, this is 15,000 megawatts (MW) by 2030, Nauen says, which is too little.

The operators Orsted, EnBW, Vattenfall and also the Spanish energy supplier Iberdrola won part of the projects with zero-cent bids in two tender rounds. They then want to market their electricity completely without EEG compensation at the price traded on the stock exchange. To date a novelty in the wind industry. According to Briese, this is a very rapid and radical change.

In the results of the second tender round for onshore wind energy last week, the tendered volume was not reached for the first time. Of 670 MW, only 111 bids with a volume of 604 MW were received.

Companies see markets such as China, India or Taiwan as more promising. The respondents see the best opportunities for the onshore sector from 2020 onwards in Asia, while in the offshore sector they see better conditions for Europe.

In principle, however, growth is increasingly shifting from Europe to Asia and is becoming smaller. This also became clear at the Windforce trade fair in Bremerhaven. The assessment of the industry experts: New projects are being implemented in Asia, while the German market is sluggish.

SOURCE




How 19 Wealthy Foundations Control The Anti-Fossil Fuel Agenda And Escape Scrutiny

Nearly 20 wealthy foundations funneled hundreds of millions of dollars between 2011 and 2015 into a network of environmental organizations to attack the fossil fuel industry, according to a new study published this week by Matthew Nisbet, Ph.D., a Professor of Communication Studies at Northeastern University.

The study, which only analyzed a subset of the organizations active on climate issues, provides a glimpse at the massive funding apparatus behind the anti-fossil fuel echo chamber – and the lack of scrutiny that this big money campaign has faced.

Energy In Depth has previously exposed how wealthy anti-fossil fuel foundations like the Rockefeller Brothers Fund and Rockefeller Family Fund are financing a wide range of activist groups.

These foundations have admitted to funding studies that attack the oil and gas industry, media outlets that provide favorable coverage of those studies, and activist groups to trumpet their anti-fossil fuel agenda online and in the press.

Nisbet’s paper notes that there has been relatively little scrutiny of the presumed independence of these voices in the media:

“When left‐of‐center and progressive foundations are covered in the U.S. press, coverage tends to be predominantly positive and uncritical, deepening a lack of public scrutiny relative to their philanthropic activities, successes, and failures.

“These grantmakers are also among the major patrons for academics and their work and are the main supporters of the rapidly growing nonprofit journalism sector. Many scholars and journalists, therefore, have reason to be cautious in their assessment (Reckhow, 2013).” (emphasis added)

Nisbet emphasizes how the 19 foundations examined in his study, including the Rockefeller Brothers Fund, Bloomberg Foundation, and Hewlett Foundation, have shaped the conversation on climate change, due to the immense concentration of grant money:

“Far from being passive supporters of actions to address climate change, major U.S. foundations for several decades have played an active role in defining a common roadmap for their grantees and partners.

“By framing the challenges, defining the priorities, and promoting specific ideas, philanthropists have actively shaped common ways of thinking that have bound together otherwise disconnected organizations and leaders into shared approaches and strategies.” (emphasis added)

Nisbet cited other literature that described the impact of the foundation approach as an “outsize megaphone, both actively shaping how people view social problems and championing specific methods through which these problems can be addressed.”

The largest of these foundations is the Energy Foundation, and the study describes how its size allows philanthropists to exert major control over the environmental community to focus work on its preferred policies:

“Launched in 1991, the Energy Foundation has been the main instrument that a network of influential U.S. philanthropists has used to define a portfolio of policy options, political strategies, and energy technologies to address climate change.

“Set up by way of large block grants from the Rockefeller Foundation, Pew Charitable Trusts, and MacArthur Foundation, and supported in later years by the Hewlett Foundation, Packard Foundation, and other funders, the principal function of the Energy Foundation has been to leverage money in a highly concentrated pattern on behalf of policies that shift markets, industry, and consumers in the direction of renewable energy technologies and energy efficiency practices.” (emphasis added)

The outsized influence exerted by a small number of wealthy foundations has led to group-think in the climate change conversation and has increased political polarization around the issue by focusing on divisive, anti-fossil fuel policies.

Nisbet cited a few specific strategies supported by the foundations that have had this impact:

“Yet related to these strategies, campaigns opposing the Keystone XL oil pipeline and natural gas fracking along with new causes related to racial, gender, and identity‐based justice have also likely contributed to deepening political polarization, serving as potent symbols for Republican donors and activists to rally voters around.

“These issues also divide liberal and centrist Democrats, and were a major point of contention during the Democratic primaries.” (emphasis added)

The efforts to support the agendas of these foundations is all-encompassing and includes communications, promotion of renewable technologies, and efforts to limit fossil fuel development.

In fact, the “Park Foundation and Rockefeller Brothers Fund…are notable for supporting strategies that directly target the fossil-fuel industry by way of communication, media, and mobilization campaigns,” Nisbet writes.

Meanwhile, the analysis of foundation spending shows alternative ways of reducing greenhouse gas emissions that contribute to climate change remain largely ignored or even attacked.

For example, the study found that $6,834,000 was spent specifically to “oppose, limit natural gas development,” even though increased use of natural gas has allowed for a significant reduction in greenhouse gas emissions in the United States

Emissions of carbon dioxide in the United States have decreased 13 percent from 2005-2016 while natural gas consumption increased 25 percent over the same period.

The study breaks down the attack on natural gas:

“Specific to natural gas fracking, $6.8 million was provided to restrict or ban drilling, $2.1 million to protect drinking water supplies; and $3.9 million for research on health and environmental impacts.

“To support efforts to ban/restrict fracking, Schmidt ($3.3 million), Hewlett ($1.5 million), Park ($1.1 million), and Heinz ($1 million) were the leading funders. Schmidt gave to a mix of national‐ and state‐based groups. Hewlett gave primarily to the Colorado Conservation Fund ($1.3 million).

“Park gave primarily to groups working in New York state, and Heinz to groups in Pennsylvania. Relative to protecting drinking water supplies, major funders included Heinz ($1 million) for efforts in Pennsylvania; and Park ($760,000) for work in North Carolina and New York.

“Major funders of research on fracking’s health and environmental impacts included Heinz ($2.7 million), Park ($780,000), and Schmidt ($390,000). These funds were given to a mix of universities and environmental groups.”

The foundations only granted $1.3 million to support work on carbon capture and sequestration and just $55,000 to promote “fossil fuel industry innovation to limit emissions.”

Nisbet concludes his paper by predicting the next steps in the anti-fossil fuel campaign and the public’s ability to hold these wealthy foundations and their political allies accountable:

“In coming years, as the endowments of major foundations continue to grow, providing philanthropists with ever greater resources, they are likely to play an even more active and strategic role in funding actions to address climate change in the United States and elsewhere.

“In 2017, the Hewlett Foundation, for example, announced it would spend $600 million over the next decade to combat the problem (Gunther, 2018). By framing the challenges and defining the solutions to climate change, as they did in the years following the defeat of the cap and trade bill, Hewlett and other major philanthropies are likely to deepen their ability to bind together organizations and leaders into shared approaches and strategies.

“In an era of political dysfunction and diminished public spending, many will look to philanthropy and their resources for answers. Yet in contrast to elected officials and government agendas, there are few channels to hold funders accountable for their decisions or to a shine a light on their actions…

“Financial support for efforts restricting fossil fuel development and for turning public opinion against the industry is also likely to expand. Examples include municipal lawsuits filed against fossil fuel companies to recover damages for climate change impacts; and decisions by states and cities to divest their pension plans of energy-related stocks.

“To aid these efforts, some funders will also deepen their support for journalistic investigations of the fossil fuel industry. Such strategies, however, are likely to intensify controversy over the ties between funders, advocacy groups, and journalists.” (emphasis added)

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Sunday, May 27, 2018


Doing the numbers on renewable energy

Wind and solar are still currently small in global terms. Which is why advocates never mention absolute size or even relative size, but focus on growth rates. They also never talk about the wildlife impacts.

In Australia, there is little research on such matters, but some figures are coming in from the US. The Gibson paper cites estimates that wind farms are killing 600,000 to 880,000 bats a year, which now makes them the second biggest risk to bats behind White Nose Syndrome. Birds are also getting killed in large numbers, but not large enough to rate next to motor vehicles and transmission lines; unless you are a bird.

But intermittent renewables like wind and solar need a much bigger transmission network than traditional grids, so they will also increase the avian transmission line death and injury toll. How much bigger does the transmission network need to be for wind and solar? 5-10 times. And those 600,000+ bats killed annually in the US are being killed for a power source that generates just 6.3 percent of US electricity.

The Jacobson plan (see Part I or critique here) calls to expand the 82 GW of wind turbine capacity in the US to 2449 GW; so we can expect this to also cost 18 to 26 million dead bats a year. We can also expect the current wind farm toll of half a million birds annually, including 83,000 raptors, to rise by perhaps a factor of 32.

But all these animal and environmental problems wouldn’t be so bad if the technology could both provide a reliable grid while also solving our climate problem… but it can’t.

In Germany, solar power is still only about 6 percent of electricity, but is already stuck.

The following figure shows that solar power growth is levelling off in all the key European countries who spent big on subsidising solar growth. The German data for solar output in 2017 is available and is much the same as for 2016.

Some of this is due to simply running out of money. But the much bigger problem is structural. It doesn’t matter how cheap it is if you can’t sell it. Solar power output in Germany will certainly rise a little more, but it’s unlikely to pass its predicted maximum of about 11 percent of German electricity.

Prediction? What prediction? I don’t know who spotted it first, but this article contains a description of why intermittent renewables will tend to level of at around what’s called the capacity factor… 11 percent for solar power in Germany, and 16 percent for solar power in sunny Australia.

Why? Put briefly, and using wind power, as an example, when you have enough wind turbines to meet 100 percent of the electricity demand on windy days, then the incentive to build more turbines starts to decline. Why? Think about what will happen on windy days after you double the amount of wind power? You’ll simply have to throw half of your electricity out; you can’t sell it.

How much electricity will you get from wind over a year if you satisfy 100 percent of the demand on windy days? This number is called the capacity factor. It’s just the annual average output divided by the theoretical maximum if every day was maximally windy at all turbine locations. It’s about 33 percent, give or take a bit.

So without large amounts of storage, profitability ceases and growth gradually stops, rather like what you can see in the graph.

The largest battery in the world was recently installed with great fanfare in South Australia, but can it store large amounts of energy? No. That was never the intention; as an energy storage device, it’s tiny.

SA typically uses 1,500 megawatt-hours of energy each hour, and the battery could store about 4 minutes worth of this. The battery was never intended to store energy; that’s just a side effect. Its purpose is to reduce frequency fluctuations during generator outages. Not that it will do that particularly well either. ACOLA reckoned it would need to be 6 times bigger to have prevented the September 2016 blackout.

So it won’t store much energy and won’t be much use to stop blackouts; so what’s it for? As a means of securing votes from renewable energy junkies, it’s priceless.

The only available technology which can store significant amounts of electricity to allow renewables to expand beyond their capacity factor is… can you guess? … flooded valleys; otherwise known as pumped-hydro.

So while renewable advocates cheered early exponential growth of solar and wind power, the rates were always destined to be logistic… meaning that they grow exponentially until hit by limiting factors which cause an equally fast levelling off.

If I had included China in the graph, you’d see a massive solar increase during the past few years, because she’s still on the exponential growth segment of the curve. But the limiting factors will eventually kick in, exactly as they have done in the EU countries. In fact, at a local level throwing out excess wind power in China is already a problem.

A few years back AEMO did a study on how to meet Australia’s electricity demand with 100 percent renewable sources. They put forward two plans, both involved putting a baseload sub-system underneath wind and solar; one plan was based on burning forests and the other on geonuclear.

Geonuclear is where you drill a hole in the earth’s crust deep enough to tap into the heat generated by radioactive decay in the earth’s mantle and crust. You might know it as geothermal, but it’s a power source based on radioactive decay so why not call a spade a spade? And did I mention the radioactive material being bought to the surface and spread over the landscape by this industry?

Is it a problem? Absolutely not. Meaning that it is a well understood micro-problem which people solve in many similar industries. But could I construct a true but totally misleading scare story about it?

For some people, I probably just did. Not everybody appreciates the irony of opposition to digging big holes to drop radioactive material down (nuclear waste repositories) while supporting digging big holes down to where extraordinary quantities of radioactive material is generating heat.

And what if you don’t want burning forests or geonuclear? A recent study of the US showed what happens when you try and power the US with just wind, solar and storage. It quantifies the lack of end game with these technologies. It’s like trying to build a 10-story building with inadequate materials and design. Things may go brilliantly until level 9 and then you suddenly realise you are screwed.

The US electricity grid is currently about 99.97 reliable, ours is generally even better. The study found that that you can get an 80 per cent reliable grid with wind and solar without too much trouble. And then it starts getting hard; really quickly. By without too much trouble, I mean lots of overbuilding and extra transmission lines.

Look at the bottom graph, which assumes 75 per cent wind and 25 per cent solar. The black line shows how big an overbuild you need if you want a grid of specified reliability. The reliability is given along the X axis and the overbuild factor on the right.

Draw a horizontal line with your eyes from the overbuild factor of 10 and see where it hits the black line. Somewhere about 99.8 percent reliability. So if you want a 99.8 percent reliable supply of 1 gigawatt, then you need to build 7.5 gigawatts of wind and 2.5 gigawatts of solar.

This is very much an optimistic estimate. There are plenty of unrealistic assumptions here, like a perfect transmission system and all your turbines in the best spots. It’s the best you can do; it’s just that the best isn’t really very good.

Now draw a horizontal line with your eyes from the overbuild factor of 5 to the 12 hour storage line. This shows that you can get a 96 per cent reliable supply of 1 gigawatt by building 3.75 GW of wind and 1.25 GW of solar if you have 12 gigwatt-hours of storage.

You’d have to repeat the study with Australian data to see what happens here, but it’s worth thinking about what 12 hours of storage looks like. In Australia, our average power use is about 28 gigawatts, so to store 12 hours worth of energy would require about 3,100 of those ‘biggest battery in the world’ devices in South Australia. There are plenty of other tiny storage systems that it’s fun to pretend might one day scale to the sizes required, but only flooded valleys have a proven track record.

As it happens, someone has done a very similar study using Australian data. The recently released ACF report A Plan to Repowe Australia lists the study (by Manfred Lenzen of UNSW and others) among its evidence base. It finds pretty much what the US study found; namely that you could power Australia, meaning supply our 28 gigawatts worth of demand) with wind, sun and storage and all you’d need to do is build 160 gigawatts worth of wind and solar farms, including 19 gigawatts worth of biomass burning backup.

A one gigawatt power plant is a large structure, whether it’s burning wood, coal or gas. The 19 biomass burners would be doing nothing for 90 percent of the time, but we’d need them just to plug the holes when there are low wind and sunshine periods. Oh, and they also postulate 15 hours of storage for the 61 gigawatts of solar farms.

How would this be provided? The main paper didn’t say, and I didn’t buy the Supplementary material. But you could do it with about 8,000 “biggest battery in the world” Li-ion batteries. Alternatively you could use fertiliser; otherwise known as molten salt. This is a mix of sodium and potassium nitrate. All you’d need would be about 26 million tonnes, which is over 8 years worth of the entire planet’s annual global production (see here and here); all of which is currently ear marked to grow food.

In South Australia, our wind energy supplies us with a little over the capacity factor percentage of energy; which means we are starting to throw away electricity when it’s windy, while relying on gas or coal power from Victoria when it isn’t.

Which is why the new Liberal Government wants to build another inter-connector. That’s fine as a short-term fix, but eventually the whole NEM will saturate with wind and solar. And then where do you build an inter-connector to?

The statewide blackout of 2016 was also a wakeup call that the automatic frequency control delivered by synchronous energy sources but not by wind and solar actually mattered; big time. Without it you are in trouble when events of any kind take out some of your generation capacity.

But ignoring the problems and assuming the US results apply, then we could surely plough on and build another 6.5 times more wind power plus considerably more solar and also buy another 180 of those Elon Musk special batteries and we could have a working, but sub-standard, grid.

This assumes we added all the rest of the required transmission infrastructure to connect all those wind and solar farms. That’s the thing with solar and wind. It may seem attractive when you kick the problems down the road and rave about the short-term successes. But the devil is in the detail and the total lack of end-game.

SOURCE  (See the original for links, graphics etc.)






Global warming could produce MORE farming land, scientists admit

CLIMATE change could increase the overall amount of boreal land ready for farming by up to 44 percent, a study has claimed

The threat of global warming and rising sea levels is increasingly likely, scientists have warned.

But a team of international scientists may have now found a potential upside to rising global temperatures. A study, published in the journal Scientific Reports, claims the impacts of global warming could unlock boreal regions for farming by 2099.

Currently, only 32 percent of the world’s boreal areas in the northern hemisphere are arable.

Study co-author Professor Joseph Holden, University of Leeds, explained: “Climate change will have a profound impact on our agricultural regions.

“A projected consequence is the loss of farmland and crops from areas that are currently productive, which is cause for concern regarding long-term global food security.

“Therefore we need to know whether in northern high latitudes new areas will become suitable for crops."

The world’s boreal regions are found in great swathes of the United States, Russia, Canada, Norway, Sweden and Finland.

But the study also warned increasing the world’s arable land could have a negative impact on agriculture in other parts of the globe by upsetting climatic water balance. [Rubbish!  A warmer world would evaporate more moisture off the oceans, meaning that overall rain and snowfall would INCREASE]

Study lead author Dr Adrian Unc, from Memorial University Canada, said: "We must not forget that any changes in land use has extensive impacts on the entire natural ecosystem, impacts that must be understood and included in any planning effort."

SOURCE 




EPA’s Pruitt is far cleaner than critics claim

His security, DC bedroom and policies are legitimate and defensible, under any fair standard

Deroy Murdock

EPA Administrator Scott Pruitt has been hounded lately by allegations of rich spending and poor judgment. While he could have detonated himself during recent congressional-oversight hearings, the former Oklahoma prosecutor seems to have survived those tests. Nonetheless, EPA’s inspector general, the Government Accountability Office, and various congressional panels continue to probe Pruitt’s official conduct. While Pruitt has plenty for which to answer, on at least three key counts, he seems to be cleaner than his critics claim.

Pruitt’s foes have attacked him for allocating too much on bodyguards. Senator Tom Udall (D-NM) slammed Pruitt for “taking 30 EPA enforcement officers away from investigating polluters to serve as his round-the-clock personal security detail.” The Associate Press counts 20, not 30, in Pruitt’s full-time protective detail. Its cost, AP reports, “approached $3 million when pay is added in travel expenses.”

But an August 16 EPA report suggests that Pruitt’s personal-defense outlays are fueled by genuine safety concerns rather than self-aggrandizement. This document cites 14 threats against Pruitt and his family in Fiscal Year 2017. Among them:

* Pruitt’s daughter has been menaced via Facebook. e.g., “I hope your father dies soon, suffering as your mother watches in horror for hours on end.”

* An e-mail sent to the Washington, D.C. office of Senator James Lankford (R - Oklahoma) threatened to assassinate Pruitt, President Trump, and Vice President Pence.

* One message to EPA said, “I hope your head administrator (Scott Pruitt), dies a very painful and horrible death through poisoning. Please explain the scientific method to this freaking neanderthal [sic].”

* Another spooky character stated via Twitter, “Pruitt, I am gonna find you and put a bullet between your eyes. Don’t even think I’m joking. I’m planning this.”

* A postcard sent to Pruitt read, “Get out while you still can, Scott, you are evil incarnate you ignorant fuck.”

* “Dear Mr. Pruitt,” another postcard began. “CLIMATE CHANGE IS REAL!! We are watching you. For the sake of our planet, our children & our grandchildren, will you be a reasonable man? I repeat, we are watching you! Myrna, Michele, Chris, Signe, Lucy, Olivia and Isabel.”

* A trespasser entered EPA headquarters on March 6, CBS News reported. He claimed to be a student attending a “Microsoft event,” said EPA Assistant Inspector General Patrick Sullivan. “The person asked about Scott Pruitt and wanted to know where Pruitt’s office was and if Pruitt ever walked in the hallway outside the room.” Although the intruder was escorted off the premises, he later phoned an employee’s office number and left voicemails in which he said, as Sullivan explained, “he can gain entry into EPA space anytime he wants.”

* Not content simply to write, one critic showed up in person. An EPA sentry stopped him. “During the confrontation, the subject was able to acquire the security officer’s duty weapon and discharge a round into a nearby chair.” The guard disarmed the visitor, who later was indicted for assaulting a federal officer/employee.

These and other concrete provocations justify Pruitt’s focus on security. The Left’s hatred of President Donald J. Trump and his supporters, including Pruitt, is incandescent. One cannot fault Pruitt’s caution, especially after James T. Hodgkinson, a Bernie Sanders campaign volunteer, shot and nearly killed Rep. Steve Scalise (R - Louisiana) and four others at the GOP congressional baseball team’s practice last June in Alexandria, Virginia.

Just a few days ago, Miami-Dade Police arrested Jonathan Oddi. Officers say they nabbed Oddi after he fired gunshots in the lobby of the Trump National Doral, one of the chief executive’s golf resorts. Miami-Dade Police Director Juan Perez said Oddi shouted “anti-Trump, President Trump rhetoric.”

A similar attack that maimed or killed Pruitt – and perhaps EPA personnel and innocent bystanders – is hardly fanciful. Such a scenario is worth devoting resources to prevent.

Also under review: Pruitt’s 2017 housing arrangements in Washington, DC. Pruitt’s accusers claim he got a special, below-cost deal in some sort of bed-for-bribe swap. Had Pruitt been billeted for pennies in a Georgetown townhouse or an Embassy Row mansion, these worries would be legitimate. However, Pruitt rented a room in a Capitol Hill condominium and paid only for the evenings when he actually slept there. He shelled out $50 per night, equal to $1,500 per month. According to Pruitt’s lease, “Enjoyment is limited to one bedroom that cannot be locked. All other space is controlled by the landlord.”

In an April 4 EPA memorandum, Designated Agency Ethics Official Kevin S. Minoli stated that “within a six-block radius” of Pruitt’s crash pad, there were “seven (7) private bedrooms that could be rented for $55 or less/day.” Minoli, who also is EPA’s principal deputy general counsel, also found 38 such rooms “across a broader section of Capitol Hill.” As a result of its research, Minoli explained, “the ethics office estimated $50/day to be a reasonable market value of the use authorized by the terms of the lease. As such, the use of the property according to the terms of the lease would not constitute a gift under the Federal ethics regulations.”

No gift, no graft.

Some also have fretted about the fact that this property is owned by energy lobbyist Steven Hart and his wife Vicki. Pruitt told the Washington Examiner that they were old friends from Oklahoma. “I’ve known him for years,” Pruitt said. “He’s the outside counsel for the National Rifle Association, has no clients that are before this agency, nor does his wife have any clients that have appeared before this agency.”

Pruitt reportedly requested and was given multiple extensions on his lease until last summer. Having overstayed his welcome, the Harts eventually asked Pruitt to make way for an incoming renter. The Harts changed the locks behind Pruitt. If this couple wanted to curry special favor with the EPA chief, this seems like a rather fruitless strategy.

It’s no surprise that these and other actions by Pruitt are under a microscope. For many on the Left, battling so-called “global warming” borders on religion. As they see it, the science is “settled,” this creed is beyond debate, and the heretics who question this faith should be jailed, as Bill Nye the Science Guy has suggested, or executed, as University of Graz, Austria Professor Richard Parncutt has proposed.

Someone like Pruitt, who rejects manmade-global-warmist alarmism and is powerful enough to implement his ideas (e.g. persuading President Trump to junk Obama’s Clean Power Plan and withdraw America from the Paris Climate Treaty) embodies the Left’s worst nightmares. To the warmists, Pruitt is a torch-bearing arsonist, scurrying maliciously through their Vatican. And he must be stopped.

Even if Pruitt winds up scot-free, his situation should serve as a cautionary tale for every member of Team Trump – from the president on down: Their margin of error is thinner than Saran Wrap. President Trump and all who work for him should act as if their every action and utterance were being broadcast live on MSNBC, with Rachel Maddow, Chris Matthews and Joe Scarborough offering scathing, bitter and unforgiving commentary. No one on Team Trump ever will get the benefit of the doubt. When the First Lady gets slammed, even for unveiling an anti-cyberbullying initiative, it is safe to assume that everyone in this administration will be scrutinized with the deepest suspicions.

As much as these actions by Scott Pruitt can be defended, these days require an even higher level of purity. It may be as physically unobtainable as 250-proof alcohol. Regardless, and unfair as it may be, this must be the ideal to which every member of the Trump Administration, the Republican Congress, and pretty much each American conservative must aspire.

SOURCE 





Liberals Upset by superhero movie's Message: Environmentalism = Mass Murder

Progressives are worried about Marvel’s Avengers: Infinity War. They think its villain Thanos, whose solution to the overpopulation problem is to wipe out half the planet, gives the wrong impression that environmentalists are evil.

At Yale Climate Connections, Michael Svoboda complains: By ascribing selfless motives to Thanos, AIW tacitly delivers this toxic message: environmentalism = mass murder.

Solitaire Townsend, co-founder of the environmental PR agency Futerra, also finds the movie’s message too close for comfort. At Forbes, she writes: The Mad Titan sounds worryingly like some environmentalists. Over the years the need for ‘population control and reduction’ has been widely called for as the necessary solution to our resource and sustainability crisis. Thanos is the ultimate Malthusian. After he fulfills his purpose, crumbling half of life in the universe into dust, he retires to an idyll many environmentalists would enjoy – a simple rural hut set in sunlit dappled fields. He had promised “not suffering, but salvation,” and in the final shot a tiny smile is playing on his face after a job well done. Ouch.

For Svoboda, this is a horrible distortion of the essential goodness with which environmentalists are imbued. Not only is it “repugnant” but it fails to take into account all the many wonderful possibilities that greens are now considering as part of their plan to save the world.

In AIW, no one ever points to a country or world that has learned to live sustainably, even though at least two places in the Marvel Cinematic Universe, Asgard and Wakanda, appear to have been moving in that direction.

Neither does anyone point out other ways to solve the problem of environmental degradation, or even other ways to pursue Thanos’s preferred solution. Empowering women, for example, is a peaceful way to constrain and then reverse population growth.

Except, as Maddie Stone argues here, Thanos’s attitude is actually a pretty accurate representation of how many environmentalists think.

To me at least, it came across as a clear denouncement of a certain breed of solutions-oriented environmentalism that centers planetary “balance” over people.

The early history of environmentalism is festooned with warnings of a population apocalypse, beginning with 18th century scholar Thomas Malthus’ An Essay on the Principle of Population, which concluded that rising human numbers would inevitably lead to widespread poverty and famine. As Malthus’ pessimistic predictions failed to materialize, he was declared a false prophet.

But his ideas stuck around, re-emerging with force in the mid-20th century following the viral popularity of works like Paul Ehrlich’s The Population Bomb(1968), which predicted “hundreds of millions of people are going to starve to death” in the 1970s, and The Limits to Growth (1972), an MIT research report that concluded “The basic behavior of the world system is exponential growth of population and capital, followed by collapse.”

So yeah, Thanos’ concern about galactic population control? Definitely something we’ve thought about here on Earth. And while the most dire doomsday predictions haven’t come true—thanks largely to industrialization and the green revolution in agriculture—this school of thinking has had real-world consequences, including racist campaigns to sterilize millions of women in the developing world, and China’s fraught one-child policy.

Avengers: Infinity War isn’t the first movie to buck Hollywood’s “environmentalism = fluffy and good” trend. One of the first to do so was Michael Crichton’s State of Fear (2004) in which eco-terrorists plot mass murder to raise awareness of global warming.

The evil mastermind in Kingsman: The Secret Service was also an environmentalist. Richmond Valentine (Samuel L Jackson) is a billionaire philanthropist who believes the only way to save humanity from overpopulation is to wipe out everybody except his favorite celebrities and politicians.

SOURCE 





Democrats Blame Trump for High Gas Prices

Meanwhile, Dems regularly call for more tax hikes on gasoline

Gas prices across the nation have been steadily rising and have now reached levels not seen since November 2014. The average price of gas this week is $2.92 per gallon, 55 cents higher than this time last year. And while summer gas price hikes are nothing unusual, the fact of the matter is it’s an election year and Democrats are looking for any political narrative to spin in their favor.

Lobbing baseless accusations from the political peanut gallery, Senate Minority Leader Chuck Schumer (D-NY) bloviated, “President Trump’s reckless decision to pull out of the Iran deal has led to higher oil prices.” He continued, “These higher oil prices are translating directly to soaring gas prices, something we know disproportionately hurts middle and lower income people.” Ah, so “controlling” the inherently uncontrollable and regularly fluctuating price of gas is more important than pulling out of a terrible deal that did nothing but give cover to a villainous, terror-funding regime on its march toward nuclear weapons? Schumer’s simply trying to kill two birds with one stone. That’s because he’s worried that the widely anticipated “blue wave” may prove to be nothing more than a ripple.

So why do gas prices fluctuate so regularly, and why in particular do they rise during the summer months? First, the leading factor in the price of gas is OPEC. In November 2016, OPEC decided to limit oil production with the express purpose of increasing the price. Second, the EPA requires refiners to change their gas formulas in the spring to accommodate air-quality regulations, driving up production costs, which are then passed on to the consumer. And finally, economics 101 — America’s growing economy has increased the demand for gas, which in turn raises the price. None of these factors are directly controlled by this or any president.

In fact, the only real direct control politicians have over the price of gas is via taxation. Ironically, while Schumer and his fellow Democrats are running around blaming Trump and Republicans for higher gas prices, the fact is Democrats have for years advocated raising gas taxes. As Investor’s Business Daily notes, “As recently as 2015, Democrats were pushing to nearly double the federal gasoline tax. At the time, House Minority Leader Nancy Pelosi said that it was the perfect time to do so because ‘if there’s ever going to be an opportunity to raise the gas tax, the time when gas prices are so low — oil prices are so low — is the time to do it.’” Democrat-controlled states like California already have raised gas taxes.

So the next time you fill up and wince at the price, remember just how much of the price is already due to Democrat taxes, and how much higher they’d prefer those prices to be.

SOURCE 

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Friday, May 25, 2018


Air pollution scare debunked

Greenies love to condemn urban air pollution and say how bad for us it is. Faulty science on fine particulate pollution (PM2.5) was the bedrock of the Obama EPA’s war on coal. Particulates don’t just make you sick; they are directly related “to dying sooner than you should,” EPA Administrator Lisa Jackson falsely told Congress. There is no level “at which premature mortality effects do not occur,” Mr. Obama’s next Administrator Gina McCarthy dishonestly testified. See also some of my previous comments here

The latest research findings below are very powerful evidence on the question. The study included the entire Medicare population from January 1, 2000, to December 31, 2012. And their finding that only one in a million people die from particulate air pollution is pretty decisive. If you bother about that tiny risk, you should never get out of bed.

The authors pretend that their findings support the Greenies but they would have been reviled if they had said the truth:  That their findings show that air pollution is not dangerous.

Air pollution from smoky cooking-fires has probably been part of the human experience for something like a million years and we have adapted to it.  We just cough it up.


Association of Short-term Exposure to Air Pollution With Mortality in Older Adults

Key Points

Question:  What is the association between short-term exposure to air pollution below current air quality standards and all-cause mortality?

Finding:  In a case-crossover study of more than 22 million deaths, each 10-μg/m3 daily increase in fine particulate matter and 10–parts-per-billion daily increase in warm-season ozone exposures were associated with a statistically significant increase of 1.42 and 0.66 deaths per 1 million persons at risk per day, respectively.

Meaning:  Day-to-day changes in fine particulate matter and ozone exposures were significantly associated with higher risk of all-cause mortality at levels below current air quality standards, suggesting that those standards may need to be reevaluated.

Abstract

Importance:  The US Environmental Protection Agency is required to reexamine its National Ambient Air Quality Standards (NAAQS) every 5 years, but evidence of mortality risk is lacking at air pollution levels below the current daily NAAQS in unmonitored areas and for sensitive subgroups.

Objective:  To estimate the association between short-term exposures to ambient fine particulate matter (PM2.5) and ozone, and at levels below the current daily NAAQS, and mortality in the continental United States.

Design, Setting, and Participants:  Case-crossover design and conditional logistic regression to estimate the association between short-term exposures to PM2.5 and ozone (mean of daily exposure on the same day of death and 1 day prior) and mortality in 2-pollutant models. The study included the entire Medicare population from January 1, 2000, to December 31, 2012, residing in 39 182 zip codes.

Exposures:  Daily PM2.5 and ozone levels in a 1-km × 1-km grid were estimated using published and validated air pollution prediction models based on land use, chemical transport modeling, and satellite remote sensing data. From these gridded exposures, daily exposures were calculated for every zip code in the United States. Warm-season ozone was defined as ozone levels for the months April to September of each year.

Main Outcomes and Measures:  All-cause mortality in the entire Medicare population from 2000 to 2012.

Results:  During the study period, there were 22 433 862 million case days and 76 143 209 control days. Of all case and control days, 93.6% had PM2.5 levels below 25 μg/m3, during which 95.2% of deaths occurred (21 353 817 of 22 433 862), and 91.1% of days had ozone levels below 60 parts per billion, during which 93.4% of deaths occurred (20 955 387 of 22 433 862). The baseline daily mortality rates were 137.33 and 129.44 (per 1 million persons at risk per day) for the entire year and for the warm season, respectively. Each short-term increase of 10 μg/m3 in PM2.5 (adjusted by ozone) and 10 parts per billion (10−9) in warm-season ozone (adjusted by PM2.5) were statistically significantly associated with a relative increase of 1.05% (95% CI, 0.95%-1.15%) and 0.51% (95% CI, 0.41%-0.61%) in daily mortality rate, respectively. Absolute risk differences in daily mortality rate were 1.42 (95% CI, 1.29-1.56) and 0.66 (95% CI, 0.53-0.78) per 1 million persons at risk per day. There was no evidence of a threshold in the exposure-response relationship.

Conclusions and Relevance:  In the US Medicare population from 2000 to 2012, short-term exposures to PM2.5 and warm-season ozone were significantly associated with increased risk of mortality. This risk occurred at levels below current national air quality standards, suggesting that these standards may need to be reevaluated.

SOURCE





Putting U.S. Energy Production in Perspective

"The increase in oil and gas production is equal to seven times the energy output of all domestic solar and wind." 

As we previously reported, oil production in the U.S. is truly something to behold. It doesn’t get much attention, but in February the Energy Information Administration calculated that more than 10 million barrels’ worth of oil is generated every single day in the U.S. — a five-decade high. However, you might be wondering how oil and natural gas production together fare in comparison to some of the Left’s coveted renewable energy projects. In a recent op-ed, the Manhattan Institute’s Robert Bryce provides the fascinating answer:

Over the past decade, merely the increase — I repeat, just the increase — in US oil and gas production is equal to seven times the total energy production of every wind turbine and solar project in the United States. … In 2008, US oil production was about 5.2 million barrels per day. Today, it’s about 10.2 million barrels per day. In 2008, domestic gas production averaged about 55.1 billion cubic feet per day. Today, it’s about 87.6 billion cubic feet per day. That’s an increase of about 32.5 billion cubic feet per day, which is equivalent to about 5.5 million barrels of oil per day. Thus, over the past decade, US oil and gas output has jumped by about 10.5 million barrels of oil equivalent per day.

Let’s compare that to domestic solar and wind production which, since 2008, has increased by 4,800 percent and 450 percent, respectively. While those percentage increases are impressive, the total energy produced from those sources remains small when compared to oil and gas. In 2017, according to the Energy Information Administration, US solar production totaled about 77 terawatt-hours and wind production totaled about 254 terawatt-hours, for a combined total of 331 terawatt-hours. That’s the equivalent of about 1.5 million barrels of oil per day. Simple division (10.5 divided by 1.5) shows that since 2008, the increase in energy production from oil and gas is equal to seven times the energy output of all domestic solar and wind.

That’s an incredible statistic. Consider just how many billions of taxpayer dollars have been thrown at renewable energy projects, and then compare the relatively lackluster results with what the free market has accomplished on its own. As we opined in February, America’s robust energy production is emblematic of the positive developments that occur when onerous regulations are repealed and innovation takes hold. Of course, that doesn’t necessarily mean that prices at the pump will reflect U.S. production. In New York, for example, some drivers are facing $5/gallon gas.

To help explain some of this discrepancy, our own Michael Swartz recently wrote, “While it isn’t as much of a factor on the supply side, OPEC can still be a price driver. In this case, both Saudi Arabia and non-OPEC Russia have put aside their foreign policy differences and enforced an 18-month-long production cut between themselves — a slowdown that has eliminated the supply glut (and low prices) we enjoyed over the last few years. And since those two nations are the second- and third-largest producers of crude oil (trailing only the U.S.), their coalition significantly influences the market.”

But if anything, this should actually encourage the U.S. to pursue oil and gas extraction to an even greater degree. The limit to what energy companies can do here in America has always been underappreciated, so providing a good environment for them to further flourish should be a high priority if our goal is genuine energy independence. The less we have to worry about what OPEC is doing behind the scenes, the better off consumers — and our national security — will be. Based on the numbers alone, wind and solar energy production aren’t going to get us there.

SOURCE





Out Of Sight, Out Of Mines

As a generalization, it’s safe to say that there are few things in this world more odious to an environmentalist than the mining of metals and minerals, except if those activities are conducted in an obscure, faraway place, and if the fruits of those activities bear the cool, sleek moniker of “clean.”

There’s a modern-day “Heart of Darkness” being perpetrated in the Democratic Republic of Congo, where tens of thousands of children as young as four are forced to haul rocks to the surface from mines dug by hand as part of a cobalt-mining operation, under conditions that would make Upton Sinclair, or, for that matter, Joseph Conrad, blush.

Last August, the Daily Mail printed an article describing these conditions, where they also reported that each electric car requires an average 15 kg (33 lbs) of cobalt in its batteries.

To give credit where due, according to Benchmark Minerals, Panasonic has enabled Tesla to reduce its cobalt consumption by 60% over the last six years by utilizing nickel-cobalt-aluminum (NCA) technology versus nickel-cobalt-manganese (NCM), which remains the standard for the electric vehicle (E.V.) industry.

Nevertheless, replacement technology for cobalt is still at least ten years out, and the projected “EV surge is far more significant than the reduction of cobalt intensity which is close to its limit[.] … [M]ore cobalt will be needed and the reliance on the Democratic Republic of Congo as the primary supplier [60% of worldwide production] will increase.”

On May 17, 2018, the Wall Street Journal reported that “prices of lithium and cobalt more than doubled from 2016 through last year, but the rally has cooled off recently amid worries about oversupply.”

The market responded in typical fashion by ramping up worldwide production (i.e., mining) of lithium and, to a lesser extent, cobalt.  Consumption levels of nickel, manganese, and aluminum are no doubt on the rise as well.

E.V.s and plug-in hybrids are eligible for federal tax credits up to $7,500, depending upon the battery capacity, and most E.V.s are eligible for the maximum amount.  Some states offer additional subsidies.  Colorado is the most generous.  This from Complete Colorado:

Currently those with EV or AFV [Alternative Fuel] vehicles receive up to $20,000 in Colorado income tax credits over and above the $7,500 the federal government already grants.  The credit is based on size and weight of vehicle.  Light passenger vehicles get $5,000, which, unlike most states and the federal credit, can be used as a rebate, and trucks get $7,000-$20,000.

As of 4/18/2018, a bill to repeal this electric vehicle subsidy (S.B. 18-047) was postponed indefinitely by the Colorado House Committee on Transportation and Energy.

All such subsidies should be eliminated.  If we stopped subsidizing electric trucks and buses, for example, we would likely see more conversions of truck and bus fleets to compressed natural gas (CNG), which is cheaper; more efficient; and, I argue, more environmentally desirable than the electric alternative.

All are imperfect, but the market is not the insidious spawn of Darth Vader.  We’re better off if complex, dynamic solutions have to prove their worth by competing on many levels in the real world, as opposed to a having a few masterminds (at the prodding, or shall we say incentivizing, of parties with vested interests) distort the field with edicts from above.

SOURCE




NY Dems’ Anti-Energy Policies Forced New Yorkers To Pay 46 Times More For Power

Natural gas prices in the New York City region skyrocketed in January, costing New Yorkers roughly 46 times more than the 2017 average for the area, according to a Tuesday report from the Consumer Energy Alliance (CEA).

Despite neighboring natural gas-rich Pennsylvania, New York residents pay 44 percent more for energy than the national average. A lack of transportation infrastructure between the two states has effectively cut off New Yorkers from a large supply of fuel.

“Spot market prices in the New York City region jumped to a record high of $140.25 for natural gas, as compared to the average natural gas spot market price for New York in 2017 was $3.08,” CEA found. “New Yorkers were subjected to prices that were $137 higher due to self-inflicted capacity constraints created by their own elected officials.”

Due largely to a lack of oil and gas infrastructure, much of New England was forced to rely on imported natural gas from Russia to keep neighborhoods heated during over the winter.

Parts of New England sit on one of the largest deposits of shale oil in the U.S., the Marcellus shale formation that covers parts of New York, Ohio, Pennsylvania and West Virginia.

Natural gas makes up a significant part of the energy mix in New York, despite the limits to infrastructure imposed by state officials. More than half of all New York residents heat their homes with natural gas, and the sector supports nearly 200,000 jobs in the state.

“This report highlights the often-overlooked benefits New York’s communities are receiving because of the U.S. energy revolution, enhanced infrastructure, and pipelines,” CEA Mid-Atlantic Director Mike Butler said in a statement.

“However, New York families, businesses, and households will not be able to realize the full potential of these benefits until natural gas plays a larger role in the state to offset intermittency issues and the physical realities of the state’s electric grid.”

SOURCE




Scott Pruitt’s Mission to Make EPA Operate More Efficiently

The Environmental Protection Agency recently announced the creation of an Office of Continuous Improvement to implement a lean management system. It’s part of Administrator Scott Pruitt’s effort to make the EPA—a government agency known for its expansive reach—work more efficiently on behalf of American taxpayers.

EPA Chief of Operations Henry Darwin spoke exclusively to The Daily Signal about the new office and the work that its director, Serena McIlwain, would be doing. A lightly edited transcript of the interview is below.

Rob Bluey: Administrator Scott Pruitt recently announced a new Office of Continuous Improvement at the EPA. Can you tell us what it’s going to do and why it matters?

Henry Darwin: The Office of Continuous Improvement is a group of EPA staff that will be helping me, as the chief of operations, deploy a new management system based upon lean principles. Initially, the vast majority of their time will be spent on deploying the new system, but over time, their time will be spent more so on performing problem solving and process improvements as we identify opportunities under the new management system.

Bluey: Let’s take a step back. What is lean management and how exactly are you applying it at EPA?

Darwin: Lean management is a system that is specifically designed to help identify opportunities for improvement and then to monitor performance to see whether or not there are additional opportunities for improvement. And also to make sure that, as we make improvements, that they’re sustained over time.

Typically what happens with most lean organizations, or organizations who say they’re lean, is they do a series of projects that result in theoretical process improvements. Without a system that is specifically designed to make sure that those processes do in fact improve and that there’s measurement in place to make sure that those processes improve, it’s often the case that those projects are not as successful as they would have otherwise been, had there been a system in place to support them.

Bluey: So how would you say that this is making the EPA operate more efficiently?

Darwin: EPA has a long history of using lean to improve processes. What it was lacking, and what we’re trying to implement for the first time, is a system that helps us identify strategic opportunities for us to use lean to improve our processes.

So whereas the previous administration merely asked or required the programs to perform lean events, we’re actually setting very strategic goals and objectives with high targets and we’re asking the programs and regional offices to meet those targets using lean. And then through the management system, we’re monitoring whether or not those improvements are actually occurring.

If they’re not, then we have the group of people now, the Office of Continuous Improvement, that can come in and analyze as to why those improvements aren’t happening or if there’s additional process improvements using lean that are needed in order to get them to where we want them to be as the administrator that sets forth goals and objectives for the agency.

Bluey: Under the Trump administration, you’ve made it a priority to track permitting, meeting legal deadlines, and correcting environmental violations. What did you find when you first took the job and how have things changed since then?

Darwin: The EPA has a history of measuring very long-term outcomes—outcomes that aren’t measurable on a regular basis. And what they had failed to do and what we’re starting to do is to measure those things that we can measure on a more frequent basis, those things that are important to our customers.

“Just like businesses have investors, we have investors. And our investors expect a return on their investment, which is clean air, clean land, clean water, and safe chemicals.”

Now, there are a lot of people out there that suggest we shouldn’t be calling those who we regulate our customers, but I’m not one of them. I believe that we do and should recognize our regulated community as our customers so we can apply business-related principles to our work.

With that said, we always have to remember that we have investors. Just like businesses have investors, we have investors. And our investors expect a return on their investment, which is clean air, clean land, clean water, and safe chemicals.

We always have to be mindful of the fact that even though we want to be paying attention to our customers’ needs as they get permits or licenses, or we’re working with them to achieve compliance, we also have to remember that our taxpayer investors expect a return on our investment. So we also have to be measuring those outcomes, those mission-related outcome measures, related to clean air, clean land, and safe chemicals.

Bluey: Let’s take permitting, for example. I know it’s something that Administrator Pruitt has talked about. He says that he wants to get permitting down to a certain period of time because in past administrations there was an indefinite period where people just didn’t get an answer, a yes or a no. He wants you to be able to say yes or no. What are some of the goals that you’re trying to do with regard to permitting specifically?

Darwin: When we arrived here in this administration, what we found was that we had heard anecdotally, from our customers, that the permitting process was simply taking too long.

What we also found was that the EPA did not have a system for tracking the amount of time it took to issue permits. So we simply went to the programs that issue permits and asked them for the last six months, how long was it taking for us to issue permits? And what we found was fairly surprising, that in some areas they were as long as three years to issue permits, which is simply unacceptable.

In having conversations with the administrator and talking about what a reasonable target or goal would be initially we agreed, he set the standard, or the goal, for issuing permits within six months. So that’s our goal.

Our goal is to, for every permit that’s directly issued by EPA, our goal is to reduce the amount of time it takes from whatever it is right now, which could be upwards of three years, down to six months.

Henry Darwin, the EPA’s chief of operations, discusses the agency’s lean management system at the announcement of the Office of Continuous Improvement on May 14. (Photo courtesy of EPA)
Bluey: In an interview with The Daily Signal, Administrator Pruitt spoke about what you’re doing as the Darwin Effect, named after you. How did you come to embrace these management principles in your line of work?

Darwin: I’m a lifelong environmental professional. I have 18 years of experience working for a state environmental agency. I became the director of that state environmental agency in Arizona about seven years ago and was the director there for five years. Over the course of my experience there, I found an appreciation for lean and a system that could support lean efforts.

We were able to, in my agency, reduce permitting timeframes on the order of 70 percent, 80 percent, and in some instances, 90 percent using lean principles and as supported by a lean management system. I, after that experience, was asked by the governor of Arizona, Gov. Doug Ducey, to do the same lean management system deployment for the entire state.

Over the course of two years, I was in the process of deploying a lean management system in 35 state agencies with 35,000 employees and we were seeing the same types of results. They continue to see those same types of results in Arizona using the same business processes and principles.

Bluey: Like Administrator Pruitt, who was prior to his appointment the attorney general of Oklahoma, you come from state government as well. How would you say that experience, both working in the environmental field and then working as Arizona’s chief of operations, prepared you for the job that you’re doing today?

Darwin: I hope that it did prepare me. But there are some significant differences between state government and the federal government. The federal government, rightly or wrongly so, is a much bigger bureaucracy. So the efforts that we had been undertaking at the state level, although not impossible, is actually more difficult now that we’re here doing this work at the federal level. But with that said, it’s more rewarding.

The zone of influence, or the impact that we are making here, it’s to the benefit of not just a single state but the entire country. So even though it may be more difficult, it’s more rewarding. And I can’t think of a better place to be right now.

Bluey: Can you talk about the reaction to the Office of Continuous Improvement within the agency? And also the lean management system.

Darwin: I’m very fortunate in the fact that before I arrived there was a pretty strong appreciation for what lean could be. With that said, EPA had not found a way of making lean all it could be.

I received a lot of support internally for this idea of bringing a system to EPA that could be used to realize, and bring to life, a lot of those improvement ideas that had been identified under previous administrations.

This is as much about carrying forward the work that had been done previously and bringing discipline to actually executing on the plans and the improvements that had been identified but not necessarily followed through on from previous administrations. It has received a lot of positive feedback, a lot of energy and positive energy around the work that we’re here to do.

Bluey: Who are you going to have directing the new office?

Darwin: The director of the Office of Continuous Improvement is a woman named Serena McIlwain. She comes most recently from a region in San Francisco, Region 9. She has a lot of experience, not only at EPA but also in the federal government. So she can help me navigate some of those bureaucracies.

She was the person at EPA who was probably the biggest proponent of lean. She was actually teaching lean tools and principles from Region 9 to the entire agency. She’s been a fantastic fit so far and I know that she’s going to do a great job.

Serena McIlwain, the EPA’s director of the Office of Continuous Improvement, explains her new role. (Photo courtesy of EPA)
Bluey: As a conservative, I have to ask this because any time government is creating a new office, you might have Americans out there who are skeptical and believe in smaller government. What’s your message to those who say, “How is this going to improve performance and not create more bureaucracy?”

Darwin: As a conservative myself, I would share those concerns or sentiments. What I will say is that even though this is a new office, this is not new employees.

We have not grown the size of the EPA. Those who are performing this work were already EPA employees. We have pulled these staff members and managers from within the agency, so we’re just redirecting them to what I believe to be higher value or more value-added work.

Instead of focusing their efforts on performing lean projects that had questionable or limited results, we’re focusing them on areas where we actually will see results. So they’re actually providing higher value, not only to the EPA but our taxpayer investors.

Bluey: And finally—I’ve posed this question to Administrator Pruitt as well—how do you ensure that the changes you’re making at EPA today last many, many years into the future?

Darwin: A lot of it is institutionalizing the work that I’m doing. And not to get too technical, but there are methods and means by which we can institutionalize the work.

It’s really connected back to your question about pulling people from within the agency. We’re not bringing in a bunch of new people, we’re not bringing in a bunch of consultants to do this work. We’re trying to learn from within EPA. We’re trying to use career staff that have a lot of experience at EPA and have a lot of influence at EPA in order to manage the office, in order to lead the office, but also to staff the office.

Because we want them to believe in the new system, we want them to carry this forward beyond our existence here.

Bluey: Henry, thanks so much for taking the time to speak to The Daily Signal.

Darwin: Thank you.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************

Thursday, May 24, 2018



Groundbreaking assessment of all life on Earth reveals humanity’s surprisingly tiny part in it as well as our disproportionate impact

This is a very silly article, replete with implicit but unargued assumptions  -- such as the implicit claim that "we" are in some way responsible to make good -- or at least apologize for -- all the damage that all humans throughout history have ever done.

From Trilobites to the dinosaurs, extinctions are what nature does.  Of all species that have existed on Earth, 99.9 percent are now extinct. And, of all the extinctions that ever happened, most by far happened long before human beings were on the scene. Humans were NOT reponsible for the extinction of the dinosaurs, for instance.

And even in the human era, modern sensitivities were virtually unknown.  The megafauna of Australia were extinguished by Australian Aborigines, for instance.  I feel no guilt over that. Primitive people are often hard on the environment (pace the fictional Chief Seattle) but how am I responsible for that? It's a basic principle of natural justice that I am not to blame for the deeds of others.

Nonetheless, I am enough of a modern man to feel some regret about some recent extinctions (passenger pigeons anyone?).  But should I?  That leads us into very rarefied areas of moral philosophy that are not all congenial.  Peter Singer, for instance, is an eminence in that field and his cogitations lead him to some very objectionable conclusions, like the permissibility of infanticide.

So feeling that recent extinctions are bad is just that: feelings. A more intellectual justification for concern awaits.  Excerpts only below:


Humankind is revealed as simultaneously insignificant and utterly dominant in the grand scheme of life on Earth by a groundbreaking new assessment of all life on the planet.

The world’s 7.6 billion people represent just 0.01% of all living things, according to the study. Yet since the dawn of civilisation, humanity has caused the loss of 83% of all wild mammals and half of plants, while livestock kept by humans abounds.

The transformation of the planet by human activity has led scientists to the brink of declaring a new geological era – the Anthropocene. One suggested marker for this change are the bones of the domestic chicken, now ubiquitous across the globe.

The new work reveals that farmed poultry today makes up 70% of all birds on the planet, with just 30% being wild. The picture is even more stark for mammals – 60% of all mammals on Earth are livestock, mostly cattle and pigs, 36% are human and just 4% are wild animals.

The destruction of wild habitat for farming, logging and development has resulted in the start of what many scientists consider the sixth mass extinction of life to occur in the Earth’s four billion year history. About half the Earth’s animals are thought to have been lost in the last 50 years.

But comparison of the new estimates with those for the time before humans became farmers and the industrial revolution began reveal the full extent of the huge decline. Just one-sixth of wild mammals, from mice to elephants, remain, surprising even the scientists. In the oceans, three centuries of whaling has left just a fifth of marine mammals in the oceans.

The researchers calculated the biomass estimates using data from hundreds of studies, which often used modern techniques, such as satellite remote sensing that can scan great areas, and gene sequencing that can unravel the myriad organisms in the microscopic world.

The researchers acknowledge that substantial uncertainties remain in particular estimates, especially for bacteria deep underground, but say the work presents a useful overview.

SOURCE




Ordinary British motorists face being priced out of driving if the Government goes ahead with proposals demanding that by 2040 every car can cover 50 miles on electric power.

The warning came from Toyota to the business select committee as it heard from car chiefs about the future of electric vehicles.

A leaked government consultation called “Road to Zero” proposes the 50-mile zero emission requirement for cars in 22 years’ time.

However, Toyota Motor Europe managing director Tony Walker warned such a measure could put driving beyond the budgets of most people, saying that batteries capable of hitting the 50-mile requirement are too expensive.

“The point is that every car, from the biggest to smallest, whether it costs £10,000 or £250,000, for every car to be able to do 50 miles [on electricity] is not wise, it is reckless,” Mr Walker. “It will price the ordinary customer out of the market.”

Toyota introduced hybrid cars to the mass market with its Prius. However, Mr Walker said the company’s current hybrids are not capable of doing 50 miles on the batteries currently used, and could be wiped out by the proposal.

To meet the target he said a more expensive battery used in plug-in hybrid cars would be required - and the economics do not stack up.

“It would make the hybrid vehicles we make in the UK currently unsaleable in the UK,” he said, adding that it would be “very difficult” to keep building cars and engines in Britain if government policy had made them impossible to sell here.

Mr Walker also questioned the government arbitrarily picking dates for targets the industry must achieve.

“Are you are saying somehow you know battery costs will come down?” he asked the committee. “How come you know that and we don’t? It’s too academic and not so practical on battery cost.”

Professor David Bailey, a car industry expert at Aston University, warned that the 50-mile requirement could kill off hybrids. “Electric cars are still expensive and will remain so for at least the next 10 years, when they will start to compete with petrol and diesel,” he said. “If Government is serious about improving air quality it needs to get drivers into hybrid cars as an interim measure rather than kill them off early.”

MP on the committee were told a “technology neutral” approach should be implemented, where Government sets targets to improve air quality for the car industry to achieve and lets them find ways to achieve it, without proscribing how they should do it.

The approach was backed by BMW, with Ian Robertson, the company’s UK representative, warning consumers are “sitting in the sidelines”, with many of them afraid to buy an electric car because of uncertainty about it.

He pointed to research saying that 90pc of drivers do short trips which are easily manageable on current electric technology.  Mr Robertson called the 50-mile zero emission target for 2040 “probably achievable” for a majority of drivers.

However, he added: “But it’s against a backdrop of some customers who need to do a 500-mile drive.

“Rather than take certain technology step that says we will have no combustion engines, let’s go for the target which we all agree on which is having a very low emission target and for majority of customer a zero emission target.”

Nissan, which builds the electric Leaf car at its giant Sunderland plant, said that the biggest challenge to battery vehicles was infrastructure. Gareth Dunsmore, the company’s electric vehicles director, said there was a “chicken and egg situation” with electric cars at the moment with people worried about whether they will be able to charge them.

However, with a comprehensive recharging infrastructure he said the 2040 target would be achievable. “If you get to wherever you park and there’s charging points the discussion about whether it is a challenge to move to electrification might be a moot point,” Mr Dunsmore said. “Customer demand could take over.”

The Government said it was “categorically untrue” it plans to ban vehicles incapable of meeting the 50-mile zero-emission target, saying the Road to Zero strategy was “yet to be finalised” and that it “would not comment on leaked draft documents”.

SOURCE




Yes, a Drop in Global Temperatures, But...

A recent editorial from a typically reputable conservative publication touted a sharp global temperature drop between February 2016 and February 2018. There's no disputing the fact that records indicate a temperature drop on a global scale during that time. However, the editorial — the purpose of which was to lambast the nefarious mainstream media for purposefully ignoring this inconvenient fact — did not explain fully what is actually happening. Here's why:

Global temperature drops are typical after El Niños, which naturally warm the globe. La Niñas do the opposite. Moreover, the last El Niño (2015-16) was exceptionally strong — a.k.a. a Super Niño. Therefore, it only makes sense that a significant global temperature drop would follow, as we've just registered back-to-back La Niña years. That being the case, the fact of the matter remains that global temperatures remain above normal two years after the last El Niño and above where they were two years after other El Niños. A higher threshold is seemingly being established following each Super Niño. Before 2015-16, the last one occurred in 1997-98. A pause ensued, but that baseline was higher than the average temperature over the previous 20 years. This doesn't prove ecofascists' rationale (not to mention justify their "solutions") for global warming, but it also doesn't do conservatives any favors to borrow from the Left's playbook by cherry-picking statistics simply for the benefit of aiding a narrative.

The cause may have to do with the immense amounts of water vapor that are released during Super Niños. These events result in a longer-lasting effect in Arctic regions years later, as tiny amounts of water vapor most affect temperatures where they are climatologically lowest and the air is driest. A glance at temperatures where most life occurs reveals that temperatures have returned to near normal, but the warmer polar regions are leading to the above-normal global temperatures, primarily during the coldest times of the year. It's still very cold, but even slight warming there can skew the entire global temperature a few tenths of a degree higher. Regardless, global temperatures are still higher than they were after previous El Niños.

To its credit, the editorial in question does note that the latest temperature drop could be entirely meaningless. However, a little basic research would have precluded the editorial altogether. As meteorologist Joe Bastardi said in an email to The Patriot Post, "This is like saying you scored a couple of touchdowns and instead of being down 56-0 you are now losing only 56-14." He added, "The point is that until temperatures return to or below normal, what is all the hoopla about? The other side can rightly point to a two-year post-El Niño record high." The mainstream media should be criticized for how it handles reporting global warming. Conservative media should avoid falling into the same trap.

SOURCE





Now's the Time to Restore Integrity to EPA Regulatory Science

For decades the federal Environmental Protection Agency (EPA) has gotten away with creating regulations that lack sound scientific basis, costing Americans hundreds of billions of dollars without solid evidence that those costs were justified.

It’s done this in two ways.

Sometimes it’s simply thrown out scientific results and regulated to satisfy a political pressure group. That was largely the case when in 1972, contrary to its own scientific findings but under heavy pressure from environmentalists, it banned the use of DDT, the most effective, least expensive, safe pesticide by which to control or eradicate disease-carrying insects like mosquitos and lice.

The U.S. had already largely eliminated malaria by widespread spraying of DDT from the 1940s into the 1960s, so the ban didn’t have immediate, large-scale negative consequences here. But it has made it more difficult to combat the recent spread of other insect-borne diseases like West Nile Virus, Zika, Lyme, and spotted fever, and even malaria is making a comeback.

The greater impact of the DDT ban has been in developing countries. The EPA persuaded other federal agencies to withhold foreign aid from countries that used DDT. Most developing countries complied. The result has been hundreds of millions of cases of malaria every year and tens of millions of malaria-caused deaths over the last 45 years.

At other times the EPA has built new regulations on “secret science” — studies whose authors refuse to grant other scientists access to the data, computer code, and methodology behind them. Such studies are not subject to replication by other scientists. Yet replication is the acid test of scientific research.

“Secret science” has been especially common as the basis for pollution regulation dependent on dose/response relationships and for regulation related to anthropogenic global warming (AGW).

Last month EPA Administrator Scott Pruitt requested public comment on a new rule, “Strengthening Transparency in Regulatory Science” (STRS), designed to solve that problem.

STRS provides that “When promulgating significant regulatory actions, the Agency shall ensure that dose response data and models underlying pivotal regulatory science are publicly available in a manner sufficient for independent validation.” It codifies what was intended in the Secret Science Reform Act of 2015 and the Honest and Open New EPA Science Treatment Act of 2017 (HONEST Act), both of which passed the House but never came up for a vote in the Senate.

The Cornwall Alliance for the Stewardship of Creation — a network of scientists, economists, and religious leaders dedicated to environmental stewardship and economic development for the poor — has issued and is gathering signatures to an open letter supporting the STRS that calls the proposed rule “badly needed to assure American taxpayers that the EPA is truly acting in their best interests.”

Opponents of STRS raise three common, and at first sight credible, objections.

The first is that peer review ensures the quality of studies published in refereed journals. But there is actually no empirical evidence that peer review works well. Drummond Rennie, deputy editor of the Journal of the American Medical Association and intellectual father of the international congresses of peer review held quadrennially starting in 1989, has said, “If peer review was a drug it would never be allowed on the market.” In fact, as John P.A. Ioannidis demonstrated in a celebrated article in PLOS/Medicine, “most scientific research findings are false.”

The second common objection is that the rule would prevent the EPA from using studies that involved confidential information, such as personal health data or corporate proprietary information. In an open letter to Pruitt, the leftist, political-activist Union of Concerned Scientists (UCS) argued, “There are multiple valid reasons why requiring the release of all data does not improve scientific integrity and could actually compromise research, including intellectual property, proprietary, and privacy concerns.”

Yet Section 30.5 of the rule expressly states: “Where the Agency is making data or models publicly available, it shall do so in a fashion that is consistent with law, protects privacy, confidentiality, confidential business information, and is sensitive to national and homeland security.” Section 30.9 allows the administrator to make exceptions when compliance isn’t feasible.

A third common objection, also expressed in the UCS letter, is that “many public health studies cannot be replicated, as doing so would require intentionally and unethically exposing people and the environment to harmful contaminants or recreating one-time events (such as the Deepwater Horizon oil spill).” But what need to be replicable in studies of such events are not the events themselves but the procedures used to collect and analyze data and make inferences from them.

Consider, for example, a study that used tree rings as proxy temperature measurements and purported to find that neither the Medieval Warm Period nor the Little Ice Age had occurred but that a rapid and historically unprecedented warming had begun in the late 19th century. The study became iconic for claims of dangerous AGW driven by human emissions of carbon dioxide.

No one needed to use a time machine to return to the 11th through 20th centuries and regrow trees to recognize that the authors had committed confirmation fallacy by excluding certain data and misused a statistical procedure, resulting in false results. All anyone needed was access to the raw data and the computer code used to analyze it.

Yet the lead author’s long refusal to allow access to raw data and computer code delayed discovery of these errors for years, during which the Intergovernmental Panel on Climate Change, the public, and governments all over the world were led to believe its claims and formulate expensive policies based partly on them.

The UCS letter asserted that concerns about transparency and certainty raised by supporters of the rule “are phony issues that weaponize ‘transparency’ to facilitate political interference in science-based decision making, rather than genuinely address either.” But the irreproducibility crisis is real, not phony. Furthermore, enhanced transparency works against politicization, not for it. This objection is so patently invalid as to suggest that those who offer it are themselves weaponizing confidentiality to facilitate their own political interference in science-based decision making.

STRS will improve, not harm, the EPA’s mission to protect Americans from real environmental risks. It will also reduce the risks caused by unjustified but costly regulations. It should be adopted.

Via email from E. Calvin Beisner




Seventh Largest River Still Covered In Thick Ice. Blame Your SUV

News from Siberia

By this time of year, boats are usually plying the ice-free Ob, but in 2018, while the winter covering the river has begun to move like a giant monster – but it has not cleared.

Far from it. Here the thick ice is slowly drifting downstream in a northerly direction towards the Arctic yet with temperatures still of an unseasonal -5C, this could go on a while.

As our remarkable videos show, this is an awesome and eerie sight, magnetic to those lucky enough to be in the vicinity.

In Surgut, people come here before work just to glimpse the natural wonder and listen to the gentle creaking and cracking of the shifting ice. Then they come back again after work.

‘Sometimes it sounds like a rustle,’ said Anya, an enthusiastic Ob-watcher. ‘Then you hear a rumble as the ice breaks. Often it is a calm silence.’

At one moment this week, ice from the Ob literally broke out of the river from the sheer force of this natural annual pilgrimage to the direction of the end of the world (the Ob flows up the eastern coast of Yamal, a gas-rich peninsula the name of which literally means the ‘end of the world’).

It smashed the railings in Surgut with its phenomenal power. It was as if a column of ice was making an escape bid from the mighty Ob, intent on invading this famous oil city. Or as a local newspaper put it: ‘If you fail to go and watch the Ob River, the Ob will come to you!’

Not all cities on Siberia’s major rivers are so lucky with such sights. Upstream on the Ob, Novosibirsk  – the largest metropolis in Russia between Moscow and the Pacific – does not get such spectacular scenes because of a dam which tempers the water so that, although it freezes, the ice is not as thick as elsewhere.

Similarly, in Krasnoyarsk, the impact of a hydro-electric plant and huge dam on the Yenisei River – the world’s fifth longest including its tributaries – acts to take the chill out of the water. So much so it doesn’t even freeze.

The ice is closely monitored by the authorities because if it gets trapped and clogs the rivers as the melt starts and floes move downstream, a frozen dam is formed. Then flooding hits riverside settlements.

Explosives are regularly used to avoid such circumstances, but as we have seen this week on the Lena River – the planet’s 11th longest – even 17 tons of TNT is not enough.

For now, though, the folks of Surgut astride the Ob are the lucky ones.

As Eldar Zagirov, a business coach, marveled: ‘We’re in the age of new technologies – super-fast internet, Instagram, and much more. ‘And yet the inhabitants of Surgut … every day after work are rushing to the embankment, families with kids, couples, just not to miss the great ice drift. ‘It’s like the first inhabitants of these places for many centuries ago.

‘And there is something here. Something primordial, real, strong and sincere …’

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************