HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » NNadir » Journal
Page: 1


Profile Information

Gender: Male
Current location: New Jersey
Member since: 2002
Number of posts: 22,547

Journal Archives

Statistical methods for the eternal monitoring of carbon dioxide waste dumps.

Right now the world's largest, pretty much to the exclusion of all others, carbon dioxide dump is the planetary atmosphere. The failure to address climate change by humanity is obviated that the increase in planetary carbon dioxide concentrations in this dump, according to preliminary figures at the Mauna Loa carbon dioxide observatory, for the first time exceeded 3 ppm in a single year, setting an all time record.

Obviously all strategies to address the issue have failed miserably. We are now drilling more gas, more oil, and mining more coal than ever before, and basically the politically popular strategies for addressing the issue have all failed miserably.

One often discussed approach to dealing with the dangerous fossil fuel waste carbon dioxide is to "sequester" it in abandoned oil and gas fields after all of the dangerous fossil fuels in them have been mined and burned. Each year the amount of carbon dioxide dumped into the atmosphere is more than 30 billion tons; in 2012, according to the table on page 93 of the 2014 World Energy Outlook report, the world emissions were 31.6 billion tons. Undoubtedly the figures for 2013, 2014, and 2015 were significantly worse.

Twelve years ago, an overly optimistic and much discussed paper about "stabilization wedges" was published by two Princeton University faculty members, the famous Pacala and Socolow paper. (Science 13 Aug 2004: Vol. 305, Issue 5686, pp. 968-972)

One may note that many of the "suggestions" in this paper describing technologies that were allegedly "already available" in 2004 are extremely dubious, for two obvious examples being substituting wind energy and solar energy for coal; coal plants have capacity utilization factors of approximately 70 to 80 percent, whereas wind plants are lucky to approach 40% and solar facilities 20%, and I'm probably being overly generous with both of these figures. If one shuts a coal plant down for the four hours when lots of solar energy is available near the summer equinox for example, one will be required to waste huge amounts of energy since coal boilers are not perfectly thermally isolated, and extra energy will be required to return the boilers to operational levels, much as a kettle on a stove requires significant heat before the water boils.

(Solar and wind energy are therefore useless as alternatives to coal; and in fact, they are completely dependent on dangerous natural gas to exist at all, with all the fracking and other risks gas dependency requires.)

One of the "stabilization wedges" discussed was carbon dioxide capture and sequestration sites designed to collect and store dangerous fossil fuel waste when, um, the wind wasn't blowing and the sun wasn't shining. In Pacala and Socolow's paper, in table 1 on page 970 this is described as "building 3500 Sleipners."

A "Sleipner" in case one doesn't know, refers to a program proposed by the Norwegian dangerous fossil fuel company Statoil to put lipstick on its offshore oil and gas drilling pig by injecting carbon dioxide into the Sleipner oil field for "sequestration" which Statoil liked to imply was "eternal sequestration." After much hulaboo, the Sleipner program was abandoned on the grounds that it was, um, "too expensive" compared to dumping carbon dioxide waste directly into the existing and "economic" dump, the planetary atmosphere.

The number of "Sleipners" built since 2004 is uncomfortably close to zero; nearly one hundred percent of all carbon dioxide injected into oil and gas fields today is designed for "EOR," the euphemistically named "enhanced oil recovery" scheme, where the plan is to drive even more dangerous fossil fuels out of the ground so the waste can be dumped in the atmosphere.

But one may ask: Suppose that there really were significant carbon dioxide waste dumps built on the scale that Socolow and Pacala suggested were "already available" in 2004, what then?

A recent paper in the scientific journal Environmental Science and Technology discusses some of the issues that are grotesquely ignored in what I regard as this "sweep it under the rug and let future generations worry about it" scheme: The possibility that these dumps for containing a dangerous gas might, um, leak.

The paper is here: Environ. Sci. Technol., 2015, 49 (2), pp 1215–1224

The title is: Quantifying the Benefit of Wellbore Leakage Potential Estimates for Prioritizing Long-Term MVA Well Sampling at a CO[sub]2[/sub] Storage Site.

Here's some of the introductory text from the paper:

In an effort to mitigate concentrations of carbon dioxide (CO2)in the atmosphere that are caused by stationary anthropogenic inputs, the United States Department of Energy (DOE) is pursuing carbon capture and sequestration (CCS) as one approach in a portfolio of greenhouse gas (GHG) reduction strategies. CCS involves (1) separating CO2 from an industrial process, (2) transporting the CO2 to a storage location, and (3)injecting and sequestering the CO2 in a geologic reservoir furlong-term isolation from the atmosphere.1 Through the Carbon Sequestration Program, the DOE is working with seven Regional Carbon Sequestration Partnerships (RCSPs) to identify feasible sites within the U.S. and portions of Canada for large-scale (i.e., one million tons of CO2 or greater) CO2geologic sequestration.2 The DOE is pursuing three primary types of geologic systems for long-term CO2 storage: (1)depleted oil and gas fields; (2) unconventional formations such as gas shales, coal seams, and basalts; and (3) salineformations.3

One of the potential risks associated with the injection and long-term storage of CO2 into geologic reservoirs is leakage of stored CO2 from geologic containment and into the near surface or surface environment. A potential leakage pathway in depleted oil and gas fields is associated with legacy exploration and production wells.4−6 These legacy wells provide a potential conduit through low-permeability cap rock formations that would otherwise act as a seal to retain CO2 in the storage reservoir. Extensive work has been conducted in Alberta, Canada over the past decade to assess the potential CO2leakage risk of legacy wells by drawing inferences from well completion and abandonment information. This work has, in part, been performed as part of the DOE Regional PartnershipPlains CO2 Reduction (PCOR) Partnership...

The paper then explores the "statistical power" of sampling a subset of drilled wells to determine the probability that more are leaking.

...A well leakage potential scoring approach like the one developed by Watson and Bachu8 provides a quantitative means for ranking the increased probability of CO2 leakage at specific well because of SCVF and/or GM. Applying this scoring methodology to the legacy wells that are located within particular region provides a screening-level risk assessment approach for identifying potential geologic CO2 storage sitesareas with a high incidence of high-ranking wells would represent locations that are not favorable to long-term geologic storage of CO2, while areas with a low incidence of highrankingwells may be suitable future CO2 injection and storage.In addition, once a geologic CO2 storage site has beenidentified, then such a well ranking approach also informs themonitoring, verification, and accounting (MVA) sampling planfor the site, as higher-ranking wells would take priority overlower-ranking wells...

There's no mention at all of what it might cost future generations to monitor these dumps for...um, um, um...eternity, but if that bothers you, don't worry, be happy: You can be reasonably assured that these dumps, not twenty "Sleipners" never mind 3500 of them, will not be built. It's far more convenient and, um, "economic" to use the "traditional" dump, the planetary atmosphere.

Enjoy the remainder of your Sunday.

Nature Editorial Comment: India needs home-grown GM food to stop starvation.

The following text is excerpted from a "World View" comment in Nature, one of the world's highest impact scientific journals:

At the beginning of this month, Prime Minister Narendra Modi announced a road map to guide India’s science and technology over the next two decades. Launched during the Indian Science Congress at the University of Mysore, the plan signalled a cautious approach to techniques such as genetically modified (GM) crops, noting that “some aspects of biotechnology have posed serious legal and ethical problems in recent years”. That is true, but a different and much larger problem looms for India. According to the 2015 United Nations World Population Prospects report, India will surpass China by early next decade as the most populous country on Earth, with the most mouths to feed. India is already classed as having a ‘serious’ hunger problem, according to the 2015 Global Hunger Index of the International Food Policy Research Institute. There is a danger that many of these new Indians will not have sufficient food.

Where can additional food come from? Grain production is stagnant, and rapid urbanization is reducing available land. To increase food production, India needs to invest in modern agricultural methods, including GM crops.

Indian researchers have shown that they have the expertise to generate GM plants, most obviously the pest-resistant cotton that is now widely grown in India. But almost all of this work has relied on molecular-biology research done elsewhere...

...India should stop trying to build the Taj Mahal with borrowed bricks. We need a concerted effort at home to discover and manipulate relevant genes in indigenous organisms and crops (such as chickpea and rice). Indian microbial institutes should take up projects in this direction, because most of the currently used genes for transgenic generation are of microbial origin. That requires a change in direction from an Indian GM-food strategy that has traditionally aimed at quick product development instead of careful assessment of the underlying science.

“Some GM crops designed abroad need more water than is usually available in some parts of India.”
Such home-grown GM crops would also reduce reliance on transgenic technology produced by multinational companies, which is expensive and rarely optimized for the conditions of specific regions. Some GM crops designed abroad need more water than is usually available in some parts of India, for example, putting great stress on farmers....

Full text (which may or may not be behind a firewall) is here: Nature 529, 439 (28 January 2016)

Enjoy the weekend!

The "Extreme Learning Machine."

I'm most definitely snowed in today, and am leafing through some issues of one of my favorite journals, Industrial Engineering and Chemistry Research and I came across a cool paper about one of my favorite topics, ionic liquids, that discusses the "Extreme Learning Machine."

Ionic liquids are generally salts of cationic and anionic organic molecules which are liquids at or near room temperature. Because they are generally not volatile, they can eliminate some of the problems associated with other process solvents, specifically air pollution. Although the term "green solvent" is probably over utilized with respect to ionic liquids, their very interesting potential uses have lead to a vast explosion of papers in the scientific literature concerning them. There are, to be sure, almost an infinite number of possible ionic liquids (and related liquids called "deep eutectics".)

My own interest in these compounds is connected with my interest in the separation of fission products and actinides in the reprocessing of used nuclear fuels, as well as an interest in their potential for the treatment of certain biological products, including lignins, a constituent of biomass that is quite different from cellulose, representing a sustainable route of access to aromatic molecules, as well as their possible use as radiation resistant (in some cases) high temperature heat transfer fluids.

Anyway, about the "deep learning machine:" The paper in question, written by scientists at Beijing Key Laboratory of Ionic Liquids Clean Process, State Key Laboratory of Multiphase Complex Systems, Institute of Process Engineering, Chinese Academy of Sciences, Beijing 100190, China, that I've been reading is this one: Ind. Eng. Chem. Res., 2015, 54 (51), pp 12987–12992

The S[sub]σ‑profile[/sub] is a quantum mechanical factor describing the charge distribution of the surfaces of molecules and organic ions.

Here's the fascinating text:

As compared to the ANN algorithm, the extreme learning machine (ELM) is a relatively new algorithm which was first developed by Huang et al.[sup]23,24[/sup] It can effectively tend to reach a global optimum and only needs to learn a few parameters between the hidden layer and the output layer as compared with the traditional ANN and thus can beused to predict properties because of its excellent efficiency and generalization performance.[sup]25[/sup] However, to the best of our knowledge, the ELM has not yet been used for predicting the properties of ILs until now. Thus, we employed this relatively new ELM algorithm to predict the heat capacity of ILs in this work.

Reference 24 is" Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489−501.

Hmm...the program needs to "learn" only a few parameters...

I always keep in the back of my mind Penrose's criticism of the concept of "artificial intelligence" (maybe because being a human being, I still want my species to be relevant) but I'm intrigued. Neurocomputing is a journal I've never accessed before, but when I can get out of here after this blizzard, I'm going to take a look at that paper which is apparently available at Princeton University's library.

I guess I'm a dork, but I find it all kind of cool...

2015 comes in as the worst year ever observed at the Mauna Loa CO2 observatory.

Data trumps theory, 100% of the time, always.

For decades, we have heard all kinds of stuff about how we would address climate change. We are not addressing it.

The preliminary data for 2015 is now in at Mauna Loa and it's telling. We have many people here who can only understand things when presented as a graphic, and here it is:

The preliminary data shows the increase in 2015 to be the first to exceed 3.00 ppm in a single year: 3.17 ppm

Before 2015, the worst year ever observed was 1998, at 2.93 ppm, a year that had an unusual event inasmuch as much of the Southeast Asian forest burned when fires set to clear land for "renewable energy" palm oil plantations (for German biodiesel) went out of control.

As for so called "renewable energy" which has been hyped to a point nearing insanity for roughly half a century, nothing, absolutely nothing draws out its grotesque failure than this data. I repeat my long standing statement that it is not actually renewable, inasmuch as it requires, owing to its low energy to mass ratio, the massive mining and refining of metals and other materials, many of which are highly toxic.

The last, best hope for humanity was one that has traditionally be the subject of much malign fear and ignorance from some of us on the left, nuclear energy. (It remains the only source of primary energy to have avoided 60 billion tons of the dumping of dangerous fossil fuel waste into the planetary atmosphere, equivalent to about two years worth of said dumping.) It remains the world's largest, by far, source of climate change gas free energy, but it is only expanding at a trivial rate, with eight reactors having been shut in the worst CO[sub]2[/sub] year ever observed, and only 10 new reactors having come on line in that same year.

World Starts Up 10, Shuts Down 8 nuclear reactors in 2015

We deserve what we are getting. Fear and ignorance, so dire in human history has triumphed again. I would like to congratulate all of the anti-nukes here and elsewhere on their grand victory, even as I am prone to weep at what their "victory" means for the future of humanity and the world.

Enjoy the rest of the weekend.


1874, Dante Gabriel Rossetti (1828-1882) English.

At the Tate Museum, London.

Observing the Environment Degrades It: Antarctic Research Stations and Persistent Organic...


I spent the day off leafing electronically through some back issues of one of my favorite journals, Environmental Science and Technology and came across an interesting paper that caught my eye concerning leaching of certain halogenated persistent organic pollutants from the McMurdo and Scott Research Stations in Antarctica.

A link to the article is here:

An Antarctic Research Station as a Source of Brominated and Perfluorinated Persistent Organic Pollutants to the Local Environment (Environ. Sci. Technol., 2015, 49 (1), pp 103–112)

A great deal has been written in the scientific literature in recent years about these classes of compounds, and no blog post could do the subject any justice, but the paper gives a nice brief overview of the issues for anyone unfamiliar with the risks these now ubiquitously distributed compounds entail. Quoting from the text:

Persistent organic pollutants (POPs) are typically anthropogenic chemicals and ubiquitous global contaminants. They share properties of persistence, toxicity, bioaccumulation potential and propensity for long-range environmental transport (LRET).1−3 As such, POPs are recognized as posing a threat to environmental and human health and are subject to the Stockholm Convention on POPs that aims to reduce, and ultimately eliminate, these compounds from the environment...

...Human activity in Polar regions, particularly the Antarctic, is undergoing rapid changes and is dramatically increasing.6,7 Easier access to both North and South Polar regions has resulted in enhanced research activity, as well as increasing tourism and marine resource exploration and extraction. Most Antarctic research bases are located in ice-free areas close to the coastline.8 These areas are also of great ecological significance. Because of this, and the fact that background POPs levels are generally relatively low, any consequent local contamination can have a disproportionately large effect on biota. Research bases have already been shown to be sources of PAHs and heavy metals along with Legacy POPs, such as PCBs.9−12...

...Alongside the increasing scale of human activity in Polar regions, the list of industrial and consumer chemicals that satisfy the classification criteria of a POP continues to grow. These factors result in an increased potential for POPs to be directly introduced to the local environment as fresh emissions from consumer products, including electronic equipment, textiles and furnishings, many of which contain POPs recently annexed under the Stockholm Convention. For example perfluorooctanesulfonic acid and its salts together with perfluorooctane sulfonyl fluoride have been added to Annex B (Restriction) and the penta- and octa-commercial mixtures of polybrominated diphenyl ethers (PBDEs) to Annex A (Elimination).

These fluorinated and brominated compounds have different physicochemical properties and hence different industrial and commercial uses. Perfluoroalkyl acids for example are generally manufactured as their salts15 such as perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) that are amphiphilic with low volatility. Such compounds have been extensively used as waterproofing or wetting agents, in many nonstick or polytetrafluoroethylene (Teflon) containing products as well as in fire-fighting foam.15 They may also be formed in situ from degradation of volatile precursors such as fluorotelomer alcohols (FTOHs).16 FTOHs are used extensively as intermediates in the manufacture of poly- and perfluoroalkylated substances (PFASs), have been identified
as residual compounds in consumer products such as stain repellents and other surfactants, and are known to have their own detrimental environmental and health effects.17,18 PBDEs on the other hand are hydrophobic and commonly found in fire retardant mixtures as well as building materials, electronics and textiles.19

The potential for PBDEs to be released from remote polar research stations, due to the relatively high density of electronic equipment and increased fire prevention concerns at these locations, has been recognized.20 PBDEs21−24 and PFASs such as PFOS25,26 may be released from consumer products as these products wear and degrade. Their contrasting physicochemical properties will influence subsequent distribution. As they are persistent they can accumulate in organisms with detrimental effects, including hepato-, immune-, and ontogenetic toxicity.

Dusts within the research station, as well as surrounding soil and some organisms such as lichen, as well as wastewater discharged from the stations, were analyzed via LC/MS/MS for representative species in these classes of compounds, and the results were quite disturbing. The lower limits of quantitation (LLOQ's) for these compounds was relatively modest given the capabilities of modern mass spectrometers, the low nanogram per gram range - modern instrumentation can detect picograms per gram of many compounds of physiological import in various matrices - but in almost every case not much sensitivity was required. At McMudro, indoor dust was found to have 9,560 ng/g as a sum of the various polybrominated diphenyl ether flame retardants.

This concentration is almost 4 orders of magnitude found in the hair of Chinese electronic waste recycling workers (See Science of The Total Environment Volume 397, Issues 1–3, 1 July 2008, Pages 46–57) where the distribution of these compounds is of high concern and thought to be carcinogenic, as well as neurotoxic.

This is, um, disturbing.

(For a description of possible mechanisms by which PDBE's act as carcinogens, toxins, and mutagens, see New Evidence for Toxicity of Polybrominated Diphenyl Ethers: DNA Adduct Formation from Quinone Metabolites (Environ. Sci. Technol. 2011, 45, 10720–10727))

Graphs in the original paper cited here show the gradients in soil sample concentrations of these compounds with increasing distance from the research stations.

The main sink - a very slow sink - for these compound classes is in fact, radiation, typically UV radiation or higher energy radiation such as x-rays and gamma rays. For the deliberate destruction of these molecules with lower energy ionizing radiation, UV, often catalytic amounts of titanium dioxide either pure or doped are employed, but this catalyst is undoubtedly not distributed over the antarctic surface.

Because of the still prevalent ozone hole in Antarctica - 2011 was an unprecedented year (See Nature 478, 469–475 (27 October 2011)) - we may expect a slightly higher rate of degradation, although the most prominent PBDE is PBDE-209, which during its degradation can form any of the other, potentially even more toxic PBDE's as degradants. This is small comfort.

It is notable that the stations in Antarctica are sources of many other questionable organic compounds, notably PCHs, polycyclic hydrocarbons, from leaks of dangerous fossil fuels transported to the stations.

One of the most famous laws in science, the Heisenberg uncertainty principle is a statement overall that attempts to view the state of something, in the case of the principle sub-atomic particles like electrons, changes it. This law surprisingly has a macroscopic correlation in environmental science.
Go to Page: 1