Member since: 2002
Number of posts: 19,999
Number of posts: 19,999
- 2016 (73)
- 2015 (6)
- 2014 (9)
- 2013 (16)
- 2012 (9)
- Older Archives
...over the same week of 2015 at the Mauna Loa observatory.
Up-to-date weekly average CO2 at Mauna Loa (Accessed September 18, 2016)
The reading for the week ending September 11, 2016 of 401.33 ppm is 4.03 ppm higher than the reading in the same week last year.
Before 2016, there were only 8 such readings in the entire history of the observatory's record of 2020 such readings going back to 1975.
They were the weeks ending September 6, 1998 (4.67 ppm), February 3, 2013 (4.54 ppm), September 27,1998 (4.49) ppm, April 18, 2010 (4.38 ppm), August 16, 2010 (4.17 ppm), May 6, 2012 and April 13, 2014, both at 4.01 ppm.
There are now 18 such readings, including one over 5.00 ppm, 5.04 ppm to be exact, on July 31 of this year. There was one other reading, 4.78 ppm, June 6, 2016 that exceeded all previous readings ever recorded.
Readings in the 20th century of this type averaged 1.54 ppm. In the 21st century, that same average is 2.09 ppm. For 2016 it is 3.52 ppm.
Since the destruction of the nuclear reactors, and thousands of buildings, at Fukushima, after which Japan shut it's nuclear reactors to see if they were "safe" and replaced them with power generated by burning dangerous fossil fuels using plants that kill people 100% of the time they operate, and not just after huge tsunamis, the average for these figures is 2.37 ppm.
In the last ten years, the world has squandered, other people use the word "invested", approximately two trillion dollars on so called "renewable energy" mostly on wind and solar infrastructure that will all be landfill within the next 30 years.
It didn't work; it isn't working; and it won't work.
If any of this troubles you; don't worry, be happy.
The City of Los Angeles has announced that it will be 100% powered by so called "renewable energy" "by 2030."
We can add this to the tens of thousands of announced programs and forecasts of "renewable energy by such and such a date" going back over half a century, beginning with Amory Lovins 1976 announcement that the United States would be producing 16 quads of solar energy "by 2000" (the entire planet as of 2016 doesn't produce 2 such quads), and extending right up to the present day.
Most such announcements choose a "by such and such a date" time point that will take place after the soothsayer offering it will be dead, but no matter.
All such predictions have been delusional. All are cases of people lying to themselves, something we are all happy to do, since it places the onus of action on our children and grandchildren to do what we have not done, and in fact, can't do, since so called "renewable energy" is, 1) not renewable, and 2) not sustainable, and 3) not realistic.
The world was previously powered by so called "renewable energy" up until the early 19th century, when so called "renewable energy" was abandoned because the bulk of the world's population, even more so than today, lived short, miserable lives of dire poverty.
But again, don't worry. Be happy.
It's the thought that counts, not results.
Have a wonderful week.
Posted by NNadir | Sun Sep 18, 2016, 05:57 PM (2 replies)
...in our lifetimes. As anyone familiar with my record here will know, I have been reporting on the disastrous climate events that have been taking place in 2016 where the carbon dioxide concentrations have been climbing at a record pace.
My last entry on this topic was here: July 31, 2016: Mauna Loa carbon dioxide levels 5.04 ppm higher than one year ago.
For the first time in recorded history, the weekly year to year comparisons at the Mauna Loa carbon dioxide observatory have exceeded a 5.00 ppm increase over levels a year ago.
On July 31, 2016, the concentration of carbon dioxide at Mauna Loa was 403.47 ppm; one year ago it was 398.43 ppm.
Mauna Loa Weekly Trends, accessed Aug 7, 2016
During summers in the Northern Hemisphere, carbon dioxide levels fall slightly from the peaks usually observed in April or May; the minimums usually occur in September. The 2016 max, observed during the week ending on April 10, 2016 was 408.31 ppm.
All of humanity's efforts to address climate change have failed. This includes all the, rhetoric, charts and graphs about the "triumph" of so called "renewable energy" on which we bet, foolishly as it turns out, our planet's atmosphere.
Some folks in the primary scientific literature have commented on it as well.
The paper attached comes from the journal Nature Climate Change:
El Niño and a record CO2 rise (Richard A. Betts,
Chris D. Jones, Jeff R. Knight, Ralph F. Keeling & John J. Kennedy , Nature Climate Change 6, 806–810 (2016))
The long-term rise in atmospheric CO2 concentration, approximately 2.1 ppm yr−1 over the past decade, is caused by anthropogenic emissions arising from fossil fuel burning, deforestation and cement production1, 2. The annual growth rate, however, varies considerably as a result of climate variability affecting the relative strength of land and ocean carbon sources and sinks. The annual growth rate measured at Mauna Loa, Hawaii3, 4 is correlated with the El Niño–Southern Oscillation (ENSO), with more rapid growth associated with El Niño events5, 6, 7, 8, 9 through drying of tropical land regions and forest fires. To test the predictive value of this relationship, we present a forecast, made in October 2015, of the CO2 concentrations throughout 2016 based on the relationship, and verify against observations available so far. We predict the monthly mean CO2 concentration at Mauna Loa to remain above 400 ppm even in its annual minimum in September, which would not have been expected without the 2015–2016 El Niño...
The relationship between 1998 and 2016:
Historically there have been markedly large annual CO2 growth rates in El Niño years (Fig. 1), probably due to warming and drying of tropical land areas resulting in reduced carbon uptake by vegetation growth, increased carbon release by fire and drought-induced tree mortality. In 1997, dry conditions in Indonesia and Malaysia allowed human-ignited fires to escape control and ignite carbon-rich peatlands, which continued to burn for some months. An estimated 0.81 to 2.57 GtC were emitted to the atmosphere as a result11, equivalent to 13–40% of global annual mean carbon emissions from fossil fuels at that time and hence a substantial contribution to the anomalously large CO2 growth rate that year12. During La Niña events, when the equatorial Pacific is colder than average, the annual CO2 growth rate is slower.
The recent El Niño, now in its declining phase, was comparable with the 1997–1998 event in some respects. Although maximum SSTs were cooler in the eastern tropical Pacific, the Niño 3.4 index was 2.6 ± 0.30 °C over November 2015 to January 2016 (larger than November 1997 to January 1998) and most tropical land regions were again anomalously dry. Once again, drought conditions allowed human-caused fires in Indonesia to burn large areas. Estimates for 2015 suggest that the total greenhouse gas emissions from these fires is equivalent to 0.4 GtC, with large uncertainty — less than those in 1997 13, but still larger than for non-El Niño years.
Some comment on the "meaning" of 400 ppm:
...A point of interest is the passing of 400 ppm in the Mauna Loa record. Although there is nothing physically significant about this concentration, it has recently become an iconic milestone in popular discourse regarding the ongoing rise in atmospheric CO2 (for example, ref. 15). In the last two years, CO2 has fluctuated around 400 ppm through the annual cycle, which has amplitude of approximately 6–7 ppm at Mauna Loa. 2014 was the first year that monthly CO2 concentrations rose above 400 ppm, and in 2015 the annual mean concentration has passed 400 ppm for the first time, but the monthly mean concentration fell back below 400 ppm for three months at the end of the boreal summer, reaching a monthly mean of 397.50 ppm in September. Adding the recent mean growth rate of 2.1 ppm yr−1 to this value would suggest a 2016 September concentration of 399.60 ppm. However, on the basis of the observed and forecast Niño 3.4 SSTs as of November 2015, we predict a Mauna Loa CO2 concentration in September 2016 of 401.48 ± 0.53 ppm (Fig. 3)...
And now the depressing part, wherein the bold is mine.
With the growth rate expected to reduce again after the El Niño, could the annual minimum CO2 concentration fall back below 400 ppm again next year or further in the future? This is exceptionally unlikely. In the instrumental record that covers the last half-century, annual growth rate has always been positive as a result of ongoing anthropogenic emissions, and the amplitude of the seasonal cycle has not varied substantially. Both the annual mean and September minimum CO2 concentrations have therefore increased year on year. This was the case even in years with large La Niña events or major volcanic eruptions that temporarily caused cooling and greater net uptake of CO2 by the biosphere, resulting in smaller, but still positive growth (Fig. 1a). For example, in the large La Niña in 1999–2000, the growth rate remained above 1ppm yr−1. Unless a very large volcanic eruption injects substantial quantities of aerosol into the stratosphere, we would expect concentrations continue to rise further above 400 ppm in the next few years.
In the longer term, a reduction in CO2 concentration would require substantial and sustained cuts in anthropogenic emissions to near zero. Even the lowest emissions/concentrations scenario assessed in the IPCC Fifth Assessment Report projects CO2 concentrations to remain above 400 ppm until 2150. This scenario, RCP2.621, is considered amongst the lowest credible emissions scenario, and relies on assumed development of 'negative emissions' methods whose potential is considered limited22. Indeed some argue that RCP2.6 is now beyond reach without radical changes in global society23. Hence our forecast supports the suggestion24 that the Mauna Loa record will never again show CO2 concentrations below the symbolic 400 ppm within our lifetimes.
If any of this bothers you, don't worry, be happy.
Those assholes at Greenpeace are continuously and constantly predicting that sometime after they and everyone else who had to sit through their tiresome repetitive and illiterate bullshit will be dead, the world will be a 100% renewable nirvana. The fact that they have been predicting this endlessly and continuously and that things are getting much, much, much worse, and not better, has no meaning.
As you may know, Greenpeace is not an organization where people do science, science being a practice in which results trump theory. You can't get into Greenpeace if you're aware of the contents of an engineering or biology or chemistry or physics or math book - but if you join Greenpeace you sure can talk, and talk, and talk - especially on subjects you know nothing about - even as the rest of us, and all future generations choke, and choke, and choke and choke.
So called "renewable energy" didn't work; it isn't working; and it won't work, but it's not results, but the thought that counts.
Have a lovely weekend.
Posted by NNadir | Fri Sep 16, 2016, 05:22 PM (2 replies)
Last night I was reading several papers in the literature concerned with what the badly screwed future generations might do with all the solar cells we've been manufacturing in recent years, despite the fact that they have failed to do a damned thing to address the accelerating rate of climate change.
The solar industry is tiny and clearly useless despite soaking up trillions of dollars, and there are many rote assumptions that claim they are "green" (as in environmentally benign) and "sustainable" (they can be used indefinitely).
As is the case with many rote assumptions, they are wrong.
The paper to which I will refer here is this one: Separating and Recycling Plastic, Glass, and Gallium from Waste Solar Cell Modules by Nitrogen Pyrolysis and Vacuum Decomposition (Lingen Zhang and Zhenming Xu, Environ. Sci. Technol., 2016, 50 (17), pp 9242–9250)
Some text from the paper:
The first solar cell was invented at Bell Laboratories in 1954. After the energy crisis of the 1970s, the products of solar cells started to be used in civilian fields. Converting the energy of sunlight into an easily usable form is one of the most attractive solutions to the shortage of fossil energy.(1) Photovoltaic energy has been known as the cleanest energy in the 21st century. With the rapidly expanding of photovoltaic market, the output of solar cells is growing by about 28.14% a year in China.(2) The solar cells have been widely used in transportation, communications, space, and other fields. Crystalline silicon has become an important and dominant semiconductor material in most of solar cells.(3) Apart from Si wafer-based module, gallium arsenide (GaAs) which is a compound semiconductor has been used for decades to make ultrahigh-efficiency solar cells because of its advantages, including their high photoelectric conversion efficiency and excellent antiradiation performance.(4, 5)...
...Gallium, as an important strategic resource, has been categorized as one of 14 mineral resources by the European Commission in extreme shortage.(11) The world reserve of gallium has been estimated to be 18 000 tones, which is merely one tenth of gold.(12) In nature, gallium has no ores of its own at all; rather it occurs in trace and minor amounts in various associated minerals types, such as bauxite, zinc, tin, and tungsten ores.(13, 14) Hence, it has led to strong interest for recovery of gallium from wastes. At present, various researches have been developed to recycle gallium. Technologies include acid leaching,(15) organic solvent,(16, 17) chemical precipitation, electrochemistry,(18, 19) and supercritical extraction(20) etc. I.M. Ahmed(21) proposed extracting method by Cyanex 923 (a mixture of four trialkylphosphine oxides) and Cyanex 925 (bis(2,4,4-trimethylpentyl) octylphosphine oxide) in kerosene from hydrochloric acid medium to recycle Ga(III). Although these studies have focused on recycling gallium resource, environmental improvement are still challenging due to limitations on using large volume of acid/alkali/organic reagent with high concentration.
The bold is mine. That bolded remark doesn't sound all that "renewable" to me.
The authors propose nitrogen pyrolysis and vacuum decomposition which is (they say) cleaner. Here's some of their investigation of the "clean" process.
The effect of temperature on the organic conversion rate was investigated in the range from 573 to 1073 K at 0.5 L/min N2 flow rate lasting 30 min. It could be seen from Figure 6(a) that the organic conversion rate was low when temperature was 573 and 623 K, which was 19.61% and 36.02%, respectively. But when temperature reached 673 K, the organic conversion rate sharply increased and exceeded 98%. With the temperature over 773 K, the organic conversion rate reached approximately 100%. It indicated that the pyrolysis temperature of plastic components was 773 K. We analyzed the organic components of oil products from 773 to 1073 K. The results were summarized graphically in Figure 7(a). We found that the organic components of oil and gas products had obvious changes with the increasing of temperature. When the temperature reached 773 K, alkanes and olefins were main organic components in the oil products. But, some naphthenes, acetophenone, and methyl naphthalene, etc. began to be detected in pyrolysis oil products with the temperature rising to 873 K. When the temperature arrived 973 and 1073 K, anthracene, phenanthrene, and homologues of benzene were main components of pyrolysis oil products. For pyrolysis gas products from panel materials (as shown in Figure 7(b)), benzene rapidly increased with increasing of temperature. But, components of gas products were not change. Therefore, the temperature of pyrolysis should be controlled under the condition of 773 K.
Um...benzene. I'm sure they'll be absolutely safe, since all recycling facilities for electronic waste are absolutely safe.
The arsenic is recovered as diatomic arsenic gas which distills away.
The process is in no way quantitative.
he effect of temperature on the recovery efficiency of gallium was investigated in the range from 973 to 1273 K, maintaining system pressure of 1 Pa and reaction time of 40 min. As shown in Figure 9(a), the recovery efficiency of gallium increased sharply with an increase of temperature from 973 to 1023 K. The recovery efficiency reached 60.9% when the temperature was 1023 K, and then its rise began to slow. The recovery efficiency of gallium increased to 76.4% with the temperatures reached 1273 K. Figure 9(b) showed the theoretical and experimental evaporation rate of Ga particles. In theory, the evaporation rate of gallium should be increased with the increase of temperature, according to Langmuir-Knudsen eq (eq 3). However, the evaporation rate has not changed in our experiment. The experimental evaporation rate of Ga presented nearly linear relationship with temperature, which was range from 5.64 × 10–5 to 2.12 × 10–4. Its explanation may be that on the one hand, first, the decomposition reaction of GaAs is happened, which needed a high temperature and then metallic gallium can volatilize.
Well, whatever gallium and arsenic remains, we can always take it to a "green landfill."
It's amazing how much handwaving and how many ill thought out beliefs, dogmatic beliefs, get attached to the solar industry, since for many decades it was all theory and no practice.
The practice is quite different. Trillion dollar quantities of resources have been thrown at this industry in the last ten years, with the result that the annual increases in the dangerous fossil fuel waste carbon dioxide is the highest ever observed.
It is expected that in about twenty years, about two million tons of used and dysfunctional solar cells will need disposal on this planet. cf (Sustainable System for Raw-Metal Recovery from Crystalline Silicon Solar Panels: From Noble-Metal Extraction to Lead Removal (Byungjo Jung†, Jongsung Park‡, Donghwan Seo†, and Nochang Park*, ACS Sustainable Chem. Eng., 2016, 4 (8), pp 4079–4083)
Enjoy the coming weekend.
Posted by NNadir | Fri Sep 16, 2016, 01:29 PM (1 replies)
<iframe width="854" height="480" src="" frameborder="0" allowfullscreen></iframe>
Posted by NNadir | Tue Sep 13, 2016, 10:16 PM (0 replies)
Recently I remarked in this space that one possible route, maybe the only route, to fixing carbon dioxide from the air - something that will prove necessary for future generations since our generation did nothing whatsoever to address climate change - will involve making products from biomass that effective sequester carbon.
The difficulty of taking a lab route for usefully fixing CO2 to an industrial level.
In recent years there have been many discussions of this approach in the scientific literature, and I came across an interesting paper that puts a fun - and interesting - spin on the topic.
The paper is here: Synthesis and Characterization of Bio-based Epoxy Resins Derived from Vanillyl Alcohol (Joseph F. Stanzione III, ACS Sustainable Chem. Eng., 2016, 4 (8), pp 4328–4339)
Many people are aware that there is growing concern that many plastics, in particular polycarbonates, are co-polymerized with bis-phenol A, which can leach out of the plastics and is a known endocrine disrupting chemical owing to some structural features making it similar to some steroidal compounds in the estrogenic pathway. Polycarbonate plastics are the tough, hard plastics commonly used for water storage bottles, baby bottles, etc.
An excerpt from the text of the paper is here:
Thermosetting polymers, such as epoxy, vinyl ester (VE), and unsaturated polyester (UPE) resins, have found utility in a wide range of industrial and commercial applications including adhesives, coatings, and composites.(1, 2) Epoxy resins dominate the thermosetting polymers market making up roughly 70% of all thermosetting polymers,(3) due to their outstanding thermomechanical properties comprising high glass transition temperatures (Tg’s) and high glassy moduli (E′’s) at 25 °C as well as good chemical resistance, when polymerized with an appropriate curing agent.(1, 2, 4) Unfortunately, the majority of commercial thermosetting polymers currently being produced are synthesized from nonrenewable, petrol-based chemicals. Since the inception of the first commercial diglycidyl ethers in the 1940s, the epoxy resin industry has been dominated by the petrochemical-based diglycidyl ether of bisphenol A (DGEBA).(1, 2, 5) This bisphenol A (BPA)-based epoxy resin is found in over 90% of thermosetting epoxy resins worldwide, in a market with a global production currently exceeding 2 million tons per year.(3)
DGEBA is a product of two main reactants, BPA and epichlorohydrin, with epichlorohydrin, historically synthesized via a multistep pathway starting with propylene.(1, 2, 6, 7) There is currently no renewable source for BPA; however, Dow Glycerin to Epichlorohydrin (GTE) Technologies and Solvay Epicerol have recently reported processes for the synthesis of epichlorohydrin from bio-based glycerol, a byproduct of biodiesel production.(8, 9) BPA (4,4′-isopropylidenediphenol) is typically synthesized via an acid catalyzed electrophilic aromatic condensation of phenol and acetone with a stoichiometric ratio of 2:1, yet the process uses large excesses of phenol to reduce the formation of higher molecular weight oligomers.(10, 11) BPA is used as the base molecule in thermosetting epoxy resins, as the bisphenolic structure provides molecular rigidity to the polymer network; thus, promoting their outstanding thermomechanical properties.(11) However, the use of BPA in epoxy resins has received a great deal of scrutiny and debate, while concerns of human exposure to BPA, a known human endocrine disruptor, via leaching from resins and food and beverage can coatings are driving the search for a suitable alternative that is both renewable and nontoxic.(3, 12)
The Dow process for producing epichlorohydrin from glycerol, a generally worthless side product of the production of biodiesel and soap, is described here:
Glycerin as a Renewable Feedstock for Epichlorohydrin Production. The GTE Process (Briggs et al Clean 2008, 36 (8), 657 – 661)
The idea is to take a "generally worthless" product and make it worth something. Polymers derived from glycerol are fixed carbon that is removed from the atmosphere.
Now for the fun part, the chemical whose flavor I love, vanillin, actually the alcohol made by reducing vanillin.
The authors write:
In the present work, we report the electrophilic aromatic condensation of vanillyl alcohol (1) with guaiacol (2) to produce bisguaiacol (BG) isomers (3). The reaction is given in Scheme 2, in which the major structural bisguaiacol isomer formed was determined to be para–para. The synthesis of bisguaiacol avoids the use of carcinogenic and highly volatile molecules like formaldehyde and acetone as the hydromethyl group present on vanillyl alcohol already provides the necessary handle and reactivity for facile and desired phenolic coupling and methylene bridge formation.
To produce a bio-based epoxy, BG (3) was then reacted with epichlorohydrin (4) to produce a diglycidyl ether of bisguaiacol (DGEBG; 5) as shown in Scheme 3. To study the influence of the methoxy moiety attached to the aromatic ring on cured polymer properties, diglycidyl ether of vanillyl alcohol (DGEVA) and diglycidyl ether of gastrodigenin (DGEGD) were also synthesized in the same manner as DGEBG (Figure 1). Furthermore, as DGEVA and DGEGD can be considered useful bio-based epoxies, diglycidyl ether of hydroquinone (DGEHQ; Figure 1) was also synthesized in order to study the influence of the methylene spacer between the aromatic ring and the glycidyl ether. The epoxy resins were cured, either by themselves or with a commercial BPA-based epoxy resin, with stoichiometric equivalents of Amicure PACM (4,4′-methylenebiscyclohexanamine; Figure 1). The thermomechanical properties of the cured polymers were tested via dynamic mechanical analysis (DMA) to determine if these resins are suitable alternatives to current commercially available petroleum-based resins.
For those who are interested in organic chemistry, here is scheme 2:
Here is scheme 3, regrettably not high quality as a graphic, but if you know some organic chemistry, you can fill in the blanks:
Recall that the authors are proposing to use epichlorohydrin derived from glycerol, so the molecules in scheme 3 are entirely derived from biomass.
Vanillin is of course available from vanilla beans, but it is mostly available on an industrial scale from the hydrolysis or thermolysis of lignin, the structural component of wood and straw that is not cellulose. (Many similar compounds are also obtained from the digestion of lignin, many of which will prove to have other uses.)
The authors report that the resulting polymers have excellent properties, comparable to the properties of the petroleum based thermosetting polymers.
There's a long way between bench top chemistry and commercial applications, but, nonetheless, it's interesting I think.
I wonder if the leachates will taste good. I personally love the vanilla flavor; can't get enough of it. I'm a vanilla kind of guy.
Have a nice evening.
Posted by NNadir | Sun Sep 11, 2016, 05:52 PM (3 replies)
This one came as a surprise to me, but I stumbled across it while going through the scientific literature this afternoon. Apparently in recent years there have been nine fatalities associated with the storage of wood pellets for wood pellet stoves.
Influence of Oxygen Availability on off-Gassing Rates of Emissions from Stored Wood Pellets (Irene Sedlmayer et al, Energy Fuels, 2016, 30 (2), pp 1006–1012)
From the text:
In times of increasing energy demand, concerns about climate change, and decrease of fossil resources, the attractiveness of alternative fuels such as wood is growing. Wood pellets are high in energy density, easy to handle, and homogeneous in quality.(1) Thus, they are most competitive with fossil fuels for heating purposes among wooden fuel types. Accordingly, the worldwide pellet market has been growing. In 2008 the worldwide pellet consumption was about 10,000,000 tons increasing to 13,500,000 tons in 2010.(2) A statistical report by the European Pellets Council published in 2014 reports the worlds wood pellets production reached 24,500,000 tons in 2013.(3) According to the same report, about 2/3 of the global pellet consumption is attributed to heating. So far, Europe remains the largest pellet consumer market responsible for around 80% of the world’s wood pellet consumption. Estimates of the European Pellet Council further indicate that at least 55% of the European pellet consumption is utilized on the residential heating sector, <50 kW.(3)
Wood pellets are known to emit various gaseous emissions during production, transportation, and storage.(4-9) Nine fatal accidents occurred since 2002 during storage or transportation of mostly big bulks of wood pellets but also in small scale stores.(10) Subsequent investigation indicated increased concentrations of CO, CO2, VOC, CH4,(4, 5, 11, 12) and H2(13) and simultaneous depletion of O2(14, 15) in pellet stores leading to a toxic breathing atmosphere. According to Svedberg et al.(12) O2 declined to levels from 16.9 to 0.8% in closed storages like ocean vessels. However, the concentration of oxygen in ventilated pellet storages strongly depends on the efficiency of ventilation. It can be assumed that in an appropriately ventilated pellet storage room, the oxygen concentration is approximately equal to the oxygen concentration of ambient air.
Reference 10 is here: Lethal Carbon Monoxide Poisoning in Woo dPellet Storerooms—Two Cases and a Review of the Literature (Saskia Gauthier et al, Ann. Occup. Hyg., Vol. 56, No. 7, pp. 755–763, 567755–763 (2012)).
A description of the fatal accidents are found in table 2 of the text. Of the nine deaths, only one was associated with storage in a private home.
My recommendation is that if you're using wood pellet stoves one should store them in a well ventilated space such as a shed or garage that is aerated. They should probably be cycled with the oldest being burned first. A carbon monoxide detector in the area or storage is advisable, particularly if the storage place is confined. It might be wise to store them in a sealed container.
I'm not sure why this is not a problem with the storage of wood for fireplaces. Probably it is related to surface area, since the wood pellets I've seen are small, um, well, pellets.
I have wood burning fireplace in my home with a circulating air fan to heat my home. In recent years I've been less and less inspired to use it - even though I use downed wood from trees on my property that would otherwise rot releasing carbon dioxide - since I am aware that about half of the seven million air pollution deaths that take place each year are from the combustion of biomass, particularly indoors. My home is regrettably heated by dangerous natural gas which is less of an air pollution source than wood, at least in a purely chemotoxic sense, but noxious all the same in a climate sense. About 50% of the electricity supplies in my state is for the time being supplied by nuclear energy, so when possible I do use electric space heaters.
(I'm not sure we're going to have all that many winters in New Jersey in the future, since we bet the planetary atmosphere on so called "renewable energy" with the result that the rate of climate change driving carbon dioxide releases is rising, not falling. So called "renewable energy" didn't work, isn't working and won't work.)
But in any case, if you have a wood pellet stove, be safe.
Enjoy the rest of your Sunday afternoon.
Posted by NNadir | Sun Sep 11, 2016, 03:02 PM (4 replies)
Cadmium is highly toxic element that occurs naturally in some soils, but has been increasingly contaminating agricultural fields since the element has been mined extensively in order to serve the electronics industry.
It is known that the element is taken up by plants since it is mimetic for the elemental essential nutriet zinc, zinc being present in the active sites of many important metalloenzymes essential to life. When coordinated with cadmium rather than zinc, the enzymes no longer function, and this is important to the mechanism by which their toxicity is observed. (The third cogener of zinc is mercury, and the mechanism of its toxicity is generally the same: Both cadmium and mercury are powerful neurotoxins, which may account for the unfathomable popularity of Donald Trump.)
An interesting paper I came across on the subject in the current (as of this writing, 9/10/16) issue of Environmental Science and Technology is here: Cadmium Isotope Fractionation in Soil–Wheat Systems (Matthais Wiggenhauser et al, Environ. Sci. Technol., 2016, 50 (17), pp 9223–9231)
Here is a graphic from the paper:
It is important to note that the percentages here do not refer to enrichment of total cadmium in the various plant parts, but rather in the ratio between two stable isotopes of the element, 114Cd and 110Cd, with the former being enriched.
Isotope effects such are these are related to the kinetics of reactions, which vary (slightly) with the mass of the reactants. These effects are well known for the hydrogen isotopes protium, deuterium and tritium, but, as the paper points out in the text, have only in recent times been accessible to study owing to developments in inductively coupled plasma mass spectrometry (ICP-MS). For the analytical chemist, details may be found in the supplementary information of the paper although the particular make and model of the ICP/MS is not noted.
The lighter the isotope, in general, the faster the reaction. In the present case, it was found that the cadmium that makes its way to the wheat seeds, the grain we eat, is enriched in heavier isotopes meaning that the reactions in the stems and roots remove some cadmium before it gets to our mouths and, ultimately our brains where it causes diseases like Trumpism and Greenpeacism and other neurological deficiencies. This is actually good news, since it means that some of the cadmium is sequestered in these inedible parts of grain, although in the case of straw, it may show up in animals that some people eat, in particular, cows.
Despite this marginally good news, it is clear that many food supplies are in fact contaminated with cadmium, especially in China, where electronic recycling, as well as mining and manufacture of semiconductors solar cell are practiced under largely uncontrolled conditions. It is reported that in Southern China, up to 70% of the grains purchased in local markets had cadmium levels exceeding government set limits: F ood supply and food safety issues in China (Sun et al Lancet 2013; 381: 2044–53) About 1/6 of the world supply of cadmium is mined in China; substantial quantities are mined in the United States and isolated from various zinc ores.
We hear a lot of hand waving about how solar prices are dropping dramatically, and Chinese manufacture of solar cells is a big part of the reason, because in China, despite having declared itself a "people's state" environmental regulations either do not exist or if they do, are poorly enforced. In China it is cheaper to dump cadmium containing wastes and avoid the purchase of expensive devices to prevent the escape of these contaminants, as well as other contaminants like tetrachlorosilicon, and yes solar prices are dropping. Whoopeee.
(The big lie about "dropping solar prices" is that any system that requires redundancy is not cheaper; it is in fact, more expensive than a system that operates continuously. Also from both an economic and environmental standpoint, the lifetime of a device matters.)
The solar industry has almost no effect whatsoever on the most exigent environmental crisis of our times, climate change. After the world's population dropped a trillion bucks on this adventure in wishful thinking and poor critical thinking ability, the result is that the rate of accumulation of the dangerous fossil fuel waste carbon dioxide in the planetary atmosphere has pretty much tripled since the big mouthed advocates of this scheme - I'm talking specifically about corporate window dresser Amory Lovins specifically, although one could point to similar assholes like his one time acolyte Joe Romm - began telling us all that solar and wind would save us.
They haven't, they aren't, and they won't.
The import of cadmium laced solar cells into the United States from China will probably have little immediate toxicological effect - although I'm not sure the issue has been studied - but in twenty or thirty years all the solar cells on this planet, as well as the inverters and other electronic junk associated with them, will become another component of electronic waste, already an intractable problem. The toxicological effects thus will fall on future generations. By doing nothing but failing to check our 1970's assumptions, the members of this generation, my generation, have showed their contempt for all future generations in thousands of ways, and piling up this solar junk is only the tip of the overheated iceberg.
By the way, cadmium is one of the "critical elements" that are likely to be depleted in the lifetime of today's infants. Here is a periodic table from one source on this increasingly discussed issue in Engineering, Materials Environmental Sciences: The importance of elemental sustainability and critical element recovery (Hunt et al, Green Chem., 2015,17, 1949-1950
It is somewhat amusing to note that five of the elements that have been touted as "solar breakthroughs" in the last twenty or thirty years of useless cheering, five of them are on the list of immediately endangered elements, those expected to run out in the next fifty years, the aforementioned cadmium, as well as gallium, germanium, indium and arsenic. (Indium may well run out first, not because of useless solar cells, but because of touch screen phones and other screens.) Two others, selenium and tellurium, both toxic elements, will run out in the next century.
So much for the hope that the solar industry, where "PRICES ARE DROPPING!!!!!!!!!" will end up producing all that much more than the less than the 2 exajoules it produces right now out of the 570 exajoules of energy that humanity now consumes each year.
Enjoy the rest of the weekend.
Posted by NNadir | Sat Sep 10, 2016, 03:48 PM (8 replies)
It was almost in two pieces - the lap top fell on it when I accidentally knocked it over but with patience and coaxing I managed to get the last week of work off of it since the last back up.
It's amazing what one can get on one of those little things; years of work; and it's so damned fragile.
Over the years, I've left a few in libraries, and thus fortunately, got in the habit of regular back ups.
Sad though, and a little frightening.
Posted by NNadir | Wed Sep 7, 2016, 10:18 PM (1 replies)
<iframe width="854" height="480" src="" frameborder="0" allowfullscreen></iframe>
Posted by NNadir | Wed Sep 7, 2016, 09:36 PM (3 replies)
This weekend, while catching up on my reading, I came across a carbon dioxide splitting/hydrogen thermochemical cycle of interest.
I have a long term interest in thermochemical hydrogen cycles, and have probably, over the years, read a few hundred papers about them.
The particular paper I was reading was this one: Applicability of an Equilibrium Model To Predict the Conversion of CO2 to CO via the Reduction and Oxidation of a Fixed Bed of Cerium Dioxide (Luke J. Venstrom et al Energy and Fuels, 2015, 29 (12), pp 8168–8177)
The chemistry of this system works like this: Small cylindrical porous particles of cerium dioxide (aka cerium (IV) oxide, CeO2, roughly 3-5 mm in length and 5 mm in diameter, are heated to 1200oC, whereupon a fraction of the CeO2 is reduced to Cerium(III) oxide, Ce2O3, dicerium trioxide. In this process oxygen gas is released.
After the oxygen is removed - and this paper is on the subject of how one might do that - one of two things can be done.
The first is that carbon dioxide, the dangerous fossil fuel waste that is now - as of 2016 - accumulating in the planetary atmosphere at a truly astounding rate - can be passed over the Ce2O3 at which time the carbon dioxide is reduced to carbon monoxide whereupon the Ce2O3 is reoxidized to CeO2. The CeO2 in this case is thus a catalyst; it is returned to its original state. Thus the net reaction is this:
2CO2 <-> 2CO + O2
The second is that water in the gas phase, steam, can be passed over the Ce2O3 where upon it is reduced to hydrogen gas, whereupon, again, the Ce2O3 is reoxidized to CeO2, and again he CeO2 in this case is thus a catalyst.
The net reaction in this case is:
H2O <-> H2 + O2
Note that one of the world's most practiced industrial reactions - the reaction by which the bulk of the world's hydrogen is currently produced is the water gas reaction:
H2O + CO <-> CO2 + H2
In this sense, the two paths each represent a path to hydrogen gas, which is useless as a consumer fuel, despite much bull to the contrary thrown around insipidly for the last three or four decades, but is very useful as a captive intermediate for the production of ammonia, and, in some places, liquid fuels ranging from gasoline to diesel to dimethyl ether and other related chemical products normally produced from petroleum.
Mixtures of carbon monoxide and hydrogen have a special name, "syn gas." Using "syn gas" in the golden age of chemistry in which we live, we can make pretty much any industrial scale organic chemical we want.
Now for the fun part. The authors of the paper cited in the opening text write the following:
The cerium dioxide (ceria, CeO2) thermochemical metal redox cycle is a promising approach to split water and carbon dioxide using concentrated solar radiation because of the favorable thermochemical properties of ceria.
I mean no criticism of the authors of this fine paper to state that I expect - I hope - they are being disingenuous when they write this line of bull. Science is poorly funded these days, and let's face it, in this cockamamie world if one wants to get a grant, one is better positioned if one puts the word "solar" in the grant proposal.
These papers about solar hydrogen have been flying around for decades and the number of concentrated solar plants on this planet producing industrially meaningful quantities of hydrogen is zero.
Here, for instance, is a link to a paper I randomly pulled up from my files that was published 51 years ago: Solar Energy Volume 9, Issue 1, January–March 1965, Pages 61-67.
Fifty years later, the world's largest solar thermal plant is in California's Mohave desert, the Ivanpah solar thermal plant.
Here are some excerpts from the Wikipedia page about this plant, which pulls few punches - despite the insipid worship of all things solar - on what a grotesque failure this huge piece of garbage has been:
The Ivanpah Solar Electric Generating System is a concentrated solar thermal plant in the California Mojave Desert, 64 km (40 miles) southwest of Las Vegas, with a gross capacity of 392 megawatts (MW). It deploys 173,500 heliostats, each with two mirrors, focusing solar energy on boilers located on three centralized solar power towers. Unit 1 of the project was connected to the grid in September 2013 in an initial sync testing. The facility formally opened on February 13, 2014, and it is currently the world's largest solar thermal power station.
"A gross capacity of 392 megawatts..."
Below we'll take a look at the actual power this plant produces, and compare how many Ivanpah solar plants would be required to match the power output of a 1000 MWe nuclear plant.
First let's look at the cost of the plant:
The project was developed by BrightSource Energy and Bechtel. It cost $2.2 billion; the largest investor in the project is NRG Energy, a power generating company based in Princeton, New Jersey, that has contributed $300 million. Google has contributed $168 million.; the U.S. government provided a $1.6 billion loan guarantee, and the plant is built on public land. In 2010, the project was scaled back from the original 440 MW design, to avoid building on the habitat of the desert tortoise.
And the land use:
The Ivanpah Solar Electric Generating System consists of three solar thermal power plants on a 4,000 acres (1,600 ha) tract of public land near the Mojave Desert and the California—Nevada border in the Southwestern United States near Interstate 15 and north of Ivanpah, California. The site is visible from adjacent Mojave National Preserve, Mesquite Wilderness, and Stateline Wilderness.
The plant has never produced enough electricity to meet its contractual delivery requirements which afford it the right to sell electricity to PSEG and SCE for $200/MWh, $50/MWh higher than the retail delivered residential cost of electricity in California.
The plant, by the way, is required to burn dangerous natural gas every morning to start up. The waste from the dangerous natural gas is dumped indiscriminately into the world's favorite waste dump, the planetary atmosphere. The plant has burned approximately 1.8 billion cubic feet of natural gas in its three years of operation.
The power output for the entire facility can be calculated from the data in the table at the bottom of the Wikipedia page, which includes data up to June of this year - the mirrors at the plant went out of alignment in July causing one of the three towers in the plant to catch fire, whereupon the two billion dollar piece of crap was shut for a few weeks.
I have taken the liberty of converting these totals, given in MWh of electricity (solar) into units of average continuous power for the periods listed in the table, in order to give a sense of scale.
In the first month of operation, January 2014, the plant produced an average continuous power of 14 MW, and it did not approach 100 MW until the month of June 2014, when it produced 89 MW of average continuous power. For the entire year of 2014, it was the equivalent of a 47 MW power plant.
Since coming on line, the plant has produced more than 100 MW of average continuous power in only three months: In April of 2015, it produced 104.59 MW of average continuous power; in June of 2015 it produced 107.68 MW of average continuous power; and in February of this year it produced 100.08 MW of average continuous power.
Overall, during it's entire history it has been the equivalent of a 61.11 MW power plant.
Thus, at 2.2 billion dollars in cost, with 1.6 billion dollars represented by loan guarantees by the US government, in order to produce as much power as a 1000 MWe nuclear plant, we would need 16.4 of these disasters, and the cost would be $36 billion. The land area required would be 265 square miles of desert.
The big difference between a nuclear plant and this piece of expensive and useless crap is that the nuclear plant would 1) actually work, 2) would operate for about 60 - 80 years and 3) would not require burning huge amounts of dangerous natural gas to start up, and 4) would not require redundant plants, fueled by dangerous fossil fuels to support it whenever the sun went down. The nuclear plant could produce all of the power of the 265 square miles of solar plant in a moderate sized industrial building.
It is interesting to note, that the cost of educating an (out of state) nuclear engineer at the University of California at Berkeley is roughly a quarter of a million dollars - a fact that sticks in my mind as my two sons are of college age, one in college, and one about to enter college. The $1.6 billion loan guarantee, which may need to be paid since the plant is technically in default on its contract since it has never met its contractual obligations to deliver electricity, is enough to pay for the full educations of 7,500 engineers, not that we give a shit about paying for engineering educations in this country.
By the way, the cerium spitting cycle would be better served by using nuclear heat, which is certainly accessible given recent advances in materials science and which is more reliable.
It is interesting to note that 144Ce is a fission product - unimaginative people with very small minds call this isotope "nuclear waste - with a 284 day half life, decaying through 144Pr to give stable 144Nd. Properly isolated, it could put out significant heat.
It is certainly conceivable to isolate Ce isotopes from continuously fueled fluid phase reactors, not just the famous molten salt reactors that many people are hyping, but from some of the aqueous solution phase reactors of the type originally built and designed by Enrico Fermi. As I recently learned, somewhat to my surprise, 17 examples of these reactors operated at Los Alamos for a period of roughly 20 years from the 1950's through the 1960's until the early 1970s. None of them required 4000 acres of land; all of them in fact, operated in small rooms. They were cheap to build, easy to operate, and apparently very reliable. It is said that Fermi would take breaks from his theoretical studies during the Manhattan project years to go play with one that he built every afternoon. He liked to operate it himself; it is said he'd never let the technicians operate it, since he was fascinated with it, and wanted to be absolutely certain of all of its operating features and thus insisted that he run the thing himself to be aware of everything the reactor did. (It was used to develop an understanding of some of the basic physics of fission, including cross sections of important nuclei.)
So how much hydrogen could the $2.2 billion dollar solar thermal plant at Ivanpah produce? Not enough to count, that's for sure. We sank a trillion dollars into the solar energy industry in the last ten years with the result that we have now tripled the rate at which new carbon dioxide is added to the atmosphere as compared to the rate in the 1970's. The solar industry - at least if you believe that the ends justify the means as opposed to believe that the ends are irrelevant and only the means count - is a grotesque and expensive failure.
Enjoy the labor day holiday.
Posted by NNadir | Mon Sep 5, 2016, 01:00 AM (8 replies)