HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » NNadir » Journal
Page: 1 2 Next »

NNadir

Profile Information

Gender: Male
Current location: New Jersey
Member since: 2002
Number of posts: 22,539

Journal Archives

This delete was messaged by its author.

Industrial drawbacks to the use of neptunium in existing nuclear reactors.

I personally believe that nuclear energy is the last best hope for humanity to address the environmental issues now before us; and in fact, nuclear power represents the last best hope to prevent nuclear war, and in fact, all wars, inasmuch as most wars are resource related.

In the 20th century, beginning notably with the Second World War, the resource that drove most, if not all wars, was oil; in the coming century it is more likely to be water. The Second World War killed people at a rate of about 10 million people per year on average, making the practice of that war even worse than modern air pollution - about which we do little other than issue platitudes - which kills people at a rate of 7 million people per year.

This is noted in an interesting review I read that was mostly concerned with the physical chemistry of water, which although it was a scientific paper contained a quasi-political note:

Despite water’s abundance, its distribution is increasingly problematic for the world’s growing populations; see Figure 1 and Appendix A.3. The availability of drinking water is limited, and it is shrinking worldwide. By the year 2030, the world’s 8.5 billion people9 will consume 6 trillion cubic meters (6000 km3 ) of water per year.10 While today 11% of the global population lives with poor access to clean drinking water,11 it is estimated that in 2030 half the world’s population will be living under severe water stress.12 It is increasingly challenging to get clean water to where it is needed. Early civilizations settled near rivers. But now, clean water is increasingly provided through water purification, desalination, 4,13,14 and transport. Therefore, clean water increasingly requires access to energy. Also, water distribution increasingly poses technical challenges, requiring advances in separating water from salts and oils at low energy costs, for example.

Water conflict is a term used to describe a clash between countries, states, or groups over access to water resources. While traditional wars have rarely been waged over water alone,15 water conflicts date back at least to 3000 B.C.16 The U.S. Dust Bowl drought of the 1930s, which covered nearly 80% of the United States at its peak, drove mass migration. More recent droughts occurred in the southwestern United States in the 1950s, and in California and the southern United States in just the past few years. Water has been regarded as a component of conflicts in the Middle East,17 in Rwanda, and in the Sudanese war in Darfur. Eleven percent of the world’s population, or 783 million people, are still without access to good sources of drinking water.11 Increased water scarcity can compound food insecurity, and put pressure on human survival.


How Water’s Properties Are Encoded in Its Molecular Structure and Energies (Dill et al[, Chem. Rev. 2017, 117, 12385−12414)

The current nuclear infrastructure is actually overall a huge consumer of fresh water for cooling purposes, but wise use of waste heat could transform nuclear energy into a water source, a point I made elsewhere in this space: Two Interesting Papers On the Utilization of Low Grade Heat.

Similarly, as a generator of gamma radiation, many fission products, most notably Cs-137, especially in the form of certain insoluble titanates (although fission product based substances others might also fill the bill) represents a real opportunity to deal with one of the most intractable (and from my perspective frightening) issues in water pollution, halogenated organic molecules, as well as intractable pharmaceutical metabolites and personal care products, since gamma radiation blows otherwise stable molecules to pieces, something we need to happen for airborne and water borne otherwise highly stable pollutants like PFOS and other polyfluorinated, polychlorinated, and polybrominated pollutants.

The point of all of the above, is that nothing that is useful can or should be considered "waste."

From my perspective, another key to realizing the full potential of nuclear energy to save us from ourselves is very much connected with the minor actinides neptunium and americium, elements that have long been considered - foolishly - to be so called "nuclear waste." In fact, if these two elements were in fact, discarded rather than utilized, they would be considered highly problematic because the most stable form of neptunium (into which amercium-241 decays) is the neptunate (V) ion, an oxyanion with a negative charge and three oxygens which forms water soluble salts. I have no doubt that they could be stored for millions of years in a way that would have no major environmental impact, but the question remains, "why do so?"

I elaborated on how I think Neptunium should be used to denature plutonium - to make it unsuitable for use in nuclear weapons - elsewhere: On Plutonium, Nuclear War, and Nuclear Peace

These ideas - which I generally refer to as the "Kessler solution" because one of the most prominent scientists to advance the argument in detail is the German nuclear scientist Günther Kessler, A new scientific solution for preventing the misuse of reactor-grade plutonium as nuclear explosive (Kessler et al, Nuclear Engineering and Design 238 (2008) 3429–3444) - are actually not new, and were not new in 2008, when Kessler wrote the first paper I read on the subject, but hardly the first paper ever written. Here, for example, is a discussion of the same topic dating to 1980: A Uranium-Plutonium-Neptunium Fuel Cycle to Produce Isotopically Denatured Plutonium (P. Wydler, W. Heer, P. Stiller & H. U. Wenger (1980) A Uranium-Plutonium-Neptunium Fuel Cycle to Produce Isotopically Denatured Plutonium, Nuclear Technology, 49:1,115-120). Kessler's paper is still very worthwhile, since it discusses in considerable detail the design of nuclear weapons and shows why nuclear weapons with a considerable heat load would be impractical.

However, the utility of these ideas does not mean that they are immediately practical, as a paper I will no cite discusses with particular attention to nuclear reactor engineering, wherein the neptunium were used in reactors that, by far, dominate all operating reactors on earth, that is thermal reactors.

A paper written the same year as the Kessler paper just referenced addresses the problems associated with utilizing the uranium/neptunium/plutonium cycle and is written by French scientists, who unlike German scientists have direct practical industrial experience both with recycling nuclear fuel and, regrettably, building nuclear weapons.

That paper is here: Neptunium in the Fuel Cycle: Nonproliferation Benefits Versus Industrial Drawbacks (Selena Ng, Dominique GrenÉche, Bernard Guesdon, Richard Vinoche, Marc
Delpech, Florence Dolci, HervÉ Golfier & Christine Poinot-Salanon (2008) Nuclear Technology, 164:1, 13-19)

Some excerpts from that paper:

AREVA, as a major industrial actor in the nuclear energy sector, is firmly committed to proliferation resistance efforts in the civilian fuel cycle. This commitment is even more important in the present context of worldwide development of nuclear power, coupled with recent geopolitical events placing proliferation at the forefront of concerns with the continued use of civil nuclear energy. The realization that a worldwide resurgence in nuclear power will require more sustainable management of its fuel resource and optimization of final repository use has recently turned the spotlight on civilian used fuel treatment-recycling plants. We believe that proliferation resistance should be approached “holistically,” that is, using a combination of technical or intrinsic! and institutional or extrinsic measures that take into account the context in which the system is placed, an aspect that is reflected in the integrated safeguards concept. Industry can boast an excellent track record in that no plutonium has been diverted from commercial treatment-recycling activities to date...

This paper will examine yet another suggestion, which is to add actinides—such as neptunium—at various points in the fuel cycle in order to reduce the attractiveness of the material containing the plutonium for proliferation purposes, or to increase its detectability should the material be diverted. Plutonium is undeniably a material of potential interest to state or non-state actors for nuclear explosive devices. But not all plutonium is equal. A high concentration of the isotopes 238Pu or 240Pu is particularly undesirable to potential proliferators because of their very high rates of radiation and decay heat, which complicate handling and manufacturing, and of spontaneous neutron emissions, which can affect the reliability and overall yield of the ultimate device. Table I, based on calculations using the depletion code CESAR, presents the plutonium isotopic composition 1 yr after reactor discharge of used uranium oxide UOX fuel and used mixed oxide MOX fuel according to burnup. Table I clearly shows that with increasing burnup, fissile plutonium content 239Pu and 241Pu decreases while the concentration of the undesirable plutonium isotopes 238Pu + 240Pu increases. Moreover, it is worth noting that discharged MOX fuel even at 45 GWd0t contains plutonium with an isotopic composition even more degraded than that of UOX fuel at the higher burnup of 60 GWd/t...

...One might then ask how the plutonium contained in discharged UOX fuel could be further degraded. One efficient route could be to add neptunium to fresh fuel, because 237Np—effectively the only neptunium isotope present in used fuel—produces 238Pu by neutron capture in the reactor core via 238Np with a half-life of 51 h!. This proposal, sometimes referred to as the “238Pu heat spike concept,” has been suggested several times in the past 3– 6 a and is still considered today as a possible option to enhance proliferation resistance in the nuclear fuel cycle. The remainder of this paper will examine the industrial feasibility and effectiveness of this proposal...


After a discussion that is highly technical (and also economic, owing to the costs of changing enrichments), and involves reactor physics and safety margins in thermal reactors - these place practical issues as to the amount of neptunium that can be added to the fuel without compromising safety margins owing to the hardening of the neutron spectrum and thus the worth of control rods and boron poisons, they then raise an important technical point that is discussed in many other places where incorporation of neptunium is discussed in these terms, that neptunium itself, generally monoisotopic as the 237 isotope is itself a weapons grade material of unusually high purity:

A brief examination of the physical properties of 237Np shows that it presents a non-negligible proliferation risk in comparison to that presented by highly enriched uranium or 239Pu. Its critical mass bare sphere is similar to that of pure 235U, neptunium in metal form is easier to compress than highly enriched uranium, and it presents insignificant heat generation and spontaneous neutron emission barriers for use in an explosive device. The only drawback it presents compared with 235U or 239Pu is its gamma-ray emission 1 mSv/h{kg 1 at 1 cm, but this can be overcome. It is in fact largely recognized today14 that neptunium could be used to fabricate a nuclear explosive device and that some states may already have tested a nuclear explosive based on neptunium.


Well then...

Um...um...um...

Does this mean that the incorporation of neptunium into the fuel cycle is a bad idea?

I don't think so.

Neptunium has been been isolated for a number of years, and is still being isolated and the number of nuclear wars that has resulted as a result is zero. Much of the success of the American space program, most notably the recently completed Cassini mission has relied precisely on this technology, the Apollo missions, the Mars Rovers, the Cassini mission, the Pioneers, and the Voyager missions all, among others, relied on the isolation of neptunium and its conversion into plutonium.

It is also worth noting that the French discussion relies on thermal reactors - France has many of them - but fast reactors represent quite a different story. In the fast nuclear cycle it is possible, at least in theory, to utilize pure neptunium as a nuclear fuel - more likely diluted with, say, depleted uranium, of which we have a great deal. Most importantly, in the fast nuclear cycle there is a nuclear reaction which takes place the 237Np[n,2n]236Np reaction that is not appreciable in thermal reactors other than in the fast fission fraction. The cross section of this reaction for fast (1-2 MeV) neutrons is just shy of 1 barn.

Neptunium 236 is a fairly stable isotope; it's half life is about 154,000 years. What makes it useful is that it decays about 12% of the time into plutonium-236, which in turn, with a half life of roughly 2.9 years into uranium-232. Uranium-232's decay series is the reason that thorium based nuclear fuels are considered proliferation resistant, because of the intense gamma radiation associated with its decay product Thallium-208. (In addition 236Pu itself generates even a larger amount of heat than does 238Pu.)

Thus the use of neptunium in fast reactors is extremely proliferation resistant, since it is possible to denature neptunium itself. I have often noted in various places around the internet that the fast neutron cycle is superior to all other fuel cycles since it represents the only opportunity, given the large amounts of depleted uranium - along with the thorium dumped to provide materials for the useless and ineffective wind industry - to put an end to all energy mining for several centuries, no coal, no oil, no gas, fewer lanthanides (many important lanthanides are side products of nuclear fuel recycling, notably praseodymium and neodymium).

We need to wake up and smell the air, which is increasing polluted and degraded precisely because of the fear and ignorance directed against our last best hope.

As I noted in the link I produced above to a post on another website, we can never make nuclear war impossible, since the nuclear cat is out of the bag and as long as uranium exists - and it will always exist - it will always be possible to make nuclear weapons. But the key to making nuclear war less likely is not to ban nuclear power, but rather to embrace it, in particular in the fast cycle, since this cycle makes it possible to denature all potential nuclear materials, including those which occur naturally, specifically, uranium.

Have a nice weekend.

Powerful and strange documentary.

I'm in Germany this evening, and in my hotel, looking for something in English, I came across a documentary by Sue Perkins.

It's the Indian city where funeral pyres are started.


http://www.bbc.co.uk/mediacentre/latestnews/2016/sue-perkins-ganges

I'm not sure one can see it on line or not, but it's a remarkable film

I'm not sure if it's available on line, but it is fascinating, the spiritual importance of the river vs its environmental stress.

The biggest concern in one village is that women are required to not urinate or defecate until dark, and then must do so in the fields, where they can be assaulted or bit by snakes, scorpions.

Told gently with great respect.

We don't have television like this in the US.

I have never been so ashamed to be an American.

I'm at an industry event in Germany and there was a person checking attendees in at one of the halls.

There was no line behind me so she struck up a conversation.

She told me that her son is an excellent student, and that she'd worked all her life to be able to send him the United States for an American university education, but now she is afraid to do so. (She spoke excellent English, and excellent German.)

I tried to tell her that her son would be safe, but she refused to believe me.

She pinched her skin and told me she was born in Africa. "My son will not be safe in your country," she insisted.

I understand her point. It's very possible that an African young man with a German accent might well be killed in our country.

I'm so ashamed.

We are not just losing the fine citizens who saw our country as a promised land, the best of the best.

We are losing the intellectuals, the thinkers, the people of broad spirit and broad culture who have always enriched our country for more than 200 years.

I'm so ashamed.

Front Lines in the Battle Against Antibiotic Resistance and a Posthumous Synthesis.

My favorite section heading title in the scientific review article I will discuss in this post, Natural Products as Platforms To Overcome Antibiotic Resistance (Wuest, Fletcher and Rossiter, Chem. Rev. 2017, 117, 12415−12474) is this one:

"Woodward’s Posthumous Synthesis of Erythromycin A."

Robert Burns Woodward is the only person to have synthesized Erythromycin A, which has this structure:



"Only person" is too strong a word; a better term would that Woodward's group is the only group to synthesize erythromycin A, the synthesis which was almost certainly largely designed by him, being completed by his students and post-docs after he died.

A nice internet riff on Woodward's synthesis of erythromycin A is here: Woodward Wednesdays

He did not, however, synthesize the drug after he died, although he was such a remarkable chemist, probably the greatest organic chemist ever, that one is inclined to think of him in quasi-mystical terms.

The completion of the work was conducted under Harvard Professor Yoshito Kishi, who had been one of Woodward's students.

Here is the final paper on this synthesis, published after Woodward died: Asymmetric total synthesis of erythromycin. 3. Total synthesis of erythromycin

Here is the triumphal remarks on the completion of the synthesis:

Completion of the synthesis of erythromycin was carried out in the following manner. Simultaneous deprotection of both the C-4” hydroxyl group of the cladinose moiety and the C-9 amino group in 7d by Na-Hg/MeOH13 furnished (9s)-erythromycylamine (l1a) [mp 126-129 OC, [.Iz5D -48.1’ (c 0.59, CHCl,); 75% yield] which was found to be identical with an authentic Me Me sample prepared from natural erythromycin by a known method.2O Treatment of loa with N-chlorosuccinimide (1 equiv) in pyridine at 25 OC gave 10b (mp 166-170 OC with partial melting at 130-134 “C), which was dehydrochlorinated by AgF in HMPA a t 70 OC to yield erythromycinimine ( 1 0 ~ ) Hydrolysis of 1Oc in water at 50oC afforded the corresponding ketone (40% overall yield from l0a), which was found to be identical with erythromycin (2) in all respects (‘H NMR, mp, mmp, CYD, mass, IR and chromatographic mobility).22


Here, from the review cited at the opening of the post is a scheme of Woodward's synthesis of erythromycin A:




Woodward, by the way was an interesting fellow. When he was 11 years old, in 1928, he wrote the German consulate in Boston to request copies of scientific papers relating to a class of chemical reactions now known as cycloadditions (specifically the Diels-Alder reaction). He entered MIT when he was 16 in 1933, was kicked out in 1934 for neglecting his studies, readmitted in 1935 and granted a bachelor's degree in 1936 and a Ph.D in 1937 when he was 20 years old. It is said he never actually had much interaction with his Ph.D. advisers, and barely knew them, their "adviser" status being a mere formality. He joined the faculty of Harvard University shortly after, remaining there until his death in 1979. Besides being the only person to successfully design a successful total synthesis for erythromycin A, Woodward was the only scientist to supervise a successful synthesis of vitamin B12. He was awarded the Nobel Prize in 1965.

Later on his life he worked with Roald Hoffman - who was born a Jew in Poland and was hidden in an attic from 1943 to 1944 (Anne Frank style) between the ages of 7 and 9 where his mother read him Geography texts - to formulate the Woodward Hoffman rules, for which Hoffman, a chemistry professor at Cornell University, was awarded the Nobel Prize in 1981, after Woodward died in 1979.

I have their book, one of my happiest possessions, in the original Verlag edition in my personal library with an inserted "errata" page in it, and the orbitals drawn in green and blue:




These rules govern the Diels-Alder reaction (and many other reactions) about which Woodward was inquiring when he was 11 years old.

As I remarked earlier in this space, one reason to do the synthesis of complex molecules is that such syntheses are high art, expressions of the beauty of the human mind:

A Beautiful Review Article on the Total Synthesis of Vancomycin.

Commercially erythromycin is not obtained by organic synthesis; it is isolated from cell cultures, a process which makes it about as cheap as aspirin, and it is still a widely used drug. (Woodward's synthesis involved 48 steps: A synthesis of this type would be commercially - and environmentally - prohibitive.) However, the species it kills are rapidly evolving, many resistant strains are known.

In Woodward's time, the goal of the organic synthesis of complex natural products was to prove their structure. Modern instrumentation coupled with modern software, high field NMR, high resolution mass spec, and x-ray and neutron diffraction systems have made organic synthesis less important in structural assignments.

Today, the goal of organic synthesis is less about structure and pure science and more about improving the "SAR" (structure activity relationships) of molecules that are modified versions of the natural products. Organic synthesis can also resolve in some cases drug shortages caused by the rarity of the species producing the natural product, an example being the anti-neoplastic (anti-cancer) drug Taxol, originally isolated from the bark of a slow growing yew tree in the Pacific Northwest, but ultimately provided commercially by partial synthesis from a precursor found in the far more plentiful pine needles of this tree. Another reason for modifying natural products is that they may not be optimized for use as drugs; they may exhibit unacceptable toxicity which needs moderation, or poor bioavailability, short half-lives in vivo, or poor stability.

So this is the goal of modern day synthesis of natural products, to improve upon nature to optimize natural products to serve humanity or to improve on their availability.

This long winded intro about R.B. Woodward brings me to the paper.

One of the many crises now before humanity is the fact that the effectiveness of antibiotics is being defeated by evolution, irrespective of whether the nut cases in the Republican party "believe" in evolution or not. (Evolution is not about "belief"; it is a fact and morons who cannot accept facts are, um, well, morons.) This evolution is being driven by overuse and by misuse, often on the part of patients, who stop taking their antibiotics when they start feeling better, even though some of the infectious organisms still survive in their systems, indeed, those organisms that have the strongest resistance to the antibiotic presented to defeat them. The consequences should be self evident.

Antibiotic development is not a big money maker in the pharmaceutical industry by the way. The industry makes more money on drugs that are palliative than drugs which cure diseases. A blood pressure drug that a patient is required to take for the rest of his or her life is going to make more money than a drug that cures a bacterial infection and only needs to be taken for a week or two.

A nice cartoon from the paper cited at the beginning of demonstrates the modes of action of almost all antibiotics now on the market:



Another graphic shows the number of people who get and die each year from infectious diseases that are resistant to antibiotics.



Figure 2. Total infections (gray) and deaths (black) in the US associated with various pathogenic bacteria.11 CRE = carbapenem-resistant enterococci; VRE = vancomycin-resistant enterococci; MDR =multidrug resistant.


Remarks from the introduction of the review article:

Among the greatest achievements of humankind in recent history stands the discovery and production of penicillin as a life-saving antibiotic. However, nearly a century of unchecked usage has rendered the world’s supply of antibiotics severely weakened; Sir Alexander Fleming noted in his Nobel lecture that underdosage can apply the selective pressure that induces bacteria to evolve resistance to these drugs. In this review, we contrast the traditional method of semisynthetic modifications to natural products with modern synthetic approaches to develop new antibiotics around the privileged scaffolds that informed drug discovery for decades in order to overcome contemporary antibiotic resistance. In the 90 years since the discovery of penicillin (1), natural products have provided a major foundation for the development of antibiotic drugs. The reliance on natural products to provide new molecular entities for virtually every disease is also well established.1 Of the nine antibiotic classes in Figure 1, six represent naturally occurring compounds, with only three (the sulfonamides, fluoroquinolones, and oxazolidinones) conceived entirely through synthetic chemistry. We note the impressive structural diversity and complexity within the natural product antibiotics especially when compared to the synthetic classes.

Scientists have warned for decades that bacteria are rapidly evolving resistance to antibiotics.2−4 Resistance has proliferated due to a confluence of two key factors: the frequent prescription against infections of a nonbacterial nature, such as viral infections, and unregulated usage, which can lead to sublethal doses, permitting resistance to spread rapidly.5 We also observe that prescribing habits vary drastically from country to country; the United States is particularly likely to use recently developed antibiotics rapidly, possibly shortening their lifetime of efficacy.6 Analysis of the IMS Health Midas database indicated that between 2010 and 2014 consumption of antibiotics worldwide increased by 36%;7 the carbapenems and polymyxins, two”last resort” drugs, have increased in usage by 45% and 13%, respectively. This resistance is extensively observed in hospitals where immunocompromised patients are particularly vulnerable. 8 Hospital-acquired resistant infections have spread rapidly since the initial discovery of sulfonamide- and penicillin-resistant strains shortly after the introduction of these drugs in the 1930s and 1940s.9,10 In the U.S. and U.K. this problem has not abated, as nearly 40−60% of hospital-acquired S. aureus strains are methicillin-resistant.11 These public health threats will continue to rise without new antibiotics and meaningful changes in treating infections. Beyond prescription in humans, antibiotics find extensive use as prophylactic agricultural supplements to promote livestock growth and prevent diseases. It is estimated that the US livestock industry consumes a staggering 80% of antibiotics produced.5 Antibiotic-resistant strains of Salmonella have been identified in ground meat,12 and antibiotic use in livestock has been strongly linked to fluoroquinolone-resistant Salmonella.1


The paper is 51 pages long, and regrettably I cannot reproduce it all in a post like this. The point of the article is a review of techniques that may address the utilization and semi-synthesis from natural products that show (out of evolutionary necessity) bacteriocidal effects.

It's chock full of beautiful synthetic schemes; if you can find your way to a good scientific library, and have a love and understanding of organic chemistry, an afternoon reading it might be really well spent.

One of the funnier parts of the article, well it would be funny were it not so awful, comes after the part just quoted above:

The need for new antibiotics is increasingly widely appreciated as a pressing concern by governments, scientists, and the general public.14,15 These factors, in tandem with the reduced research and development toward discovering new antibiotics, have worsened the recent eruption of antibiotic resistant bacterial populations across the globe. As the golden age of antibiotics has clearly ended, the most pessimistic view of the current state of affairs is that a postantibiotic era may be approaching.


The bold is mine.

Um, not your government. Your government is controlled by ignorant clowns and stupid people who hate science because they are not bright enough or educated enough to know a damned thing about it other than that they hate it. And while the neo-nazis in the White House and their racist pals in Congress may be slightly worse than the general public, the general public - and we need to include some people on the left as well, anti-vaxxers, anti-nukes and their ilk, anti-this, anti-that, in the set that has put this country well on the path to a post-scientific age.

The conclusion of the review begins with a restatement of what I said above:

In most talks given by natural product chemists, it is commonly noted that while natural products can have outstanding biological activities, they are often poor choices for drugs, exemplified by erythromycin. While semisynthetic modifications have historically redirected this potent activity into a clinically useful drug, we have demonstrated that the need for new antibiotics to overcome resistant pathogens is so great as to require new generations of drugs occupying previously unexplored regions of chemical space. We have shown herein that the most direct, efficient, and fruitful method of generating drugs that can evade bacterial resistance mechanisms is through the power of total synthesis. We have outlined synthetic achievements toward many antibiotic scaffolds, both traditional and unexplored, and have discussed how these compounds fare against pathogenic bacteria. Traditional SAR studies can be undertaken to identify the key bioactive moieties, which can then be modified to generate more potent compounds. Additionally, synthetic approaches such as DOS, FOS, and CtD can be used to construct unprecedented scaffolds bearing the complexity of natural products, despite that these molecules may be foreign to Nature to the best of our knowledge. We hope that the examples discussed herein will spark further inspiration in the synthetic community to continue exploring innovative targets and methods to ensure a sustainable antibiotic supply.


And it ends with this plaintive scientific call:

Despite the potential for new antibiotic isolates and scaffolds, we must take care to preserve the efficacy of drugs currently prescribed (and overprescribed!). This requires improving upon our antibiotic stewardship by encouraging reduction in both the over prescription and misuse of these medicines. We must also continue to educate the public about the causes and persistence of antibiotic resistance, in part to drive public favor for a greater allocation of resources to address this crisis. Although the recognition of the term “antibiotic resistance” has increased, the understanding of how to avoid it and how it is caused has not been translated as effectively.15 Only by actively combining scientific innovation and communication can we avoid a postantibiotic era.


I've bolded the line to wish the fine scientists who wrote this review good luck with that. We, in this country, have just established ourselves as a nation of morons, an international laughing stock with an educational system being directed by an Amway scanner who absolutely hates education.

I don't mean to be too depressing.

Have a nice Sunday in spite of me.

Moisture Swing Absoprtion/Desorption of CO2 Using Polymeric Ionic Liquids.

The paper I will discuss in this thread is this one: Designing Moisture-Swing CO2 Sorbents through Anion Screening of Polymeric Ionic Liquids (Wang et al, Energy Fuels, 2017, 31 (10), pp 11127–11133)

All of humanity's efforts - or lack of effort mixed with a unhealthy dollop of denial - to address climate change have failed.

No one now living will ever again see a reading of under 400 ppm of the dangerous fossil fuel waste carbon dioxide in the atmosphere, an irreversible milestone that was passed last year.

On the right, the open hatred of science has put an uneducated and unintelligent orange human nightmare in the White House who is actually trying to revive the worst dangerous fossil fuel, coal, while on the left, there has been a delusional embrace of so called "renewable energy" that has lead to the very dangerous, and frankly criminal, surge in the dangerous natural gas industry, for which so called "renewable energy" is merely lipstick on the pig.

So called "renewable energy" did not work, is not working, and will not work.

Thus, it will fall to future generations however many - if any - survivors there are, to remove carbon dioxide from the atmosphere, an engineering challenge that is enormous, to the extent it is even feasible. Our practiced contempt for them will leave them with few resources to address this planet.

In recent years I've been thinking and reading a great deal about this challenge. It seems to me that there are only two reasonable pathways that might work, one being extraction of carbon dioxide from seawater (where on a volume basis it is far more concentrated than in air) and the other being the pyrolytic processing of biomass, which, as life is self replicating and can thus produce huge surface areas capable of absorbing CO2.

Right now of course, the primitive way that biomass is utilized - combustion - is responsible for roughly half of the 7 million people killed each year from air pollution while mindless dullards do stuff like for example (one actually sees this kind of thing) whine about the collapse of a tunnel at the Richland National Laboratory where radioactive materials are stored.

However, in pyrolytic treatment of biomass, the biomass is heated in a closed system that is not vented to the atmosphere. If the heat to drive the pyrolytic reactions is nuclear heat - using the only real resource that we are leaving to future generations, used nuclear fuel, depleted uranium, and the thorium waste from the stupid and wasteful wind turbine/electric car industry - pyrolytic treatment of biomass can almost be certainly carried out in a way that is carbon negative to the extent that carbon is captured in products like graphite.

What is necessary in this case is the ability to separate carbon dioxide from other gases, notably hydrogen.

It is thus with some interest the paper linked at the beginning of this post, published by scientists at State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou, Zhejiang Province 310018, P.R. China. (China loses close to 2 million people per year to air pollution, and unlike Americans, they are actually interested in solving the problem.)

An ionic liquid is an salt, almost always partially organic, that is liquid at or near room temperature. A great deal of research has been conducted on ionic liquids, and they are very promising reagents for the capture of carbon dioxide.

I discussed them earlier in this space, providing a link to a lecture by the American scientist Joan Brennecke, an expert in ionic liquids:

On the Solubility of Carbon Dioxide in Ionic Liquids.

The work by the Chinese scientists expands greatly on Dr. Brenneke's talk and offers some very novel approaches.

As the title suggests, this is a moisture swing approach to capture and release of carbon dioxide. There are several types of "swings" that are utilized in the separation of gases. "Pressure swings" rely on a permeable material that will preferentially absorb or release one gas faster than an other when subject to changes in pressure. Commercial nitrogen and oxygen generators work this way to separate either oxygen or nitrogen from air. They consist of a compressor and a valve. "Temperature swings" rely on heating and cooling a material that absorbs a gas, monoaminoethanol is such an agent often proposed (but seldom used on a meaningful scale) for carbon capture.

This "moisture swing" is unique in as much as it relies on wetting and drying a polymeric ionic liquid.

Here's some text from the paper:

Carbon dioxide capture and storage (CCS) is regarded as one of the most effective approaches to alleviating global warming.(1) Among the developed CO2 capture technologies, adsorption based on a solid material is advantageous, since sorbents have unique interfacial properties, such as porous structures, modifiable functional groups,(2, 3) and low emissions to the environment.(4) Recently, polymeric ionic liquids (PILs) have been developed as a family of promising CO2 sorbents, as they possess the unique characteristics of ionic liquids (ILs) and the feasibility of a macromolecular framework.(5-7) Both the CO2 adsorption capacity and rate of PILs can be substantially higher than those of their corresponding ionic liquid monomers.(8) Similar to the fine-tunability of ILs,(9) these properties of PILs can further be tuned by the choice of cations and anions.

Among PILs, the quaternary-ammonium-based PILs have been highlighted in the literature due to their higher stability and CO2 sorption capacity than those of imidazolium-based PILs.(10) More interestingly, when coordinated with relatively basic anions such as OH– and CO32–, these PILs show the unique property of moisture-swing adsorption (MSA).(11) During the moisture-swing cycle, the sorbents adsorb CO2 in a dry atmosphere and release CO2 in a humid atmosphere. For poly[4-vinylbenzyltrimethylammonium carbonate] (P[VBTEA][CO32–]), the equilibrium partial pressure of CO2 under wet conditions is 2 orders of magnitude higher than that under dry conditions.(11) This moisture-induced cycle utilizes the free energy released by water evaporation, and thus, it can avoid the use of high-grade heat for sorbent regeneration and is also environmentally benign...


Humanity will never invent a form of energy that is as is safe, as sustainable, or as reliable as nuclear energy. However, nuclear energy is not perfect, of course (a point raised frequently by stupid people with selective attention who love to burn gas and coal to rant about, say, Fukushima). From my perspective, the biggest single problem with nuclear energy is what to do with the waste heat.

This system, however, offers a wonderful way to utilize waste heat in say, hot climates, where such heat may be of less value than in climates having cold spells - not that his planet will have cold spells as frequently as it once did.

The authors continue:

...Of particular interest in this work is the screening of anions for PIL materials with MSA ability for CO2. Previous studies have shown that the type of anion plays a key role in defining the PIL features.(12) After introducing CO32– or OH– anions into P[VBTEA], the sorbents exhibited such a strong CO2 affinity that they were proposed for directly capturing CO2 from the ambient air (400 ppm).(11, 13-16) For quaternary-ammonium-based salt hydrates with F– or acetate (Ac–), the CO2 absorption capacity was observed to be affected by the hydration state,(17) which is similar to the MSA. The CO2 adsorption isotherms indicate that they are suitable for gas separation from concentrated sources (15–100 kPa).(17) To gain insight into the MSA, several theoretical studies have been conducted on ionic interactions, reaction pathways, and hydration energy. Density functional theory (DFT) calculations by Wang et al.(18) demonstrated the reaction pathways of proton transfer for the PIL with the anion CO32–. The results showed that the hydrated water acts as both a reactant and a catalyst during CO2 adsorption. By building a molecular dynamic model for ion pairs of the quaternary-ammonium cation (N(CH3)4+) and CO32– in a confined space, Shi et al.(19) found that the free energy of CO32– hydrolysis in nanopores decreases with the decrease in water availability. However, how the chemical structures, especially the types of anion, of quaternary-ammonium-based PILs would affect the moisture-swing performance for CO2 capture remains unknown. Therefore, extensive work is required to reveal the relationship between the structural and physicochemical properties of PILs, especially through theoretical approaches.


In this paper, the ionic liquid is bound to polystyrene, as the following picture from the paper shows:




The structure of the ionic liquids is optimized so that it can react freely with water, and the authors screen various anions to form ion pairs with the positively charged polymeric ionic liquid, so that the structure can accommodate 3 water molecules.



The thermodynamic reaction diagram is shown:



The structure is optimized, reflecting that the fluoride ion is found to be the best counterion.



The authors conclude thusly:

Quaternary-ammonium-based PILs are promising sorbents for CO2 capture. In this work, theoretical studies were performed to systematically investigate the effect of varying the counteranion on CO2 adsorption. Our results showed that the PIL with carbonate as the counteranion has the lowest activation energy, strongest CO2 affinity, and largest swing size, and it functions through a two-step mechanism, which indicates that it is a better candidate as a sorbent to capture CO2 from ultralow concentration mixtures, such as that of air. The PIL with fluoride as the counteranion has a low activation energy, strong CO2 affinity, and medium-to-large swing size, and it adsorbs CO2 through a two-step mechanism, owing to the unique ability of fluoride to strongly attract protons. The PIL with acetate as the counteranion has a high activation energy, weak CO2 affinity, and small swing size and functions with a one-step mechanism, which indicates that it is suitable for capturing CO2 at a high partial pressure because of its large capacity and its feasibility to be regenerated through conventional approaches. Further investigations revealed that the repulsion between the two quaternary ammonium cations, which interact with the carbonate anion or two fluoride anions, could promote the dissociation of hydrated water and lower the activation energy of the CO2 adsorption. The two-step reaction pathways also exhibit low activation energy owing to the relatively small structural changes of each step. Our findings could provide a fundamental understanding of CO2 capture by quaternary-ammonium-based PILs and pave the way toward determining the optimal structure of a PIL to be used for CO2 capture in specific circumstances.


This is interesting. Note that if the ionic liquids themselves are synthesized utilizing carbon dioxide as a starting material - this is definitely possible, particularly using biological lignin as a source for the aromatic rings, the carbon is sequestered by the capture agent.

I thought this paper was very cool and thought I'd share it.

Have a nice Friday tomorrow.

Long Persisting Organic luminescent Systems.

Here's another cool paper: Organic long persistent luminescence

Long persistent luminescence (LPL) materials—widely commercialized as ‘glow-in-the-dark’ paints—store excitation energy in excited states that slowly release this energy as light1. At present, most LPL materials are based on an inorganic system of strontium aluminium oxide (SrAl2O4) doped with europium and dysprosium, and exhibit emission for more than ten hours2. However, this system requires rare elements and temperatures higher than 1,000 degrees Celsius during fabrication, and light scattering by SrAl2O4 powders limits the transparency of LPL paints1. Here we show that an organic LPL (OLPL) system of two simple organic molecules that is free from rare elements and easy to fabricate can generate emission that lasts for more than one hour at room temperature. Previous organic systems, which were based on two-photon ionization, required high excitation intensities and low temperatures3. By contrast, our OLPL system—which is based on emission from excited complexes (exciplexes) upon the recombination of long-lived charge-separated states—can be excited by a standard white LED light source and generate long emission even at temperatures above 100 degrees Celsius. This OLPL system is transparent, soluble, and potentially flexible and colour-tunable, opening new applications for LPL in large-area and flexible paints, biomarkers, fabrics, and windows. Moreover, the study of long-lived charge separation in this system should advance understanding of a wide variety of organic semiconductor devices4.




...Upon exposure to ultraviolet or visible light, some substances absorb the excitation energy and release it as a differently coloured light either quickly, as fluorescence, or slowly, in the form of either phosphorescence or long persistent luminescence. Exploiting the slowest emission process, LPL materials have been widely commercialized as glow-in-the-dark paints for watches and emergency signs1, and are being explored for application in in vivo biological imaging because their long-lived emission makes it possible to take time-resolved images long after excitation5. In the mid-1990s, a highly efficient LPL system was developed that uses SrAl2O4 doped with europium and dysprosium, and this inorganic system forms the basis of most commercial glow-in-the-dark paints because of its long emission (more than ten hours) and high durability2. However, this system requires not only rare elements for long-lived emission but also very high fabrication temperatures of more than 1,000 °C. Moreover, the manufacturing of paints from the insoluble SrAl2O4 requires many steps, including grinding of the compounds into micrometre-scale powders for dispersion into solvents or matrices, and light scattering by the powders prevents the formulation of a transparent paint1. The realization of LPL from organic molecules would solve many of these problems.


Their glow in the dark organic molecules can stay lit up for about an hour.

Ion Sieves from Graphene Oxide.

Cool paper, this one: Ion sieving in graphene oxide membranes via cationic control of interlayer spacing (Wu, Jin, Fang, Li et al Nature 550, 380–383 (19 October 2017))

We're in a hell of place with water and metals on this planet, whether we know it or not, and our foolish "investment" in so called "renewable energy" which is yet another distributed (and thus difficult to control) "thing" in our portfolio of "things" will make it worse.

This kind of filter potentially could be utilized to manage distributed waste which is what distributed things become after a short interlude.

As always, the caveat is the requirement for energy.

Anyway, some text from the very interesting materials science paper:

Graphene oxide membranes—partially oxidized, stacked sheets of graphene1—can provide ultrathin, high-flux and energy-efficient membranes for precise ionic and molecular sieving in aqueous solution2, 3, 4, 5, 6. These materials have shown potential in a variety of applications, including water desalination and purification7, 8, 9, gas and ion separation10, 11, 12, 13, biosensors14, proton conductors15, lithium-based batteries16 and super-capacitors17. Unlike the pores of carbon nanotube membranes, which have fixed sizes18, 19, 20, the pores of graphene oxide membranes—that is, the interlayer spacing between graphene oxide sheets (a sheet is a single flake inside the membrane)—are of variable size. Furthermore, it is difficult to reduce the interlayer spacing sufficiently to exclude small ions and to maintain this spacing against the tendency of graphene oxide membranes to swell when immersed in aqueous solution21, 22, 23, 24, 25. These challenges hinder the potential ion filtration applications of graphene oxide membranes. Here we demonstrate cationic control of the interlayer spacing of graphene oxide membranes with ångström precision using K+, Na+, Ca2+, Li+ or Mg2+ ions. Moreover, membrane spacings controlled by one type of cation can efficiently and selectively exclude other cations that have larger hydrated volumes.





There have been previous efforts to tune the interlayer spacing. For example, it can be widened, to increase the permeability of the graphene oxide membrane (GOM), by intercalating large nanomaterials21, 22 as well as by cross-linking large and rigid molecules23. Reducing GOMs can lead to a sharp decrease in the interlayer spacing, but renders them highly impermeable to all gases, liquids and aggressive chemicals24, 25. Recent work reported a way of sieving ions through GOMs by encapsulating the graphene oxide sheets in epoxy films and varying the relative humidity27 to tune the interlayer spacing. It remains difficult to reduce the interlayer spacing sufficiently (to less than a nanometre) to exclude small ions while still permitting water flow and enabling scalable production25. This limits the potential of GOMs for separating ions from bulk solution or for sieving ions of a specific size range from a mixed salt solution—such as the most common ions in sea water and those in the electrolytes of lithium-based batteries and super-capacitors (Na+, Mg2+, Ca2+, K+ and Li+)2, 25. Here we combine experimental observations and theoretical calculation to show that cations (K+, Na+, Ca2+, Li+ and Mg2+) themselves can determine and fix the interlayer spacing of GOMs at sizes as small as a nanometre


From the conclusion:

In summary, we have experimentally achieved facile and precise control of the interlayer spacing in GOMs, with a precision of down to 1 Å, and corresponding ion rejection, through the addition of one kind of cation. This method is based on our understanding of the strong noncovalent hydrated cation–π interactions between hydrated cations and the aromatic ring, and its production is scalable. We note that our previous density functional theory computations show that other cations (Fe2+, Co2+, Cu2+, Cd2+, Cr2+ and Pb2+) have a much stronger cation–π interaction with the graphene sheet26, suggesting that other ions could be used to produce a wider range of interlayer spacings. Overall, our findings represent a step towards graphene-oxide-based applications, such as water desalination and gas purification, solvent dehydration, lithium-based batteries and supercapacitors and molecular sieving.


Have a nice day tomorrow.

On the future of work...

There are some interesting, if worrisome, reports in the current issue of Nature, one entitled "The Future of Work"


From a news item in the current issue of Nature:

Last year, entrepreneur Sebastian Thrun set out to augment his sales force with artificial intelligence. Thrun is the founder and president of Udacity, an education company that provides online courses and employs an armada of salespeople who answer questions from potential students through online chats. Thrun, who also runs a computer-science lab at Stanford University in California, worked with one of his students to collect the transcripts of these chats, noting which resulted in students signing up for a course. The pair fed the chats into a machine-learning system, which was able to glean the most effective responses to a variety of common questions.

Next, they put this digital sales assistant to work alongside human colleagues. When a query came in, the program would suggest an appropriate response, which a salesperson could tailor if necessary. It was an instantaneously reactive sales script with reams of data supporting every part of the pitch. And it worked; the team was able to handle twice as many prospects at once and convert a higher percentage of them into sales. The system, Thrun says, essentially packaged the skills of the company's best salespeople and bequeathed them to the entire team — a process that he views as potentially revolutionary. “Just as much as the steam engine and the car have amplified our muscle power, this could amplify our brainpower and turn us into superhumans intellectually,” he says.

The past decade has seen remarkable advances in digital technologies, including artificial intelligence (AI), robotics, cloud computing, data analytics and mobile communications. Over the coming decades, these technologies will transform nearly every industry — from agriculture, medicine and manufacturing to sales, finance and transportation — and reshape the nature of work. “Millions of jobs will be eliminated, millions of new jobs will be created and needed, and far more jobs will be transformed,” says Erik Brynjolfsson, who directs the Initiative on the Digital Economy at the Massachusetts Institute of Technology in Cambridge.


http://www.nature.com/polopoly_fs/7.47100.1508235986!/image/Future-of-work_graphic1.jpg_gen/derivatives/landscape_630/Future-of-work_graphic1.jpg

This is actually slightly encouraging. Robots do better in extreme environments than human beings, and since we are well on the way to making this entire planet an extreme environment, it's possible someone, um, something, will be here to um, "enjoy?!?" it.

In another news item in the same issue, there's a report of a computer that learned to play the game "Go" without human intervention and taught itself, in days, to beat the world's best players.

Self-taught AI is best yet at strategy game Go


Artificial-intelligence program AlphaGo Zero trained in just days, without any human input.


An artificial intelligence (AI) program from Google-owned company DeepMind has reached superhuman level at the strategy game Go — without learning from any human moves.

This ability to self-train without human input is a crucial step towards the dream of creating a general AI that can tackle any task. In the nearer-term, though, it could enable programs to take on scientific challenges such as protein folding or materials research, said DeepMind chief executive Demis Hassabis at a press briefing. “We’re quite excited because we think this is now good enough to make some real progress on some real problems.”

Previous Go-playing computers developed by DeepMind, which is based in London, began by training on more than 100,000 human games played by experts. The latest program, known as AlphaGo Zero, instead starts from scratch using random moves, and learns by playing against itself. After 40 days of training and 30 million games, the AI was able to beat the world's previous best 'player' — another DeepMind AI known as AlphaGo Master. The results are published today in Nature1, with an accompanying commentary2.

An FDA panel has recommended approval of the first gene therapy drug.

FDA advisers back gene therapy for rare form of blindness. (News Item, Nature, Vol. 550 Issue 7676)



Advisers to the US Food and Drug Administration (FDA) have paved the way for the agency’s first approval of a gene therapy to treat a disease caused by a genetic mutation.

On 12 October, a panel of external experts unanimously voted that the benefits of the therapy, which treats a form of hereditary blindness, outweigh its risks. The FDA is not required to follow the guidance of its advisers, but it often does. A final decision on the treatment, called voretigene neparvovec (Luxturna), is expected by 12 January.

An approval in the lucrative US drug market would be a validation that gene-therapy researchers have awaited for decades. “It’s the first of its kind,” says geneticist Mark Kay of Stanford University in California, of the treatment. “Things are beginning to look more promising for gene therapy.”

Luxturna is made by Spark Therapeutics of Philadelphia, Pennsylvania, and is designed to treat individuals who have two mutated copies of a gene called RPE65. The mutations impair the eye’s ability to respond to light, and ultimately lead to the destruction of photoreceptors in the retina.

The treatment consists of a virus loaded with a normal copy of the RPE65 gene. The virus is injected into the eye, where the gene is expressed and supplies a normal copy of the RPE65 protein.

In a randomized controlled trial that enrolled 31 people, Spark showed that, on average, patients who received the treatment improved their ability to navigate a special obstacle course1. This improvement was sustained for the full year during which the company gathered data. The control group, however, showed no improvement overall. This was enough to convince the FDA advisory committee that the benefits of the therapy outweigh the risks.

Long road

That endorsement is an important vote of confidence for a field that has struggled over the past 20 years. In the early 1990s, gene therapy was red hot, says David Williams, chief scientific officer at Boston Children’s Hospital in Massachusetts. “You couldn’t keep young people out of the field,” he says. “Everyone wanted in.” Then came the death of a young patient enrolled in a gene-therapy clinical trial, and the realization that a gene therapy used to treat children with an immune disorder could cause leukaemia.

Investors backed away from gene therapy, and some academics grew scornful of it. Although European regulators approved one such therapy in 2012, for a condition that causes severe pancreatitis, many doubted that it worked. (The company that makes it has announced that it will not renew its licence to market the drug when it expires on 25 October.) “You’re too smart to work in this field,” a colleague told Kay. “It’s a pseudoscience.”

But some researchers kept plugging away at the problem, improving the vectors that shuttle genes into human cells. Over time, new clinical trials began to show promise, and pharmaceutical companies became more interested in developing treatments for rare genetic diseases. Gradually, investors returned...


These people will be, of course, GMO, and we can look for the bourgeois assholes at Greenpeace to protest their existence.

Blindness has never worried Greenpeace types, as we can see from their awful and frankly criminal campaign against golden rice, which might have addressed vitamin A deficiencies in, um, poor people.

It is perfectly acceptable of course, to make people suffer if people who neither understand nor like nor are competent to understand science object loudly.

This is good news in any case.

I am involved (peripherally) professionally in a project involving gene therapy for a disease that kills people. Thus I find this approval encouraging.



Go to Page: 1 2 Next »