By Steve Hargreaves
5 December 2011
DOHA, Qatar (CNNMoney) -- Representatives from a half-dozen OPEC nations acknowledged Monday what many U.S. politicians won't -- that global warming is indeed a problem.
The representatives attending the World Petroleum Congress -- a week-long gathering of oil industry executives and government officials held every three years -- outlined steps their countries are taking to move toward cleaner, renewable energy.
"Increasing climate effects are an unquestionable reality," said Sheikh Hamad bin Khalifa Al Thani, the Emir of Qatar. "Developing clean and renewable resources is a goal fully supported by oil and gas exporters."
The opening session of the conference focused on ways the Middle East can help solve the world's energy challenge: dealing with the dependency on a dirty form of fuel that's becoming ever more expensive and will someday run out.
Of course, increasing investment in oil production is a top priority.
The minister from Bahrain detailed several new projects his country is undertaking, and the Kuwaiti minister said his country plans on investing $180 billion over the next two decades in oil field development.
With that investment, Kuwait hopes to increase its oil output to 4 million barrels a day from the current 3 million barrels a day as early as 2020.
But oil ministers from Bahrain and the United Arab Emirates also talked about solar projects their nations are building. Those projects are still modest in size compared to projects in the United States, Spain or other places, but include plans for big expansion going forward. […]
Monday, December 5, 2011
Friday, December 2, 2011
Judge orders Washington state and regional air agencies to regulate climate change pollution from Big Oil
Posted by Elisabeth Keating on December 2, 2011 - 2:58pm
FOR IMMEDIATE RELEASE : December 2, 2011
Challenge to reduce dangerous greenhouse gas emissions from WA oil refineries advances
Seattle, WA —A federal judge today ruled that the Washington Department of Ecology, Northwest Clean Air Agency, and Puget Sound Clean Air Agency have unlawfully failed to regulate climate change pollution from the five oil refineries operating in Washington State. Washington Environmental Council and Sierra Club initiated the lawsuit in March of this year. The lawsuit claimed that state agencies have the duty to regulate climate change pollution from oil refineries because this pollution fits within the definition of “air contaminants” in Washington’s State Implementation Plan, which was approved by the Environmental Protection Agency and is enforceable under the federal Clean Air Act.
All five oil refineries in Washington are owned by big oil companies—BP, ConocoPhillips, Shell Oil, Tesoro and U.S. Oil. Collectively, these oil refineries are responsible for six to eight percent of total state-wide greenhouse gas emissions, primarily in the form of nitrous oxide, methane, and carbon dioxide. The oil refineries were represented in the lawsuit by the Western States Petroleum Association, which intervened in the litigation.
The conservation groups praised the decision by U.S. District Chief Judge Marsha J. Pechman, who ordered the state agencies to begin the regulatory process to begin controlling climate change pollution from the refineries. “We are heartened by this major step to address the serious air pollution and climate challenges our state faces now and in the near future. Oil refineries are the second-largest stationary source of dangerous climate change pollutants, and it is critical that they do everything they can to preserve the health and well-being of Washington communities.” said Becky Kelley of Washington Environmental Council. “We view this decision as a win for both the environment and the economy,” said Aaron Robins of the Sierra Club. “There are numerous options for reducing climate change pollution from oil refineries that can help protect our environment while making refining operations more efficient and creating new jobs.”
The lawsuit claimed that the state agencies had violated their obligation under Washington’s State Implementation Plan to determine and impose “reasonably available control technologies” on refineries to control climate change pollution. The Court agreed, holding that “Washington’s [State Implementation Plan] requires the Agencies to regulate GHGs.” “The Court affirmed that Washington has the authority and the obligation to address impacts from climate change pollution,” said Janette Brimmer, an attorney with Earthjustice. “Our state can no longer afford to have our regulators sit on their hands and wait for the federal government deal with the issue—it is time for our state regulators to follow the law and implement long-overdue measures to protect our climate ."
Earthjustice and the law firm of Ziontz, Chestnut, Varnell, Berley & Slonim represented the Sierra Club and Washington Environmental Council in the lawsuit. The decision from Judge Pechman is available at: http://wecprotects.org/issues-campaigns/climate-change/judges-order-in-oil-refineries-litigation/at_download/file
Janette Brimmer, Earthjustice, (206) 343-7340 ext. 1029
Joshua Osborne-Klein, Ziontz, Chestnut, Varnell, Berley & Slonim, (206) 448-1230
Aaron Robins, Sierra Club Washington State Chapter, (425) 442-6726 Becky Kelley, Washington Environmental Council, (206) 631-2602
Sunday, October 30, 2011
This design is close to the full realization of an idea that occurred to me around a decade ago, as I pondered how to house 10 billion humans and still have a biosphere. I built a genetic algorithm framework for modeling these kinds of structures, which was used in Gennaro Senatore: Morphogenesis of Spatial Configurations. It’s amazing to see these kinds of structures actually being built; it’s as though the 21st century has finally arrived.
By Diane Pham
16 October 2011
We've reported extensively on green vertical towers that integrate plant life into their facade, but unlike many of those designs, here's one that goes beyond being a mere concept. Designed by Stefano Boeri - architect, academic and former editor of design and architecture magazine Domus - his Bosco Verticale is a towering 27-story structure, currently under construction in Milan, Italy. Once complete, the tower will be home to the world's first vertical forest.
The Bosco Verticale is a system that optimizes, recuperates, and produces energy. Covered in plant life, the building aids in balancing the microclimate and in filtering the dust particles contained in the urban environment (Milan is one of the most polluted cities in Europe). The diversity of the plants and their characteristics produce humidity, absorb CO2 and dust particles, producing oxygen and protect the building from radiation and acoustic pollution. This not only improves the quality of living spaces, but gives way to dramatic energy savings year round.
Each apartment in the building will have a balcony planted with trees that are able to respond to the city’s weather — shade will be provided within the summer, while also filtering city pollution; and in the winter the bare trees will allow sunlight to permeate through the spaces. Plant irrigation will be supported through the filtering and reuse of the greywater produced by the building. Additionally, Aeolian and photovoltaic energy systems will further promote the tower’s self-sufficiency.
The design of the Bosco Verticale is a response to both urban sprawl and the disappearance of nature from our lives and on the landscape. The architect notes that if the units were to be constructed unstacked as stand-alone units across a single surface, the project would require 50,000 square meters of land, and 10,000 square meters of woodland. Bosco Verticale is the first offer in his proposed BioMilano, which envisions a green belt created around the city to incorporate 60 abandoned farms on the outskirts of the city to be revitalized for community use.
Saturday, October 22, 2011
The results of the Berkeley Earth Surface Temperature study (BEST) are in, and to (almost) nobody’s surprise, Earth is warming. Even more compelling is how closely the BEST team’s surface temperature reconstruction matches that of NASA, NOAA, and the Hadley Centre.
The team’s lead is Richard Muller, who’s a well-known climate science skeptic, and for this reason denialists fully expected him to turn climate science on its head and find no global warming. Instead, his results provide strong confirmation that we do, in fact, know how to measure surface temperature correctly. Predictably, denialists have turned on Muller, accusing him of joining “The Team”.
Here’s his editorial in The Wall Street Journal, “The Case Against Global-Warming Skepticism”. The most amusing denialist defense now is the claim that they never disputed the upward trend in the temperature record, only its cause. I’ve gone quite a few rounds with denialists over the years, and I can attest that “there is no warming trend” has always been one of the first arrows out of the quiver.
By RICHARD A. MULLER
21 OCTOBER 2011
[…] let me explain why you should not be a skeptic, at least not any longer.
Over the last two years, the Berkeley Earth Surface Temperature Project has looked deeply at all the issues raised above. I chaired our group, which just submitted four detailed papers on our results to peer-reviewed journals. We have now posted these papers online at www.BerkeleyEarth.org to solicit even more scrutiny.
Our work covers only land temperature—not the oceans—but that's where warming appears to be the greatest. Robert Rohde, our chief scientist, obtained more than 1.6 billion measurements from more than 39,000 temperature stations around the world. Many of the records were short in duration, and to use them Mr. Rohde and a team of esteemed scientists and statisticians developed a new analytical approach that let us incorporate fragments of records. By using data from virtually all the available stations, we avoided data-selection bias. Rather than try to correct for the discontinuities in the records, we simply sliced the records where the data cut off, thereby creating two records from one.
We discovered that about one-third of the world's temperature stations have recorded cooling temperatures, and about two-thirds have recorded warming. The two-to-one ratio reflects global warming. The changes at the locations that showed warming were typically between 1-2ºC, much greater than the IPCC's average of 0.64ºC. […]
When we began our study, we felt that skeptics had raised legitimate issues, and we didn't know what we'd find. Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been very careful in their work, despite their inability to convince some skeptics of that. They managed to avoid bias in their data selection, homogenization and other corrections.
Global warming is real. Perhaps our results will help cool this portion of the climate debate. How much of the warming is due to humans and what will be the likely effects? We made no independent assessment of that.
A new analysis of the temperature record leaves little room for the doubters. The world is warming
Oct 22nd 2011
FOR those who question whether global warming is really happening, it is necessary to believe that the instrumental temperature record is wrong. That is a bit easier than you might think.
There are three compilations of mean global temperatures, each one based on readings from thousands of thermometers, kept in weather stations and aboard ships, going back over 150 years. Two are American, provided by NASA and the National Oceanic and Atmospheric Administration (NOAA), one is a collaboration between Britain’s Met Office and the University of East Anglia’s Climate Research Unit (known as Hadley CRU). And all suggest a similar pattern of warming: amounting to about 0.9°C over land in the past half century.
To most scientists, that is consistent with the manifold other indicators of warming—rising sea-levels, melting glaciers, warmer ocean depths and so forth—and convincing. Yet the consistency among the three compilations masks large uncertainties in the raw data on which they are based. Hence the doubts, husbanded by many eager sceptics, about their accuracy. A new study, however, provides further evidence that the numbers are probably about right. […]
Hot Dog Bites Skeptical Man: Koch-Funded Berkeley Temperature Study Does “Confirm the Reality of Global Warming”
By Joe Romm
20 October 2011
Four new papers confirm that “the world is warming fast,” as the Economist summed it up. One paper finds that “the effect of urban heating on the global trends is nearly negligible.” Another finds that the work of the scientist-smearing denier Anthony Watts is pure BS.
Okay, that’s all “dog bites man” stuff, which is to say, not news in the least. The news is that this work was funded in part by Charles Koch, a leading funder of deniers, and two of the key authors are well-known smearers of climate scientists, Judith Curry and Richard Muller. Hot dog!
Climate Progress actually broke this story back in March — see Exclusive: Berkeley temperature study results “confirm the reality of global warming and support in all essential respects the historical temperature analyses of the NOAA, NASA, and HadCRU.” That was based on an email Climatologist Ken Caldeira sent me after seeing their preliminary results and a public talk by Muller confirming:
- “We are seeing substantial global warming”
- “None of the effects raised by the [skeptics] is going to have anything more than a marginal effect on the amount of global warming.”
But now the Berkeley Earth Surface Temperature Study have completed their “independent” analysis of all of the temperature stations and found a rate of warming since the 1950s as high NOAA and NASA and faster than the (much maligned) UK Hadley/CRU data.
If there is any news here it is that Watts has been demonstrated once and for all to be an “anti-scientist” — not just someone who routinely smears scientists, but someone who represents the negation of the scientific method. No facts can change his conclusions. He is a science rejectionist — and an uber-hypocritical one, as we’ll see.
Watts had famously promised “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.“ He and other deniers even starting working with BEST to influence the outcome, as I first reported here: “Bombshell: Climate Science deniers claim to have full access to Berkeley temperature study work-product — and are now working with the Berkeley team!”
But BEST just released a whole paper devoted to debunking Watts’ life work – his effort to smear climate scientists by accusing them of knowingly using bad temperature stations to rig their results. NOAA had debunked Watts 2 years ago (see here), of course. But now it’s friendly fire trained on Watts. […]
By ANDREW REVKIN
20 October 2011
Anthony Watts and others who have energized climate skeptics by claiming to poke holes in research showing substantial recent warming have their work cut out for them.
Richard Muller, a noted Berkeley physicist who’s been a strident critic of climate campaigners, has released a much-anticipated new package of studies, along with all of his team’s data and methods, that powerfully challenges one of the prime talking points of pundits and politicians trying to avoid a shift away from fossil fuels.
The assertion has been that the world hasn’t really warmed — just the thermometers — due to expanding asphalt and concrete around cities and other locations housing weather stations.
You can find Muller’s materials at Berkeleyearth.org. [4:52 p.m. | Update | Anthony Watts has posted a long piece stressing the important point that the Muller work has not yet been peer reviewed. (A Dot Earth reader below notes some irony in this complaint.)] […]
By Peter Gleick
20 October 2011
Oh, we already knew that.
That’s what crossed my mind today when I read the news release and then the actual scientific papers and then the Wall Street Journal opinion piece about the new conclusions of the study of the Earth’s surface temperature records from the Berkeley Earth Surface Temperature (BEST) group.
The scientific community has known — and been saying for decades — that the earth is warming up. Except for a small cadre of highly vocal, ideologically stuck, but increasing marginalized people, there is no dispute about this among scientists. The data are extensive – covering the globe – and they have been vetted, reanalyzed, corrected for error, compared with satellite data, and subjected to every known criticism. And independent group after independent group has found the same thing: the earth is warming. The fact that this is actually old news can be seen in the latest poll (from Stanford University with Ipsos and Reuters) that, despite the inability of all the leading Republican presidential candidates to publicly acknowledge this, even 83% of the American people believe the earth is warming. And there probably isn’t much that 83% of the American people will agree on these days.
Indeed, even most remaining climate change skeptics and deniers have moved away from saying there is no warming. Now, their major talking points are that it isn’t caused by humans, or only a little bit, or it won’t be bad, or we can’t afford to fix it, or… Denial is a moving target.
Nevertheless, among a small group of skeptics there has been a lot of noise denying warming, ostensibly on the grounds that there are problems with the temperature measurements, thermometers, long-term records, methods of analysis, and more. The leading proponent of this view is Anthony Watts, a meteorologist who runs a popular blog site for climate skeptics. Watts has argued for a long time that our temperature records or analyses stink and that we cannot, therefore, believe the scientists who have shown over and over that it is warming. It has always been hard to take Watts seriously, given the massive amounts of evidence for warming, even beyond the clear temperature records themselves: the disappearing glaciers, the disappearing Arctic ice, the changes in migratory patterns for birds, the faster blooming of plants, the more extreme heat waves, the high ratio of record high temperatures to record low temperatures, the movement of plant and pest species toward the poles, the disappearing permafrost, the rising sea levels… I could go on and on. None of this convinces the diehards, though. […]
By Paul Krugman
21 October 2011
If you follow this blog regularly, you’ll know that whenever I present data — and I do present a lot of data — right-wingers will complain of “cherry-picking”. They never have a clear example of how I should do things differently — or if they do, it’s always obviously wrong. But what they really mean is that they won’t accept data that doesn’t tell them what they want to hear.
This stuff is a minor version of what goes on, on a far bigger and more important scale, with regard to climate change. No matter how much evidence scientists accumulate, they’re accused of somehow manipulating the data.
Now, as Andy Revkin and Joe Romm tell us, one prominent skeptic who actually believed that the data was being manipulated has reported in detail on his efforts to produce clean climate data. And guess what: his data overwhelmingly confirm what climate scientists have been saying.
Richard Muller, the skeptic we’re talking about, seems to have had different motivations from many of the professional climate skeptics. He basically appears to have suffered from nothing more than characteristic physicist arrogance, the belief that people in lesser sciences just don’t know what they’re doing. (Economists experience this all the time, but we make up for it by being equally condescending to sociologists.) To his credit, he went and tried to do better — and is now being honest in revealing that what he got was pretty much the same as the results of previous research.
Of course, you know how the professional skeptics have responded; Joe Romm has the ugly but predictable details.
Oh, one more thing, relevant to both this story and today’s column: landing in my inbox this morning was
POLITICO Playbook, presented by the American Petroleum Institute
Thursday, September 8, 2011
ScienceDaily (Sep. 5, 2011) — The smallest electrical motor on the planet, at least according to Guinness World Records, is 200 nanometers. Granted, that's a pretty small motor -- after all, a single strand of human hair is 60,000 nanometers wide -- but that tiny mark is about to be shattered in a big way.
Chemists at Tufts University's School of Arts and Sciences have developed the world's first single molecule electric motor, a development that may potentially create a new class of devices that could be used in applications ranging from medicine to engineering.
In research published online Sept. 4 in Nature Nanotechnology, the Tufts team reports an electric motor that measures a mere 1 nanometer across, groundbreaking work considering that the current world record is a 200 nanometer motor. A single strand of human hair is about 60,000 nanometers wide.
According to E. Charles H. Sykes, Ph.D., associate professor of chemistry at Tufts and senior author on the paper, the team plans to submit the Tufts-built electric motor to Guinness World Records.
"There has been significant progress in the construction of molecular motors powered by light and by chemical reactions, but this is the first time that electrically-driven molecular motors have been demonstrated, despite a few theoretical proposals," says Sykes. "We have been able to show that you can provide electricity to a single molecule and get it to do something that is not just random."
Sykes and his colleagues were able to control a molecular motor with electricity by using a state of the art, low-temperature scanning tunneling microscope (LT-STM), one of about only 100 in the United States. The LT-STM uses electrons instead of light to "see" molecules.
The team used the metal tip on the microscope to provide an electrical charge to a butyl methyl sulfide molecule that had been placed on a conductive copper surface. This sulfur-containing molecule had carbon and hydrogen atoms radiating off to form what looked like two arms, with four carbons on one side and one on the other. These carbon chains were free to rotate around the sulfur-copper bond.
The team determined that by controlling the temperature of the molecule they could directly impact the rotation of the molecule. Temperatures around 5 Kelvin (K), or about minus 450 degrees Fahrenheit (ºF), proved to be the ideal to track the motor's motion. At this temperature, the Tufts researchers were able to track all of the rotations of the motor and analyze the data. […]
Sunday, July 31, 2011
I predict that not a single peer-reviewed scientific paper will come out of this from the deniers’ side. This will used by them as further evidence of the climate-scientist/Al-Gore/World-Socialist conspiracy.
By Andy Coghlan
28 July 2011
Temperature records going back 150 years from 5113 weather stations around the world were yesterday released to the public by the Climatic Research Unit (CRU) at the University of East Anglia in Norwich, UK. The only records missing are from 19 stations in Poland, which refused to allow them to be made public.
"We released [the dataset] to dispel the myths that the data have been inappropriately manipulated, and that we are being secretive," says Trevor Davies, the university's pro-vice-chancellor for research. "Some sceptics argue we must have something to hide, and we've released the data to pull the rug out from those who say there isn't evidence that the global temperature is increasing."
The university were ordered to release data by the UK Information Commissioner's Office, following a freedom-of-information request for the raw data from researchers Jonathan Jones of the University of Oxford and Don Keiller of Anglia Ruskin University in Cambridge, UK.
Davies says that the university initially refused on the grounds that the data is not owned by the CRU but by the national meteorological organisations that collect the data and share it with the CRU.
When the CRU's refusal was overruled by the information commissioner, the UK Met Office was recruited to act as a go-between and obtain permission to release all the data.
Poland refused, and the information commissioner overruled Trinidad and Tobago's wish for the data it supplied on latitudes between 30 degrees north and 40 degrees south to be withheld, as it had been specifically requested by Jones and Keiller in their FOI request and previously shared with other academics. […]
Other mainstream researchers and defenders of the consensus are not so confident that the release will silence the sceptics. "One can hope this might put an end to the interminable discussion of the CRU temperatures, but the experience of GISTEMP – another database that's been available for years – is that the criticisms will continue because there are some people who are never going to be satisfied," says Gavin Schmidt of Columbia University in New York.
"Sadly, I think this will just lead to a new round of attacks on CRU and the Met Office," says Bob Ward, communications director of the Grantham Research Institute on Climate Change and the Environment at the London School of Economics. "Sceptics will pore through the data looking for ways to criticise the processing methodology in an attempt to persuade the public that there's doubt the world has warmed significantly." […]
Wednesday, July 20, 2011
By Randy Dotinga
18 July 2011
San Diego – Postcard after postcard came addressed to Naomi Oreskes after she wrote her first book on how scientists study the movement of continents.
A groundswell of attention, perhaps? Not exactly. Her mother wrote them all, dashing off each postcard after finishing a chapter. Outside the worlds of science and academia, the book didn't attract much attention.
But 12 years later, the Manhattan-raised historian is traveling a much more public path, drawing both praise and condemnation. She's a fierce defender of scientists and a leader in the vanguard of those who strongly advocate that the world must acknowledge and deal with global warming.
"Professor Oreskes has turned vilified scientists into the heroes they deserve to be," says John Abraham, an associate professor at the University of St. Thomas in St. Paul, Minn. She's performing a service regarding global warming by showing "how a few organized and influential people were able to confuse the country long after the science was settled," he says.
Oreskes, a professor of history and science studies at the University of California, San Diego, acknowledges that she's trying to save the world. Earlier, though, her goal was simpler. She wanted to understand scientists by studying their past, in terms of both their findings and their funding.
"What difference does it make who pays for scientific research?" she says. "I'm interested in how scientists decide they have enough evidence to say they know something, and what difference it makes who pays for the work. We want science to be objective and neutral, but someone has to pay for it, and there's that old cliché about whoever pays the piper chooses the tune."
After writing about continental drift and plate tectonics, Oreskes began focusing on the efforts of oceanographers.
They were working to better understand the relation between the ocean and the atmosphere. In the process, they uncovered signs of global warming.
"I thought, 'Wow, this is unbelievable, there's this whole history that no one talks about,' " she says. "People have no idea how old the science [of global warming] is."
In 2004, Oreskes wrote a brief paper in the influential journal Science debunking claims that scientists disagreed about global warming. Instantly, she found herself at the center of an emotional dispute. News media cited her work, as did the Al Gore movie, An Inconvenient Truth.
Then, as now, Oreskes offers a simple message backed by extensive documentation: There is no scientific debate over climate change. None, zero, zip.
"The science is stable, the science is real, and there's no reason to doubt climate change," she says. […]
Sunday, July 17, 2011
By Audie Cornish
17 July 2011
NASA's shuttle program ends when Atlantis comes back to Earth this week. It's not the end of American space exploration, however; it's the beginning of a new phase in commercial space travel.
For now, American astronauts will be hitching rides to the International Space Station on Russian Soyuz capsules. NASA and President Obama hope that won't be for long; they're counting on America's private sector to come up with a new way to get people, equipment and supplies into space.
That means there's a lot of money to be made in shuttling back and forth to the space station, and several companies are competing in a new race to space. Defense contractors like Boeing are in the game, as is Virgin Galactic — the private space tourism company owned by British business tycoon Richard Branson.
Whatever the new space vehicle is, it'll need a place to park. Enter Spaceport America, a company building a kind of airport for spaceships.
According to the people behind Spaceport America, the future of commercial space travel begins near the tiny New Mexican town of Truth or Consequences, where America's first commercial spaceport is under construction. Just outside of town, highway signs bear little yellow stickers in the shape of a rocket.
"It's kind of a mystery. We don't know who's putting those there," says David Wilson, spokesman for the state of New Mexico's Spaceport Authority. Really, he insists, it's not the spaceport.
On a 45-minute drive deep into the desert, miles of spiky grasses outline the horizon — interrupted by the occasional bison. The sky above is powder blue and clear of clouds. For decades, it's been the perfect place for pioneering rocket scientists.
"Robert Goddard brought his experiments and rockets to the New Mexican desert in the 30's for the same reasons," Wilson says. "There's this incredible weather window; there's no population out here, and then you're a mile up from sea level. We have a saying around there, 'The first mile of space is free.' It takes less energy to get to space from a place out here like this." […]
Thursday, June 30, 2011
By GEIR MOULSON, Associated Press
30 June 2011
BERLIN — German lawmakers overwhelmingly approved on Thursday plans to shut the country's nuclear plants by 2022, putting Europe's biggest economy on the road to an ambitious build-up of renewable energy.
The lower house of parliament voted 513-79 for the shutdown plan drawn up by Chancellor Angela Merkel's government after Japan's post-earthquake nuclear disaster. Most of the opposition voted in favor; eight lawmakers abstained.
Lawmakers sealed for good the shutdown of eight of the older reactors, which have been off the grid since March. Germany's remaining nine reactors will be shut down in stages by the end of 2022.
By 2020, Germany wants to double the share of energy stemming from water, wind, sun or biogas to at least 35 percent. Until this year, nuclear energy accounted for a bit less than a quarter of Germany's power supply.
"Some people abroad ask: will Germany manage this? Can it be done? It is the first time that a major industrial country has declared itself ready to carry through this technological and economic revolution," Environment Minister Norbert Roettgen told lawmakers.
"The message from today is this: the Germans are getting to work," he said. "This will be good for our country, because we all stand together. So let's get to work." […]
Monday, June 27, 2011
Washington DC (SPX) Jun 17, 2011 – Even as Germany, Japan, Switzerland and other nations move to abandon existing and planned nuclear reactors, the United States is on a path to see at best only a small handful of already planned, government-backed reactor projects proceed, a group of experts have said.
While reversals for the nuclear power industry overseas have attracted substantial media attention, relatively little focus has been paid to such developments in the U.S. as the mothballing of the South Texas Project in Texas (once a prime candidate for a federal loan guarantee), the Calvert Cliffs-3 reactor expansion in Maryland (another federal loan guarantee candidate despite major complications presented by foreign ownership issues), and the decision this week by the French industry leader Areva to halt production at a Virginia reactor component plant - a direct result of the turndown in the industry's prospects.
The industry's situation is now such that even the controversial Obama Administration proposal for $36 billion in Treasury-backed loan guarantees for new reactors likely would be a case of throwing good money after bad, according to the experts.
Peter Bradford, former member of the United States Nuclear Regulatory Commission, former chair of the New York and Maine utility regulatory commissions, and currently adjunct professor at Vermont Law School on "Nuclear Power and Public Policy," said: "Even before Fukushima events over the last two years had amply demonstrated that new nuclear power was a bad investment in the U.S. Cost estimates had continued to rise while those of alternatives fell. Wall Street rating agencies were uniformly skeptical.
Constellation pulled out of Calvert Cliffs last October. Exelon did the same for its proposed Texas reactors, and did so in the context of a review of its low carbon options that showed new nuclear to be far more expensive than most of its other choices .
Bradford added: "Since Fukushima, NRG has pulled the plug on South Texas and the County of Fresno in California has reconsidered its support for new nuclear units there. If the past is any guide, there will soon appear stories about how the U.S. nuclear renaissance was well underway before being stalled by the one-of-a-kind nuclear accident at Fukushima.
Just as we are often wrongly told that the first nuclear construction wave in the U.S. ended because of the accident at Three Mile Island, industry spokespeople will use Fukushima to obscure the fact that new nuclear has been priced out of the market in the U.S. for many years.
Under these circumstances, adding additional exposure to American taxpayers in the form of nuclear loan guarantees can't be justified."
Paul Fremont, managing director of equity research, Jefferies and Company, Inc., said: "The estimated cost of building a new nuclear plant varies widely from $4,500 per KW estimated by NRG for its cancelled project in Texas to $6,350 per KW estimated by Southern Company for its project in Georgia.
Today, nuclear represents the highest cost option to construct compared to traditional technologies including coal at an estimated cost of $2,000-$3,000 per KW and gas combined cycle units at $950 per KW. According to Jefferies analysis, the best economic alternative for new build today is gas based on forward prices ranging from $4.40 expected in 2011 to $6.00 in 2015."
Fremont added: "In March 2010, Jefferies published a report on nuclear new build titled 'Sympathy for the Devil' arguing that absent U.S. government subsidies, gas prices would need to be $8.50 per MCF or higher to earn reasonable (10 percent) returns on new nuclear investment. […]
Friday, June 17, 2011
By Brian Merchant
12 May 2011
Until a few months ago, you'd be hard-pressed to find a more classic climate skeptic than D.R. Tucker. A conservative author and radio talk show host, he didn't buy the notion that greenhouse-gas emissions were causing temperatures to rise. He was pretty sure global warming was a hoax perpetrated by Al Gore and a cadre of liberal, grant-hungry scientists. Then Tucker did what partisan pundits and climate skeptics rarely do: He changed his mind.
"I was defeated by facts," Tucker announced on FrumForum, the popular conservative blog. In an April 18 post, "Confessions of a Climate Convert," Tucker told readers how he came to question the ideologies of the climate debate, examine the science, and conclude that global warming was, in fact, very real. Tucker's post sent a giddy ripple through green circles and stoked the ire of his libertarian colleagues.
This sort of thing doesn't happen often. Or at least, it doesn't seem to. Only 48 percent of Americans believe that global warming is at least in part "a result of human activities," according to a 2010 Gallup poll, down from 60 percent in 2007 and 2008.
Anthony Leiserowitz, director of the Yale Project on Climate Change Communication, attributes this decline to five factors: The economic collapse, a severe decrease in media coverage, weather events like "Snowmaggedon," the efforts of the "denial industry" (the network of industry-funded think tanks and political advocacy groups that push skeptic views), and the "ClimateGate" debacle.
This shift toward climate-change skepticism makes Tucker's "conversion" all the more remarkable. So how did it happen?
Leiserowitz has been documenting trends in American climate belief for the past decade. He divides American attitudes toward climate change into six categories: "alarmed," "concerned," "cautious," "disengaged," "doubtful," and "dismissive." …
Tucker was a naysayer. "I bought into Rush Limbaugh's view that the environmentalist movement was 'the new refuge of socialist thinking,' " he tells me. Tucker figured Al Gore and Van Jones (Obama's onetime green jobs adviser) were leading liberals in a plot that used the specter of climate change to snare more power. Leiserowitz would call this "dismissive" thinking.
Tucker's conversion began when he read Morris Fiorina's Disconnect, which outlines the way partisan divisions take shape between Democrats and Republicans, and points out that environmentalism used to be one of conservatives' chief concerns. Tucker's curiosity was piqued.
"Why was it that environmentalism was only associated with the Democratic party now? And it was from those political questions that I became open to the scientific questions," Tucker says. "It went from politics to the science." …
By Jean Chemnick, E&E reporter
14 June 2011
A former Republican congressman who is an advocate for action to address climate change plans to launch a new conservative coalition this fall made up of Republicans who, like him, believe that human emissions are triggering global warming and that steps should be taken to stop it.
Former six-term Rep. Bob Inglis (R-SC) said he hopes his coalition will become a factor in the 2012 presidential and congressional elections -- and beyond. He said the view embraced by many Republicans that human emissions are not a major contributor to global warming is out of step with what it means to be a conservative, given that most scientists say the reverse is true.
Conservatives typically are people who try to be cognizant of risk and move to minimize risk. To be told of risk and to consciously decide to disregard it seems to be the opposite of conservative," Inglis said in a telephone interview.
He said his coalition would seek to change that, even if the message takes a while to stick.
"What I hope to do is be a part of an effort that calls conservatives to return to conservatism and to turn away from the populist rejection of science," Inglis said. He conceded that he expects this message to take at least two election cycles to take root, given today's political climate. ...
Tuesday, May 24, 2011
By Alan Boyle
10 May 2011
A Navy-funded effort to harness nuclear fusion power reports that its unconventional plasma device is operating as designed and generating "positive results" more than halfway through the project.
The latest quarterly update from EMC2 Fusion Development Corp. comes amid other signs that seemingly oddball approaches to fusion research may not be all that oddball after all. Just last week, General Fusion announced that Amazon.com's billionaire founder, Jeff Bezos, was part of a $19.5 million investment round to further the company's plan to take advantage of a technology called magnetized target fusion. Another billionaire, Paul Allen, is an investor in Tri Alpha Energy, which is working on its own hush-hush fusion project (and occasionally publishing its research).
EMC2 Fusion doesn't have tens of millions of venture capital to play with — but it does have a $7.9 million Navy contract to test a plasma technology known as inertial electrostatic confinement fusion, also known as Polywell fusion. The idea is to accelerate positively charged ions in an electrical cage to such an extent that they occasionally spark a fusion reaction, releasing energy and neutrons. The concept was pioneered by the late physicist Robert Bussard, and carried forward by the EMC2 Fusion team in Santa Fe, N.M.
Some of the leading team members went on leave from Los Alamos National Laboratory to work on EMC2. Rick Nebel, the Los Alamos engineer who led the company since Bussard's death in 2007, retired from the company last November. Taking his place as acting chief executive officer is Jaeyoung Park. The 41-year-old physicist says he's given up his position at Los Alamos to focus fully on EMC2.
"We had a lot of milestones to meet in the last six months or so," Park told me today. "It's been pretty hectic." …
Sunday, May 22, 2011
TOKYO,’May 22 (AFP) — Japan is considering a plan that would make it compulsory for all new buildings and houses to come fitted with solar panels by 2030, a business daily said Sunday.
The plan, expected to be unveiled at the upcoming G8 Summit in France, aims to show Japan's resolve to encourage technological innovation and promote the wider use of renewable energy, the Nikkei daily said.
Japan has reeled from the March 11 earthquake and tsunami and the nuclear crisis they triggered as it battles to stabilise the crippled Fukushima Daiichi atomic power plant.
On Thursday, the first day of the two-day summit in Deauville, France, Prime Minister Naoto Kan is expected to announce Japan's intention to continue operating nuclear plants after confirming their safety, the Nikkei said without citing sources.
But he is also expected to unveil a plan to step up efforts to push renewable energy and energy conservation.
Kan believes that the installation of solar panels would help Japan realise such goals, the Nikkei said.
He hopes that technological innovation will drastically bring down costs of solar power generation and thereby make the use of renewable energy more widespread, it said.
Friday, May 20, 2011
This design looks dead brilliant. From the company site:
The ridgeblade is a cross flow turbine fitted on the ridge line at the top of a building and uses the existing roof area to collect and focus the wind. This is where the wind is forced to travel over the roof surface, accelerating the airflow though the turbine. And whilst the unit is fixed to the roof and doesn’t turn to face the wind, the advanced blade design means that it works in 70% of wind directions.
The ridgeblade is an innovative, affordable and effective way of harnessing the wind's power to produce renewable electricity.
The ridgeblade addresses the issues associated with traditional micro-wind generation technologies.
The unique design means it can reliably produce electricity in low or variable wind conditions whilst creating very little visual impact.
Tuesday, May 17, 2011
UN's climate change science body says renewables supply, particularly solar power, can meet global demand
By Fiona Harvey, www.guardian.co.uk
9 May 2011
Renewable energy could account for almost 80% of the world's energy supply within four decades - but only if governments pursue the policies needed to promote green power, according to a landmark report published on Monday.
The Intergovernmental Panel on Climate Change, the body of the world's leading climate scientists convened by the United Nations, said that if the full range of renewable technologies were deployed, the world could keep greenhouse gas concentrations to less than 450 parts per million, the level scientists have predicted will be the limit of safety beyond which climate change becomes catastrophic and irreversible.
Investing in renewables to the extent needed would cost only about 1% of global GDP annually, said Rajendra Pachauri, chairman of the IPCC.
Renewable energy is already growing fast – of the 300 gigawatts of new electricity generation capacity added globally between 2008 and 2009, about 140GW came from renewable sources, such as wind and solar power, according to the report.
The investment that will be needed to meet the greenhouse gas emissions targets demanded by scientists is likely to amount to about $5trn in the next decade, rising to $7trn from 2021 to 2030. …
Monday, May 16, 2011
15 May 2011
It’s been a long time coming, but there has now been an official finding in at least one of the complaints concerning the dubious scholarship of GMU professors Edward Wegman and Yasmin Said. According to Dan Vergano of USA Today, the journal Computational Statistics and Data Analysis (CSDA) has officially confirmed that Said, Wegman et al 2008, a follow up to the infamous Wegman et al report to Congress, will finally be retracted following complaints of plagiarism and inadequate peer review.
The CSDA paper, Social Networks of Author–Coauthor Relationships, was a follow up to the 2006 report to congress by Wegman, Said and Rice University professor David Scott. Both the Wegman report and Said et al claimed that the “entrepreunerial” style of co-authorship in paleoclimatology demonstrated lax peer review in the field, while the “mentor” style of an established professor collaborating with former students would be less problematic. All three of Wegman’s 2008 co-authors – Said, Walid Sharabati and John Rigsby – were former or current students
I first discovered apparent plagiarism in the Wegman report in December 2009. I later documented massive cut-and-paste in the social network analysis background sections of both the report and the CSDA paper in April 2010. At the time, I pointed out that both Wegman and Said had acknowledged federal funding from research offices associated with the Department of Defence and the National Institute of Health.
And I also noted that the paper had sailed through from submission to acceptance in a mere six days, suggesting that it had not been properly peer reviewed at all. That astonishing fact and the deeply flawed analysis belied the paper’s central premise; indeed, as John Mashey has noted, this is a prime example of self-refuting paper. …
By Dan Vergano, USA TODAY
15 May 2011
Evidence of plagiarism and complaints about the peer-review process have led a statistics journal to retract a federally funded study that condemned scientific support for global warming.
The study, which appeared in 2008 in the journal Computational Statistics and Data Analysis, was headed by statistician Edward Wegman of George Mason University in Fairfax, Va. Its analysis was an outgrowth of a controversial congressional report that Wegman headed in 2006. The "Wegman Report" suggested climate scientists colluded in their studies and questioned whether global warming was real. The report has since become a touchstone among climate change naysayers.
The journal publisher's legal team "has decided to retract the study," said CSDA journal editor Stanley Azen of the University of Southern California, following complaints of plagiarism. A November review by three plagiarism experts of the 2006 congressional report for USA TODAY also concluded that portions contained text from Wikipedia and textbooks. The journal study, co-authored by Wegman student Yasmin Said, detailed part of the congressional report's analysis. …
The congressional report, requested by global warming skeptic Rep. Joe Barton, R-Texas, and the study concluded that climate scientists favorably publish one another's work because of too-close collaboration. They suggested this led to the consensus that the Earth is warming. …
Computer scientist Ted Kirkpatrick of Canada's Simon Fraser University, filed a complaint with the journal after reading the climate science website Deep Climate, which first noted plagiarism in the Wegman Report in 2009. "There is something beyond ironic about a study of the conduct of science having ethics problems," Kirkpatrick says. …
Friday, March 25, 2011
By Matilda Battersby
25 March 2011
Lights will switch off around the globe tomorrow for the fifth annual Earth Hour.
New York’s Empire State Building, Abu Dhabi’s Emirates Palace, Paris’ Eiffel Tower, Hong Kong’s Government House, the Christ the Redeemer statue in Brazil and other global landmarks will go dark at 8.30pm local time on Saturday 26 March.
131 countries are participating and organisers are estimating that hundreds of millions of people will come together to switch off power in support of saving the planet from climate change.
“It is only through the collective action of business, organisations, individuals, communities and governments that we will be able to affect change on the scale required to address the environmental challenges we face,” said Andy Ridley, Co-Founder and Executive Director of Earth Hour.
“We are calling on businesses and organisations to use the annual lights-out event as the time to show their commitment to lasting action for the planet, beyond the hour.”
This year is the first time organisers are calling on participants to go “beyond the hour” and they have set up a dedicated website in 11 languages to allow businesses, governments and individuals to collaborate on worldwide projects.
“We have developed this platform for people, organisations and companies around the world to show what can be done, by showcasing and sharing their actions throughout the year,” Ridley said. …
Wednesday, March 16, 2011
Contact: Nicole Casal Moore, firstname.lastname@example.org, University of Michigan
ANN ARBOR, Mich.---A prototype implantable eye pressure monitor for glaucoma patients is believed to contain the first complete millimeter-scale computing system.
And a compact radio that needs no tuning to find the right frequency could be a key enabler to organizing millimeter-scale systems into wireless sensor networks. These networks could one day track pollution, monitor structural integrity, perform surveillance, or make virtually any object smart and trackable.
Both developments at the University of Michigan are significant milestones in the march toward millimeter-scale computing, believed to be the next electronics frontier.
Researchers present papers on each today at the International Solid-State Circuits Conference (ISSCC) in San Francisco. The work is being led by three faculty members in the U-M Department of Electrical Engineering and Computer Science: professors Dennis Sylvester and David Blaauw, and assistant professor David Wentzloff.
Nearly invisible millimeter-scale systems could enable ubiquitous computing, and the researchers say that's the future of the industry. They point to Bell's Law, a corollary to Moore's Law. (Moore's says that the number of transistors on an integrated circuit doubles every two years, roughly doubling processing power.)
Bell's Law says there's a new class of smaller, cheaper computers about every decade. With each new class, the volume shrinks by two orders of magnitude and the number of systems per person increases. The law has held from 1960s' mainframes through the '80s' personal computers, the '90s' notebooks and the new millennium's smart phones.
"When you get smaller than hand-held devices, you turn to these monitoring devices," Blaauw said. "The next big challenge is to achieve millimeter-scale systems, which have a host of new applications for monitoring our bodies, our environment and our buildings. Because they're so small, you could manufacture hundreds of thousands on one wafer. There could be 10s to 100s of them per person and it's this per capita increase that fuels the semiconductor industry's growth."
Blaauw and Sylvester's new system is targeted toward medical applications. The work they present at ISSCC focuses on a pressure monitor designed to be implanted in the eye to conveniently and continuously track the progress of glaucoma, a potentially blinding disease. (The device is expected to be commercially available several years from now.)
In a package that's just over 1 cubic millimeter, the system fits an ultra low-power microprocessor, a pressure sensor, memory, a thin-film battery, a solar cell and a wireless radio with an antenna that can transmit data to an external reader device that would be held near the eye.
"This is the first true millimeter-scale complete computing system," Sylvester said.
"Our work is unique in the sense that we're thinking about complete systems in which all the components are low-power and fit on the chip. We can collect data, store it and transmit it. The applications for systems of this size are endless."
The processor in the eye pressure monitor is the third generation of the researchers' Phoenix chip, which uses a unique power gating architecture and an extreme sleep mode to achieve ultra-low power consumption. The newest system wakes every 15 minutes to take measurements and consumes an average of 5.3 nanowatts. To keep the battery charged, it requires exposure to 10 hours of indoor light each day or 1.5 hours of sunlight. It can store up to a week's worth of information. …
Sunday, March 6, 2011
By Laurel Whitney
4 March 11
“Under a government which imprisons any unjustly, the true place for a just man is also a prison.” Henry David Thoreau on Civil Disobedience
A collective gasp was heard late afternoon yesterday as Tim DeChristopher was found guilty after only 5 hours of jury deliberation. Officially charged with one count of False Statement and one count of violating the Oil and Gas Leasing Reform Act, suddenly everyone was left thinking- did they convict the real criminal?
Much of the last two days of trial had focused on DeChristopher's intent when bidding for BLM land leases. Prosecutor John Hubert argued that DeChristopher intentionally "disrupted, derailed, and sabotaged" the auction. However, defense attorney Ron Yengich painted a different picture:
"He wanted to raise a red flag," he said. "He wanted to make a statement. That’s what he wanted to do. His desire was not to thwart the auction. ... He wanted people to think about the consequences that the auction was bringing to bear on other people. But it was never his intention to harm anyone."
Maybe if Tim had run into the auction using his paddle to feverishly whack participants to prevent them from bidding, then that could be seen harmful.
But let’s put this into context:
Did Tim cause the deaths of 29 people in a mining accident fueled by poor practices and improper equipment maintenance?
Did Tim cause the deaths of 11 people when an offshore oil rig exploded because of numerous safety violations and regulatory oversights causing millions of barrels of oil to spew into the Gulf for months and effectively decimate local economies and ecosystems?
Did he contaminate acres of Amazon rainforest and years later refuse to clean it up and pay the fines?
Or did he cause a toxic gas leak at a pesticide plant that not only killed 20,000 people but continues to contaminate the water and cripplingly sicken citizens over 25 years after the original event?
No, in fact Tim only picked up a paddle. And now he’s the one facing prison. The worst any of the above companies suffered was a blow to their images. …
Monday, February 7, 2011
By George Webster for CNN
December 27, 2010 2:14 a.m. EST
London (CNN) -- With its long hull, towering masts and expansive sails, it resembles a schooner from the 19th century. But fitted with a series of high-tech features, this so-called "sail ship" is designed to cut carbon emissions on the high seas today.
Part of a fleet of carbon-neutral, wind-powered sail ships planned by Britain's B9 Energy, it's just one example of how companies are looking to the past for greener alternatives to the gas-guzzling vessels that transport the world's cargo.
When it comes to wind power replacing fuel in shipping vessels, "it's not a question of if, but when," according to David Surplus, the chairman of B9 Energy, Britain's largest windfarm operator.
"By most people's estimates, we have reached peak oil -- sooner or later the fuel will run out and there will simply be no alternative," said Surplus.
Roughly 87% of international trade is carried out by the shipping industry, figures from the International Maritime Organization show.
With the majority of world trade traveling by sea, the shipping industry is responsible for around 4% of global carbon emissions, according to the latest figures available from the United Nations.
B9 claims its vessel will be the first commercially produced merchant ship to harness alternative energy, but it certainly isn't alone in using old-fashioned sail boats to move goods.
"At the moment it's happening on a fairly small, fairly local scale," said Jan Lundberg, founder of Sail Transport Network, a group that promotes sailing as a means of eco-friendly, cost-efficient trade.
But the trend is growing, he said, pointing to examples like El Lago Coffee Company, which uses traditional sail boats to ship Guatemalan coffee beans to the United States, and the Sail Transport Company, a Seattle-based group that uses sailboats to deliver "petroleum-free organic produce."
B9's new eco-friendly ships, planned to be in production by 2012, signify a return to a much more traditional form of merchant shipping. Before diesel-powered steel tankers came to dominate the seas, international trade was conducted on vast, wooden sail ships.
The 100% carbon-neutral freighter will feature automated, self-adjusting sails that respond to minute changes in the wind to maximize propulsion. The boat will also take advantage of "skysail" technology -- a kite-styled towing system currently used on some cargo ships to improve fuel efficiency. …
Friday, February 4, 2011
Is our Milky Way galaxy home to other planets the size of Earth? Are Earth-sized planets common or rare? NASA scientists seeking answers to those questions recently revealed their discovery.
"We went from zero to 68 Earth-sized planet candidates and zero to 54 candidates in the habitable zone - a region where liquid water could exist on a planet’s surface. Some candidates could even have moons with liquid water," said William Borucki of NASA’s Ames Research Center, Moffett Field, Calif., and the Kepler Mission’s science principal investigator. "Five of the planetary candidates are both near Earth-size and orbit in the habitable zone of their parent stars."
Planet candidates require follow-up observations to verify they are actual planets.
"We have found over twelve hundred candidate planets - that’s more than all the people have found so far in history," said Borucki. "Now, these are candidates, but most of them, I’m convinced, will be confirmed as planets in the coming months and years."
The findings increase the number of planet candidates identified by Kepler to-date to 1,235. Of these, 68 are approximately Earth-size; 288 are super-Earth-size; 662 are Neptune-size; 165 are the size of Jupiter and 19 are larger than Jupiter. Of the 54 new planet candidates found in the habitable zone, five are near Earth-sized. The remaining 49 habitable zone candidates range from super-Earth size -- up to twice the size of Earth -- to larger than Jupiter. The findings are based on the results of observations conducted May 12 to Sept. 17, 2009 of more than 156,000 stars in Kepler’s field of view, which covers approximately 1/400 of the sky.
"The fact that we’ve found so many planet candidates in such a tiny fraction of the sky suggests there are countless planets orbiting stars like our sun in our galaxy," said Borucki. "Kepler can find only a small fraction of the planets around the stars it looks at because the orbits aren’t aligned properly. If you account for those two factors, our results indicate there must be millions of planets orbiting the stars that surround our sun."
“We’re about half-way through Kepler’s scheduled mission," said Roger Hunter, the Kepler project manager. "Today’s announcement is very exciting and portends many discoveries to come. It’s looking like the galaxy may be littered with many planets.”
Among the stars with planetary candidates, 170 show evidence of multiple planetary candidates, including one, Kepler-11, that scientists have been able to confirm that has no fewer than six planets.
"Another exciting discovery has been the tremendous variations in the structure of the confirmed planets – some have the density of Styrofoam and others are denser than iron. The Earth's density is in between."
"The historic milestones Kepler makes with each new discovery will determine the course of every exoplanet mission to follow," said Douglas Hudgins, Kepler program scientist at NASA Headquarters in Washington. …