Saturday, April 30, 2011

A full, multidirectional undersea grid

"They did not know it was impossible, so they did it" - Mark Twain

Hundreds of Miles of Wind Farms, Networked Under the Sea


Atlantic Wind Connection would link wind farms over hundreds of miles

By David Roberts
Popular Science
April 28, 2011

On the Grid The Atlantic Wind Connection would link wind farms over hundreds of miles using undersea cables and voltage conversion stations. Kevin Hand

During the last ice age, glaciers a mile high pushed several dozen cubic miles of rock, sand and debris into the ocean off North America’s mid-Atlantic coast, creating a broad shelf that extends up to 40 miles offshore. This long, flat stretch of seabed and the shallow, windy waters that cover it make the ideal spot for dozens of offshore wind farms—and if all goes well, the network that would link those turbines together and back to the coast will soon be in place.

Offshore wind power has significant advantages over the onshore variety. Uninterrupted by changes in terrain, the wind at sea blows steadier and stronger. Installing turbines far enough from shore that they’re invisible except on the very clearest days lessens the possibility of not-in-my-backyard resistance. The challenge is getting the electricity back to land, to the people who will use it.

The Atlantic Wind Connection could provide an entirely new model for connecting seaborne energy with land users.The Maryland-based transmission-line company Trans-Elect proposes to do just that with a $5-billion undersea power grid that would stretch some 350 miles from northern New Jersey to southern Virginia. The Atlantic Wind Connection (AWC) would provide multiple transmission hubs for future wind farms, making the waters off the mid-Atlantic coast an attractive and economical place for developers to set up turbines. The AWC’s lines could transmit as much as six gigawatts of low-carbon power from turbines back to the coast—the equivalent capacity of 10 average coal-fired power plants.

So far, the project has attracted backing from Google, the clean-energy investment firm Good Energies, and the Japanese trading company Marubeni. Trans-Elect says it plans to begin construction on phase I—a $1.8 billion, 150-mile span from Delaware Bay to Atlantic City—in 2013, and that section could be operational by 2016.

To appreciate the novelty and potential of the Atlantic Wind Connection, it helps to understand the building blocks of an offshore wind system. At its simplest, offshore wind transmission involves connecting a group of wind turbines to an AC transmission cable, which will carry the electricity they generate back to land. (Nearly every wind turbine on the market today generates AC power, the standard for the terrestrial grid.) But AC cables are generally efficient only over short distances, particularly when they’re used underground or underwater.

Wind farms connected to land by AC cables would therefore be forced to cluster around a limited number of spots where the cables could access the terrestrial grid. Those spots, however, might not necessarily be the best places to park a wind farm. Placing turbines in the windiest possible places might involve transmitting power greater distances than AC lines are capable of—and that requires high-voltage DC cables, along with a series of voltage converters to transform the AC power coming out of the windmills to DC and then, once ashore, back again.

What makes the AWC both so ambitious and so innovative is that it is designed to link multiple DC systems together into a full, multidirectional undersea grid. The system will be capable of channeling power as needed among multiple wind farms and consumers on shore, a project of unprecedented scale and complexity.

The nodes of this network will be voltage converters that sit on offshore platforms in huge, weatherproof boxes. Each one will be connected to other wind farms and converter stations. Power will flow among them, as well as to the land-based stations that feed into the terrestrial grid. Each voltage converter will be able to respond instantly to changes in power flow in the terrestrial grid—to ratchet power up and down based on supply and demand back on land. As a result, even before the AWC is connected to a single wind farm, it will help stabilize the mid-Atlantic’s aging, overtaxed grid, preventing brownouts and blackouts.

AC vs. DC: Undersea power cables require metal shielding to protect them from fishing nets and marine life. But metal shielding doesn’t work so well for the alternating-current (AC) lines used almost universally in the terrestrial grid, because the electric field surrounding an AC line changes direction 120 times a second, enough to heat up a metal shield like a toaster coil. Over long distances, this interaction saps enough energy to make buried AC lines impractical—hence the use of direct current (DC), in which the electrons move in one direction, at a constant speed, producing an electric field that interacts only minimally with the shielding.

Relieving strain on the existing grid would just be the beginning. The AWC would draw wind-farm developers to the region by allowing them to hook new turbines into an existing transmission backbone, cutting the cost of laying their own cables and building the infrastructure to connect those cables to the terrestrial grid. Developers will also be free to place turbines where the wind is. “It’s not just about transmission access,” says Markian Melnyk, who developed the initial AWC concept. “It’s also about providing access to the best sites for wind.”

But the AWC’s greatest advantage would be its sheer scale. It would tie together wind farms distributed over hundreds of miles of coastline, and because the wind will usually be blowing somewhere, its breadth would compensate for the unpredictability of wind at any individual site. Together, the steadier flow of power, combined with the money that wind-farm developers will save by feeding electricity into an existing network, could make wind power cheaper than electricity generated by natural gas or, in some cases, even coal.

Obviously, there is a long way to go. Multiple regulatory authorities will have to sign off before the AWC becomes reality. But the large-scale, holistic approach to offshore wind power that it represents is already echoed by at least two undersea power-transmission proposals being considered in Europe. And if the AWC succeeds in delivering predictable, low-cost, low-carbon power to the mid-Atlantic, expect the number of imitative projects to grow.

Friday, April 29, 2011

Wind farm is a welcome addition to Japan's energy portfolio

There apparently were not very many "NIMBY's" attempting to prevent a new 30 megawatt wind farm from going on line in Japan. Wind turbines survived the recent earthquake, aftershocks and tsunami. They will clearly be part of the mix of new generation that Japan must build in the aftermath of the natural disasters. (GW)

New wind offers timely boost to Japan's crisis-hit power grid

ReCharge
April 21, 2011


Eurus Energy, which is majority owned by crisis-stricken Tokyo Electric Power Corporation (Tepco), has completed a 30MW wind farm in southern Japan as the country races to avoid looming summer power shortages.

The Kunimi-yama wind farm (shown in the above photo) is a welcome addition to both Tepco’s and Japan’s generation portfolio, which was thrown into chaos by the 11 March earthquake and tsunami that caused a radioactive crisis at Tepco’s Fukushima Daiichi nuclear plant.

The project is located on southern tip of Kyushu, Japan’s third-largest island, and was built 880 metres above sea level along the eastern ridge of Mount Kunimi-yama.

While the site offers strong wind, it also posed enormous challenges for transportation, construction and grid connection, says Eurus.

The Kunimi-yama project uses 15 turbines made by Japan Steel Works, which has delivered about 100 machines to the modest-sized Japanese wind market since it entered the sector in 2006. Eurus has used Vestas turbines in the past.

Eurus is Japan’s largest wind developer, with 2.02GW of capacity spanning Europe (760MW), the US (587MW), Japan (527MW), and Korea (142MW).

The Tokyo-based company is 60% owned by Tepco and 40% by a unit of Toyota.

Eurus is also pushing its way into the solar sector. The company commissioned its first 994kW PV array in South Korea in 2008 and last month acquired the 2.97MW Jindosun array from JA Solar, also in Korea.

It is developing a 45MW PV park in California alongside US partner NRG Energy, which is expected to be finished by June. The $220m project uses Sharp modules, with the electricity to be sold for 20 years to Pacific Gas and Electric.

Having been forced to implement rolling blackouts for two weeks in March, Tepco is scrambling to add generation capacity before the peak-usage summer months.

Over the medium term, much of the slack is likely to be picked up by new gas-fired power plants, though renewables are also expected to benefit.

With 2.3GW of installed wind capacity, Japan’s wind sector is dwarfed by Asian giants China (42.3GW) and India (13.1GW). Japan added just 221MW of wind last year, compared to about 1GW of PV.

Thursday, April 28, 2011

Chernobyl

Chernobyl disaster 25th anniversary

Boston Globe
August 25, 2011

On April 26, 1986, reactor number four at the Chernobyl nuclear power facility in what is now Ukraine exploded. The largest civil nuclear disaster in history led to mass evacuations, and long-term health, agricultural, and economic distress. The nearby city of Pripyat has been abandoned, and a 19-mile radius "exclusion zone" established where radiation contamination makes continued habitation dangerous. Collected here are archival pictures of the catastrophe, as well as more recent images of the area. In addition, two photographers who've made extensive studies of the aftermath have been gracious enough to share their work with us here. Diana Markosian documented the lives of pensioners Lida and Mikhail Masanovitz, who continue to live in the abandoned ghost town of Redkovka, Ukraine. Her work is found here in photographs 13 through 16. Michael Forster Rothbart has produced one of the most extensive records available of life near Chernobyl. His work is found here in photographs 23 through 29. Links to the websites of both photographers can be found below. -- Lane Turner (34 photos total)

Wednesday, April 27, 2011

"What is our wealth?"

Bucky Fuller often reminded us that real wealth is "know-how". As we discover more of Nature's inventory of general principles and incorporate them into technologies that enable us to do more with less, the wealthier we all become. (GW)

Sustainable development 'the only answer to spreading wealth'

By Anna Zacharias
The National
April 27, 2011

RAS AL KHAIMAH // Sustainable development is the only answer to spreading the wealth among nations, the Ruler of Ras al Khaimah told a science forum yesterday.

Sheikh Saud bin Saqr was speaking yesterday at the opening of the 17th Annual Micromachine Summit, a gathering of leading research scientists from 22 countries.

"I can't overemphasise the role of scientists and research," he said. "What is our wealth? It's really nothing. With the six or seven billion people on Earth, we have great challenges ahead of us. Only through knowledge and research can we afford to make a better future for our children."

The Swiss organisers of the summit chose to hold it in Ras al Khaimah to showcase the work of two research organisations from Switzerland that are exploring renewable energy programmes in the emirate.

"For us to spread the wealth - not just for a few, but for all - it means we have to find sustainable development," said Sheikh Saud. "Sustainable in terms of raw materials and availability, but more so in terms of climate."

The UAE delegation, the first to attend the summit from the Middle East, presented their research from the UAE Centre Suisse d'Eletronique et de Microtechnique (CSEM-UAE) on solar islands that rotate to catch the sun's rays and its use of solar polygeneration to desalinate water, cool buildings and provide electricity.

"I see the marvellous solar islands they have pioneered here, and we will look at if they can somehow tap the power of the sun, if we can make more efficient use of the cooling that we have, if we can make a more efficient use of desalination," said Sheikh Saud. "There are small steps and big steps, but nevertheless those steps will allow humanity to have a better chance."

Researchers in Switzerland have formed strong bonds with the RAK community in recent years - part of the reason they chose to host this week's summit in the emirate.

"Facilities, topics, the environment, the real life …" said Dr Nico De Rooij, an inventor and world leader in microtechnology who first proposed hosting the summit in RAK. "Research here is not just in a lab, it's the real exposure. It's very, very instructive for researchers to have this."

Government support in RAK was its greatest asset, said Dr Philippe Fischer, the director of the Swiss Foundation for Research in Microtechnology.

"You can do things very fast," said Dr Fischer. "There are no decision-making procedures like we know in Switzerland if you want to build new facilities. This ability to go ahead and do things is very exciting for scientists."

That was the appeal for the RAK campus of Ecole Polytechnique Federale de Lausanne (EPFL), which has had 70 researchers visit RAK since November. The first batch of 25 to 30 RAK-based students will begin their masters in Lausanne in September, and arrive here next year.

"Switzerland has needed to diversify its economy from the onset, and there is a strong connection between RAK and Switzerland precisely because of that," said Dr Franco Vigliotti, dean of the RAK campus.

RAK students aim to research wind engineering, energy systems, energy management, sustainable urban design, water resources and sustainable mobile transportation.

"There are several things that connect us. First of all, a shared vision of what the future challenges are," he said.

"RAK and the UAE - this is a place where you can think of the future of renewable energies.

"We also think it's a place where technological advances for water desalination are going to be important."

This local research would have direct application to industry, said Dr Khater Massaad, an EPFL alumni. The chief executive of RAK Investment Authority already has his eye on using CSEM-UAE's research to provide energy for projects like the Al Hamra Village, where he is the managing director.

"The idea is to make a lot of start-ups with all this [research], whether it is CSEM or EPFL, to support the industry," said Dr Massaad. "Every industry needs the research and development, and when you have the institutes likes this they can help match industry needs."

azacharias@thenational.ae

Tuesday, April 26, 2011

Big brother is not only watching....he's mapping

Yesterday's post focused on advanced 3D modeling software used by architects to design individual buildings. Why stop there? How about super sharp 3D cityscapes? They're here today courtesy of advanced missile technology.

Tomorrow?? (GW)

Ultrasharp 3-D Maps

A missile-targeting technology is adapted to process aerial photos into 3-D city maps sharper than Google Earth's.

By Tom Simonite
Technology Review
April 26, 2011

Technology originally developed to help missiles home in on targets has been adapted to create 3-D color models of cityscapes that capture the shapes of buildings to a resolution of 15 centimeters or less. Image-processing software distills the models from aerial photos captured by custom packages of multiple cameras.

The developer is C3 Technologies, a spinoff from Swedish aerospace company Saab. C3 is building a store of eye-popping 3-D models of major cities to license to others for mapping and other applications. The first customer to go public with an application is Nokia, which used the models for 20 U.S. and European cities for an upgraded version of its Ovi online and mobile mapping service released last week. "It's the start of the flying season in North America, and we're going to be very active this year," says Paul Smith, C3's chief strategy officer.

Although Google Earth shows photorealistic buildings in 3-D for many cities, many are assembled by hand, often by volunteers, using a combination of photos and other data in Google's SketchUp 3-D drawing program.

C3's models are generated with little human intervention. First, a plane equipped with a custom-designed package of professional-grade digital single-lens reflex cameras takes aerial photos. Four cameras look out along the main compass points, at oblique angles to the ground, to image buildings from the side as well as above. Additional cameras (the exact number is secret) capture overlapping images from their own carefully determined angles, producing a final set that contains all the information needed for a full 3-D rendering of a city's buildings. Machine-vision software developed by C3 compares pairs of overlapping images to gauge depth, just as our brains use stereo vision, to produce a richly detailed 3-D model.

"Unlike Google or Bing, all of our maps are 360° explorable," says Smith, "and everything, every building, every tree, every landmark, from the city center to the suburbs, is captured in 3-D—not just a few select buildings."

C3's approach has benefits relative to more established methods of modeling cityscapes in 3-D, says Avideh Zakhor, a UC Berkeley professor whose research group developed technology licensed by Google for its Google Earth and Street View projects. Conventionally, a city's 3-D geometry is captured first with an aerial laser scanner—a technique called LIDAR—and then software adds detail.

"The advantage of C3's image-only scheme is that aerial LIDAR is significantly more expensive than photography, because you need powerful laser scanners," says Zakhor. "In theory, you can cover more area for the same cost." However, the LIDAR approach still dominates because it is more accurate, she says. "Using photos alone, you always need to manually correct errors that it makes," says Zakhor. "The 64-million-dollar question is how much manual correction C3 needs to do."

Smith says that C3's technique is about "98 percent" automated, in terms of the time it takes to produce a model from a set of photos. "Our computer vision software is good enough that there is only some minor cleanup," he says. "When your goal is to map the entire world, automation is essential to getting this done quickly and with less cost." He claims that C3 can generate richer models than its competitors, faster.

Images of cities captured by C3 do appear richer than those in Google Earth, and Smith says the models will make mapping apps more functional as well as better-looking. "Behind every pixel is a depth map, so this is not just a dumb image of the city," says Smith. On a C3 map, it is possible to mark an object's exact location in space, whether it's a restaurant entrance or 45th-story window.

C3 has also developed a version of its camera package to gather ground-level 3-D imagery and data from a car, boat, or Segway. This could enable the models to compete with Google's Street View, which captures only images. C3 is working on taking the technology indoors to map buildings' interiors and connect them with its outdoor models.

Smith says that augmented-reality apps allowing a phone or tablet to blend the virtual and real worlds are another potential use. "We can help pin down real-world imagery very accurately to solve the positioning problem," he says. However, the accuracy of cell phones' positioning systems will first have to catch up with that of C3's maps. Cell phones using GPS can typically locate themselves to within tens of meters, not tens of centimeters.

Copyright Technology Review 2011.

Monday, April 25, 2011

Lowering construction costs with 3D modeling

I remember when 3D meant putting on cardboard glasses with one red and one blue plastic lens. We've come a long way since then. 3D computer models allow architects and engineers to examine their designs from the bottom up, top down and every conceivable perspective. There are even 3D printers that can create an actual model from the computer renderings. This is not just meant to dazzle. There are some very tangible benefits to all this. (GW)

Beyond blueprints: 3-D saves time, money


Sophisticated software is letting builders bring new precision to major construction projects

By Casey Ross
Boston
Globe
April 25, 2011
A drainage pipe for a new research center at the University of Massachusetts Medical School in Worcester was in the wrong place. Left there, it could require that a large hole be drilled through the $400 million building’s foundation.

On any other project, it would have meant a complicated and costly fix. But in this case, the problem was discovered months before construction even began, while university officials were scanning a laser-precise, 3-D virtual model of the building created by the contractor, Suffolk Construction Co. of Boston.

“We just asked them to move the drain line, and it ended up saving us a lot of time and money,’’ said John Baker, a UMass employee overseeing construction of the building, the Albert Sherman Research Center.

The center is among a new generation of buildings being constructed with computer design technology that is not only changing the construction process, but how the new structures are ultimately used.

In addition to flagging design problems, the modeling software allows builders to accurately predict factors such as how much sun exposure a building will get on specific days of the year. That information can then be used by future occupants to set interior lighting controls and cut energy costs.

“About 85 percent of a building’s costs happen after construction,’’ said Jay Bhatt, an executive with Autodesk Inc., which manufactures the software used to design the Sherman Center. “What we’re doing is connecting the 3-D model to [mechanical] systems in the building, so owners can monitor performance and costs.’’

Modeling software has been available for years, but Massachusetts contractors have only recently begun to use it on major projects.

Among the primary users is Suffolk Construction, which also employed the technology to help build the David H. Koch Institute for Integrative Cancer Research at the Massachusetts Institute of Technology in Cambridge, as well as in construction of an expanded hospital and emergency wing at Bay State Medical Center in Springfield.

“You can actually design these buildings virtually and get them perfect,’’ said Peter Campot, an executive at Suffolk Construction. “I was at a future technology conference the other day where they showed a 15-story building in Japan that was built in six days.’’

The Sherman Center, scheduled for completion by 2013, will take a few months longer than that. But the 515,000-square-foot facility, which will contain space for classrooms and medical laboratories, is precisely on schedule so far, according to plans created using the modeling software.

An image from the plans predicted the steel skeleton of the building would be completed by March 28; a picture taken by Suffolk on that day is a virtual match, reflecting a rare level of precision on such a large project.

Campot, a veteran builder who acknowledges he can get evangelical about the software, said projects built without 3-D blueprints typically run into design issues that lead to costly delays and work changes.

“But if you find the change before the construction documents come out,’’ he said, “it can all be done electronically. It costs you nothing.’’

At Bay State Medical Center, where construction is underway on a 640,000-square-foot facility, project supervisors said a 3-D model of the building allowed them to get better input from the doctors, nurses, and administrators who will use the hospital.

Stanley Hunter, an executive overseeing the project, said contractors used the software to show virtual versions of completed patient rooms and treatment facilities.

“For the clients, it really helps them visualize what they’re going to get,’’ he said. “The coordination of the work becomes a lot easier. I think we’re really just learning the power of this.’’

MIT’s Koch Center was completed last month. The 360,000-square-foot building includes public galleries, student training facilities, and laboratories where biologists and genome scientists are working to develop treatments for cancer.

Suffolk used the modeling software to fix an array of issues when plans for mechanical systems clashed with architectural drawings. Overall, Campot said, the software helped detect about 1,000 potential problems that could have required the builders to change ceiling heights or make other alterations.

After looking at the list of problems, the builders and designers agreed that most of the issues would have been caught during the usual low-tech process of reconciling paper blueprints. But there were at least a dozen that probably would have resulted in change-work orders, delays, and added costs.

“We ended up with zero constructability change orders, which is almost unheard of,’’ Campot said.

Casey Ross can be reached at cross@globe.com.

Sunday, April 24, 2011

"Extraordinarily settled hors d'ouevre of summer"

The boiling frog story is a widespread anecdote describing a frog slowly being boiled alive. The premise is that if a frog is placed in boiling water, it will jump out, but if it is placed in cold water that is slowly heated, it will not perceive the danger and will be cooked to death. The story is often used as a metaphor for the inability of people to react to significant changes that occur gradually.

Hotter than LA, drier than Madrid - Britain hogs the super soaraway April sun!

It's great, but it's not spring as we know it. David Randall on the upsides and downsides of an extraordinary season

By David Randall
The Independent
April 24, 2011

We are now in the fourth week of what may well be the warmest April on record. The days have been almost continuously sunny since 6 April; temperatures have, in southern parts, reached 26.5 C; and our gardens and countryside look more like mid-May than late April. Around London, south-facing wisteria has already peaked and the cherry blossom fallen, but bluebells are blooming weeks early, and the first white flower buds of may trees have begun to show.

Plants are not the only things perking up. In the parks, sunning lunch-timers loll on grass that should, according to the date, be too damp. On the high street, the clothes shop rails rattle to the sound of young women searching for a cooling top just about decent enough for the office. Men of a certain age have dug out the Baden-Powell shorts and been silly enough to wear them. Thirsty plants in dry gardens are relieved by water from as yet unbanned hosepipes. And a summer sun shines down on us all, the righteous and unrighteous, the AV-ers and No-to-AV-ers, and probably even the superinjunction seekers, too. Blessed be the British in the spring of 2011, for we are warmer than Los Angeles, drier than Madrid, and cosier than Corfu.

This will, for many of us, be nothing less than the defining spring of our lives. And its memory will linger long after the mobile phone pictures of that day on the beach or picnic by the river have been erased or digitally superseded; for spring always seems the most British of seasons.

I have a theory about why this should be. It isn't because of the vivid, varied greenness that grows daily in treetops and hedgerows or the gradual advent of the long, light north European evenings. Nor is it, like some natural annual replay of our wartime legend, the seizing of sap-rising victory after the months of backs-against-the-wall winter. It is the mildness, the very British lack of extreme. In our corner of the Atlantic system, handily placed to avoid hurricanes, typhoons or tropical storms, spring is the mildest of times in the mildest of climes.

Search the history of our weather, and March, April and May are the benign, friendly months, conspicuous only for their reluctance to provide record floods, heat, cold, wind, snow – or, indeed, record anything. On the Met Office website, there are 39 extreme weather records for Britain. None is for March or April, and only two for May – one for the highest two-hour rainfall (West Yorkshire, 1989), the other for Scotland's sunniest month (Tiree, 1975). Nothing more damaging, you might say, than an extended refreshing shower, and a month of balmy days in the Inner Hebrides.

In the history of our recorded weather, springs do not, like other seasons, jostle and elbow each other out of the way to provide our wettest, hottest, stormiest incidents. They stand out only in relation to each other. The memorable ones are remarkable not so much for what occurs, but for doing so at this time of year at all – springs into which bits of other, more aggressive seasons have briefly intruded: the blizzards of April 1917, Devon snowdrifts of May 1935, and gales of April 1973. These, and others like them (see panel), are like a mild-mannered friend who shocks with an uncharacteristic outburst.

In the annals of kindly springs, 2011 is very special. History will remember it for statistics – by 13 April (the latest available official data), our average maximum temperature was 3.7 C above the norm, and the South-east's rainfall a mere 4 per cent of the long-term average. But we who have lived it, especially those of us who spend much of our time out of doors, will know it for things better than data: evenings sitting in the garden, weekend after weekend of reliable warmth, paddling pools filled with happy splashers – a summer come early, a windfall, like the winnings from a lucky flutter, something we had no right to expect.

Everyone will have their own slideshow they can play in their heads. Mine has infant lambs tottering around after their mothers across warm Sussex fields; an early April sunset over Pagham Harbour, its last rays still warm on the face; the plump green undergrowth more like mid-May than April; a round of golf remarkable for seeing, in as many holes, six species of butterfly (large white, peacock, orange-tip, brimstone, holly, and clouded yellow); and, on the shimmering waters of a south coast nature reserve, behaviour that was anything but reserved.

The two swans looked as if they were working up to something, and I barely had time to raise my camera before he swam behind her, applied his full weight, held her head under water with his beak, fully mounted her, and mated in less than a minute. She resurfaced, their chests met, both necks stretched up, and then, bending in symmetrical shapes, they formed a momentary heart shape, and swam apart. It was the essence of sex between lifelong partners – an exercise in tenderness and practised selfishness.

The recurring theme of those who care for our wildlife has, in the last decade, not been the sensuous joys of spring; rather the propensity for doom contained within its globally warmed and much earlier arrival. It comes more than two weeks sooner than 30-odd years ago, say repeated studies of flowerings, spawnings, and egg-laying; and possibly three weeks earlier than the 1950s. But never mind the longer growing season, say the pressure groups, that is a fools' premature paradise. Think instead, they insist, of earlier springs throwing our ecologies out of kilter: summer resident birds arriving as per their routine schedule, only to find the food their young depend on has already bred/pupated/hatched/or gone to ground; and plants flowering before their pollinators are on the wing. The whole finely balanced evolved timetable thrown into disarray.

It is a disturbing prospect, and journalists, including myself, have duly peddled it. But the evidence for it happening, several decades after the warming process began, has proved scant. Last year, a study led by Dr Stephen Thackeray and Professor Sarah Wanless of the Centre for Ecology & Hydrology was published. It looked at 726 species of plants and animals, and found that 80 per cent of them experienced earlier events, the pace of change was accelerating, and predators were often slower to respond. But it found no direct evidence of species suffering as a result, and added: "The seasonal timing of reproduction is often matched to the time of year when food supply increases, so that offspring receive enough food to survive." Thus, although the threat remains, wildlife seems smarter and more adaptable than fretting campaigners would have us believe.

We, it seems, are less so. Repeated warnings about the environmental damage caused by cars and excessive energy consumption have fallen on ears made deaf by convenience and habit, and the result is all too palpable, especially if you are asthmatic. Polluting particles love long periods of warm, sunny weather as much as we do, and the result, on Tuesday afternoon, was all too apparent. A friend and I stood on the North Downs looking towards London and saw, hanging over the capital, a vast smoking room fug. Sure enough, two days later, the Government issued a smog alert, and urged the vulnerable, who are many, to avoid afternoons out of doors and car rides.

The other matter, just to complete the tour of possible downsides to this most glorious of springs, is water, or the lack of it. March was the driest in England and Wales for 50 years – and April has been drier still. Up to the 13th of the month, England has had only 16 per cent of its long-term average rainfall, and it is probable that river levels will be very low by the end of this month. Already, the likes of the Daily Express and broadcasters wearing concerned faces have started to warn of possible shortages, and speculate on the chances of dessicated woodlands and heaths spontaneously combusting. All, of course, in the public interest.

To that, there is only one response, and that is to remind ourselves what happened after the warmest April ever, that of 2007. The average maximum for the UK was 15.2C (16.3C for England), and temperature records were widely set as the pitiless April sun beat down. Newspapers reached for the D-word: drought, they warned, was the inevitable price we would have to pay for our month of pleasure. And then came the rains of May, followed by those of June, which were, in due and damp course, succeeded by the wretched floods of July. They were the wettest such months in the record, and some areas had three times as much rain as normal. It is a comfort to know that, whatever else is up in the sky above Britain, caprice is always there with it.

That is what makes this extraordinarily settled hors d'ouevre of summer so delicious. I think this weekend we should simply savour it. The main course may be rather different.

But, occasionally, it can be outrageous, too...

1893 A prolonged drought across the South-east of England (some parts had no rain for 60 days) was most keenly felt in east London. At Mile End, not a single drop fell between 4 March and 15 May.

1917 Blizzards over much of the British Isles in early April. In parts of Ireland, snow fell for two days non-stop, and there were reports of drifts three metres deep.

1935 One of the worst May snowfalls ever. Mid-month, it was lying three inches deep at Cambridge, and there were piles of the stuff as far south as Tiverton in Devon.

1949 In the middle of April, on the 16th, Britain recorded its hottest ever Easter Saturday temperature when the mercury reached 29.4C in Camden Square. It is still an all-time record for the month.

1955 On 17 May snow fell for nearly three hours in the London area, with the added frisson of severe gales. Elsewhere, roads were blocked in Derbyshire and Wales.

1973 Winds reaching Force 10 and in some parts 11 swept much of Britain, causing extensive damage, especially to trees, roofs and caravans. And, where winds did not blow, there was snow.

1989 The sunniest May on record. Some places in the South had at least 340 hours of sun in the month – an average of more than 10 hours a day. The average maximum temperature in London was the highest for any May in 150 years, and probably helped tempers fray at the FA Cup final.

2000 A soaking April (the wettest on record), followed by a truly soggy May. There were widespread floods, and some memorable thunderstorms.

Saturday, April 23, 2011

How to jollify a wall, or bridge, or subway train

It's a rainy Saturday morning here on Cape Cod. Think I'll grab a cup of coffee, sit back and check out this fascinating gallery of street art. (GW)

The world's biggest gallery: How street art became big business

Street art was once seen just by passing pedestrians. Now, the internet has revolutionised the form, and it's not only Banksy who is transforming our cities

By Guy Adams
The Independent
April 23, 2011

Street art used to be very much a minority taste. Enthusiasts would venture out, at the dead of night, armed with spray cans. They'd use them to jollify a wall, or bridge, or subway train. Sometimes, their work might be very good indeed, but its audience would be limited to whoever happened to wander past. An artist's only brush with notoriety came when newspapers reported their latest court case.

Then the internet came along and changed all that. In a little over a decade, a genre that was once dismissed as vandalism has used cyberspace to gain credibility as one of the most vibrant and creative scenes in urban culture. Inspired by Banksy, who has completed the journey from viral sensation to millionaire creator of $60,000 canvasses, Oscar-nominated movies, and bestselling books, a generation of artists is now starting to fashion lucrative careers out of their former hobby.

"Today, somebody does a tag in Russia, China, Japan, or Africa, a friend photographs it and within a few hours it'll be viewed on websites all over the world," says Jeffrey Deitch, director of the Museum of Contemporary Art in Los Angeles, which recently opened a major show on the subject. "I think you can make a good case that street art is now the most influential art movement of the past 30 years. The penetration of urban culture is huge, and it's influencing everything from skateboard design to high fashion. Some of these guys have even been hired to design Louis Vuitton handbags."

Central to this rise has been Sebastian Buck. An English enthusiast, based in LA, he founded the website Unurth in 2008. It has since become one of the most influential street art forums, helping discover and champion such rising stars as Roa and Escif, as well as established "names" like Blu, JR and OS Gemeos. On the coming pages, Buck profiles some of the most exciting talents working on what is now the art world's largest canvas.

Friday, April 22, 2011

Chernobyl’s sarcophagus

Stewart Brand, editor of the original Whole Earth Catalog and current defender of nuclear energy spoke at a recent energy symposium at MIT earlier this week. He said that Japan's nuclear meltdown has not dulled his enthusiasm for nukes one bit. In his recent book, Whole Earth Discipline, he points to Chernobyl as an example of how a region bounces back from a nuclear accident. (GW)

After 25 Years, Sealing Off Chernobyl

Dealing with the world’s worst nuclear disaster still requires incredible amounts of work.

By Katherine Bourzac
Technology Review
April 2011

This coming Tuesday marks the 25th anniversary of the fire and core meltdown at Chernobyl’s nuclear reactor four. Even now, the site requires tremendous care so that the remaining nuclear material does not escape.

Within a few months of the accident, officials put a concrete enclosure called the “sarcophagus” over the reactor, but the structure has been showing signs of wear ever since. It’s cracked; it lets the elements in and some radioactivity out. The yellow metal support structure visible in this photo was added to stabilize the sarcophagus, a process that was completed three years ago to prevent it from collapsing. Now, engineers financed by the European Bank for Reconstruction and Redevelopment, with contributions from the European Commission and countries including the United States, are working on two major remaining tasks (about $1.8 billion has been raised so far). Led by the consortium Novarka, they’re building new storage facilities for the spent nuclear fuel from other reactors on the site. And Novarka is undertaking one of the most complex construction projects ever, to create a seal that will go over the cracked sarcophagus. On an adjacent site, workers are building the foundations for a 100-meter-high steel structure that will be slid into place to seal it off until the reactor itself can be dismantled, in about 100 years.

The reinforced-concrete foundation for this structure, called the New Safe Confinement, is visible in the foreground of this photo. The gravel trench on the right is part of the track that will be used to slide the tremendous structure over the sarcophagus.

Thursday, April 21, 2011

Too smart for it's own good?

There's always a catch isn't there? Bucky Fuller was once asked: "As humans become better and better problem solvers, will we finally reach Utopia?" Bucky responded that we will succeed in creating more complex problems that need to be addressed.

Case in point: Well-intentioned efforts to modernize the grid (something that desperately needs to happen) will inevitably make it more vulnerable to cyber-attacks. (GW)

A 'smart' grid will expose utilities to smart computer hackers


By Peter Behr
Climate Wire
April 19, 2011

A year ago, an unidentified computer intruder tried to penetrate the Lower Colorado River Authority's power generation network with 4,800 high-speed log-in attempts that originated at an Internet address in China, according to a grid official's confidential memo that was leaked to the media.

And that was probably just an amateur's work, says David Bonvillain, vice president of Accuvant LABS, the security and research division of Accuvant, a cybersecurity consulting firm based in Denver.

Far greater challenges lie ahead as smart grid technologies proliferate in the nation's transmission network and utility control centers and eventually reach business and residential electricity customers, he says.

"There are known vulnerabilities, and there are vulnerabilities that haven't been discovered yet," he said. The risk that a hacker could disrupt a closely managed grid control system is considerably lower than for an intrusion into a financial or industrial network, but the consequences could be far graver, Bonvillain and other experts agree.

And the scope of the threat is expanding faster than the utility sector's response, says Michael Assante, the former chief security officer of the North American Electric Reliability Corp., the federally designated grid monitor. Assante left NERC last year to form a new nonprofit, the National Board of Information Security Examiners, which provides technical certification qualification for utility cyber defenders. The certification is intended to identify elite cybersecurity professionals.

"The smart grid increases the complexity of the system," Assante said in an interview. "There is more technology, and more networks highly interconnected to share information. You've increased the overall attack surface. You're deploying technology that is no longer in a building you control, and you are deploying it over the air, right up to the home.

"And you are deploying it at such a scale, it's a real challenge to manage and maintain security," Assante said. "We should deploy the technology" because of the range of benefits it promises, he said. "But we must learn where the weaknesses are."

The smart grid's rollout is raising awareness of the threat even as it increases vulnerability, some experts say. "The smart grid is one of the best things to ever happen to security in the utility space. People are really starting to see that threats are present there," said Jon Miller, director of Accuvant LABS.

"The smart grid will make technology management a core part of what any utility is," he said. But this transition is happening faster at some energy companies than at others, he said.

Security 'floor' needed for utility control rooms

The threshold challenge has been the slow development of security standards that establish a floor for safeguarding generator and transmission control rooms, according to the Government Accountability Office. A GAO report on March 11 called on the National Institute of Standards and Technology to complete its updating of cybersecurity guidelines, and concluded that the Federal Energy Regulatory Commission needed a stronger process for monitoring industry compliance with cyber standards.

The GAO report also cited a dramatic increase in cyber attacks on federal agencies, as reported to the U.S. Computer Emergency Readiness Team (US-Cert). Cyber incidents totaled 41,776 in fiscal 2010, a 650 percent increase in five years.

The standards-setting process has been burdened by jurisdictional issues and the need to seek a time-consuming utility industry consensus on a response to a rapidly evolving threat, experts say.

Responding to GAO's criticism, FERC chairman Jon Wellinghoff has pointedly noted that when Congress set up the process for creating cybersecurity standards for the electric power industry in the 2005 Energy Policy Act, it put the agency into a reactive stance: FERC can approve or reject cyber standards developed through NERC's industry consensus process, but it cannot do more.

Because FERC's regulatory authority is limited to the interstate high-voltage transmission network, it has no direct influence over cybersecurity on utility distribution grids that deliver power to customers in cities. State utility commissions oversee that part of the grid. FERC and the National Association of Regulatory Utility Commissioners, the American Public Power Association and the National Rural Electric Cooperative Association are trying to harmonize a common approach, Wellinghoff said in a response to GAO last month.

After years of disjointed efforts since the 2005 act passed, the cyber issue has begun to move on some fronts, officials said, although some difficult regulatory policy negotiations still lie ahead.

Jurisdictional disputes remain

NERC's board of directors approved in December a new detailed checklist that power and transmission companies are to follow in identifying critical parts of their systems that will be subject to cyber protection regulation. The checklist responds to criticism from some members of Congress and FERC's staff that some utilities had kept critical facilities off the "critical assets" list to limit the future reach of cyber legislation. That new policy awaits FERC action.

NERC's trustees also approved in December a new regulatory approval process that is designed to prevent new cyber and reliability standards sought by FERC from being shelved because they failed to win approval by a supermajority of NERC's power company members. The federal regulators had directed NERC in March 2010 to come up with a solution to the impasse issue, and a year later, a resolution is about to occur, with a final approval from FERC expected soon, officials said.

Another jurisdictional issue involving nuclear plants has been overcome. The Nuclear Regulatory Commission has agreed to take oversight responsibility for cybersecurity of all systems at nuclear power plants, not just the reactors, officials said. A memorandum of understanding between the NRC and FERC resolves this question.

But a new Senate initiative is likely to reignite the federal-state jurisdictional quarrel over cyber standards.

Wellinghoff, in his March 10 letter to the GAO, said that the Federal Power Act, which applies to high-voltage interstate power transmission, "excludes virtually all of the grid facilities in certain large cities such as New York, thus precluding Commission action to mitigate cyber security or other national security threats to reliability that involve such facilities and major population areas. It is also important to note that much of the smart grid equipment will be installed on distribution facilities and will not fall under the Commission's Federal Power Act jurisdiction."

Last week, Chairman Jeff Bingaman (D-N.M.) of the Senate Energy and Natural Resources Committee and ranking Republican Lisa Murkowski (R-Alaska) circulated a draft bill on cyber protection policy that would give FERC the authority over critical distribution networks that it has been seeking. The proposed language says the bill would cover the "generation, transmission, or distribution of electric energy affecting interstate commerce" that federal authorities consider to be vital to U.S. security or national public health and safety.

Wide gap between least and most protected

A hearing on the legislation will be held in May, the committee said. Majority Leader Harry Reid (D-Nev.) has begun meetings with leaders of several Senate committees interested in the cybersecurity issue, seeking a coordinated path toward action this year, Senate aides said.

But even the successful completion of standards and rules for cyber protection for the power sector won't be enough if the technical competency of the industry's cyber managers is not upgraded, Assante insists.

The case study Assante cites is the Stuxnet computer worm, which industry experts believe penetrated a part of Iran's nuclear power infrastructure in mid-2009, damaging some of its critical uranium enrichment centrifuges.

The code for the Stuxnet cyber weapon, whose authors remain unidentified publicly and are the subject of intense speculation, was identified by a Russian security firm that found it on a USB flash drive, Assante says. The USB stick was turned over to the Russian firm by a security specialist at another firm who had plugged the stick into a computer and noticed a split-second response that was out of the ordinary.

The specialist didn't shrug off the anomaly, he says. "The reaction wasn't, 'Well, that was odd, and just move on,' which is a typical unaware reaction. ... It's easy to say, 'Well, that didn't work right. Let's just restart the computer.'"

Grid reliability is based on planning to keep the power flowing if a plant suddenly goes offline, a power line is knocked out, or a transformer fails, Assante said. The cyber challenge is different. "Planning engineers are used to saying, 'If this goes away, can the system still operate safely?'

"My point to them was, what happens if it doesn't go away, but this part of your system is being misused" to threaten the system?

Assante said there is still too wide a gap dividing power companies that are serious about raising cyber threat barriers and training people to use them, and other companies whose awareness and preparations are not adequate.

"Some utilities are certainly more progressive. They have more skilled folks on staff, and they've been able to do more to protect their systems. Others have suffered from the challenge of getting technical skills." The Tennessee Valley Authority is an example of a power provider that is setting high standards, he said. "Awareness differs. It's not a simple task," he said. "There's still work to be done."

Correction: An earlier version referred to Accuvant LABS as a company based in Hanover, Md.; however, Accuvant LABS is a division of Accuvant, which based in Denver.

Wednesday, April 20, 2011

Floatovoltaics

We're familiar with floating oil and gas platforms. Engineers have been working on designs for floating wind turbines for over a decade. The Russians have allegedly launched a floating nuclear power plant...so why not floating solar? (GW)

Solar on the Water

By Todd Woody
New York Times
April 19, 2011

PETALUMA, Calif. — Solar panels have sprouted on countless rooftops, carports and fields in Northern California. Now, several start-up companies see potential for solar panels that float on water.

Already, 144 solar panels sit atop pontoons moored on a three-acre irrigation pond surrounded by vineyards in Petaluma in Sonoma County. Some 35 miles to the north, in the heart of the Napa Valley, another array of 994 solar panels covers the surface of a pond at the Far Niente Winery.

“Vineyard land in this part of the Napa Valley runs somewhere between $200,000 and $300,000 an acre,” said Larry Maguire, Far Niente’s chief executive. “We wanted to go solar but we didn’t want to pull out vines.”

The company that installed the two arrays, SPG Solar of Novato, Calif., as well as Sunengy of Australia and Solaris Synergy of Israel, are among the companies trying to develop a market for solar panels on agricultural and mining ponds, hydroelectric reservoirs and canals. While it is a niche market, it is potentially a large one globally. The solar panel aqua farms have drawn interest from municipal water agencies, farmers and mining companies enticed by the prospect of finding a new use for — and new revenue from — their liquid assets, solar executives said.

Sunengy, for example, is courting markets in developing countries that are plagued by electricity shortages but have abundant water resources and intense sunshine, according to Philip Connor, the company’s co-founder and chief technology officer.

Chris Robine, SPG Solar’s chief executive, said he had heard from potential customers as far away as India, Australia and the Middle East. When your land is precious, he said, “There’s a great benefit in that you have clean power coming from solar, and it doesn’t take up resources for farming or mining.”

Sunengy, based in Sydney, said it had signed a deal with Tata Power, India’s largest private utility, to build a small pilot project on a hydroelectric reservoir near Mumbai. Solaris Synergy, meanwhile, said it planned to float a solar array on a reservoir in the south of France in a trial with the French utility EDF.

MDU Resources Group, a $4.3 billion mining and energy infrastructure conglomerate based in Bismarck, N.D., has been in talks with SPG Solar about installing floating photovoltaic arrays on settling ponds at one of its California gravel mines, according to Bill Connors, MDU’s vice president of renewable resources.

“We don’t want to put a renewable resource project in the middle of our operations that would disrupt mining,” Mr. Connors said. “The settling ponds are land we’re not utilizing right now except for discharge and if we can put that unproductive land into productive use while reducing our electric costs and our carbon foot footprint, that’s something we’re interested in.”

Mr. Connors declined to discuss the cost of an SPG floating solar array. But he noted, “We wouldn’t be looking at systems that are not competitive.”

SPG Solar’s main business is installing conventional solar systems for homes and commercial operations. It built Far Niente’s 400-kilowatt floating array on a 1.3-acre pond in 2007 as a special project and has spent the last four years developing a commercial version called Floatovoltaics that executives say is competitive in cost with a conventional ground-mounted system.

The Floatovoltaics model now being brought to market by SPG Solar is the array that bobs on the surface of the Petaluma irrigation pond.

“We have been able to utilize a seemingly very simple system, minimizing the amount of steel,” said Phil Alwitt, project development manager for SPG Solar, standing on a walkway built into the 38-kilowatt array.

“With steel being so expensive, that’s our main cost,” Mr. Alwitt said.

Long rows of standard photovoltaic panels made by Suntech, the Chinese solar manufacturer, sit tilted at an eight-degree angle on a metal lattice fitted to pontoons and anchored by tie lines to buoys to withstand wind and waves.

The array, which is not yet operational, will be hooked up to a transmission line through a cable laid under the pond bed. Mr. Alwitt said that when the array is completed, 2,016 panels would cover most of the pond’s surface and generate one megawatt of electricity at peak output.

He noted that the cooling effect of the water increased electricity production at the Far Niente winery by 1 percent over a typical ground-mounted system.

SPG Solar executives said an environmental engineering firm that evaluated its technology concluded that water evaporation under the floating arrays decreased by 70 percent. The companies also say that their systems inhibit destructive algae growth by blocking the sunlight the algae need to grow.

David L. Sedlak, a professor of civil and environmental engineering at the University of California, Berkeley, agreed that the floating solar power plants could prove useful in controlling algae.

“Irrigation ponds have the potential to become algal sources and algae can cause all sorts of issues,” said Dr. Sedlak, co-director of the university’s Berkeley Water Center. But he said he doubted that stemming evaporation would be a big selling point for floating solar panels since irrigation ponds did not lose that much water to evaporation.

Solar entrepreneurs had hoped to persuade the California State Water Project to cover the 400-mile California Aqueduct with photovoltaic panels. The panels could then generate electricity in the canal that irrigates the agricultural empire of the Central Valley and helps supply water to 25 million Californians.

Solaris Synergy, the Israeli firm, claims that installing its floating solar arrays on the aqueduct could produce up to two megawatts of electricity per mile. And SPG Solar executives said they held preliminary discussions with state officials about putting solar panels on the aqueduct.

“We think there’s a huge potential,” Mr. Robine said.

Ralph Torres, deputy director of the state water project, said he had recently spoken with Solaris, the latest of many companies that he said had approached his agency over the years about installing solar panels.

“You would really have to anchor these solar arrays so they wouldn’t float away,” Mr. Torres said. “If you do spring a leak and have to go in quickly these panels would be in the way and you might damage or destroy them when responding to an emergency.”

“A better application would be on a reservoir,” he added.

That is Sunengy’s strategy. Mr. Connor said the company was looking to developing countries to turn hydroelectric dams and village reservoirs into giant batteries.

“Any solar power you generate on the dam allows you to feed the transmission line and save water in the dam for use on rainy days or at night,” he said.

Sunengy’s plan would deploy rafts of solar units that use a plastic lens to track the sun and concentrate sunlight on small photovoltaic cells that use less expensive silicon than conventional cells. In high winds, the lens stows under the water.

“If you have a drought on a hydroelectric dam, your asset is dead,” Mr. Connor said. “If you have solar power on that dam, you can continue to generate electricity.”

Tuesday, April 19, 2011

"...more important than anything we've seen for a long time"

The above quote is from a BP executive made privately about the oil reserves in Iraq a year before Britain decided to join the U.S. in its war there.

But we know the war had nothing to do with oil. (GW)

Secret memos expose link between oil firms and invasion of Iraq


By Paul Bignell
The Independent
April 19, 2011

Plans to exploit Iraq's oil reserves were discussed by government ministers and the world's largest oil companies the year before Britain took a leading role in invading Iraq, government documents show.

The papers, revealed here for the first time, raise new questions over Britain's involvement in the war, which had divided Tony Blair's cabinet and was voted through only after his claims that Saddam Hussein had weapons of mass destruction.

The minutes of a series of meetings between ministers and senior oil executives are at odds with the public denials of self-interest from oil companies and Western governments at the time.

The documents were not offered as evidence in the ongoing Chilcot Inquiry into the UK's involvement in the Iraq war. In March 2003, just before Britain went to war, Shell denounced reports that it had held talks with Downing Street about Iraqi oil as "highly inaccurate". BP denied that it had any "strategic interest" in Iraq, while Tony Blair described "the oil conspiracy theory" as "the most absurd".

But documents from October and November the previous year paint a very different picture.

Five months before the March 2003 invasion, Baroness Symons, then the Trade Minister, told BP that the Government believed British energy firms should be given a share of Iraq's enormous oil and gas reserves as a reward for Tony Blair's military commitment to US plans for regime change.

The papers show that Lady Symons agreed to lobby the Bush administration on BP's behalf because the oil giant feared it was being "locked out" of deals that Washington was quietly striking with US, French and Russian governments and their energy firms.

Minutes of a meeting with BP, Shell and BG (formerly British Gas) on 31 October 2002 read: "Baroness Symons agreed that it would be difficult to justify British companies losing out in Iraq in that way if the UK had itself been a conspicuous supporter of the US government throughout the crisis."

The minister then promised to "report back to the companies before Christmas" on her lobbying efforts.

The Foreign Office invited BP in on 6 November 2002 to talk about opportunities in Iraq "post regime change". Its minutes state: "Iraq is the big oil prospect. BP is desperate to get in there and anxious that political deals should not deny them the opportunity."

After another meeting, this one in October 2002, the Foreign Office's Middle East director at the time, Edward Chaplin, noted: "Shell and BP could not afford not to have a stake in [Iraq] for the sake of their long-term future... We were determined to get a fair slice of the action for UK companies in a post-Saddam Iraq."

Whereas BP was insisting in public that it had "no strategic interest" in Iraq, in private it told the Foreign Office that Iraq was "more important than anything we've seen for a long time".

BP was concerned that if Washington allowed TotalFinaElf's existing contact with Saddam Hussein to stand after the invasion it would make the French conglomerate the world's leading oil company. BP told the Government it was willing to take "big risks" to get a share of the Iraqi reserves, the second largest in the world.

Over 1,000 documents were obtained under Freedom of Information over five years by the oil campaigner Greg Muttitt. They reveal that at least five meetings were held between civil servants, ministers and BP and Shell in late 2002.

The 20-year contracts signed in the wake of the invasion were the largest in the history of the oil industry. They covered half of Iraq's reserves – 60 billion barrels of oil, bought up by companies such as BP and CNPC (China National Petroleum Company), whose joint consortium alone stands to make £403m ($658m) profit per year from the Rumaila field in southern Iraq.

Last week, Iraq raised its oil output to the highest level for almost decade, 2.7 million barrels a day – seen as especially important at the moment given the regional volatility and loss of Libyan output. Many opponents of the war suspected that one of Washington's main ambitions in invading Iraq was to secure a cheap and plentiful source of oil.

Mr Muttitt, whose book Fuel on Fire is published next week, said: "Before the war, the Government went to great lengths to insist it had no interest in Iraq's oil. These documents provide the evidence that give the lie to those claims.

"We see that oil was in fact one of the Government's most important strategic considerations, and it secretly colluded with oil companies to give them access to that huge prize."

Lady Symons, 59, later took up an advisory post with a UK merchant bank that cashed in on post-war Iraq reconstruction contracts. Last month she severed links as an unpaid adviser to Libya's National Economic Development Board after Colonel Gaddafi started firing on protesters. Last night, BP and Shell declined to comment.

www.fuelonthefire.com

Not about oil? what they said before the invasion

* Foreign Office memorandum, 13 November 2002, following meeting with BP: "Iraq is the big oil prospect. BP are desperate to get in there and anxious that political deals should not deny them the opportunity to compete. The long-term potential is enormous..."

* Tony Blair, 6 February 2003: "Let me just deal with the oil thing because... the oil conspiracy theory is honestly one of the most absurd when you analyse it. The fact is that, if the oil that Iraq has were our concern, I mean we could probably cut a deal with Saddam tomorrow in relation to the oil. It's not the oil that is the issue, it is the weapons..."

* BP, 12 March 2003: "We have no strategic interest in Iraq. If whoever comes to power wants Western involvement post the war, if there is a war, all we have ever said is that it should be on a level playing field. We are certainly not pushing for involvement."

* Lord Browne, the then-BP chief executive, 12 March 2003: "It is not in my or BP's opinion, a war about oil. Iraq is an important producer, but it must decide what to do with its patrimony and oil."

* Shell, 12 March 2003, said reports that it had discussed oil opportunities with Downing Street were 'highly inaccurate', adding: "We have neither sought nor attended meetings with officials in the UK Government on the subject of Iraq. The subject has only come up during conversations during normal meetings we attend from time to time with officials... We have never asked for 'contracts'."

www.fuelonthefire.com

Monday, April 18, 2011

EU considers replacing short flights with high-speed rail

European Union leaders are showing the kind of thinking and planning that will be necessary to avoid a worst-case climate change scenario from unfolding. They've shown that a huge dose of political courage will inevitably have to accompany every proposed design solution. (GW)

EU could ground short-haul flights in favour of high-speed rail

By Dan Milmo
Guardian
April 18, 2011

Short-haul flights across Europe could be replaced by high-speed rail under ambitious European Union proposals to reduce carbon dioxide emissions from transport by 60% over the next 40 years.

According to the EU, Heathrow's congestion problems could be eased by cutting domestic and European flights, while demand for new runways can be suppressed by building new rail networks. The EU transport commissioner, Siim Kallas, has announced a series of green transport goals, including shifting the majority of flights longer than 300km to rail and phasing out the use of petrol cars in city centres by 2050.

"At Heathrow there are no new runways, but we desperately need to increase capacity and you can do this if you reduce short-haul flight connections," said Kallas. The commissioner added in an interview with the Guardian that the UK should look at the example of Spain, where high-speed rail has hit demand on a previously popular flight corridor.

"This has happened in Madrid and Barcelona, where 50% of the market has moved to high-speed rail. It is comfortable for everybody. Airlines can put emphasis on long-haul flights, which is better for their business."

Noting the ongoing debate over expanding London's squeezed airports, he added: "If we are successful in creating new railways they can take over short-haul airline connections. It makes it easier for the runway issue."

Kallas hit the headlines this month when he declared a target of phasing out petrol and diesel cars from city centres by 2050. The commissioner said he was unfazed by criticism of the benchmark. "If you don't like the idea of reducing the use of conventional cars in city centres, what are your proposals?"

Kallas said EU countries need to reduce the "mass need" to use petrol and diesel cars for short journeys. "It is a desirable goal to phase out conventional cars," he said. However, Kallas added that mass-adoption of electronic cars also posed problems because major city roads will continue to be clogged by traffic.

Speaking after a meeting with officials at Transport for London, Kallas said the capital's congestion charge will be copied by other conurbations around Europe. "The congestion charge is a step that many cities will follow," he said. Kallas's 2050 targets include connecting all hub airports to high-speed rail lines and connecting majors ports to rail networks in order to reduce dependency on road freight.

Sunday, April 17, 2011

Energy flights of fancy

I met the MIT grad students who came up with this innovative approach to harvesting wind energy last year at a technology competition in Cambridge. Their concept attracted a lot of attention and raised a lot a questions. It's very encouraging to see young minds working on these issues. (GW)

A generator that’s lighter than air — and relatively light on the wallet

By Scott Kirsner
Boston Globe
April 17, 2011

Graduates of MIT’s Department of Aeronautics and Astronautics learn lots about lift, aerodynamics, and rocket engines.

But they don’t learn how to launch and fly a blimp. So when the MIT-educated founders of Altaeros Energies Inc. unpacked a white plastic shroud from the back of a Volkswagen sedan last November and filled it with helium, they didn’t exactly know what they were doing. They’d attached a video camera to their custom-made cylindrical airship, and also a few sensors, to gather information about its motion in the wind. As one of the team members worked a pulley system to allow the blimp to lift off, the team members grinned, scribbled a few notes, and shot video from the ground.

Altaeros’s blimp, technically called an “aerostat’’ since it is tethered to the ground rather than free-flying, is designed to hold a wind turbine in its hollow center and fly at nearly 2,000 feet, where the winds are more consistent and powerful. (The wind turbines proposed for the Cape Wind project, by comparison, will reach 440 feet.) A thick mooring cable will carry the electricity generated by the turbine to a ground station.

The company’s vision is that airborne wind power will prove an important complement to traditional pole-mounted turbines. Altaeros has already raised several hundred thousand dollars to test its technology and has been meeting with individual investors and venture capital firms to raise more. Based in East Cambridge, Altaeros is one of a half-dozen or so companies around the world working on designs that would send wind turbines high into the sky.

While we’ve captured power from the wind for hundreds of years, there are drawbacks to putting a spinning rotor atop a tall pole. “Thirty to 50 percent of the cost of putting up a wind turbine is building the tower,’’ says Altaeros chief executive Adam Rein. And if you try to build a taller tower to harness stronger winds at higher altitudes, construction costs increase further and objections from neighbors get louder.

The Federal Aviation Administration says tethered balloons are kosher, but they can’t be near airports or fly above 2,000 feet. They must be illuminated at night and include safety precautions in case the tether rips. (In that scenario, the balloon is designed to automatically let helium escape, letting the balloon drift to the ground.)

Altaeros believes its aerostat design could produce two to five times as much power as a pole-mounted, 350-foot-tall turbine in the exact location. Initially, the company is developing what you might call “wind power in a box,’’ packing the balloon, helium tanks, rotor, generator, and related gear into a shipping container. The container would then be sent to a remote oil field, island, or military post that would typically rely on a diesel generator for power. Altaeros’s aim is that the aerostat will be flying after just a day of set-up and will only need to descend every three or four months for a helium top-off and a maintenance check.

Altaeros is still a few years from commercial production, but it expects the cost of electricity produced by its airborne turbines to compare favorably to a diesel generator, especially if the diesel fuel is being shipped long distances.

Several companies are further along than Altaeros in designing flying wind turbines, and have raised substantially more money. California-based Makani Power Inc., cofounded by MIT alum Saul Griffith, has raised at least $15 million from Google and last year received a $3 million grant from the Department of Energy. Makani is building an unmanned aircraft, tethered to the ground, that will fly in large circles. Three rotors on the craft’s wing help lift it into the sky, acting as propellers. Once aloft, the rotors will generate power as the wing swoops through the air like a kite.

A Canadian company, Magenn Power Inc., says it expects to begin selling its first-generation, helium-filled system this year. Unlike Altaeros’s design, the entire balloon is intended to generate electricity by spinning in the wind horizontally, similar to a waterwheel. A Magenn system capable of cranking out 100 kilowatts — enough to power 30 to 50 homes — will cost about $500,000.

But investors, whether big energy producers or venture capital firms, haven’t exactly begun singing “off we go into the wild blue yonder.’’

“Like tidal power, it still has a lot of question marks,’’ says Rob Day, partner at the Boston office of Black Coral Capital. At Boston’s Flybridge Capital Partners, partner Jon Karlen says, “First you have the technical risk of getting the system to work for a day or a week in the air, and then you need to make sure it will work for decades.’’

Altaeros plans to have a new prototype of its aerostat ready for testing this summer. Unlike the hollow-centered blimp lofted last November, this one will have a small wind turbine inside, capable of producing 2.5 kilowatts of electricity — about the size of a “backyard’’ turbine that would supply power to a single-family residence.

Using what they learned from last fall’s flights, the company’s engineers have been tweaking the design to make it more stable. The less it is buffeted about, the more efficient it will be at wringing power from the wind.

As with all futuristic technologies — especially those that seem to have popped off the pages of a Jules Verne novel — Altaeros is prepared to answer all sorts of skeptical questions. One investor asked, “What happens if one of the aerostats falls out of the sky and hits a whale?’’

That would be bad news, of course, but what if we had decided to ban cars because of the risk that they’d crash into things? We need technology, even if it seems clunky or dangerous at first, to take us onward and upward.

Scott Kirsner can be reached at kirsner@pobox.com. Follow him on Twitter @ScottKirsner.

Saturday, April 16, 2011

What replaces coal?

Let's take it as good news that the federal government is "leading by example" by closing 18 aging coal-fired boilers in an effort to reduce greenhouse gas emissions.

That's half of the solution.

The other half involves what the coal boilers are being replaced with. Therein, lies the rub. If half of the glass is emptied of coal but replaced with nukes (or even natural gas for that matter), what have we ultimately accomplished?

Make no mistake, elimating coal burning is the top priority, but why only go half way when we have the ability to do more?(GW)


TVA agrees to shut down 18 coal-fired boilers and curb emissions

By Dina Fine Maron
ClimateWire
April 15, 2011

One of the nation's largest coal-burning utilities said yesterday it will shutter 18 of its coal-fired boilers and pay billions to rein in pollutants at many of its remaining units, underscoring the evolving energy landscape in the United States.

The move by the Tennessee Valley Authority will result in nearly 1 percent of the nation's coal-fired power capacity going offline by the end of 2018, including 1,000 megawatts of coal-fired power TVA said it planned to retire last year. TVA's landmark deal with a suite of states and environmental groups and U.S. EPA resolves a number of lingering violation complaints EPA brought against the company for allegedly failing to comply with Clean Air Act pollution control requirements at 11 of its plants.

Environmentalists yesterday hailed the agreement as a success for public health that will result in major reductions of greenhouse gases on top of targeted benefits in reductions of sulfur dioxide (SO2) and nitrogen oxides (NOx).

EPA estimated that the agreement will cut TVA's NOx by 69 percent and SO2 by 67 percent, resulting in about $27 billion in annual health care benefits by averting thousands of early deaths, asthma attacks and heart attacks. EPA did not calculate specific greenhouse gas reduction figures.

The federally owned Tennessee Valley Authority will be closing 18 units at three of its plants in Tennessee and Alabama as part of the agreement, affecting about 16 percent of its coal-fired electricity generating system. TVA will also need to invest in pollution control retrofits for most of its remaining 41 coal-fired plants, which the company said could cost between $3 billion and $5 billion.

Another provision of the agreement requires TVA to inject $350 million into energy projects to slash pollution and save energy, with $240 million of that pot funding energy efficiency initiatives. A $40 million chunk of TVA's funds will also go toward reducing greenhouse gases and other pollutants through waste heat recovery, hybrid electric charging stations, solar installations and waste treatment methane gas capture projects.

"Today's announcement locks in the retirements ahead, so now we'll see what the next steps are for reductions in greenhouse gases and what will replace the coal-fired power plants," said Bruce Nilles, deputy conservation director for the Sierra Club, a group involved in the settlement. "Putting an end to burning millions of tons of coal means huge reductions in greenhouse gases," he said.

15 million tons of CO2 to be eliminated

The 18 units slated for closure emitted about 15 million tons of carbon dioxide in 2008, according to TVA.

To replace the electric capacity, TVA will look to "low-emission or zero-emission electricity sources, including renewable energy, natural gas, nuclear power and energy efficiency," the utility said in a statement.

Stephen Smith, executive director of the Southern Alliance for Clean Energy and an unpaid adviser for a group that crafted a long-term strategy for TVA's future resource use, estimates that the closures will shrink TVA's carbon footprint by about 10 percent.

"There are not the workhorse plants. These are older, lower-utilized plants," he said, noting that they would not typically be operating at full capacity.

Still, he called these coal reductions "very important," since TVA is one of the largest coal plant operators in the country and continues to be a major player in the southeastern United States. Other companies will see this choice and follow suit, since it will be expensive to install environmental controls on some of these older, inefficient plants, he said. With this announcement, he said, "you are seeing a major company in the southeastern United States announcing commitments to retire significant amounts of coal."

The central plank of the settlement agreement forged by TVA and EPA requires the retirement of two units at the John Sevier Fossil Plant in eastern Tennessee, six units at the Widows Creek Fossil Plant in northern Alabama, and all 10 of the units at TVA's Johnsonville Plant in central Tennessee. Almost all of those units date back to the 1950s and had no modern pollution controls installed.

"These units are among the first built by TVA and have served us well over the years. But as times change, TVA must adapt to meet future challenges," TVA President and CEO Tom Kilgore told his board yesterday in Chattanooga, Tenn., where the majority of the board signed off on the plan, according to a statement. Installing needed pollution control equipment at these facilities would not be cost-effective, he said.

EPA Administrator Lisa Jackson said TVA's transition sends an important signal to other companies: "This not only can be done, this is being done," she told reporters yesterday.

"The message here," she said, "is that we don't have anything against coal, but we have to reduce the pollution that comes from coal to our air, to our water and on our land."

Other companies are already taking steps to move away from coal.

Other utilities shrink CO2 footprint

North Carolina-based companies Progress Energy and Duke Energy, which announced a $13 billion merger in 2009, were among the first out of the box to lay out plans to replace a significant number of older coal-fired units with natural gas. Three combined-cycle gas-burning plants are now under construction or in the planning stages in that state, and Duke and Progress are also waiting for federal regulatory approvals for three new nuclear power stations.

For its part, Duke is among the largest coal users in the country, but if state regulators buy into its plans, company chairman Jim Rogers has said its portfolio will shift to more gas, nuclear and high-efficiency coal plants.

Atlanta-based Southern Co. also expects to get the green light to start construction on two nuclear power plants in Georgia this year. In Kemper County, Miss., Southern is building a coal gasification plant. Meanwhile, just outside of Atlanta, subsidiary Georgia Power is replacing coal-fired units with three 840-megawatt gas units.

The TVA deal put a variety of options on the table to move toward a cleaner energy future. It set up a framework for the construction of up to 4,000 megawatts of natural gas power plants to be part of the mix, according to EPA.

Another option under the settlement agreement would be to move toward burning woody biomass and other waste instead of coal -- though such biomass facilities have been a hot-button issue on Capitol Hill, as some groups question the carbon footprint of burning woody waste and say those plants could drive deforestation.

"There are certainly many ways that we can use biomass to replace fossil fuels including coal and get real reductions in life-threatening carbon dioxide pollution and still protect our natural resources," said Franz Matzner, legislative director for the Natural Resources Defense Council's Climate Center. "What remains to be seen is what is on the table here. ... We need to be very cautious with how we proceed with the catch-all 'biomass,'" he said.

Reporter Joel Kirkland contributed.