Thursday, March 31, 2011

You've got to admit they're getting bigger...a little bigger all the time

In this age of miniaturization (with iPods getting smaller and laptops ever thinner) wind energy technology is headed in the opposite direction. In today's world, the most expeditious way for wind energy to compete economically with its fossil fuel competitors is for individual turbines and turbine arrays (windfarms) must get larger and more efficient. Cape wind has recently completed a 10-year permitting process for the development of a windfarm consisting of up to 130 3.6 megawatt turbines that were considered almost unbelievably huge a decade ago. (GW)

Vestas surprises with 7 MW offshore wind turbine

Reuters
March 30, 2011

By Shida Chayesteh

LONDON (Reuters) - Danish wind turbine maker Vestas unveiled a new giant 7-megawatt offshore turbine on Wednesday and said it expected it to be in serial production in early 2015.

Vestas, the world's biggest wind turbine manufacturer by market share, had only been expected to unveil a 6 MW turbine, not the towering 135-meter (443 feet) 7 MW unit, with its rotor diameter of 164 meters.

The new turbine, whose rotor will trace a circle bigger than the London Eye ferris wheel and whose 80-meter blade is longer than nine London double-decker buses, is part of a push toward larger and larger units further offshore, where the wind power potential is high and public resistance can be avoided.

Shares in Vestas, already up before the announcement, rose as much as 3.5 percent and ended up 2.6 percent at 225.20 Danish crowns, outperforming a 1.7 percent rise in the Copenhagen bourse's bluechip index.

Chief Executive Ditlev Engel told a news conference in London that the new turbine, designed for North Sea conditions, was the first that Vestas has developed specifically for offshore wind parks and its single largest research and development ever.

Wind currently accounts for just 2 percent of the global energy mix, but Vestas expects it to provide 10 percent in 2020, Engel said in a presentation.

Most of the growth in offshore wind power capacity to 2025 will be in Europe, he said.

President of Vestas Offshore, Anders Soe-Jensen, told the news conference the company expected the offshore wind market to grow by 8.4 percent annually from 2015 to 2025.

The UK is the world leader in offshore wind power capacity, having overtaken Denmark, and the new turbine is expected to come in time for the big new British projects planned under Round 3 of the UK's licensing system, Vestas said.

"Just here in Britain, there are plans for 32,000 MW out at sea -- that is 10 times the total installed wind capacity of Denmark, so we are talking about a lot of money," Engel told Reuters on the sidelines of the news conference.

The first prototypes are expected to be built in the fourth quarter of 2012, Vestas Wind Systems A/S said in a statement.

"Serial production is set to begin in the first quarter of 2015, provided a firm order backlog is in place to justify the substantial investment needed to develop the new production and assembly facilities required for a turbine of this size," the company said.

CAPEX DEPENDS ON ORDERS

Engel said the company has not decided where to locate the new plant that would be needed to manufacture the turbine, and he added that the capital expenditure and location would depend on order commitments coming through.

Engel said Vestas had no orders for the new turbine in hand, though customers have been involved in the design process. He declined to say how big he expected sales of the new unit to be.

Alm. Brand analyst Michael Friis Jorgensen said: "The main thing we could have hoped for was more specifics on the investment needed to build up capacity for (this turbine)."

"They are saying they will look at orders and then decide on the investment," Jorgensen said.

Engel told Reuters that selling wind turbines has become similar to the practice in the airline industry where one sells on the basis of design and then builds the product once one has the backlog of orders to justify the big investment.

He said the plant would preferably be located near a port to facilitate transport.

Vestas's R&D chief Finn Strom Madsen said the new turbine would be equipped with a gearbox, unlike the gearless models being developed by some manufacturers.

"We are able to design direct drive solutions, but they are less attractive because they provide a poorer business case for our customers," Madsen said.

The new Vestas turbine has more than twice the capacity of the company's current biggest 3 MW unit, but is smaller than the world's largest 10 MW wind turbines being built by British, American and Norwegian engineers.

(Reporting via Copenhagen newsroom; writing by John Acher; editing by Elaine Hardcastle and Will Waterman)

Wednesday, March 30, 2011

Cloudburst

"Slow down, you move too fast." So observed Simon and Garfunkel in the "59th Street Bridge Song". That's sound advice I'd like to pass along to a lot of information technology developers these days. And to think, when I was in high school I was considered a nerd/geek. Not just considered...I was complete with slide rule in a suede carrying case that hooked to my belt and an assorted array of pocket protectors (including one from NASA!!!)

Somehow this race for digital supremacy feels shaky. Maybe that's because I once the not-so-proud owner of an eight-track tape player. Our private lives are increasingly being deposited in the public domain. Sometimes by choice and sometimes to our surprise. I think I'll keep my music library close and personal. At least for now. (GW)

Amazon vs Apple: The race for the future of music

A new 'virtual jukebox' lets you play your own songs wherever you are. Jerome Taylor on why it makes iPods look so last century

The Independent
March 30, 2011

Ever since the dawn of the internet, music fans have dreamt of a mythical celestial jukebox where every song ever produced would be available at the click of a button.

Now Amazon has sneaked ahead of arch rivals Apple and Google by becoming the first major internet company to unveil a music-streaming service – allowing people to store their music online and listen to the tracks on any computer or smartphone.

So-called "cloud music", where music libraries are stored in cyberspace rather than on computer hard drives, is the new Holy Grail of the digital music industry, as technology companies race to entice consumers in a world where the CD has been all but abandoned.

Yesterday, Amazon quietly released Cloud Player to its American customers in a move that has taken much of the music blogosphere by surprise. Numerous smaller companies have already released their own start-up cloud players in what is still a niche, yet rapidly growing, market. But Amazon's offering is the first time that one of the major tech goliaths has jumped into the business. Google and Apple are thought to be developing their own cloud players.

Cloud music players are often described as "digital lockers" where music listeners can place the music they own on a remote server. Users can then access their music library anywhere in the world as long as they have a fast internet connection.

It means that gap-year students travelling abroad or employees in offices can listen over the internet to the same CDs that are sitting on their bedroom shelves at home. Amazon's service, which is not yet available to British customers, starts by giving subscribers 5GB of free storage space, enough to hold 1,000 songs. Those who purchase an album through the company's digital music store will be given a further 20GB free for the first year and will then be expected to pay $20 (£12.50) a year to continue using the service.

You can then access your music on any computer or Android mobile phone. Given the ongoing hostility between Amazon and Apple – the two companies are currently engaged in a legal spat over who owns the phrase "App Store" – iPhones will not be able to sign up to the Amazon cloud player.

Music fans might find it somewhat galling to pay a company to store their music for them, but according to Mike Butcher, European editor of the TechCrunch technology blog, it won't be long before everyone is listening to music this way. He said: "Most of us have growing libraries of digital music.

"But if our hard drives crash or our computers are stolen we risk losing everything. Storing your music on a cloud means you'll always be able to access it." As internet speeds continue to get faster, many companies are trying to persuade people to store their data with them to free up space on their hard drives. The practice also gives the companies a lucrative insight into their customers' spending habits.

But the emergence of cloud players is very likely to lead to yet another clash between software companies and record labels over royalties and the thorny question of who owns the rights to digital music files. Figures released this week show that global recorded music sales fell by $1.3bn last year as digital piracy continues to take its toll on the music industry. Record label executives are determined to rake back profits through music providers like Apple, Amazon and Google. But cloud players could soon become a new sticking point.

"There is going to be a monumental ding-dong over all this in the coming years," Steve Mayall, editor of the consultancy Music Ally, said. "Amazon can probably see which way the wind is blowing and have decided to roll out their service, but I'm sure there will be some legal battles on the horizon."

Uploading music files to a digital locker is still viewed as legally ambiguous. Although someone who has bought a track has the right to listen to it, do they then have the right to share it on another server? And would a cloud provider like Amazon be liable for pirated music that is played through its servers? Although many record labels have turned a blind eye to these ambiguities, a handful of companies are being sued over such issues.

Amazon has defended its cloud system as being no different to using a web-based service to back up files. "We don't need a licence to store music," Craig Pape, the company's director of music, said. "The functionality is the same as an external hard drive."

Tuesday, March 29, 2011

Unholy grails of grain

If you would like to see what climate change adaptation looks like from one perspective -- a purely capitalist one -- read how Monsanto and other biotechnology companies are racing to develop a variety of drought-and-heat-resistant wheat. Although Monsanto gave up its search for a variety of wheat that was resistant to its deadly pesticide Roundup back in 2004, don't be surprised if they don't try to incorporate that trait into their latest genetically-modified quest. (GW)

Companies begin a difficult search for a climate-hardy variety

By Tiffany Stecker
ClimateWire
March 28, 2011


In 2009, Monsanto, the biggest agricultural company in the world, did something that had been unthinkable just five years before.

It made a major investment in wheat.

This wasn't the company's first foray into developing more advanced wheat cultivars. In the 1990s, it had begun research in developing Roundup Ready wheat to add to its suite to herbicide-resistant crops. But economics reared its ugly head. Acreage in spring wheat had declined dramatically, and Monsanto ended the research in 2004.

Just five years later, wheat markets were on the upswing. Growers began pushing heavily for private investment in research. Monsanto bought WestBred, a small grain biotechnology research firm out of Bozeman, Mont., in 2009 for $45 million. With the merger, Monsanto acquired WestBred's "germ plasm" -- a kind of toolbox of genetic resources that might improve wheat.

Monsanto gave itself five to seven years to develop genes that promoted drought tolerance and high yield. The company is now developing high-yielding, locally applicable varieties, and the next step will be to apply biotechnology know-how acquired in corn research to begin playing with wheat genetics, said Claire CaJacob, wheat technology lead for the company.

"The biggest focus is on drought and intrinsic yield," said CaJacob. Disease and pest control -- a factor also linked to climate change -- "is also pretty important."

In the short term, Monsanto will continue to develop its germ plasm through advanced breeding techniques, using molecular markers -- pieces of DNA that 'mark' a certain trait on a plant's genetic blueprint -- to formulate climate-hardy seeds.

Eventually, the company hopes to develop its germ plasm to a point where genes from corn and soy to improve yield, drought tolerance and nitrogen-use efficiency can be implanted into the wheat genome.

Looming challenge to seed developers

Monsanto could start field testing genetically modified wheat by 2012 and deliver the variety to the market in the next decade, said a spokesperson for the company.

With not a single genetically engineered wheat variety on the market, and a pressing need to feed an estimated 9 billion people by 2050, seed developers are beginning to grasp the challenge looming just ahead.

While traditional wheat breeding has been accelerated by modern science, it is still no match for the potential of genetic engineering. Isolating a drought-tolerant or nitrogen-efficient gene through crossbreeding different species -- even with today's molecular marker technology -- can take up to 20 years.

In theory, genetic engineering can cut that time in half, even less. But there are other factors that are keeping genetic engineering research stalled.

According to Mark Sorrells, a researcher in plant breeding and genetics at Cornell University, the major factor holding back the technology is regulation, which can cost companies millions. More precisely, it costs between $100 million and $150 million to develop genetically engineered crops before they hit the market, according to a spokesperson for the Biotechnology Industry Organization.

In addition, unpredictable results might require multiple attempts at placing the transplanted gene into a wheat genome. And although official reports claim that bioengineered crops are no less safe than traditional ones, many people -- especially in Europe -- still refuse to accept them as a food crop.

But despite the costs, scientific uncertainty and public skepticism, even traditional plant breeders agree that genetic manipulation, along with age-old methods, is needed to keep the world fed through the next 40 years. The two approaches are complementary; both have their place.

And whether for transgenic or traditional breeding research, recent private investment in wheat has elicited a sigh of relief for many in the industry.

"There's only so much public investment that will go through research," said Jane DeMarchi, director of government affairs for research and technology at the National Association of Wheat Growers (NAWG). "If we limit ourselves to just public investment, we might not see the innovation we need for our crop to reach its potential."

Hybrid wheat, approaching the 'holy grail'

Like Monsanto, Syngenta, the third-largest seed company in the world, is looking to corn, wheat's competitor on the global market, for clues on making heartier varieties. The company is developing a hybrid variety of wheat -- a cross between two species of the same genus -- that is impossible to create in nature.

Hybrids are often much more vigorous than their inbred versions. In the case of corn, the hybrid variety far outyields its non-hybrid version, and responds better to fertilizer, as well.

"Hybrid wheat has been holy grail for people in wheat genetics," said John Bloomer, head of cereals for Syngenta. That's because hybrids are a moneymaker for seed companies, who can resell the choice variety every year. Otherwise, farmers would simply save the seeds themselves.

Since wheat naturally pollinates itself, cross-pollinating two parents from different varieties to create a hybrid is impossible without some genetic prodding. Expensive biotechnology is needed to correct this has inhibited hybrid research. Syngenta is using its experience from the development of hybrid barley -- a close relative to wheat -- to develop a hybrid variety of wheat.

For fighting climate change, research has centered on strengthening the rooting structure of wheat, enhancing the intake of water, increasing the plant's biomass and facilitating CO2 absorption.

Last year, Syngenta partnered with CIMMYT, a Mexico-based nonprofit corn and wheat research and training center, to do some more advanced wheat research. Using advanced genetic marker technology and traditional seed banks, the partnership seeks to develop both native and genetically modified traits for wheat. The company did not disclose its financial investment in the partnership.

"We're looking at genetics, native traits, GM approaches, seed care products," said Bloomer. "We're looking at all the tools possible to make plants utilize water better."

Monday, March 28, 2011

"Industry will have to change working patterns... people will just have to get used to the heat"

Japan is a country that has already adopted many energy efficiency and conservation measures over the decades. It's citizens and businesses will be hard-pressed to conserve even more in the face of power shortages that look as if they will be the norm in the foreseeable future. (GW)

Saving Power Is A New Priority

After years of debate, Japanese policy makers have finally begun seriously to consider for the first time in six decades instituting daylight-saving time this summer, which would reduce energy demand in the early mornings and evenings. Japanese companies also are weighing reducing hours worked—and wages paid—at offices and factories.

As a result, Tokyo households could face higher utility bills and blackouts during the hottest summer months, when air conditioners usually are cranked up to maximum power.

The sudden push to save electricity comes as several power plants that served the Tokyo area were temporarily or permanently knocked out of service by the March 11 quake, including the damaged Fukushima Daiichi nuclear plant that has been leaking radiation.

Prime Minister Naoto Kan's government has set up a special task force of cabinet-level officials charged with coming up with a set of policy prescriptions and recommendations. As part of that initiative, government ministries are reaching out to industry groups to coordinate their efforts.

"Everything is being examined from a zero basis, without any favor or prejudice," said Noriyuki Shikata, a spokesman for the cabinet. "As one of the world's most energy-efficient countries, our margin for additional conservation is rather limited, but we don't have any choice."

In response to calls for conserving electricity, many businesses in Tokyo already are operating with dimmed lights, prompting some to post "open for business" signs on front entrances. The Tokyo metropolitan government turned off about half the city's street lights and many elevators in public facilities in the aftermath of the quake. Train service has been suspended on several routes because of the power shortage. Even the wattage of ubiquitous vending machines on nearly every street corner has been turned down.

But those efforts may not be nearly enough as the heat and humidity of summer approache, threatening to paralyze Japan's capitol city.

Tokyo Electric Power Co., or Tepco, hopes to end its rolling blackouts by late April as it races to bring mothballed or underutilized generators—mostly powered by fossil fuels—back on line. Even so, it estimates that the gap between supply and demand during the peak summer months will balloon to one million kilowatts, or about 20% of total usage.

Despite relatively high prices for electricity by global standards, Japan has grown accustomed to ample supplies over the past two decades. Even as the population remained virtually unchanged between 1990 and 2009, electricity usage surged almost 35% over the period, according to data from the Federation of Electric Power Companies of Japan.

The sudden shortage of electricity in and around Tokyo could trigger a supply-side shock to Japan's economy the likes of which it hasn't experienced since oil prices spiked in the 1970s. BNP Paribas cites the electricity "bottleneck" and supply-chain disruptions in its forecast for negative growth for the next two quarters, which it expects to act as a drag on overall growth and to induce a 0.9% contraction in gross domestic product in fiscal 2011.

Businesses in areas covered by the two hardest-hit utilities, Tepco and Tohoku Electric Power Co., account for half Japan's total economic output and 45% of its manufacturing. Nomura Securities says the shortfall in electricity mid-summer and mid-winter could wipe out about 1.4 trillion yen ($17.19 billion) in pretax profits, or an average of about 5% at Japan's 400 biggest companies.

In an editorial Sunday, the Nihon Keizai, Japan's largest business daily, called on Japan's two main industry associations to take the lead in organizing "rotating holidays" to reduce demand everywhere from gritty factory floors to fancy department stores.

Due to a 100-year-old historical quirk, the power shortages Tokyo faces can't be remedied by sending more voltage from western Japan, which was largely unaffected by the quake. While Tokyo uses 50-hertz electricity, its western rival Osaka uses 60-hertz power—a discrepancy stemming from Japan's crash industrialization program in the 1890s when Tokyo chose German-made generators and Osaka adopted U.S. generators. The shortfall in eastern Japan is 10 times what substations are capable of converting from east to west, according to BNP Paribas.

That has spurred Japanese leaders to consider taking drastic measures. Few executives have broached the topic of shorter working hours openly, but a top official at a major Japanese manufacturer who declined to be named said reduced work shifts are likely in order to cope with the lack of electricity. "Industry will have to change working patterns, and people will just have to get used to the heat," the official said.

Neither step would be completely without precedent: before World War II, government officials in Japan worked only until noon during the peak summer months, and under the U.S. occupation period after the World War II, Japan instituted daylight-savings time until 1952. But the issue of daylight savings has been surprisingly contentious in Japan, where some have viewed it as a way to extend the work day or rehash postwar memories.

Sunday, March 27, 2011

Funding support for billionaire from program he's trying to kill

So who invented irony, anyway? (GW)

An energy program too efficient for its own good

By Jon Coifman
Boston Globe
March 26, 2011

BILLIONAIRE CONSERVATIVE financier David Koch doesn’t know it, but the advanced energy-saving technologies used in the new $211 million cancer research lab that bears his name at the Massachusetts Institute of Technology were funded in part through a government program to reduce global warming pollution. It is the same program under heavy attack by one of Koch’s biggest political beneficiaries, the group Americans for Prosperity.

The David H. Koch Institute, which was dedicated this month, will use almost a third less energy than comparable facilities. Everything in the building is designed to maximize efficiency, from lighting and climate controls to the laboratory systems — even the floor plan.

Money for all those extras came through MIT’s $14 million campuswide partnership with its utility, NStar. In just 36 months, they plan to cut the university’s energy use 15 percent — enough to power 4,500 Massachusetts homes for a year. The total lifetime payback is expected to exceed $50 million.

But NStar didn’t decide to do this on its own. Under a law signed by Governor Deval Patrick in 2008, Massachusetts utilities are required to pay for efficiency upgrades whenever the energy savings cost less than building the equivalent amount of new generating capacity.

Last year, almost a fifth of that money came by way of the Regional Greenhouse Gas Initiative, an agreement by 10 Northeastern states that limits the amount of carbon dioxide utilities can put in the air and collects a small fee — set by auction — for every ton they emit.

Koch would not be alone in missing the connection to the Regional Greenhouse Gas Initiative. In fact, top supporters of the embattled program didn’t know either. And that is precisely why the program is in serious trouble.

Thousands of businesses, families, and local communities are reaping dividends from the initiative without knowing it, because those benefits flow through a tangled web of rebate and incentive programs administered by utilities, state governments, and nonprofits — robbing an effective program of the natural constituency it deserves and making it easier for groups like Americans for Prosperity to attack.

And attack they have.

Thanks to an aggressive yearlong campaign by Americans for Prosperity, the New Hampshire House of Representatives voted last month to quit the initiative. Senate agreement is expected. The new Republican governor of Maine wants to follow, and there is mounting pressure on Governor Chris Christie of New Jersey to do likewise.

That would be a giant leap in the wrong direction, not just — or even mainly — for the environment, but also those state economies.

The Regional Greenhouse Gas Initiative costs the average household about 75 cents a month. It pays for upgrades from home weatherization to energy-efficient industrial boilers, big commercial lighting projects, and rooftop photovoltaic installations on schools and warehouses (which is how New Jersey became the number two state for solar). Studies consistently show these efforts return $3 to $4 for every one spent.

It saves money for everyone by avoiding expensive new power plants, and by lowering peaks in demand that drive up electricity prices across the board. That’s good for ratepayers, and good for business.

Programs funded by the initiative provide access to scarce capital and help reduce operating costs. They create opportunities for everyone from architects, engineers, and programmers to the people in tool belts who bend metal and wire up buildings. And it’s opportunity that can’t be outsourced to China.

Of the $789 million raised by the Regional Greenhouse Gas Initiative through December, more than half went to efficiency projects. Eleven percent went for renewables, and 14 percent to offset bills for low-income families. Less than 5 percent went for administrative overhead, a figure that should warm Tea Party hearts. (Those numbers would be even better if New York and New Jersey hadn’t used some proceeds for deficit reduction.)

At a time when energy prices are threatening a shaky recovery and many parts of the Northeast are pressed to meet electricity demand, lawmakers and business leaders should take an honest accounting of the program’s benefits before they surrender this important and highly cost-effective economic tool.

Jon Coifman is an environment and clean tech strategist at PRCG Communications, and blogs at PositioningGreen.com.

Saturday, March 26, 2011

Why ideas and history matter

Thanks to Alexandra Hinrichs for today's post. Alex, the daughter of my best buddy has been reporting from the front lines of the collective bargaining battles in Madison, Wisconsin where she attended school and now lives.

I can't remember how many people I've recommended "Nature's Metropolis" to over the years. It's one of those books that really helps you understand just how the world really works. (GW)

Wisconsin's most dangerous professor


Why are Republicans desperate to see Bill Cronon's emails? Because ideas and history matter

Friday, March 25, 2011

"I guess this is the right time to evaluate the options"

According to overnight news accounts, the water three men were exposed to while working at the Fukushima Daiichi nuclear power plant had 10,000 times the amount of radiation typical for that locale. Speculation is that a crack in the containment wall of the No. 3 reactor is allowing radiation from the core to escape.
Today's two posts compare the reactions from Japan and the U.S. to the tragedy unfolding at the Fukushima Daiichi plant. (GW)

Japan and energy: What's the alternative?

By Peter Shadbolt
CNN
March 24, 2011

(CNN) -- As Japan's earthquake and tsunami ripped through the Fukushima Daiichi nuclear power plant, the wind turbines at nearby Takine Ojiroi Wind Farm did what they were designed to do: They swayed, they stopped, they electronically checked themselves and automatically restarted.

"Except for one turbine that was very close to the nuclear power plant, all the turbines were up and running after the quake," said Sean Sutton of Vestas, the world's largest manufacturer of electricity generating wind turbines.

"And the damaged turbine we were able to monitor remotely," he said. Even now, the turbines are generating power for the grid despite being isolated within the nuclear exclusion zone.

As a source of power, wind energy is about as clean, safe and earthquake-proof as it gets -- the problem is it generates a fraction of Japan's energy needs.

Compared with the massive 4,696MW output of the six reactors at Fukushima Daiichi nuclear plant, the 23 turbines at Takine Ojiroi can produce just 46MW -- enough for 30,000 households.

With just 1% of the capacity of the nuclear-powered leviathan just a few kilometers away, the wind farm is a microcosm of the contribution of wind power to Japan's energy mix.

Currently, Japan gets 27% of its power from coal, 26% from gas, 24% from nuclear, 13% from oil, and 8% from hydro. The remaining 2% is occupied by renewables such as geothermal power stations, solar and wind.

While renewable energy companies are loathe to admit it, as Japan still counts the human cost of the quake, the nuclear crisis at Fukushima may be the best thing that ever happened to the sector.

Last week, the share prices of renewable energy companies soared as much as 10%. Nuclear companies, meanwhile, tanked as China -- which has the world's largest commitment to nuclear power with 13 plants in operation and more than 27 in the pipeline -- announced it would suspend approval for new nuclear power projects.

Germany, too, has suspended its nuclear program and ordered seven older plants closed during the moratorium. At the same time, Berlin has boosted the size of the government's renewable energy fund from €300 million ($425 million) to €1 billion.

Renewable energy companies admit their fortunes are closely tied to the vagaries of the oil market. Every sustained price spike sees renewed interest and funding in alternative energy. This interest ebbs when oil prices fall to more affordable levels.

However, the nuclear crisis in Fukushima -- coupled with events in Libya that have pushed oil to 30-month highs -- has added a new, and many in the industry will be hoping, permanent dimension to the funding of the heavily subsidy-dependent alternatives sector.

"I guess this is the right time to evaluate the options," said Sutton, whose Danish company has its Asia headquarters in Singapore. "Wind power compared with other sources is safe, fast, predictable and clean - it can also be deployed quickly."

He said the Japanese government was now reviewing long-stalled wind power feed-in tariffs -- a form of government subsidy -- under which alternative power producers, including solar power generating households, are paid a premium for feeding power into the grid.

With the Japanese government considering massive offshore wind farm projects that could generate 1,000MW of power, wind companies are betting on strong growth in Japan.

"Previously the government was quite fractured on this issue, but I think if there's any good to come from this nuclear crisis then it will be to help kick start feed-in tariffs for wind," Sutton said.

The Holy Grail for the alternative sector is what is known in the industry as grid parity, where it becomes as cheap to buy wind and solar power as it is to buy other power on the grid. Without feed-in tariffs the dream of grid parity is a long way off.

Japan already has feed-in tariffs for solar power and is the third largest producer of solar power in the world, behind Germany and Spain. While it has ambitious plans for the fast growing energy sector, even the most optimistic projections only put it at 10 % of the energy mix by 2050.

As exciting as alternatives may seem, the answer to Japan's future energy needs is likely to come from more traditional sources, according to Ivo Bozon, a leading energy analyst at McKinsey & Company.

"It takes a long-term commitment to get the scale necessary in renewables to produce meaningful amounts of the power," said Bozon. "There are physical limits on renewable energy -- with wind power it's reliant on onshore winds and with solar it's space and sunlight."

As you go down the list of clean fuels, he explained, discounting those energy sources that take up too much space, rely on intermittent power sources or are simply too expensive to produce, what you end up with is natural gas.

"The biggest lift from this is likely to be in gas, especially in Japan and China, where they have the scale to get plants up and running," he said.

Biomass, fuels produced from biological sources such as palm oil, is also likely to see renewed interest as the pendulum swings away from nuclear power.

"I think all countries will be rethinking their commitment to nuclear in the light of these events. At the very minimum there could be delays," said Bozon.

“We surely should avoid a rush to judgment”

Disappointingly, President Obama's and Congress' strong support for nuclear energy apparently does not seem to be at all affected by the developments in Japan. (GW)

Lobbyists’ Long Effort to Revive Nuclear Industry Faces New Test

By Eric Lichtblau
New York Times
March 24, 2011

WASHINGTON — One flash point in the remarkable revival of interest in nuclear energy here — a revival now threatened by the calamity in Japan — came almost by accident at a late-night brainstorming session in a senator’s office in 1997.

Pete V. Domenici, then a Republican senator from New Mexico, was looking for an issue to claim as his own. One staff member, a former scientist at the Los Alamos nuclear lab, tossed out an idea that seemed dead on arrival: a renewed commitment to nuclear energy.

“Are you serious?” Mr. Domenici remembers asking the aide incredulously. After Three Mile Island and Chernobyl, nuclear energy had fallen into disfavor, development had stalled, and many politicians ran from the issue like it was a toxic cloud.

But with industry backing, Mr. Domenici overcame his skepticism and became one of the driving forces in a decade-long renaissance of nuclear energy — a resurgence that began in earnest under President George W. Bush and has led President Obama to seek a $36 billion expansion in loan guarantees to finance reactors at a time when other programs are being slashed.

Now, however, the future of nuclear energy in the United States is in doubt, with advocates on all sides bracing for a fierce debate over whether the disaster in Japan should slow or even derail the planned expansion of America’s 104 nuclear reactors.

Mr. Obama has shown no sign of backing away — a testament to the success of an expensive multiyear campaign by the nuclear energy industry, advocates in Congress and the executive branch.

Nuclear executives, girding for a fight, have already held 20 briefings for Washington lawmakers and others about the events in Japan and the potential lessons learned at home. They have been putting out guidance on increased safeguards for reactors, and giving reporters tours of nuclear plants.

The message: Despite the events in Japan, nuclear is a safe, affordable and “clean” energy source that does not spew harmful carbons into the environment or rely on foreign producers.

“We surely should avoid a rush to judgment,” Jeff Merrifield, a former member of the Nuclear Regulatory Commission, said in one of a series of videos that the Nuclear Energy Institute, the leading trade group, has put out on its Web site since this month’s tsunami crippled Japan’s reactors. The United States, he said, should “continue to move forward with building those plants because it’s the right thing for our nation.”

But with polls in the last two weeks showing dimmed support for nuclear power, opponents are hoping to use the events in Japan to slow the industry’s political momentum and challenge what the industry maintains is a long record of safety.

“The risk is just so great if there’s a screw-up,” said David Hamilton, director of energy programs for the Sierra Club, which opposes the expansion of nuclear energy. “The nuclear renaissance was already hanging by a thread, and the Japanese disaster may have cut the thread.”

But even the critics acknowledge that the industry’s backers have managed to jump-start nuclear energy in a way that few thought possible a decade ago.

One turning point, people on both sides of the issue agree, was that proponents took advantage of the public concern over climate change and carbon-producing fuels beginning in the early 2000s and were able to recast themselves — first to fence-sitting lawmakers, then to the public as a whole — as a “clean” alternative that would not harm the environment.

“It was a brilliant campaign,” said Tyson Slocum, an energy expert at Public Citizen, which opposes nuclear energy because of concerns about its safety, security and cost.

“While everyone was focused on shutting down coal plants, they had a couple of years to themselves to just talk to the American public in very sophisticated ad campaigns and to reintroduce a generation of Americans to nuclear power,” he said. “That was very powerful.”

Nuclear industry firms and their employees also contributed more than $4.6 million in the last decade to members of Congress — both Republicans and Democrats, including Mr. Obama, then a senator, and his presidential campaign — as the industry’s political fortunes were rising, according to an analysis by MAPLight.org, a Washington research group that tracks money and politics.

And the industry has spent tens of millions more lately on lobbying. Last year, electric utilities, trade groups and other backers spent $54 million hiring lobbyists, including former members of Congress, to make their case, according to a separate analysis by the Sunlight Foundation, which also tracks money and politics.

As a senator, Mr. Domenici was a big beneficiary of the industry’s largess, collecting more than $1.25 million over his 20-year career from political donors affiliated with the energy sector.

Months after he committed himself to promoting nuclear energy, he gave a talk on the topic in 1997 at Harvard University called “A New Nuclear Paradigm.” Nuclear energy proponents called it a seminal moment in the shift of public opinion.

“I wanted to put nuclear power in its proper perspective,” said Mr. Domenici, who left the Senate in 2009 and serves as a senior fellow at the Bipartisan Policy Center in Washington.

“You have this resource just sitting there saying, ‘Are you going to use me or not?’ ” Mr. Domenici said in an interview last week. “People were stirring up fears of another Three Mile Island, but I believe the reality of nuclear power has now become much better known.”

Mr. Domenici’s position as a senior member of both the Senate Energy and Appropriations Committees gave him a particularly influential role in helping the industry. He was at Mr. Bush’s side in 2005, when the president signed a major bill that encouraged the building of new nuclear plants. Work has now begun on four new plants.

Mr. Domenici’s former aides have gone on to play critical roles in the debate as well.

Pete Lyons, a nuclear scientist and the former Domenici aide who first suggested the nuclear energy idea to the senator at his 1997 brainstorming session, went on to serve on the Nuclear Regulatory Commission and has been nominated by Mr. Obama to run the Energy Department’s civilian nuclear program. Alex Flint, another Domenici aide at the meeting, now is the chief federal lobbyist for the Nuclear Energy Institute. And a third aide at the meeting, Steve Bell, assists Mr. Domenici’s work on a presidential panel on nuclear waste.

Mr. Flint said the senator’s staff did not expect to succeed when Mr. Domenici began proposing modest appropriations for nuclear research and programming in the late 1990s.

“We were going against the conventional wisdom,” Mr. Flint said. “We expected a pushback, but we didn’t get it. And it just grew from there.”

Within the Energy Department, meanwhile, a 2003 study by the Massachusetts Institute of Technology on the future of nuclear energy helped forge a consensus within the government, even among skeptical policy makers, officials said. The study concluded that while nuclear power was facing “stagnation and decline,” it should remain an important way to provide carbon-free energy at relatively low cost.

“That really moved my thinking, and that kind of analysis was very influential,” said Daniel B. Poneman, deputy secretary at the Energy Department.

Today, there is no doubt about where the Energy Department stands.

Its Web site extols the value of nuclear energy as providing “low-cost, carbon-free electricity to help drive the American economy and preserve the environment,” and it even includes a special page for children called “the Power Pack,” featuring a sci-fi journey through nuclear energy.

For critics urging a go-slow approach to building reactors, the enthusiasm is all a bit much.

“The industry has really embedded itself in the political establishment,” said Mr. Slocum at Public Citizen. “They’ve had reliable friends from George Bush to Barack Obama, and the government has really just become cheerleaders for the industry.”

Whether events in Japan change the political calculus in Washington “is what everyone is waiting to see,” he said. “We don’t want to be seen as exploiting a tragedy, but it’s prudent to talk about the implications here. The best and the brightest can’t see around every corner.”

Thursday, March 24, 2011

“Then a strange blight crept over the area..."

Our Silent Spring

By James Carroll
Boston Globe
March 21, 2011

WHEN RACHEL Carson entitled her prescient 1962 book “Silent Spring,’’ she was imagining the dawning of the season without the sweet sounds of wildlife. She noted that, even then, in many parts of the United States, spring “comes unheralded by the return of birds, and the early mornings are strangely silent where once they were filled with the beauty of birdsong.’’ Carson’s book was heard as a resounding alarm, jumpstarting the contemporary environmental movement. In important ways, her warning was heeded (restrictions on DDT), but the human assault on the natural world only escalated in the decades since, with last week’s catastrophe in Japan a latest signal of the danger.

“There was once a town in the heart of America where all life seemed to live in harmony with its surroundings.’’ The book begins with what Carson calls a fable for tomorrow. “Then a strange blight crept over the area and everything began to change. Some evil spell had settled on the community . . . No witchcraft, no enemy action had silenced the rebirth of new life in the stricken world. The people had done it to themselves.’’

As Carson wrote, America’s first commercial nuclear power plant had just come on line (in 1958), and she could hardly have imagined the escalation of risk that took off then. The contaminations of chemical poisons that so worried Carson can seem benign compared to the ruins of radiation, if the worst happens. The Fukushima experience suggests what expert reassurances are worth. More than 500 nuclear power plants are in operation or under construction around the world today, with every one of them being viewed with new skepticism. One chance in a million — such predictions of disaster suddenly seem less of a long shot. What have we done to ourselves?

That more than 10,000 Japanese are likely to have been killed by the natural phenomena of the earthquake and the tsunami relativizes the prospect of far fewer being killed, injured, or sickened by released radiation from the damaged reactors — the expected outcome as of now. But the global anxiety attached to this multivalent catastrophe rises to another level of concern. Alarm is drowning out all the other sounds of spring this week.

The combined destructiveness of the shaken earth, the furious sea, and the nuclear product of industrial technology in Japan is a perfect expression of the perennial tension between nature and human inventiveness. The story of homo sapiens has been a tale of two impulses, at least since the invention of agriculture more than 10,000 years ago.

There is the embrace of nature, even unto cultivation and celebration — the sustenance we take from grown food, the lift of our hearts at birdsong. And there is the crushing of nature for greed, land into property, carbon monoxide into the air, habitual laying waste and moving on.

Ironically, whether out of love for nature or exploitation of it, the broad result of the double-barreled human impulse, as is now apparent, has been the obliteration of climate stability — a problem that transcends all political, economic, and cultural preoccupations. Because it is transcendent, the climate crisis is hard to contemplate, and that is where the news from Japan comes in.

The issue suddenly is not just the radiation danger of commercial nuclear power, but the true and total cost of industrial technology — not only to nature, but to the human future. Whether the reactors at Fukushima go into meltdown or not, the incipient Japanese environmental trauma underscores the way in which the fragile atmosphere of Earth has already begun its meltdown. Could Rachel Carson have imagined Montana’s Glacier National Park without glaciers (as soon as 2020, according to the US Geological Survey), or, for that matter, the polar icecap without ice (NASA predicts ice-free Arctic summers by 2100). Even if we are urgently mobilized, will humans have more long-term success in restoring climate stability than the valiant Japanese technicians and firefighters are having in short-term cooling of the overheating reactors?

That humankind is by nature conscious of itself has led us to imagine that we are above nature — our tragic flaw. Faced this week not only with nature’s capriciousness, but with the deadly consequences of that human flaw, we fall silent. Our silence, for once, echoes the silent spring.

James Carroll’s column appears regularly in the Globe.

Wednesday, March 23, 2011

Music of the high frequency geodesics (aka spheres)

These guys are really good. Go to their website, give a listen and consider purchasing their first CD - Sympathetic Vibrations. (GW)

Dymaxion Quartet: Bucky-ing The Trend


by Josh Jackson
National Public Radio
March 8, 2011

Nature may contain the best answer to every problem, but it was innovator Buckminster Fuller who created a portmanteau to explain it: DYnamic MAXimum tensION, or Dymaxion. The Dymaxion Quartet operates on this relatively simple concept: getting the maximum output from every unit of input.

"I think [Fuller] looked at the world through science, architecture and design — as though he was trying to unlock the code of a most efficient or perfect way of doing something," says drummer and leader Gabriel Gloege. "As it relates to the band and my compositions, I think of not having notes that don't need to be there. That's what I hope is very Dymaxion about this band."

Sympathetic Vibrations is the Dymaxion Quartet's debut recording. The band plays all original music, and each song is based on a photograph from Asca S.R. Aull's "Heaven on Earth" series. Aull spent three years in three different cities: Hong Kong, Paris and New York. He photographed aspects of each city's premodern culture, then spent another year finding old doors and window parts to frame them.

"There was a simplicity and a focus on a single idea or emotion," Gloege says. "He was just finding simple beauty that we pass by every day. His photos looked the way I want my music to sound. So I started writing some music based on it, and here we are, nine songs later."

In this studio performance and interview for WBGO's The Checkout, we get to hear a third of them: "At One," "The Boat" and "Summer's End." The latter composition draws from a Parisian café at the end of summer.

"The sun is setting, and there's this crown of light coming over the crowded street," Gloege says. "There's a bittersweet feeling — the joy of reflecting on the fun you had over the summer, and the realization that life's going to get harder as the weather changes. So I took a melodic idea, then tried to write two extremely different pieces and slap them together. You can hear the motif in the chorale and the boogaloo. Both are from the same kernel."

There are plenty of big ideas in the Dymaxion Quartet, but the focal point comes from the music.

"If you accept the notion that music is a language, then you have a completely new set of tools with which to understand the world around you," Gloege says. "Living with music is living a richer, more profound life."

Tuesday, March 22, 2011

“Nobody expected magnitude 9”

There are still many acts of Nature that defy scientific prediction. In the case of weather, inaccurate forecasting can occasionally lead to disastrous results, but usually we just end up cursing the weather person for not reminding us to pack an umbrella.

When it comes to earthquakes however, the stakes, more often than not, are considerably higher. (GW)


Blindsided by Ferocity Unleashed by a Fault

By Kenneth Chang
New York Times
March 21, 2011

On a map of Japan that shows seismic hazards, the area around the prefecture of Fukushima is colored in green, signifying a fairly low risk, and yellow, denoting a fairly high one.

But since Japan sits on the collision of several tectonic plates, almost all of the country lies in an earthquake-risk zone. Most scientists expected the next whopper to strike the higher-risk areas southwest of Fukushima, which are marked in orange and red.

“Compared to the rest of Japan, it looks pretty safe,” said Christopher H. Scholz, a seismologist at the Lamont-Doherty Earth Observatory at Columbia University, referring to the area hit worst by the quake on March 11. “If you were going to site a nuclear reactor, you would base it on a map like this.”

Records kept for the past 300 years indicated that every few decades, part of the Japan trench, an offshore fault to the east of Fukushima, would break, generating an earthquake around magnitude 7.5, perhaps up to magnitude 8.0. While earthquakes that large would be devastating in many parts of the world, the Japanese have diligently prepared for them with stringent building codes and sea walls that are meant to hold back quake-generated tsunamis.

Shinji Toda, a professor of geology at Kyoto University in Japan, said a government committee recently concluded that there was a 99 percent chance of a magnitude-7.5 earthquake in the next 30 years, and warned there was a possibility for an even larger magnitude-8.0 quake.

So much for planning. Although Japan’s foresight probably saved tens of thousands of lives, it could not prevent the vast destruction of a magnitude-9.0 temblor, which releases about 30 times as much energy as a magnitude-8.0 quake. It was the largest ever recorded in Japan, and tied for fourth largest in the world since 1900. Thirty-foot tsunamis washed over the sea walls and swept inland for miles. The death toll is expected to be more than 20,000, and nearly 500,000 are now in shelters.

“I was surprised,” Dr. Toda said. “Nobody expected magnitude 9.”

This was not the first time scientists have underestimated the ferocity of an earthquake fault. Many were also caught by surprise by the magnitude-9.1 quake in 2004 off Sumatra, which set off tsunamis radiating across the Indian Ocean, killing more than 200,000 people.

Sometimes, scientists are blindsided by earthquakes because they occur along undiscovered faults. The deadly earthquakes in New Zealand this year; in Haiti last year; in Northridge, Calif., in 1994; and in Santa Cruz, Calif., in 1989 all happened along faults that scientists were unaware of until the ground shook.

“It’s shameful, but we’ve barely scratched the surface,” said Ross Stein, a geophysicist with the United States Geological Survey. In California, for instance, scientists have cataloged 1,400 faults, yet for smaller earthquakes — magnitude 6.7 or less — about one in three still occur on previously unknown faults.

“Humbling,” Dr. Stein said.

That raises a worrisome question: How many major quakes are lurking in underestimated or unknown faults?

The basic dynamics of earthquakes have been understood for decades. Earth’s crust is broken into pieces — tectonic plates — which slide and collide. But the sliding is not always smooth. When the plates stick together, they begin to buckle. Stress builds until the ground breaks and jumps, releasing energy in the form of vibrations: an earthquake. Not surprisingly, places close to plate boundaries are beset by earthquakes, while those far from the boundaries are not earthquake-prone.

The largest earthquakes occur in subduction zones, places where an ocean plate collides with and slides under a continental plate, particularly around the edge of the Pacific Ocean.

But some subduction zones seemed to produce more large earthquakes than others. One explanation was offered in 1980, when Hiroo Kanamori of the California Institute of Technology and Larry J. Ruff, now at the University of Michigan, published a paper that said giant earthquakes occurred more often along ocean faults where the subducting ocean plates were geologically young. The younger plates, like those off Alaska and Chile, were warmer, less dense and harder to push down into the Earth’s mantle, their thinking went. Meanwhile, the older, colder and denser ocean plates like those off Java and the Marianas trench in the Pacific would sink more easily and not produce the giant catastrophic quakes.

And yet the Pacific plate off Japan is 130 million years old, one of the oldest, and it generated a magnitude-9.0 counterexample. “It is not nearly as straightforward as I thought in the beginning,” Dr. Kanamori said.

Dr. Scholz of Columbia said the recent quake in Japan fit with a theory that he and Jaime Campos of the University of Chile developed in 1995. By their theory, the colliding tectonic plates off Fukushima were stuck, and should have been producing earthquakes. But the absence of spectacular earthquakes in the near historic record disagreed with their theory, and led Dr. Scholz to believe that something unknown was relieving the stress.

“Now we know we were wrong about that” and right in the first place, he said. “It does agree with the theory.”

Dr. Scholz said that patches of the Pacific plate off Fukushima become stuck as the plate moves under Japan. In the more modest earthquakes of the past 300 years, just one patch would break free. This time, he said, the patches ruptured together, producing a more cataclysmic quake. “The past 300 years, that hasn’t happened,” Dr. Scholz said. “So if you’re going to use the past history to extrapolate the future, the last 300 years wouldn’t have predicted the recent earthquake.”

Most regions of the world have less historical data than Japan, making it even harder to judge the earthquake patterns. Haiti is a prime example.

Even the notion of an earthquake fault — a long crack in the earth — is not quite as certain as it once was. Near Landers, Calif., seismologists had identified three faults, each capable of a magnitude-6.5 quake. Then, in 1992, an earthquake shook along all three faults at once, at a magnitude of 7.3.

“This is a controversy through the field right now,” said Peter Bird, a professor of geology and geophysics at the University of California, Los Angeles,, “whether we can say we know the names and lengths of the faults.”

In Japan’s history, there does seem to have been a precedent for the recent quake, but it took place more than a thousand years ago. A text known as “Nihon Sandai Jitsuoku,” or “The True History of Three Reigns of Japan,” described an earthquake in July 869 and a tsunami that flooded the plains of northeast Japan: “The sea soon rushed into the villages and towns, overwhelming a few hundred miles of land along the coast. There was scarcely any time for escape, though there were boats and the high ground just before them. In this way about 1,000 people were killed.”

These were the same plains that were submerged this month. Analysis of sediments left by the 869 tsunami led to an estimate that the earthquake had a magnitude of 8.3.

Brian F. Atwater, a geologist at the United States Geological Survey, said that a similar situation exists in the Pacific Northwest. Only in the past couple of decades have scientists realized that the seismic conditions of the Cascadia trench off Oregon had the potential to produce a huge earthquake. Warning systems have been built. Evacuation plans have been drawn up.

Another worrisome subduction zone is the 2,000-mile Java trench in the Indian Ocean. Few earthquakes occur there. The ocean plate there is young, so Dr. Kanamori’s 1980 observations would suggest little likelihood of a great quake.

But Robert McCaffrey, a research professor of geology at Portland State University, said he no longer believes that geophysicists can distinguish dangerous subduction zones from the not-so-dangerous ones. “We just don’t have a long enough earthquake history to make models of subduction,” he said.

The only relevant characteristic, he said, is the length of the fault, and he sees the potential for a magnitude-9.6 earthquake in the Java trench. Indonesia, which has not built extensive sea walls and warning systems, would likely be very hard hit.

“That’s my biggest fear,” Dr. McCaffrey said.

Over the weekend, Dr. Scholz reread his 1995 paper and found that Java’s recent quiet did not fit with what his theory predicted. “It must be missing a very big one,” he said.

Monday, March 21, 2011

Will the EU Close the books on novel foods?

It looks like the "Brave New World" is coming whether we want it to or not. Enter Novel Foods - defined as a type of food that does not have a significant history of consumption or is produced by a method that has not previously been used for food.

Apparently not only are genetically engineered (GM) foods trying to make their way onto supermarket shelves, but molecularly engineered (via nanotechnology) ones as well. How will these new foods be regulated?

With extreme difficulty and contoversy.(GW)


EU novel food regulation review at risk


EurActiv
18 March 2011

As talks between the European Parliament and the European Commission to update an EU regulation on novel foods stumble on cloning, the institutions have now two more weeks to find a compromise agreement before the whole review fails on 30 March.

Background

The EU's current Novel Foods Regulation dates back to May 1997. It does not cover foods developed since then that use nanotechnology, nor does it cover foods that are consumed outside the EU.

The European Commission adopted a legislative proposal to amend the current Novel Foods Regulation in January 2008.

The aim, according to the EU executive, is to allow "for safe and innovative foods to reach the EU market faster" and to encourage the development of "new types of foods and food production techniques".

The regulation would create a centralised authorisation system to simplify and speed up the process of authorisation for novel foods. The European Food Safety Authority (EFSA) would be responsible for carrying out the risk assessment for a novel food application and, if judged safe, the Commission would then propose its authorisation.

Only novel foods that are included on the Community list after assessment by the EFSA may be placed on the market.

Discussions on amending the bloc’s novel foods regulation failed at four o'clock on Thursday morning (17 March), after nine hour marathon talks, amid disagreement over whether or not to ban food from clones and their offspring.

The Parliament is calling for the explicit ban of meat produced from cloned animals and their descendants, whereas EU ministers and the Commission back ban on cloning for food production, but reject ban on food from offspring.

According to EU conciliation rules, if the two institutions fail to reach an agreement by a 30 March deadline, the whole proposal - on which work began in 2008 - will have to be scrapped.

So far, only one conciliation process has failed: on the Working Time Directive.

Risk of conciliation failure

The failure of the novel foods review would leave the use and development of new foods, not to mention the cloning and application of new technologies such as nanotechnology, in legal limbo for years to come.

The European Environmental Bureau voiced concern in particular over nanofoods, which would escape political attention if conciliation fails.

The Parliament is seeking a moratorium on foods containing nanomaterials until specific risk assessments have proven that they are safe.

They also want food containing nano-ingredients to be labelled. The Council supports a more lenient approach, backing a case-by-case authorisation process after the safety of foods containing or consisting of manufactured nanomaterials has been evaluated.

Little hope of agreement

A last-chance conciliation meeting is provisionally schedued for the evening of 28 March.

But the Parliament's rapporteur on the dossier, Dutch MEP Kartika Liotard (European United Left/Nordic Green Left) said that while the House "will do its outmost" to achieve an agreement before the deadline and is prepared to negotiate, "we will not cave in at any price".

She said that the negotiations can only have a positive outcome "if the Council moves towards consumers' expectations on the issue of cloning," referring to Eurobarometer survey results.

The latest survey shows that only 15% of EU citizens support animal cloning for food, whereas 70% think the practice should not be encouraged in food production.

"If the position of the Council and the Commission remains exclusively tied to commercial trade interests, Parliament won't accept any deal," Liotard said.

But the Council said in a statement that together with the Commission they had reasons to reject a ban on food obtained from the naturally conceived offspring of clones.

It stressed that neither the EU, nor third countries have a system for tracing the natural offspring of clones in the food chain.

Therefore, if restrictions on cloned cattle were introduced, Europe would have to reject imports of beef and dairy products from countries such as the United States, as producers could not fully trace the ancestors of their products.

"Such a ban would be impossible to defend under WTO rules and would lead to direct retaliatory measures by third countries," reads the Council statement.

According to the Council, the ban cannot be justified from an animal welfare perspective either, as the offspring of cloned animals are bred using traditional methods.

Finally, it underlined that "such a ban would mislead consumers as there are already non-traceable offspring in the EU and foodstuff derived from them has already entered the food market".

These animals mainly end up into the EU food supply chain via imports of breeding material from the US.

Pointing finger at Commission

Italian MEP Gianni Pittella (Socialists and Democrats), chair of the Parliament's delegation to the conciliation process, lamented that while the European Commission is supposed to act in a conciliatory manner to help the other two institutions reach an agreement, the House "could not count on its help".

"The Commission's position is even more rigid than that of the Council, which is not very helpful in reaching an agreement," Pittella said.

MEP Liotard even suggested that the failure of negotiations had more to do with the very firm line taken by the EU executive.

Sources told EurActiv that instead of sticking to its role of being an impartial mediator in the conciliation negotiations, the Commission, in particular its trade department, has been acting as a "true lobbyist", actively seeking to influence the Parliament's position on the dossier in particular.

In a letter sent to Commission President José Manuel Barroso earlier this month, European Parliament President Jerzy Buzek said that the House was not satisfied with the manner in which the Commission had acted in this conciliation procedure and asked the EU executive to play its role as a mediator between the Council and the Parliament.

The letter was sent after EU Trade Commissioner Karel De Gucht warned the Parliament that a ban on the offspring of cloned cattle could spark a trade war between the EU and the rest of the world.

But MEP Liotard noted that if the regulatory review fails and there is no regulation to face up to new developments in the field of novel foods, it would impact upon world trade as well.

Positions

Finnish Green MEP Satu Hassi said the European Commission had played "an inglorious role in these negotiations" via "proactively pushing EU member states to resist any ban on clone food, even though its role under the treaties should only be to 'facilitate' negotiations. It is highly regrettable that the Commission is more concerned with the interests of its trading partners in third countries and their niche industry than the will of the majority of EU citizens".

Monique Goyens, director-general of European consumers organisation BEUC, reacting to the Commissioner De Gucht's comments on the threat of a trade war, stressed that "legislators should not use commercial arguments to endanger the choice of EU consumers. They did not give in to big business on hormone-treated beef EU imports and we need to see the same determination repeated".

"An overwhelming majority of EU consumers do not want cloning to be used for food production purposes: 84% are concerned about the long-term health and safety effects, and yet the Commission persists in ignoring the very people they are supposed to represent," BEUC added.

Belgian Green MEP Bart Staes stressed that "it is crucial that the descendents of clones are also covered, as it is ultimately these descendents, and not the original clones, that will be used for food production. The Council's claims that there are no animal welfare concerns resulting to the offspring of clones is ludicrous and flies in the face of the evidence".

UK MEP Linda McAvan (Socialists & Democrats) said that "we need to have a public debate on cloning. There are many ethical, animal welfare and consumer issues which need a public airing".

On behalf of the Hungarian Presidency, Sándor Fazekas, Hungarian minister for rural development, said that "we arrived with a clear mandate from the Council providing the maximum possible protection of consumers with a system that is practically and legally feasible".

"The position of the European Parliament would require drawing a family tree for each slice of cheese or salami, which is practically impossible, would mislead consumers and create horrendous extra costs for farmers," the Council stressed in a statement.

Next Steps

28 March 2011: Last-chance conciliation meeting scheduled.
30 March 2011: Deadline for reaching agreement in conciliation before review fails.

Sunday, March 20, 2011

Food for every child

In a growing number of American cities, food markets are few and far between. Stores and boutiques filled with things we don't really need proliferate. This is yet another example of what happens when greed-inspired opportunities to turn-a-quick-buck-and-run overrules our sense of community. (GW)

Few markets, poor nutrition

Boston Globe Editorial
March 20, 2011

SUPERMARKET CHAINS are quick to locate on busy roadways that pass through well-to-do suburbs. But in many urban areas — especially the state’s gateway cities — political leaders need to clamor harder for big food stores.

As a recent study underscored, many Massachusetts residents have little access to supermarkets with fresh, healthy food — a finding at odds with the Commonwealth’s image as one of the more health-conscious states. According to the report, prepared for the Massachusetts Public Health Association, the situation is particularly dire in the gateway cities, the mid-sized former mill towns that tend to lag behind the rest of the state economically.

Lawrence is a telling example: a city of 7 square miles and 70,000 residents, it has just two grocery stores. Lowell and Fitchburg, according to the report, have half as many supermarkets per capita as the national average. Over the past 10 years, the head of the Massachusetts Food Association said, existing supermarkets have closed down in Springfield, New Bedford, and Fall River.

In Boston, the story is somewhat different. While parts of Roxbury, Mattapan, Dorchester, and East Boston have fewer food stores than their population would support, on the whole things are improving in the city: more than a dozen new supermarkets have opened in the city in the last decade. This didn’t just happen; Mayor Tom Menino and his predecessor, Ray Flynn, both campaigned hard to convince retail executives that urban neighborhoods can be good locations for supermarkets.

The gateway cities deserve special attention from state policymakers, because their residents don’t enjoy the same political pull and access to policy makers as the inhabitants of many Boston neighborhoods. But civic leaders need to step up, too, and make supermarkets as high a priority as Menino and Flynn did.