Wednesday, June 27, 2007

Substance In Tree Bark Could Lead To New Lung Cancer Treatment

Researchers at UT Southwestern Medical Center have determined how a substance derived from the bark of the South American lapacho tree kills certain kinds of cancer cells, findings that also suggest a novel treatment for the most common type of lung cancer.

The compound, called beta-lapachone, has shown promising anti-cancer properties and is currently being used in a clinical trial to examine its effectiveness against pancreatic cancer in humans. Until now, however, researchers didn’t know the mechanism of how the compound killed cancer cells.

Dr. David Boothman, a professor in the Harold C. Simmons Comprehensive Cancer Center and senior author of a study appearing online this week in the Proceedings of the National Academy of Sciences, has been researching the compound and how it causes cell death in cancerous cells for 15 years.

In the new study, Dr. Boothman and his colleagues in the Simmons Cancer Center found that beta-lapachone interacts with an enzyme called NQO1, which is present at high levels in non-small cell lung cancer and other solid tumors. In tumors, the compound is metabolized by NQO1 and produces cell death without damaging noncancerous tissues that do not express this enzyme.

“Basically, we have worked out the mechanism of action of beta-lapachone and devised a way of using that drug for individualized therapy,” said Dr. Boothman, who is also a professor of pharmacology and radiation oncology.

In healthy cells, NQO1 is either not present or is expressed at low levels. In contrast, certain cancer cells - like non-small cell lung cancer - overexpress the enzyme. Dr. Boothman and his colleagues have determined that when beta-lapachone interacts with NQO1, the cell kills itself. Non-small cell lung cancer is the most common type of lung cancer.

Beta-lapachone also disrupts the cancer cell’s ability to repair its DNA, ultimately leading to the cell’s demise. Applying radiation to tumor cells causes DNA damage, which results in a further boost in the amount of NQO1 in the cells.

“When you irradiate a tumor, the levels of NQO1 go up,” Dr. Boothman said. “When you then treat these cells with beta-lapachone, you get synergy between the enzyme and this agent and you get a whopping kill.”

In the current study, Dr. Boothman tested dosing methods on human tumor cells using a synthesized version of beta-lapachone and found that a high dose of the compound given for only two to four hours caused all the NQO1-containing cancer cells to die.

Understanding how beta-lapachone works to selectively kill chemotherapy-resistant tumor cells creates a new paradigm for the care of patients with non-small cell lung cancer, the researchers said. They are hoping that by using a drug like beta-lapachone, they can selectively target cancer tumors and kill them more efficiently. The current therapy for non-small cell lung cancer calls for the use of platinum-based drugs in combination with radiation.

“Future therapies based on beta-lapachone and NQO1 interaction have the potential to play a major role in treating devastating drug-resistant cancers such as non-small cell lung cancer,” said Dr. Erik Bey, lead author of the study and a postdoctoral researcher in the Simmons Cancer Center. “This is the first step in developing chemotherapeutic agents that exploit the proteins needed for a number of cellular processes, such as DNA repair and programmed cell death.”

About 85 percent of patients with non-small cell lung cancer have cancer cells containing elevated levels of the NQO1 enzyme, which is produced by a certain gene. Patients who have a different version of the gene would likely not benefit from treatment targeting NQO1, Dr. Boothman said.

Dr. Boothman cautioned that clinical trials of beta-lapachone in lung cancer patients will be needed to determine its effectiveness as a treatment. He and his team have created a simple blood test that would screen patients for the NQO1 enzyme.

Along with Dr. Jinming Gao’s laboratory in the Simmons Cancer Center and a joint collaboration with the bioengineering program at UT Dallas, researchers in the new “Cell Stress and Cancer Nanomedicine” initiative within the Simmons Cancer Center have developed novel nanoparticle drug delivery methods for the tumor-targeted delivery of this compound. These delivery methods have the promise of further improving this drug for non-small cell lung cancer.

Other Simmons Cancer Center researchers involved in the study were Dr. Ying Dong, postdoctoral researcher; Dr. Chin-Rang Yang, assistant professor; and Dr. Gao, associate professor. UT Southwestern’s Dr. John Minna, director of the Nancy B. and Jake L. Hamon Center for Therapeutic Oncology Research and the W.A. “Tex” and Deborah Moncrief Jr. Center for Cancer Genetics, and Dr. Luc Girard, assistant professor of pharmacology, also participated along with researchers from Case Western Reserve University and UT M.D. Anderson Cancer Center.

Great shopping deals on DealCritic

By Brandon Watts

All of us love to get a good deal, and this is why so many of us constantly scan sales circulars from stores and cut out coupons from the newspaper. After all, why pay more when you don’t have to? With so many stores offering to match and discount the prices offered by their competitors, it only makes sense to be on the lookout for ways to save money. DealCritic is a site that will not only help you to find deals, but it’ll let you vote on them, too.

The deals are compiled from sources throughout the Web and listed for your enjoyment. In addition, all registered users can also post any deals that they may find while browsing the Internet. The voting functionality is what really helps the site to stand out because the good stuff will rise to the top of the list due to a number of supportive votes. Comments can also be left on the individual deals so that you can see what experiences people have had with the product or the deal itself, and the related deals that are listed will give you more options to keep busy with. DealCritic feels kind of vacant at this point in time, but hopefully that will all change because the site could eventually be built into something great.

Millennium Development Goals: Are We On Track?

In April 2007, the General Assembly of the Economic and Social Council (ECOSOC) of the United Nations convened to discuss progress made towards the Millennium Development Goals (MDGs). Patrick Webb, PhD, dean for academic affairs at the Friedman School of Nutrition Science and Policy at Tufts University, presented on the status of the Millennium Development Goal One (MDG1): radically reducing extreme poverty and hunger.

The MDGs were developed at the United Nations Millennium Summit in 2000 to set measurable goals and targets for a range of pressing global problems, including poverty, hunger, disease, illiteracy, environmental degradation and discrimination against women. MDG1 has a twofold objective: reduce the proportion of people living on less than a dollar a day, and reduce the proportion of people suffering hunger by half. In order to meet these goals, Webb highlighted recent successes that can guide future efforts.

There are three priority issues to be tackled in seeking to address the second objective (on hunger) — removing the “invisibility” of hunger by better measuring and highlighting areas making limited progress, promoting innovations in programming, and integrating mainstream lessons from successful action in humanitarian settings (saving lives and reducing acute malnutrition) into development interventions. Development efforts can address these issues by improving protocols and products for treating malnutrition, and preventing malnutrition by enhancing behaviors and choices that lead to improved nutrition throughout the lifecycle.

Poverty and hunger are multidimensional problems requiring multidimensional solutions. To address the multidimensional nature of these problems, the first MDG addresses five distinct goals: reducing poverty, narrowing the poverty gap (between richest and poorest), increasing the share of income enjoyed by the poorest families, reducing the share of preschool children who are underweight, and lowering the share of each country’s population that has too little to eat. “Progress in one target area does not guarantee progress in the other areas,” says Webb. “Focusing on just one aspect of the problem, or just one of the target risks, is missing the point — and risks compromising the success of the entire agenda.”

Speaking on behalf of the three Rome-based agencies of the United Nations (the Food and Agriculture Organization (FAO), the World Food Programme (WFP), and the International Fund for Agricultural Development (IFAD)), Webb argued that governments should engage more seriously in measuring their own progress towards agreed goals. Solutions to hunger are known, and novel approaches to the treatment and prevention of malnutrition in emergencies have shown that quick gains are possible. Making such gains sustainable means scaling up innovations across developing country settings, and targeting resources towards such key priorities.

This meeting of Member States, UN System Organizations, other major institutional stakeholders, as well as non-governmental organizations, academics and foundations concentrated on the theme “Eradicating poverty and hunger… joining forces to make it happen.” The conference explored the issues and set the groundwork for the ECOSOC’s first Annual Ministerial Review (AMR) this summer in Switzerland. Webb concluded by stating that “…in order to successfully impact malnutrition it takes complementary inputs delivered through complementary partnerships. It is not only about food, and it is not only about cash in the pocket. Many different elements are needed to achieve the five goals, and many institutions have to bring their resources to bear. In order to tackle the problem it is not possible to focus on just agriculture or just health; in order to be successful the many facets of the problem must all be addressed.”

Webb will continue to work with the many stakeholders involved in assessing progress and further advancing the important human goals that are the MDGs.

How Do Americans Want To Reduce Greenhouse Gas Emissions?

Most Americans now believe that global warming is happening, and they want the federal government to take action to limit its effects. But what form should that action take, and does support for action hold firm if people understand how much it may cost them financially?

To find out, New Scientist, Stanford University and Resources for the Future, an independent think tank, commissioned the survey research firm Knowledge Networks to query a representative sample of American adults.

We investigated three main ways of reducing greenhouse pollution.

  1. Standards or Mandates: The government tells companies exactly how they must generate electricity or manufacture vehicle fuel to achieve a cut in emissions.
  2. Emissions Tax: The government taxes companies for their greenhouse gas emissions.
  3. Cap-and-Trade: The government imposes a cap on companies’ greenhouse gas emissions, but allows companies to trade permits - which represent the right to emit a certain amount of pollution.

The aim of our poll was to test the relative attractiveness of these three options. We told 1,491 adults how each option could work in each of two sectors: vehicle fuel and electricity. We chose these sectors because they are each responsible for a substantial proportion of U.S. greenhouse emissions, and because any costs of making cuts will likely be passed onto consumers. That gave a total of six possible policies, each of which we told respondents would reduce total projected U.S. greenhouse emissions in 2020 by five percent.

Highlights from the results:

  • Specific policies to combat global warming can command majority public support in the U.S., as long as they don’t hit people’s pockets too hard.
  • Given the probable costs, the U.S. public has a clear preference for action in the electricity sector rather than vehicle fuel. At the lower end of the costs we quoted, all the electricity policies won majority support. In contrast, none of the vehicle fuel policies gained majority backing, even at the lowest costs quoted.
  • Americans prefer standards, in which companies are told exactly what to do to curb emissions, over the other policies we investigated. A low-carbon standard for electricity generation was backed by 73 percent of respondents who were told it would cause a typical monthly bill to rise by $10. By contrast, a cap-and-trade scheme for power companies was backed by only 47 percent at this price point. This gives pause for thought, as cap-and-trade schemes feature in some bills currently being considered by the U.S. Congress.
  • Residents of the western U.S. are more likely to favour policies to limit global warming than those in other regions of the country. Parents and people with higher incomes are also more likely to support action.

“For politicians who want to find voter-friendly ways to fight global warming, our poll provides some comfort,” says Peter Aldhous, New Scientist’s San Francisco Bureau Chief. But there are also significant challenges ahead, not least that the ’standards’ preferred by the public are predicted by environmental economists to be more expensive than other policies that command less public support.

“Our findings suggest that Americans are open to policies they think will work and that are affordable. Policy-makers who want to avoid public resistance to their proposals will find useful guidance in our numbers,” says Stanford Professor Jon Krosnick, who jointly designed the survey.

The poll results provide a springboard for the debate about how best to tackle global warming. Policies that hit the U.S. public’s wallets hard will be a tough sell, and those that may prove cheapest seem inherently unpopular. “If we are to turn from the path to climate chaos, it seems that environmentalists, economists and politicians have some explaining to do,” says Aldhous.

Saturday, June 23, 2007

Is local the new organic?

Last week, The New York Times ran a feature by Marian Burros on New Seasons Markets, a grocery store chain in Portland that's banking on consumer interest in local, sustainable food -- as opposed to simply organic.

The chain recently completed an inventory of the origins of its stock and has labeled everything grown in Oregon, Washington, and Northern California "Homegrown." They've already got six stores and three more on the way, but remain adamantly opposed to expanding beyond the Portland suburbs -- a testament to their commitment to being grounded in the local food economy.

People concerned about health, taste, and the environment have long sought out organic products. Once a cutting-edge concept for gourmets and health-food junkies, organic is now mainstream, with many familiar major food brands launching organic product lines. I bought organic milk at a Seattle Safeway the other day that was packaged under Safeway's own new "O" label. Organics are the fastest growing segment of the food industry, with sales increasing by some 20 percent per year.

But, as the NYT piece notes, organic alone is not the answer to the question of the fundamental role food plays in our local economy, environment, food security, community vitality, or even health and enjoyment. I don't know where that organic milk I bought from Safeway came from. I like the idea of sticking with my delivery from Smith Brothers dairy each week. Even though it's not organic, there's no growth hormone used and I am supporting the last of the independent dairy farms in my state, Washington.

We won't be seeing New Seasons outside the Portland area any time soon -- but other areas are making progress on the local food front.

In the Seattle area, for example, cutting-edge projects are exploring food as a driver in the local economy and as a focal point for public policies ranging from health and nutrition to urban planning and even transportation.

Sustainable Seattle is launching a first-of-its-kind research project looking at how dollars spent on locally produced food affect the local economy, as a counterpoint to the dollar spent on the average grocery item that has traveled 1500 miles to reach the consumer.

Washington State University's King County Extension office is leading an effort to establish a food policy council for Seattle and King County that would bring together a broad spectrum of food system participants -- from farmers to hunger activists to grocery executives to land use experts -- to work jointly on solutions to current challenges like childhood obesity, disappearing farmland, and alarmingly high levels of hunger in our community. Leaders of that effort talked about how a food-policy council could be a source of innovative, community-based solutions in an oped in the Seattle P-I in December.

The local-food movement may even help us close the divide between rural and urban, red and blue. From the NYT piece:

Doc and Connie Hatfield, who founded the Country Natural Beef cooperative in 1986, said the co-op now has 70 ranchers, who raise beef on a vegetarian diet free of hormones, antibiotics and genetically modified feed. "Most of the ranchers are rural, religious, conservative Republicans," Mr. Hatfield said. "And most of the customers are urban, secular, liberal Democrats. When it comes to healthy land, healthy food, healthy people and healthy diets, those tags mean nothing. Urbanites are just as concerned about open spaces and healthy rural communities as people who live there. When ranchers get to the city, they realize rural areas don't have a corner on values. I think that's what we are most excited about."

I have always believed in the power of coming to the table together to hash out issues, find common ground, and be reminded of one another's humanity, but I have most often thought about it in the very personal context of family and friends. In these times of bitter division, can coming to the table in celebration of delicious local-grown bounty help remind us of our many shared values and experiences?


Local is the ''New'' Organic

The theme behind this interesting "think" piece isn't the organic movement.

Rather, it's geared to encourage folks to find local sources for their foods, the safest and most affordable way to eat "happier" meals, leading some to call themselves localtarians and locavores.

When major retailers like Wal-Mart sell organic food, it still requires an industrial model of farming and long-distance shipping remains the same.

Growing, chilling, washing, packaging and transporting a box of organic salad from California to the East coast takes 57 calories of fossil fuel for every calorie of food.

If eating locally captures national attention, the movement could reinvent the model of industrial farming in a way that organic food never could. It could eventually lead to more money for local economies, more fresh produce in the diet, and a greater appreciation for the natural cycles of the Earth.

COA News April 10, 2007

Check this for the rest of the article

Reduce Global Warming? Let's Start With Cows

Dennis Avery

Virtually all of the Kyoto Protocol's member countries have increased their CO2 emissions since signing the treaty. The political and economic costs of reducing CO2 from cars and factories have proven very high. So they just haven't happened.

Why don't eco-activists support a major cut in methane emissions from cows instead? Each molecule of methane has 21 times as much global warming potential as a molecule of CO2, and we already have cost-effective ways that farmers can cut livestock-emitted methane.

The cows and pigs won't care.

A new study in Canada found about 20 different ways to reduce methane and nitrous oxide emissions from livestock—each of them capable of cutting these emissions by one-third. North America has more than 100 million cattle, hundreds of millions of hogs and feeder pigs, and more than 2 billion chickens, together emitting billions of tons of CO2 equivalent greenhouse gasses every year.

The Canadian authors, Karin Wittinberg and Dinah Boadi of the University of Manitoba, say that such methane reduction strategies should be a top priority in any greenhouse gas reduction effort.

Simply grinding and pelleting the feeds for confinement animals reduced methane by 20 to 40 percent, because it makes the feed more fully digestible.

Steers grazing on high-quality alfalfa-grass pastures emit 50 percent less methane than steers grazing on mature grass-only pastures. Rotational grazing—changing where the cows graze every few days—also cuts methane emissions. It would cost only a few dollars per acre to encourage farmers to rotatationally graze, replant their pastures more often, and use higher-quality forages because the better pastures also produce more meat and milk for the farmer's profit.

Methane emissions in feedlot cattle were reduced by one-third when 4 percent canola oil was added to cattle feedlot rations. The canola oil costs only slightly more than comparable grain calories.

Genetically engineered bovine growth hormone reduces methane emissions by 10 percent in dairy cattle. The growth hormone hasn't even been legalized in Canada, thanks largely to opposition from activist groups such as Greenpeace!

Keeping young pigs and poultry separated by age groups, and phasing their feeds by growth stages can cut greenhouse emissions by 50 percent and sharply reduce bad smells too. Again, farmers would need only modest encouragement to use the system because it also increases feed efficiency.

The activists who warn of an overheated planet say we should focus on "no-regrets" energy strategies, and make the easiest, most cost-effective changes first. They ignore, however, such major sources of cost-effective greenhouse gas reductions as nuclear power—and encouraging livestock farmers to cut their birds' and animals' greenhouse emissions.

Instead, the Greens demanded ridiculous battery-powered electric cars. They demand we spend billions to clutter the landscape with tens of thousands of huge, unsightly wind turbines that kill birds and bats. They say we can't approve nuclear plants until we demonstrate safe storage of spent fuel—for 10,000 years! They're even suing to stop geothermal power plants that have no emissions, no radioactive waste and virtually no footprint on the land.

Let's make a deal with the global warming crowd. We'll reconsider CO2 emissions from cars and factories—after they shepherd through Congress a reasonable and effective cost-sharing plan to reduce a billion CO2 tons worth of methane and nitrous oxide from livestock. That will show they're serious about global warming.

Until now, they've seemed more interested in reducing American lifestyles to poverty levels than in actually reducing greenhouse gas emissions.

Tuesday, June 19, 2007

Online Shoppers Will Pay Extra To Protect Privacy

People are willing to pay extra to buy items from online retailers when they can easily ascertain how retailers’ policies will protect their privacy, a new Carnegie Mellon University study shows. Participants in the laboratory study used a Carnegie Mellon shopping search engine called Privacy Finder, which can automatically evaluate a Web site’s privacy policies and display the results on the search results page. The study, led by Lorrie Cranor, director of the Carnegie Mellon Usable Privacy and Security (CUPS) Lab, found that people were more likely to buy from online merchants with good privacy policies, which were identified by Privacy Finder. They were also willing to pay about 60 cents extra on a $15 purchase when buying from a site with a privacy policy they liked.

Findings from the study, the first to suggest that people will pay a premium to protect their privacy when shopping online, will be presented Friday, June 8, at the Workshop on the Economics of Information Security, an international meeting hosted by Carnegie Mellon that begins Thursday, June 7. In addition to Cranor, the research team included Alessandro Acquisti, assistant professor of information technology and public policy, and graduate students Janice Tsai and Serge Egelman.

Many people express concerns that unscrupulous online merchants might misuse credit information, target spam to their email addresses or otherwise violate their privacy. But a number of previous studies have found that many people still fail to act to protect their privacy online. Some have shown that people willingly give up private information in return for lower prices or even the mere chance of a monetary reward.

“Our suspicion was that people care about their privacy, but that it’s often difficult for them to get information about a Web site’s privacy policies,” said Cranor, an associate research professor of computer science and of engineering and public policy. A Web site’s policies may not be readily accessible, can be hard to interpret and sometimes are nonexistent, Cranor said. “People can’t act on information that they don’t have or can’t understand,” she added.

Privacy Finder is a search engine developed by Cranor and her students to address this issue. The engine makes use of the Platform for Privacy Preferences (P3P), a technical standard for creating machine-readable privacy policies. About 10 percent of Web sites overall and more than 20 percent of e-commerce sites now employ P3P, Cranor said, and of the top 100 most-visited Web sites, about a third use P3P. The search engine can automatically read and evaluate the policies of Web sites that employ P3P, and it displays this information as a series of colored squares that indicate to the user whether the site’s privacy policy complies with his or her privacy preferences.

In the new study, Cranor and her colleagues recruited 72 people to make online purchases. Some used Privacy Finder while others did not. They were given $45 and asked to buy two items — a package of batteries and a vibrating sex toy — each of which cost about $15. Participants were allowed to keep the items and any surplus money, so they had a financial incentive to buy from the cheapest online retailers. Those who used Privacy Finder made purchases from sites with “high privacy” ratings for 50 percent of the battery purchases and 33 percent of the sex toy purchases.

Cranor said they had expected people to be more sensitive about privacy in purchases of the sex toy, but the findings proved inconclusive on that point. Additional research is necessary to resolve that issue and to better determine how much of a premium purchasers are willing to pay. This study focused on whether people would pay a premium, not on how much they would pay.

Calorie Density Key To Losing Weight

Eating smart, not eating less, may be the key to losing weight. A year-long clinical trial by Penn State researchers shows that diets focusing on foods that are low in calorie density can promote healthy weight loss while helping people to control hunger.

Foods that are high in water and low in fat - such as fruits, vegetables, soup, lean meat, and low-fat dairy products - are low in calorie density and provide few calories per bite.

“Eating a diet that is low in calorie density allows people to eat satisfying portions of food, and this may decrease feelings of hunger and deprivation while reducing calories” said Dr. Julia A. Ello-Martin, who conducted the study as part of her doctoral dissertation in the College of Health and Human Development at Penn State. Previously, little was known about the influence of diets low in calorie density on body weight.

“Such diets are known to reduce the intake of calories in the short term, but their role in promoting weight loss over the long term was not clear,” said Dr. Barbara J. Rolls, who directed the study and who holds the Helen A. Guthrie Chair of Nutritional Sciences at Penn State.

“We have now shown that choosing foods that are low in calorie density helps in losing weight, without the restrictive messages of other weight loss diets,” explained Ello-Martin, whose findings appear in the June 2007 issue of the American Journal of Clinical Nutrition.

The researchers compared the effects of two diets - one reduced in fat, the other high in water-rich foods as well as reduced in fat - in 71 obese women aged 22 to 60. The participants were taught by dietitians to make appropriate food choices for a diet low in calorie density, but unlike most diets, they were not assigned daily limits for calories.

At the end of one year, women in both groups showed significant weight loss as well as a decrease in the calorie density of their diets. However, women who added water-rich foods to their diets lost more weight during the first six months of the study than those who only reduced fat in their diets - 19.6 pounds compared to 14.7 pounds. Weight loss was well maintained by subjects in both groups during the second six months of the study.

Records kept by the women showed that those who included more water-rich foods ate 25 percent more food by weight and felt less hungry than those who followed the reduced-fat diet. “By eating more fruits and vegetables they were able to eat more food, and this probably helped them to stick to their diet and lose more weight,” said Ello-Martin.

Renewable Electricity Standards Toolkit

There should be an image here!In a growing number of states, renewable electricity standards have emerged as an effective and popular tool for developing a cleaner, more sustainable power supply. UCS created this toolkit to provide renewable energy advocates, policy makers, researchers, and concerned citizens with both summary-level and in-depth information on the design and implementation of each existing state standard.

Friday, June 8, 2007

Big increase in hurricanes is not caused by global heating, say scientists

Lewis Smith, Environment Reporter Hurricanes in the Atlantic are increasing because of
natural weather patterns rather than global warming, a study has concluded.

Growing numbers of hurricanes battering the United States and the Caribbean have made their presence felt in the past decade and are forecast to worsen. Global warming has been cited as a possible cause but researchers looking at sediment and coral deposits have now identified natural variations in their frequency.

Hurricane Katrina, which devastated New Orleans in 2005, was “unexceptional” when historic patterns of such stormy weather are analysed, they suggested.

Global warming may even have been responsible for unusually low levels of hurricanes in the 25 years before 1995 when the number began rising, according to the scientists, led by the Geological Survey of Sweden.

Using deposits trapped in sediment to indicate when hurricanes had taken place, the researchers built up a record detailing their number and frequency going back 270 years.

They found that the decline in hurricanes during the 1970s and 1980s was matched by similar declines in the past, indicating natural variations in the weather patterns.

“The record indicates that the average frequency of major hurricanes decreased gradually from the 1760s until the early 1960s, reaching anomalously low values during the 1970s and 1980s,” they reported in the journal Nature. “Furthermore, the phase of enhanced hurricane activity since 1995 is not unusual compared to other periods of high hurricane activity and appears to represent a recovery to normal hurricane activity.”

The findings are at variance with the conclusions in February of the Intergovernmental Panel on Climate Change, the United Nations organisation addressing global warming. The UN panel stopped short of blaming increased frequency of hurricanes on man-made temperature rises, but said it was “more likely than not” that greenhouse gas emissions had contributed to the greater intensity of cyclonic storms.

The Swedish-led research team suggested that hurricane levels were normal, though they accepted “a future possibility” of higher sea temperatures contributing to more intense hurricanes.The researchers were unable to identify any direct link between increased hurricanes and rising sea level temperatures, beyond the requirement for a minimum temperature of 27C (81F) to be reached before a hurricane developed.

>From 1730-2005 there were on average 3-3.5 major hurricanes each year.

The researchers from Sweden, the US and Puerto Rico said that being able to calculate vertical wind shear – the differences in wind speeds at different heights – was crucial in determining the frequency of hurricanes.

Higher wind-shear levels disrupt developing hurricanes; low wind-shear levels fail to batter the
storms sufficiently to prevent them developing.

The researchers suggested that higher air temperatures caused by global warming may have led to stronger vertical wind shear, which has destroyed developing hurricanes in the Atlantic before 1995, explaining the dearth.

Deposits of sediments accumulated from increased run-off from rainfall and plankton remains associated with increased levels of nutrients and provided clues to the scientists to historic vertical wind shear and hurricanes.

They were able to check their readings of the data by comparing their findings with documentation of hurricanes.

In the wind

–– From 1995 to 2005 there were an average of 4.1 major hurricanes (categories 3-5) in the Atlantic compared with an average of 1.5 from 1971-94

–– Five periods in the past 270 years were found to have had the same lack of hurricanes, combined with high wind shear, as 1971-94: 1730-36, 1793-99, 1827-30, 1852-66 and 1915-26

–– Six periods were identified as having the same high levels witnessed since 1995: 1756-74, 1780-85, 1801-12, 1840-50, 1873-90 and 1928-33

Saturday, June 2, 2007

Research Finds That Earth’s Climate Is Approaching ‘Dangerous’ Point

NASA and Columbia University Earth Institute research finds that human-made greenhouse gases have brought the Earth’s climate close to critical tipping points, with potentially dangerous consequences for the planet.

From a combination of climate models, satellite data, and paleoclimate records the scientists conclude that the West Antarctic ice sheet, Arctic ice cover, and regions providing fresh water sources and species habitat are under threat from continued global warming. The research appears in the current issue of Atmospheric Chemistry and Physics.

Tipping points can occur during climate change when the climate reaches a state such that strong amplifying feedbacks are activated by only moderate additional warming. This study finds that global warming of 0.6??C in the past 30 years has been driven mainly by increasing greenhouse gases, and only moderate additional climate forcing is likely to set in motion disintegration of the West Antarctic ice sheet and Arctic sea ice. Amplifying feedbacks include increased absorption of sunlight as melting exposes darker surfaces and speedup of iceberg discharge as the warming ocean melts ice shelves that otherwise inhibit ice flow.

The researchers used data on earlier warm periods in Earth’s history to estimate climate impacts as a function of global temperature, climate models to simulate global warming, and satellite data to verify ongoing changes. Lead author James Hansen, NASA Goddard Institute for Space Studies, New York, concludes: “If global emissions of carbon dioxide continue to rise at the rate of the past decade, this research shows that there will be disastrous effects, including increasingly rapid sea level rise, increased frequency of droughts and floods, and increased stress on wildlife and plants due to rapidly shifting climate zones.”

The researchers also investigate what would be needed to avert large climate change, thus helping define practical implications of the United Nations Framework Convention on Climate Change. That treaty, signed in 1992 by the United States and almost all nations of the world, has the goal to stabilize atmospheric greenhouse gases “at a level that prevents dangerous human-made interference with the climate system.”

Based on climate model studies and the history of the Earth the authors conclude that additional global warming of about 1??C (1.8??F) or more, above global temperature in 2000, is likely to be dangerous. In turn, the temperature limit has implications for atmospheric carbon dioxide (CO2), which has already increased from the pre-industrial level of 280 parts per million (ppm) to 383 ppm today and is rising by about 2 ppm per year. According to study co-author Makiko Sato of Columbia’s Earth Institute, “the temperature limit implies that CO2 exceeding 450 ppm is almost surely dangerous, and the ceiling may be even lower.”

The study also shows that the reduction of non-carbon dioxide forcings such as methane and black soot can offset some CO2 increase, but only to a limited extent. Hansen notes that “we probably need a full court press on both CO2 emission rates and non-CO2 forcings, to avoid tipping points and save Arctic sea ice and the West Antarctic ice sheet.”

A computer model developed by the Goddard Institute was used to simulate climate from 1880 through today. The model included a more comprehensive set of natural and human-made climate forcings than previous studies, including changes in solar radiation, volcanic particles, human-made greenhouse gases, fine particles such as soot, the effect of the particles on clouds and land use. Extensive evaluation of the model’s ability to simulate climate change is contained in a companion paper to be published in Climate Dynamics.

The authors use the model for climate simulations of the 21st century using both ‘business-as-usual’ growth of greenhouse gas emissions and an ‘alternative scenario’ in which emissions decrease slowly in the next few decades and then rapidly to achieve stabilization of atmospheric CO2 amount by the end of the century. Climate changes are so large with ‘business-as-usual’, with additional global warming of 2-3??C (3.6-5.4??F) that Hansen concludes “‘business-as-usual’ would be a guarantee of global and regional disasters.”

However, the study finds much less severe climate change - one-quarter to one-third that of the “business-as-usual” scenario - when greenhouse gas emissions follow the alternative scenario. “Climate effects may still be substantial in the ‘alternative scenario’, but there is a better chance to adapt to the changes and find other ways to further reduce the climate change,” said Sato.

While the researchers say it is still possible to achieve the “alternative scenario,” they note that significant actions will be required to do so. Emissions must begin to slow soon. “With another decade of ‘business-as-usual’ it becomes impractical to achieve the ‘alternative scenario’ because of the energy infrastructure that would be in place” says Hansen.