Tuesday, September 13, 2011

10 Incredible Wind Power Facts



In the rush to find fossil fuel's replacement as the next cheap and plentiful energy source for powering the human machine, wind gets a lot of attention. After all, it's certainly in no small supply (except when you need to get that kite airborne), and the idea of continuous, zero-pollution energy is too enticing to ignore. Oh, and there's the fact that mankind burns through about 400 quadrillion British Thermal Units (BTUs) annually, according to the U.S. Department of Energy. Four hundred quadrillion doesn't even sound like a real number, but consider that a single BTU is about as much energy generated by a lit match and that may help put it into perspectiv.


It's not like wind hasn't been earning its keep. For centuries, we've used it to mill grains, power ships and even to generate electricity, starting in the 1930s. But as energy demand climbs, so have efforts to turn wind into a viable option for producing electricity on a large scale. Wind turbines in particular are what people think of when discussing wind power. These turbines can measure more than 400 feet (122 meters) tall and weigh in at close to 400 tons.


We know the basics, but in this article we'll explore some of the unsung, and surprising, facts about wind power.

10: Wind Power Accounted for 1.9 Percent of U.S. Electricity Production in 2009

Interest in wind has been outpacing other renewable methods for new electrical power generation for a few years, increasing more than 31 percent between 2008 and 2009 alone [source: U.S. Energy Information Association].

Besides its enormous promise, there are a couple of factors that have contributed to these gains. First, in 2009, wind generators were eligible for government incentives in the United States, encouraging developers to take the plunge. Secondly, Title IX of the 2008 Farm Bill made it easier and more attractive for farmers and ranchers to undertake wind projects.


Less than 2 percent doesn't sound like much, but when you consider the rate at which it's gaining popularity, wind has the inside track to becoming a much more viable alternative for large-scale energy production.

 9: Wind is one of the Oldest Forms of Energy

Wind power dates back to at least 5000 B.C., with the earliest known use for powering sails [source: U.S. Department of Energy]. This is perhaps a no-brainer, but early sailors were not just the first to figure out an easier way to get from Point A to Point B. They laid the groundwork for humankind's understanding of important concepts such as thermodynamics and lift [source: TelosNet]. These principles would be key for other innovations, beginning with the very first windmills, which were powered by sails. These devices were used as mills and water pumps, and paved the way for an agricultural revolution by automating otherwise time-consuming activities.


This technology was carried to the New World, where it played an important role in settling the wilderness and plains of early America. As new technologies emerged, the windmill lost ground to steam engines and inexpensive electric power when, in the 1930s, the Rural Electrification Program brought inexpensive electricity to the rural U.S. [source: National Archives].


But wind is coming full circle, making a comeback as the price and accessibility of fossil fuels make it an increasingly prohibitive method for energy production.

8: One Megawatt of Wind Energy = 2,600 Fewer Tons of Carbon Dioxide

So, with all the noise about clean energy, what kind of improvement are we really talking about with wind? Consider that every year 1MW of wind energy can offset approximately 2,600 tons of carbon dioxide (CO2) [source: NREL], and the interest comes into focus. The simple math is less fossil fuel consumption equals less CO2. And measuring carbon reduction has become a key benchmark for monitoring the progress of alternative energy adoption.


In Massachusetts, for example, the average resident produced 4.5 tons of CO2 as a result of using electricity in 2004. Just 1MW of wind energy could power up to 400 homes without emitting any CO2. And besides reducing CO2 levels, wind power is dramatically easier on water supplies, with the same 1MW of wind energy saving about 1,293 million gallons of water.
7: In 2007, the NSA Determined Wind Farms Pose no Threat to Birds

One of the chief concerns among wind opponents is the danger the installations pose to native wildlife. After all, these massive turbines spin at lethal speeds and the colossal structures take up large swaths of space that would otherwise be wilderness, or open flight paths for birds.


One particularly highly publicized wind farm, Altamont Pass in California, has been a lightning rod of controversy because of the impact poor planning has had on the bird population. According to the Center for Biological Diversity, as many as 1,300 eagles, falcons, hawks and other predatory species are killed each year because the wind turbines were constructed along a critical migration route.


Research conducted at other wind farms, however, has shown that bird populations have not been significantly impacted, and the National Academy Of Sciences has stated that bird fatalities from wind farms represent a fraction of the total number of bird deaths caused by human.

6: Wind Power is Actually Solar Power

And what's the source of this magical, unending source of free and clean energy? The sun. The sun warms up our planet, but because of surface irregularities and its rotation, the Earth doesn't heat uniformly. These variances in temperature also cause irregularities in air pressure, and air molecules migrate from areas of high air pressure to areas of low air pressure.
This results in wind, the intensity, duration and direction of which are influenced by a number of factors including weather, vegetation, surface water and topography [source: EIA].

All of these variables add to wind's unpredictability and contribute to the concern that it could never be consistent enough to meet all of our energy needs. Some of the most predictable winds occur offshore, which, of course, adds to construction costs.

5: World Wind Power Production Quadrupled from 2000 to 2006

With so much potential, companies are positioning themselves to take advantage. In fact, production surged between 2000 and 2006. And even later, in 2009, while world economies plunged, the wind industry thrived.

That year alone, the installed wind power capacity, or the amount of energy capable of being produced by existing equipment, increased to 158,000 megawatts (that 31 percent jump we discussed in Fact No.10) [source: Roney]. World production is currently capable of serving the needs of 250 million people, and more than 70 countries have installations.

The United Nations recently issued a report that said making the jump from fossil fuels to renewable energy (not wind exclusively) would require more than $12 trillion over the next two decades [source: Morales].

This level of commitment will not come easily, especially while traditional resources remain relatively inexpensive. So, in order to continue the growth curve established between 2000 and 2006, it's going to take serious government incentives to encourage development.

4: Texas Has the Most Installed Wind Capacity of any State

Everything is bigger in Texas, including the wind. And the Lone Star State is leading the way in wind power with more than 40 different projects [source: Weber]. In 2008, the total capacity was 7,907MW, a significant margin over the next closest state, Iowa, which came in at 2,883MW. In fact, Texas wind installations account for one-third of the entire installed wind capacity for the United States [source: Roney].


Part of Texas' success is geography. The wide-open Texas Panhandle holds spectacular potential for harvesting wind energy; its featureless terrain and high elevation mean that wind can blow unencumbered across the plains. This, coupled with state legislation that includes financial incentives for companies involved with wind projects to boost interest in development, has positioned Texas at the forefront of the wind boom.

3: In 2008, U.S. Wind Turbines Generated Enough Energy to Power Colorado

The U.S. generated 52 billion KW hours in 2008, about 1 percent of total nationwide electricity production at the time. This may sound insignificant, but it was enough to power nearly 5 million homes -- or the entire state of Colorado [source: EIA].


As new technologies help drive down the costs associated with wind farming, the practice will, no doubt, become more and more accessible. These developments, along with government subsidies, tax breaks and other incentives, will contribute to furthering wind power production. One such initiative is green pricing programs, or options provided to customers that give them the choice to pay a premium for electricity that comes from renewable sources.


2: 38 U.S. States Have Wind Farms

The U.S. is well represented in the race for wind power, with 38 out of 50 states currently operating utility-grade wind installations. Fourteen have more than 1,000 MW of installed wind power -- which, if you recall, is the amount of electricity that can be produced by existing equipment -- and the top five wind-producing states came in with a cumulative capacity of more than 20,000 MW [source: GWEC].


In 2008, the U.S. Department of Energy published a study that examined what would be necessary in order for wind power to provide 20 percent of U.S. electricity by 2030 [source: US Department of Energy]. As the feasibility, both technologically and financially, of these types of activities is determined, and the energy industry discovers ways to make wind as profitable as current energy markets, the number of states eager to capitalize on this natural resource will likely only increase.

1: U.S. Wind Resources Could Power the Nation 10 Times Over

Although the industrial application of wind power for producing electricity has been in development for decades, it is still a relatively young technology with much to prove in terms of viability. The motivation to move forward isn't based on what wind offers today, but rather the staggering potential it holds. Yes, it is currently an expensive endeavor requiring loads of cash and the enthusiasm of a Labrador. But when you consider the simple abundance and regularity of the wind, nothing else really comes close to matching what may be possible.

The National Renewable Energy Laboratory states that the potential of land-based resources (wind farms installed on land as opposed to the open ocean) alone could provide America with its electricity needs 10 times over [source: AWEA].


A 2009 Harvard study found that a network of turbines operating at even a modest 20 percent of capacity could supply more than 40 times the worldwide demand for electricity. If this study, and others like it, are even in the ballpark, then continuing the exploration of wind as an alternative to fossil fuels is a no-brainer.




 by "environment clean generations"

Unclaimed Environmental Prizes



When the X Prize Foundation announced in 1996 it would pay $10 million to the team that could launch a privately funded spaceship into suborbit twice within two weeks, it got the world's attention.

Prizes are a common way of addressing the greatest technological hurdles facing a society. They're a long-standing tradition, going back at least to the 1700s when the British government offered 20,000 pounds for a maritime device that would measure longitude, resulting in clockmaker John Harrison's chronometer

The nature of prize-driven competitions makes them effective tools for innovation. When a problem becomes a public prize, especially an international one, the field of potential problem-solvers expands exponentially, and experts who would normally be interested in the challenge are further spurred to action by the money, the recognition and the thrill of the race.

Financially speaking, the competition is a windfall for society, not just for the winner. Typically, all competitors combined spend far more money solving the problem than the prize is worth. In pursuit of the $10 million X Prize, contestants spent about $100 million on research and development.

It makes sense, then, that one of the greatest problems facing the world today would warrant some big prize money. In the last 10 years, environmental prizes have hit the radar in a big way. They may not be quite as sexy as suborbital spaceflight, but the problem they address is bigger.
And in some cases, so is the money.

Someday soon, meat could be one of the most animal-friendly products out there.
Most of us associate PETA with animals, as opposed to straight environmentalism. But of course the two are related. PETA is offering $1 million to the first group to develop and successfully market synthetic meat. In this case, they're looking for chicken.

We're not talking chicken-flavored tofu here. We're talking about chicken-flavored chicken -- meat that looks, feels and tastes just like the stuff at the meat counter right now, only grown in the lab using chicken stem cells. And PETA's only going to pay up if it tastes just like the chicken meat from an actual chicken, because it has to be marketable on a large scale.

The primary motivation, of course, is humane treatment of animals. PETA is opposed to the methods of raising and slaughtering animals used in the livestock-farming industry. But there's another benefit to the prize: curbing greenhouse-gas emissions. Livestock farming accounts for 9 percent of all global CO2 emissions from human activity, and 65 percent of the nitrous oxide [source: UN]. All totaled, the meat industry accounts for more greenhouse gases than the transportation sector.

To win the $1 million, contestants have to sell the synthetic meat commercially, and at a price comparable to naturally grown meat, in at least 10 U.S. states by June 30, 2012.

The Freedom Prizes are more about innovation than invention. Sponsored by the Freedom Foundation, a nonprofit, and funded by the U.S. Department of Energy, the prizes reward groups who make the best use of current technologies to reduce consumption of fossil fuels.

The motivation behind the prize is not so much saving the environment as saving the United States from the hazards of dependence on foreign oil, and also from the health effects of pollution. But the outcome is the same: fewer greenhouse gases emitted into the atmosphere. The idea is to encourage people to use the energy-saving tools already at their disposal.

The prize totals more than $4 million, broken up into increments of $500,000 to $1 million each, awarded to the best ideas in five different areas: community, industry, government, military and schools. In each of the categories, the winners are those who devise and put into effect plans that cut back on fossil fuels in a way that can be implemented widely.

These programs might be things like using alternative energy to heat schools, greening automotive fleets or offering rebates on homeowners' association fees to people who use 10 percent less electricity per month. The competition began officially in 2008, and the foundation plans to begin distributing prize money some time in 2009.


The X Prize is not a one-time thing. It's a whole foundation dedicated to giving millions of dollars to the best and the brightest of technological innovators in numerous fields, and it has turned its attention from space to Earth. The most recent X Prize aims to develop a car that can make a real dent in automotive greenhouse-gas emissions.

The Progressive Automotive X Prize is a joint effort between the X Prize Foundation and Progressive Insurance. They're offering $10 million to the people behind the best 100-mpg car.

It's not just about miles per gallon, though. The prize is about pragmatism as much as it's about environmentalism: This unbelievably fuel-efficient car has to be safe enough, smooth enough and cheap enough for mass consumption.

The X Prize knows how to draw attention to itself, which is part of its success. To win the $10 million, one car in each of two categories, mainstream and alternative, has to have the lowest overall time in two long-distance, urban road races planned for 2009 and 2010. The cars have to be as fast as they are Earth-friendly (or at least fast enough for typical highway driving -- we're probably not looking at Ferrari speeds here).

The mainstream winner must have a 200-mile (321-kilometer) range, while the alternative car has to last for 100 miles (160 kilometers) without refueling -- whatever "refueling" means in this context. Sixty teams from around the world have already signed up to compete.

The government of Scotland has proposed a juicy challenge to solve the world's energy problems: 10 million euros (about $15 million) to the team that develops the best ocean-power system.

Ocean power comes primarily in two forms: wave power and tidal power. Wave power generators float on top of the ocean, generating power as they're tossed about by waves. Tidal generators are under the sea, generating power from the force of tidal movements. They're kind of like wind turbines, but instead of wind they harness ocean currents.

Contestants can develop either one of these types of systems, and they have to test it in Scottish water. The goal is to come up with a viable, efficient and highly productive power system that relies on Scotland's hefty supply of water energy (25 percent of all potential ocean power in Europe is in Scottish waters) instead of on fossil fuels. The winning design will be the one that supplies thousands of Scottish homes with all their electricity needs for two years, in the most efficient manner and requiring the least amount of maintenance.

The prize aims to further Scotland's ambitious goal of meeting half of its energy needs through renewable energy by 2020.

In 2007, former U.S. Vice President Al Gore and business mogul Richard Branson of Virgin Group announced a joint venture: the Virgin Earth Challenge, a competition to remove carbon dioxide from the air. It's a call to scientists and engineers to design a CO2 removal system that makes a real difference in long-term global warming predictions -- specifically, it must remove, at a minimum, a billion tons of CO2 per year during 10 years of operation. But that's not all.

The system has to be commercially viable, and it can't do any harm to the environment in the process of pulling carbon dioxide out of the air.

In exchange for this tall order, the Virgin Earth Challenge offers $25 million. It's the largest science and technology prize ever offered, and it'll be on the table for an initial five-year period.
If engineers will be clambering for a $25-million chance to save the planet, imagine the rush for a $300-million endeavor. During his 2008 presidential campaign, John McCain suggested a $300-million, presumably government-sponsored prize for the inventor of a super high-efficiency car battery.
No word yet on whether the government is signing on.

 by "environment clean generations"

Pole Shifting In 2012?


Some say the world will end in fire; some say ice. Lately, screenwriters and apocalypse enthusiasts have preferred natural cataclysms as their world-killers. As for when the end will arrive, those folks who claim to be in the know have an affinity for stamping 2012 as the Earth's sell-by date.

Why 2012? The answer traces back to true believers' interpretations (and reinterpretations) of Nostradamus, Edgar Cayce and various other ambiguous and nonscientific sources. Some armchair eschatologists have narrowed the expiration date further, to Dec. 21, 2012 -- when, they argue, the Mayan Long Count calendar ends its 5,125-year cycle. However, experts agree that the Mayans themselves did not believe that the world would end on this date, so feel free to buy green bananas on Dec. 19, 2012
he lack of scientific evidence for the coming apocalypse hasn't deterred believers from trotting out scientific theories to serve as evidence of imminent mass destruction. One of the most remarkable ideas they've chosen to flog is the pole shifthypothesis, in which the Earth's crust and mantle (or outermost layers) move as one piece. Pole shift might send the poles sliding toward the equator, swing North America poleward or produce any arrangement that might result from turning a globe in your hands.

People have been batting around some version of the pole shift hypothesis since at least the mid-19th century and, although many of the scientific questions it attempted to answer have since been addressed by plate tectonics, it's rooted solidly in physics. Plate tectonics and pole shifts interact and are governed by the same forces, but pole shifts, in which the outer shell of the world moves as one piece, produce very different results than plate tectonics, in which pieces of the Earth's crust bump, grind and slide -- opening seas, building mountain ranges and rearranging continents.


If a large pole shift could happen suddenly, the redistribution of land and water it caused would be nothing short of cataclysmic. In the short term, it would mean earthquakes, strange weather patterns, massive tsunamis capable of drowning parts of continents, and possibly gaps in the planet's magnetic field -- our shield against harmful cosmic rays. In the long term, the redistribution of land and water in the tropics, subtropics and poles would fundamentally alter ocean currents and the heat balance of the Earth, resulting in widespread climatological shifts. Ice caps might melt and reform elsewhere, or remain melted, driving sea levels down or up.

All of which returns us to the question: Could such a catastrophic shift occur, and if so, will it happen in 2012?

Pole shift refers to a geological phenomena in which the Earth's outermost layers move together as one piece. 

o understand polar shift (known to geologists as the True Polar Wander, or TPW, hypothesis), it helps to have a clear picture of how the Earth is put together.

The Earth isn't a solid ball of rock; it consists of concentric layers, each with its own heat and density characteristics. The outermost layer, the crust, is made up of rocky, interlocking pieces. These aluminum-silicate plates float like rafts atop a molten outer mantle, which surrounds a more fluid inner mantle. Farther in, a liquid nickel-iron outer core encompasses the Earth's solid, iron inner core. Put another way, the Earth consists of a solid shell surrounding a liquid interior, which encircles a solid center.


Most of the Earth's internal heat is stored in the mantle. There, temperature differentials cause convection -- the same process observed in a pot of boiling water, except it takes place over hundreds of kilometers and involves what Dr. Evil would call "liquid hot magma." Hotter magma rises toward the crust while cooler, denser materials -- such as subducted oceanic plates -- sink toward the core [source: Sager]. Convection drives tectonic processes and also redistributes the internal mass of the Earth.


Above the mantle, the crust tilts, rocks, sinks and rebounds in response to changes in pressure and load, such as occur after an ice age, when glacial ice returns to the sea as meltwater. The motion is like how a boat reacts to a person exiting or climbing aboard, only much slower.


When internal and/or surface mass distributions become uneven, TPW might occur, because the centrifugal force of the Earth's spin drives mass anomalies -- whether on the crust or in the mantle -- toward the equator. Some geologists argue that this has happened in the past. One possibility occurred about 800 million years ago; another, 610 - 510 million years ago, might have caused climate shifts that helped bring about the Cambrian explosion -- the relatively rapid appearance of most of the major groups of animals in the fossil record.

Some scientists believe that a polar shift is happening right now, at a rate of around nearly 4 inches (10 centimeters) per year [source: Tarduno]. This gradual "righting of the boat" is a physical response to the retreat of the Laurentide ice sheet at the end of the Pleistocene Epoch, 20,000-plus years ago [source: Maloof]. But don't pack the kids into John Cusack's limo just yet; this rate, although fast by plate tectonic standards, is still very, very slow. In fact, TPW takes 1 million to 100 million years to complete an adjustment -- the geological equivalent of watching fingernails grow -- and the current one will stop before making much progress.
 However you look at it, rapid polar shifts, like the kind portrayed in the movie "2012," simply don't happen.


Magnetic Shift

Polar shift is easily confused with pole reversal, in which the Earth's magnetic poles change places. Evidence for such reversals is abundant, locked in the iron oxides of ancient rocks, which aligned along the direction of magnetic north when they cooled. This alignment still occurs in some igneous rocks, such as when lava cools and crystallizes. Magnetic pole reversals occur irregularly (around every 300,000 years) and require thousands of years to complete. Some call the South Atlantic Anomaly -- a trough in the Earth's magnetic field near the coast of Brazil that enables cosmic rays and charged particles to delve lower than usual into the atmosphere -- a harbinger of an upcoming pole reversal, due perhaps as soon as 2012. Scientists think it unlikely; even if it happened, they say, the results would not be catastrophic.



 by "environment clean generations"

16 Super Earths Discovered



The European Southern Observatory (ESO) has announced its exoplanet-hunting HARPS (High Accuracy Radial velocity Planet Searcher) has discovered 50 new exoplanets, making it the largest amount of exoplanets that has been announced at the one time. Bringing the number of planets discovered outside our solar system to 645, the 50-planet haul includes 16 super-Earths (planets with a mass between one and ten times that of Earth), including one that orbits at the edge of the habitable zone of its star.

 Wide-field view of the sky around the star HD 85512 (Image: ESO and Digitized Sky Survey 2 / Davide De Martin)

Whereas NASA's Kepler spacecraft looks for fluctuations in the brightness of stars to detect planets passing in front of it, HARPS is a high precision echelle spectrograph that observes Doppler shifts in the spectrum of the star around which a planet orbits. In contrast to the majority of planets discovered by the transit method employed by Kepler, which are very distant from us, the planets found by HARPS are around stars that are much closer, making them better targets for many kinds of additional follow-up observations.
HARPS discovered its first super-Earth in the habitable zone, (Gliese 581 d), in 2007. More recently, it was also used to demonstrate that the other candidate super-Earth in the habitable zone around star Gliese 581 (Gliese 581 g) doesn't exist.


In the eight years since HARPS achieved first light, its observations have allowed astronomers to improve the estimate of how likely it is that a star like the Sun is host to low-mass planets as opposed to gaseous giants. By studying the properties of all the HARPS planets discovered so far, the team has found that about 40 percent of stars similar to the Sun have at least one planet lighter than Saturn. Additionally, the majority of exoplanets of Neptune mass or less appear to be in systems with multiple planets.


With upgrades to both hardware and software, the team is increasing the sensitivity of HARPS, to search for rocky planets that could support life. One potential candidate is the newly discovered HD 85512 b, which is estimated to be just 3.6 times the mass of the Earth and is located at the edge of the habitable zone where water may be present in liquid form if conditions are right.


"This is the lowest-mass confirmed planet discovered by the radial velocity method that potentially lies in the habitable zone of its star, and the second low-mass planet discovered by HARPS inside the habitable zone," says Lisa Kaltenegger of the Max Planck Institute for Astronomy and Harvard Smithsonian Center for Astrophysics.


The team says that HARPS is now so sensitive that it can detect radial velocity amplitudes of significantly less than 4 km/h (2.5 mph), allowing it to detect planets under two Earth masses. Earth induces a 0.32 km/h (0.2 mph) radial velocity on the Sun.


HARPS is currently installed on ESO's 3.6 m Telescope at La Silla Observatory in Chile but a copy of HARPS is to be installed on the Telescopio Nazionale Galileo in the Canary Islands, to survey stars in the northern sky. 

Additionally, a new and more powerful planet-finder, called ESPRESSO, (Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations), will be installed on ESO's Very Large Telescope in Chile in 2016. It will boast radial velocity precision of 0.35 km/h (0.22 mph) or less, giving it the ability to discover Earth-mass planets in the habitable zone of low-mass stars.


"In the coming ten to twenty years we should have the first list of potentially habitable planets in the Sun's neighborhood. Making such a list is essential before future experiments can search for possible spectroscopic signatures of life in the exoplanet atmospheres," concludes Michel Mayor, who leads the ESO's HARPS team. 

     The habitable zone around some stars with planets (Image: ESO)

 Artist's impression of the rocky super-Earth HD 85512 b - one of more than 50 new exoplanets found by HARPS (Image: ESO/M. Kornmesser)

 by "environment clean generations"

A New Type Of Alloy Converts Heat Directly Into Electricity


The heat given off by electronics, automobile engines, factories and other sources is a potentially huge source of energy, and various technologies are being developed in order to capture that heat, and then convert it into electricity. Thanks to an alloy that was recently developed at the University of Minnesota, however, a step in that process could be saved - the new material is able to convert heat directly into electricity.

The multiferroic alloy, with the catchy name Ni45Co5Mn40Sn10, was created by combining its various elements at the atomic level. Multiferroic materials are known for having unique elastic, magnetic and electric properties, and in the case of this alloy, that takes a form of an usual phase change. When heated, the non-magnetic solid material suddenly becomes a strongly magnetic solid.

In a lab test, upon becoming magnetic, the material absorbed heat in its environment and proceeded to produce electricity in an attached coil. Although some of the heat energy is lost in a process known as hysteresis, the U Minnesota researchers have developed a method of minimizing that energy loss.

"This research is very promising because it presents an entirely new method for energy conversion that's never been done before," said aerospace engineering and mechanics professor Richard James, who led the research team. "It's also the ultimate 'green' way to create electricity because it uses waste heat to create electricity with no carbon dioxide."

by "environment clean generations"

Skies Teeming With UFOs


Earlier this year, Annie Jacobsen's book Area 51: An Uncensored History of America's Top Secret Military Base drew groans from skeptics and believers alike, who derided her claim that the infamous 1947 Roswell crash was really a spy plane sent by Josef Stalin, and piloted by “alien-like children” created by Nazi doctor Josef Mengele, intended to create a mass panic about an alien invasion.


The story was based entirely on one anonymous source without a shred of supporting evidence, which is not unheard of among UFO reports. UFO enthusiasts who like some documentation with their speculation might prefer journalist Leslie Kean's recent book, UFOs: Generals, Pilots and Government Officials Go on the Record.

Kean's book topped The New York Times best-seller list -- an unusual achievement for a nonfiction book about extraterrestrials. Part of the reason the book has done so well, Kean told Discovery News, is that “I'm trying to be very straightforward as a journalist, laying out what we know based on the official records. Also, many of the chapters were written by other people [including generals and former Arizona governor Fife Symington], and [former White House chief of staff] John Podesta wrote the foreword. ... They're not just taking my word for it; the reader gets to actually read what these authorities have to say in their own words.”
There are many cases in the book -- from a UFO sighted over Chicago's O’Hare Airport in 2006 to reports from Brazil and Iran -- but one famous UFO incident was solved shortly after the book came out.

The Belgian Ufo photo it's a famous photo taken April 4, 1990, by a man known only as “Patrick” in the Belgian town of Petit-Rechain. Patrick and a female friend noticed a strange aircraft with four lights hovering in the sky above her home. He took a photo that has been called “one of the most convincing” pieces of evidence for the existence of UFOs.

According to one of Kean's contributors, Maj. Gen. Wilfried de Brouwer of the Belgian air force, a distinguished team of experts analyzed the photograph: “A team under the direction of Professor Marc Acheroy discovered that a triangular shape became visible when overexposing the slide. After that, the original color slide was further analyzed by Francois Louange, specialist in satellite imagery with the French national space research center, CNES; Dr. Richard Haines, former senior scientist with NASA; and finally Professor Andre Marion, doctor in nuclear physics and professor at the University of Paris-Sud and also with the CNES.”


The team came to various conclusions, including that there was no indication of tampering with the slide, and that the lights were positioned symmetrically on the craft. A 2002 reanalysis “using more sophisticated technology confirmed the earlier findings and concluded that ‘the picture was not faked. The experts noted especially that the unique characteristics of the lights are very specific and said such an effect would not occur if the picture was a hoax.’”


In fact, the photographer confessed on July 26, 2011, that he had indeed hoaxed the photograph. The image, which was (twice) deemed authentic by the panel of distinguished scientists and experts, was really of a small piece of triangular Styrofoam spray-painted black with lights attached. The skeptics had been right all along.

Kean acknowledged that the hoaxing posed a serious problem: “If the guy says it was a hoax, we pretty much have to assume it was. We know that he’s a liar. He either lied the first time, or he's lying now. I'm going to have to assume that he’s telling the truth now, even though there’s some questions about it.” Belgian UFO expert Patrick Ferryn, who appeared in the History Channel show "Secret Access: UFOs on the Record," which was based on Kean’s book, has also concluded that the photo was faked.
The fact that a UFO photo turned out to be a hoax is nothing new; many have been proven fake. But it raises serious questions about the scientific analysis involved. How could these distinguished Ph.D. experts with decades of experience have been convinced by a piece of painted Styrofoam? And what does that say about other famous UFO photos that have also been “authenticated” by these and other experts?


Kean agrees: “It's a disturbing development, and it shows how hard it is to authenticate a photograph. At the time the book was put together, everyone was relying on what we knew from the labs. As a reporter I'm going to take that information seriously, and de Brouwer certainly took it very seriously, and now the guy comes out [confessing the hoax], so we’re stuck with a serious problem that's still being investigated.” Kean noted, however, that the faked photo is only part of a larger so-called Belgian Wave of UFO sightings that occurred around the same time, and the hoax “doesn’t discount all the sightings that took place.”

The Mysterious 5 Percent


Kean first got interested in UFOs back in 1999, when she received a copy of a French report summarizing two years of UFO evidence analysis. The report, which was not an official government document but included data from more than a dozen retired generals, scientists and space experts, concluded that about 95 percent of UFO reports likely have mundane, prosaic explanations. Yet that remaining elusive 5 percent “cannot be easily attributed to earthly sources” and might be extraterrestrial in origin.


Despite the reluctance of many UFO eyewitnesses and officials to come forward, Kean felt no apprehension about researching the book. “A lot of people talk about being threatened, and the CIA tapping their phones and all that. I think a lot of that stuff is exaggerated among people in the UFO community,” Kean says. Besides, she notes, she's hardly revealing classified information: “I'm really only reporting on information that’s out there; I'm reporting on official information. Anybody can look at the documents and the data. ... None of this is top secret information that would be any threat to anybody.”






by "environment clean generations"

Monday, September 12, 2011

Better Wind Turbines From Carbon Nanotube-Reinforced Polyurethane

Carbon nanotube-reinforced polyurethane could make for lighter and more durable wind turbine blades.

In the effort to capture more energy from the wind, the blades of wind turbines have become bigger and bigger to the point where the diameter of the rotors can be over 100 m (328 ft). Although larger blades cover a larger area, they are also heavier, which means more wind is needed to turn the rotor.

The ideal combination would be blades that are not only bigger, but also lighter and more durable. A researcher at Case Western Reserve University has built a prototype blade from materials that could provide just such a winning combination.

The new blade developed by Marcio Loos, a post-doctoral researcher in the Department of Macromolecular Science and Engineering, is the world's first polyurethane blade reinforced with carbon nanotubes. Using a small commercial blade as a template, Loos manufactured a 29-inch (73.6 cm) blade that is substantially lighter, more rigid and tougher than conventional blades. Rigidity is important because as a blade flexes in the wind it loses the optimal shape for catching air, so less energy is captured.

Working with colleagues at Case Western Reserve, and investigators from Bayer Material Science in Pittsburgh, and Molded Fiber Glass Co. in Ashtabula, Ohio, Loos compared the properties of the new materials with that of conventional blades manufactured using fiberglass resin.

"Results of mechanical testing for the carbon nanotube reinforced polyurethane show that this material outperforms the currently used resins for wind blades applications," said Ica Manas-Zloczower, professor of macromolecular science and engineering and associate dean in the Case School of Engineering.


Comparing reinforcing materials, the researchers found that the carbon nanotubes are lighter per unit of volume than carbon fiber and aluminum and had five times the tensile strength of carbon fiber and more than 60 times that of aluminum.

Meanwhile, fatigue testing showed the reinforced polyurethane composite lasts about eight times longer than epoxy reinforced with fiberglass, while delamination fracture tests showed it was also about eight times tougher.

The performance of the material was even better when compared against vinyl ester reinforced with fiberglass, another material used to make wind turbine blades. Fracture growth rates were also a fraction of that found for traditional epoxy and vinyl ester composites.

Loos and her team are now working to determine the optimal conditions for the dispersion of the nanotubes, the ideal distribution within the polyurethane and the ways to achieve both.

by "environment clean generations"