Sunday, March 22, 2009

Super Solar Flares: Armageddon from the Sun?


Most people know what a solar flare is, but few know about a very rare and yet very powerful type of solar flare - a "super solar flare." A super solar flare is about as powerful as the nuclear electromagnetic pulse (NEMP) that occurs when a nuclear device that is detonated. A super solar flare or an NEMP could seriously set the world back many, many years as far as technology is concerned. Our earth's atmosphere protects us and our equipment from regular solar flares, but can be easily penetrated by a super solar flare. Which could damage everything from our modern electronics to the power grid.

The phenomenon of destruction from an NEMP was originally documented during World War II when nuclear bombs were new and undergoing testing and detonation. The two phenomenons - an NEMP and a super solar flare - are similar in that they both emit high concentrations of electromagnetic waves that can cause considerable damage to our environment. But there are two important differences. Usually super solar flares don't affect biological life unless it is highly concentrated, or unless that biological life is too close to conducting surfaces such as wires and metal structures. Unlike a super solar flare, the primary effect of a nuclear detonation is radiation, which is meant to destroy biological life.

NASA recently released a report detailing a study by The National Academy of Sciences entitled, "Severe Space Weather Events - Understanding Societal and Economic Impacts." NASA’s website contains a good synopsis of the findings of this study, which they funded. Our military and our government are very concerned about understanding these events, because the government needs to be able to operate if such a scenario should happen. The truth is, very little is known about the effects that a super solar flare or an NEMP attack might have on modern electronics and electrical systems. Speculation abounds, particularly in Internet forums, but the actual effects have never been experienced.

A super solar flare is many thousands of times more powerful than a regular solar flare, just as modern nuclear warheads are many thousands of times more powerful than the ones used for initial testing during World War II. Also keep in mind that if an NEMP should happen there will be almost no advance warning, maybe a couple of seconds to a minute or two, and that’s it. By contrast, in terms of a super solar flare, there are dozens of satellites in orbit around the earth that monitor solar output and many other details, including solar flare activity. Because it takes about 7½ minutes for light or solar activity to reach the earth from the sun, it is possible that we could get a little advance warning before a super solar flare hits. Just a little.

There is plenty of information on the Internet right now about super solar flares and NEMPs, some good and some not so good. Understand that most of the information available about how to mitigate damages or protect equipment is derived from mathematics, and is still only theory. Even though the math has been proven in other circumstances and situations, no one has actually tested it with a super solar flare or NEMP. And you can only do so much to "harden" your environment before your preparations become too much of a hassle and interfere with everyday living.

However, the popular belief is that you can construct a "Faraday cage" and put whatever you want protected inside of it, and it will be protected - such as radios, a small television, and communication equipment. A Faraday cage is nothing more than a metal box or mesh wire screen where all ends meet each other to form a tight metallic seal. Some theorize that even chicken wire with its very large open weave would work just as well as a solid metal box. Nothing you put inside the box should come in contact with the box itself or the mesh, and no metal wires should be run in or out of the box. A quick and easy approach would be to put what you want protected in a cardboard or wooden box and then wrap it with aluminum foil. Some people have lined rooms with chicken wire and are very careful about what goes into the room. Some say an airplane is a perfect example of a well-constructed Faraday box because of the high amount of static electricity generated on the exterior "skin," which is made of aluminum. Aircraft are actually struck by lightning fairly regularly, and nothing inside gets damaged.

There was a minor solar flare in the fall of 2005 that resulted in all GPS signals on one side of the Earth being noticeably degraded. When those results are scaled up to the power of the super solar flares that many people are expecting to happen sometime in the next few years, researchers say there could be massive outages experienced in GPS receivers located on the daylight side of the planet.

This is just a quick down and dirty primer to provide a little information and spark your interest in learning more. Now it is up to you to do a little research on your own, and devise a plan to protect yourself and your equipment should these events take place.

What Causes Thunder


Thunder is the sound caused due to lightning. The flash of lightning and the accompanying thunder occur around the same time. However, lightning is seen first, followed by the sound of thunder after few seconds. This is due to the fact that light wave travels much faster than sound waves.

In order to understand thunder, one needs to know about lightning. Well! Lightning is electricity discharged during a thunderstorm. It is caused due to the build-up and discharge of electrical energy in the thunderstorm clouds, which are about 15,000 to 25,000 feet above sea level. Lightning usually occurs within clouds, between the cloud and air or between the ground and cloud. Based on the nature of lightning, there are various types such as in-cloud lightning, cloud-to-ground lightning, cloud-to-cloud lightning, sheet lightning, bead lightning, ribbon lightning, ball lightning and bolt from the blue.

What Causes Thunder

The cause of thunder has been a subject of discussion for a long time. In the third century BC, it was believed that thunder is caused due to collision of clouds. However, the most accepted theory was developed in the 20th century. According to this theory, the bolts of lightning are very hot, much hotter than the surface of the sun. It is estimated that the bolt has a temperature of 30,000 to 50,000 degrees F (28,000 degrees C). When this high temperature bolt hits the surrounding air, there is an instant expansion of the air, sending out a shock wave or vibration, which we hear as a sound of explosion. Thus, in short, thunder is caused due to rapid heating and cooling of the air, near the stroke of lightning.

The sound intensity of thunder varies, depending upon the nature of lightning and the distance of the hearer from the origin of the sound. You can estimate the distance (miles) from the strike; first count the interval between the flash of lightning and the sound of thunder in seconds and then divide the interval by five. In case, you are near to the flash of lightning, you will hear thunder as a sharp crack; whereas, if you are far from the lightning stroke, then the sound will seem a low rumble. The rumbling sound is because of the echo, occurring due to the reflection of sound waves from the buildings, trees and hillsides.

Thunder can be dangerous, depending on how close we are from the stroke of lightning. If the sound is very loud, it can hurt our ears. However, it is to be noted that it is lightning, which is more dangerous. On an average, about one hundred people die every year in the United States and several people suffer from lifelong disability due to lightning. Most of the lightning casualties occur, when people are caught outdoors during rainy weather. In case, you are outside during a thunderstorm, it is advisable to avoid open fields, beaches and lakes and also stay away from tall trees. The safest way is to get inside your home or stay in the car. In the latter situation, you should not touch metal, as it is a good conductor of electricity.

Is Yellowstone Going to Erupt?


Some researchers and residents around Yellowstone are beginning to wonder if the hundreds of small earthquakes that have been happening in the area are a grim forecast of a disaster in the making. Simmering beneath the glorious majesty of the snow-capped mountains and lush green meadows of Yellowstone is one of the largest volcanoes in the world. Scientists say that millions of years ago, the volcano erupted with a blast a thousand times more powerful than the eruption at Mount St. Helens, with ash being catapulted as far away as Louisiana. However, geologists say that there is no risk of any eruption like that happening again, and even a minor eruption is not likely any time soon.

Still, some researchers and residents around Yellowstone are beginning to wonder if the increasing frequency of small earthquakes that have been happening in the area are Mother Nature’s warning that major fireworks are about to begin. But the Yellowstone Volcano Observatory, which monitors Yellowstone for seismic activity has not changed the alert level of the volcano, which is still at "normal."

Online disaster forums such as Armageddononline.com have been abuzz with keeping Internet visitors apprised of daily events happening at Yellowstone. One site even went so far as to post a warning accompanied by the logo of the U.S. Geological Survey, advising everyone living within 100 miles of Yellowstone to leave because of dangerous poisonous gases that are escaping due to the recent earthquakes. The site has now been taken down after complaints that site visitors might believe it was an official source. USGS attorneys are investigating to determine whether federal charges should be levied against the site’s owner because of violating the USGS trademark.

Yellowstone and the surrounding areas experience hundreds of earthquakes each year, often in "swarms" of quakes in rapid succession. For a week beginning on December 26, 2008, there were approximately 900 earthquakes, which was evidently the most energetic earthquake swam in more than 20 years. Almost all of the quakes were too weak for most people to even notice, but the most powerful quake was a 3.9 magnitude temblor, only strong enough to cause slight damage. here has been no public speculation about what caused the December swarm, but scientists are analyzing the data they gathered and it could take months before they are able to issue a statement in conclusion.

"I could come up with 100 different theories without any evidence for them and they would all be equally likely," said Jake Lowenstern, the scientist in charge of Yellowstone Volcano Observatory. "Unless you have some reason to say that's what's going on, then you're not going to get a whole lot of people convinced by your speculation." Hank Heasler, a geologist at Yellowstone, furthered Lowenstern’s statements by saying that the odds of a huge eruption are about the same as the odds of a meteorite hitting Earth. According to a paper co-authored by the two, large hydrothermal eruptions occur only about every 200 years, leaving behind football field-sized craters.

New equipment has been installed in the park in recent years, and geologists hope the data gathered from those sensors will help to provide a definitive picture of what activity is causing the swarm of earthquakes. In the meantime, Loewenstern and others are content with biding their time, at least for this generation. "Statistically, it would be surprising to see an eruption the next hundred years," Lowenstern said.

Top Astronomy Discoveries of 2008


Planetary science made some leaps and bounds in 2008 that many people weren’t even aware of. Many of the findings were right in our own backyards, in Mercury and Mars, and others were way out in space beyond our solar system. Astronomers discovered at least 50 new planets, called "exoplanets," this year. "It’s been a very exciting year for exoplanet discoveries," said Michael Liu, an astronomer at the University of Hawaii. "The big picture is that a wide variety of new technologies, both instruments on existing telescopes and new dedicated telescopes, are really allowing astronomers to do much more sensitive measurements, and thus leading to a real bonanza of discoveries," Liu told SPACE.com.

So far there have been more than 300 exoplanets discovered. Many astronomers seem convinced that it’s only a matter of time before they spot another planet Earth. In November, two teams of astronomers reported that they had taken photographs of exoplanets. Geoffrey Marcy of the University of California, Berkeley, calls the images "the most spectacular thing in 2008." Speaking about the Hubble Space Telescope’s image of the planet called Fomalhaut b., Marcy added, "In my own professional opinion this is by far the most definitive picture of a planet ever taken."

With the regular reports on NASA’s Phoenix Mars Lander, which touched down on the red planet in May, our planetary neighbor has gotten a lot of attention. There were also reports continuing to come from NASA’s Reconnaissance Orbiter and the Mars Exploration Rover twins Spirit and Opportunity. The Orbiter has not imaged nearly 40% of the planet, and continues to capture and send back fascinating pictures that help astronomers learn more about the planet.

One of the primary goals of these missions is to find signs of past or present water- the main ingredient for supporting life. When the Phoenix Lander collected water ice near the north pole of Mars this year, the astronomy world was overwhelmed. Earlier in the year, Spirit has found deposits of silica in Gusev Crater, which scientists believe suggests that hot water once flowed through the soil in hydrothermal vents, which may have harbored life. And if life did exist there at one time, the silica could have preserved fossils.

One of the most sci-fi sounding research projects of 2008 was the exploration of a mysterious "force" scientist call "dark energy." This force was discovered about 10 years ago and has been expanding the universe at an increasing pace. Scientists admit that their research is still in very preliminary stages, but a new method used this year confirmed the existence of dark energy and suggested that it is stifling the growth of galaxies. The basic concept is that in an expanding universe dominated by dark energy, rather than galaxies mixing and mingling, they fly away from each other.

Now that Pluto is no longer a planet, Mercury has taken its place as the smallest planet in our solar system. Mercury had remained cloaked in mystery until early 2008 when NASA’s Messenger probe made its first trek around Mercury, beginning a mission to take images of the entire planet. The first images indicated clear evidence of the existence of volcanoes. Pictures showed lava flows in the Caloris basin and a volcano larger than the state of Delaware. The thousands of other images sent by Messenger could shed light on other mysteries of Mercury, including the planet’s core, which makes up about 2/3 of the planet’s mass. Some astronomers purport that a huge impact hundreds of millions of years ago may have stripped Mercury of its original surface.

Astronomers predict that the upcoming year will bring us even closer to discovering a planet that could be Earth’s twin. NASA’s Kepler mission, scheduled to launch in March, will search for rocky planets about the size of Earth that orbit within the habitable zone of their host stars where liquid water and life may exist. While they continue their research, imaging, and postulating, the rest of the world can only watch and wait, gazing upward and dreaming of what might be out there.

Plastic Continent: Plastic Recycling Overview

The manufacturing of plastic began during World War II. Plastic was a result of the experiments conducted by petrochemical industries. The term plastic is derived from the Greek term plastikos meaning fit for molding.

One of the features of plastics is it does not decompose easily and this is the negative point of the plastics. The plastic materials such as bottles, wrappers and containers if dumped do not decompose. Instead, plastic garbage causes environmental pollution and is detrimental for the conservation of environment.

Since plastics does not degrade, it is broken down using a process called photo degradation. During this process plastic becomes brittle and then breaks downs into bits. Plastics are considered as one of the pollutants causing serious environmental issues because the garbage containing plastics end up in the waterways that eventually flow into the oceans. Accumulation of plastic in the ocean endangers marine life and pollutes the water.

Does Plastic Continent exist?

Yes, plastic continent does exist and was discovered by Captain Charles Moore a decade ago. Plastic continent is twice the size of Britain and it is the region that lies between the Hawaii Islands and California in the central Pacific Ocean.

One of the causes for the lack of marine life in this region is due to pollution and the pollutant is none other than the plastics. According to the reports of Captain Charles Moore, instead of clear ocean waters he could see only mounds of plastic wastes such as bottles, wrappers and other wastes floating. He believes plastic wastes started accumulating in the 1950s.

Plastic Continent is the man made continent of floating plastic wastes. The sailors and the fishermen have avoided this region for years. The reasons are

* There are no fishes here because of the lack of nutrients
* The zone also lacks the wind that is essential for sailing.

To research on the ocean pollution, Captain Moore founded Algalita Marine Research Foundation. Other organizations such as Greenpeace supported his cause. The reports of The United Nations Environment Program reveal millions of seabirds and marine mammals are dying to due the invasion of plastics.

The waste plastics that are believed to be recycled are in fact being dumped into the oceans. The marine animals mistake these for food particles and nibble them. This leads to their death that affects the marine food chain.

To save the oceans from the plastic continent you need to - reduce the use of plastics, re-use the plastic bags and containers and recycle the plastics.

Plastic Recycling Overview

One of the greatest sources of plastic wastes is the household and the recycling household plastics present various challenges. One of the challenges is collecting of waste plastics dumped along with other garbage. There are two types of plastic recycling:

Mechanical recycling of plastics - The plastic wastes should be sorted out before the process of mechanical recycling. Mechanical recycling involves the processes such as washing, shredding, melting and granulation.

Chemical recycling of plastics - Chemical recycling is also known as feedstock recycling. In this process, the plastic wastes are broken down to polymers by using various recycling technologies such as pyrolysis, hydrogenation, thermal cracking and gasification.

A wide array of plastic products is made from recycled plastics. Some of the products include plastic bags and garbage bin liners, window frames, CD cases, garden furniture and various office accessories such as plastic trays, stands and so on. The recycled plastics are also made into flowerpots, plumbing pipes, toys, carpets and insulation.

A Few Plastic Recycling Facts

Did you know on an average Americans throw away two million plastic bottles in an hour?

PET or Poly Ethylene Terephthalate has the recycling code #1. The soda bottles made from PET are the most recycled plastic containers.

Another interesting fact about PET bottles is that, at the time of recycling 35 PET soda bottles make enough fiberfill to fill a sleeping bag.

One-third of the carpets manufactured in the US have recycled PET bottles in it.

You can make a six-foot plastic bench using around 1100 recycled milk cans.

The high-density polyethylene plastic used to make juice bottles and milk cans, can be recycled to make plastic goods like flowerpots, trashcans and curbside garbage bins.

The statistics show that a ton of HDPE plastic costs around $400. It is believed around 75% of the HDPE plastic that is manufactured are use and throw containers.

Around 55% of the recycled PET is used in the manufacturing of carpet and clothes. 21% of recycled PET is used to make food and non-food containers. Similarly, 45% of recycled HDPE plastics are used for making new bottles for storing fruit juices and milk. HDPE is also used to make garden products such as edging, picnic tables, benches and many more.

It is estimated about 6-8% of the solid wastes of the United States is the plastic.

Do you know it takes 700 years for a plastic bottle to decompose and mix with the earth in the landfill?

What is Water Pollution


Water pollution is an undesirable change in the state of water, contaminated with harmful substances. It is the second most important environmental issue next to air pollution. Any change in the physical, chemical and biological properties of water that has a harmful effect on living things is water pollution. Water pollution affects all the major water bodies of the world such as lakes, rivers, oceans and groundwater. Polluted water is unfit for drinking and for other consumption processes. It is also not suitable for agricultural and industrial use. The effects of water pollution are harmful to human beings, plants, animals, fish and birds. Polluted water also contains viruses, bacteria, intestinal parasites and other harmful microorganisms, which can cause waterborne diseases such as diarrhea, dysentery, and typhoid. Due to water pollution, the entire ecosystem gets disturbed.

Sources of water pollution

The important sources of water pollution are domestic wastes, industrial effluents and agricultural wastes. Other sources include oil spills, atmospheric deposition, marine dumping, radioactive waste, global warming and eutrophication. Among these, domestic waste (domestic sewage) and industrial waste are the most important sources contributing to water pollution.

Domestic Sewage: Domestic sewage is wastewater generated from the household activities. It contains organic and inorganic materials such as phosphates and nitrates. Organic materials are food and vegetable waste, whereas inorganic materials come from soaps and detergents. Usually people dump the household wastes in the nearby water source, which leads to water pollution. The amount of organic wastes that can be degraded by the water bodies is measured in terms of Biological Oxygen Demand (BOD). BOD is the amount of oxygen needed by microorganisms to decompose the organic waste present in the sewage. The higher the amount of BOD, the more water is polluted with organic waste. Many people are not aware of the fact that soaps and detergents enrich the water bodies with phosphates. These phosphates often lead to algal bloom and eutrophication, which is is most common in stagnant water bodies such as ponds and lakes. Algal bloom and eutrophication lead to the suffocation of fish and other organism in a water body.

Industrial Effluents: Wastewater from the manufacturing and processing industries causes water pollution. The industrial effluents contain organic pollutants and other toxic chemicals. Some of the pollutants from industrial source include lead, mercury, asbestos, nitrates, phosphates, oils, etc. Wastewater from food and chemical processing industries contribute more to water pollution than the other industries such as distilleries, leather processing industries and thermal power plants. Also dye industries generate wastewater which changes the water quality especially water color. Since the water color is changed, there is alteration in the light penetration and hence it disturbs the aquatic plants and animals. Many of the big industries have come up with wastewater treatment plants. However, it is not the case with small-scale industries. It is very difficult to treat wastewater from the industries.

Let’s take the example of Minamata disease in which more than 1,784 people died and many more suffered due to consumption of fish, bioaccumulated with methyl mercury. It was caused by release of methyl mercury from Chisso Corporation’s chemical factory. The disease continued to affect animals and humans for over 30 years, from 1932 to 1968.

Agricultural Waste: Agricultural waste include manure, slurries and runoffs. Most of the agricultural farms use chemical fertilizers and pesticides. The runoffs from these agricultural fields cause water pollution to the nearby water sources such as rivers, streams and lakes. The seepage of fertilizers and pesticides causes groundwater pollution, which is commonly known as leaching. Although the quantity of agricultural waste is low, the effects are highly significant. It causes nutrient and organic pollution to both water and soil. Nutrient pollution causes an increase in the nitrates and phosphates in the water bodies, which leads to eutrophication.

Depending upon the origin, sources of water pollution are classified as point source and non-point source and ground-water pollution. Point source pollution discharges the harmful waste directly into water bodies, for example, disposal through wastewater treatment plants. On the other hand, non-point source pollution delivers indirectly through other ways, for example, water pollution from acid rain.

Prevention of water pollution

Although 71% of earth’s surface is covered with water bodies, we don’t have enough water to drink. Many researches have been done on water purification systems in order to have safe drinking water. However, there are about 1 billion people, who don’t have proper excess to drinking water. Therefore, water needs to be conserved and prevent from pollution in order to make it safe for drinking and other consumption process. Reducing the amount of water use can help conserve water as well as save money. Prevention of water pollution includes using eco-friendly household products such as non-phosphate or low-phosphate detergents and other toiletries, improving housekeeping, turning off the water tap when not needed, disposing the household wastes in proper sites far off from the water sources. Planting more trees can also prevent water pollution by reducing soil erosion and water runoff. Educating people about water pollution is an important way of preventing water pollution.

The Pickens Plan: Tossing Oil Dependency into the Wind


One of the world’s most outspoken and productive billionaires has never been shy about predicting the price fluctuations of oil and gas. And now he’s not being shy about how he thinks the United States can break their dependency on foreign oil. Pickens certainly knows what he’s talking about when it comes to fossil fuels. He is the main person responsible for forming the energy futures investment strategy of the BP Capital, which manages more than $4 billion in one of the nation’s most successful energy-oriented investment funds.

As most Americans know, our country is addicted to foreign oil suppliers, and this dependency just gets worse each year. Every day about 85 million barrels are produced around the world, and 21 million of those are used in the U.S. So 25% of the world’s oil demand is used by just 4% of the world’s population. There’s obviously a dangerous disparity there. World oil production peaked in 2005, and oil production has fallen steadily over the past three years. Oil drilling and production is getting increasingly expensive, and there isn’t enough oil to keep up with the demand.

In 1970 we imported only 24% of foreign oil, and today that number is nearly 70% and growing. At current oil prices, we will send $700 billion out of the country this year to support our foreign oil habit. To put that into perspective, that’s four times the annual cost of the war in Iraq. If this level of consumption continues, then over the next 10 years we will have paid out $10 TRILLION—the greatest transfer of wealth in the history of world.

Now let’s get back to T. Boone Pickens, whose insight into oil production and America’s dangerous dependency has led him to begin making a concerted effort to do something about it. Pickens is spearheading an economic revival for rural America while also trying to break the cycle of dependence on fossil fuel. In the small town of Pampa, Texas, just north of Sweetwater, T. Boone Pickens’ Mesa Power is currently building the largest wind farm in the world.

The Department of Energy says that 20% of America’s electricity can come from wind, if the government will just take steps to begin pursuing it. Studies from around the world show that the Great Plains states have the greatest wind energy potential in the world. North Dakota alone has the potential to provide power for more than 25 years. A 2005 study by Stanford University shows that there is enough wind power in the world to satisfy global electricity demands 7 times over, even if only 20% of that wind power could be harnessed.

Today’s wind turbines currently account for the power that serves more than 4.5 million households, but that’s still only about 1% of current demand. In one year alone, a 3-megawatt wind turbine can produce as much energy as 12,000 barrels of imported oil. So what’s the government waiting for? It would take about $1 trillion to build wind facilities in the corridor that stretches from North Dakota to the Texas panhandle. It would take another $200 billion to build the infrastructure to carry that energy into towns and cities throughout the country. And that sounds like a daunting figure of money. But compared to the $700 billion we spend on foreign oil each year, it’s a bargain.

Developing wind power would revitalize rural America by providing jobs and new economic opportunities. Sweetwater is a good example of such revitalization; the population of under 10,000 has grown to 12,000 since Mesa Power began building the wind farm. In addition to creating new construction and maintenance jobs, workers are needed to manufacture the hardware and blades for the turbines. These are well-paid high-skill jobs comparable to jobs in the aerospace industry. And wind turbines don’t interfere with farming and grazing, so they won’t affect food production. Wind power would be a cheap new replacement for the expensive foreign oil that is draining our economy dry in more ways than one.

T. Boone Pickens is one of the greatest philanthropists in the US, and has contributed millions of dollars to a wide range of worthwhile causes and charities. His foundation is improving lives through grants that support medical research, education, corporate wellness, wildlife, and at-risk youth. And now, he has designed a blueprint for solving a looming crisis for at-risk America. Building new wind generation facilities will be expensive and time-consuming, but the gains far outweigh the means, and it will take only about 10 years to replace more than 1/3 of our foreign oil imports. But the solution has to start at the top—with the government.

The Pickens Plan website lays out the challenge clearly: "On January 20th, 2009, a new President will take office. We're organizing behind the Pickens Plan now to ensure our voices will be heard by the next administration. Together we can raise a call for change and set a new course for America's energy future in the first hundred days of the new presidency — breaking the hammerlock of foreign oil and building a new domestic energy future for America with a focus on sustainability."

Go to the Pickens Plan website and click on the link to send a message to President-elect Barack Obama to tell him why you believe it’s important to reduce our dependence on foreign oil.

Melting Ice May Slow Global Warming

Collapsing antarctic ice sheets, which have become potent symbols of global warming, may actually turn out to help in the battle against climate change and soaring carbon emissions.

Professor Rob Raiswell, a geologist at the University of Leeds, says that as the sheets break off the ice covering the continent, floating icebergs are produced that gouge minerals from the bedrock as they make their way to the sea. Raiswell believes that the accumulated frozen mud could breathe life into the icy waters around Antarctica, triggering a large, natural removal of carbon dioxide from the atmosphere.

And as rising temperatures cause the ice sheets to break up faster, creating more icebergs, the amount of carbon dioxide removed will also rise. Raiswell says: ' It won't solve the problem, but it might buy us some time.'

As the icebergs drift northwards, they sprinkle the minerals through the ocean. Among these minerals, Raiswell's research shows, are iron compounds that can fertilise large-scale growth of photosynthetic plankton, which take in carbon dioxide from the air as they flourish.

According to his calculations, melting Antarctic icebergs already deposit up to 120,000 tonnes of this 'bioavailable' iron into the Southern Ocean each year, enough to grow sufficient plankton to remove some 2.6 billion tonnes of carbon dioxide, equivalent to the annual carbon pollution of India and Japan. A 1 per cent increase in the number of icebergs in the Southern Ocean could remove an extra 26 million tonnes of CO2, equivalent to the annual emissions of Croatia.

Raiswell, a Leverhulme Emeritus Fellow, said: 'We see the rapid ice loss in Antarctica as one obvious sign of climate warming, but could it be the Earth's attempt to save us from global warming?' He added that the effect had not been discovered before because scientists assumed that the iron in the iceberg sediment was inert and could not be used by plankton.

In a paper published in the journal Geochemical Transactions, Raiswell and colleagues at the University of Bristol and the University of California describe how they chipped samples off four Antarctic icebergs blown ashore on Seymour island by a storm in the Weddell Sea.

They found that they contained grains of ferrihydrite and schwertmannite, two iron minerals that could boost plankton growth. 'These are the first measurements of potentially bioavailable iron on Antarctic ice-hosted sediments,' they write. 'Identifying icebergs as a significant source of bioavailable iron may shed new light on how the oceans respond to atmospheric warming.'

No rivers flow into the Southern Ocean and the only previously identified major source of iron for its anaemic waters is dust blown from South America. The team says that icebergs could deliver at least as much iron as the dust.

A key question is how much of the carbon soaked up by the growing plankton is returned to the atmosphere. 'We simply don't know the answer to that,' Raiswell said. Seeding the oceans with iron will only benefit the climate if the plankton sink to the bottom when they die, taking the carbon with them.

David Vaughan, a glaciologist with the British Antarctic Survey, said: 'It's a very interesting new line of research and one that should be looked at in more detail.'

He said the number of icebergs in the Antarctic was expected to rise by about 20 per cent by the end of the century, which could remove an extra 500 million tonnes of carbon dioxide each year, if they all seeded plankton growth.

UN is Told That Earth Needs an Asteroid Shield

A group of the world's leading scientists has urged the United Nations to establish an international network to search the skies for asteroids on a collision course with Earth. The spaceguard system would also be responsible for deploying spacecraft that could destroy or deflect incoming objects.

The group - which includes the Royal Society president Lord Rees and environmentalist Crispin Tickell - said that the UN needed to act as a matter of urgency. Although an asteroid collision with the planet is a relatively remote risk, the consequences of a strike would be devastating.

An asteroid that struck the Earth 65 million years ago wiped out the dinosaurs and 70 per cent of the species then living on the planet. The destruction of the Tunguska region of Siberia in 1908 is known to have been caused by the impact of a large extraterrestrial object.

'The international community must begin work now on forging three impact prevention elements - warning, deflection technology and a decision-making process - into an effective defence against a future collision,' said the International Panel on Asteroid Threat Mitigation, which is chaired by former American astronaut Russell Schweickart. The panel made its presentation at the UN's building in Vienna.

The risk of a significantly sized asteroid - defined by the panel as being more than 45 metres in diameter - striking the Earth has been calculated at two or three such events every 1,000 years, a rare occurrence, though such a collision would dwarf all other natural disasters in recent history.

The panel added that developments in telescope design mean that, by 2020, it should be possible to pinpoint about 500,000 asteroids in orbit round the Sun and study their movements. Of these, several dozen will be revealed to pose threats to Earth, the panel added.

However, the group warned it would be impossible to predict exactly which of these 'at-risk' asteroids would actually strike until it was very close to our planet. By then, it would be too late to take action.

As a result, the panel said it would be necessary to launch missions to deflect or destroy asteroids that have only a one in 10, or even a one in 100, risk of hitting our planet. 'Over the next 10 to 15 years, the process of discovering asteroids will likely identify dozens of new objects threatening enough that they will require proactive decisions by the United Nations,' the report added. In addition, such missions will have to be launched well ahead of a predicted impact, so that slight deflections by spaceships can induce major changes in an asteroid's paths years later. The world will not be able to rely on Bruce Willis saving it from an asteroid at the last minute as he does in Armageddon, in other words. Considerable planning and forethought will be needed.

Funding such missions will therefore require far greater investment than is currently being made by international authorities. At present, about $4m (£2.7m) a year is spent by Nasa on asteroid detection, while the European Space Agency's planned mission to study the asteroid Apophis - which astronomers calculate has a 1 in 45,000 chance of striking the Earth this century - is likely to be a modest project costing only a few tens of millions of dollars.

By contrast, any effective protection system will require funding of about $100m (£68m) a year to provide a full survey of the skies, combined with investment in spacecraft that can reach an asteroid and then deflect it. This would be achieved either by crashing the spacecraft on to the asteroid or by triggering a nuclear explosion in space.

However, the cost of such missions should not be used as an excuse for failing to act, added the panel. 'We are no longer passive victims of the impact process,' it concluded. 'We cannot shirk the responsibility.'

Saturday, March 21, 2009

Mechatronics and Artificial Hearts

Those who are unknowingly implementing the methodologies typical for mechatronics are likely to ask themselves what this term really refers to. At any rate, certain organizations and companies seem to have taken up this terminology in order to give a name to the designs of motion control. The reasons for this would be "To combine all the mechanical, the electronics and all the products that come into motion control from the amplifiers to the motors to the gearboxes to the bus network systems," according to John Mazurkiewicz, president of Motion Control Assn.

Mazurkiewicz also says that the word 'mechatronics’ seems to sum up and include everything that goes into motion control." He also mentions the fact that mechatronics is more used today than five years ago, for instance.

Thanks to the mechatronics’ approach, users can avoid problems. Mazurkiewicz says: "They buy a motor from one company, an encoder from another company and a control from a third company and they make up their own cabling." Problems are bound to appear afterwards, like, for instance, the fact that "Each organization was pointing their fingers to the other organization and the customer never got the problem solved."

Mazurkiewicz adds "Today, what you find is that a lot of companies are offering the entire package." Baldor’s BSM C-series is one approach to an entire package. The torque and horsepower capability is higher for the brushless servo motor systems, but the motor, the control and the cabling system can only be sold separately, because of the necessity in some special configurations. Nonetheless, even when they are bought separately, the pieces can work together just perfectly: "We run the motor and control and we come out with what we call a matched performance package" (the same Mazurkiewicz.) "We have the speed torque curves that tell you if you get this motor and this drive, this is the performance that you are going to get." The evaluation refers even to the cabling system.

At any rate, the solution meant to increase system design, belonging to Baldor, is merely one of many approaches. Another special approach is the one belonging to Danaher Motion. This company has a MechaWare 3.0 mechatronic tool kit integrating control software design and some mechanical systems to increase the speed of the design cycles and to improve the performance of the motion system.

A great innovation brought about by mechatronics might have been artificial hearts. Yet, these artificial systems have some flaws of design that can only limit their use, and make them ineffective for patients. These hearts of electromechanical nature are based on some blood pumps with positive displacement, which have the tendency to become bulky. This results in their failure to fit in chest cavities that have a smaller size. At the Heart Institute in Texas experts try to develop a new type of mechatronic heart, based on a simpler pump system and on more sophisticated algorithms of control. This is meant to solve the problems.

Experts mention that "The overarching goal is to create a robust continuous-flow ventricular assist device that is smaller and more reliable than the current pulsating pumps that mimic the natural heart. The mathematical models of the cardiovascular system also will be evaluated as a possible means to health prognostics and diagnostics. In addition, information from the controllers will be used to assess current conditions of the blood, including viscosity, which is critical to maintaining patient health."

And here are the words of some UH professors: "We are very much looking forward to a long-term collaboration with this excellent biomedical engineering team and to the potential development of an effective, reliable mechanical replacement for the failing human heart,’ says Metcalfe. ‘With heart disease being the leading cause of death in the United States, this is crucial research that constantly needs fresh approaches and interaction across disciplines.’ Echoing his colleague, Franchek adds, ‘What we have here is a good partnership between engineers and physicians. We are harvesting knowledge from a fertile ground where many new discoveries lie, and at the end of the day our goal is to improve many people’s quality of life."

Whether this will be or not feasible, only time can tell. We’ll live and see.

Science that Takes Your Breath Away


The breath you "see" on a cold winter's day may become a useful diagnostic tool for identifying lung changes or damage from air pollutants. EPA scientists are at the forefront of new biomarker research to determine if exhaled breath can be used to detect changes in the body at the molecular or cellular level.

Over the last 20 years, researchers have used a medical procedure called bronchoscopy that involves taking lung cells with a scope to study the effects of air pollutants on the lungs. The procedure must be done in a clinic under medical supervision. In contrast, EPA scientists expect the breath-borne biomarkers to be usable in more situations outside the clinic, cost less to administer, and provide much more comfort to the volunteer.

Is Biodiversity Good for Our Health?


Lyme disease. West Nile virus. Malaria. Over the past several decades, scientists and public health officials have documented the outbreak of many troubling infectious diseases: illnesses once thought well under control are making a comeback or appearing in locations where they had never been seen before; brand new diseases are suddenly appearing on the scene.

Over the same time period that the frequency of emerging infectious diseases appears to be increasing, natural habitats and biological diversity (the abundance, composition and distribution of species) have been declining, primarily due to deforestation, development projects, and other human activities. Is there a link? While the current scientific literature suggests the answer is yes, there have been few integrative, interdisciplinary studies exploring the scientific connections between human health and biodiversity. A new research initiative at EPA aims to fill that void.

Geologic temperature record



ScienceDaily: Your source for the latest research news and science breakthroughs -- updated daily
Science Reference
Share Blog
Print Email Bookmark
Geologic temperature record

This article is devoted to temperature changes in Earth's environment as determined from geologic evidence on multi-million to billion (109) year time scales.
See also:
Earth & Climate

* Geology
* Ice Ages
* Earth Science

Fossils & Ruins

* Fossils
* Early Climate
* Paleontology

The last 3 million years have been characterized by cycles of glacials and interglacials within a gradually deepening ice age..

For more information about the topic Geologic temperature record, read the full article at Wikipedia.org, or see the following related articles:
Temperature record — The temperature record shows the fluctuations of the temperature of the atmosphere and the oceans through various spans of time. The most detailed ... > read more
Sequence stratigraphy — Sequence stratigraphy is a relatively new branch of geology that attempts to link prehistoric sea-level changes to sedimentary deposits. The ... > read more
Ice age — An ice age is a period of long-term reduction in the temperature of Earth's climate, resulting in an expansion of the continental ice sheets, polar ... > read more
Paleoclimatology — Paleoclimatology is the study of climate change taken on the scale of the entire history of the earth. Glaciers are a widely employed instrument in ...

Climate model

Climate models use quantitative methods to simulate the interactions of the atmosphere, oceans, land surface, and ice.
See also:
Earth & Climate

* Weather
* Climate
* Global Warming
* Ice Ages
* Air Quality
* Environmental Science

They are used for a variety of purposes from study of the dynamics of the weather and climate system to projections of future climate. The most talked-about models of recent years have been those relating air temperature to emissions of carbon dioxide.

These models predict an upward trend in the surface temperature record, as well as a more rapid increase in temperature at higher altitudes..

For more information about the topic Climate model, read the full article at Wikipedia.org, or see the following related articles:
Consensus of scientists regarding global warming — The IPCC Third Assessment Report is a product of the Intergovernmental Panel on Climate Change, which is based in Geneva Switzerland, and was ... > read more
Global climate model — General Circulation Models (GCMs) are a class of computer-driven models for weather forecasting, understanding climate and projecting climate change, ... > read more
Numerical weather prediction — Numerical weather prediction is the science of predicting the weather using "models" of the atmosphere and computational techniques. Manipulating the ... > read more
Carbon cycle — The carbon cycle is the biogeochemical cycle by which carbon is exchanged between the biosphere, geosphere, hydrosphere and atmosphere of the Earth. ..

Instrumental temperature record


The instrumental temperature record shows the fluctuations of the temperature of the atmosphere and the oceans since the invention of thermometers..
See also:
Earth & Climate

Matter & Energy

For more information about the topic Instrumental temperature record, read the full article at Wikipedia.org, or see the following related articles:
Temperature record — The temperature record shows the fluctuations of the temperature of the atmosphere and the oceans through various spans of time. The most detailed ... > read more
Consensus of scientists regarding global warming — The IPCC Third Assessment Report is a product of the Intergovernmental Panel on Climate Change, which is based in Geneva Switzerland, and was ... > read more
Temperature record of the past 1000 years — The temperature record of the past 1000 years describes the reconstruction of temperature for the last 1000 years on the Northern Hemisphere. A ... > read more
Climate — The climate is the weather averaged over a long period of time. Weather is the combination of events in the atmosphere and climate is the overall

ce shelf


An ice shelf is a thick, floating platform of ice that forms where a glacier or ice sheet flows down to a coastline and onto the ocean surface, typically in Antarctica or Greenland.
See also:
Earth & Climate

* Global Warming
* Ice Ages
* Oceanography

Matter & Energy

* Nature of Water

The boundary between floating ice shelf and the grounded (resting on bedrock) ice that feeds it is called the grounding line.

When the grounding line retreats inland, water is added to the ocean and sea level rises..

For more information about the topic Ice shelf, read the full article at Wikipedia.org, or see the following related articles:
Antarctic ice sheet — The Antarctic ice sheet is the largest single mass of ice on Earth. It covers an area of almost 14 million square km and contains 30 million cubic km ... > read more
Larsen Ice Shelf — The Larsen Ice Shelf is a long, fringing ice shelf in the northwest part of the Weddell Sea, extending along the east coast of Antarctic Peninsula ... > read more
Ice sheet — An Ice sheet is a mass of glacier ice that covers surrounding terrain and is greater than 50,000 square kilometers (19,305 square miles). The only ... > read more
Iceberg — An iceberg is a large piece of ice that has broken off from a snow-formed glacier or ice shelf and is floating in open water. Since the density of .

Larsen Ice Shelf


The Larsen Ice Shelf is a long, fringing ice shelf in the northwest part of the Weddell Sea, extending along the east coast of Antarctic Peninsula from Cape Longing to the area just southward of Hearst Island.
See also:
Earth & Climate

* Global Warming
* Climate
* Ice Ages
* Snow and Avalanches
* Oceanography
* Environmental Issues

The Larsen Ice Shelf is a series of three shelves that occupy (or occupied) distinct embayments along the coast.

From north to south, the three segments are called Larsen A (the smallest), Larsen B, and Larsen C (the largest) by researchers who work in the area.

The Larsen A ice shelf disintegrated in January of 1995.

The Larsen B ice shelf disintegrated in February of 2002.

The Larsen C ice shelf appears to be stable. The Larsen disintegration events were unusual.

Typically, ice shelves lose mass by iceberg calving and by melting at their upper and lower surfaces.

The disintegration events are linked to the ongoing climate warming in the Antarctic Peninsula, about 0.5 °C per decade since the late 1940's (possibly a result of global warming)..

For more information about the topic Larsen Ice Shelf, read the full article at Wikipedia.org, or see the following related articles:
Ice shelf — An ice shelf is a thick, floating platform of ice that forms where a glacier or ice sheet flows down to a coastline and onto the ocean surface, ... > read more
Antarctic ice sheet — The Antarctic ice sheet is the largest single mass of ice on Earth. It covers an area of almost 14 million square km and contains 30 million cubic km ... > read more
Ice sheet — An Ice sheet is a mass of glacier ice that covers surrounding terrain and is greater than 50,000 square kilometers (19,305 square miles). The only ... > read more
Iceberg — An iceberg is a large piece of ice that has broken off from a snow-formed glacier or ice shelf and is floating in open water. Since the density of .

Greenhouse effect

The greenhouse effect is the process in which the emission of infrared radiation by the atmosphere warms a planet's surface.
See also:
Earth & Climate

* Environmental Issues
* Atmosphere
* Global Warming
* Renewable Energy
* Earth Science
* Environmental Science

The name comes from an analogy with the warming of air inside a greenhouse compared to the air outside the greenhouse.

The Earth's average surface temperature is about 33°C warmer than it would be without the greenhouse effect.

In addition to the Earth, Mars and especially Venus have greenhouse effects. The Earth receives energy from the Sun in the form of radiation.

The Earth reflects about 30% of the incoming solar radiation.

The remaining 70% is absorbed, warming the land, atmosphere and oceans.

For the Earth's temperature to be in steady state so that the Earth does not rapidly heat or cool, this absorbed solar radiation must be very nearly balanced by energy radiated back to space in the infrared wavelengths.

Since the intensity of infrared radiation increases with increasing temperature, one can think of the Earth's temperature as being determined by the infrared flux needed to balance the absorbed solar flux.

The visible solar radiation mostly heats the surface, not the atmosphere, whereas most of the infrared radiation escaping to space is emitted from the upper atmosphere, not the surface.

The infrared photons emitted by the surface are mostly absorbed in the atmosphere by greenhouse gases and clouds and do not escape directly to space. The reason this warms the surface is most easily understood by starting with a simplified model of a purely radiative greenhouse effect that ignores energy transfer in the atmosphere by convection (sensible heat transport) and by the evaporation and condensation of water vapor (latent heat transport).

In this purely radiative case, one can think of the atmosphere as emitting infrared radiation both upwards and downwards.

The upward infrared flux emitted by the surface must balance not only the absorbed solar flux but also this downward infrared flux emitted by the atmosphere.

The surface temperature will rise until it generates thermal radiation equivalent to the sum of the incoming solar and infrared radiation..

For more information about the topic Greenhouse effect, read the full article at Wikipedia.org, or see the following related articles:
Climate model — Climate models use quantitative methods to simulate the interactions of the atmosphere, oceans, land surface, and ice. They are used for a variety of ... > read more
Consensus of scientists regarding global warming — The IPCC Third Assessment Report is a product of the Intergovernmental Panel on Climate Change, which is based in Geneva Switzerland, and was

Animal Families With The Most Diversity Also Have Widest Range Of Size

Climate Warming Affects Antarctic Ice Sheet Stability


A five-nation scientific team has published new evidence that even a slight rise in atmospheric concentrations of carbon dioxide, one of the gases

Flex Appeal: Researchers Create Carbon Nanotube Muscles


Researchers for decades have been developing polymers and other materials they hope to someday use to create artificial muscles that, when given an electrical charge, mimic the real thing more cheaply and effectively than the hydraulic systems and electric motors used today. A group of scientists at the University of Texas at Dallas' Alan G. MacDiarmid NanoTech Institute reports in Science today that they have demonstrated a fundamentally new type of artificial muscle, consisting almost exclusively of carbon nanotubes, which can operate at extreme low temperatures that would cause other artificial muscles systems to freeze and at very high temperatures that would cause other muscle systems to decompose

MOON DANCE:


The Hubble Space Telescope captured four of Saturn's moons transiting, or passing in front of their planet, at once. In this February image from Hubble's Wide Field Planetary Camera 2, Saturn's largest moon, Titan, and its shadow are plainly visible across the ringed planet's northern hemisphere. The much smaller Mimas is barely visible as a whitish dot below and to the left of Titan (just above the rings), but can be found next to the more apparent dark shadow it casts across the gas giant. To the left of the image is Dione, which appears to hover above the rings, and Enceladus, the tiny speck at the rings' edge. Hubble's view was made possible by the fact that the plane of Saturn's rings aligns with our line of sight in 2009—the rings will be perfectly "edge-on" on September 4. (At that time, however, Saturn's position relative to the sun will largely preclude observation from Earth.) This alignment takes place every 15 years or so as Saturn makes its 29.5-year orbit around the sun and has not occurred since 1995–1996.

The human brain is on the edge of chaos


Cambridge-based researchers provide new evidence that the human brain lives "on the edge of chaos", at a critical transition point between randomness and order. The study, published March 20 in the open-access

Baby boomlet: US births in 2007 break 1950s record


AP) -- More babies were born in the United States in 2007 than any year in the nation's history, topping the peak during the baby boom 50 years earlier, federal researchers reporte

Microsoft Hopes To Win Back Browser Market Share With Internet Explorer 8


(PhysOrg.com) -- Microsoft has a lot at stake with the success of Internet Explore 8 since they lost 7 percent of their browser market to Firefox, Safari, and Chrome browsers. IE8 has been in release candidate

Carbon Nanotube Artificial Muscles for Extreme Temperatures


PhysOrg.com) -- Researchers at the UT Dallas Alan G. MacDiarmid NanoTech Institute have demonstrated a fundamentally new type of artificial muscle, which can operate at extreme temperatures where no other

New simulation shows consequences of a world without Earth's natural sunscreen (w/Video)


(PhysOrg.com) -- The year is 2065. Nearly two-thirds of Earth's ozone is gone -- not just over the poles, but everywhere. The infamous ozone hole over Antarctica, first discovered in the 1980s, is a year-round .

Ice Progression West Antarctic ice comes and goes, rapidly


Researchers today worry about the collapse of West Antarctic ice shelves and loss of the West Antarctic ice sheet, but little is known about the past movements of this ice. Now climatologists from Penn State

Utilization of High Performance Computing


High-capability computing research is performed to support and improve interaction between science observations and science models, along with creating state-of-the-art engineering simulations. Computer scientists and modelers make use of JPL's hardware, middleware, and application resources to help investigate science and engineering questions that could support future NASA missions.Read More about Utilization of High Performance Computing

Everyday Science:


# Children come to understand the living world through a complex interplay of informal and formal educational experiences. We are developing synthetic accounts of children’s development of biological understanding from their activities in and out of school (cf., folk biology). Also, because children encounter highly discrepant images of science across their activities, we are studying how children come to understand the nature and purposes of scientific inquiry and knowledge (cf., folk epistemology).
# Digital Technologies in Youth Culture: Pervasive digital technologies—like instant messaging, videogames, and cell phones—are being increasingly interwoven into children’s everyday activities. We are documenting how such devices are influencing their social practices and development in order to better understand how children learn with and about these digital technologies.
# Everyday Argumentation: We are documenting how children engage in and attend to argument across the settings and activities of their lives. This allows us to understand the various forms and purposes for that argumentation as well as understanding the social and cognitive competencies children demonstrate in different contexts. By examining the processes and products of everyday argument, we hope to ultimately inform how children can develop scientific argumentation practices

Science and technology


A brilliant new approach

Light-emitting diodes will transform the business of illumination, especially with new production breakthroughs

Wednesday, March 18, 2009

Science and technology studies


Science and technology studies (STS) is the study of how social, political, and cultural values affect scientific research and technological innovation, and how these in turn affect society, politics, and culture. More than two dozen universities worldwide offer baccalaureate degrees in STS; about half of these also offer doctoral or master's programs.

STS scholars tend to be inspired by one or both of the following[citation needed]:

* The discovery of relationships between scientific and technological innovations and society, from new and revealing perspectives, with the assumption that science and technology are socially embedded.
* Concern over the direction and the risks of science and technology.

History

STS is a new and expanding subject; for example, in 2005, four major U.S. universities announced new STS programs.[citation needed] Like most interdisciplinary programs, it emerged from the confluence of a variety of disciplines and disciplinary subfields, all of which had developed an interest -- typically, during the 1960s or 1970s-- in viewing science and technology as socially embedded enterprises.

Early developments

The key disciplinary components of STS took shape independently, beginning in the 1960s, and developed in isolation from each other well into the 1980s, although Ludwig Fleck's monograph (1935) Genesis and Development of a Scientific Fact anticipated many of STS's key themes:

* Science studies, a branch of the sociology of scientific knowledge that places scientific controversies in their social context.
* History of technology, that examines technology in its social and historical context. Starting in the 1960s, some historians questioned technological determinism, a doctrine that can induce public passivity to technologic and scientific 'natural' development. At the same time, some historians began to develop similarly contextual approaches to the history of medicine.
* History and philosophy of science (1960s). After the publication of Thomas Kuhn's well-known The Structure of Scientific Revolutions (1962), which attributed changes in scientific theories to changes in underlying intellectual paradigms, programs were founded at the University of California, Berkeley and elsewhere that brought historians of science and philosophers together in unified programs.
* Science, technology, and society In the mid- to late-1960s, student and faculty social movements in the U.S., UK, and European universities helped to launch a range of new interdiscplinary fields (such as Women's Studies) that were seen to address relevant topics that the traditional curriculum ignored. One such development was the rise of "science, technology, and society" programs, which are also -- confusingly -- known by the STS acronym. Drawn from a variety of disciplines, including anthropology, history, political science, and sociology, scholars in these programs created undergraduate curricula devoted to exploring the issues raised by science and technology. Unlike scholars in science studies, history of technology, or the history and philosophy of science, they were and are more likely to see themselves as activists working for change rather than dispassionate, "ivory tower" researchers[citation needed]. As an example of the activist impulse, feminist scholars in this and other emerging STS areas addressed themselves to the exclusion of women from science and engineering.
* Science, engineering, and public policy studies emerged in the 1970s from the same concerns that motivated the founders of the science, technology, and society movement: A sense that science and technology were developing in ways that were increasingly at odds with the public’s best interests. The science, technology, and society movement tried to humanize those who would make tomorrow’s science and technology, but this discipline took a different approach: It would train students with the professional skills needed to become players in science and technology policy. Some programs came to emphasize quantitative methodologies, and most of these were eventually absorbed into systems engineering. Others emphasized sociological and qualitative approaches, and found that their closest kin could be found among scholars in science, technology, and society departments.

During the 1970s and 1980s, leading universities in the U.S., UK, and Europe began drawing these various components together in new, interdisciplinary programs. For example, in the 1970s, Cornell University developed a new program that united science studies and policy-oriented scholars with historians and philosophers of science and technology. Each of these programs developed unique identities due to variation in the components that were drawn together, as well as their location within the various universities. For example, the University of Virginia's STS program united scholars drawn from a variety of fields (with particular strength in the history of technology); however, the program's teaching responsibilities -- it is located within an engineering school and teaches ethics to undergraduate engineering students -- means that all of its faculty share a strong interest in engineering ethics.

The "turn to technology"

A decisive moment in the development of STS was the mid-1980s addition of technology studies to the range of interests reflected in science studies programs. During that decade, two works appeared en seriatim that signaled what Steve Woolgar was to call the “turn to technology”: Social Shaping of Technology (MacKenzie and Wajcman, 1985) and The Social Construction of Technological Systems (Bijker, Hughes et al., 1987). MacKenzie and Wajcman primed the pump by collecting a highly readable collection of articles attesting to the influence of society on technological design. In a seminal article, Trevor Pinch and Wiebe Bijker attached all the legitimacy of the Sociology of Scientific Knowledge to this development by showing how the sociology of technology could proceed along precisely the theoretical and methodological lines established by the sociology of scientific knowledge. This was the intellectual foundation of the field they called the social construction of technology.

The "turn to technology" helped to cement an already growing awareness of underlying unity among the various emerging STS programs. More recently, there has been an associated turn to materiality, whereby the socio-technical and material co-produce each other. This is especially evident in work in STS analyses of biomedicine (such as Carl May, Nelly Oudshoorn, and Andrew Webster).

Professional associations

Founded in 1975, the Society for Social Studies of Science, initially provided scholarly communication facilities -- including a journal (Science, Technology, and Human Values) and annual meetings -- that were mainly attended by science studies scholars, but the society has since grown into the most important professional association of science and technology studies scholars worldwide. The Society for Social Studies of Science members also include government and industry officials concerned with research and development as well as science and technology policy; scientists and engineers who wish to better understand the social embeddedness of their professional practice; and citizens concerned about the impact of science and technology in their lives. Proposals have been made to add the word "technology" to the association's name, thereby reflecting its stature as the leading STS professional society, but there seems to be widespread sentiment that the name is long enough as it is.

In Europe, the European Society for the Study of Science and Technology (EASST) was founded in 1981 to stimulate communication, exchange and collaboration in the field of studies of science and technology.

Founded in 1958, the Society for the History of Technology initially attracted members from the history profession who had interests in the contextual history of technology. After the "turn to technology" in the mid-1980s, the society's well-regarded journal (Technology and Culture) and its annual meetings began to attract considerable interest from non-historians with technology studies interests.

Less identified with STS, but also of importance to many STS scholars in the US, are the History of Science Society, the Philosophy of Science Association, and the American Association for the History of Medicine. In addition, there are significant STS-oriented special interest groups within major disciplinary associations, including the American Anthropological Association, the American Political Science Association, and the American Sociological Association.