The devastation and loss of life from Hurricane Katrina will send this storm into the history books as one of the worst natural disasters ever to strike the United States. It is already being called the worst natural disaster since the 1906 San Francisco earthquake that leveled that city.
It will certainly go into the books as the most destructive hurricane, in terms of monetary damage, in U.S. history, although Hurricane Camille in 1969 was meteorologically a much stronger storm.
Like Katrina, Camille struck the Mississippi coast hard. Unlike Katrina, however, it struck while at maximum intensity. It remains the most intense storm ever to hit the U.S. with sustained wind speeds of 190 miles per hour at impact. Gusts were estimated to be more than 220 miles per hour in places.
But in 1969, the region was relatively sparsely populated. The death toll, estimated to be at least 255, would have been much higher if Camille had hit New Orleans, just to the west.
And that’s exactly what Katrina did. Katrina was a huge storm, with hurricane-force winds extending many dozens of miles from the eye wall. Even so, if it had hit only a few miles to the west of where it did, New Orleans would have received the full force of the storm instead of the glancing blow it got.
Even with the close side-swipe, however, New Orleans suffered extreme amounts of devastation. And most of it is due to the city’s topography. Much of it lies below sea level, relying on levees to hold back water from the Mississippi River and Lake Pontchartrain.
When Camille struck Biloxi in 1969, it produced a storm surge of 22 feet. Reports were heard of people climbing onto their rooftops to survive the encroaching ocean, even as far inland as two miles from the coast.
Yet, even knowing that such a scenario would surely repeat itself one day, residents vowed to rebuild. And so they did.
And now, the devastation has been repeated with Katrina. But even before the devastation had begun to be cleared away, news reports showed despondent residents vowing to stay and rebuild.
Hurricanes Katrina and Camille, along with 1992’s Andrew, which, like Katrina, struck both Florida and the Gulf Coast leaving billions of dollars in damage, were all natural disasters. As such, one might believe the extreme devastation could not have been avoided. After all, the technology does not exist now, let alone in 1969, to change the course of a hurricane, or to dissipate one.
But much of the loss of life and property could have been prevented.
In 1906, San Franciscans had no idea they were living on a major fault line. But after their rude awakening, they knew the score. They could rebuild, or they could look to build elsewhere. They chose to rebuild.
In 1969, the Gulf Coast along Mississippi had a few scattered towns and small cities. After their destruction by Camille, residents could have moved to safer ground, but they decided to stay put and rebuild.
New Orleans is a major city built below sea level in a region that is prone to hurricane strikes. It would be illogical to expect a major city, especially one with such historic significance, to pack it in and move upstream. But if it did, it would avoid being destroyed again in the future.
Today, million-dollar homes are being built right on top of major fault lines in San Francisco and Los Angeles. People are moving as close as they can to the sea shore in Florida and along the Gulf Coast.
It’s the same everywhere. In Italy, the City of Naples is paying people to move out of parts of the city that are in the red zone of Mount Vesuvius. Many are refusing because of the good volcanic soil used to grow grapes for wine.
I don’t mean to suggest that people who build near the seashore, near volcanoes, or on top of fault lines have it coming. And I certainly don’t mean to be insensitive to the tragedy.
But when people make a cognitive decision to live their lives in flood plains, beneath sea level near the coast, on a fault line, or close to an active volcano, then those people are gambling with their lives and property. Eventually, they will lose their wager. It’s inevitable.
Last week, nature cashed in on the people of New Orleans and the Mississippi coast.
Wednesday, August 31, 2005
Thursday, August 25, 2005
With Oil Prices so High it's Alternative Fuels to the Rescue
With the price of crude oil nearing $70 a barrel, forcing gasoline prices to record highs, there are those who wonder if the doom-and-gloom forecasts that say we’ll run out of petroleum sometime between 2012 and 2030 might be coming true even sooner.
One of those gloomy scenarios, the Olduvai theory, was first introduced by Richard Duncan in 1989. It claims industrialized civilization can last no longer than 100 years. Duncan put the starting point of industrialized civilization at 1930.
It predicts modern society will reach an “Olduvai cliff” by the year 2012, with continual rolling blackouts and permanent gas shortages, leading to economic collapse. So far, each pre-cliff event predicted by the theory has come true.
A similar prediction is the Hubbert peak oil theory, although it lacks such exact dates as the Olduvai theory. Introduced by geophysicist M. K. Hubbert, the peak oil theory states that world oil production will form a steep bell-shaped curve, with decline in production mirroring the rapid increase in oil production seen in the twentieth century.
Many oil experts place the year of peak oil production at 2007, although some say it will be next year.
Of course, most of these predictions assume that there will be no new technologies coming down the pike that will create alternatives to petroleum, which is the main source of the world’s energy now and has been for more than 100 years.
But alternative fuel sources have existed for years. Until now, however, they have been too expensive to be taken seriously.
Between 1920 and 1970, the price of oil remained fairly constant, even dropping slightly. It peaked sharply in 1973 and again in 1980. In fact, the 1980 spike was more dramatic than this year’s increase in oil prices.
But on average, the price of oil has remained generally too low to justify any large-scale production of alternative fuels.
During World War II, however, Germany did rely on an alternative fuel production method. It is called the Fisher-Tropsch process in honor of the two Germans who invented it in the 1920s.
The process produces synthetic petroleum by cooking hydrogen gas in the presence of carbon monoxide to yield long-chain hydrocarbons, which can then be refined into diesel fuel, kerosene, and gasoline. South Africa also used this process during the apartheid years.
Both South Africa and Germany were cut off from petroleum imports, so they had to use their vast supplies of coal as the feedstock for the production of the raw ingredients needed for the Fisher-Tropsch process. Although expensive, it supplied these countries with all the oil they needed.
The world has vast reserves of coal. The U.S. also has rich deposits of coal in many areas, including Indiana.
While crude oil was hovering at less than $25 per barrel, in 1999 dollars, for much of the last century, the gasification of coal for use in the Fisher-Tropsch process was not economical. A barrel of oil produced from that process would cost about $32.
But now that the price of crude has risen to more than twice that price, some oil experts, and some politicians, are taking another look at synthetically-produced oil.
Last week Gov. Brian Schweitzer of Montana announced a plan that would use Montana’s vast coal reserves to power the United States for the next 40 years. And his plan is attracting the attention of oil analysts from all over the U.S.
Schweitzer claims his plan can produce gasoline for about a dollar per gallon. He said Montana is sitting on more usable energy resources than the whole of the Middle East.
Of course, Montana is not the only state with coal reserves. The main problem, other than the initial investment capital, is that nobody wants a coal mine in his back yard. If we really do elect to use coal gasification to produce synthetic oil, it would mean an increase in strip mining of coal across the country.
But Schweitzer said it could be accomplished without any major environmental impact, at least for the foreseeable future. And if the world is to avoid the Olduvai cliff, we may not have much choice.
One of those gloomy scenarios, the Olduvai theory, was first introduced by Richard Duncan in 1989. It claims industrialized civilization can last no longer than 100 years. Duncan put the starting point of industrialized civilization at 1930.
It predicts modern society will reach an “Olduvai cliff” by the year 2012, with continual rolling blackouts and permanent gas shortages, leading to economic collapse. So far, each pre-cliff event predicted by the theory has come true.
A similar prediction is the Hubbert peak oil theory, although it lacks such exact dates as the Olduvai theory. Introduced by geophysicist M. K. Hubbert, the peak oil theory states that world oil production will form a steep bell-shaped curve, with decline in production mirroring the rapid increase in oil production seen in the twentieth century.
Many oil experts place the year of peak oil production at 2007, although some say it will be next year.
Of course, most of these predictions assume that there will be no new technologies coming down the pike that will create alternatives to petroleum, which is the main source of the world’s energy now and has been for more than 100 years.
But alternative fuel sources have existed for years. Until now, however, they have been too expensive to be taken seriously.
Between 1920 and 1970, the price of oil remained fairly constant, even dropping slightly. It peaked sharply in 1973 and again in 1980. In fact, the 1980 spike was more dramatic than this year’s increase in oil prices.
But on average, the price of oil has remained generally too low to justify any large-scale production of alternative fuels.
During World War II, however, Germany did rely on an alternative fuel production method. It is called the Fisher-Tropsch process in honor of the two Germans who invented it in the 1920s.
The process produces synthetic petroleum by cooking hydrogen gas in the presence of carbon monoxide to yield long-chain hydrocarbons, which can then be refined into diesel fuel, kerosene, and gasoline. South Africa also used this process during the apartheid years.
Both South Africa and Germany were cut off from petroleum imports, so they had to use their vast supplies of coal as the feedstock for the production of the raw ingredients needed for the Fisher-Tropsch process. Although expensive, it supplied these countries with all the oil they needed.
The world has vast reserves of coal. The U.S. also has rich deposits of coal in many areas, including Indiana.
While crude oil was hovering at less than $25 per barrel, in 1999 dollars, for much of the last century, the gasification of coal for use in the Fisher-Tropsch process was not economical. A barrel of oil produced from that process would cost about $32.
But now that the price of crude has risen to more than twice that price, some oil experts, and some politicians, are taking another look at synthetically-produced oil.
Last week Gov. Brian Schweitzer of Montana announced a plan that would use Montana’s vast coal reserves to power the United States for the next 40 years. And his plan is attracting the attention of oil analysts from all over the U.S.
Schweitzer claims his plan can produce gasoline for about a dollar per gallon. He said Montana is sitting on more usable energy resources than the whole of the Middle East.
Of course, Montana is not the only state with coal reserves. The main problem, other than the initial investment capital, is that nobody wants a coal mine in his back yard. If we really do elect to use coal gasification to produce synthetic oil, it would mean an increase in strip mining of coal across the country.
But Schweitzer said it could be accomplished without any major environmental impact, at least for the foreseeable future. And if the world is to avoid the Olduvai cliff, we may not have much choice.
Technology Behind Gadgets Enhances Their Enjoyment
Those who know me probably also know that I like gadgets. Yes, I just love my electronic widgets and doodads.
Take for example back in the mid-1980s when the new-fangled stereo component, the CD player, started to hit the market. It didn’t take very long for “record” stores to completely replace their stocks of vinyl albums with the new digital medium, the compact disc. And it didn’t take long for me to own one.
I’m never among the first to jump onto the bandwagon of new devices. I know that first-generation devices are always bare-bones and expensive. I grit my teeth and wait it out until the price comes down and they have worked out the bugs.
But if the technology catches on, I’m usually there for round two.
It was a similar situation when DVDs started to become popular. I love DVD. I own a DVD player that also plays Super Audio CD and DVD-Audio, an audio format that has four times the resolution of standard CDs. I also own two DVD recorders – one on my computer and one on top of my TV set, which replaced my aging VCR.
To me, the technology that makes CDs and DVDs work is fascinating. Teenagers today, who may not have ever seen a vinyl record, take the technology for granted. But I know how different a CD is from an old-fashioned record.
The 45-RPM or LP vinyl discs were based on exactly the same technology that Thomas Edison used when he first recorded his voice by etching a groove onto a wax cylinder. It is an exceedingly simple concept.
Put a needle in the grooves of a vinyl record, start it rotating and the grooves cause the needle to vibrate. The only thing you have to do to hear it is amplify it.
In the old days, the amplification came by causing the needle to vibrate a diaphragm connected to the armature. The diaphragm’s vibrations were amplified by the resonance chamber of the Victrola, as the players were called.
Later, electronic amplifiers connected to speakers were used in place of the diaphragm. Finally high fidelity, or hi-fi, stereo players became the norm. But even they used Edison’s technique of setting a needle onto etched grooves, which cause it to vibrate.
The CD and DVD work on completely different principals from vinyl records. They both use lasers to read microscopic pits that are stamped within the metallic layer of a plastic disc. These pits scatter the laser light, whereas the flat areas between the pits reflect the laser light onto a sensor. Nothing is vibrating at all.
A microcomputer chip converts the series of on-and-off laser signals into the only language a computer can understand, 1s and 0s. A vast number of these 1s and 0s are then combined to form the code which the compute recognizes as a certain sound.
This digital code is then converted into an analog sound signal, by a different computer component. It is then amplified and sent to the speakers.
To me, it’s all quite fascinating. It adds an extra dimension to listening to good music. I now not only can sit back and appreciate the high quality sound of a brilliantly-mixed DVD-Audio recording in digital 5.1 surround sound, I can also appreciate the technology that went into making it all happen.
Most people don’t have to think about it; they just listen and enjoy. But, to me, thinking about how the technology works is part of the enjoyment.
Take for example back in the mid-1980s when the new-fangled stereo component, the CD player, started to hit the market. It didn’t take very long for “record” stores to completely replace their stocks of vinyl albums with the new digital medium, the compact disc. And it didn’t take long for me to own one.
I’m never among the first to jump onto the bandwagon of new devices. I know that first-generation devices are always bare-bones and expensive. I grit my teeth and wait it out until the price comes down and they have worked out the bugs.
But if the technology catches on, I’m usually there for round two.
It was a similar situation when DVDs started to become popular. I love DVD. I own a DVD player that also plays Super Audio CD and DVD-Audio, an audio format that has four times the resolution of standard CDs. I also own two DVD recorders – one on my computer and one on top of my TV set, which replaced my aging VCR.
To me, the technology that makes CDs and DVDs work is fascinating. Teenagers today, who may not have ever seen a vinyl record, take the technology for granted. But I know how different a CD is from an old-fashioned record.
The 45-RPM or LP vinyl discs were based on exactly the same technology that Thomas Edison used when he first recorded his voice by etching a groove onto a wax cylinder. It is an exceedingly simple concept.
Put a needle in the grooves of a vinyl record, start it rotating and the grooves cause the needle to vibrate. The only thing you have to do to hear it is amplify it.
In the old days, the amplification came by causing the needle to vibrate a diaphragm connected to the armature. The diaphragm’s vibrations were amplified by the resonance chamber of the Victrola, as the players were called.
Later, electronic amplifiers connected to speakers were used in place of the diaphragm. Finally high fidelity, or hi-fi, stereo players became the norm. But even they used Edison’s technique of setting a needle onto etched grooves, which cause it to vibrate.
The CD and DVD work on completely different principals from vinyl records. They both use lasers to read microscopic pits that are stamped within the metallic layer of a plastic disc. These pits scatter the laser light, whereas the flat areas between the pits reflect the laser light onto a sensor. Nothing is vibrating at all.
A microcomputer chip converts the series of on-and-off laser signals into the only language a computer can understand, 1s and 0s. A vast number of these 1s and 0s are then combined to form the code which the compute recognizes as a certain sound.
This digital code is then converted into an analog sound signal, by a different computer component. It is then amplified and sent to the speakers.
To me, it’s all quite fascinating. It adds an extra dimension to listening to good music. I now not only can sit back and appreciate the high quality sound of a brilliantly-mixed DVD-Audio recording in digital 5.1 surround sound, I can also appreciate the technology that went into making it all happen.
Most people don’t have to think about it; they just listen and enjoy. But, to me, thinking about how the technology works is part of the enjoyment.
Subscribe to:
Posts (Atom)