Saturday, January 30, 2010

Yes, Evolution Really is a Proven Fact

I do not need to be convinced of the fact of evolution. I’ve known that organisms have evolved ever since I took biology in high school all those decades ago. I also know that not everyone shares my conviction that evolution has happened.

Oh, I know, even the most ardent creationist admits that evolution happens up to a point, within a species. But they go on blindly denying that one species can evolve into another.

Although I needed no convincing, I decided to read Richard Dawkins’ newest book on the subject, The Greatest Show on Earth: The Evidence for Evolution. I’m not even completely finished with the book yet, but I’ll have to say, if I were ever a fence sitter, I would no longer be straddling. I have taught biology in high school and middle school for years, and I know all the standard lines of evidence commonly taught to students. But I must admit I did not know how deeply the evidence supporting evolution went until I started reading the book.

Forget fossils. Yes, they still provide grand evidence supporting evolution. But, as Dawkins states, fossils are neither the only nor the best evidence for evolution. In fact, evolution would be on solid ground even if not a single fossil existed.

Take the DNA evidence for example. I’ve taught genetics as part of my science class as long as I’ve been a teacher, but the story of how DNA provides astonishing evidence for evolution was a revelation to me. I was aware that humans and chimpanzees share 98 percent of their DNA. That is striking enough, but it becomes far more compelling when one compares the similarities in the genomes of several different species of animals that are not closely related.

Currently, scientists have devised a family tree of more than 3,000 species of organisms, including plants, animals, and bacteria. The family tree comes not from fossil evidence, as was the case with the trees of life I became familiar with when I was taking science in high school and college. Those were based primarily on fossil evidence. They are still quite accurate, given that fossils do exist and do provide striking evidence of ancestry. But the family tree Dawkins describes is based on DNA sequences. And it is far more precise.

Creationists are fond of pointing out how statistically unlikely it would be for evolution to take place purely by chance. “It would be like throwing pieces of a watch in a box, shaking it, and having the pieces accidently assemble into a watch,” I’ve heard it said. Of course those who say this have no clue about how evolution works. It is not driven by pure chance, but by selection pressures - natural selection.

But the family tree of life created by DNA evidence can provide some real statistics. The statistics show that the similarities in DNA sequences among multiple species have to be because the species share a common ancestry. Any other explanation would be so extremely unlikely as to be non-existent. In other words, DNA has statistically proven evolution.

But what if you just don’t believe in statistics? That might only be true because they tell you something you don’t want to know. Nevertheless, DNA is not the only form of evidence for evolution other than fossils. Darwin himself realized that the fossil record was incomplete. Although we have vastly more fossil evidence today than Darwin did, it will never be complete. Fossilization is just too rare an event for that to happen.

But we still have what Darwin had. We have comparative anatomy. Things such as homologous structures and convergent anatomy exist only because species have evolved. It is well beyond the scope of this blog to explain the evidence in detail. Books such as the one by Dawkins do a great job providing all the evidence anyone should ever need.

Suffice it to say that evolution of species through natural selection is now a proven fact. Sure it is still a theory, but only because, like all scientific theories, there is room for improvement in the details. Einstein’s theory of gravity is still a theory, yet we take advantage of it daily when we use our GPS devices. Rocket scientists used its predictions to place satellites in precise orbits. And the theory of evolution has been used extensively to make real-world predictions in medical science, genetics, and paleontology, to name a few.

Researchers in Switzerland have even managed to simulate natural selection using robots. The robots "evolved" over several generations to do their jobs better, to cooperate with each other more, and even to behave civilly toward each other. Researchers programmed the robots to produce random changes in their neural nets, which led to various behavior changes over time. Only the changes that improved performance were kept, just as in natural selection.

There really is no room for any doubt. The fact that 44 percent of Americans doubt evolution is due to the vocal minority of evangelical truth deniers who care nothing of evidence. They have convinced those who were capable of being convinced. Those who have been convinced of creationism are obviously open-minded. It is they who really should read Dawkins’ book or others like it so that they can be exposed to the truth, along with the evidence to back it up. That is something the truth-denying creationists can never provide.

Saturday, January 23, 2010

Finders Keepers?

There they were, two folded dollar bills lying flat on the tile floor at McAlister's Deli. I was one table up the aisle from where they lay. And there was a party of four at the table next to the dropped dough, but none of them saw it lying there. There was a party of five at the table across the aisle and the man on the end was within arms length of it, but he didn't see it either.

My daugher, sitting at the booth with me and facing away from the money, noticed that people at other tables farther away from the cash had noticed, too. They were all eying it.

I kept wondering if anyone was going to get up and make a mad dash for the dough, but everyone showed restraint. It was, after all, only two bucks. But it was a FREE two bucks for anyone willing to bend over for it.

My daugher told me that she has taken a lot of personality quizzes, especially online, and one of the questions invariably is about what to do with found money. Would you keep it? Would you turn it in to the cashier or at lost and found, or maybe to the police? Or would you just ignore it, like everyone was doing at the restaurant?

It really depends on the circumstances, I think, and maybe on the amount. If you see a few bucks lying on the ground or on the floor and there is no one around, most people would pick it up and pocket it. I know I would.

If you found, say, $1,000 or even $100 lying around would you turn it in? That's some serious cash for most folks. Unless I knew, or at least suspected whom it belonged to I would pocket it without remorse. Sure it belonged to someone. And they lost it; I found it. So now it belongs to me.

It's the same thing as if a cashier accidentally gives me back too much money. The store can afford it; it was the cashier's mistake, I just benefited. It's nothing personal; it's business. Again, I would feel no remorse for keeping the booty.

But two bucks on the floor next to a table in a crowded restaurant, that called for a more subtle approach. I was considering standing up and walking toward it and then asking the man at the table next to it if he had dropped the money. After all, if I was going to be bending over right next to him, decorum would dictate that I give him a heads up.

But I was busy finishing my soup and I certainly didn't want to look greedy by making a dash for the money and then returning to my table to finish my meal. If I had already been on my way out and saw it, that would be different.

Still, there was free money lying on the floor no more than a dozen feet from me and I felt compelled to go pick it up. But I showed restraint. I figured if it was still there when I finished I would get it.

Seeing money on the floor and leaving it there, to me, is kind of like losing a sneeze or refraining from popping the bubble wrap. It doesn't really matter that it was only two bucks; it was begging to be picked up.

When I was about 10 years old, I found a dollar lying on the sidewalk near my home. I couldn't wait to spend it. I went down to our local toy store and spent half an hour trying to decide how to spend my new-found treasure. Back then, a dollar would buy some cool stuff.

That memory popped into my head at the restaurant. But two dollars today would have been like finding two bits back in those days. I would still have been happy to pick it up, but it would not have bought many thrills, even for a 10-year-old.

But, alas, I had no excuse for picking up the money off the restaurant floor without looking like a cheap miser. And after about two minutes, although it seemed like forever, one of the servers walked by, bent over and nonchalantly picked it up. Finders keepers.

Sunday, January 17, 2010

Daily Newscasts have become Daily Fluff-casts

In my last post I complained about how the news media often encourages a misunderstanding of the facts by the general public because of its insistence that both sides of every story be reported even if the other side is based on faulty science. The examples I included were anthropogenic climate change and the theory of evolution. Few legitimate climatologists debate that humans are affecting modern climate, yet the truth deniers on the far right get an equal voice in the media. And virtually no biological scientist disputes the fact of evolution, yet nearly half of the general population reject the science in favor of an ancient creation myth because evolution’s opponents have an equal voice in the media.

Well I’m not quite finished ragging on the news media, especially TV news coverage of hard-luck stories. I watch local news coverage on TV almost every day because I still believe the benefits of getting the information outweigh the sensationalism that is often intermingled in the reporting.

And it’s not just television news that is at fault. Newspapers over the last several years have followed the trend of filling up their front pages with soft news and human interest stories at the expense of real news that I prefer to read about.

Once upon a time if you opened up a newspaper to read a front-page story, all you would really need to do is read the headlines and the first paragraph and you were well informed of the basic facts. If you wanted more details, read the rest of the story. These days the headline often sensationalizes a minor point of the story and the first paragraph reads like the beginning of a novel. To get to the guts of the story, you have to read far down the page.

That style works fine for the entertainment pages or human interest stories that belong on internal sections of the paper, but not for the front page or the main news pages. These days, the front page of some newspapers is more than half filled with a huge color photograph of somebody doing something cute.

On television, stories that might barely qualify for news coverage are often expanded to include interviews of, not only any victims, but their neighbors or friends. Sad stories of people’s trials and tribulations that take up several minutes of air time dominate, highlighted by interviews of people breaking into tears on camera. Local news shows now last 90 minutes, so if there is not enough real news, they fill the airtime with fluff.

An example is a story I remember about a local woman who was partially disabled and who couldn’t afford to pay her heating bill. The story was generally about those who are forced to use space heaters in their homes when their gas has been shut off. Sometimes space heaters cause fires, especially when not used properly. So this poor woman had a single heater that she used to heat her living space. The woman was afraid to go to sleep for fear that the heater would fail and cause a fire.

The underlying story is legitimate. But it could easily have been covered by reading a paragraph about the situation in general while perhaps showing images of those, like the woman being featured, huddled around a space heater. But no. They had to interview the woman and I was forced to listen to her sad sob story while she broke into tears while the camera cut to the face of the reporter looking all concerned. The story went on like that for several minutes.

Then there are the disaster stories of tornadoes or earthquakes. In my region, a tornado is often the cause of a local disaster. And obviously if a tornado hits and causes damage, it is worth reporting on local TV news. But most channels don’t stop at simply reporting that a tornado hit and destroyed this building and that storefront and killed X number of people. They have to go in and find the poor soul whose house was just destroyed and ask him how he feels about the situation. And invariable, the man or woman being interviewed thanks God for allowing them to survive the catastrophe, never mind that their neighbor didn’t.

When I was the news editor of my hometown newspaper, readers could scan the headlines for stories that interested them, read the first paragraph, and be reasonably informed. Those who were interested in the story could read the whole thing. As Sgt. Friday was so often misquoted, “Just the facts, Ma’am.”

That might seem like dry news coverage, but I say let the news itself decide how juicy the coverage is. A tornado is sensational enough; you don’t have to embellish. Besides, there are legitimate places to cover fluff, both in newspapers and on TV news shows. Fluff does not ever have to be added to hard news stories. To me, that’s bad journalism.

Sunday, January 10, 2010

Truth Deniers Given Credibility by Media

A commenter on one of my blog posts lamented, “I believe that the worst thing that has happened in my life, I'm 65, is 24 hour news.” While I disagree with this writer’s opinion on ubiquitous and continuous news coverage, I do whole-heartedly agree with the point he was making, that when you have so much time to tell the news, a lot of what you’re reporting is not really news. It’s opinion; it’s fluff, or it’s just time-killing babble.

And it goes deeper than just being annoying. It often reaches a point where what is reported on is a gross exaggeration of the news. This often happens when, in an unyielding attempt to be fair and balanced, news producers, directors, and reporters often seek the “other side” of what amounts to a simple fact.

Take, as an example, the question of whether or not global warming is at least partially caused by human activity. Almost everyone agrees with the fact that the world has been getting warmer over the past century or so. The sticking point is whether this is part of a natural cycle of change or if it is caused by carbon dioxide and other greenhouse gases produced by human activity.

Climate scientists working for an array of unconnected organizations, such as the National Academy of Sciences, the National Oceanographic and Atmospheric Administration, the U.S. Geological Survey, and the Royal Society, along with thousands of peer-reviewed climate studies going back more than 15 years say that not only is the climate warming, but it is warming due to human influence. In fact, 97 percent of climatologists agree that anthropogenic global warming is real.

Of course, science isn’t ruled by consensus, but by evidence. But the vast majority of the evidence supports the notion that humans can and do adversely affect climate and other natural phenomena on a global scale. Consider the growing holes in the ozone that occurred during the 1970s. The scare back then was that ozone depletion would result in increasing risks of melanoma and that if it continued people would need protection from the sun anytime they went outside.

Conspiracy theorists of the day claimed it was mostly hype and that a call to replace chlorofluorocarbons with less harmful propellants in aerosol cans and in refrigerating units worldwide would be too costly. But that was in the days before 24-hour news networks and the Internet came on the scene. Conspiracy theorists had less of a voice because they had no easy way to propagate their agenda.

In a decade-long campaign to rid the atmosphere of ozone-depleting gases, the U.S. and other industrialized nations banded together and replaced chlorofluorocarbons with innocuous gases. The ozone holes sealed themselves within a few years. Catastrophe averted.

Global warming is at least as big a problem for the global environment as was ozone depletion. But truth deniers of today have a much bigger platform from which to shout their claims. They have blogs, Internet forums, and numerous 24-hour news channels hungry for something to report.

The Dallas Morning News ran a piece called “Balance of Opinion: The global-warming debate” last month. The headline says it all. When 97 percent of experts agree that something is true, why is there a debate going on at all, let alone one that grabs headlines?

Another example of the news industry’s unrelenting focus on balancing every story with a counterclaim can be seen in the evolution vs. creation debate. Creationists, for the most part, no longer call themselves creationists, because the Supreme Court rightfully ruled that creation is not science, but religion. It therefore cannot be taught in public schools. So instead of creationism, they now support intelligent design “theory.” It’s exactly the same bunk in a shiny new package.

Recent polls say that about 45 percent of Americans believe that life was created in a week by God about 6,000 years ago. In other words, nearly half the population believes that a Bronze-Age fairy tale is literal truth.

Now, this wouldn’t be surprising if it were 1810 instead of 2010. Even most scientists 200 years ago accepted the biblical tale of creation. They had little or no evidence to suggest it wasn’t true back then. But, unlike other scientific theories that are embraced by the general public (gravity, atoms, cells, etc.) the theory of evolution is attacked because it is the theory that is in direct opposition to an Old Testament telling of how we came to be. Besides, most people see humans as a superior species and so have an air of arrogance that won’t let them believe that we share a common ancestor with monkeys.

But the scientific evidence supporting evolution is no less formidable than the evidence supporting the atomic theory. In the same polls that show lukewarm public support for the theory at best, 97 percent of biologists say they accept evolution as factual. Yes, it’s the same percentage as for the proportion of environmental scientists that say global warming is caused by human activity.

Neither of these theories, evolution or global warming, is debated among scientists as being true or false. They are obviously true. Any scientific debate focuses on the minutia, the details of pinpointing timelines for evolutionary, or environmental, changes, and on the mechanisms of change.

Yet the news media treat the theory of evolution as though there is a great amount of scientific opposition to it. There isn’t. Only those who do not understand evolution are against it. And most truth deniers do not wish to understand what they are against; otherwise they would have to stop denying it.

But, in the interest of being “fair,” news reports often include spurious counterclaims to well-established scientific theories (and in science, a theory is a proposition that is well-accepted and has lots of supporting evidence to back it up).

Would these same news organizations give equal time to the Flat Earth Society members who claim that a spherical earth is just a conspiracy theory? Are there any organized efforts to include a flat earth theory as an alternative to the spherical earth theory that is taught in geography classes in school? No, that would be inane.

It is equally inane to suggest that the universe, and all life in it, was created in six days and that the earth is only a few thousand years old. It is equally inane to subscribe to the contention that humans do not have any effect on global climate.

But until news organizations stop treating truth deniers as equals, they will continue to have a loud global voice.

Sunday, January 03, 2010

Everything's Better with Cheese

I love cheese.

Cheese is one of my favorite foods. Almost any savory food goes better with cheese. Even some desserts go better with soft, mild cheeses. Take cheesecake, for example. Cheese is its prime ingredient.

My favorite types of cheeses are the softer varieties, such as brie. I’m less of a fan of mozzarella. I don’t like pizzas that are made from whole slices of mozzarella, because it tends to gum up in my mouth, which makes it difficult for me to chew and swallow. That’s why I don’t eat steak; I have a phobia against swallowing anything that doesn’t dissolve away as I chew it. Shredded mozzarella works best on pizza for me.

I love goat cheese. It really perks up the flavor and texture of spaghetti. Brie goes especially well with eggs and smoked salmon. And on a sandwich, it’s hard to beat havarti.

At the Old Spaghetti Factory, they serve several dishes with mizithra cheese. Mizithra with butter sauce over angel hair pasta is one of my favorite dishes in the world.

But out of all the different varieties of cheese that I’ve enjoyed, and there have been lots of them, one of my favorites is plain old American process cheese. Process cheese is a blend of cheeses, usually Colby and cheddar, and sometimes other added ingredients such as salt and whey. They are made with emulsifiers so that they melt more smoothly and don’t separate when heated like most non-processed cheeses do. For that reason, they are great for cooking or for placing on hot sandwiches, such as grilled cheese sandwiches and on cheeseburgers.

In the U.S. process cheese is usually called American cheese or pasteurized process cheese food (if there are other ingredients besides cheese in it). It is sold as “cheese slice” in the UK, “Laughing Cow” in France, and simply process cheese or processed cheese in some other countries.

I also like cottage cheese, which is the only cheese I know of that is eaten with a fork as a side dish. It consists of a mixture of broken curd and whey.

When I go to the movies, I always order a little tub of cheese dip with my popcorn. I like queso dip and other cheese spreads such as Cheese Whiz. And of course, I like Velveeta, which is classified as a process cheese product rather than a process cheese food because it has less than 50 percent actual cheese.

Finally, one of my favorite cheeses for spreading and dipping is Philadelphia cream cheese. It is also great with scrambled eggs and in omelets.

So why did I deviate from my usual, more controversial topics in writing this blog entry? I was just snacking on one of my favorite cheese snacks, one of the Mini Bells, along with some sardines and I started wondering if other countries sold American cheese as pre-wrapped singles. So I decided to do a little research on the Web. My policy is if I am going to do research on something, I might as well write about it. So there you go.

Friday, January 01, 2010

It's the Year 2010 - That's Twenty Ten

Happy New Year and welcome to 2010 everyone. And that’s “Twenty Ten” NOT “Two Thousand Ten.” I admit it was easier to say the year 2009 as “two thousand nine,” and all the previous years in the millennium as well. But that habit has got to stop.

“Why is that?” I heard you asking. Well, it’s a matter of syllable economics. We are now a society of texting and tweeting and we value brevity, especially if nothing is lost by being brief. “Two thousand ten” or as some say it, “two thousand and ten” has four or five syllables. “Twenty ten” has but three. It’s easier to say once you get past the habit of saying “two thousand.” We’ve been doing that for ten years now and that’s enough.

The syllable thing kind of evened out when pronouncing the years of the first decade. For example, “two thousand nine” has the same number of syllables as “twenty-O-nine,” although those who say “two thousand and nine” are still at a disadvantage. But it’s that “O” in the middle that screws things up. It’s not really an “O” but a zero. And nobody uses the word aught anymore. I even heard one news reporter on TV use the phrase “twenty-O-ten” in describing some future plans for our city. That sounds like a bad pun on an Irish name.

But wait; don’t we enumerate other things using the longer terminology? If I was counting the number of words in one of my blogs (a very long blog) and there were 2,010 of then, I would say I typed “two thousand and ten” words. Yes but you see that little curvy thing after the 2? That’s called a comma. And when we write numbers past 999 we use commas after every third digit from the right. That breaks it up, so we can say “two thousand and ten” in the case where we are counting items.

Yes, I know our calendar is a way of counting years. But traditionally, there is not a comma in year numbers and by tradition, we have never used “thousand” in the pronunciation of any year, with the possible exception of the first 10 years of the second millennium, 1000 through 1009 (Although, technically, 1000 was still part of the first millennium. Read on.).

And it’s not just the sheer number of syllables that matter; it’s how we pronounce them. “Twenty” has two syllables compared to three for “two thousand,” but just listen to how we pronounce “thousand.” The first syllable is a strong diphthong. It’s almost like saying two syllables. And the second syllable has the “n” sound, which is almost always extended. It takes a bit longer to say “and” than it does to say “at” even though both words are monosyllabic. Ok, the first syllable of “twenty” has an “n,” too. Leave me alone; I’m trying to make a point here.

So for the sake of brevity and tradition, and so we won’t end up saying “two thousand and twenty” when 2020 arrives, we need to all correct old habits and start saying “twenty ten,” “twenty eleven,” etc.

Now, what about this decade thing? Is the first decade of the third millennium now history? Well, no it isn’t unless you started counting this decade at the year 2000 instead of 2001 as you should have. Back when we started assigning numbers to our years we used only Roman Numerals. There is no zero in Roman Numerals, so the first year of the first millennium in the Common Era was “I.” That’s year 1. So counting the first decade, or ten-year period, from year 1, we see that year 10 was the last year of the first decade. Year 11 started the second decade. Continuing on for the next 2,000 years, we see that the third millennium started in 2001 as did the first decade of the new millennium. So the second decade must begin at 2011.

But, alas, it makes it tougher to label decades that way. We always refer to decades as the twenties or thirties. And sometimes we give them a modifier, such as the Gay Nineties or the Roaring Twenties. But the actual decade of the Roaring Twenties started in 1921. The year 1920 is part of the twenties, but it’s not part of the decade of the twenties. And therein lies the confusion.

I am willing to accede to the decade question for the sake of simplicity, and because my opinion on it matters not to the masses of people, and the news media, who insist that we have just begun the new decade of the third millennium. I mean, Tiger Woods was already named Athlete of the Decade even though the decade officially has one more year to go.

Still, for the sake of consistency and clarity, I will become assimilated with regards to the decade matter. Just know that officially, the first decade of the third millennium is not really over yet.

So with all that said, the next question is what to call the new decade. I guess it makes sense to call it the “twenty tens.” We shouldn’t say “twenty teens” because that leaves out 2010 through 2012. I think the second decade of last century was called the “nineteen tens.”

I still don’t know what to call the first decade. I don’t think anyone else does either. When referring to the years 1900 through 1909, I’ve heard people say “the first decade of the twentieth century.” That’s a bit cumbersome. In our future retrospectives will we refer to the first 10 years of the twenty-first century as just that or will we call them the “twenty-O-ones,” or maybe the “aughts”? None of it sounds all that enticing.

Finally, do we use CE after the year or AD? It is becoming more common to use CE and BCE after years when referring to whether or not the year is positive or negative. There was no year zero as I said, but there was a year one right before the year one that began the first decade moving forward. Most people still refer to those negative years as BC, meaning “Before Christ.” Years after Christ are designated as AD, which is Latin meaning “The Year of Our Lord.”

The problem with that is for those of us who do not subscribe to Christianity. It’s not 2010 in the year of MY lord, since I don’t claim ownership of a lord. Back when the Christian Church was in charge of all matters political, they could proclaim “our” and it applied to everyone, whether they wanted it to or not. The tradition has stuck, but the meaning still is offensive to those who claim other religions or no religion at all.

So I use the alternative method, CE and BCE, which means “Common Era” and “Before Common Era” respectively. It is neutral with regards to any religion and should be adopted by everyone, because it offends no one (except perhaps for the fundamentalist Christians who are quite easy to offend).

So it is now the year 2010 CE. That’s “Twenty Ten.” And it is now the decade of the Twenty Tens. I hope it’s a good one for all of us.

Oh, and just for the sake of full disclosure, there are 1,233 words in this blog entry. That’s one thousand, two hundred thirty-three.