A commenter on one of my blog posts lamented, “I believe that the worst thing that has happened in my life, I'm 65, is 24 hour news.” While I disagree with this writer’s opinion on ubiquitous and continuous news coverage, I do whole-heartedly agree with the point he was making, that when you have so much time to tell the news, a lot of what you’re reporting is not really news. It’s opinion; it’s fluff, or it’s just time-killing babble.
And it goes deeper than just being annoying. It often reaches a point where what is reported on is a gross exaggeration of the news. This often happens when, in an unyielding attempt to be fair and balanced, news producers, directors, and reporters often seek the “other side” of what amounts to a simple fact.
Take, as an example, the question of whether or not global warming is at least partially caused by human activity. Almost everyone agrees with the fact that the world has been getting warmer over the past century or so. The sticking point is whether this is part of a natural cycle of change or if it is caused by carbon dioxide and other greenhouse gases produced by human activity.
Climate scientists working for an array of unconnected organizations, such as the National Academy of Sciences, the National Oceanographic and Atmospheric Administration, the U.S. Geological Survey, and the Royal Society, along with thousands of peer-reviewed climate studies going back more than 15 years say that not only is the climate warming, but it is warming due to human influence. In fact, 97 percent of climatologists agree that anthropogenic global warming is real.
Of course, science isn’t ruled by consensus, but by evidence. But the vast majority of the evidence supports the notion that humans can and do adversely affect climate and other natural phenomena on a global scale. Consider the growing holes in the ozone that occurred during the 1970s. The scare back then was that ozone depletion would result in increasing risks of melanoma and that if it continued people would need protection from the sun anytime they went outside.
Conspiracy theorists of the day claimed it was mostly hype and that a call to replace chlorofluorocarbons with less harmful propellants in aerosol cans and in refrigerating units worldwide would be too costly. But that was in the days before 24-hour news networks and the Internet came on the scene. Conspiracy theorists had less of a voice because they had no easy way to propagate their agenda.
In a decade-long campaign to rid the atmosphere of ozone-depleting gases, the U.S. and other industrialized nations banded together and replaced chlorofluorocarbons with innocuous gases. The ozone holes sealed themselves within a few years. Catastrophe averted.
Global warming is at least as big a problem for the global environment as was ozone depletion. But truth deniers of today have a much bigger platform from which to shout their claims. They have blogs, Internet forums, and numerous 24-hour news channels hungry for something to report.
The Dallas Morning News ran a piece called “Balance of Opinion: The global-warming debate” last month. The headline says it all. When 97 percent of experts agree that something is true, why is there a debate going on at all, let alone one that grabs headlines?
Another example of the news industry’s unrelenting focus on balancing every story with a counterclaim can be seen in the evolution vs. creation debate. Creationists, for the most part, no longer call themselves creationists, because the Supreme Court rightfully ruled that creation is not science, but religion. It therefore cannot be taught in public schools. So instead of creationism, they now support intelligent design “theory.” It’s exactly the same bunk in a shiny new package.
Recent polls say that about 45 percent of Americans believe that life was created in a week by God about 6,000 years ago. In other words, nearly half the population believes that a Bronze-Age fairy tale is literal truth.
Now, this wouldn’t be surprising if it were 1810 instead of 2010. Even most scientists 200 years ago accepted the biblical tale of creation. They had little or no evidence to suggest it wasn’t true back then. But, unlike other scientific theories that are embraced by the general public (gravity, atoms, cells, etc.) the theory of evolution is attacked because it is the theory that is in direct opposition to an Old Testament telling of how we came to be. Besides, most people see humans as a superior species and so have an air of arrogance that won’t let them believe that we share a common ancestor with monkeys.
But the scientific evidence supporting evolution is no less formidable than the evidence supporting the atomic theory. In the same polls that show lukewarm public support for the theory at best, 97 percent of biologists say they accept evolution as factual. Yes, it’s the same percentage as for the proportion of environmental scientists that say global warming is caused by human activity.
Neither of these theories, evolution or global warming, is debated among scientists as being true or false. They are obviously true. Any scientific debate focuses on the minutia, the details of pinpointing timelines for evolutionary, or environmental, changes, and on the mechanisms of change.
Yet the news media treat the theory of evolution as though there is a great amount of scientific opposition to it. There isn’t. Only those who do not understand evolution are against it. And most truth deniers do not wish to understand what they are against; otherwise they would have to stop denying it.
But, in the interest of being “fair,” news reports often include spurious counterclaims to well-established scientific theories (and in science, a theory is a proposition that is well-accepted and has lots of supporting evidence to back it up).
Would these same news organizations give equal time to the Flat Earth Society members who claim that a spherical earth is just a conspiracy theory? Are there any organized efforts to include a flat earth theory as an alternative to the spherical earth theory that is taught in geography classes in school? No, that would be inane.
It is equally inane to suggest that the universe, and all life in it, was created in six days and that the earth is only a few thousand years old. It is equally inane to subscribe to the contention that humans do not have any effect on global climate.
But until news organizations stop treating truth deniers as equals, they will continue to have a loud global voice.
Sunday, January 10, 2010
Sunday, January 03, 2010
Everything's Better with Cheese
I love cheese.
Cheese is one of my favorite foods. Almost any savory food goes better with cheese. Even some desserts go better with soft, mild cheeses. Take cheesecake, for example. Cheese is its prime ingredient.
My favorite types of cheeses are the softer varieties, such as brie. I’m less of a fan of mozzarella. I don’t like pizzas that are made from whole slices of mozzarella, because it tends to gum up in my mouth, which makes it difficult for me to chew and swallow. That’s why I don’t eat steak; I have a phobia against swallowing anything that doesn’t dissolve away as I chew it. Shredded mozzarella works best on pizza for me.
I love goat cheese. It really perks up the flavor and texture of spaghetti. Brie goes especially well with eggs and smoked salmon. And on a sandwich, it’s hard to beat havarti.
At the Old Spaghetti Factory, they serve several dishes with mizithra cheese. Mizithra with butter sauce over angel hair pasta is one of my favorite dishes in the world.
But out of all the different varieties of cheese that I’ve enjoyed, and there have been lots of them, one of my favorites is plain old American process cheese. Process cheese is a blend of cheeses, usually Colby and cheddar, and sometimes other added ingredients such as salt and whey. They are made with emulsifiers so that they melt more smoothly and don’t separate when heated like most non-processed cheeses do. For that reason, they are great for cooking or for placing on hot sandwiches, such as grilled cheese sandwiches and on cheeseburgers.
In the U.S. process cheese is usually called American cheese or pasteurized process cheese food (if there are other ingredients besides cheese in it). It is sold as “cheese slice” in the UK, “Laughing Cow” in France, and simply process cheese or processed cheese in some other countries.
I also like cottage cheese, which is the only cheese I know of that is eaten with a fork as a side dish. It consists of a mixture of broken curd and whey.
When I go to the movies, I always order a little tub of cheese dip with my popcorn. I like queso dip and other cheese spreads such as Cheese Whiz. And of course, I like Velveeta, which is classified as a process cheese product rather than a process cheese food because it has less than 50 percent actual cheese.
Finally, one of my favorite cheeses for spreading and dipping is Philadelphia cream cheese. It is also great with scrambled eggs and in omelets.
So why did I deviate from my usual, more controversial topics in writing this blog entry? I was just snacking on one of my favorite cheese snacks, one of the Mini Bells, along with some sardines and I started wondering if other countries sold American cheese as pre-wrapped singles. So I decided to do a little research on the Web. My policy is if I am going to do research on something, I might as well write about it. So there you go.
Cheese is one of my favorite foods. Almost any savory food goes better with cheese. Even some desserts go better with soft, mild cheeses. Take cheesecake, for example. Cheese is its prime ingredient.
My favorite types of cheeses are the softer varieties, such as brie. I’m less of a fan of mozzarella. I don’t like pizzas that are made from whole slices of mozzarella, because it tends to gum up in my mouth, which makes it difficult for me to chew and swallow. That’s why I don’t eat steak; I have a phobia against swallowing anything that doesn’t dissolve away as I chew it. Shredded mozzarella works best on pizza for me.
I love goat cheese. It really perks up the flavor and texture of spaghetti. Brie goes especially well with eggs and smoked salmon. And on a sandwich, it’s hard to beat havarti.
At the Old Spaghetti Factory, they serve several dishes with mizithra cheese. Mizithra with butter sauce over angel hair pasta is one of my favorite dishes in the world.
But out of all the different varieties of cheese that I’ve enjoyed, and there have been lots of them, one of my favorites is plain old American process cheese. Process cheese is a blend of cheeses, usually Colby and cheddar, and sometimes other added ingredients such as salt and whey. They are made with emulsifiers so that they melt more smoothly and don’t separate when heated like most non-processed cheeses do. For that reason, they are great for cooking or for placing on hot sandwiches, such as grilled cheese sandwiches and on cheeseburgers.
In the U.S. process cheese is usually called American cheese or pasteurized process cheese food (if there are other ingredients besides cheese in it). It is sold as “cheese slice” in the UK, “Laughing Cow” in France, and simply process cheese or processed cheese in some other countries.
I also like cottage cheese, which is the only cheese I know of that is eaten with a fork as a side dish. It consists of a mixture of broken curd and whey.
When I go to the movies, I always order a little tub of cheese dip with my popcorn. I like queso dip and other cheese spreads such as Cheese Whiz. And of course, I like Velveeta, which is classified as a process cheese product rather than a process cheese food because it has less than 50 percent actual cheese.
Finally, one of my favorite cheeses for spreading and dipping is Philadelphia cream cheese. It is also great with scrambled eggs and in omelets.
So why did I deviate from my usual, more controversial topics in writing this blog entry? I was just snacking on one of my favorite cheese snacks, one of the Mini Bells, along with some sardines and I started wondering if other countries sold American cheese as pre-wrapped singles. So I decided to do a little research on the Web. My policy is if I am going to do research on something, I might as well write about it. So there you go.
Friday, January 01, 2010
It's the Year 2010 - That's Twenty Ten
Happy New Year and welcome to 2010 everyone. And that’s “Twenty Ten” NOT “Two Thousand Ten.” I admit it was easier to say the year 2009 as “two thousand nine,” and all the previous years in the millennium as well. But that habit has got to stop.
“Why is that?” I heard you asking. Well, it’s a matter of syllable economics. We are now a society of texting and tweeting and we value brevity, especially if nothing is lost by being brief. “Two thousand ten” or as some say it, “two thousand and ten” has four or five syllables. “Twenty ten” has but three. It’s easier to say once you get past the habit of saying “two thousand.” We’ve been doing that for ten years now and that’s enough.
The syllable thing kind of evened out when pronouncing the years of the first decade. For example, “two thousand nine” has the same number of syllables as “twenty-O-nine,” although those who say “two thousand and nine” are still at a disadvantage. But it’s that “O” in the middle that screws things up. It’s not really an “O” but a zero. And nobody uses the word aught anymore. I even heard one news reporter on TV use the phrase “twenty-O-ten” in describing some future plans for our city. That sounds like a bad pun on an Irish name.
But wait; don’t we enumerate other things using the longer terminology? If I was counting the number of words in one of my blogs (a very long blog) and there were 2,010 of then, I would say I typed “two thousand and ten” words. Yes but you see that little curvy thing after the 2? That’s called a comma. And when we write numbers past 999 we use commas after every third digit from the right. That breaks it up, so we can say “two thousand and ten” in the case where we are counting items.
Yes, I know our calendar is a way of counting years. But traditionally, there is not a comma in year numbers and by tradition, we have never used “thousand” in the pronunciation of any year, with the possible exception of the first 10 years of the second millennium, 1000 through 1009 (Although, technically, 1000 was still part of the first millennium. Read on.).
And it’s not just the sheer number of syllables that matter; it’s how we pronounce them. “Twenty” has two syllables compared to three for “two thousand,” but just listen to how we pronounce “thousand.” The first syllable is a strong diphthong. It’s almost like saying two syllables. And the second syllable has the “n” sound, which is almost always extended. It takes a bit longer to say “and” than it does to say “at” even though both words are monosyllabic. Ok, the first syllable of “twenty” has an “n,” too. Leave me alone; I’m trying to make a point here.
So for the sake of brevity and tradition, and so we won’t end up saying “two thousand and twenty” when 2020 arrives, we need to all correct old habits and start saying “twenty ten,” “twenty eleven,” etc.
Now, what about this decade thing? Is the first decade of the third millennium now history? Well, no it isn’t unless you started counting this decade at the year 2000 instead of 2001 as you should have. Back when we started assigning numbers to our years we used only Roman Numerals. There is no zero in Roman Numerals, so the first year of the first millennium in the Common Era was “I.” That’s year 1. So counting the first decade, or ten-year period, from year 1, we see that year 10 was the last year of the first decade. Year 11 started the second decade. Continuing on for the next 2,000 years, we see that the third millennium started in 2001 as did the first decade of the new millennium. So the second decade must begin at 2011.
But, alas, it makes it tougher to label decades that way. We always refer to decades as the twenties or thirties. And sometimes we give them a modifier, such as the Gay Nineties or the Roaring Twenties. But the actual decade of the Roaring Twenties started in 1921. The year 1920 is part of the twenties, but it’s not part of the decade of the twenties. And therein lies the confusion.
I am willing to accede to the decade question for the sake of simplicity, and because my opinion on it matters not to the masses of people, and the news media, who insist that we have just begun the new decade of the third millennium. I mean, Tiger Woods was already named Athlete of the Decade even though the decade officially has one more year to go.
Still, for the sake of consistency and clarity, I will become assimilated with regards to the decade matter. Just know that officially, the first decade of the third millennium is not really over yet.
So with all that said, the next question is what to call the new decade. I guess it makes sense to call it the “twenty tens.” We shouldn’t say “twenty teens” because that leaves out 2010 through 2012. I think the second decade of last century was called the “nineteen tens.”
I still don’t know what to call the first decade. I don’t think anyone else does either. When referring to the years 1900 through 1909, I’ve heard people say “the first decade of the twentieth century.” That’s a bit cumbersome. In our future retrospectives will we refer to the first 10 years of the twenty-first century as just that or will we call them the “twenty-O-ones,” or maybe the “aughts”? None of it sounds all that enticing.
Finally, do we use CE after the year or AD? It is becoming more common to use CE and BCE after years when referring to whether or not the year is positive or negative. There was no year zero as I said, but there was a year one right before the year one that began the first decade moving forward. Most people still refer to those negative years as BC, meaning “Before Christ.” Years after Christ are designated as AD, which is Latin meaning “The Year of Our Lord.”
The problem with that is for those of us who do not subscribe to Christianity. It’s not 2010 in the year of MY lord, since I don’t claim ownership of a lord. Back when the Christian Church was in charge of all matters political, they could proclaim “our” and it applied to everyone, whether they wanted it to or not. The tradition has stuck, but the meaning still is offensive to those who claim other religions or no religion at all.
So I use the alternative method, CE and BCE, which means “Common Era” and “Before Common Era” respectively. It is neutral with regards to any religion and should be adopted by everyone, because it offends no one (except perhaps for the fundamentalist Christians who are quite easy to offend).
So it is now the year 2010 CE. That’s “Twenty Ten.” And it is now the decade of the Twenty Tens. I hope it’s a good one for all of us.
Oh, and just for the sake of full disclosure, there are 1,233 words in this blog entry. That’s one thousand, two hundred thirty-three.
“Why is that?” I heard you asking. Well, it’s a matter of syllable economics. We are now a society of texting and tweeting and we value brevity, especially if nothing is lost by being brief. “Two thousand ten” or as some say it, “two thousand and ten” has four or five syllables. “Twenty ten” has but three. It’s easier to say once you get past the habit of saying “two thousand.” We’ve been doing that for ten years now and that’s enough.
The syllable thing kind of evened out when pronouncing the years of the first decade. For example, “two thousand nine” has the same number of syllables as “twenty-O-nine,” although those who say “two thousand and nine” are still at a disadvantage. But it’s that “O” in the middle that screws things up. It’s not really an “O” but a zero. And nobody uses the word aught anymore. I even heard one news reporter on TV use the phrase “twenty-O-ten” in describing some future plans for our city. That sounds like a bad pun on an Irish name.
But wait; don’t we enumerate other things using the longer terminology? If I was counting the number of words in one of my blogs (a very long blog) and there were 2,010 of then, I would say I typed “two thousand and ten” words. Yes but you see that little curvy thing after the 2? That’s called a comma. And when we write numbers past 999 we use commas after every third digit from the right. That breaks it up, so we can say “two thousand and ten” in the case where we are counting items.
Yes, I know our calendar is a way of counting years. But traditionally, there is not a comma in year numbers and by tradition, we have never used “thousand” in the pronunciation of any year, with the possible exception of the first 10 years of the second millennium, 1000 through 1009 (Although, technically, 1000 was still part of the first millennium. Read on.).
And it’s not just the sheer number of syllables that matter; it’s how we pronounce them. “Twenty” has two syllables compared to three for “two thousand,” but just listen to how we pronounce “thousand.” The first syllable is a strong diphthong. It’s almost like saying two syllables. And the second syllable has the “n” sound, which is almost always extended. It takes a bit longer to say “and” than it does to say “at” even though both words are monosyllabic. Ok, the first syllable of “twenty” has an “n,” too. Leave me alone; I’m trying to make a point here.
So for the sake of brevity and tradition, and so we won’t end up saying “two thousand and twenty” when 2020 arrives, we need to all correct old habits and start saying “twenty ten,” “twenty eleven,” etc.
Now, what about this decade thing? Is the first decade of the third millennium now history? Well, no it isn’t unless you started counting this decade at the year 2000 instead of 2001 as you should have. Back when we started assigning numbers to our years we used only Roman Numerals. There is no zero in Roman Numerals, so the first year of the first millennium in the Common Era was “I.” That’s year 1. So counting the first decade, or ten-year period, from year 1, we see that year 10 was the last year of the first decade. Year 11 started the second decade. Continuing on for the next 2,000 years, we see that the third millennium started in 2001 as did the first decade of the new millennium. So the second decade must begin at 2011.
But, alas, it makes it tougher to label decades that way. We always refer to decades as the twenties or thirties. And sometimes we give them a modifier, such as the Gay Nineties or the Roaring Twenties. But the actual decade of the Roaring Twenties started in 1921. The year 1920 is part of the twenties, but it’s not part of the decade of the twenties. And therein lies the confusion.
I am willing to accede to the decade question for the sake of simplicity, and because my opinion on it matters not to the masses of people, and the news media, who insist that we have just begun the new decade of the third millennium. I mean, Tiger Woods was already named Athlete of the Decade even though the decade officially has one more year to go.
Still, for the sake of consistency and clarity, I will become assimilated with regards to the decade matter. Just know that officially, the first decade of the third millennium is not really over yet.
So with all that said, the next question is what to call the new decade. I guess it makes sense to call it the “twenty tens.” We shouldn’t say “twenty teens” because that leaves out 2010 through 2012. I think the second decade of last century was called the “nineteen tens.”
I still don’t know what to call the first decade. I don’t think anyone else does either. When referring to the years 1900 through 1909, I’ve heard people say “the first decade of the twentieth century.” That’s a bit cumbersome. In our future retrospectives will we refer to the first 10 years of the twenty-first century as just that or will we call them the “twenty-O-ones,” or maybe the “aughts”? None of it sounds all that enticing.
Finally, do we use CE after the year or AD? It is becoming more common to use CE and BCE after years when referring to whether or not the year is positive or negative. There was no year zero as I said, but there was a year one right before the year one that began the first decade moving forward. Most people still refer to those negative years as BC, meaning “Before Christ.” Years after Christ are designated as AD, which is Latin meaning “The Year of Our Lord.”
The problem with that is for those of us who do not subscribe to Christianity. It’s not 2010 in the year of MY lord, since I don’t claim ownership of a lord. Back when the Christian Church was in charge of all matters political, they could proclaim “our” and it applied to everyone, whether they wanted it to or not. The tradition has stuck, but the meaning still is offensive to those who claim other religions or no religion at all.
So I use the alternative method, CE and BCE, which means “Common Era” and “Before Common Era” respectively. It is neutral with regards to any religion and should be adopted by everyone, because it offends no one (except perhaps for the fundamentalist Christians who are quite easy to offend).
So it is now the year 2010 CE. That’s “Twenty Ten.” And it is now the decade of the Twenty Tens. I hope it’s a good one for all of us.
Oh, and just for the sake of full disclosure, there are 1,233 words in this blog entry. That’s one thousand, two hundred thirty-three.
Subscribe to:
Posts (Atom)