Sunday, March 30, 2008

Spring Break through the Ages

Here I am in the middle of spring break, that time of year when students and teachers get to take a much-needed break from each other and soak up the springtime sun, if it ever stops raining.

The spring break tradition goes way back in history. In the modern U.S., spring break brings to mind Florida beaches, or California ones for those who live on the West Coast, fun in the sun, too much drinking, and wet t-shirt contests.

The first spring breaks occurred in ancient Egypt when the pharaohs decreed that those learning how to build pyramids could take a week off to recuperate. In 590 BCE, however, the tradition almost ended as soon as it began when unruly slave teens knocked over one of the aqueducts that supplied water to the city, dowsing the partying students with gallons of fresh water.

The pharaoh became so enraged that he beheaded the ring leaders and sent the others back to their lessons. It did, however, provide a beginning for the wet t-shirt contests that students often enjoy today.

Some ancient manuscripts, that the Council of Nicaea in 325 CE decided to leave out of the bible, describe how Jesus gave his disciples a spring break two weeks before the Last Supper. He was disappointed that his students decided to use their break time for recreation instead of meditation. He scolded them upon their return, which left one of them, Judas, rather bitter.

Middle Eastern spring breaks went out of style during the third century CE as it became apparent that much of the Middle East does not actually have a spring.

The tradition continued in full force, however, throughout Europe during the medieval period. Feudal schoolchildren relished their time away from learning how to become good peasants. They would often gather in groups of a hundred or more, ransack their lord’s livestock and journey down to the Mediterranean for a splash.

But spring break can sometimes be more than a time for rest and relaxation. Isaac Newton was on spring break from Cambridge in 1666 when, relaxing under an apple tree, an apple fell from a branch and struck him on the head, initiating his enlightenment regarding the law of gravity.

Spring break has also had some very negative consequences throughout history. In March of 1815, for example, Napoleon decided to cancel spring break for his soldiers that were in training for an important battle to come. He felt they needed to concentrate more on the war at hand than to go off partying.

In June of that year, Napoleon’s army was defeated at Waterloo by a Prussian army whose soldiers all had enjoyed spring break earlier in the year. A rested Prussian force along with Wellington’s allied army, handily defeated Napoleon’s soldiers, many of whom were still bitter about not having been allowed time off in the spring.

Unconfirmed reports also tell how soldiers from the North helped defeat the Rebel forces during the Civil War while vacationing in northern Florida in 1865. The infamous Battle of Spring Break was a turning point in the War Between the States.

As for me, I’m just enjoying my time away from my students, as I’m sure they are all very happy not to be sitting in the classroom today, learning history and science.

Sunday, March 23, 2008

Was Our Easter Dinner Really a Lunch?

Have you ever eaten hotdogs, spaghetti, or pizza for breakfast with ice cream for dessert? Perhaps some people may grab whatever is leftover in the fridge on their way to work just to get some food into them. But few people actually prepare the above-mentioned foods as their primary breakfast meal. They are better eaten for lunch or dinner.

But if you look at the original Middle English definition of dinner, it actually meant breakfast. The Middle English term was derived from the French word disner which came from the Latin, disiūnāre, meaning to break one’s fast.

Of course, today’s word for the first meal of the day is more to the point of breaking one’s fast, breakfast. But the old French tradition was to have the first meal of the day around nine o’clock and it was the biggest meal of the day. Later, the most important meal of the day migrated toward the evening hours. This was especially true in America and Australia. So the word, dinner, simply became defined as the main meal of the day, whatever time you ate it.

But my daughter and I got into an etymological dispute the other day regarding our family’s Easter dinner. It was held at two o’clock in the afternoon, so she insisted that it be called Easter lunch. But in keeping with the original root meaning of the word, I claimed that dinner would be the best description of the meal because it would be the most important meal of that day.

She provided encyclopedic evidence claiming that, although in England dinner is the main meal of the day, in the U.S. it is the meal taken between five and eight in the evening, thus making our two o’clock meal a lunch.

Oh yes, we have heated family debates over such important stuff as word usage and what certain things mean in a historic context. It’s not all about politics or religion in our family.

But since we’re on the subject, let’s not leave out supper. When I was a kid, supper meant the evening meal and dinner was taken at midday. That terminology is still in common usage in the South. But, in traditional southern style, the midday meal usually was the largest meal of the day, eaten between 11:00 AM and noon, before the afternoon chores. The evening meal was lighter in comparison.

In modern usage, a dinner is a more formal meal and is almost always associated with special meals such as those eaten at Thanksgiving, Christmas, and Easter, regardless of the time of day.

Supper comes from the Middle English souper, which derives from sup, meaning a small swallow or mouthful. Our word soup has the same derivation. So supper means a small meal eaten after dinner. And lunch comes from the Latin, nonshench, meaning noon. Lunch is defined as a light meal eaten at midday.

So who won our family debate about Easter dinner? Well, no one really. My daughter still insists that dinner must be eaten after five o’clock, basing her stance on a definition she found in Wikipedia. And although I agree with her that the American norm is to call the evening meal dinner and a midday meal lunch, the importance of the meal takes priority over the time it is eaten, so our Easter feast would be a dinner.

In the end, we all had a good time and were plenty full. And, unlike our traditional Christmas meal consisting of breakfast-style foods eaten at midday, we all agreed that this one was not brunch.

Sunday, March 16, 2008

Kids Find the True Nature of Easter

When I was a kid, Easter was probably my third-favorite holiday, behind Halloween and Christmas. Christmas was, of course, the big one. It meant getting lots of cool toys and candy. Halloween was good because I liked the costumes and it was even more fun to get lots of candy.

Easter was fun because I got to hunt Easter eggs and because, yes, there was even more candy. It wasn’t as cool as Halloween, because Easter happens in springtime. I always liked the fall, even as a kid. And Easter didn’t bring quite as much candy.

I don’t think I ever really believed in the Easter Bunny. I enjoyed the concept, but the idea of a huge rabbit bringing Easter eggs to kids while they slept seemed far-fetched, even when I was six. And I was never sure what rabbits had to do with eggs anyway.

I still liked Easter as an adult with kids, because now I could watch them hunt Easter eggs. I was never good at it. I remember being in many Easter egg hunts as a kid, but I can’t remember ever actually finding any eggs.

Once, I got to go with a group of kids by bus to Indianapolis where we were dumped off at a huge field filled with hard-boiled goodies. When they said go, I, along with a few dozen other kids, scrambled across the field. They were all picking up colorful gems. I found nothing, except one broken egg that someone had spotted before me and decided to leave behind.

So when I had my own kids, they would get together with their cousins and we made sure that everyone found their fair share of Easter eggs for their baskets.

I always knew the religious significance of Easter. But in church, precious little was ever mentioned about what was most important to me, the candy, the eggs, and the baskets full of goodies. So when I had kids of my own, I decided to find out more about this holiday.

As it turns out, a form of Easter existed long before it became connected to Christianity. The ancient Saxons celebrated the return of spring with an uproarious festival commemorating their goddess of offspring and of springtime, Eastre. When the second-century Christian missionaries encountered the tribes of the north with their pagan celebrations, they attempted to convert them to Christianity. They did so, however, in a clandestine manner, because they wanted their beliefs to infiltrate the pagans slowly. It apparently worked.

So what about the Easter bunny? Did ancient bunnies really lay colorful eggs? Well, of course not. The Easter Bunny is not a modern invention. The symbol originated with the pagan festival of Eastre. The goddess Eastre was worshipped by the Anglo-Saxons through her earthly symbol, the rabbit.

The Germans brought the symbol of the Easter rabbit to America. It was widely ignored by Christians until shortly after the Civil War. In fact, Easter itself was not widely celebrated in America until after that time.

As for the egg, the colored egg symbol also predates modern Easter. The exchange of eggs in the springtime is a custom that was centuries old when Easter was first celebrated by Christians.

From the earliest times, the egg was a symbol of rebirth in most cultures. Eggs were often wrapped in gold leaf or, if you were a peasant, colored brightly by boiling them with the leaves or petals of certain flowers.

Today, it’s good to know the tradition lives on. Children still hunt colored eggs and place them in Easter baskets along with the modern version of real Easter eggs -- those made of plastic or chocolate candy.

And so, the more festive nature of Easter, stripped of the tacked-on encumbrance of religious influence, continues to be carried on by our children, who help put the fun back in an otherwise solemn holiday.

Sunday, March 02, 2008

Future Vision: A Planet-Sized Computer

Imagine what it would be like if every household, every place of business, and every manufacturing plant had to generate its own electricity. What if there were no power companies that provided the juice to run your lights, appliances, and computers?

Every house would have its own generator. Every homeowner would be responsible for its upkeep and someone is the house would have to keep an eye on the utilization curve to avoid running the generator when not necessary. Business and industry would have much bigger headaches running their own power generating plants.

That’s the way electricity was heading until Thomas Edison and Samuel Insull built the first power grid. The power grid is a huge network of wires and transformers into which power companies pump their electricity. Controllers keep a balance between supply and demand within the grid so the system doesn’t get overloaded.

But if they do their jobs correctly, we don’t have to worry about it. We turn on a light switch and it gets brighter. We pop bread in the toaster and we get toast in three minutes. We turn on our high-definition televisions and see our favorite shows.

But when it comes to computing, everybody is doing it the way we would all be getting our electricity if Edison and company hadn’t developed the power grid. We all generate our own computing power. Every company and household that uses computers, which is the vast majority of us, must purchase its own computer box. And sometimes the size and power of those computers don’t really match our needs.

Nicholas Carr, in a new book, The Big Switch, imagines a world in which computing power is handled more like electricity is handled today, like a utility. When we connect to the Internet, what if our computer were used as part of a giant worldwide network of computers, adding its computing power to all the others. It would be like one hug planet-sized computer and everyone plugged into it would have a virtual supercomputer, the world’s largest one, at his or her fingertips.

Like the power grid, there would be people in charge of information flow, regulating it for peak times and lull times. Carr even believes that the World Wide Computer, as he calls it, would eventually gain a degree of artificial intelligence on its own.

Of course, Carr warns that the World Wide Computer won’t be the panacea to solve all the world’s problems. Electricity was supposed to do that 100 years ago, but it didn’t happen. It did, however, make our world a much better and more convenient place.

Big companies, for example, may not need to buy their own very expensive supercomputers. They could just purchase whatever computing power they needed from the grid. And individual users would pay a monthly fee determined solely by how much information and computing power they pulled off the World Wide computer.

There are downsides to having a computer the size of the planet. Imagine if it came down with a mega-virus or something. But if Carr’s vision comes true, computing power, like electricity, will change the world again. And most of the changes will be to make our lives much easier.