You are currently browsing the monthly archive for December 2011.

While thinking of how to sum up 2011, I looked backwards to my last blog post from 2010 and was jarred by the similarity of the two years.  There it all was again: the same sort of political scandals, the same news of war in the Middle East, the same tedious celebrity hijinks–only the world shaking environmental catastrophe had changed (the Gulf of Mexico oil spill was supplanted by the Fukushima Daiichi nuclear disaster).  It made me question the optimism of last year’s New Year’s post, in which I ultimately concluded that technology was rolling forward and thereby bringing us both knowledge and the resources needed to live a better happier life.

So this year I am going to base my final post around the worst thing that happened in 2011: the Fukushima Daiichi nuclear disaster.  This spring, three nuclear reactors on the northeast coast of Honshu melted down after being shaken by an earthquake and inundated by a once-in-a-lifetime tsunami. Designed in the sixties and manufactured in the early seventies, the reactors were an old design.  Mistakes made by engineers trying to rectify the situation initially compounded the problem.  This event has already been responsible for several worker deaths (although those occurred not as a result of radiation but rather from disaster conditions caused by the earthquake and flood).  It is estimated that, over the coming decades, fatalities from cancer could ultimately stretch up into the tens or perhaps even the hundreds!

Hindsight is 20/20, but, seriously, was this the best place for a series of fission reactors?

The fear generated by the incident has caused a global anti-nuclear backlash.  Plans for next-generation nuclear plants have been put on hold while existing power plants have been shut down.  Germany is exiting the nuclear energy business entirely.  Japan is building a host of ineffective wind plants and setting its advantages in fission power aside.  Developing nations like India, Brazil, and South Africa are reassessing their nuclear power plans.  The United States is suddenly building more gas power plants.  Even France is backing away from nuclear energy.

Anti-nuclear demonstrators march in Cologne (AP Photo/dapd/Roberto Pfeil)

Of course cold-blooded, analytically-minded readers who missed out on the media circus around the Fukushima incident might be wondering why a few (potential) deaths outweigh the 20,000 victims who were killed by the tsunami outright, or the hundreds of thousands of people killed worldwide in traffic accidents, or the millions of victims of North Korean famine.  Those kinds of casualties are all very ordinary and dull whereas the people who (might possibly) die (someday) from nuclear contamination face a very unusual, rare, and scary end.

Isn’t it worse that ten men might someday die of cancer then 10,000 men die outright from coal mining accidents?

Well no, not really.  The hype around nuclear accidents was used by fear-mongers to peddle their energy agenda–on the surface this might seem to be earth-friendly green energy, but since such a thing doesn’t really exist yet, the beneficiaries of nuclear power’s decline will be oil and gas producers, who are already operating the largest and most lucrative industry on earth.  Additionally the whole crisis allowed media sources to garner viewers and readers by means of frightening headlines (in fact that’s what I’m doing with this post).  The nuclear industry must become bigger to fit the needs of a world running out of fossil fuel (but with a quickly growing population of consumers).  Additionally our next generation of technology will likely require more energy rather than less.

Nigerians fight an oil pipeline explosion which burned hundreds of people to death

But, thanks to a disaster involving equipment that was four decades out of date which killed two people (from blood loss and contusion), humankind is abandoning the pursuit of inexpensive inexhaustible green energy for the foreseeable future.  At best, the next-generation nuclear designs now on the drawing boards or in early stages of construction will be reevaluated and made safer, but at worst we will fall into a long era of dependence of frac gas and foreign oil–a gray age of stagnation. Our leaders will greenwash this development by pretending that solar and wind energy are becoming more effective—but so far this has not been true at all.

I hope my flippant tone has not made it seem like I am making light of the tragedy that befell Japan, a peace-loving nation which is an unparalleled ally and friend.  I really am sad for every soul lost to the tsunami and I feel terrible for people who are now forced to live with the nebulous fear of cancer (especially the brave workers who raced in to known danger to fix the stricken plant).  Similarly, I worry about the Nigerians burned to death in pipeline accidents, the Pakistanis killed in friendly fire accidents, and the bicyclists run over by minivan drivers. To care about the world is to worry and face grief.

Tsunami Memorial Stone

But coping with such worries and sadness is the point of this essay.  Our fears must not outweigh our bright hopes. We must keep perspective on the actual extent of our setbacks and not allow them to scare us away from future progress. Only bravery combined with clear-headed thought will allow us to move forward.  Undoing this year’s mistakes is impossible but is still possible to learn from them and not live in fear of trying again.  I wrote about the energy sector because of its primacy within the world economy—but I dare say most industries are facing such a crisis to one extent or the other.

If we turn back or freeze in place, we will be lost–so onwards to 2012 and upward to great things.  And of course happy new year to all of my readers!

[And as always–if you feel I am utterly misguided in my energy policy or any other particular, just say so below.]

Advertisement

I am always frustrated when the “who we lost in 2011” obituary lists come out and they are filled with actors and popular entertainers (although I am rather pleased that this year’s list contained so many despots, terrorists, and mass murderers).

Good riddance!

Although I enjoyed M*A*S*H and Columbo, televised entertainments are not foremost in my list of human accomplishments.  Therefore here is my (not at all comprehensive) overview of various important people who died in 2011.  I have tried to concentrate on scientists, doctors, and heroes (as I tend to hold them in the highest respect) but some painters, toymakers, and fantasy illustrators crept into my list thanks to my own professional background.  We will miss these notable people who passed on in 2011:

John “Jack” Ertle Oliver (September 26, 1923 – January 5, 2011) was a geologist who provided scientific data supporting the (then controversial) place tectonic model of continental drift.

Milton Levine (November 3rd, 1913 – January 16th, 2011) was a toy inventor who created Uncle Milton’s Ant Farm—one of the ultimate fad toys. More than 20 million units were sold during Levine’s lifetime. In 1956, while at a Fourth of July picnic, he became entranced by a mound of ants.  His fascination with the teeming colony of hymenopterans led him to found Uncle Milton’s Toys.

Uncle Milton's Ant Farm (one of 20,000,000)

Frank Buckles (February 1st, 1901 – February 27th, 2011)was the last living American veteran of World War I.  He drove ambulances in the mud of France and was still driving the tractor on his West Virginia farm until he was 103. He was one of the last survivors of the so-called “Lost Generation” passing away of natural causes at the age of 110.

Frank Buckles in his World War I Uniform

Simon van der Meer (November 24th, 1925 – March 4th, 2011) was a particle physicist from the Netherlands who shared a Nobel Prize for discovering the W and Z particles, two of the most fundamental constituents of matter.

Paul Baran (1926- March 26th, 2011) was a Polish-American engineer who invented packet switching techniques critical to the internet.  He additionally helped develop many other technologies including cable modems, interactive TV, and airport metal detectors.

Baruch Samuel “Barry” Blumberg (July 28th, 1925 – April 5th, 2011) Blumberg received a Nobel Prize in Medicine for identifying the Hepatitis B virus, for which he subsequently developed a diagnostic test and a vaccine. He patented his vaccine and then distributed it for free to international pharmaceutical companies (thereby saving millions of people from a life of disease, serious liver complications, and early death).

Baruch Samuel “Barry” Blumberg

Rosalyn Sussman Yalow (July 19th, 1921 – May 30th, 2011) was the second woman to win the Nobel Prize for Medicine in recognition of her work developing the Radioimmunoassay, an in vitro immune assay technique which revolutionized the field of endocrinology.

Lucian Freud (December 8th, 1922 – July 20th, 2011) was a figurative painter who crafted impasto portraits of normal people in anguished poses. His fleshy nudes were so un-erotic and anti-beautiful that they took on their own strange heroic dimension.

Reflection, self portrait (Lucian Freud, 1985, oil on canvas)

Elliot Handler (April 9, 1916 – July 21, 2011) was a toy-maker and businessperson who co-founded Mattel (the “el” stood for Elliot).  He designed or popularized famous toys including Barbie, Burp Gun, Chatty Cathy, and Hot Wheels.

The first Barbie doll shown at New York Toy Fair in 1959.

Gen. John M. Shalikashvili (June 27th, 1936 – July 23rd, 2011) was the first foreign born soldier to rise up through the American army to become the chairman of the joint chiefs of staff.  His father, Prince Dimitri Shalikashvili (1896–1978), was a Geogian nobleman who served the army of Imperial Russia before fleeing the Bolshevik Revolution to Poland.

Wilson Greatbatch (September 6th, 1919 – September 27th, 2011) invented the implantable cardiac pacemaker now worn constantly by countless survivors of heart disease.

John McCarthy (September 4th, 1927 – October 24th, 2011) was a cognitive scientist and computer pioneer who coined the phrase “Artificial Intelligence” in 1956.  He created the LISP programming language.

Lynn Margulis (March 5th, 1938 – November 22nd, 2011) was a cell biologist and philosopher best known for her theory on the symbiotic origin of eukaryotic organelles. Her contributions were critical to the endosymbiotic theory—the accepted scientific consensus concerning the manner certain organelles were formed. She also helped to formulate thee Gaia hypothesis, which posits that all life is linked together as a super-organism.

Darrell K. Sweet (August 15th, 1934 – December 5th, 2011) was a fantasy illustrator famous for providing cover art for novels such as the Wheel of Time series and the Xanth series.

Robot Adept (Darell K. Sweet, mixed media)

Václav Havel (October 5th, 1936 – December 18th, 2011) was a Czech playwright, essayist, and political dissident who ended up becoming the last president of Czechoslovakia and the first President of the Czech Republic as the iron curtain crashed down around Europe.  I have a special fondness for Havel since he wrote “The Memorandum”, the first play I acted in during high school.  I played the officious pedant “Lear”, mouthpiece of the latest inane concept sweeping through a hidebound bureaucracy.  I enjoyed the role intellectually but didn’t really get Havel till I grew up and went to work in an office.

Conifers are amazing! Also happy holidays from Ferrebeekeeper.

It is the holiday season and decorated conifers are everywhere. Seeing all of the dressed-up firs and spruces reminds me that Ferrebeekeeper’s tree category has so far betrayed a distinct bias towards angiosperms (flowering plants). Yet the conifers vastly outdate all flowering trees by a vast span of time.  The first conifers we have found date to the late Carboniferous period (about 300 million years ago) whereas the first fossils of angiosperms appear in the Cretaceous (about 125 million years ago) although the flowering plants probably originated earlier in the Mesozoic.

The first known conifer trees resembled modern Araucaria trees.  They evolved from a (now long-extinct) ancestral gymnosperm tree which could only live in warm swampy conditions—a watery habitat necessitated since these progenitor trees did not cope well with dry conditions and also probably utilized motile sperm.  Instead of relying on free-swimming gametes and huge seeds, the newly evolved conifers used wind to carry clouds of pollen through the air and were capable of producing many tiny seeds which could survive drying out.  Because the evergreen cone-bearing trees could survive in drier conditions, the early conifers had immense competitive advantages.  These advantages were critical to survival as the great warm swamps of the Carboniferous dried out.  The continents, which had been separated by shallow oceans and seas, annealed together into the baking dry supercontinent of Permian Pangaea.  In the arid deserts and mountains, the conifers were among the only plants which could survive.

Pay attention to the Trees in this Painting not the Dinosaurs (art by Jon Taylor)

This ability to live through any condition helped the conifers get through the greatest mass extinction in life’s history—The Permian–Triassic (P–Tr) extinction event, (known to paleontologists as “the Great Dying”).  Thereafter, throughout the Mesozoic they were the dominant land plants (along with cycads and ginkgos which had evolved at about the same time).  The Mesozoic saw the greatest diversity of conifers ever—the age of dinosaurs could just as well be called the age of conifers.  Huge heard of sauropods grazed on vast swaths of exotic conifers. Beneath these strange sprawling forests, the carnosaurs hunted, the early birds glided through endless green canyons, and the desperate little mammals darted out to grab and hoard the pine nuts of the time.

The Great Boreal Forests of Canada (photo by Chad Delany)

Although flowering plants rapidly came to prominence towards the end of the Cretaceous and have since become the most diverse plants, today’s conifers are not in any way anachronisms or primitive also-rans.  They still out-compete the flowering trees in cold areas and in dry areas. Conifers entirely dominate the boreal forests of Asia, Europe, and North America—arguably the largest continuous ecosystem on the planet except for the pelagic ocean.  They form entire strange ecosystems in the Araucaria moist forests of South America—which are relics of the great conifer forests of Antarctica (the southern continent was once a warmer happier place before tectonics and climate shift gradually dragged its inhabitants to frozen death).

Contemporary Araucaria Forest in South America (photo by Garth Lenz)

The largest trees—the sequoias and redwoods–are conifers.  The oldest trees—bristlecone pine trees and clonal Spruces–are conifers (excepting of course the clonal colonies).  Conifers are probably the most commercially important trees since they are fast-growing staples of the pulp and the timber industries. Timber companies sometimes buy up hardwood forests, clear cut the valuable native deciduous trees and plant fast growing pines in their place to harvest for pulp.  In fact all of the Christmas trees which are everywhere around New York come from a similar farming process.  The conifers are nearly everywhere—they have one of the greatest success stories in the history of life.  It is no wonder they are the symbol of life surviving through the winter to come back stronger.  They have done that time and time again through the darkest and driest winters of the eons.

A Grove of Giant Sequoias (Sequoiadendron giganteum)

In the Northern Hemisphere today is the first day of winter.  As always, this change of season occurs on the winter solstice, the shortest day of the year. Last night was actually the longest evening of the year—so I suppose we can now look forward to the gradual return of the sun bit by bit (even as the weather worsens for the true cold of January and February).

To celebrate winter (admittedly my least favorite season), here is a gallery of winter personifications.  Each wears an icy crown and most of them look cold, haughty, indifferent, or cruel.  I am including these ice kings and queens under Ferrebeekeeper’s mascot category even though they are not really cheering for a team or a product.  “Personification” seems close enough to the definition of mascot to ensure that I won’t get in trouble from WordPress (although, as ever, I invite any comments or aeguments below).

Snow Queen (by Vladislav Erko)

Winter King and Queen (Source unknown)

Old fashioned cartoon Ice Monarch

Katy Perry? How did you get in my blog and why are you dressed as queen of winter?

Ded Moroz (Дед Мороз) “Grandfather Frost” plays a similar gift-giving role to Santa Clause in Slavic Cultures

Title Character from “The Snow Queen” by Birmingham Repertory Theatre

The Ice King, a villain from “Adventuretime” on Cartoon Network

The Ice Princess Tatiana from Dolphin Mall in Miami.  She looks like she’s saying “I don’t know where you parked your car.”

Tilda Swinton as the White Witch of Narnia (possible pretender to the throne of winter)

“snow king” card

A snow queen halloween costume available for sale online

From the Illamasqua Art-of-Darkness makeup collection (click for link)

I would hang around and make some funny comments about all of the monarchs of winter but all of the white hair and piercing eyes are starting to weird me out a little (to say nothing of Katy Perry’s vacuous stare).  Have you ever noticed how summer, spring, and fall are not represented as maniacal tyrants with wicked crowns?  I’m looking forward to getting back to those other seasons.  In the mean time have a wonderful winter!

The Piraiba (Brachyplatystoma filamentosum)

Today features a short post concerning one of the strangest looking groups of catfish—which is truly saying something since the entire order of catfish appears rather odd.  Brachyplatystoma is a genus of catfish from central South America which includes the largest catfish from that continent, Brachyplatystoma filamentosum, the so-called Goliath catfish or Piraiba, which is capable of reaching up to 3.6 metres (12 ft) in length and can weigh up to 200 kg (450 pounds).  The Piraiba is hunted for food and sport both with hooks and with harpoons.  All Brachyplatystoma catfish are swift sleek fish which live by hunting, but whereas the other species mostly hunt fish, the Piraiba has been known to eat primates.  Specimens have been found with monkeys in their digestive system and attacks on humans are darkly rumored (although ichthyologists scoff that the mighty fish only scavenges the remains of such terrestrial animals).

Brachyplatystoma capapretum (photo by Enrico Richter)

The other Brachyplatystoma catfish species are smaller than the giant Brachyplatystoma filamentosum, but they all have the elongated flattened nose which characterizes the genus.  One of these species, B. tigrinum, has especially lovely stripes.  Although an unusual fish, it is caught in sufficient quantities to be available in specialty stores for home aquariums, where its long nose, pretty stripes, and interesting behavior fetch a premium price.

Brachyplatystoma tigrinus (Zebra shovelnose catfish)

Brachyplatystoma tigrinus

When I was a boy I was wandering around in my grandfather’s storage shed when I found a ragged hand woven sack filled with mystery blobs.  These powdery golden-orange nuggets were hard (albeit slightly gummy) and they had a translucent glow.  When I inquired about the alien substance, my grandfather pulled a glowing ember from the fire and set one of the weird nuggets on top of the hot coal.  Immediately an aromatic cloud of smoke welled up from the lump and filled the room.  The glorious smell was simultaneously like lemon and pine but with deep strange medicinal undertones of cedar and some unidentifiable otherworldly spiciness.  It was transfixing. The blobs were frankincense, obtained in Somalia during the fifties (my grandparents and mother and uncle were living there on diplomatic/government business).  The unprepossessing amber lumps turned out to be the incense of kings and gods.

Frankincense

Frankincense has been harvested from the arid deserts of Southern Arabia and Northeast Africa since prehistoric times.  The hardened resin which is also known as olibanum is the product of tiny scrub trees from the family Boswellia.  The sacred frankincense tree Boswellia sacra, produces an especially fine grade (although the same tree can produce different grades of frankincense depending on the time of year).

A Boswellia Tree (Frankincense Tree)

Frankincense trees are tough trees capable of surviving on misty breezes from the ocean, rocky limestone soil, and little else.  Certain species of Boswellia trees are able to produce a disk-like bulb which adheres to sheer rock.  The trees can thereby cling to boulders and cliffs in severe windstorms.  The incense is harvested by carefully scraping a wound in the tree’s bark and then returning later to harvest the hardened resin (although such mistreatment is said to gradually diminish the trees).

For countless centuries, bags of frankincense and other aromatic resins were the chief trade products of regions of the southern Arabian Peninsula (in what are today Yemen and Oman).  These compounds were of tremendous importance to the ancient Egyptians for both cosmetic and funerary purposes.  In Biblical times, incense was traded throughout the Middle East and in the classical Greco-Roman world.  The fragrant resins even were exported to ancient India and dynastic China where they became part of traditional medicine and ritual.

An earthenware censer with lead glaze from the Eastern Han dynasty, 25-220 AD

This incense trade was allegedly centered in the quasi-mythical lost city Iram of the Pillars.  In this oasis at the edge of the Rub’ al Khali dunes, the Ubar people dwelled in a beautiful columned city. According to the Quran, the city of Iram met an apocalyptic doom when its ruler, King Shaddad defied the warnings of the prophet Hud .  Shaddad’s impiety caused Allah to smite the entire region into the sands.  All of this was regarded as mythology until space-based imaging systems (including LandSat, SPOT, and shuttle data) revealed that ancient caravan trails did indeed center on a collapsed oasis.  It is speculated that over millennia, the inhabitants had drained the ancient subterranean aquifer, which ultimately caused the ground to collapse—a salutary lesson for the aquifer based cities of Western America! Whatever the cause, the frankincense industry contracted greatly around 300 AD, although plenty of resin still went to medicinal and liturgical buyers.

Frankincense is purported to have many pharmacological uses, particularly as an anti-inflammatory agent, an anti-depressant, and an anti-cancer treatment. Although initial clinical studies of these claims seem encouraging, the safety and efficacy of frankincense is still being tested and reviewed.  Sources on the web suggested that a recent study by Johns Hopkins biologists and doctors from Hebrew University in Jerusalem found that inhaling frankincense incense could alleviate anxiety and depression (but again my sources are unclear so don’t run out and start eating frankincense if you are suffering from holiday blues).  Even if frankincense does not provide us with a new class of wonder drugs, it remains useful for deterring insects, including the deadly malarial mosquitoes.  Additionally, as noted above,  frankincese smells wonderful.  Maybe you should run to your local caravan and pick some up.

Or wherever you go for incense these days...

A large Victorian gingerbread house created by the Disney Corporation as a centerpiece

Since the winter solstice is only a few days away, now seems like a good time for a festive holiday post to warm up the long cold nights. Long-time readers know about Ferrebeekeeper’s obsession with all things gothic.  To cheer up the dark season here is a post which combines the beauty of gothic architecture with the sugary appeal of candy!

Like gothic art, gingerbread has a very long tradition which stretches back to late antiquity.  It was introduced in Western Europe by Gregory of Nicopolis (Gregory Makar) an Armenian monk and holy man who moved to France in 992 AD.  Whole communities would specialize in gingerbread baking and nearly every European country developed its own intricate traditions and recipes.  In Germany and Scandinavia it became traditional to make two sorts of gingerbread—a soft gingerbread for eating (which was said to aid digestion) and a hard gingerbread which could be stored or used for building.

Here then is a little gallery of some gothic gingerbread constructions which I found around the web.  They really look too good to eat, but if you are interested in making your own version, the cooks/artists who made the gingerbread cathedral immediately below have also put up an instructional webpage.

Seriously, if you follow that link you can make this!

Another Disney Gingerbread House from the "American Adventure" Pavilion

(Image:thoughtdistillery.com/2004/12/13/74)

Even in sugar, icing, and gingerbread, the beauty of gothic architecture shines through! Best wishes for sweet thoughts and happy dreams as the nights grow long and the wind blows outside the door (unless, of course, you are in the tropics or the southern hemisphere, in which case, can I come stay with you?).

A Young Cheetah Threatens a Hat

Things have been a pretty grim here at Ferrebeekeeper lately, what with the inexorable takeover of the labor market by machines, the child-killing Christmas demon Krampus, and the death of the universe. To cheer things up as we go into the weekend, here is a post about baby cheetahs.  Some people may claim this topic is a cynical attempt to exploit the endearing cubs and drive up ratings.  To those naysayers I respond “baby cheetahs!”

Cheetah Cubs must survive by hiding (image from http://cutearoo.com)

Cheetahs (Acinonyx jubatus)are well known as the fastest land animal–capable of running at blazing speeds of up to 120 km/h (75 mph).  To run at such a velocity the cheetah was forced to forgo some offensive advantages possessed by other comparably-sized cats.  Cheetahs’ jaws are smaller and their claws are permanently fixed in place–which makes their slashing implements shorter and duller than the razor sharp claws of other hunting cats.  Because they concentrate on running prowess to hunt they can never risk a sports injury from fighting.  These adaptations make it difficult for mother cheetahs to defend their cubs from predators.  Naturally the tiny cubs can not rely on the mother cheetah’s best defense—her legendary speed.

A Mother Cheetah with her cubs at Masai Mara National Reserve in Kenya

Female cheetahs gestate for ninety to ninety-eight days and give birth to a litter of 3 to 9 cubs which each weigh 150 to 300 g (5.3 to 11 oz.) at birth.  Since they are so small and slow, (and since they impede their mother’s hunting) cubs suffer from high mortality.  Evolution however has utilized certain tricks to minimize the danger they face.  Unlike many feline cubs, cheetahs are born already covered with spots. They are adept from a young age at hiding within thorny scrub. Additionally the cubs have a remarkable adaptation to aid their defense.  Until they are near maturity, they possess long punk-rock mantle of downy hair along their neck.  These wild manes act like ghillie suits—breaking up the cubs’ outlines when they are hidden in dense scrub.  The mantles also mimic the Don King style hair of the honey badger (well-known as one of the craziest, bravest, angriest small animals of the savannah).  No animals want to mess with honey badgers since the angry badgers despise their own lives only slightly less than those of other living things and are thus extremely unpredictable.

Cheetah Cub

Honey Badger

When cheetahs reach adolescence they lose their mantles and acquire their extraordinary speed, but they still have a certain kittenish playfulness.  I was once in the Washington DC zoo on Sunday morning (when the cheetahs are each given a frozen rabbit as a treat).  The cheetah run in the National Zoo is long and narrow giving the animals space to build up full speed.  The male adolescent cheetahs were excited for their rabbits.  They were crouching and slinking back and forth faster than most people could run.  One of the adolescent cheetahs got too close to the powerful electric fence surrounding the enclosure and there was a sizzling “pop” as he accidentally touched his delicate nose to the wire.  The young male ran off and, because cheetahs are bred to the bone for the chase, his brother ran after him.  They ran faster and faster, becoming an exquisite blur.  The elegant forms left footprints of fire behind them until the first cheetah slid to a (10 meter) sliding stop and emitted an otherworldly angry chirp-yowl. The spectacle only lasted a moment, but compared to those cheetahs, all other runners I have seen–athletes, racehorses, greyhounds, rabbits–all seemed slow and awkward.

Yesterday Ferrebeekeeper described the Luddite movement, an anti-technology workers’ revolt which occurred near the beginning of the Industrial Revolution. The revolt centered on the idea that labor-saving machines destroy jobs, a concept which economists decry as the “Luddite fallacy.” Most Neoclassical economists believe that, even if machines cause job losses in certain industries, such losses are more than offset by the attendant fall in prices for consumers.  The history of the world since the beginning of the industrial revolution has borne this idea out, as more and more goods have become available to wider and wider markets.  The history of first world nations reflects a sort of anti-Luddite narrative:  farmers are not needed to plough the lands because of greater agricultural productivity so they go to work in factories.  Factories then become more productive thanks to machines and cheap competition so the factory workers become tertiary sector employees.  The tertiary sector consists of service jobs where employees do not necessarily make or produce anything tangible but instead offer support, experience, or knowledge—for example nurses, lawyers, waste-disposal professionals, casino employees, courtesans, financiers and such like (some economists posit that there is a quaternary sector of scientists, professors, computer geniuses, artists, and bloggers—the creative sector—but we needn’t get into that here).

Industrial Robots installed at Kia Motors' Slovakian plant by Hyundai Heavy Industries

Since the dawn of the Industrial era, this progression has worked admirably for creating economic progress.  And, during that time, machines have been constantly improving.  Whereas the horseless carriage once put horses, hostlers, and livery stables out of work but provided automakers with jobs, then robot arms and mechanized welding units came along to supplant those auto-workers.  The displaced autoworkers all had to go out and become radiologists, actuaries, sex-workers, and restaurateurs.  Now, however, machines are becoming sophisticated enough to invade the tertiary sector.  Subtle computer programs are proving superior to trained (overworked) radiologists at finding the tiniest nascent tumors.  Accountants are being replaced by Turbo-tax and Quickbooks. Weird Japanese scientists have built robots which…um make sushi and pour drinks. It seems like this trend is going to gobble up a great many service jobs in the near future from all strata of society.

A world where machines are able to replace white-collar workers would mean the hollowing out of the middle class.  The international corporations and plutocrats making software, robots, and automated factories would become extravagantly rich while the rest of would have to struggle to find niches the machines haven’t taken over. A huge economic slump would grip the developed world–as average consumers became unable to buy the goods turned out by those factories.  Hmm, that seems awfully familiar.

"But the Roomba is my friend."

So are the Luddites finally correct?  Should we go out and smash our computers and Roombas? Well… it isn’t like we can stop what we are doing.  To move forward in science and manufacturing we are going to need better thinking machines.  At some point these machines will be better at thinking then we are…and they will also be better than us at making machines.  That point will be the technological singularity and it seems that we are on that path, unable to turn back.  Perhaps we will end up with a race of omniscient omnipotent servants (yay!).  Perhaps we will combine with machines and become mighty cyborgs.  Perhaps we will end up as housepets or as a mountain of skulls the robots walk on and laugh at.  I don’t know.  Nobody does. Yikes! How did this essay about a nineteenth century protest movement take us to this destination?

Welcome to Utopia--Beep!

In the mean time, it would be useful if people would talk more about what we want from our technology and how we can get there.  The fact that having better machines is currently splitting society into some dysfunctional Edwardian plutocracy is disquieting.  It means we are not thinking hard enough or using our imaginations.  We should start doing so now…while we are still allowed to!

Ned Ludd was a person with severe developmental problems back in the 18th century, when society lacked effective ways of assisting people with disabilities.  In the cruel parlance of the time he was a “half-wit”.  Supposedly, Ned worked as a weaver in Anstey (an English village which was the gateway to the ancient Charnwood forest).  In 1779, something went wrong—either Ned misunderstood a confusing directive, or he was whipped for inefficiency, or the taunts of the villagers drove him to rage. He picked up a hammer and smashed two brand new stocking frames (a sort of mechanical knitting machine used to quickly weave textiles).  Then he fled off into the wilderness where he lived as a freeman. Some say that in the primal forests he learned to become a king.

The Leader of the Luddites (Annymous, published in 1812 by Mess, Walker and Knight)

Ned is important not because of his life (indeed it seems likely that he was not real—or, at best, he was just barely real) but because he was mythicized into a larger-than-life figure around whom the Luddite movement coalesced.  This diffuse social rebellion had some roots in the austere and straightened times of the Napoleonic war but it was mostly a direct response to the first sweeping changes wrought by the Industrial revolution.  Skilled weavers and textile artisans were aggrieved that machines operated by unskilled workpeople could easily produce much more fabric than trained artisans using traditional methods. The unskilled workpeople were angry at being underpaid and mistreated in the harsh dangerous early factories.  This anger was combined with widespread popular discontent about the privations of war and the rapacity of the elite. Free companies of rebels met and drilled at night in Sherwood Forest or on vacant moorland. Anonymous malefactors smashed the new machines.  Mills burnt down and factory owners were threatened.  It was whispered that it was all the work of “King Ludd” whose rough signatures appeared on broadsheets and threatening letters.

The first wave of Luddite Rebellion broke out in 1811 centered in Nottingham and the surrounding areas.  It is interesting that the same region which came up with Robin Hood, the hero-thief of folklore, also was responsible for remaking Ned Ludd from a lumpen outsider into a bellicose king of anti-technology.  Disgruntled (male) artisans marched in women’s clothes and called themselves “Ned Ludd’s wives”.  Circulars were addressed from the “king’s” office deep in Sherwood forest.

The original teasing tone quickly vanished as Luddite uprisings broke out across Northern England in the subsequent months and years until British regulars were sent in to quash them.   For a brief period, there were more redcoats putting down Luddite insurrections in England than there were fighting Napoleon on the Iberian Peninsula. Professional soldiers made short work of the rebels and Parliament hastily enacted a series of laws which made “machine-breaking” a capital crime. A number of Luddites were executed and others were transported to Australia.

Ned Ludd escaped these reprisals by being from a different era (and fictional).  “Luddite” has now become a preferred label for all people who eschew technology.  The half-wit King Ned still lives on in the imagination of people who have lost their jobs to the march of progress and in the nightmares of technophiles and economists. Indeed one of the great constructs/truisms of economics is the Luddite fallacy—which holds that labor saving devices increase unemployment.  Neoclassical economists (who named the concept) assert that it is a fallacy because labor saving devices decrease the cost of goods—allowing more consumers to obtain them.  However there are some who believe this has only been the case so far because the machines have not become sophisticated enough. Once a certain threshold of technology is passed thinking machines might replace many skilled positions as well as unskilled ones.  This simultaneously awesome and horrifying concept will be our theme tomorrow.

Ye Olde Ferrebeekeeper Archives

December 2011
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031