You are currently browsing the monthly archive for March 2012.
Since 2006, beekeepers in Europe and North America have been reporting mysterious mass die-offs of honeybees. Although this has been a problem which has sometimes affected beekeepers in the past, the worldwide scale of beehive failures subsequent after 2006 was unprecedented. Worldwide bee populations crashed. Since bees are directly responsible for pollinating a huge variety of domestic crops–particularly fruits and nuts—the threat to our food supply and agricultural base extended far beyond the honey production which people associate with bees. An entire community of free-wheeling apiarists came into the limelight. For generations these mavericks would load up their trucks with hives of bees and drive to orchards in bloom. For the right…honorarium…they would release the bees to pollinate the almonds, broccoli, onions, apples, cherries, avocados, citrus, melons, etcetera etcetera which form the non-cereal base of the produce aisle (as an aside, I find it fascinating that there is a cadre of people paid to help plants reproduce by means of huge clouds of social insects—if you tried to explain all this to an extraterrestrial, they would shake their heads and mutter about what perverts earthlings are).
As bees have declined, honey has naturally become more expensive, but so too have a great many other agricultural staples. Not only has the great dying hurt farmers and food shoppers it has also affected entire ecosystems—perhaps altering them for many years to come. “Pollinator Conservation” (an article from the Renewable Resources Journal) opines that “Cross-pollination helps at least 30 percent of the world’s crops and 90 percent of our wild plants to thrive.”
Scientists have been rushing to get to the bottom of this worldwide problem, pointing fingers at varroa mites (invasive parasitic vampire mites from China), pesticides, global warming, transgenic crops, cell phone towers, habitat destruction, and goodness knows what else. The lunatic fringe has leaped into the fray with theories about super bears, aliens, and Atlantis (although I could add that sentence to virtually any topic). So far no theory has proven conclusive: exasperated entomologists have been throwing up their hands and saying maybe it’s a combination of everything.

An extremely cool illustration of Imidacloprid acting on insect nerves from Bayer (the original inventor/patent holder of the compound)
Yesterday (March 29th, 2012) two studies released in “Science” magazine made a more explicit link between colony collapse and neonicotinoid insecticides. The first study suggested that hives exposed to imidacloprid (one of the most widely used pesticides worldwide) produced 85% fewer queen bees than the control hives. The second study tracked individual bees with radio chips (!) to discover that bees dosed with thiamethoxam were twice as likely to suffer homing failure and not return to the hive. Suspicion has focused on neonicotinoid poisons as a culprit in hive collapse disorder for years (the compounds were hastened into use in the nineties because they were so benign to vertebrates), however the rigorously reviewed & carefully controlled studies in “Science” bring an entirely new level of evidence to the problem. Unfortunately this also brings a new variety of problems to the problem, since neonicotinoids are tremendously important to agriculture in their own right (sorry Mother Earth) and since they are such handy poisons for, you know, not killing us and our pets and farm animals.
A few weeks ago Ferrebeekeeper featured a post about belemnites, extinct cephalopods from the Mesozoic which teemed in immense schools through the reptile-haunted oceans of that bygone era. Yet belemnites were certainly not the only cephalopods which swam in the Mesozoic seas. Numerous shelled cephalopods—the ammonites—were widespread in every sort of marine habitat. Ammonites are personal favorites of mine so I am not going to write a comprehensive explanation/description of the subclass. Instead I wish to provide you with an idea of how big ammonites could get by providing a few pictures of large ammonite fossils which have been discovered. Imagine these monsters jetting through the water with huge tentacles and big intelligent eyes scanning for giant predatory reptiles and you will have a better idea of the Mesozoic Oceans!
To balance yesterday’s post about the dog star, today we feature three whimsical cat paintings by Tokyo born surrealist Tokuhiro Kawai. I am calling Kawai a surrealist, but perhaps it would be more correct to call him a painter of fantastical narrative: all of his works seem to have some sort of magical fairy-tale story behind them. Although the three monarchical cats shown here are lighthearted, some of Kawai’s other paintings are much more melodramatic and feature fearsome conflict between devils, angels, and heroes.
Each of these paintings features a Scottish Fold housecat either wearing a crown or being ceremonially coronated. The little black and white cat is so self-assured and regal that we hardly wonder at its elevation to the throne. With broad gleaming eyes and fur that seems as though the viewer could touch it, the cat seems real. One wonders if perhaps it belongs to the artist.
Kawai has a particular gift for painting animals and many of his compositions are filled from top to bottom with flamingos, foxes, owls, ammonites, and pelicans. Cats seem to be his favorite and they are pictured as conquerors, tyrants, and gods—in one of his pictures a feisty cat has killed an angel like it was a songbird and is holding the limp corpse in his fangs while standing like a stylite atop a classical column. Fortunately the cat in these three paintings does not seem as violent. The little kitty is clearly dreaming about the trappings of power—what it would be like to wield absolute authority and be pampered all day. Knowing my own pet housecat’s personality, I believe that such an interpreatation of feline psychology is not entirely a stretch.
The brightest star in the night sky is Sirius. Only 8.6 million light years from Earth, Sirius A is 25 times more luminous than the sun. Because of its brightness, the star was well known in ancient times—it was named Sopdet in Ancient Egypt and it was the basis of the Egyptian calendar. After a 70-day absence from the skies Sirius (or Sopdet) first became visible just before sunrise near the Summer solstice—just prior to the annual Nile floods. Greek and Roman astronomers philosophized and speculated about Sirius, (which they called “the dog star” because of its closeness to the constellation Canis Majoris). Arabs knew the star as Aschere “the leader”. Polynesians used it as a principle focus of their astonishing oceanic navigation. Over countless millennia, Sirius has worked its way deep into human consciousness as one of the immutable landmarks of the night sky.
So imagine the shock when it was discovered that Sirius is not alone. The bright star we know is actually Sirius A, a star with twice the mass of the sun. In 1844 the German astronomer Friedrich Bessel hypothesized a tiny companion for Sirius based on the irregular proper motion of Sirius. Then, in 1862, as the American Civil war was being fought an American astronomer in Chicago first observed the tiny companion, Sirius B (thereafter affectionately known as “the Pup”). Sirius B has nearly the same mass as the sun (.98 solar mass) but it is only 12,000 kilometers (7,500 mi) in diameter—nearly the same distance around as Earth.
Today Sirius B is the closest white dwarf star to planet Earth. However it has not always been so, Sirius B began its life as a luminous blue B-type main sequence star with a mass five times that of the sun. About 124 million years ago—as the dinosaurs grazed on the first magnolias—Sirius B fused its way through the hydrogen and helium in its mass. As Sirius B began to fuse together larger elements like oxygen and carbon it expanded into a red giant star with a diameter 10 to 100 times that of the sun. Then Sirius B ran out of nuclear fuel. Without the heat generated by nuclear fusion to support it, the star underwent gravitational collapse and shrank into a hyper dense white dwarf star. These tiny stars are extremely dense and hot when they are formed, but since they generate no new energy their heat and radiance gradually radiate away over billions of years until the stars are completely black and dead.
Although Sirius B is largely composed of a carbon-oxygen mixture, its core is overlaid by an envelope of lighter elements. Hydrogen, being lightest, forms the outermost layer (which is why the little star currently appears uniformly white).
Spring has come early this year and the beautiful tulip-like petals of New York City’s magnolia trees are already beginning to fall into great drifts of white and pink. If you stop and pick up one of the pretty petals from such a pile you will be surprised by the leathery resilience of the delicate-looking petals. The durability of the petals of magnolia flowers is not coincidental—the flowers are different from other common flowering trees because Magnoliidae trees were among the first flowering trees to evolve. The earliest known fossils of such flowers date from the upper Cretaceous period around 130 million years ago. Magnoliidae petals are tough because they were originally meant to attract the attention of beetles rather than bees (which do not appear in the fossil record until 100 million years ago). Since there were no insects specially adapted to live as pollinators when magnolia-like trees first appeared, the petals and reproductive structures of these first flowering trees had to be robust to survive attention from the hungry clumsy beetles (toughness which has passed on to the modern ornamental trees).

Beetles on a Magnolia flower by Beatriz Moisset http://pollinators.blogspot.com/2011/06/magnolias-and-beetle-pollination.html
Paeleobotanists have not yet unraveled the entire history of the evolution of flowering plants (indeed, Charles Darwin called the abrupt appearance of flowers in the fossil record “the abominable mystery”) however magnolia-like trees appeared long before the great radiation of angiosperms which occurred approximately 100 million years ago. The first magnoliid trees must have seemed tremendously strange–explosions of color and shape surrounded by great uniformly green forests of gymnosperm trees (like the familiar conifers). Magnolia blossoms betray evidence of their ancient lineage through several “primitive” features: the petals are nearly indistinguishable from the sepals; each flower has many stamens which are arranged in spiral rows; there are multiple pistils; and all of the stamens and pistils are supported by a “fingerlike receptacle.”
By attracting the attention of animals (either through the colorful appearance and appealing scent of flowers, or by the edible nectar and fruit) flowering plants were better able to reproduce themselves. Magnolias spread around the temperate world and began the complicated interdependent relationship which all sorts of animals (including humans) have with flowering plants.
Today’s post combines two major Ferrebeekeeper topics to get an unexpectedly mild result! I imagined that by combining crowns and serpents I would get some sort of spectacular king cobra or a mythological crown made of golden serpents and rubies but what turned up instead was the Southeastern crowned snake (Tantilla coronata), a slender dusky-colored snake with a little sand colored diadem.
The crowned snake is indigenous to the American Southeast from southern Virginia down through the Carolinas and Georgia to the northern panhandle of Florida. Unlike the regal snakes of my imagination, the crowned snake is a tiny snake which measures from 15 to 20 centimeters long (6 to 9 inches) and lives on small arthropods like scorpions, spiders, and insects. Although not dangerous to people or mammals, the crowned snake possesses an extremely mild venom which it slowly chews into its prey like a old man deliberately eating a biscuit.
Hmm, not what I expected from a crowned serpent!
Here at Ferrebeekeeper we have reviewed all sorts of meanings of the word “Gothic”–From Gothic painting to Gothic literature to Gothic history to Gothic architecture, a review of the changing history of this word has taken us through thousands of years of western culture (and beyond). Although this blog has described the Gothic alphabet created by Bishop Wulfila in order to proselytize to the Gothic tribes, the term Gothic has yet another meaning in the context of the Roman alphabet.

Black-letter book hand by Jacobus de Voragine, from his Legenda aurea, 1312; in the British Museum, London
From approximately 1150 to well into the 17th century, Blackletter script was widely used throughout Western Europe. Renaissance humanists from 15th century Italy onward thought the various varieties of Blackletter script were barbaric and they called the fonts “Gothic” to belittle them as un-Roman and hence primitive (the Germans however did not suffer such prejudice and Blackletter–or Gothic–scripts are still used for writing and printing German).
The main Blackletter scripts were Schwabacher, Fraktur, , and, above all, Textualis (although typographers may disagree with me about this—and indeed about everything I am writing here). Blackletter was developed in a world where ink made of bone soot was cheap but parchment was heavy and expensive. Letters and Lettering by Frank Chouteau Brown describes the development of Blackletter as a pursuit of both beauty and practicality:
The original Gothic letter was a gradual outgrowth from the round Roman Uncial. Its early forms retained all the roundness of its Uncial parent; but as the advantages of a condensed form of letter for the saving of space became manifest, (parchment was expensive and bulky) and the beauty of the resulting blacker page was noticed, the round Gothic forms were written closer and narrower…until a form was evolved in which the black overbalanced the white–the Blackletter which still survives in the common German text of to-day. Thus, though a Gothic letter may not be a Blackletter, a Blackletter is always Gothic, because it is constructed upon Gothic lines.
Thus generations of medieval scholars, scribes, and copyists carefully transcribed the Roman classics by hand into Blackletter manuscripts. When Guttenberg carved the font for his 42-line Bible (an original copy of which I saw last year at the Huntington) he chose Textualis as the most appropriate for the printed word of God.
I find Blackletter fonts to be extremely beautiful (although I am disquieted by how quickly they edge towards illegibility). Throughout this short article I have not attempted to provide any sort of true overview of the endless variety of Gothic fonts, but I have attempted to include an overview of these extremely gorgeous alphabets.
There is one addendum to all of this. Nobody seems able to stop calling things “Gothic” and of course contemporary typography designers were no exception. Your word processing program probably has a variety of modern sans-serif typefaces named “something-something Gothic” (“Frankln Gothic being especially popular). These scripts have nothing to do with Gothic Blackletters but instead are a throwback to true classical Roman and Etruscan letters. Unfortunately American publishers and designers called the new fonts “Gothic “since that was their term for German alphabets (and it was 19th century German designers who first introduced some of these letters).
The winter is gradually passing away into spring–which should be an exultant season for flower gardeners. Yet the results in my back yard are extremely discouraging because the ferocious squirrels of Brooklyn have eaten all of my crocuses! Despite planting an immense number of the hardy little flowers, I am still bereft of spring color. I guess I should have expected something like this after the infernal bushy-tailed rodents ate all the glass bulbs from the Christmas lights…
As it turns out, squirrels are not the only ones who love crocus flavor. One of the world’s most precious spices is made from the little flowers. The gourmet spice saffron literally consists of the harvested stigmas of the saffron crocus (Crocus sativus). The saffron crocus plant has been domesticated since antiquity to provide the costly spice and the plant literally owes its existence to human appetite for the powdery threads. Crocus sativus is a descendant of Crocus cartwrightianus, a wild crocus from the rocky skree of southwest Asia. As humankind selectively planted the plants with the longest stigmas (and hence the most delicious saffron) the little crocus developed into a completely different—and completely dependent—species. Crocus sativus now has magnificent spiraling stigma covered in deep yellow pollen, but the artificial selection came at a terrible cost. The plant is a male sterile triploid, incapable of sexual reproduction thanks to its extra chromosome. Saffron crocuses can only reproduce asexually and they require human assistance to prosper. The spice is still prohibitively expensive since the little plants must be planted and harvested by hand.
Saffron is known to recorded history as early as the 7th century BC (when it was mentioned in a Assyrian botanical treatise) however archeological and genetic evidence suggest that saffron has been harvested for at least 4 millenia! Since saffron contains over 150 volatile and aroma-yielding compounds, I am not going to try to describe it to you—you’ll just have to get some yourself. My favorite dish which uses the yellow pollen is mussels, saffron, vermouth, and cream!
Of course I am cheating a little bit by writing this article in the spring–the saffron crocus is really an autumn flowering plant. However I felt like my slaughtered crocuses deserved some sort of memorial tribute. Of course if I really wanted to commemorate the slain flowers I could turn to my paint box. In addition to being a spice, saffron is also a color—a deep orangey gold reminiscent of foods prepared with the spice. Strangely, for a color so steeped in the sensory joys of living, saffron has also come to represent worldly renunciation. Buddhist monks wear robes of deep saffron and the top bar of the Indian flag is the same rich orange-yellow. The flag’s designers hoped that the color would inspire India’s leaders to set aside material gains and dedicate themselves to the welfare of the people, but, alas, in all societies such selfless dedication is even rarer than the rarest spice.
Just kidding—aside from zoos and the pet trade, Ireland actually famously has no snakes. It is one the few snake-free large islands on Earth joined only by New Zealand, Iceland, Greenland, and Antarctica (well—everywhere far enough north or south is snake-free: the reptiles don’t really thrive in places where there is permafrost or truly cold winters). Legend has it that it was Saint Patrick who drove the snakes out of Ireland. Standing on a great hill he lifted up his crosier and focused divine energy upon the unlucky reptiles which then writhed en masse into the sea and never returned to the emerald island.
It has always been a bit unclear to me why Saint Patrick would do such a thing. Ecoystems which undergo such catastrophic changes tend to go haywire with great alacrity! Fortunately the story is entirely a myth. If snakes ever lived in Ireland (and it doesn’t seem like they did), they were long gone by the time the first Christians showed up. The real reason is even more interesting than the dramatic Moses-like power of Saint Patrick, but as with most actual answers it is also more complex.
Evidence suggests that snakes evolved 130 million years ago during the Cretaceous. At the time Ireland was, um, underwater at the bottom of a warm chalky sea. Early snakes slithered their way across landbridges, rafted to islands on washed away logs, and swam (like the sea snakes) from island to island but, during the Mesozoic, there was no Ireland for them to go to.
When the Mesozoic era ended in the great ball of fire, the continents again shifted. Snakes went through a substantial evolutionary period during the Miocene and the original python-like snakes evolved into many different forms. These new varieties of snakes slithered into grasslands, deserts, forests, and oceans around the world, but they still could not get to Ireland (now above the waves) because a cold ocean was in their way. Then the end of the Miocene brought an ice age. To quote the National Zoo’s essay on “Why Ireland Has No Snakes”:
The most recent ice age began about three million years ago and continues into the present. Between warm periods like the current climate, glaciers have advanced and retreated more than 20 times, often completely blanketing Ireland with ice. Snakes, being cold-blooded animals, simply aren’t able to survive in areas where the ground is frozen year round. Ireland thawed out for the last time only 15,000 years ago.
So Ireland remains snake-free because of the world’s temperamental geology. The island was underwater or covered by ice during certain eras when the snakes might have arrived–geography has conspired against serpents coming to Eire and setting up shop. The age of humans however has been marked by numerous introduced species cropping up everywhere. I wonder how long Ireland will be snake free when a pet shop accident or crazy hobbyist could unleash a plague of serpents on the green island. The fact that such a thing has yet to happen seems almost as miraculous as the original myth.