Could there be a solution to appeal vegans, vegeterians and meat lovers? It’s called post-animal bio-economy, and among other things it involves growing laboratory meat from stem cells.
A totally painless method for animals, who could at last lead their happy lives without us giving up a steak, a glass of milk or a poached egg. These would note be alternative products, but the same products, developed in a much more sustainable way in respect to linear agriculture (which has proven problematic for the quantity of land, chemicals, pesticides, energy resources, needed water and work, and emissions of greenhouse gases).
Take the highly controversial foie gras: in the near future it could be produced from the stem cells gathered from the tip of a duck’s feather. It might seem a bit sci-fi, but the first lab foie gras is already here, and this journalist tasted it.
There are just two obstacles: on one hand, the costs of lab meat are still too high for large-scale production (but this shouldn’t take too long to fix); on the other, there’s the small detail that this is a cultural, and not just agricultural, revolution. We will find out how traditional farmers will react, and above all if consumers are rady to try these new cruelty-free products.
The city of Branau am Inn, in Austria, is sadly known as the birthplace of a certain dictator called Adolf. But it should be remembered for another reason: the story of Hans Steininger, a burgomaster who on September 28, 1567, was killed by his own beard. A thick and prodigiously long set of hair, which turned out to be fatal during a great fire: while escaping the flames, mayor Hans forgot to roll his 2-meters-long beard and put it in his pocket, as he usually did, tripped on it and fell down the stairs breaking his neck.
As in the 1500s there was no such thing as the Darwin Awards, his fellow citizens placed a nice plaque on the side of the church and preserved the killer beard, still visible today at Branau’s Civic Museum.
But if you think silly deaths are an exclusively human achievement, hear this: “due to the humidity in its environment and how slowly a sloth moves, plant life will grow in its fur. This, combined with poor eyesight, leads to some sloths grabbing their own arms, thinking it’s a tree branch, and falling to their deaths.” (via Seriously Strange)
Furthermore, there’s the genius rodent who slipped into a 155-years-old mousetrap on exhibit in a museum. Slow clap.
You’re always so nervous and depressed, they said.
Why don’t you learn a musical instrument, just to chill out and amuse yourself?, they said.
It served them well.
The Flying Dutch of the 20th Century was called SS Baychimo, a cargo ship that got stuck in the Alaskan ice in 1931 and was abandoned there. For the next 38 years the ghost ship kept turning up and was spotted on several occasions; somebody even managed to board it, but each time the Baychimo successfully escaped without being recovered. (Thanks, Stefano!)
The terrible story of “El Negro”: when collectors of natural curiosities didn’t just ship animal skins back to Europe from the Colonies, but also the skin of human beings they dug out of their graves during the night.
Since we’re talking about human remains, the biggest traveling mummy exhibit was launched eight years ago (featuring a total of 45 mummies). You never got to see it? Neither did I. Here are some nice pictures.
Japanese aesthetics permeates even the smallest details: take a look at these two pages from a late-XVII C. manuscript showing the different kinds of design for wagashi (tipical pastries served during the tea ceremony. Ante litteram food porn.
Some researchers form the University of Wisconsin and the University of Maryland created music specifically studied to be appealing to cats, with frequences and sounds that should be, at least in theory, “feline-centric“. The tracks can be bought here even if, to be honest, my cats didn’t seem to be particularly impressed by the music samples. But then again, those two are fastidious and spoiled rotten.
Among the most bizarre museums, there is the wonderful Museum of Broken Relationships. It consists of objects, donated by the public, that symbolize a terminated relationship: the pearl necklace given as a gift by a violent fiancé to his girlfriend, in the attempt to be forgiven for his last abuses; an axe used by a woman to chop all of her ex-grilfriend’s furniture into pieces; the Proust volumes that a husband read out loud to his wife — the last 200 hundred pages still untouched, as their relationship ended before they’d finished reading the book. Well, can a love story ever last longer than the Recherche? (via Futility Closet)
1948, University of Tubingen, Germany.
Zoologist H. M. Peters was frustrated. He was conducting a photographic research on the way orb-weaver spiders build their web, but he had encountered a problem: the arachnids he was studying insisted on performing this task of astounding engineering only during the night hours, very early in the morning. This schedule, besides forcing him to get up at an ungodly hour, made photographic documentation quite hard, as the spiders preferred to move in total darkness.
One day Peters decided to call on a collegue, young pharmacologist Dr. Peter N. Witt, for assistance. Would it be possible to somehow drug the spiders, so they would change this routine and start weaving their webs when the sun was already up?
Witt had never had any experience with spiders, but he soon realized that administering tranquilizers or stimulants to the arachnids was easier than he thought: the little critters, constantly thirsty for water, quickly learned to drink from his syringe.
The results of this experiment, alas, turned out to be pretty worthless to zoologist Peters. The spiders kept on building their webs during the night, but that was not the worst part of it. After swallowing the medicine, they weren’t even able to weave a decent web: as if they were drunk, the arachnids produced a twisted mesh, unworthy of being photographed.
After this experience, a disheartened Peters abandoned his project.
In Dr. Witt’s mind, instead, something had clicked.
Common spiders (Araneidae) are all but “common” when it comes to weaving. They build a new web every morning, and if byt he end of the day no insect is trapped, they simply eat it. This way, they are able to recycle silk proteins for weeks: during the first 16 days without food, the webs look perfect. Whe nthe spider gets really hungry, it begins sparing the energy by building a wider-meshes web, suitable to catch only larger insects (the spider is in need of a substantial meal).
After all, for a spider the web isn’t just a way to gather food, but an essential instrument to relate with the surrounding world. Most of these arachnids are almost totally blind, and they use the vibrations of the strands like a radar: from the perceived movements they can understand what kind of insect just snagged itself on the web, and if it is safe for them to approach it; they can notice if even a single thread has broken, and they confidently head in the right direction to repair it; they furthermore use the web as a means of communication in mating rituals, where the male spider remains on the outer edges and rythmically pinches the strings to inform the female of its presence, in order to seduce her without being mistaken for a juicy snack.
During his experimentation with chemicals, Dr. Witt noticed that there seemed to be a significative correspondence between the administered substance and the aberrations that the spiderweb showed. He therefore began feeding the spiders different psychoactive drugs, and registering the variations in their weaving patterns.
Dr. Witt’s study, published in 1951 and revised in 1971, was limited to statistical observation, without attempting to provide further interpretations. Yet the results could lead to a fascinating if not very orthodox reading: it looked like the spiders were affected in much the same way humans react to drugs.
Under the influence of weed, they started regularly building their web, but were soon losing interest once they got to the outer rings; while on peyote or magic mushrooms, the arachnids movements became slower and heavier; after being microdosed with LSD, the web’s design became geometrically perfect (not unlike the kaleidoscopic visions reported by human users), while more massive doses completely inhibited the spiders’ abilities; lastly, caffeine produced out of control, schizoid results.
Clearly this “humanized” interpretation is not scientific to say the least. In fact, what really interested Witt was the possibility of using spiders to ascertain the presence of drugs in human blood or urine, as they had proved sensitive to minimal concentrations, which could not be instrumentally detected at the time. His research continued for decades, and Witt went from being a pharmacologist to being an entomology authority. He was able to recognize his little spliders one by one just by looking at their webs, and his fascination for these invertebrates never faded.
He kept on testing their skills in several other experiments, by altering their nervous system through laser stimulation, administering huge quantities of barbiturics, and even sending them in orbit. Even in the absence of gravity, in what Witt called “a masterpiece in adaptation”, after just three days in space the spiders were able to build a nearly perfect web.
Near the end of the Seventies, Witt discontinued his research. In 1984 J. A. Nathanson re-examined Witt’s data, but only in relation to the effects of caffeine.
In 1995 Witt saw his study come back to life when NASA successfully repeated it, with the help of statistic analysis software: the research showed that spiders could be used to test the toxicity of various chemicals instead of mice, a procedure that could save time and money.
Anyway, there is not much to worry regarding the fate of these invertebrates.
Spiders are among the very few animals who survived the biggest mass extinction that ever took place, and they are able to resist to atmospheric conditions which would be intolerable to the majority of insects. Real rulers of the world since millions of years, they will still be here a long time — even after our species has run its course.
Some days ago I was contacted by a pathologist who recently discovered Bizzarro Bazar, and said she was particularly impressed by the website’s “lack of morbidity”. I could not help but seize the opportunity of chatting a bit about her wonderful profession: here is what she told me about the different aspects of this not so well-known job, which is all about studying deformity, dissimilarities and death to understand what keeps us alive.
What led you to become a pathologist?
When I was sixteen I decided I had to understand disease and death.
The pathologist’s work is very articulated and varied, and mostly executed on living persons… or at least on surgically removed parts of living persons; but undoubtedly one of the routine activities is the autoptical diagnosis, and this is exactly one of the reasons behind my choice, I won’t deny it. Becoming a pathologist was the best way to draw on my passion for anatomy, turning it into a profession, and what’s more I would also have the opportunity of exorcising my fear of death by getting accustomed to it… getting my hands dirty and looking at it up close. I wanted to understand and investigate how people die. Maybe part of it had to do with my visual inclination, and pathology is a morphologic discipline which requires sharp visual memory and attention to macro and microscopic details, to differences in shape, to nuances in color.
Is there some kind of common prejudice against your job? How did you explain your “vocation” to friends and relatives?
Actually the general public is not precisely aware of what the pathologist does, hence a certain morbid curiosity on the part of non-experts. Most of them think of Kay Scarpetta, from Cornwell’s novels, or CSI. When people asked me about my job, at the beginning of my career, I gave detailed explanations of all the non-macabre aspects of my work, namely the importance of an hystological diagnosis in oncology, in order to plan the correct treatment. I did this to avoid a certain kind of curiosity, but I was met with puzzled looks. To cut it short, I would then admit: “I also perform autopsies”, and eventually there was a spark of interest in their eyes. I never felt misjudged, but I sometimes noticed some sort of uneasiness. And maybe some slightly sexist prejudice (the unasked question being how can a normal girl be into this kind of things); those female sexy pathologists you find in novels and TV series were not fashionable yet, and at the postgraduate school I was the only woman. As for friends and relatives… well, my parents never got in the way with my choices… I believe they still haven’t exactly figured out exactly what I do, and if I try to tell them they ask me to spare them the details! As for my teenage kids, who are intrigued by my job, I try to draw their attention to the scientific aspects. In the medical environment there is still this idea of a pathologist being some kind of nerd genius, or a person who is totally hopeless in human interactions, and therefore seeks shelter in a specialization that is not directly centered on doctor-patient relationship. Which is not necessarily true anymore, by the way, as often pathologists perform biopsies, and therefore interact with the patient.
Are autopsies still important today?
Let’s clarify: in Italy, the anatomopatologo is not a forensic pathologist, but is closer to what would be known in America as a surgical pathologist. The autopsy the pathologist performs is on people who died in a hospital (and not on the deceased who fell from a height or committed suicide, for instance) to answer to a very specific clinical inquiry, while the legal autopsy is carried out by the legal MD on behalf of the DA’s office.
One would think that, with the development of imaging radiology tests, the autoptic exam would have by now become outdated. In some facilities they perform the so-called “virtual autopsy” through CAT scans. In reality, in those cases in which a diagnosis could not be determined during the deceased’s life, an autopsy is still the only exam capable of clarifying the final cause of death. Besides direct examination, it allows to take organ samples to be studied under the microscope with conventional coloring or to be submitted for more refined tests, such as molecular biology. In the forensic field, direct examination of the body allows us to gather information on the chronology, environment and modality of death, all details no other exam could provide.
There is of course a great difference (both on a methodological and emotional level) between macroscopic and microscopic post mortem analysis. In your experience, for scientific purposes, is one of the two phases more relevant than the other or are they both equally essential?
They are both essential, and tightly connected to each other: one cannot do without the other. The visual investigation guides the following optic microscopy exam, because the pathologist samples a specific area of tissue, and not another, to be submitted to the lab on the grounds of his visual perception of dissimilarity.
In my experience of autopsy rooms, albeit limited, I have noticed some defense strategies being used to cope with the most tragic aspects of medical investigation. On one hand a certain humor, though never disrespectful; and, on the other, little precautions aimed at preserving the dignity of the body (but which may also have the function of pushing away the idea that an autopsy is an act of violation). How did you get used to the roughest side of your job?
I witnessed my first autopsy during my first year in medical school, and I still remember every detail of it even today, 30 years later. I nearly fainted. However, once I got over the first impact, I learned to focus on single anatomical details, as if I were a surgeon in the operating room, proceeding with great caution, avoiding useless cuts, always keeping in mind that I’m not working on a corpse, but a person. With his own history, his loved ones, presumably with somebody outside that room who is now crying for the loss. One thing I always do, after the external exam and before I begin to cut, is cover up the face of the dead person. Perhaps with the illogical intent of preventing him to see what I’m about to do… and maybe to avoid the unpleasant feeling of being watched.
Are there subjects that are more difficult to work with, on the emotional level?
Children.
Are autopsies, as a general rule, open to a non-academic public in Italy? Would you recommend witnessing an autopsy?
No, all forensic autopsies are not accessible, for obvious reasons, since there is often a trial underway; neither are the diagnostic post mortem examinations in hospitals. I wouldn’t know whether to recommend seeing an autopsy to anyone. But I do believe every biology or medicine student should be allowed in.
One of the aspects that always fascinated me about pathological anatomy museums is the vitality of disease, the exuberant creativity with which forms can change: the pathological body is fluid, free, forgetful of those boundaries we think are fixed and insurmountable. You just need to glance at some bone tumors, which look like strange mineral sponges, to see the disease as a terrible blooming force.
Maybe this feeling of wonder before a Nature both so beautiful and deadly, was the one animating the first anatomists: a sort of secret respect for the disease they were fighting off, not much different from the hunter’s reverential fear as he studies his prey before the massacre. Have you ever experienced this sense of the sublime? Does the apparent paradox of the passionate anatomist (how can one be a disease enthusiast?) have something to do with this admiration?
To get passionate, in our case, means to feel inclined towards a certain field, a certain way of doing research, a certain method and approach which links a morphologic phenomenon to a functional phenomenon. We do not love disease, we love a discipline which teaches us to see (Domine, ut videam) in order to understand the disease. And, hopefully, cure it.
And yes, of course there is the everyday experience of the sublime, the aesthetic experience, the awe at shapes and colors, and the information they convey. If we know how to interpret it.
Speaking of the vitality of disease: today we recognize in some teratologic specimens a proof of the attempts through which evolution gropes around, one failed experiment after the other. How many of these maladies (literally, “being not apt”) are actually the exact opposite, an adaptation attempt? Is any example of mutation (which a different genetic drift might have elected to dominant phenotype) always pathological?
What I really mean to ask is, of course, another one of those questions that any pathological anatomy museum inevitably suggests: what are the actual boundaries of the Norm?
The norm is established on a statistical basis following a Gaussian distribution curve, but what falls beyond the 90th percentile (or before the 10th) is not forcibly unnatural, or unhealthy, or sick. It is just statistically less represented in the general population in respect to the phenotype we are examining. Whether a statistically infrequent character will be an advantage only time will tell.
The limits of the norm are therefore conventionally established on a mathematical basis. What is outside of the norm is just more uncommon. Biology undergoes constant transformation (on the account of new medicines or therapies, climatic and environmental change, great migrations…), and therefore we are always confronted with new specimens coming in. That is why our job is always evolving, too.
I didn’t expect such a technical answer… mine was really a “loaded” question. As you know, for years I have been working on the concepts of dissimilarity, exoticism and diversity, and I wanted to provoke you – to see whether from your standpoint a mutant body could also be considered as a somewhat revolutionary space, a disruptive element in a context demanding total compliance to the Norm.
Ask a loaded question… and you’ll get a convenient answer. You’re talking about a culture demanding compliance to a social norm, I replied in terms of the biology demanding compliance to a norm that is established by the scientific community on a frequency-based statistic calculation — which is therefore still conventional. In reality, deformity appears in unexpected ways, and should be more correctly described following a probabilistic logic, and not frequency. But I’m beginning to sound technical again.
I have seen respected professors lighten up like children before some pathological wet specimens. The feeling I had was that the medical gaze in some ways justified an interest for extreme visions, usually precluded to the general public. Is it an exclusively scientific interest? Is it possible to be passionate about this kind of work, without being somehow fascinated by the bizarre?
There could be a little self-satisfaction at times. But in general there is sincere passion and enthusiasm for the topic, and that surely cannot be faked. It is a job you can only do if you love it.
All our discipline is based on the differential diagnosis between “normal” and “pathological”. I could say that everything pathological is dysmorphic in respect to the norm, therefore it is bizarre, different. So yes, you have to feel a the fascination for the bizarre. And be very curious.
The passion for the macabre is a growing trend, especially among young people, and it is usually deemed negative or cheap, and strongly opposed by Italian academics. This does not happen in other realities (not just the US, but also the UK for instance) in which a common element of communication strategies for museums has become the ability of arousing curiosity in a vast public, sometimes playing on pop and dark aspects. Come for the macabre, stay for the science. If young people are drawn to the subject via the macabre imaginary, do you think in time this could lead to the education of new, trustworthy professionals?
Yes, it’s true, there is a growing interest, I’m thinking of some famous anatomical exhibitions which attracted so many visitors they had to postpone the closing date. There is also my kids’ favorite TV show about the most absurd ways to die. I believe that all this is really an incentive and should be used as a basis to arouse curiosity on the scientific aspects of these topics. I think that we can and must use this attraction for the macabre to bring people and particularly youngsters closer to science, even more so in these times of neoshamanic drifts and pseudo-scientific rants. Maybe it could also serve the purpose of admitting that death is part of our daily lives, and to find a way to relate to it. As opposed to the Anglo-Saxon countries, in Italy there still is a religious, cultural and legislative background that partially gets in the way (we have laws making it hard to dissect bodies for study, and I also think of the deeply-rooted idea that an autopsy is a violation/desecration of the corpse, up to those prejudices against science and knowledge leading to grotesque actions like the petition to close the Lombroso Museum).
Has your job changed your relationship with death and dying in any way?
I would say it actually changed my relationship with life and living. My worst fear is no longer a fear of dying. I mostly fear pain, and physical or mental decay, with all the limitations they entail. I hope for a very distant, quick and painless death.
With your twenty years experience in the field, can you think of some especially curious anecdotes or episodes you came across?
Many, but I don’t feel comfortable relating episodes that revolve around a person’s remains. But I can tell you that I often do not wonder how these people died, but rather how in the world they could be alive in the first place, given all the diseases I find! And, to me, life looks even more like a precariously balanced wonder.
Even mice sing.
We have known that for 50 years, but we are only recently beginning to understand the complexity of their songs. Part of the difficulty of studying mice songs lies in their ultrasonic vocalizations, frequencies the human ear cannot perceive: in the wild, this kind of calls happen for example when a mouse pup calls for his mother.
In April, in Frontiers of Behavioral Neuroscience, a new Duke University research appeared, showing how mice songs are really much more intricate than expected.
Researchers Jonathan Chabout, Abhra Sarkar, David B. Dunson and Erich D. Jarvis have exposed the mice to different social contexts and, using new specifically elaborated software, they have analysed the frequency modulation and duration of these ultrasonic calls. Researchers have been able to break down the songs into “syllables” and clusters of sound repeated to a certain rythm, and to discover how they vary according to the situation.
If a male mouse is exposed to female urine, and therefore gets convinced that she is somewhere nearby, his singing becomes louder and more powerful, if somewhat less accurate; to awake a sleeping female, he utilizes the same song, but the “syllables” are now pronounced much more clearly.
Female mice seem to prefer songs that are complex and rich in variations; even so, when a male finds himself near an available female, his elaborate courting song switches to a simpler tune. Once the potential mate has been attracted, in fact, our little mouse needs to save energy to chase her around and try to mate.
The mouse’s ability to sing is not as articulate as in songbirds; and yet, changes in the syntax according to social context prove that the songs convey some meaning and serve a precise purpose. Researchers are not sure how much mice are able to learn to modify their vocalizations (as birds do) or how much they just choose from fixed patterns. Forthcoming studies will try to answer this question.
It is nice to better understand the world of rodents, but why is it so important?
The goal of these studies is actually also relevant to humans. In the last decade, we understood how mice are extremely similar to us on a genetic level; discovering how and to what extent they are able to learn new “syllables” could play a fundamental part in the study of autism spectrum disorders, particularly in regard to communication deficits and neural circuits controlling vocal learning.