This article originally appeared on #ILLUSTRATI n.53 – L’uomo che piantava gli alberi
When I was still attending the film school, I wrote a brief script for a short film about a man who bought a flower, a big orchid. He went home, put the orchid on his bed, caressed it, kissed it and, finally, made love with it.
The subject appeared quite comical but my intention was to make a sort of test or, even better, an exercise in style: would I be able to shoot such an absurd sex scene without making it ridiculous? Would I be able to make it even romantic? Was it possible to produce the well-known “suspension of disbelief” in the audience in such an extreme situation?
At that time, I didn’t know that I wasn’t being particularly original.
There is, in fact, a real rare paraphilia, or better a form of fetishism, consisting in deriving sexual arousal from trees and plants. The term defining it is ‘dendrophilia’ and may have been coined by Lawrence Buell when referring to the famous writer Henry David Thoreau’s love of trees – a totally innocent feeling in that case.
Like other paraphilias, dendrophilia is one of those topics that can be the delight of tabloids: when the editorial staff runs out of news, they can simply send a photographer in the courtyard, to take some shots of a lady kissing a tree, and the article is done: “I want to marry a poplar!”.
But does dendrophilia really exist?
A seminal 2007 study from Bologna University estimated that only 5% of all fetishisms refers to inanimate objects that have no relationship with the human body. Within such a low rate, it is no wonder that the more exclusive niche of people with a thing for plants can be almost invisible; this is compounded by the shame of talking about it and by the fact that this sexual preference does not cause any problem; so, you will understand why there is no record of case studies dedicated to dendrophilia in medical literature.
Assuming this sort of paraphilia exists, we can reasonably infer from what we know about the other forms of fetishism, that its manifestations could be much less weird than we expect. Most of the appeal of the fetish relies on the smell, the texture, and the appearance of the object, which becomes important on an evocative level in order to stimulate arousal. In this respect, it is likely that the potential dendrophile simply finds the texture of a bark, the smoothness or the colour of the leaves, the shape of a root extremely pleasant; contact, sometimes associated with a sexually relevant distant memory, becomes effective in stimulating arousal.
Not really different from those people who derive arousal from velvet: we tend to define fetishism on a pathological level only when it becomes a sine qua non in order to get sexual gratification and, actually, many practitioners working in the field tend to make distinctions between optional, preferred and exclusive paraphilias. Only the last ones are considered sexual disorders; this distinction, which can be found in the DSM-5 as well (the most used diagnostic manual in psychiatry), is in fact the distinction between real fetishism and fetishistic behaviour.
Last June, a little scandal broke in Palermo: among the plants of the amazing Botanical Gardens popped up a video installation by the Chinese artist Zheng Bo named Pteridophilia (literally, “love of ferns”). The video shows seven young boys having sex with plants in a Taiwan forest.
Having considered the ruckus raised by this artistic video, maybe it’s a good thing that I never realized my short film. Which, with a little plot twist, ended up playing with the subtle distinction between pathological fetishism and a simple fetishistic act “by proxy”: after making love with the orchid, the main character went to gently put it on the grave of the only woman he had ever loved.
Mrs. Josephine M. Bicknell died only one week before her sixtieth birthday; she was buried in Cleburne, Texas, at the beginning of May, 1928.
Once the coffin was lowered into the ground,her husband James C. Bicknell stood watching as the grave was filled with a thick layer of cement; he waited for an hour, maybe two, until the cement dried completely. Eventually James and the other relatives could head back home, relieved: nobody would be able to steal Mrs. Bicknell’s body – not the doctors, nor the other collectors who had tried to obtain it.
It is strange to think that a lifeless body could be tempting for so many people.
But the lady who was resting under the cement had been famous across the United States, many years before, under her maiden name: Josephine Myrtle Corbin, the Four-Legged Girl from Texas.
Myrtle was born in 1868 in Lincoln County, Tennessee, with a rare fetal anomaly called dipygus: her body was perfectly formed from her head down to her navel, below which it divided into two pelvises, and four lower limbs.
Her two inner legs, although capable of movement, were rudimentary, and at birth they were found laying flat on the belly. They resembled those of a parasitic twin, but in reality there was no twin: during fetal development, her pervis had split along the median axis (in each pair of legs, one was atrophic).
between each pair of legs there is a complete, distinct set of genital organs, both external and internal, each supported by a pubic arch. Each set acts independently of the other, except at the menstrual period. There are apparently two sets of bowels, and two ani; both are perfectly independent,– diarrhoea may be present on one side, constipation on the other.
Myrtle joined Barnum Circus at the age of 13. When she appeared on stage, nothing gave away her unusual condition: apart from the particularly large hips and a clubbed right foot, Myrtle was an attractive girl and had an altogether normal figure. But when she lifted her gown, the public was left breathless.
She married James Clinton Bicknell when she was 19 years old, and the following year she went to Dr. Lewis Whaley on the account of a pain in her left side coupled with other worrying symptoms. When the doctor announced that she was pregnant in her left uterus, Myrtle reacted with surprise:
“I think you are mistaken; if it had been on my right side I would come nearer believing it”; and after further questioning he found, from the patient’s observation, that her right genitals were almost invariably used for coitus.
That first pregnancy sadly ended with an abortion, but later on Myrtle, who had retired from show business, gave birth to four children, all perfectly healthy.
Given the enormous success of her show, other circuses tried to replicate the lucky formula – but charming ladies with supernumerary legs were nowhere to be found.
With typical sideshow creativity, the problem was solved by resorting to some ruse.
The two following diagrams show the trick used to simulate a three-legged and a four-legged woman, as reported in the 1902 book The New Magic (source: Weird Historian).
If you search for Myrtle Corbin’s pictures on the net, you can stumble upon some photographs of Ashley Braistle, the most recent example of a woman with four legs.
The pictures below were taken at her wedding, in July 1994, when she married a plumber from Houston named Wayne: their love had begun after Ashley appeared in a newspaper interview, declaring that she was looking for a “easygoing and sensitive guy“.
Unfortunately on May 11, 1996, Ashley’s life ended in tragedy when she made an attempt at skiing and struck a tree.
Did you guess it?
Ashley’s touching story is actually a trick, just like the ones used by circus people at the turn of the century.
This photographic hoax comes from another bizarre “sideshow”, namely the Weekly World News, a supermarket tabloid known for publishing openly fake news with funny and inventive titles (“Mini-mermaid found in tuna sandwich!” “Hillary Clinton adopts a baby alien!”, “Abraham Lincoln was a woman!”, and so on).
The “news” of Ashley’s demise on the July 4, 1996 issue.
Another example of a Weekly World News cover story.
To end on a more serious note, here’s the good news: nowadays caudal duplications can, in some instances, be surgically corrected after birth (it happened for example in 1968, in 1973 and in 2014).
And luckily, pouring cement is no longer needed in order to prevent jackals from stealing an extraordinary body like the one of Josephine Myrtle Corbin Bicknell.
7 centimeters-long, carved from the bone of an unidentified bird, this perfect needle (complete with an eye to insert a thread) was produced more than 50.000 years ago – not by proper Homo sapiens, but by the mysterious Denisova hominin: settled on mount Altaj in Siberia, these human predecessors are partly still an enigma for paleontologists. But this needle, found in 2016 from their cave, is a proof of their technological advancement.
Needles Under The Skin The inexplicable delay of Western medicine
The effects of the bites of venomous snakes and insects pointed clearly to the possibility of the introduction of drugs through punctures in the skin. In primitive societies, the application for therapeutic purposes of plant and animal products through cutaneous incisions is practiced […], and the use of poisoned arrows may be regarded as a crude precursor of hypodermic and intramuscular medication.
We could trace another “crude precursor” of intramuscular injections back to Sir Robert Christison‘s 1831 proposal, suggesting that whalers fix a vial of prussic acid to their harpoons in order to kill whales more quickly.
And yet, despite of all these clues, the first proper hypodermic injection for strict medical purposes did not take place before mid-Nineteenth Century. Until then, syringes (which had been around for centuries) were mainly used for suction, for instance to draw the fluids which accumulated in abscesses. Enemas and nasal irrigation were used since Roman times, but nobody had thought to inject medications under the skin.
Physicians had tried, with varying results, to scar the epydermis with irritants and to deposit the drug directly on the resultin ulcer, or they sliced the skin with a lancet, as in bloodletting, and inserted salts (for example morphine) through the cut. In 1847, G. V. Lafargue was the first to have the intuition of combining inoculation with acupuncture, and to build a long and thick hollow needle filled with morphine paste. But other methods were being tested, such as sawing a silk thread, imbued in drugs, directly into the patient’s skin.
The first true hypodermic syringe was invented in 1853 by Scottish doctor Alexander Wood, as reported in his New Method of Treating Neuralgia by Subcutaneous Injection (1855). Almost at the same time, the French physician Charles Pravaz had devised his own version. By the end of the Nineteenth Century, hypodermic injections had become a widespread procedure in the medical field.
Needles In The Flesh The bizarre clinical case of the “needle woman”
Published in 1829 by Giuseppe Ferrario, Chief Surgeon at the Ospedale Maggiore in Milan, La donna dagli aghi reports a strange case that began in June 1828.
A young 19-year-old woman, Maria Magni, “peasant, of scrofulous appearance, but with a passionate temper” was admitted to the hospital because of severe pain.
One April morning, the year before, she had found a light blue piece of paper on the ground which contained 70/80 steel sewing needles. In order not to lose them, she had pinned them on her blouse cuff. But Maria suffered from epileptic fits, and a few hours later, as she was working in the vineyard, “she fell victim of the usual spasms, and convulsive bouts. Under these abnormal and violent muscular movements […] she believes that she unwillingly pushed the needles she had pinned to her shirt through her right arm – which was naked, as is the case among our peasants – as well as through her breast”. The needles didn’t cause her any trouble until three months later, when the pain had become unbearable; she then decided to go to the hospital.
The doctor on duty hesitated to admit her, for fear she had syphilis: Magni had tried alternative treatments, and had applied “many varied remedies, catplasms, ointments, blistering drugs and other ulcerating substances, etc, with the intention of exciting the needles out of her skin”, but this only resulted in her body being covered by sores.
Enter Doctor Ferrario, who during the first 35 days of treatment submitted her to bloodletting for 16 times, applied more than 160 leeches to her temples, administered vesicants, frictions, decoctions, salts and various tinctures. But the daily epileptic fits were terrible, and nothing seemed to work: “all the physicians, stunned by the woman’s horrible condition, predicted an approaching and inevitable death”.
Upon hearing the story of the needles, though, Ferrario began to wonder if some of them were still sticking inside the young woman’s body. He examined her wounds and actually started feeling something thin and hard within the flesh; but touching those spots triggered some epileptic fits of unheard violence. Ferrario described these bouts with typical 19th-Century literary flourishes, in the manner of Gothic novels, a language which today sounds oddly inappropriate in a medical context:
the poor wretched girl, pointing her nape and feet, pushed her head between her shoulders while jumping high above the bed, and arched her bust and arms on the account of the spasmodic contraction of dorsal muscles […] she was shaking and screaming, and angrily wrapped her body in her arms at the risk of suffocating […]. There was involuntary loss of urine and feces […]. Her gasping, suffocated breath, her flaccid and wrinkled breast which appeared beneath her hirst, torn to pieces; the violence with which she turned her head on her neck, and with which she banged it against the walls and threw it back, hanging from the side of the bed; her red and bulging eyes, sometimes dazed, sometimes wide open, almost coming out of their socket, glassy and restless; the obscene clenching of her teeth, the foamy, bloody matter that she squirted and vomited from her dirty mouth, her swollen and horribly distorted face, her black hair, soaked in drool, which she flapped around her cranium […] all this inspired the utmost disgust and terror, as it was the sorrowful image of an infernal fury.
Ferrario then began extracting the needles out of the woman’s body, performing small incisions, and his record went on and on much in the same way: “this morning I discovered a needle in the internal superior region of the right breast […] After lunch, having cut the upper part of the arm as usual, I extracted the needle n. 14, very rusty, with its point still intact but missing the eye […] from the top of the mons pubis I extracted the needle n. 24, rusty, without point nor eye, of the length of eight lines.”
The pins were hard to track down, they moved across the muscles from one day to the other, so much so that the physician even tried using big horseshoe magnets to locate the needles.
The days went by, and as the number of extracted needles grew, so did the suspect that the woman might be cheating on the doctors; Maria Magni just kept expelling needles over and over again. Ferrario began to wonder whether the woman was secretly inserting the needles in her own body.
But before accusing her, he needed proof. He had them searched, kept under strict surveillance, and he even tried to leave some “bait” needles lying around the patient’s bed, to see if they disappear. Nothing.
In the meantime, starting from extraction number 124, Miss Magni began throwing up needles.
The physician had to ask himself: did these needles arrive into the digestive tract through the diaphragm? Or did Magni swallow them on purpose? One thing is sure: vomiting needles caused the woman such distress that “having being so unwell, I doubt she ever swallowed any more after that, but she might have resorted to another less uncomfortable and less dangerous opening, to continue her malicious introduction of needles in the body”.
The “less uncomfortable opening” was her vagina, from which many a new needle was removed.
As if all this was not enough, rumors had spread that the “needle woman” was actually a witch, and hospital patients began to panic.
An old countrywoman, recovering in the bed next to Magni’s, became convinced that the woman had been victim of a spell, and then turned into a witch on the account of the magic needles. Being on the bed next to her, the old lady believed that she herself might fall under the spell. She didn’t want to be touched by the young woman, nor by me, for she believed I could be a sorcerer too, because I was able to extract the needles so easily. This old lady fell for this nonsense so that she started screaming all day long like a lunatic, and really became frenzied and delirious, and many leeches had to be applied to her head to calm her down.
Eventually one day it was discovered where Magni had been hiding the needles that she stuck in her body:
Two whole needles inside a ball of yarn; four whole needles wrapped in paper between the mattress and the straw, all very shiny; a seventh needle, partly rusted, pinned under a bed plank. Several inmates declared that Maria Magni had borrowed four needles from them, not returning them with the excuse that they had broken. The ill-advised young woman, seeing she was surrounded and exposed […] faked violent convulsions and started acting like a demon, trashing the bed and hurting the assistants. She ended by simulating furious ecstasy, during which she talked about purely fictional beings, called upon the saints and the devils, then began swearing, then horribly blasphemed angels, saints, demons, physicians, surgeons and nurses alike.
After a couple of days of these performance, Magni confessed. She had implanted the needles herself under her skin, placed them inside her vagina and swallowed them, taking care of hiding the pierced areas until the “tiny red hole” had cicatrized and disappeared.
In total, 315 needles were retrieved from Maria Magni’s body.
In the epilogue of his essay, Ferrario points out that this was not even the first recorded case: in 1821, 363 needles were extracted from the body of young Rachel Hertz; another account is about a girl who survived for more than 24 years to the ingestion of 1.500 needles. Another woman, Genueffa Pule, was born in 1763 and died at the age of 37, and an autopsy was carried out on her body: “upon dissecting the cadaver, in the upper, inner part of each thigh, precisely inside the triceps, masses of pins and needles were found under the teguments, and all the muscles teemed with pins and needles”.
Ferrario ascribes the motivations of these actions to pica, or superstition. Maria claimed that she had been encouraged by other women of the village to swallow the needles in order to emulate the martyr saints, as a sort of apotropaic ritual. More plausibly, this was just a lie the woman told when she saw herself being cornered.
In the end, the physician admits his inability to understand:
It is undoubtedly a strange thing for a sane person to imagine how pain – a sensation shunned even by the most ignorant people, and abhorred by human nature – could be sometimes sought out and self-inflicted by a reasonable individual.
As I was going through pathology archives, in search of studies that could have some similarities with the Magni story, I came upon one, then two, then several other reports regarding an even more unbelievable occurrence: sewing needles found in the encephalon of adult patients, often during routine X-rays.
The answer is quite awful. These are all cases of failed infanticide.
The possibility of infanticide by inserting pins through the fontanelle is mentioned in the Enciclopedia legale ovvero Lessico ragionato by F. Foramiti (1839), where the author includes a (chilling) list of all the methods with which a mother can kill her own child, among which appears the “puncturing the fontanelle and the brain with a thin sharp dagger or a long and strong needle”.
But the practice, properly documented in medical literature only by 1914, already appeared in Persian novels and texts: perhaps the fact that the method was well-known in the ancient Middle East, is the reason why most of the forty recorded cases were documented in Turkey and Iran, with a minority coming from Southeast Asia, Europe and the United States. In Italy there were two known cases, one in 1987 and the 2010 case mentioned above.
Most of these patients didn’t show any particular neurological symptom: the sewing needles, having been embedded in the brain for so many years, are not even removed; a surgical procedure, at this point, would be more dangerous than leaving them in situ.
This was the case for the only known occurrence reported in Africa, a 4-year-old child carrying a 4,5 cm needle through his brain. At the time the report was filed, in 2014, the needle was still there: “no complications were noted, the child had normal physical and mental development with excellent performance at school”.
Of course, discovering at the age of forty that someone – your parents, or maybe your grandparents – tried to kill you when you were just months old must be a shock.
It happened to Luo Cuifen, a chinese lady who was born in 1976, and who showed up at the hospital because of blood in her urine in 2007, and who discovered she had 26 sewing needles in her body, piercing vital organs such as lungs, liver, kidneys and brain. Her story is related to the discriminations towards female newborn children in rural China, where a son is more welcome than a daughter because he can carry on the family name, perform funeral rituals for ancestors, and so on. In Luo’s case, it was most likely her grandparents who attempted the infanticide when she was but months old (even if this theory cannot be proven, as her grandparents already passed away).
In more recent cases, recorded in Tunisia, China and Brazil, it was discovered that the children had respectively three, twelve and even fifty needles stuck in their bodies.
The cases of people surviving for decades with a needle in their brain are obviously an exception – as one of the studies put it, this is the “tip of the iceberg”.
A needle wound can be almost invisible. What is really disquieting is the thought of all those infanticides who are carried out “successfully”, without being discovered.
Sometimes the smallest objects can turn out to be the most useful. And the most lethal.
My gratitude goes to Mariano Tomatis, who recommended La donna dagli aghi, which he discovered during his studies on 19th-century magnetism, and which started this research.
Paracelsus‘ homunculus, the result of complicated alchemic recipes, is an allegorical figure that fascinated the collective uncoscious for centuries. Its fortune soon surpassed the field of alchemy, and the homunculus was borrowed by literature (Goethe, to quote but one example), psychology (Jung wrote about it), cinema (take the wonderful, ironic Pretorius scene from TheBride of Frankenstein, 1935), and the world of illustration (I’m thinking in particular of Stefano Bessoni). Even today the homunculus hasn’t lost its appeal: the mysterious videosposted by a Russian youtuber, purportedly showing some strange creatures developed through unlikely procedures, scored tens of millions of views.
Yet I will not focus here on the classic, more or less metaphorical homunculus, but rather on the way the word is used in pathology.
Yes beacuse, unbeknownst to you, a rough human figure could be hiding inside your own body.
Welcome to a territory where the grotesque bursts into anatomy.
Let’s take a step back to how life starts.
In the beginning, the fertilized cell (zygote) is but one cell; it immediately starts dividing, generating new cells, which in turn proliferate, transform, migrate. After roughly two weeks, the different cellular populations organize into three main areas (germ layers), each one with its given purpose — every layer is in charge of the formation of a specific kind of structure. These three specialized layers gradually create the various anatomical shapes, building the skin, nerves, bones, organs, apparatuses, and so on. This metamorphosis, this progressive “surfacing” of order ends when the fetus is completely developed.
Sometimes it might happen that this very process, for some reason, gets activated again in adulthood.
It is as if some cells, falling for an unfathomable hallucination, believed they still are at an embryonic stage: therefore they begin weaving new structures, abnormal growths called teratomas, which closely resemble the outcome of the first germ differentiations.
These mad cells start producing hair, bones, teeth, nails, sometimes cerebral or tyroid matter, even whole eyes. Hystologically these tumors, benign in most cases, can appear solid, wrapped inside cystes, or both.
In very rare cases, a teratoma can be so highly differentiated as to take on an antropomorphic shape, albeit rudimentary. These are the so-called fetiform teratomas (homunculus).
Clinical reports of this anomaly really have an uncanny, David Cronenberg quality: one homunculus found in 2003 inside an ovarian teratoma in a 25-year-old virginal woman, showed the presence of brain, spinal chord, ears, teeth, tyroid gland, bone, intestines, trachea, phallic tissue and one eye in the middle of the forehead.
In 2005 another fetiform mass had hairs and arm buds, with fingers and nails. In 2006 a reported homunculus displayed one upper limb and two lower limbs complete with feet and toes. In 2010 another mass presented a foot with fused toes, hair, bones and marrow. In 2015 a 13-year-old patient was found to carry a fetiform teratoma exhibiting hair, vestigial limbs, a rudimentary digestive tube and a cranial formation containing meninxes and neural tissue.
What causes these cells to try and create a new, impossible life? And are we sure that the minuscule, incomplete fetus wasn’t really there from the beginning?
Among the many proposed hypothesis, in fact, there is also the idea that homunculi (difficult to study because of their scarcity in scientific literature) may not be actual tumors, but actually the remnants of a parasitic twin, incapsulated within his sibling’s body during the embryonic phase. If this was the case, they would not qualify as teratomas, falling into the fetus in fetu category.
But the two phenomenons are mainly regarded as separate.
To distinguish one from the other, pathologists rely on the existence of a spinal column (which is present in the fetus in fetu but not in teratomas), on their localization (teratomas are chiefly found near the reproductive area, the fetus in fetu within the retroperitoneal space) and on zygosity (teratomas are often differentiated from the surrounding tissues, as if they were “fraternal twins” in regard to their host, while the fetus in fetu is homozygote).
The study of these anomalous formations might provide valuable information for the understanding of human development and parthenogenesis (essential for the research on stem cells).
But the intriguing aspect is exactly their problematic nature. As I said, each time doctors encounter a homunculus, the issue is always how to categorize it: is it a teratoma or a parasitic twin? A structure that “emerged” later, or a shape which was there from the start?
It is interesting to note that this very uncertainty also has existed in regard to normal embryos for the over 23 centuries. The debate focused on a similar question: do fetuses arise from scratch, or are they preexistent?
This is the ancient dispute between the supporters of epigenesis and preformationism, between those who claimed that embryonic structures formed out of indistinct matter, and those who thought that they were already included in the egg.
Aristotle, while studying chicken embryos, had already speculated that the unborn child’s physical structures acquire solidity little by little, guided by the soul; in the XVIII Century this theory was disputed by preformationism. According to the enthusiasts of this hypothesis (endorsed by high-profile scholars such as Leibniz, Spallanzani and Diderot), the embryo was already perfectly formed inside the egg, ab ovo, only too small to be visible to the naked eye; during development, it would just have to grow in size, as a baby does after birth.
Where did this idea come from? An important part was surely played by a well-known etching by Nicolaas Hartsoeker, who was among the first scientists to observe seminal fluid under the microscope, as well as being a staunch supporter of the existence of minuscule, completely formed fetuses hiding inside the heads of sperm cells.
And Hartsoeker, in turn, had taken inspiration precisely from the famous alchemical depictions of the homunculus.
In a sense, the homunculus appearing in an ancient alchemist’s vial and the ones studied by pathologists nowadays are not that different. They can both be seen as symbols of the enigma of development, of the fundamental mystery surrounding birth and life. Miniature images of the ontological dilemma which has been forever puzzling humanity: do we appear from indistinct chaos, or did our heart and soul exist somewhere, somehow, before we were born?
Little addendum of anatomical pathology (and a bit of genetics)
by Claudia Manini, MD
Teratomas are germ cell tumors composed of an array of tissues derived from two or three embryonic layers (ectoderm, mesoderm, endoderm) in any combination.
The great majority of teratomas are benign cystic tumors mainly located in ovary, containing mature (adult-type) tissues; rarely they contains embryonal tissues (“immature teratomas”) and, if so, they have a higher malignant potential.
The origin of teratomas has been a matter of interest, speculation, and dispute for centuries because of their exotic composition.
The partenogenic theory, which suggests an origin from the primordial germ cell, is now the most widely accepted. The other two theories, one suggesting an origin from blastomeres segregated at an early stage of embryonic development and the second suggesting an origin from embryonal rests have few adherents currently. Support for the germ cell theory has come from anatomic distribution of the tumors, which occurs along the body midline of migration of the primordial germ cell, from the fact that the tumors occur most commonly during the reproductive age (epidemiologic-observational but also experimental data) and from cytogenetic analysis which has demonstrated genotypic differences between omozygous teratomatous tissue and heterozygous host tissue.
The primordial germ cells are the common origins of gametes (spermatozoa and oocyte, that are mature germ cells) which contain a single set of 23 chromosomas (haploid cells). During fertilization two gametes fuse together and originate a new cell which have a dyploid and heterozygous genetic pool (a double set of 23 chromosomas from two different organism).
On the other hand, the cells composing a teratoma show an identical genetic pool between the two sets of chromosomes.
Thus teratomas, even when they unexpectedly give rise to fetiform structures, are a different phenomenon from the fetus in fetu, and they fall within the scope of tumoral and not-malformative pathology.
All this does not lessen the impact of the observation, and a certain awe in considering the differentiation potential of one single germ cell.
Kurman JR et al., Blaustein’s pathology of the female genital tract, Springer 2011
Prat J., Pathology of the ovary, Saunders 2004
The fourth book in the Bizzarro Bazar Collection, published by Logos, is finally here.
While the first three books deal with those sacred places in Italy where a physical contact with the dead is still possible, this new work focuses on another kind of “temple” for human remains: the anatomical museum. A temple meant to celebrate the progress of knowledge, the functioning and the fabrica, the structure of the body — the investigation of our own substance.
The Morgagni Museum in Padova, which you will be able to explore thanks to Carlo Vannini‘s stunning photography, is not devoted to anatomy itself, but rather to anatomical pathology.
Forget the usual internal architectures of organs, bones and tissues: here the flesh has gone insane. In these specimens, dried, wet or tannized following Lodovico Brunetti’s method, the unconceivable vitality of disease becomes the real protagonist.
A true biological archive of illness, the collection of the Morgagni Museum is really a time machine allowing us to observe deformities and pathologies which are now eradicated; before the display cases and cabinets we gaze upon the countless, excruciating ways our bodies can fail.
A place of inestimable value for the amount of history it contains, that is the history of the victims, of those who fell along the path of discovery, as much as of those men who took on fighting the disease, the pioneers of medical science, the tale of their committment and persistence. Among its treasures are many extraordinary intersections between anatomy and art.
The path I undertook for His Anatomical Majesty was particularly intense on an emotional level, also on the account of some personal reasons; when I began working on the book, more than two years ago, the disease — which up until then had remained an abstract concept — had just reached me in all its destabilizing force. This is why the Museum, and my writing, became for me an initiatory voyage into the mysteries of the flesh, through its astonishments and uncertainties.
The subtitle’s oxymoron, that obscure splendour, is the most concise expression I could find to sum up the dual state of mind I lived in during my study of the collection.
Those limbs marked by suffering, those still expressive faces through the amber formaldehyde, those impossible fantasies of enraged cells: all this led me to confront the idea of an ambivalent disease. On one hand we are used to demonize sickness; but, with much the same surprise that comes with learning that biblical Satan is really a dialectical “adversary”, we might be amazed to find that disease is never just an enemy. Its value resides in the necessary questions it adresses. I therefore gave myself in to the enchantment of its terrible beauty, to the dizziness of its open meaning. I am sure the same fruitful uneasiness I felt is the unavoidable reaction for anyone crossing the threshold of this museum.
The book, created in strict collaboration with the University of Padova, is enriched by museology and history notes by Alberto Zanatta (anthropologist and curator of the Museum), Fabio Zampieri (history of medicine researcher), Maurizio Rippa Bonati (history of medicine associated professor) and Gaetano Thiene (anatomical pathology professor).
For some days now I have been receiving suggestions about Dr. Masaichi Fukushi‘s tattoo collection, belonging to Tokyo University Pathology Department. I am willing to write about it, because the topic is more multifaceted than it looks.
Said collection is both well-known and somewhat obscure.
Born in 1878, Dr. Fukushi was studying the formation of nevi on the skin around 1907, when his research led him to examine the correlation between the movement of melanine through vascularized epidermis and the injection of pigments under the skin in tattoos. His interest was further fueled by a peculiar discovery: the presence of a tattoo seemed to prevent the signs of syphilis from appearing in that area of the body.
In 1920 Dr. Fukushi entered the Mitsui Memorial Hospital, a charity structure where treatment was offered to the most disadvantaged social classes. In this environment, he came in contact with many tattooed persons and, after a short period in Germany, he continued his research on the formation of congenital moles at Nippon Medical University. Here, often carrying out autopsies, he developed an original method of preserving tattooed epidermis he took from corpses; he therefore began collecting various samples, managing to stretch the skin so that it could be exhibited inside a glass frame.
It seems Dr. Fukushi did not have an exclusively scientific interest in tattoos, but was also quite compassionate. Tattooed people, in fact, often came from the poorest and most problematic bracket of japanese society, and Fukushi’s sympathy for the less fortunate even pushed him, in some instances, to take over the expenses for those who could not afford to complete an unfinished tattoo. In return, the doctor asked for permission to remove their skin post mortem. But his passion for tattoos also took the form of photographic records: he collected more than 3.000 pictures, which were destroyed during the bombing of Tokyo in WWII.
This was not the only loss, for a good number of tattooed skins were stolen in Chicago as the doctor was touring the States giving a series of academic lectures between 1927 and 1928.
Fukushi’s work gained international attention in the 40s and the 50s, when several articles appeared on the newspapers, such as the one above published on Life magazine.
As we said earlier, the collection endured heavy losses during the 1945 bombings. However some skin samples, which had been secured elsewhere, were saved and — after being handed down to Fukushi’s son, Kalsunari — they could be today inside the Pathology Department, even if not available to the public. It is said that among the specimens there are some nearly complete skin suits, showing tattoos over the whole body surface. All this is hard to verify, as the Department is not open to the public and no official information seems to be found online.
Then again, if in the Western world tattoo is by now such a widespread trend that it hardly sparks any controversy, it still remains quite taboo in Japan.
Some time ago, the great Italian tattoo artist Pietro Sedda (author of the marvelous Black Novel For Lovers) told me about his last trip to Japan, and how in that country tattooers still operate almost in secret, in small, anonymous parlors with no store signs, often hidden inside common apartment buildings. The fact that tattoos are normally seen in a negative way could be related to the traditional association of this art form with yakuza members, even though in some juvenile contexts fashion tattoos are quite common nowadays.
A tattoo stygma existed in Western countries up to half a century ago, ratified by explicit prohibitions in papal bulls. One famous exception were the tattoos made by “marker friars” of the Loreto Sanctuary, who painted christian, propitiatory or widowhood symbols on the hands of the faithful. But in general the only ones who decorated their bodies were traditionally the outcast, marginalized members of the community: pirates, mercenaries, deserters, outlaws. In his most famous essay, Criminal Man (1876), Cesare Lombroso classified every tattoo variation he had encountered in prisoners, interpreting them through his (now outdated) theory of atavism: criminals were, in his view, Darwinianly unevolved individuals who tattooed themselves as if responding to an innate primitiveness, typical of savage peoples — who not surprisingly practiced tribal tattooing.
Coming back to the human hides preserved by Dr. Fukushi, this is not the only, nor the largest, collection of its kind. The record goes to London’s Wellcome Collection, which houses around 300 individual pieces of tattoed skin (as opposed to the 105 specimens allegedly stored in Tokyo), dating back to the end of XIX Century.
The edges of these specimens show a typical arched pattern due to being pinned while drying. And the world opened up by these traces from the past is quite touching, as are the motivations that can be guessed behind an indelible inscription on the skin. Today a tattoo is often little more than a basic decoration, a tribal motif (the meaning of which is often ignored) around an ankle, an embellishment that turns the body into a sort of narcissistic canvas; in a time when a tattoo was instead a symbol of rebellion against the establishment, and in itself could cause many troubles, the choice of the subject was of paramount relevance. Every love tattoo likely implied a dangerous or “forbidden” relationship; every sentence injected under the skin by the needle became the ultimate statement, a philosophy of life.
These collections, however macabre they may seem, open a window on a non-aligned sensibility. They are, so to speak, an illustrated atlas of that part of society which is normally not contemplated nor sung by official history: rejects, losers, outsiders.
Collected in a time when they were meant as a taxonomy of symbols allowing identification and prevention of specific “perverse” psychologies, they now speak of a humanity who let their freak flag fly.
(Thanks to all those who submitted the Fukushi collection.)
Some days ago I was contacted by a pathologist who recently discovered Bizzarro Bazar, and said she was particularly impressed by the website’s “lack of morbidity”. I could not help but seize the opportunity of chatting a bit about her wonderful profession: here is what she told me about the different aspects of this not so well-known job, which is all about studying deformity, dissimilarities and death to understand what keeps us alive.
What led you to become a pathologist?
When I was sixteen I decided I had to understand disease and death.
The pathologist’s work is very articulated and varied, and mostly executed on living persons… or at least on surgically removed parts of living persons; but undoubtedly one of the routine activities is the autoptical diagnosis, and this is exactly one of the reasons behind my choice, I won’t deny it. Becoming a pathologist was the best way to draw on my passion for anatomy, turning it into a profession, and what’s more I would also have the opportunity of exorcising my fear of death by getting accustomed to it… getting my hands dirty and looking at it up close. I wanted to understand and investigate how people die. Maybe part of it had to do with my visual inclination, and pathology is a morphologic discipline which requires sharp visual memory and attention to macro and microscopic details, to differences in shape, to nuances in color.
Is there some kind of common prejudice against your job? How did you explain your “vocation” to friends and relatives?
Actually the general public is not precisely aware of what the pathologist does, hence a certain morbid curiosity on the part of non-experts. Most of them think of Kay Scarpetta, from Cornwell’s novels, or CSI. When people asked me about my job, at the beginning of my career, I gave detailed explanations of all the non-macabre aspects of my work, namely the importance of an hystological diagnosis in oncology, in order to plan the correct treatment. I did this to avoid a certain kind of curiosity, but I was met with puzzled looks. To cut it short, I would then admit: “I also perform autopsies”, and eventually there was a spark of interest in their eyes. I never felt misjudged, but I sometimes noticed some sort of uneasiness. And maybe some slightly sexist prejudice (the unasked question being how can a normal girl be into this kind of things); those female sexy pathologists you find in novels and TV series were not fashionable yet, and at the postgraduate school I was the only woman. As for friends and relatives… well, my parents never got in the way with my choices… I believe they still haven’t exactly figured out exactly what I do, and if I try to tell them they ask me to spare them the details! As for my teenage kids, who are intrigued by my job, I try to draw their attention to the scientific aspects. In the medical environment there is still this idea of a pathologist being some kind of nerd genius, or a person who is totally hopeless in human interactions, and therefore seeks shelter in a specialization that is not directly centered on doctor-patient relationship. Which is not necessarily true anymore, by the way, as often pathologists perform biopsies, and therefore interact with the patient.
Are autopsies still important today?
Let’s clarify: in Italy, the anatomopatologo is not a forensic pathologist, but is closer to what would be known in America as a surgical pathologist. The autopsy the pathologist performs is on people who died in a hospital (and not on the deceased who fell from a height or committed suicide, for instance) to answer to a very specific clinical inquiry, while the legal autopsy is carried out by the legal MD on behalf of the DA’s office.
One would think that, with the development of imaging radiology tests, the autoptic exam would have by now become outdated. In some facilities they perform the so-called “virtual autopsy” through CAT scans. In reality, in those cases in which a diagnosis could not be determined during the deceased’s life, an autopsy is still the only exam capable of clarifying the final cause of death. Besides direct examination, it allows to take organ samples to be studied under the microscope with conventional coloring or to be submitted for more refined tests, such as molecular biology. In the forensic field, direct examination of the body allows us to gather information on the chronology, environment and modality of death, all details no other exam could provide.
There is of course a great difference (both on a methodological and emotional level) between macroscopic and microscopic post mortem analysis. In your experience, for scientific purposes, is one of the two phases more relevant than the other or are they both equally essential?
They are both essential, and tightly connected to each other: one cannot do without the other. The visual investigation guides the following optic microscopy exam, because the pathologist samples a specific area of tissue, and not another, to be submitted to the lab on the grounds of his visual perception of dissimilarity.
In my experience of autopsy rooms, albeit limited, I have noticed some defense strategies being used to cope with the most tragic aspects of medical investigation. On one hand a certain humor, though never disrespectful; and, on the other, little precautions aimed at preserving the dignity of the body (but which may also have the function of pushing away the idea that an autopsy is an act of violation). How did you get used to the roughest side of your job?
I witnessed my first autopsy during my first year in medical school, and I still remember every detail of it even today, 30 years later. I nearly fainted. However, once I got over the first impact, I learned to focus on single anatomical details, as if I were a surgeon in the operating room, proceeding with great caution, avoiding useless cuts, always keeping in mind that I’m not working on a corpse, but a person. With his own history, his loved ones, presumably with somebody outside that room who is now crying for the loss. One thing I always do, after the external exam and before I begin to cut, is cover up the face of the dead person. Perhaps with the illogical intent of preventing him to see what I’m about to do… and maybe to avoid the unpleasant feeling of being watched.
Are there subjects that are more difficult to work with, on the emotional level?
Are autopsies, as a general rule, open to a non-academic public in Italy? Would you recommend witnessing an autopsy?
No, all forensic autopsies are not accessible, for obvious reasons, since there is often a trial underway; neither are the diagnostic post mortem examinations in hospitals. I wouldn’t know whether to recommend seeing an autopsy to anyone. But I do believe every biology or medicine student should be allowed in.
One of the aspects that always fascinated me about pathological anatomy museums is the vitality of disease, the exuberant creativity with which forms can change: the pathological body is fluid, free, forgetful of those boundaries we think are fixed and insurmountable. You just need to glance at some bone tumors, which look like strange mineral sponges, to see the disease as a terrible blooming force.
Maybe this feeling of wonder before a Nature both so beautiful and deadly, was the one animating the first anatomists: a sort of secret respect for the disease they were fighting off, not much different from the hunter’s reverential fear as he studies his prey before the massacre. Have you ever experienced this sense of the sublime? Does the apparent paradox of the passionate anatomist (how can one be a disease enthusiast?) have something to do with this admiration?
To get passionate, in our case, means to feel inclined towards a certain field, a certain way of doing research, a certain method and approach which links a morphologic phenomenon to a functional phenomenon. We do not love disease, we love a discipline which teaches us to see (Domine, ut videam) in order to understand the disease. And, hopefully, cure it.
And yes, of course there is the everyday experience of the sublime, the aesthetic experience, the awe at shapes and colors, and the information they convey. If we know how to interpret it.
Speaking of the vitality of disease: today we recognize in some teratologic specimens a proof of the attempts through which evolution gropes around, one failed experiment after the other. How many of these maladies (literally, “being not apt”) are actually the exact opposite, an adaptation attempt? Is any example of mutation (which a different genetic drift might have elected to dominant phenotype) always pathological?
What I really mean to ask is, of course, another one of those questions that any pathological anatomy museum inevitably suggests: what are the actual boundaries of the Norm?
The norm is established on a statistical basis following a Gaussian distribution curve, but what falls beyond the 90th percentile (or before the 10th) is not forcibly unnatural, or unhealthy, or sick. It is just statistically less represented in the general population in respect to the phenotype we are examining. Whether a statistically infrequent character will be an advantage only time will tell.
The limits of the norm are therefore conventionally established on a mathematical basis. What is outside of the norm is just more uncommon. Biology undergoes constant transformation (on the account of new medicines or therapies, climatic and environmental change, great migrations…), and therefore we are always confronted with new specimens coming in. That is why our job is always evolving, too.
I didn’t expect such a technical answer… mine was really a “loaded” question. As you know, for years I have been working on the concepts of dissimilarity, exoticism and diversity, and I wanted to provoke you – to see whether from your standpoint a mutant body could also be considered as a somewhat revolutionary space, a disruptive element in a context demanding total compliance to the Norm.
Ask a loaded question… and you’ll get a convenient answer. You’re talking about a culture demanding compliance to a social norm, I replied in terms of the biology demanding compliance to a norm that is established by the scientific community on a frequency-based statistic calculation — which is therefore still conventional. In reality, deformity appears in unexpected ways, and should be more correctly described following a probabilistic logic, and not frequency. But I’m beginning to sound technical again.
I have seen respected professors lighten up like children before some pathological wet specimens. The feeling I had was that the medical gaze in some ways justified an interest for extreme visions, usually precluded to the general public. Is it an exclusively scientific interest? Is it possible to be passionate about this kind of work, without being somehow fascinated by the bizarre?
There could be a little self-satisfaction at times. But in general there is sincere passion and enthusiasm for the topic, and that surely cannot be faked. It is a job you can only do if you love it.
All our discipline is based on the differential diagnosis between “normal” and “pathological”. I could say that everything pathological is dysmorphic in respect to the norm, therefore it is bizarre, different. So yes, you have to feel a the fascination for the bizarre. And be very curious.
The passion for the macabre is a growing trend, especially among young people, and it is usually deemed negative or cheap, and strongly opposed by Italian academics. This does not happen in other realities (not just the US, but also the UK for instance) in which a common element of communication strategies for museums has become the ability of arousing curiosity in a vast public, sometimes playing on pop and dark aspects. Come for the macabre, stay for the science. If young people are drawn to the subject via the macabre imaginary, do you think in time this could lead to the education of new, trustworthy professionals?
Yes, it’s true, there is a growing interest, I’m thinking of some famous anatomical exhibitions which attracted so many visitors they had to postpone the closing date. There is also my kids’ favorite TV show about the most absurd ways to die. I believe that all this is really an incentive and should be used as a basis to arouse curiosity on the scientific aspects of these topics. I think that we can and must use this attraction for the macabre to bring people and particularly youngsters closer to science, even more so in these times of neoshamanic drifts and pseudo-scientific rants. Maybe it could also serve the purpose of admitting that death is part of our daily lives, and to find a way to relate to it. As opposed to the Anglo-Saxon countries, in Italy there still is a religious, cultural and legislative background that partially gets in the way (we have laws making it hard to dissect bodies for study, and I also think of the deeply-rooted idea that an autopsy is a violation/desecration of the corpse, up to those prejudices against science and knowledge leading to grotesque actions like the petition to close the Lombroso Museum).
Has your job changed your relationship with death and dying in any way?
I would say it actually changed my relationship with life and living. My worst fear is no longer a fear of dying. I mostly fear pain, and physical or mental decay, with all the limitations they entail. I hope for a very distant, quick and painless death.
With your twenty years experience in the field, can you think of some especially curious anecdotes or episodes you came across?
Many, but I don’t feel comfortable relating episodes that revolve around a person’s remains. But I can tell you that I often do not wonder how these people died, but rather how in the world they could be alive in the first place, given all the diseases I find! And, to me, life looks even more like a precariously balanced wonder.
Vi sono molti disturbi ai quali la scienza non ha ancora saputo trovare un’origine e una causa certa.
Sotto il comune termine di “balbuzie” si è soliti raggruppare diversi tipi di impedimenti del linguaggio, più o meno gravi; al di là delle classificazioni specialistiche, ciò che risulta chiaro anche ai profani è che chi soffre di questo genere di disfluenze verbali finisce per essere sottoposto a forte stress, tanto da farsi problemi ad iniziare una conversazione, avere attacchi di ansia, e addirittura nei casi più estremi isolarsi dalla vita sociale. Si tratta di un circolo vizioso, perché se la balbuzie provoca ansia, l’ansia a sua volta ne aggrava i sintomi: la persona balbuziente, quindi, deve saper superare un continuo sentimento di inadeguatezza, lottando costantemente contro la perdita di controllo.
Le cause esatte della balbuzie non sono state scoperte, così come non è ancora stata trovata una vera e propria cura definitiva per il problema; è indubbio che il fattore ansiogeno sia comunque fondamentale, come dimostrano quelle situazioni in cui, a fronte di uno stress più ridotto (ad esempio, parlando al telefono), i sintomi tendono ad affievolirsi notevolmente se non a scomparire del tutto.
Fra i primi a sottolineare l’importanza dell’aspetto psicologico della balbuzie (pensieri, attitudini ed emozioni dei pazienti) fu il Dr. Wendell Johnson. Riconosciuto oggi come uno dei più influenti patologi del linguaggio, egli focalizzò il suo lavoro su queste problematiche in un’epoca, gli anni ’30, in cui gli studi sul campo erano agli albori: eppure i dati raccolti nelle sue ricerche sui bambini balbuzienti sono ancora oggi i più numerosi ed esaustivi a disposizione degli psicologi.
Nonostante le molte terapie efficaci da lui iniziate, e una vita intera dedicata alla comprensione e alla cura di questo disturbo (di cui egli stesso soffriva), Johnson viene spesso ricordato soltanto per un esperimento sfortunato e discutibile sotto il profilo etico, che nel tempo è divenuto tristemente famoso.
Wendell Johnson era convinto che la balbuzie non fosse genetica, ma che venisse invece fortemente influenzata da fattori esterni quali l’educazione, l’autostima e in generale l’ambiente di sviluppo del bambino. Per provare questa sua teoria, nel 1939 Johnson elaborò un complesso esperimento che affidò a una studentessa universitaria, Mary Tudor, sotto la sua supervisione. Lo scopo del progetto consisteva nel verificare quanto influissero i complimenti e i rimproveri sullo sviluppo del linguaggio: la Tudor avrebbe cercato di “curare” la balbuzie di alcuni bambini, lodando il loro modo di esprimersi, e allo stesso tempo – ecco che arriva la parte spinosa – di indurla in altri bambini perfettamente in grado di parlare, tramite continui attacchi alla loro autostima. Venne deciso che le piccole cavie umane sarebbero state dei bambini orfani, in quanto facili da reperire e privi di figure genitoriali che potessero interferire con il progetto.
In un orfanotrofio di veterani nello Iowa, Johnson e Tudor selezionarono ventidue bambini dai 5 ai 15 anni, che avevano tutti perso i genitori in guerra; fra questi, soltanto dieci erano balbuzienti. I bambini con problemi di balbuzie vennero divisi in due gruppi: a quelli del gruppo IA, sperimentale, la Tudor doveva ripetere che il loro linguaggio era ottimo, e che non avevano da preoccuparsi. Il gruppo IB, di controllo, non riceveva particolari suggestioni o complimenti.
Poi c’erano i dodici bambini che parlavano fluentemente: anche loro vennero divisi in due gruppi, IIA e IIB. I più fortunati erano quelli del secondo gruppo di controllo (IIB), che venivano educati in maniera normale e corretta. Il gruppo IIA, invece, è il vero e proprio pomo della discordia: ai bambini, tutti in grado di parlare bene, venne fatto credere che il loro linguaggio mostrasse un inizio preoccupante di balbuzie. La Tudor li incalzava, durante le sue visite, facendo notare ogni loro minimo inciampo, e recitando dei copioni precedentemente concordati con il suo docente: “Siamo arrivati alla conclusione che hai dei grossi problemi di linguaggio… hai molti dei sintomi di un bambino che comincia a balbettare. Devi cercare immediatamente di fermarti. Usa la forza di volontà… Fa’ qualunque cosa pur di non balbettare… Non parlare nemmeno finché non sai di poterlo fare bene. Vedi come balbetta quel bambino, vero? Beh, certamente ha iniziato proprio in questo modo”.
L’esperimento durò da gennaio a maggio, con Mary Tudor che parlava ad ogni bambino per 45 minuti ogni due o tre settimane. I bambini del gruppo IIA, bersagliati per i loro fantomatici difetti di pronuncia, accusarono immediatamente il trattamento: i loro voti peggiorarono, e la loro sicurezza si disintegrò totalmente. Una bambina di nove anni cominciò a rifiutarsi di parlare e a tenere gli occhi coperti da un braccio tutto il tempo, un’altra di cinque divenne molto silenziosa. Una ragazzina quindicenne, per evitare di balbettare, ripeteva “Ah” sempre più frequentemente fra una parola e l’altra; rimproverata anche per questo, cadde in una sorta di loop e iniziò a schioccare le dita per impedirsi di dire “Ah”.
I bambini della sezione IIA, nel corso dei cinque mesi dell’esperimento, divennero introversi e insicuri. La stessa Mary Tudor riteneva che la ricerca si fosse spinta troppo oltre: presa dai sensi di colpa, per ben tre volte dopo aver concluso l’esperimento Mary ritornò all’orfanotrofio per rimediare ai danni che era convinta di aver provocato. Così, di sua spontanea iniziativa, cercò di far capire ai bambini del gruppo IIA che, in realtà, non avevano mai veramente balbettato. Se questo tardivo moto di pietà sia servito a ridare sicurezza ai piccoli orfani, oppure abbia disorientato ancora di più le loro già confuse menti, non lo sapremo mai.
I risultati dell’esperimento dimostravano, secondo Johnson, che la balbuzie vera e propria poteva nascere da un errato riconoscimento del problema in famiglia: anche con le migliori intenzioni, i genitori potevano infatti scambiare per balbuzie dei piccoli difetti di linguaggio, perfettamente normali durante la crescita, e ingigantirli fino a portarli a livello di una vera e propria patologia. Lo psicologo si rese comunque conto che il suo esperimento poggiava su un confine etico piuttosto delicato, e decise di non pubblicarlo, ma di renderlo liberamente disponibile nella biblioteca dell’Università dello Iowa.
Passarono più di sessant’anni, quando nel 2001 un giornalista investigativo del San Jose Mercury News scoprì l’intera vicenda, e intuì subito di poterci costruire uno scoop clamoroso. Johnson, morto nel frattempo nel 1965, era ritenuto uno degli studiosi del linguaggio di più alto profilo, rispettato ed ammirato; il sensazionale furore mediatico che scaturì dalla rivelazione dell’esperimento alimentò un intenso dibattito sull’eticità del suo lavoro. L’Università si scusò pubblicamente per aver finanziato il “Monster Study” (com’era stato immediatamente ribattezzato dai giornali), e il 17 agosto 2007 sei degli orfani ancora in vita ottennero dallo Stato un risarcimento di 950.000 dollari, per le ferite psicologiche ed emotive sofferte a causa dall’esperimento.
Era davvero così “mostruoso” questo studio? I bambini del gruppo IIA rimasero balbuzienti per tutta la vita?
In realtà, non lo divennero mai, nonostante Johnson sostenesse di aver provato la sua tesi anti-genetica. Mary Tudor aveva parlato di “conseguenze inequivocabili” sulle abilità linguistiche degli orfani, eppure a nessuno dei bambini del gruppo IIA venne in seguito diagnosticata una balbuzie. Alcuni di loro riferirono in tribunale di essere diventati introversi, ma di vera e propria balbuzie indotta, neanche l’ombra.
Le valutazioni degli odierni patologi del linguaggio variano considerevolmente sugli effetti negativi che la ricerca di Johnson potrebbe aver provocato. Quanto all’eticità del progetto in sé, non va dimenticato che negli anni ’30 la sensibilità era differente, e non esisteva ancora alcuna direttiva scientifica internazionale riguardo gli esperimenti sugli esseri umani. A sorpresa, in tutto questo, l’aspetto più discutibile rimane quello scientifico: i professori Nicoline G. Ambrose e Ehud Yairi, in un’analisi dell’esperimento condotta dopo il 2001, si mostrano estremamente critici nei confronti dei risultati, viziati secondo loro dalla frettolosa e confusa progettazione e dai “ripensamenti” della Tudor. Anche l’idea che la balbuzie sia un comportamento che il bambino sviluppa a causa della pressione psicologica dei genitori – concetto di cui Johnson era strenuamente convinto e che ripeté come un mantra fino alla fine dei suoi giorni – non viene assolutamente corroborata dai dati dell’esperimento, visto che alcuni dei bambini in cui sarebbe dovuta insorgere la balbuzie avevano invece conosciuto addirittura dei miglioramenti.
La vera macchia nella brillante carriera di Johnson, quindi, non sarebbe tanto la sua mancanza di scrupoli, ma di scrupolo: la ricerca, una volta spogliata da tutti gli elementi sensazionalistici ed analizzata oggettivamente, si è rivelata meno grave del previsto nelle conseguenze, ma più pasticciata e tendenziosa nei risultati che proponeva.
Il Monster Study è ancora oggi un esperimento pressoché universalmente ritenuto infame e riprovevole, e di sicuro lo è secondo gli standard morali odierni, visto che ha causato un indubbio stress emotivo a un gruppo di minori già provati a sufficienza dalla morte dei genitori. Ma, come si è detto, erano altri tempi; di lì a poco si sarebbero conosciuti esperimenti umani ben più terrificanti, questa volta dalla nostra parte dell’Oceano.
Ad oggi, nonostante l’eziologia precisa del disturbo rimanga sconosciuta, si ritiene che la balbuzie abbia cause di tipo genetico e neurologico.
Figli di un ginecologo di prestigio e di una cantante d’opera, i fratelli Collyer, Homer e Langley, sono divenuti nel tempo figure iconiche di New York a causa della loro vita eccentrica e della loro tragica fine: purtroppo la sindrome a cui hanno involontariamente dato il nome non è affatto rara, e pare anzi sia in lenta ma costante crescita soprattutto nelle grandi città.
Nati alla fine dell’ ‘800, fin da ragazzi i due fratelli dimostrano spiccate doti che lasciano prospettare una vita di successo: dall’intelligenza acuta e vispa, si iscrivono entrambi alla Columbia University e guadagnano il diploma – Langley in ingegneria e Homer in diritto nautico. Homer è anche un eccellente pianista e si esibisce al Carnegie Hall, ma abbandona la carriera di musicista abbastanza presto, in seguito alle prime critiche negative. Langley sfrutta le sue conoscenze ingegneristiche per brevettare alcuni marchingegni che però non hanno successo.
Fin dal 1909 i Collyer vivono con i genitori ad Harlem: all’epoca si trattava di un quartiere “in“, a prevalenza bianca, in cui risiedevano molti professionisti altolocati e grossi nomi della finanza e dello spettacolo. Ma nel 1919 il padre Herman abbandona la famiglia e si trasferisce in un’altra casa: morirà quattro anni più tardi. La madre dei due fratelli muore invece nel 1929.
Harlem in quel periodo sta cambiando volto, da zona residenziale sta divenendo un quartiere decisamente malfamato. I fratelli Collyer, rimasti orfani, reagiscono a questo cambiamento facendosi sempre più reclusi, e più eccentrici. Cominciano a collezionare oggetti trovati in giro per le strade, e non gettano più l’immondizia. La loro casa al 2078 della Quinta Avenue si riempie a poco a poco degli utensili più disparati che i Collyer accatastano ovunque: giocattoli, carrozzine deformate, pezzi di violini, corde e cavi elettrici attorcigliati a pile di giornali vecchi di anni, cataste di scatoloni pieni di bicchieri rotti, cassepanche ricolme di lenzuola di ogni genere, fasci di decine di ombrelli, candelabri, pezzi di manichini, 14 pianoforti, un’intera automobile disassemblata e un’infinità di altre cianfrusaglie senza alcun valore.
Come il poeta greco dell’antichità di cui porta il nome, negli anni ’30 Homer diventa cieco. Langley, allora, decide di prendersi cura del fratello, e la vita dei due si fa ancora più misteriosa e appartata. I ragazzini tirano i sassi alle loro finestre, li chiamano “i fratelli fantasma”. Homer resta sempre sepolto in casa, all’interno della fitta rete di cunicoli praticati all’interno della spazzatura che ormai riempie la casa fino al soffitto; Langley esce di rado, per procurarsi le cento arance alla settimana che dà da mangiare al fratello nell’assurda convinzione che serviranno a ridargli la vista. Diviene sempre più ossessionato dall’idea che qualche intruso possa fare irruzione nel loro distorto, sovraffollato universo per distruggere l’intimità che si sono ritagliati: così, da buon ingegnere, costruisce tutta una serie di trappole, più o meno mortali, che dissemina e nasconde nella confusione di oggetti stipati in ogni stanza. Chiunque abbia l’ardire di entrare nel loro mondo la pagherà cara.
Eppure è proprio una di queste trappole, forse troppo bene mimetizzata, che condannerà i due fratelli Collyer. Nel Marzo del 1947, mentre sta portando la cena a Homer, strisciando attraverso un tunnel scavato nella parete di pacchi di giornale, Langley attiva per errore uno dei suoi micidiali trabocchetti: la parete di valigie e vecchie riviste gli crolla addosso, uccidendolo sul colpo. Qualche metro più in là sta seduto Homer, cieco e ormai paralizzato, impotente nell’aiutare il fratello. Morirà di fame e di arresto cardiaco qualche giorno più tardi.
La polizia, allertata da un vicino, fa irruzione nell’appartamento il 21 marzo, 10 ore dopo la morte di Homer. Ma il corpo di Langley, sepolto sotto uno strato di immondizia, non viene trovato subito e gli investigatori attribuiscono l’insopportabile odore all’immensa quantità di immondizia. Viene scatenata una caccia all’uomo nel tentativo di localizzare il fratello mancante, mentre le forze dell’ordine procedono, con estrema cautela per evitare le trappole, a svuotare a poco a poco l’appartamento.
Gli agenti, dopo quasi due settimane di ricerche fra le ben 180 tonnellate di rifiuti accumulati dai Collyer negli anni, trovano Langley l’8 aprile, mentre il cadavere è già preda dei topi.
I fratelli Collyer sono divenuti famosi perché incarnano una realtà tipicamente newyorkese: gli appartamenti sono spesso talmente piccoli, e la gente sedentaria, che il problema dei cosiddetti hoarders, cioè i collezionisti compulsivi di spazzatura, finisce spesso fuori controllo. Ma il fenomeno è tutt’altro che circoscritto alla sola Grande Mela.
Gli hoarders purtroppo esistono ovunque, anche in Italia, come vi racconterà qualsiasi vigile del fuoco. Normalmente si tratta di individui, con un’alta percentuale di anziani, costretti a una vita estremamente solitaria; la sindrome comincia con la difficoltà di liberarsi di ricordi e oggetti cari, ed è spesso acutizzata da timori di tipo finanziario. L’abitudine di “non buttar via nulla” diventa presto, per questi individui, una vera e propria fobia di essere separati dalle loro cose, anche le più inutili. Chi è affetto da questa mania spesso riempie le stanze fino ad impedire le normali funzioni per cui erano originariamente progettate: non si può più cucinare in cucina, dormire in camera da letto, e così via. L’accumulo incontrollato di cianfrusaglie rappresenta ovviamente un pericolo per sé e per gli altri, e rende difficoltose le operazioni di soccorso in caso di incendio o di altro infortunio.
La disposofobia è nota anche come sindrome di Collyer, in ricordo dei due eccentrici fratelli. Sempre in memoria di questa triste e strana storia, nel luogo dove sorgeva la loro casa c’è ora un minuscolo “pocket park”, chiamato Collyer Park.
Ecco la pagina di Wikipedia sulla disposofobia. E in questa pagina trovate una serie di fotografie di appartamenti di persone disposofobiche.
Diamo il benvenuto alla nostra nuova guestblogger, Marialuisa, che ha curato il seguente articolo.
La deformità, la carne, il sangue, sono tutti elementi che ci spingono in qualche modo a entrare in contatto con il lato puramente fisico del nostro essere.
Abituati a crearci un’immagine di noi che spesso dipende da elementi astratti e artificiosi quali status sociale, moda, potere, spesso dimentichiamo che siamo corpi di ossa e sangue. Ecco allora che la deformità – di una ferita, della putrefazione, della malattia – entra nei nostri occhi a ricordarci cosa siamo, quanto siamo fragili e in fondo semplici, spogliati dei nostri trucchi.
Jenny Saville, pittrice inglese nata nel 1970, estrapola con i suoi quadri il fascino cruento della deformità, espone corpi oscenamente grassi, feriti, tumefatti e ci lascia entrare nelle vite di questi personaggi bidimensionali. Ci permette di vederli davvero umani in quanto imperfetti; e non possiamo fare a meno di continuare a fissarli perché queste ferite, questi cumuli di adipe, questi sessi in mostra che non rispecchiano il viso di chi li possiede, sono qualcosa di nuovo, di reale e così vivo da essere più bello delle immagini, sterili e patinate, che quotidianamente definiamo “perfette”. L’artista, interessata a quella che chiama “patologia della pittura”, dipinge quadri enormi, le cui dimensioni permettono quasi di sentire la porosità della pelle, di perdersi fra le pieghe e i tagli di questi corpi violati. I nostri occhi vengono indirizzati, dalla composizione dell’immagine e dal sapiente uso dei colori, attraverso un preciso percorso dello sguardo.
Così, Jenny Saville ci rende partecipi di qualcosa che è successo, di un atto. Ci porta a immaginare ciò che è realmente osceno (e che appunto rimane fuori scena): una vita passata a ingozzarsi per sconfiggere la tristezza, o il pugno che si è abbattuto sul viso di un ragazzino, le angosce di chi si sente prigioniero di un corpo sbagliato. La violenza, autoinferta o subìta, è il tema primario – i corpi sono i veri protagonisti, a ricordarci che in fondo è questo che siamo.
Ecco il profilo dell’artista sul sito (in inglese) della Gagosian Gallery di New York. Sempre in inglese, ma più completa della versione italiana, la pagina di Wikipedia dedicata all’artista.