Victorian Hairwork: Interview with Courtney Lane

Part of the pleasure of collecting curiosities lies in discovering the reactions they cause in various people: seeing the wonder arise on the face of onlookers always moves me, and gives meaning to the collection itself. Among the objects that, at least in my experience, evoke the strongest emotional response there are without doubt mourning-related accessories, and in particular those extraordinary XIX Century decorative works made by braiding a deceased person’s hair.

Be it a small brooch containing a simple lock of hair, a framed picture or a larger wreath, there is something powerful and touching in these hairworks, and the feeling they convey is surprisingly universal. You could say that anyone, regardless of their culture, experience or provenance, is “equipped” to recognize the archetypical value of hair: to use them in embroidery, jewelry and decoration is therefore an eminently magical act.

I decided to discuss this peculiar tradition with an expert, who was so kind as to answer my questions.
Courtney Lane is a real authority on the subject, not just its history but also its practical side: she studied the original techniques with the intent of bringing them back to life, as she is convinced that this ancient craft could accomplish its function of preserving memory still today.

Can you tell us a bit about yourself?

I am a Victorian hair artist, historian, and self-proclaimed professional weirdo based in Kansas City. My business is called Never Forgotten where, as an artist, I create modern works of Victorian style sentimental hairwork for clients on a custom basis as well as making my own pieces using braids and locks of antique human hair that I find in places such as estate sales at old homes. As an academic, I study the history of hairwork and educate others through lectures as well as online video, and I also travel to teach workshops on how to do hairwork techniques.

Hairwork by Courtney Lane.

Where does your interest for Victorian hairwork come from?

I’ve always had a deep love for history and finding beauty in places that many consider to be dark or macabre. At the young age of 5, I fell in love with the beauty of 18th and 19th century mausoleums in the cemeteries near the French Quarter of New Orleans. Even as a child, I adored the grand gesture of these elaborate tombs for memorializing the dead. This lead me to developing a particular fondness for the Victorian era and the funerary customs of the time.
Somewhere along the line in studying Victorian mourning, I encountered the idea of hairwork. A romantic at heart, I’d already known of the romantic value a lock of hair from your loved one could hold, so I very naturally accepted that it would also be a perfect relic to keep of a deceased loved one. I found the artwork to be stunning and the sentiment to be of even greater beauty. I wondered why it was that we no longer practiced hairwork widely, and I needed to know why.
I studied for years trying to find the answers and eventually I learned how to do the artwork myself. I thoroughly believed that the power of sentimental hairwork could help society reclaim a healthier relationship with death and mourning, and so I decided to begin my business to create modern works, educate the public on the often misunderstood history of the artform, and ensure that this sentimental tradition is “Never Forgotten”.

How did hairwork become a popular mourning practice historically? Was the hair collected before or post mortem? Was it always related to grieving?

Hairwork has taken on a variety of purposes, most of which have been inherently sentimental, but it has not always been related to grieving. With the death of her husband, Queen Victoria fell into a deep mourning which lasted the remaining 40 years of her life. This, in turn, created a certain fashionability, and almost a fetishism, of mourning in the Victorian era. Most people today believe all hairwork had the purpose of elaborating a loss, but between the 1500s and early 1900s, hairwork included romantic keepsakes from a loved one or family mementos, and sometimes served as memorabilia from an important time in one’s life. As an example, many of the large three-dimensional wreaths you can still see actually served as a form of family history. Hair was often collected from several (often living) members of the family and woven together to create a genealogy. I’ve seen other examples of hairwork simply commemorating a major life event such as a first communion or a wedding. Long before hairwork became an art form, humans had already been exchanging locks of hair; so it’s only natural that there were instances of couples wearing jewelry that contained the hair of their living lovers.

As far as mourning hairwork is concerned, the hair was sometimes collected post mortem, and sometimes the hair was saved from an earlier time in their life. As hair was such an important part of culture, it was often saved when it was cut whether or not there was an immediate plan for making art or jewelry with it.
The idea of using hair as a mourning practice largely stems from Catholicism in the Middle Ages and the power of saintly relics in the church. The relic of a saint is more than just the physical remains of their body, rather it provides a spiritual connection to the holy person, creating a link between life and death. This belief that a relic can be a substitute for the person easily transitioned from public, religious mourning to private, personal mourning.
Of the types of relics (bone, flesh, etc), hair is by far the most accessible to the average person, as it does not need any sort of preservation to avoid decomposition, much as the rest of the body does; collecting from the body is as simple as using a pair of scissors. Hair is also one of the most identifiable parts of person, so even though pieces of bone might just be as much of a relic, hair is part of your loved one that you see everyday in life, and can continue to recognize after death.

Was hairwork strictly a high-class practice?

Hairwork was not strictly high-class. Although hairwork was kept by some members of upper class, it was predominantly a middle-class practice. Some hairwork was done by professional hairworkers, and of course, anyone commissioning them would need the means to do so; but a lot of hairwork was done in the home usually by the women of the family. With this being the case, the only expenses would be the crafting tools (which many middle-class women would already likely have around the home), and the jewelry findings, frames, or domes to place the finished hairwork in.

How many people worked at a single wreath, and for how long? Was it a feminine occupation, like embroidery?

Hairwork was usually, but not exclusively done by women and was even considered a subgenre of ladies’ fancy work. Fancy work consisted of embroidery, beadwork, featherwork, and more. There are even instances of women using hair to embroider and sew. It was thought to be a very feminine trait to be able to patiently and meticulously craft something beautiful.
As far as wreaths are concerned, it varied in the number of people who would work together to create one. Only a few are well documented enough to know for sure.
I’ve also observed dozens of different techniques used to craft flowers in wreaths and some techniques are more time consuming than others. One of the best examples I’ve seen is an incredibly well documented piece that indicates that the whole wreath consists of 1000 flowers (larger than the average wreath) and was constructed entirely by one woman over the span of a year. The documentation also specifies that the 1000 flowers were made with the hair of 264 people.

  

Why did it fall out of fashion during the XX Century?

Hairwork started to decline in popularity in the early 1900’s. There were several reasons.
The first reason was the growth of hairwork as an industry. Several large companies and catalogues started advertising custom hairwork, and many people feared that sending out for the hairwork rather than making it in the home would take away from the sentiment. Among these companies was Sears, Roebuck and Company, and in one of their catalogues in 1908, they even warned, “We do not do this braiding ourselves. We send it out; therefore we cannot guarantee same hair being used that is sent to us; you must assume all risk.” This, of course, deterred people from using professional hairworkers.
Another reason lies with the development and acceptance of germ theory in the Victorian era. The more people learned about germs and the more sanitary products were being sold, the more people began to view the human body and all its parts as a filthy thing. Along with this came the thought that hair, too, was unclean and people began to second guess using it as a medium for art and jewelry.
World War I also had a lot to do with the decline of hairwork. Not only was there a general depletion in resources for involved countries, but more and more women began to work outside of the home and no longer had the time to create fancy work daily. During war time when everybody was coming together to help the war effort, citizens began to turn away from frivolous expenses and focus only on necessities. Hair at this time was seen for the practical purposes it could serve. For example, in Germany there were propaganda posters encouraging women to cut their long hair and donate it to the war effort when other fibrous materials became scarce. The hair that women donated was used to make practical items such as transmission belts.
With all of these reasons working together, sentimental hairwork was almost completely out of practice by the year 1925; no major companies continued to create or repair hairwork, and making hairwork at home was no longer a regular part of daily life for women.

19th century hairworks have become trendy collectors items; this is due in part to a fascination with Victorian mourning practices, but it also seems to me that these pieces hold a special value, as opposed to other items like regular brooches or jewelry, because of – well, the presence of human hair. Do you think we might still be attaching some kind of “magical”, symbolic power to hair? Or is it just an expression of morbid curiosity for human remains, albeit in a mild and not-so-shocking form?

I absolutely believe that all of these are true. Especially amongst people less familiar with these practices, there is a real shock value to seeing something made out of hair. When I first introduce the concept of hairwork to people, some find the idea to be disgusting, but most are just fascinated that the hair does not decompose. People today are so out of touch with death, that they immediately equate hair as a part of the body and don’t understand how it can still be perfectly pristine over a hundred years later. For those who don’t often ponder their own mortality, thinking about the fact that hair can physically live on long after they’ve died can be a completely staggering realization.
Once the initial surprise and morbid curiosity have faded, many people recognize a special value in the hair itself. Amongst serious collectors of hair, there seems to be a touching sense of fulfillment in the opportunity to preserve the memory of somebody who once was loved enough to be memorialized this way – even if they remain nameless today. Some may say it is a spiritual calling, but I would say at the very least it is a shared sense of mortal empathy.

What kind of research did you have to do in order to learn the basics of Victorian hairworks? After all, this could be described as a kind of “folk art”, which was meant for a specific, often personal purpose; so were there any books at the time holding detailed instructions on how to do it? Or did you have to study original hairworks to understand how it was done?

Learning hairwork was a journey for me. First, I should say that there are several different types of hairwork and some techniques are better documented than others. Wire work is the type of hairwork you see in wreaths and other three-dimensional flowers. I was not able to find any good resources on how to do these techniques, so in order to learn, I began by studying countless wreaths. I took every opportunity I could to study wreaths that were out of their frame or damaged so I could try to put them back together and see how everything connected. I spent hours staring at old pieces and playing with practice hair through trial and error.
Other techniques are palette work and table work. Palette work includes flat pictures of hair which you may see in a frame or under glass in jewelry, and table work includes the elaborate braids that make up a jewelry chain such as a necklace or a watch fob.
The Lock of Hair
by Alexanna Speight and Art of Hair Work: Hair Braiding and Jewelry of Sentiment by Mark Campbell teach palette work and table work, respectively. Unfortunately, being so old, these books use archaic English and also reference tools and materials that are no longer made or not as easy to come by. Even after reading these books, it takes quite a bit of time to find modern equivalents and practice with a few substitutions to find the best alternative. For these reasons, I would love to write an instructional book explaining all three of these core techniques in an easy to understand way using modern materials, so hairwork as a craft can be more accessible to a wider audience.

Why do you think this technique could be still relevant today?

The act and tradition of saving hair is still present in our society. Parents often save a lock of their child’s first haircut, but unfortunately that lock of hair will stay hidden away in an envelope or a book and rarely seen again. I’ve also gained several clients just from meeting someone who has never heard of hairwork, but they still felt compelled cut a lock of hair from their deceased loved one to keep. Their eyes consistently light up when they learn that they can wear it in jewelry or display it in artwork. Time and again, these people ask me if it’s weird that they saved this hair. Often, they don’t even know why they did. It’s a compulsion that many of us feel, but we don’t talk about it or celebrate it in our modern culture, so they think they’re strange or morbid even though it’s an incredibly natural thing to do.
Another example is saving your own hair when it’s cut. Especially in instances of cutting hair that’s been grown very long or hair that has been locked, I very often encounter people who have felt so much of a personal investment in their own hair that they don’t feel right throwing it out. These individuals may keep their hair in a bag for years, not knowing what to do with it, only knowing that it felt right to keep. This makes perfect sense to me, because hair throughout history has always been a very personal thing. Even today, people identify each other by hair whether it be length, texture, color, or style. Different cultures may wear their hair in a certain way to convey something about their heritage, or individuals will use their own creativity or sense of self to decide how to wear their hair. Whether it be for religion, culture, romance, or mourning, the desire to attach sentimental value to hair and the impulse to keep the hair of your loved one are inherently human.
I truly believe that being able to proudly display our hair relics can help us process some of our most intimate emotions and live our best lives.

You can visit Courtney Lane’s website Never Forgotten, and follow her on Facebook, Instagram, Twitter, and YouTube. If you’re interested in the symbolic and magical value of human hair, here is my post on the subject.

Dreams of Stone

Stone appears to be still, unchangeable, untouched by the tribulations of living beings.
Being outside of time, it always pointed back to the concept fo Creation.
Nestled, inaccessible, closed inside the natural chest of rock, those anomalies we called treasures lie waiting to be discovered: minerals of the strangest shape, unexpected colors, otherworldly transparency.
Upon breaking a stone, some designs may be uncovered which seem to be a work of intellect. One could recognize panoramas, human figures, cities, plants, cliffs, ocean waves.

Who is the artist that hides these fantasies inside the rock? Are they created by God’s hand? Or were these visions and landscapes dreamed by the stone itself, and engraved in its heart?

If during the Middle Ages these stone motifs were probably seen as an evidence of the anima mundi, at the beginning of the modern period they had already been relegated to the status of simple curiosities.
XVI and XVII Century naturalists, in their wunderkammern and in books devoted to the wonders of the world, classified the pictures discovered in stone as “jokes of Nature” (lusus naturæ). In fact, Roger Caillois writes (La scrittura delle pietre, Marietti, 1986):

The erudite scholars, Aldrovandi and Kircher among others, divided these wonders into genres and species according to the image they saw in them: Moors, bishops, shrimps or water streams, faces, plants, dogs or even fish, tortoises, dragons, skulls, crucifixes, anything a fervid imagination could recognize and identify. In reality there is no being, monster, monument, event or spectacle of nature, of history, of fairy tales or dreams, nothing that an enchanted gaze couldn’t see inside the spots, designs and profiles of these stones.

It is curious to note, incidentally, that these “caprices” were brought up many times during the long debate regarding the mystery of fossils. Leonardo Da Vinci had already guessed that sea creatures found petrified on mountain tops could be remnants of living organisms, but in the following centuries fossils came to be thought of as mere whims of Nature: if stone was able to reproduce a city skyline, it could well create imitations of seashells or living things. Only by the half of XVIII Century fossils were no longer considered lusus naturæ.

Among all kinds of pierre à images (“image stones”), there was one in which the miracle most often recurred. A specific kind of marble, found near Florence, was called pietra paesina (“landscape stone”, or “ruin marble”) because its veinings looked like landscapes and silhouettes of ruined cities. Maybe the fact that quarries of this particular marble were located in Tuscany was the reason why the first school of stone painting was established at the court of Medici Family; other workshops specializing in this minor genre arose in Rome, in France and the Netherlands.

 

Aside from the pietra paesina, which was perfect for conjuring marine landscapes or rugged desolation, other kinds of stone were used, such as alabaster (for celestial and angelic suggestions) and basanite, used to depict night views or to represent a burning city.

Perhaps it all started with Sebastiano del Piombo‘s experiments with oil on stone, which had the intent of creating paintings that would last as long as sculptures; but actually the colors did not pass the test of time on polished slates, and this technique proved to be far from eternal. Sebastiano del Piombo, who was interested in a refined and formally strict research, abandoned the practice, but the method had an unexpected success within the field of painted oddities — thanks to a “taste for rarities, for bizarre artifices, for the ambiguous, playful interchange of art and nature that was highly appreciated both during XVI Century Mannerism and the baroque period” (A. Pinelli on Repubblica, January 22, 2001).

Therefore many renowned painters (Jacques Stella, Stefano della Bella, Alessandro Turchi also known as l’Orbetto, Cornelis van Poelemburgh), began to use the veinings of the stone to produce painted curios, in tension between naturalia e artificialia.

Following the inspiration offered by the marble scenery, they added human figures, ships, trees and other details to the picture. Sometimes little was needed: it was enough to paint a small balcony, the outline of a door or a window, and the shape of a city immediately gained an outstanding realism.

Johann König, Matieu Dubus, Antonio Carracci and others used in this way the ribbon-like ornaments and profound brightness of the agate, the coils and curves of alabaster. In pious subjects, the painter drew the mystery of a milky supernatural flare from the deep, translucent hues; or, if he wanted to depict a Red Sea scene, he just had to crowd the vortex of waves, already suggested by the veinings of the stone, with frightened victims.

Especially well-versed in this eccentric genre, which between the XVI and XVIII Century was the object of extended trade, was Filippo Napoletano.
In 1619 the painter offered to Cosimo II de’ Medici seven stories of Saints painted on “polished stoned called alberese“, and some of his works still retain a powerful quality, on the account of their innovative composition and a vivid expressive intensity.
His extraordinary depiction of the Temptations of Saint Anthony, for instance, is a “little masterpiece [where] the artist’s intervention is minimal, and the Saint’s entire spiritual drama finds its echo in the melancholy of a landscape of Dantesque tone” (P. Gaglianò on ExibArt, December 11, 2000).

The charm of a stone that “mimicks” reality, giving the illusion of a secret theater, is unaltered still today, as Cailliois elegantly explains:

Such simulacra, hidden on the inside for a long time, appear when the stones are broken and polished. To an eager imagination, they evoke immortal miniature models of beings and things. Surely, chance alone is at the origin of the prodigy. All similarities are after all vague, uncertain, sometimes far from truth, decidedly gratuitous. But as soon as they are perceived, they become tyrannical and they offer more than they promised. Anyone who knows how to observe them, relentlessly discovers new details completing the alleged analogy. These kinds of images can miniaturize for the benefit of the person involved every object in the world, they always provide him with a copy which he can hold in his hand, position as he wishes, or stash inside a cabinet. […] He who possesses such a wonder, produced, extracted and fallen into his hands by an extraordinary series of coincidences, happily imagines that it could not have come to him without a special intervention of Fate.

Still, unchangeable, untouched by the tribulations of living beings: it is perhaps appropriate that when stones dream, they give birth to these abstract, metaphysical landscapes, endowed with a beauty as alien as the beauty of rock itself.

Several artworks from the Medici collections are visible in a wonderful and little-known museum in Florence, the Opificio delle Pietre Dure.
The best photographic book on the subject is the catalogue
Bizzarrie di pietre dipinte (2000), curate by M. Chiarini and C. Acidini Luchinat.

The Abominable Vice

Among the bibliographic curiosities I have been collecting for years, there is also a little book entitled L’amico discreto. It’s the 1862 Italian translation of The silent friend (1847) by R. e L. Perry; aside from 100 beautiful anatomical plates, the book also shows a priceless subtitle: Observations on Onanism and Its Baneful Results, Including Mental and Sexual Incapacity and Impotence.

Just by skimming through the table of contents, it’s clear how masturbation was indicated as the main cause for a wide array of conditions: from indigestion to “hypoconriac melancholy”, from deafness to “bending of the penis”, from emaciated complexion to the inability to walk, in a climax of ever more terrible symptoms preparing the way for the ultimate, inevitable outcome — death.
One page after the other, the reader learns why onanism is to be blamed for such illnesses, specifically because it provokes an

excitement of the nervous system [which] by stimulating the organs to transient vigour, brings, ere middle life succeeds the summer of manhood, all the sensible infirmities and foibles of age; producing in its impetuous current, such an assemblage of morbid irritation, that even on trivial occasions its excitement is of a high and inflammable character, and its endurance beyond the power of reason to sustain.

But this is just the beginning: the worst damage is on the mind and soul, because this state of constant nervous stimulation

places the individual in a state of anxiety and misery for the remainder of his existence, — a kind of contingency, which it is difficult for language adequately to describe; he vegetates, but lives not: […] leading the excited deviating mind into a fertile field of seductive error — into a gradual and fatal degradation of manhood — into a pernicious, disgraceful, and ultimately almost involuntary application of those inherent rights which nature wisely instituted for the preservation of her species […] in defiance of culture, moral feeling, moral obligation, and religious impressions: thus the man, who, at the advent of youth and genius was endowed with gaiety and sociality, becomes, ere twenty-five summers have shed their lustre on him, a misanthrope, and a nadir-point of discontent! What moral region does that man live in? […] Is it nothing to light the gloomy torch that guides, by slow and melancholy steps to the sepulchre of manhood, in the gay and fascinating spring-time of youth and ardent desire; when the brilliant fire of passion, genius, and sentiment, ought to electrify the whole frame?

This being a physiology and anatomy essay, today its embellishments, its evocative language (closer to second-rate poetry than to science) seem oddly out of place — and we can smile upon reading its absurd theories; yet The Silent Friend is just one of many Nineteeth Century texts demonizing masturbation, all pretty popular since 1712, when an anonymous priest published a volume called Onania, followed in 1760 by L’Onanisme by Swiss doctor Samuel-Auguste Tissot, which had rapidly become a best-seller of its time.
Now, if physicians reacted in such a harsh way against male masturbation, you can guess their stance on female auto-eroticism.

Here, the repulsion for an act which was already considered aberrant, was joined by all those ancestral fears regarding female sexuality. From the ancient vagina dentata (here is an old post about it) to Plato’s description of the uterus (hystera) as an aggressive animale roaming through the woman’s abdomen, going through theological precepts in Biblical-Christian tradition, medicine inherited a somber, essentially misogynistic vision: female sexuality, a true repressed collective unconscious, was perceived as dangerous and ungovernable.
Another text in my library is the female analogue of Tissot’s Onania: written by J.D.T. de Bienville, La Ninfomania ovvero il Furore Uterino (“Nymphomania, or The Uterine Fury”) was originally published in France in 1771.
I’m pasting here a couple of passages, which show a very similar style in respect to the previous quotes:

We see some perverted young girls, who have conducted a voluptuous life over a long period of time, suddenly fall prey to this disease; and this happens when forced retirement is keeping them from those occasions which facilitated their guilty and fatal inclination. […] All of them, after they are conquered by such malady, occupy themselves with the same force and energy with those objects which light in their passion the infernal flame of lewd pleasure […], they indulge in reading lewd Novels, that begin by bending their heart to soft feelings, and end up inspiring the most depraved and gross incontinence. […] Those women who, after taking a few steps in this horrible labyrinth, miss the strength to come back, are drawn almost imperceptibly to excesses, which after corrupting and damaging their good name, deprive them of their own life.

The book goes on to describe the hallucinatory state in which the nymphomaniacs fall, frantically hurling at men (by nature all chaste and pure, it seems), and barely leaving them “the time to escape their hands“.
Of course, this an Eighteenth Century text. But things did not improve in the following century: during the Nineteenth Century, actually, the ill-concealed desire to repress female sexuality found one of its cruelest incarnations, the so-called “extirpation”.

This euphemism was used to indicate the practice of clitoridectomy, the surgical removal of the clitoris.
Everybody kows that female genital mutilations continue to be a reality in many countries, and they have been the focus of several international campaigns to abandon the practice.
It seems hard to believe that, far from being solely a tribal tradition, it became widespread in Europe and in the United States within the frame of modern Western medicine.
Clitoridectomy, a simple yet brutal operation, was based on the idea that female masturbation led to hysteria, lesbianism and nymphomania. The perfect circular reasoning behind this theory was the following: in mental institutions, insane female patients were often caught masturbating, therefore masturbation had to be the cause of their lunacy.

One of the most fervent promoters of extirpation was Dr. Isaac Baker Brown, English gynaecologist and obstetrical surgeon.
In 1858 he opened a clinic on Notting Hill, ad his therapies became so successful that Baker Brown resigned from Guy’s Hospital to work privately full time. By means of clitoridectomy, he was able to cure (if we are to trust his own words) several kinds of madness, epilepsy, catalepsy and hysteria in his patients: in 1866 he published a nice little book on the subject, which was praised by the Times because Brown “brought insanity within the scope of surgical treatment“. In his book, Brown reported 48 cases of female masturbation, the heinous effects on the patients’ health, and the miraculous result of clitoridectomy in curing the symptoms.

We don’t know for sure how many women ended up under the enthusiastic doctor’s knife.
Brown would have probably carried on with his mutilation work, if he hadn’t made the mistake of setting up a publicity campaign to advertise his clinic. Even then, self-promotion was considered ethically wrong for a physician, so on April 29, 1866, the British Medical Journal published a heavy j’accuse against the doctor. The Lancet followed shortly after, then even the Times proved to have changed position and asked if the surgical treatment of illness was legal at all. Brown ended up being investigated by the Lunacy Commission, which dealt with the patients’ welfare in asylums, and in panic he denied he ever carried out clitoridectomies on his mentally ill patients.

But it was too late.
Even the Royal College of Surgeons turned away from him, and a meeting decided (with 194 approving votes against 38 opposite votes) his removal from the Obstetric Society of London.
R. Youngson and I. Schott, in A Brief History of Bad Medicine (Robinson, 2012), highlight the paradox of this story:

The extraordinary thing was that Baker Brown was disgraced, not because he practised clitoridectomy for ridiculuous indications, but because, out of greed, he had offended against professional ethics. No one ever suggested that there was anything wrong with clitoridectomy, as such. Many years were to pass before this operation was condemned by the medical profession.

And many more, until eventually masturbation could be freed from medical criminalization and moral prejudice: at the beginning of the Twentieth Century doctors still recommended the use of constrictive laces and gears, straight-jackets, up to shock treatments like cauterization or electroconvulsive therapy.

1903 patent to prevent erections and nocturnal pollutions through the use of spikes, electric shocks and an alarm bell.

Within this dreadful galaxy of old anti-masturbation devices, there’s one looking quite harmless and even healthy: corn flakes, which were invented by famous Dr. Kellogg as an adjuvant diet against the temptations of onanism. And yet, whenever cereals didn’t do the trick, Kellogg advised that young boys’ foreskins should be sewn with wire; as for young girls, he recommended burning the clitoris with phenol, which he considered

an excellent means of allaying the abnormal excitement, and preventing the recurrence of the practice in those whose will-power has become so weakened that the patient is unable to exercise entire self-control.
The worse cases among young women are those in which the disease has advanced so far that erotic thoughts are attended by the same voluptuous sensations that accompany the practice. The author has met many cases of this sort in young women, who acknowledged that the sexual orgasm was thus produced, often several times daily. The application of carbolic acid in the manner described is also useful in these cases in allaying the abnormal excitement, which is a frequent provocation of the practice of this form of mental masturbation.

(J. H. Kellogg, Plain Facts for Old And Young, 1888)

It was not until the Kinsey Reports (1948-1953) that masturbation was eventually legitimized as a natural and healthy part of sexuality.
All in all, as Woody Allen put it, it’s just “sex with someone you love“.

On the “fantastic physiology” of the uterus, there is a splendid article (in Italian language) here. Wikipedia has also a page on the history of masturbation. I also recommend Orgasm and the West. A History of Pleasure from the Sixteenth Century to the Present, by R. Muchembled.

Freaks: Gaze and Disability

Introduction: those damn colored glasses

The image below is probably my favorite illusion (in fact I wrote about it before).

At a first glance it looks like a family in a room, having breakfast.
Yet when the picture is shown to the people living in some rural parts of Africa, they see something different: a family having breakfast in the open, under a tree, while the mother balances a box on her head, maybe to amuse her children. This is not an optical illusion, it’s a cultural one.

The origins of this picture are not certain, but it is not relevant here whether it has actually been used in a psychological study, nor if it shows a prejudice on life in the Third World. The force of this illustration is to underline how culture is an inevitable filter of reality.

It reminds of a scene in Werner Herzog’s documentary film The Flying Doctors of East Africa (1969), in which the doctors find it hard to explain to the population that flies carry infections; showing big pictures of the insects and the descriptions of its dangers does not have much effect because people, who are not used to the conventions of our graphic representations, do not understand they are in scale, and think: “Sure, we will watch out, but around here flies are never THAT big“.

Even if we would not admit it, our vision is socially conditioned. Culture is like a pair of glasses with colored lenses, quite useful in many occasions to decipher the world but deleterious in many others, and it’s hard to get rid of these glasses by mere willpower.

‘Freak pride’ and disability

Let’s address the issue of “freaks”: originally a derogatory term, the word has now gained a peculiar cultural charm and ,as such, I always used it with the purpose of fighting pietism and giving diversity it its just value.
Any time I set out to talk about human marvels, I experienced first-hand how difficult it is to write about these people.

Reflecting on the most correct angle to address the topic means to try and take off culture’s colored glasses, an almost impossible task. I often wondered if I myself have sometimes succumbed to unintended generalizations, if I unwillingly fell into a self-righteous approach.
Sure enough, I have tried to tell these amazing characters’ stories through the filter of wonder: I believed that – equality being a given – the separation between the ordinary and the extra-ordinary could be turned in favor of disability.
I have always liked those “deviants” who decided to take back their exotic bodies, their distance from the Norm, in some sort of freak pride that would turn the concept of handicap inside out.

But is it really the most correct approach to diversity and, in some cases, disability? To what extent is this vision original, or is it just derivative from a long cultural tradition? What if the freak, despite all pride, actually just wanted an ordinary dimension, what if what he was looking for was the comfort of an average life? What is the most ethical narrative?

This doubt, I think, arose from a paragraph by Fredi Saal, born in 1935, a German author who spent the first part of his existence between hospitals and care homes because he was deemed “uneducable”:

No, it is not the disabled person who experiences him- or herself as abnormal — she or he is experienced as abnormal by others, because a whole section of human life is cut off. Thus this very existence acquires a threatening quality. One doesn’t start from the disabled persons themselves, but from one’s own experience. One asks oneself, how would I react, should a disability suddenly strike, and the answer is projected onto the disabled person. Thus one receives a completely distorted image. Because it is not the other fellow that one sees, but oneself.

(F. Saal, Behinderung = Selbstgelebte Normalität, 1992)

As much as the idea of a freak pride is dear to me, it may well be another subconscious projection: I may just like to think that I would react to disability that way… and yet one more time I am not addressing the different person, but rather my own romantic and unrealistic idea of diversity.

We cannot obviously look through the eyes of a disabled person, there is an insuperable barrier, but it is the same that ultimately separates all human beings. The “what would I do in that situation?” Saal talks about, the act of projecting ourselves onto others, that is something we endlessly do and not just with the disabled.

The figure of the freak has always been ambiguous – or, better, what is hard to understand is our own gaze on the freak.
I think it is therefore important to trace the origins of this gaze, to understand how it evolved: we could even discover that this thing we call disability is actually nothing more than another cultural product, an illusion we are “trained” to recognize in much the same way we see the family having breakfast inside a living room rather than out in the open.

In my defense, I will say this: if it is possible for me to imagine a freak pride, it is because the very concept of freak does not come out of the blue, and does not even entail disability. Many people working in freakshows were also disabled, others were not. That was not the point. The real characteristics that brought those people on stage was the sense of wonder they could evoke: some bodies were admired, others caused scandal (as they were seen as unbearably obscene), but the public bought the ticket to be shocked, amazed and shaken in their own certainties.

In ancient times, the monstrum was a divine sign (it shares its etymological root with the Italian verb mostrare, “to show”), which had to be interpreted – and very often feared, as a warning of doom. If the monstruous sign was usually seen as bearer of misfortune, some disabilities were not (for instance blindness and lunacy, which were considered forms of clairvoyance, see V. Amendolagine, Da castigo degli dei a diversamente abili: l’identità sociale del disabile nel corso del tempo, 2014).

During the Middle Ages the problem of deformity becomes much more complex: on one hand physiognomy suggested a correlation between ugliness and a corrupted soul, and literature shows many examples of enemies being libeled through the description of their physical defects; on the other, theologians and philosophers (Saint Augustine above all) considered deformity as just another example of Man’s penal condition on this earth, so much so that in the Resurrection all signs of it would be erased (J.Ziegler in Deformità fisica e identità della persona tra medioevo ed età moderna, 2015); some Christian female saints even went to the extreme of invoking deformity as a penance (see my Ecstatic Bodies: Hagiography and Eroticism).
Being deformed also precluded the access to priesthood (ordo clericalis) on the basis of a famous passage from the  Leviticus, in which offering sacrifice on the altar is forbidden to those who have imperfect bodies (P. Ostinelli, Deformità fisica…, 2015).

The monstrum becoming mirabile, worthy of admiration, is a more modern idea, but that was around well before traveling circuses, before Tod Browning’s “One of us!“, and before hippie counterculture seized it: this concept is opposed to the other great modern invention in regard to disability, which is commiseration.
The whole history of our relationship with disability fluctuates between these two poles: admiration and pity.

The right kind of eyes

In the German exhibition Der (im)perfekte Mensch (“The (im)perfect Human Being”), held in 2001 in the Deutsches Hygiene Museum in Dresden, the social gaze at people with disabilities was divided into six main categories:

– The astonished and medical gaze
– The annihilating gaze
– The pitying gaze
– The admiring gaze
– The instrumentalizing gaze
– The excluding gaze

While this list can certainly be discussed, it has the merit of tracing some possible distinctions.
Among all the kinds of gaze listed here, the most bothering might be the pitying gaze. Because it implies the observer’s superiority, and a definitive judgment on a condition which, to the eyes of the “normal” person, cannot seem but tragic: it expresses a self-righteous, intimate certainty that the other is a poor cripple who is to be pitied. The underlying thought is that there can be no luck, no happiness in being different.

The concept of poor cripple, which (although hidden behind more politically correct words) is at the core of all fund-raising marathons, is still deeply rooted in our culture, and conveys a distorted vision of charity – often more focused on our own “pious deed” than on people with disabilities.

As for the pitying gaze, the most ancient historical example we know of is this 1620 print, kept at the Tiroler Landesmuseum Ferdinandeum in Innsbruck, which shows a disabled carpenter called  Wolffgang Gschaiter lying in his bed. The text explains how this man, after suffering unbearable pain to his left arm and back for three days, found himself completely paralyzed. For fifteen years, the print tells us, he was only able to move his eyes and tongue. The purpose of this paper is to collect donations and charity money, and the readers are invited to pray for him in the nearby church of the Three Saints in Dreiheiligen.

This pamphlet is interesting for several reasons: in the text, disability is explicitly described as a “mirror” of the observer’s own misery, therefore establishing the idea that one must think of himself as he is watching it; a distinction is made between body and soul to reinforce drama (the carpenter’s soul can be saved, his body cannot); the expression “poor cripple” is recorded for the first time.
But most of all this little piece of paper is one of the very first examples of mass communication in which disability is associated with the idea of donations, of fund raising. Basically what we see here is a proto-telethon, focusing on charity and church prayers to cleanse public conscience, and at the same time an instrument in line with the Counter-Reformation ideological propaganda (see V. Schönwiese, The Social Gaze at People with Disabilities, 2007).

During the previous century, another kind of gaze already developed: the clinical-anatomical gaze. This 1538 engraving by Albrecht Dürer shows a woman lying on a table, while an artist meticulously draws the contour of her body. Between the two figures stands a framework, on which some stretched-out strings divide the painter’s vision in small squares so that he can accurately transpose it on a piece of paper equipped with the same grid. Each curve, each detail is broke down and replicated thanks to this device: vision becomes the leading sense, and is organized in an aseptic, geometric, purely formal frame. This was the phase in which a real cartography of the human body was developed, and in this context deformity was studied in much the same manner. This is the “astonished and medical gaze“, which shows no sign of ethical or pitying judgment, but whose ideology is actually one of mapping, dividing, categorizing and ultimately dominating every possible variable of the cosmos.

In the wunderkammer of Ferdinand II, Archduke of Austria (1529-1595), inside Ambras Castle near Innsbruck, there is a truly exceptional portrait. A portion of the painting was originally covered by a red paper curtain: those visiting the collection in the Sixteenth Century might have seen something close to this reconstruction.

Those willing and brave enough could pull the paper aside to admire the whole picture: thus the subject’s limp and deformed body appeared, portrayed in raw detail and with coarse realism.

What Fifteen-Century observers saw in this painting, we cannot know for sure. To understand how views are relative, it suffices to remind that at the time “human marvels” included for instance foreigners from exotic countries, and a sub-category of foreigners were cretins who were said to inhabit certain geographic regions.
In books like Giovan Battista de’ Cavalieri’s Opera ne la quale vi è molti Mostri de tute le parti del mondo antichi et moderni (1585), people with disabilities can be found alongside monstruous apparitions, legless persons are depicted next to mythological Chimeras, etc.

But the red paper curtain in the Ambras portrait is an important signal, because it means that such a body was on one hand considered obscene, capable of upsetting the spectator’s senibility. On the other hand, the bravest or most curious onlookers could face the whole image. This leads us to believe that monstrosity in the Sixteenth Century had at least partially been released from the idea of prodigy, and freed from the previous centuries superstitions.

This painting is therefore a perfect example of “astonished and medical” gaze; from deformity as mirabilia to proper admiration, it’s a short step.

The Middle Path?

The admiring gaze is the one I have often opted for in my articles. My writing and thinking practice coincides with John Waters’ approach, when he claims he feels some kind of admiration for the weird characters in his movies: “All the characters in my movies, I look up to them. I don’t think about them the way people think about reality TV – that we are better and you should laugh at them.

And yet, here we run the risk of falling into the opposite trap, an excessive idealization. It may well be because of my peculiar allergy to the concept of “heroes”, but I am not interested in giving hagiographic versions of the life of human marvels.

All these thoughts which I have shared with you, lead me to believe there is no easy balance. One cannot talk about freaks without running into some kind of mistake, some generalization, without falling victim to the deception of colored glasses.
Every communication between us and those with different/disabled bodies happens in a sort of limbo, where our gaze meets theirs. And in this space, there cannot ever be a really authentic confrontation, because from a physical perspective we are separated by experiences too far apart.
I will never be able to understand other people’s body, and neither will they.

But maybe this distance is exactly what draws us together.

“Everyone stands alone at the heart of the world…”

Let’s consider the only reference we have – our own body – and try to break the habit.
I will borrow the opening words from the introduction I wrote for Nueva Carne by Claudio Romo:

Our bodies are unknowable territories.
We can dismantle them, cut them up into ever smaller parts, study their obsessive geometries, meticulously map every anatomical detail, rummage in their entrails… and their secret will continue to escape us.
We stare at our hands. We explore our teeth with our tongues. We touch our hair.
Is
this what we are?

Here is the ultimate mind exercise, my personal solution to the freaks’ riddle: the only sincere and honest way I can find to relate diversity is to make it universal.

Johnny Eck woke up in this world without the lower limbs; his brother, on the contrary, emerged from the confusion of shapes with two legs.
I too am equipped with feet, including toes I can observe, down there, as they move whenever I want them to. Are those toes still me? I ignore the reach of my own identity, and if there is an exact point where its extension begins.
On closer view, my experience and Johnny’s are different yet equally mysterious.
We are all brothers in the enigma of the flesh.

I would like to ideally sit with him  — with the freak, with the “monster” — out on the porch of memories, before the sunset of our lives.
‘So, what did you think of this strange trip? Of this strange place we wound up in?’, I would ask him.
And I am sure that his smile would be like mine.

La Morgue, yesterday and today

Regarding the Western taboo about death, much has been written on how its “social removal” happened approximately in conjunction with WWI and the institution of great modern hospitals; still it would be more correct to talk about a removal and medicalization of the corpse. The subject of death, in fact, has been widely addressed throughout the Twentieth Century: a century which was heavily imbued with funereal meditations, on the account of its history of unprecedented violence. What has vanished from our daily lives is rather the presence of the dead bodies and, most of all, putrefaction.

Up until the end of Nineteenth Century, the relationship with human remains was inevitable and accepted as a natural part of existence, not just in respect to the preparation of a body at home, but also in the actual experience of so-called unnatural deaths.
One of the most striking examples of this familiarity with decomposition is the infamous Morgue in Paris.

Established in 1804, to replace the depository for dead bodies which during the previous centuries was found in the prison of Grand Châtelet, the Morgue stood in the heart of the capital, on the île de la Cité. In 1864 it was moved to a larger building on the point of the island, right behind Notre Dame. The word had been used since the Fifteenth Century to designate the cell where criminals were identified; in jails, prisoners were put “at the morgue” to be recognized. Since the Sixteenth Century, the word began to refer exclusively to the place where identification of corpses was carried out.

Due to the vast number of violent deaths and of bodies pulled out of the Seine, this mortuary was constantly filled with new “guests”, and soon transcended its original function. The majority of visitors, in fact, had no missing relatives to recognize.
The first ones to have different reasons to come and observe the bodies, which were laid out on a dozen black marble tables behind a glass window, were of course medical students and anatomists.

This receptacle for the unknown dead found in Paris and the faubourgs of the city, contributes not a little to the forwarding of the medical sciences, by the vast number of bodies it furnishes, which, on an average, amount to about two hundred annually. The process of decomposition in the human body may be seen at La Morgue, throughout every stage to solution, by those whose taste, or pursuit of science, leads them to that melancholy exhibition. Medical men frequently visit the place, not out of mere curiosity, but for the purpose of medical observation, for wounds, fracturs, and injuries of every description occasionally present themselves, as the effect of accident or murder. Scarcely a day passes without the arrival of fresh bodies, chiefly found in the Seine, and very probably murdered, by being flung either out of the windows which overhang the Seine river, or off the bridges, or out of the wine and wood-barges, by which the men who sell the cargoes generally return with money in their pockets […]. The clothes of the dead bodies brought into this establishment are hung up, and the corpse is exposed in a public room for inspection of those who visit the place for the purpose of searching for a lost friend or relative. Should it not be recognised in four days, it is publicly dissected, and then buried.

(R. Sears, Scenes and sketches in continental Europe, 1847)

This descripton is, however, much too “clean”. Despite the precautions taken to keep the bodies at low temperature, and to bathe them in chloride of lime, the smell was far from pleasant:

For most of the XIX Century, and even from an earlier time, the smell of cadavers was part of the routine in the Morgue. Because of its purpose and mode of operation, the Morgue was the privileged place for cadaveric stench in Paris […]. In fact, the bodies that had stayed in the water constituted the ordinary reality at the Morgue. Their putrefaction was especially spectacular.

(B. Bertherat, Le miasme sans la jonquille, l’odeur du cadavre à la Morgue de Paris au XIXe siècle,
in Imaginaire et sensibilités au XIXe siècle, Créaphis, 2005)

What is curious (and quite incomprehensible) for us today is how the Morgue could soon become one of the trendiest Parisian attractions.
A true theatre of death, a public exhibition of horror, each day it was visited by dozens of people of all backgrounds, as it certainly offered the thrill of a unique sight. It was a must for tourists visiting the capital, as proven by the diaries of the time:

We left the Louvre and went to the Morgue where three dead bodies lay waiting identification. They were a horrible sight. In a glass case one child that had been murdered, its face pounded fearfully.

(Adelia “Addie” Sturtevant‘s diary, September 17, 1889)

The most enlightening description comes from the wonderful and terrible pages devoted to the mortuary by Émile Zola. His words evoke a perfect image of the Morgue experience in XIX Century:

In the meantime Laurent imposed on himself the task of passing each morning by the Morgue, on the way to his office. […]When he entered the place an unsavoury odour, an odour of freshly washed flesh, disgusted him and a chill ran over his skin: the dampness of the walls seemed to add weight to his clothing, which hung more heavily on his shoulders. He went straight to the glass separating the spectators from the corpses, and with his pale face against it, looked. Facing him appeared rows of grey slabs, and upon them, here and there, the naked bodies formed green and yellow, white and red patches. While some retained their natural condition in the rigidity of death, others seemed like lumps of bleeding and decaying meat. At the back, against the wall, hung some lamentable rags, petticoats and trousers, puckered against the bare plaster. […] Frequently, the flesh on the faces had gone away by strips, the bones had burst through the mellow skins, the visages were like lumps of boned, boiled beef. […] One morning, he was seized with real terror. For some moments, he had been looking at a corpse, taken from the water, that was small in build and atrociously disfigured. The flesh of this drowned person was so soft and broken-up that the running water washing it, carried it away bit by bit. The jet falling on the face, bored a hole to the left of the nose. And, abruptly, the nose became flat, the lips were detached, showing the white teeth. The head of the drowned man burst out laughing.

Zola further explores the ill-conealed erotic tension such a show could provoke in visitors, both men and women. A liminal zone — the boundaries between Eros and Thanatos — which for our modern sensibility is even more “dangerous”.

This sight amused him, particularly when there were women there displaying their bare bosoms. These nudities, brutally exposed, bloodstained, and in places bored with holes, attracted and detained him. Once he saw a young woman of twenty there, a child of the people, broad and strong, who seemed asleep on the stone. Her fresh, plump, white form displayed the most delicate softness of tint. She was half smiling, with her head slightly inclined on one side. Around her neck she had a black band, which gave her a sort of necklet of shadow. She was a girl who had hanged herself in a fit of love madness. […] On a certain occasion Laurent noticed one of the [well-dressed ladies] standing at a few paces from the glass, and pressing her cambric handkerchief to her nostrils. She wore a delicious grey silk skirt with a large black lace mantle; her face was covered by a veil, and her gloved hands seemed quite small and delicate. Around her hung a gentle perfume of violet. She stood scrutinising a corpse. On a slab a few paces away, was stretched the body of a great, big fellow, a mason who had recently killed himself on the spot by falling from a scaffolding. He had a broad chest, large short muscles, and a white, well-nourished body; death had made a marble statue of him. The lady examined him, turned him round and weighed him, so to say, with her eyes. For a time, she seemed quite absorbed in the contemplation of this man. She raised a corner of her veil for one last look. Then she withdrew.

Finally, the Morgue was also an ironically democratic attraction, just like death itself:

The morgue is a sight within reach of everybody, and one to which passers-by, rich and poor alike, treat themselves. The door stands open, and all are free to enter. There are admirers of the scene who go out of their way so as not to miss one of these performances of death. If the slabs have nothing on them, visitors leave the building disappointed, feeling as if they had been cheated, and murmuring between their teeth; but when they are fairly well occupied, people crowd in front of them and treat themselves to cheap emotions; they express horror, they joke, they applaud or whistle, as at the theatre, and withdraw satisfied, declaring the Morgue a success on that particular day.
Laurent soon got to know the public frequenting the place, that mixed and dissimilar public who pity and sneer in common. Workmen looked in on their way to their work, with a loaf of bread and tools under their arms. They considered death droll. Among them were comical companions of the workshops who elicited a smile from the onlookers by making witty remarks about the faces of each corpse. They styled those who had beenburnt to death, coalmen; the hanged, the murdered, the drowned, the bodies that had been stabbed or crushed, excited their jeering vivacity, and their voices, which slightly trembled, stammered out comical sentences amid the shuddering silence of the hall.

(É. Zola, Thérèse Raquin, 1867)

In the course of its activity, the Morgue was only sporadically criticized, and only for its position, deemed too central. The curiosity in seeing the bodies was evidently not perceived as morbid, or at least it was not considered particularly improper: articles on the famous mortuary and its dead residents made regular appearance on newspapers, which gladly devoted some space to the most mysterious cases.
On March 15, 1907 the Morgue was definitively closed to the public, for reasons of “moral hygiene”. Times were already changing: in just a few years Europe was bound to know such a saturation of dead bodies that they could no longer be seen as an entertainment.

And yet, the desire and impulse to observe the signs of death on the human body never really disappeared. Today they survive in the virtual morgues of internet websites offering pictures and videos of accidents and violence. Distanced by a computer screen, rather than the ancient glass wall, contemporary visitors wander through these hyperrealistic mortuaries where bodily frailness is articulated in all its possible variations, witnesses to death’s boundless imagination.
The most striking thing, when surfing these bulletin boards where the obscene is displayed as in a shop window, is seeing how users react. In this extreme underground scene (which would make an interesting object for a study in social psychology) a wide array of people can be found, from the more or less casual visitor in search of a thrill, up to the expert “gorehounds”, who seem to collect these images like trading cards and who, with every new posted video, act smart and discuss its technical and aesthetic quality.
Perhaps in an attempt to exorcise the disgust, another constant is the recourse to an unpleasant and out-of-place humor; and it is impossible to read these jokes, which might appear indecent and disrespectful, without thinking of those “comical companions” described by Zola, who jested before the horror.

Aggregators of brutal images might entail a discussion on freedom of information, on the ethics and licitness of exhibiting human remains, and we could ask ourselves if they really serve an “educational” purpose or should be rather viewed as morbid, abnormal, pathological deviations.
Yet such fascinations are all but unheard of: it seems to me that this kind of curiosity is, in a way, intrinsic to the human species, as I have argued in the past.
On closer inspection, this is the same autoptic instinct, the same will to “see with one’s own eyes” that not so long ago (in our great-great-grandfathers’ time) turned the Paris Morgue into a sortie en vogue, a popular and trendy excursion.

The new virtual morgues constitute a niche and, when compared to the crowds lining up to see the swollen bodies of drowning victims, our attitude is certainly more complex. As we’ve said in the beginning, there is an element of taboo which was much less present at the time.
To our eyes the corpse still remains an uneasy, scandalous reality, sometimes even too painful to acknowledge. And yet, consciously or not, we keep going back to fixing our eyes on it, as if it held a mysterious secret.

 

The premature babies of Coney Island

Once upon a time on the circus or carnival midway, among the smell of hot dogs and the barkers’ cries, spectators could witness some amazing side attractions, from fire-eaters to bearded ladies, from electric dancers to the most exotic monstrosities (see f.i. some previous posts here and here).
Beyond our fascination for a time of naive wonder, there is another less-known reason for which we should be grateful to old traveling fairs: among the readers who are looking at this page right now, almost one out of ten is alive thanks to the sideshows.

This is the strange story of how amusement parks, and a visionary doctor’s stubbornness, contributed to save millions of human lives.

Until the end of XIX Century, premature babies had little or no chance of survival. Hospitals did not have neonatal units to provide efficient solutions to the problem, so the preemies were given back to their parents to be taken home — practically, to die. In all evidence, God had decided that those babies were not destined to survive.
In 1878 a famous Parisian obstetrician, Dr. Étienne Stéphane Tarnier, visited an exhibition called Jardin d’Acclimatation which featured, among other displays, a new method for hatching poultry in a controlled, hydraulic heated environment, invented by a Paris Zoo keeper; immediately the doctor thought he could test that same system on premature babies and commissioned a similar box, which allowed control of the temperature of the newborn’s environment.
After the first positive experimentations at the Maternity Hospital in Paris, the incubator was soon equipped with a bell that rang whenever the temperature went too high.
The doctor’s assistant, Pierre Budin, further developed the Tarnier incubator, on one hand studying how to isolate and protect the frail newborn babies from infectious disease, and on the other the correct quantities and methods of alimentation.

Despite the encouraging results, the medical community still failed to recognize the usefulness of incubators. This skepticism mainly stemmed from a widespread mentality: as mentioned before, the common attitude towards premature babies was quite fatalist, and the death of weaker infants was considered inevitable since the most ancient times.

Thus Budin decided to send his collaborator, Dr. Martin Couney, to the 1896 World Exhibition in Berlin. Couney, our story’s true hero, was an uncommon character: besides his knowledge as an obstetrician, he had a strong charisma and true showmanship; these virtues would prove fundamental for the success of his mission, as we shall see.
Couney, with the intent of creating a bit of a fuss in order to better spread the news, had the idea of exhibiting live premature babies inside his incubators. He had the nerve to ask Empress Augusta Victoria herself for permission to use some infants from the Charity Hospital in Berlin. He was granted the favor, as the newborn babies were destined to a certain death anyway.
But none of the infants lodged inside the incubators died, and Couney’s exhibition, called Kinderbrutanstalt (“child hatchery”) immediately became the talk of the town.

This success was repeated the following year in London, at Earl’s Court Exhibition (scoring 3600 visitors each day), and in 1898 at the Trans-Mississippi Exhibition in Omaha, Nebraska. In 1900 he came back to Paris for the World Exhibition, and in 1901 he attended the Pan-American Exhibition in Buffalo, NY.

L'edificio costruito per gli incubatori a Buffalo.

The incubators building in Buffalo.

The incubators at the Buffalo Exhibition.

But in the States Couney met an even stronger resistence to accept this innovation, let alone implementing it in hospitals.
It must be stressed that although he was exhibiting a medical device, inside the various fairs his incubator stand was invariably (and much to his disappointment) confined to the entertainment section rather than the scientific section.
Maybe this was the reason why in 1903 Couney took a courageous decision.

If Americans thought incubators were just some sort of sideshow stunt, well then, he would give them the entertainment they wanted. But they would have to pay for it.

Infant-Incubators-building-at-1901-Pan-American-Exposition

Baby_incubator_exhibit,_A-Y-P,_1909

Couney definitively moved to New York, and opened a new attraction at Coney Island amusement park. For the next 40 years, every summer, the doctor exhibited premature babies in his incubators, for a quarter dollar. Spectators flowed in to contemplate those extremely underweight babies, looking so vulnerable and delicate as they slept in their temperate glass boxes. “Oh my, look how tiny!“, you could hear the crowd uttering, as people rolled along the railing separating them from the aisle where the incubators were lined up.

 

In order to accentuate the minuscule size of his preemies, Couney began resorting to some tricks: if the baby wasn’t small enough, he would add more blankets around his little body, to make him look tinier. Madame Louise Recht, a nurse who had been by Couney’s side since the very first exhibitions in Paris, from time to time would slip her ring over the babies’ hands, to demonstrate how thin their wrists were: but in reality the ring was oversized even for the nurse’s fingers.

Madame Louise Recht con uno dei neonati.

Madame Louise Recht with a newborn baby.

Preemie wearing on his wrist the nurse’s sparkler.

Couney’s enterprise, which soon grew into two separate incubation centers (one in Luna Park and the other in Dreamland), could seem quite cynical today. But it actually was not.
All the babies hosted in his attractions had been turned down by city hospitals, and given back to the parents who had no hope of saving them; the “Doctor Incubator” promised families that he would treat the babies without any expense on their part, as long as he could exhibit the preemies in public. The 25 cents people paid to see the newborn babies completely covered the high incubation and feeding expenses, even granting a modest profit to Couney and his collaborators. This way, parents had a chance to see their baby survive without paying a cent, and Couney could keep on raising awareness about the importance and effectiveness of his method.
Couney did not make any race distinction either, exhibiting colored babies along with white babies — an attitude that was quite rare at the beginning of the century in America. Among the “guests” displayed in his incubators, was at one point Couney’s own premature daughter, Hildegarde, who later became a nurse and worked with her father on the attraction.

Nurses with babies at Flushing World Fair, NY. At the center is Couney’s daughter, Hildegarde.

Besides his two establishments in Coney Island (one of which was destroyed during the 1911 terrible Dreamland fire), Couney continued touring the US with his incubators, from Chicago to St. Louis, to San Francisco.
In forty years, he treated around 8000 babies, and saved at least 6500; but his endless persistence in popularizing the incubator had much lager effects. His efforts, on the long run, contributed to the opening of the first neonatal intensive care units, which are now common in hospitals all around the world.

After a peak in popularity during the first decades of the XX Century, at the end of the 30s the success of Couney’s incubators began to decrease. It had become an old and trite attraction.
When the first premature infant station opened at Cornell’s New York Hospital in 1943, Couney told his nephew: “my work is done“. After 40 years of what he had always considered propaganda for a good cause, he definitively shut down his Coney Island enterprise.

Martin Arthur Couney (1870–1950).

The majority of information in this post comes from the most accurate study on the subject, by Dr. William A. Silverman (Incubator-Baby Side Shows, Pediatrics, 1979).

(Thanks, Claudia!)