Though not the height of fashion, a white cotton shirtwaist was the unofficial uniform of schoolteachers in the Edwardian Philippines. Having used a chalkboard for a good part of my own teaching career, I can attest that having your sleeves already be white is extremely practical. Two of my previous heroines, Georgina and Allegra, thought so too.
According to the Indianapolis Journal on January 1, 1900: “The shirtwaist will be with us more than ever this summer. Women are wearing shirtwaists because they are comfortable, because they can be made to fit any form, and because they are mannish.” Fashion historian Catherine Gourley explains that “it was similar to a man’s shirt. It had a stiff, high-necked collar and buttons down the front. Women often wore one with a floppy bow or tie. Some pinned a brooch to the collar.”
In contrast, high fashion in the first decade of the 1900s was a structured Gibson Girl silhouette that looked a lot like that of the previous century, particularly the painfully small waist. The badly named “health” corset “pushed the bust forward and the hips back in an attempt to avoid pressure on the abdomen,” according to the timeline of the Fashion Institute of Technology (FIT) of the State University of New York. The shape was top-heavy with dramatic sleeves, “enhanced with petticoats that had full backs and smooth fronts” (FIT).
Dresses did not loosen until around 1910 or so, but fortunately Sugar Communion is quite epic in scope so I can explore new fashion templates that look far more comfortable. I was surprised by how 1920s-esque they looked, and then I found that FIT agreed with me: “While changes in women’s fashion that manifested in the 1920s are often attributed to changes due to World War I, many of the popular styles of the twenties actually evolved from styles popular before the war and as early as the beginning of the decade.”
I paged through only a few of the plates at the Costume Institute Collections at The Met to get an idea of what I would like to see Liddy wear, when she gets the chance—when she is not tending to patients in a practical shirtwaist, that is.
I think the geometric patterns on the above skirt would appeal, though Liddy is not likely to be seen at entertainments like horse races, nor would she approve, probably.
See what I mean by the roaring twenties vibe? Ignore the hat on the right, which seems to be an inspiration for Dr. Seuss’s cat. Both of these dresses seem so elegant. The one on the left I can see Miss Fisher wearing while she solves a murder mystery.
I do not understand the knotted kerchief hanging off the belt on the right illustration above, but that blouse and skirt is otherwise very modern. Also, women began to dare to show some ankle—racy, I know!—though not bare skin. My heroine Liddy does not have the time nor inclination for hose, so socks and boots are her daily wear.
I think that back in the 1980s I had a blouse like the one above on the left. No feathered hats for me or Liddy, though.
These plates tell me that clothing was starting to become more comfortable, and even high fashion followers did not want to be dependent on a maid to dress them all the time.
Can you imagine having a ladies’ valet in 2020? “The yoga pants again, ma’am?”
My favorite stuffed animal as a child was a weird-looking turtle named Snoozie. My bedtime stories were mostly Snoozie skits—half-Muppet Show, half Lion King—as written and performed by my father. When my beloved Snoozie tore a seam, my father stitched him up. The surgeon of the house did all the sewing. My father also removed my splinters with the tip of an eight-inch butcher’s knife. Since I could not stand to look at the knife, I watched his face as he concentrated. He never missed one, and it never hurt.
As I grew older, I loved to hear tales of my father’s training in medical school, like when he had to draw his own blood because his partner had passed out. He filled the syringe and handed it over when the other guy woke up. Another classmate devoted only one line in his notebook to each day’s lecture. Later, if anyone had a question about what was said a month ago in physiology, this fellow would look up the right dated line and reprise the professor’s entire hour-long talk verbatim, even the bad jokes.
Despite this steady diet of stories, my father did not believe in pressuring his only child to follow in his footsteps—not that it was much of a choice for me after college. I am a bit embarrassed to admit that I did not take a single laboratory science course after high school, and that omission would have been a problem on my application—in the 1990s. In the 1890s, not so much. Harvard Medical School accepted nearly all applicants. Well, all male applicants. The president of the university considered coeducation “a thoroughly wrong idea which is rapidly disappearing.”
Fortunately, coeducation did not disappear and, also fortunately, other medical schools at the time did accept women, including Ohio Medical University, where my next heroine, Liddy, will be trained. She will be one of about three women in her class of forty-nine. (My father went there too. By the 1960s, it was known as the Ohio State University College of Medicine. Go Bucks!)
Liddy will be unusual because she will have a bachelor’s degree when she starts medical school—something only eight percent of American medical students had in 1894, when she began. Typically those eight percent probably came from the bottom of their respective college classes. Scholars with promise went into teaching or the clergy. Physicians were considered “coarse and uncultivated . . . devoid of intellectual interests.” There was a real danger that too much science would “overcrowd” their limited minds. There were no written examinations at Harvard Medical School. None. In fact, that would have been impossible, one professor complained, because half of his students “could barely write.” He was not making a joke about doctors’ poor penmanship.
How could this be?
The Humoral System (Pre-Gilded Age)
Let’s talk first about what we know about what makes us sick. For far too long—from the ancient Greeks to the middle of the Victorian age—the European system of medicine described the human body as a balance of four substances called humors. If you had too much blood, the first of the four, it made you sanguine—courageous, hopeful, even amorous. Too much yellow bile turned you choleric, or hot-tempered. Black bile produced melancholic scholars, Shakespeare’s favorite. Too much phlegm slowed you down, made you apathetic. Your “sense of humor,” as it was known, even dictated which internal organs were most likely to fail you, like a combined CT-scan-slash-Meyers-Briggs personality test.
Blood was the only humor that could be spilled on command, so bleeding became a popular treatment for any imbalance. If you were sick in the eighteenth century, you headed off to your neighborhood barber-surgeon, maybe get a few teeth pulled while you were there. In 1793, when Founding Father Dr. Benjamin Rush faced a yellow fever epidemic in Philadelphia—then the nation’s capital—he treated one hundred people a day by draining two liters of blood per person. That’s about forty percent of the blood in their bodies! Half of Rush’s patients died. When George Washington fell ill from a throat infection in 1799, he was bled the same amount by his doctor. He died. Washington’s physician, like Rush before him, and like the barber-surgeons before them, used a specific scalpel named after a medieval weapon. It was called a “little lance,” or a lancet. A publication named The Lancet was and still is a leading medical journal. That’s like naming an education blog The Paddle.
Less extreme than the lancet were leeches, or parasitic worms. At the beginning of the nineteenth century, Britain imported 42 million leeches a year, seven million for London alone. That was about three leeches per person, but it still wasn’t enough. One British doctor admitted to using the same leeches on fifty different patients in succession—not realizing that he was exposing that fiftieth patient to blood-borne diseases from the last forty-nine people he treated. No wonder Napoleon called medicine “the science of murderers.”
He should know. He had been given another favorite prescription of the age: calomel, or mercurous chloride, which was prescribed as a magical tonic for almost any ailment, from tuberculosis to ingrown toenails. It was another humoralist treatment: if you did not want to drain blood, you might choose purge your patient from both ends with powdered mercury. Among the many, many symptoms of mercury poisoning are tremors, loss of teeth, and amnesia. Oh, and death.
No, I’m not blowing smoke up your ass. Wait, did you ever wonder why we say such a thing? The biggest fear of the eighteenth and nineteenth centuries was, shockingly, not doctors themselves but their doctors burying them alive. George Washington’s last words were instructions not to conduct any funeral for three days, just in case his physicians were not capable of distinguishing between life and death. Apparently, he had not heard of the latest sure-fire test, a tobacco smoke enema. Blowing smoke through a tube into a person’s nether region was sure to animate any phlegmatic—even before Dr. Previnaire added a bellows, a hand-held blower like I use in my fireplace, to create his patented anal tobacco furnace. The Academy of Sciences in Brussels gave Dr. Previnaire a prize for his work (Bondeson 139).
This is not medicine, you say; it’s snake oil! Absolutely, another popular remedy.
There were some bright spots. British Naval surgeon Dr. James Lind discovered that oranges and lemons helped his sailors recover from scurvy, but he did not know why. He did not even know what Vitamin C was. Still locked into a humoralist mindset, he believed scurvy was caused by cold, wet sea air and a lack of exercise. And, yes, vaccination did exist at this time—in fact, a form of vaccination has been around for a thousand years—but originally no one could explain how it worked.
It was not until the population medicine studies of Pierre Louis in 1820s and 1830s France that people looked at the data and said maybe bleeding doesn’t work. Louis introduced a new way of examining the efficacy of treatment: looking at large numbers of similar patients and studying their reactions to different applications of medicine. It was the first baby step toward clinical trials, though it was not yet randomized and his sample sizes were not very large.
Bloodletting faded from life slower than the patients who were being bled. Despite a very public debate between doctors in the 1850s, the practice persisted in textbooks as late as 1942. One part of the appeal may have been its accessibility and affordability. There were bloodletters everywhere, and they were cheap “health care.”
Another reason it persisted: no one had yet proven another theory of disease. All the pieces were there. Contagion was not a new concept: even as far back as the Islamic scholar Ibn Sina, there was an idea that disease could be spread by touch. Animalcules, or microscopic organisms had been seen as early as the 1670s. Dr. John Snow (not that Jon Snow) had shown it was not miasma, noxious urban gasses, that caused cholera but something the sick had passed to the water through their feces. Snow did not make this discovery with a microscope, though, but with a map showing clusters of cases around certain well pumps.
But Snow did not really change long-term thinking. The handle was reinstalled on the Broad Street pump in London a couple of weeks later, after the cholera crisis had passed. Maybe, they thought, Snow did not really know what he was talking about. Mysterious waterborne poison, indeed.
Gilded Age Medicine
You cannot change the answers until you change the questions. And you cannot change the questions until you admit what you don’t know. What was in the air—or water—that we were not seeing? At the beginning of the Gilded Age, Louis Pasteur introduced an anthrax vaccine in 1881 and a rabies vaccine in 1885. Pasteur’s best frenemy, German physician Robert Koch, isolated the bacterium that causes tuberculosis in 1882. In 1884, he did the same for cholera. These were four of the worst disease bogeymen of the modern age. Modern bacteriology and immunology were born.
By the way, the man who introduced these two rivals, Koch and Pasteur, was Dr. Joseph Lister, the first surgeon to disinfect wounds and sterilize surgical equipment. You know his name as the root of the brand name Listerine. Yes, you are rinsing your mouth with surgical antiseptic. Please continue to do so.
It would take time before the best and brightest of the American college set would pursue a career in medicine. And, like my character Liddy, if you wanted the best post-graduate education, you really had to go to Europe. While earlier in the century that may have meant Edinburgh or Paris, by 1890 that meant Germany or Austria, and in particular the Allgemeines Krankenhaus (General Hospital) of Vienna. (And you ate dinner at the Riedhof too!)
Back in the US, it was not until 1910 that medical education truly changed. Two of the richest men to ever live, John D. Rockefeller and Andrew Carnegie, funded the Flexner Report, which was like an early US News & World Report ranking guide to medical schools—and like all of those publications, it was deeply flawed. The publication of the Flexner Report in 1910 is credited with creating the modern scientific medical school system in the US, but it also directly or indirectly caused the closure of many medical schools for women and African Americans. Those that had been coeducational reduced their admission of women, partly because they had a rise in male applicants. One study calls an unintended consequence of Flexner’s report “the near elimination of women in the physician workforce between 1910 and 1970.”
Nevertheless, the Gilded Age must have been very exciting to live through. Every day, it would seem, more diseases were being identified and explained. Notice that I did not say cured. Calomel was still popular in the early 1880s, as were chocolate-covered arsenic tablets. Aspirin existed, but no one knew how it worked until 1971! Cannabis was legal until the xenophobic backlash against refugees fleeing unrest south of the border after the 1910 Mexican Revolution, and then this effective pain reliever was demonized.
There still was no real anesthesia for surgery except ether and cocaine. Cocaine was quite handy, actually, and it was sold in lozenge form for toothaches. Bayer Pharmaceuticals introduced a new form of cough relief that they said was just as good as morphine, but not as habit-forming. They trademarked this miracle compound: Heroin. You could buy two vials for $1.50 from Sears, complete with carrying case and dosage instructions for children!
Paul Ehrlich was playing around with dye stains when he stumbled upon the inspiration for a chemotherapy treatment for syphilis that would eventually be known as Salvarsan. He and his assistant, Sahachirō Hata, introduced their “magic bullet” to the world in 1909. It was an actual medicine with laboratory-tested results, and really the importance of this fact cannot be overstated. There was no other treatment for syphilis at this time. (And masturbation was discouraged in the strongest moral terms. See more on syphilis in historical romance—or, really, the lack of it.) The administration of Salvarsan was technically complicated and cumbersome, though, and the disease had to be caught in time. Ehrlich had wanted to discover a “magic bullet” for what ailed us, but nothing was that simple. Eventually, post-Gilded Age, sulfa drugs were introduced (1930s) and penicillin and other antibiotics shortly thereafter, but old habits of calomel and bloodletting died harder than they should have.
Opioid addiction rates are not the only modern parallels to Gilded Age medicine. We still distribute poisons that would make the merchants of mercury blush. For example, botulism bacteria produce a paralyzing substance so toxic that one teaspoon could kill as many as a million people. You know it as Botox, a medically recognized treatment for Cerebral Palsy and chronic migraines. Or you might have it injected into your face to smooth your wrinkles. No judgment.
Progress is not always a straight line. Leeches and maggots are making a comeback—raised in sterile conditions, fortunately, and shipped to an intensive care unit near you. The leech releases an enzyme that keeps blood vessels open, which is essential in reattachment surgery particularly in fingers and toes. Maggots are good for recurring ulcers of the skin caused by drug-resistant infections like MRSA. Maggots only eat dead tissue—as long as you get the right type—and also release an enzyme that promotes healing. And even bloodletting, or phlebotomy therapy, may be used today for specific diseases of overproduction of red and white blood cells and excess iron.
The medicine of World War I is also making a comeback. Bacteriophages are viruses that destroy bacteria. Honestly, they look like creepy spiders from a horror movie. They are hard to keep alive in transport—which is why they were tossed aside when antibiotics were discovered—but in an era of resistant superbugs, they may be the answer.
My father is now retired from stitching up humans and stuffed animals. There are many talented, highly-trained, and impressive women and men who have taken his place. This Thanksgiving I am grateful for them all, from emergency room nurses to the scientists behind messenger RNA vaccine development. But if this somewhat sordid tour of medical history has taught us anything, it is this: whether you are doctor or patient, teacher or student, we need to keep in mind the wise words of 12th-century rabbi, scientist, and physician Maimonides: “Teach thy tongue to say ‘I do not know,’ and thou shalt progress.”
Even Maimonides should have trained his tongue better. After all, he believed in bloodletting.
Want to know more about the history of medicine? I used a collection of podcasts introduced in my previous post, and I cannot recommend them highly enough! For more on sex education manuals of the time, check out my random sampling.
I drove two hours to attend Latin Mass and, predictably, understood not a word. The church was not struck by lightning, though, so I am counting it a win.
Let’s start at the beginning. People say write what you know, and it is good advice…that I do not follow very often. Okay, well sometimes I do: I’ve written two teacher characters so far. My heroines in past and future books hail from Boston (near where I currently live); Fairmont, West Virginia, where my mother moved in high school and the home of my favorite pepperoni rolls; and Columbus, Ohio, where I grew up. I love inserting sports into my historical novels because I am a football and volleyball coach who grew up playing softball and dated a baseball player in high school. Even the hymns used in my novella are favorites from daily singing at the Episcopal school where I teach.
But I have nothing in common with a Roman Catholic priest in 1900. I have written men before, though not celibate men who spent their entire young adult life in the seminary listening to lectures in Latin. When trying something completely different, research matters. I want to write Andres Gabiana as authentically, respectfully, and convincingly as possible.
Where to start? I read. And I read. And I read. You can follow my progress on Goodreads, if you like. What follows is not going to give you any spoilers about the upcoming novel, Sugar Communion. It is more like a stream-of-consciousness book report (which I would admittedly never accept from my own students). Here goes:
I’ve read twenty-two priest and nun memoirs so far. I’ve read three written by priests who, after struggling with celibacy, rededicated themselves to their vows and remained active priests. I have read three written by children of priests and nuns. I have read one by a man who came close to entering the seminary—he lived with religious orders and went on retreats—but ultimately decided against it. Mostly, though, I have targeted memoirs (fifteen of them) written by Roman Catholic priests who left the Church. And, like most of the other hundred thousand American priests who have left, they did so in order to take part in consensual, adult relationships. I really cannot emphasize these last three words enough: Consensual. Adult. Relationships. If marriage is a sacrament and a human right, and the Church says it is, then these priests left to exercise that right.
Sadly, consensual adult relationships with priests are not the average Bostonian’s first thought, but here’s the problem: the priests who sexually abused children in this diocese hid inside the Church. They did not leave it. And that has cost the bishops: nineteen American dioceses have been bankrupted by $3 billion dollars in court judgments, according to the National Catholic Reporter, and all because the Church refused to listen to victims and victims’ families, and instead reassigned these criminals to new parishes instead of turning them into the authorities. Pedophile priests are a small—and incredibly destructive—fraction of those who have broken their celibacy vows. Celibacy does not cause pedophilia. Institutionally, though, it can create the conditions that allow it to thrive, if the seed is already planted: a flawed selection process for priests, sexually immature men in positions of power, a culture of secrecy and shame around sex, and possibly a celibate’s lack of a parental impulse to protect children.
In order to separate my story as far as I can from this pattern, my heroine is a few years older than my priest (both are in their 30s); she is a professional (medical doctor) in her own right; and she is not a member of his parish. Andres is also a good man and a good priest.
He is a good priest, I swear, even by the teachings of the Church itself. Did you know that throughout the first eleven hundred years of Christian history, the leadership—including popes, bishops, and parish priests—could legally wed and celebrate the faith as married men? (I did not know this, either, not until I read two academic treatments from experts A. W. Richard Sipe and William E. Phipps, which are the basis of most of the historical information to follow.) The Jewish tradition celebrated married love and required it of priests and rabbis. Not only was Jesus a Galilean Jew, but his role could be best described as an early rabbi (teacher and scholar). There is evidence that Jesus himself may have been married (and maybe widowed) by the time of his ministry. We know Peter was married. Paul was widowed. Moreover, in the early Jesus Movement, women played significant roles in ministry, church leadership, and funding.
So where did Catholic clerical celibacy and patriarchy come from? Pre-Christian Greek philosophers like Plato and Aristotle. If you didn’t know, these guys were pretty big misogynists, as were most Athenian men. It is from their teachings that early Christian saints decided that male genitals and the whole of women were created by Satan. A female was a defective male, Saint Thomas Aquinas said, quoting Aristotle.
Even worse, once clerical celibacy was required—not until 1139, mind—it inaugurated the most corrupt period in the Church’s history. Marriage was eschewed as foul, while concubinage, pedophilia, and rape were only given mild cautions that were often ignored. Everyday churchgoers needed protection from ravenous clergy that hunted their wives and daughters. Those few priests who wanted to live moral lives by marrying their spouses found themselves excommunicated and their wives enslaved. Schisms and war erupted. It was a nasty time of division and violence, and it was overseen by the men who brought the Church celibacy.
Today Catholic clergy do not even agree upon the definition of celibacy, let alone practice it consistently. At any one time, Sipe says, only about half the clergy in the United States is celibate. What I have learned from the memoirs I have read is that most priests were not given any training at how to be celibate while they were in seminary, other than a few lectures on Eve’s temptations and the corruption of the earthly sphere. They might also be taught the official Catholic teaching on homosexuality as a “disordered” behavior, despite recent studies that have estimated over half of American priests today would identify themselves as gay or bisexual if they were free to do so. The person who first encouraged me to try a Latin Mass is a practicing Catholic who currently lives with his common-law husband, the love of his life, in Arizona. Had this friend been free to be a married gay priest, he would have been one of the very best. Good people of all genders are lost to the priesthood because, for reasons that have nothing to do with their morals and leadership qualities, they are not allowed to apply.
Sexual liaisons are not the only relationships that seminaries restricted, I have learned. In the nineteenth and early twentieth centuries, seminaries did not want their charges to even have close friendships. The instructors monitored who walked with whom between buildings like they were overseeing cotillion dance cards. Nor was a seminarian allowed to remain in close contact with his own family. Trips home—even for weddings and funerals—were very limited. The future priest was the property of the Church and not the other way around. Even after ordination, the vow of celibacy allows this control to continue for a lifetime. Bachelor priests are easier to move without notice, and they have no widows or heirs to claim Church property. Not surprisingly, then, the theme that came out strongest in all the memoirs is loneliness.
By immersing myself in these memoirs, I have been able to live, albeit briefly, in the culture that will shape Andres Gabiana. I took extensive notes, and I even bought a scanner to enter them! Most of what I learned will never make it to the fiction page, but it still helps to set the scene in my head.
Most of the memoirs on my reading list took place during the 1940s-1980s, mostly in the United States and Ireland but also one in rural Brazil. I do not read Spanish or Filipino, which limits my Philippines-based sources. However, many of the orders operating in the Philippines were European-based, and their rules applied internationally. The Church is also a hierarchical organization following its own canons (code of law) applied throughout every diocese.
The Brazilian account exposed one flawed assumption from my previous books. In the provinces of predominantly Catholic countries in the early twentieth century, priests would have been in short supply. No curate would have had the luxury of ministering at one tiny chapel at Hacienda Altarejos full-time. Poor Andres. His job just got a lot harder. You’ll see.
Research itself will only take you so far, though. Some things you have to witness. For example, even if you are Catholic, forget (almost) everything you know about mass. The Latin Rite (pre-1962) is not just in Latin, a language that most laypeople do not understand, but also the priest keeps his back to the congregation the vast majority of the time. Half the time he whispers. The only chance for participation is at communion, which is still not a verbal exchange. I had to see the whole thing in person to understand it, so this Monday morning I went to Latin Mass.
On the face of it, the ritual seems designed to be incomprehensible. I barely saw the Host and never saw the priest consume the sacramental bread and wine. It was like watching a cashier make change from across the room. In a court of law, I could not testify that he actually did it. And, to be honest, that confused me more than the silence. It’s not great showmanship—or is it? Maybe what appeals to people in the service is the mystery: “a religious truth known or understood only by divine revelation.” Awe and enigma have fueled religions from the beginning.
I was most impressed by the server, or altar boy. (It does not have to be a boy, by the way. It can be a layman, a subdeacon, deacon, or another priest. Needless to say, he does have to be a male.) I would say the boy was about the age of my students, going into ninth grade. He had to know more than just when to ring the bell: he had to answer for the congregation since we never spoke. This meant he had to know a lot of Latin, and he had to say it clearly. In fact, I found it easier to understand his elocution than the priest’s because, proudly, he sorta shouted. He did not go to school for eight or twelve years to learn how to manage this mass; he learned his part on his own time. He probably takes Latin at the local Catholic school, but still.
I am not sure if Roman Catholicism would not have survived as the largest denomination of Christianity these past fifty years if it had stayed so inscrutable, but the Latin Rite does have its attractions—especially for the priest, I imagine. He is more remote, powerful, and enigmatic. This had to be, at least partly, the draw of a vocation. As all the memoirs made clear, the whole family took on an elevated status in the parish once they had a son in the seminary.
(I do not know if this last part is still true because traditional geographic parishes are breaking down in favor of “personal parishes,” or parishes based on nationality, language, or other specializations. The church I went to was a personal parish centered around the Latin Rite, for example. There is a growing conservative Catholic movement in these personal parishes, and you will see them more and more throughout the United States.)
When I went to mass, I never spoke to the priest about any of my reactions. I never spoke to him at all. He did not seem particularly stern or unapproachable—he was younger than me, probably in his late 20s or early 30s, and he sported a well-trimmed beard. I did not talk to him because he wasn’t standing at the back of the Church shaking hands as people left. Maybe he greets the parish after High Mass on Sundays? I will go sometime to find out, but I am still not sure what I would ask him. I could ask why he chose to be a part of a religious order dedicated to the Latin Rite, the Priestly Fraternity of St. Peter, but that seems like more than a two-minute conversation.
There’s one place for sure that the mass-goer can talk to the priest: in confession before the service begins. In the memoirs I read, though, most priests disliked confession. It is not the voyeuristic extravaganza you might expect. It’s everyday stuff at best (cursing, gossip, impure thoughts); and it’s troubling at worst (domestic violence) without clear ways to intercede and provide help without violating the seal. Crime dramas centered around confessed murders rarely happen, despite each priest hearing dozens if not hundreds of confessions a week for their entire careers—not that anyone wants a murderer confessing to them, of course.
Actually, one theme that came from both the academic books and the memoirs is that confession can mire a priest in the muddy sludge of the material world—lust, greed, corruption—for which the seminary’s tight rules do not prepare him. Often he is ordained before he truly understands what he is agreeing to. He goes from not talking about sex at all to parishioners asking questions about sex (e.g. “Is oral sex with my husband a sin?”). At the time that celibacy became a discipline in the Roman Catholic Church, most priests would have lived about ten to fifteen years total after their ordination. Now they live fifty or more. Statistically, the hardest year for priests is the thirteenth anniversary of their ordination, and by this point many priests have reached a crisis.
In the time that Andres will be a priest, it was almost impossible to leave the clerical office. Though it is easier now to be laicized, or “reduced” to the non-clerical state, it can still take years, or even decades, because the Church is very good at burying paperwork. Meanwhile, they are told to stay far, far away from their old dioceses and all their old friends, some of whom have cut them off anyway. Loneliness can beget more loneliness. And despite what you read in the press, there is no such thing as an ex-priest in the Roman Catholic Church. A priest is a priest forever, even if no longer able to receive confessions, which is done on behalf of the bishop. A laicized priest can still administer some sacraments, like the Eucharist and Extreme Unction, but he can no longer serve as deacon (the position he had before ordination). In other words, their status is…complicated.
Let me thank all the priests (and children of priests) who wrote their memoirs. They have been willing to share their most personal thoughts with me, a stranger. It has been a summer of learning. If you have comments on this book report, please join my Facebook group, History Ever After, and post them there. The real test, dear reader, will be writing Sugar Communion, and there my work is just beginning.