Unfit to Print; or A History of Bad News: the Party Press, Penny Papers, and Yellow Journalism

Unfit to Print title card.jpg

The epithet “fake news” is a curious thing. In its truest sense, it refers to media hoaxes, false stories formatted to look like they originated from legitimate news sources that are then widely spread online. These media hoaxes can best be compared to the hoaxes of 19th century- newspapers that I have previously spoken about in great detail. Many of these online news hoaxes, like the Great Moon Hoax and the Balloon Hoax and the New York Zoo Hoax, are perpetrated in order to earn the hoaxer money. In the 19th-century press, this was achieved through increased circulation, but online news hoaxes earn money through clicks and ad revenue. They are also comparable to the many fake news stories by Joseph Mulhatton which I have written about in that they may be small in scale and anonymous and could be perpetrated just for a laugh. But the online news hoaxes of today may also be used as disinformation and propaganda by a foreign power or political campaigns. Still, this too may not be so very different from newspaper practices in the past. The thing is that this term, “fake news,” has often been lobbed at reputable, legacy news organizations simply because their news coverage is inconvenient or unflattering. It is a mark of the post-truth era that politicians can have their corruption exposed by investigative journalists but can save face simply by calling the news reports “fake.” Meanwhile, some of the most outrageously false and dangerous news reporting doesn’t typically get called “fake news.” I am thinking here of the various platforms of Rupert Murdoch’s conservative media empire. Murdoch’s sensational publications in the UK are not called “fake news,” but rather tabloids, and here in America, they masquerade as a “fair and balanced” alternative to the “mainstream media” that they try to undermine by calling fake. Their stock in trade is projection and gaslighting. They are pissing on their viewer’s mouths and telling them it’s drinking water, while screaming that they cannot trust what comes out of their tap. As I write this, Fox News host Tucker Carlson has been actively discouraging COVID vaccination with blatantly false claims about widespread deaths being attributed to them. Fox News pundits Jeanine Pirro, Maria Bartiromo, and Lou Dobbs were instrumental in spreading baseless election fraud conspiracy claims that incited insurrection and murder on January 6th. And it’s not limited to the sensational cable news channel. During the election, the Murdoch-owned New York Post was instrumental in promoting the false October Surprise story about Hunter Biden’s laptop, citing only Steve Bannon and Rudy Giuliani as their sources. While conservative personalities decry “biased journalism” as “fake news,” they prop up the most biased and fake journalistic outlets active today. Ironically, though, all complaints about bias and fake news, whether in the mainstream media or in alternative media, seems implicitly nostalgic, looking back mournfully on some vague former time when the press is supposed to have been a paragon of fairness and objectivity. But did that lost golden age of journalistic integrity exist? And can the history of American journalism put the exploits of Murdoch’s News Corp in context?

In order to trace the evolution of the news industry, we must first understand its beginnings. The origins of newspapers can be found in England, in the 17th century, when wealthy country gentlemen who wished to stay informed as to the goings-on at the royal court were obliged to hire correspondents who would write them letters with the latest gossip. These were correspondents in the oldest sense of the term, in that they corresponded with their employers through the Royal Mail service, sending “news-letters.” Here in America, our first news service evolved from this tradition in the very early 18th century, with the Boston News-Letter. These gossip publications were typically published by a postmaster and consumed right there in the post office, the hub of each village’s communication with the world. By the time that newsletters began to be circulated here in the U.S., though, they began to evolve back in England. The news that people wanted was political, coming out of parliament, which convened secretly, so that only certain invited visitors could observe the proceedings in the “Stranger’s Gallery.” Some visitors saw an opportunity and would write reports from memory of what was said and done, or would even furtively try to take down notes while in attendance. Gradually, these political news-letters moved from simply reporting parliamentary gossip and began publishing opinions. Formerly, political opinions were printed and disseminated as broadsides and pamphlets, but that role was taken over by newspaper editors in the early 18th century, and from there, it did not take long for newspapers to become the mouth-pieces of certain political parties. This is the dawn of the “party press” age of journalism, both in England and America, when newspapers reported facts and opinions with a view toward benefiting the image of a certain political party and winning readers to that party’s side. And which party a newspaper supported was not dependent on the editor’s view. Rather, politicians and parties subsidized newspapers, making them official organs. One misconception about news publications today and a principal criticism of the “mainstream media,” is that it is a cardinal sin for a journalist to express an opinion or take a side, but opinion has been a major element journalism going all the way back to its beginnings. In later years, it’s true, ideas of journalistic integrity demanded that editorial opinion be clearly labeled and separated from “news,” but even so, some leaning toward and favoring of a certain viewpoint even in reportage has always been acceptable and even encouraged. It’s what is called “editorial direction,” and it is a major draw for a paper. If one doesn’t like a paper’s views, one can find another paper that better suits them. In fact, when we talk about “freedom of the press,” it is precisely the freedom to express opinions that is protected, not the freedom to report facts.

First issue of the Boston News-Letter, regarded as the first continuously published newspaper in British North America. Published April 24, 1704. Public Domain.

First issue of the Boston News-Letter, regarded as the first continuously published newspaper in British North America. Published April 24, 1704. Public Domain.

Perhaps Rupert Murdoch and his News Corp represent a return to the age of the party press. Interestingly, his newspaper, The New York Post, started out as an organ of the Federalist Party, founded in 1801 by Alexander Hamilton after the election of Thomas Jefferson, about which I spoke a great deal in more than one episode last autumn. Hamilton wanted a mouth-piece to compete with Thomas Jefferson’s National Intelligencer, which had begun publication the year before. And it should be said that the party press era was not entirely negative. Indeed, partisan journalism has been credited by James Baughman of the Center for Journalism Ethics with greatly increasing democratic participation and voter turnout, and with 2020’s spike in voter participation, we may be seeing the same effect today. But let me qualify. In the party press era, as today, newspapers colored the facts, gave one-sided versions of events, and ignored or chose not to emphasize stories that made political rivals look good. This is certainly something that we observe today, on both sides, even if more so on one side than the other. That kind of bias is one thing, but making up the news, misrepresenting or inventing events, or purposely misreporting in order to make one party look good or another party look bad is something else entirely. One egregious example of such manufacturing of the news during the party press era occurred when Samuel Johnson, reporting on parliamentary proceedings for Cave’s magazine, apparently made up an entire speech that he wasn’t present in the Stranger’s Gallery to hear, basing his remarks on a second-hand description of the speech. This is certainly deceptive, but the fact that Johnson was widely praised for his seeming neutrality, despite Johnson himself confessing that he always did what he could to make the Whigs look bad in his reporting, demonstrates that the kind of brazenly false news leveraged as propaganda that we see today may not have shown up until later. Here in America, in the 19th century, the party press gave us lurid personal scandals, like the competing newspaper coverage of Andrew Jackson’s marriage—some characterizing him as having seduced a wife away from her husband and marrying her before her divorce was final, and others depicting him as an innocent victim of character assassination, asserting her marriage had been abusive and that Jackson did not know the divorce was not yet final—but these are instances of gossip that are hard to characterize as purposeful disinformation. I struggle to find instances of outright deception in the party press era, so let us move on.

In 1833, a new age of journalism commenced with the so-called Penny Press era, when Benjamin Day founded the New York Sun. What made the Sun and other penny papers different was its price, first of all—one cent compared to the 6 cents that most other papers cost—but also its intended audience. Using the steam press, they were able to mass produce the paper cheaply, which meant, in order to make money, they needed the masses to take up reading the news, many of whom were not interested in the politics peddled by partisan papers. So they changed their approach, writing shorter, more easily digestible pieces using simpler language, and focusing on practical and relevant news over politics, as well as on more human stories. In this way, penny papers appealed to the common folk and immigrants, or as Day put it, “mechanics and the masses generally.” With this wider readership also came better advertising revenue, and thus the business model of newspapers would be changed forever, relying more on wide circulation than on political subsidization for profit. And of course, the Penny Press era gave us the first dramatic examples of really fake news, the Great Moon Hoax and the Balloon-Hoax, which I have previously discussed in some detail. It is tempting to suggest, then, that whenever there is a sea change in the accessibility of news and the medium by which it is delivered, fake news flourishes. It began with the Penny Press, and continued with the advent of radio broadcast and Orson Welles’s War of the Worlds, and then reached its height with the Internet, the ultimate democratization of information access.

Samuel Johnson, an early manufacturer of partisan news. Public Domain.

Samuel Johnson, an early manufacturer of partisan news. Public Domain.

But that would be an oversimplification. First of all, as I recently discussed, the fake news related to the War of the Worlds broadcast was not on the radio but rather in the papers, which exaggerated the panic it had caused. And furthermore, the era of the penny press also led us to the highest ideals of journalistic integrity. This was the era in which the independent press emerged. Benjamin Day’s rival, James Gordon Bennett, expressed these ideals clearly when he founded his competing penny paper, the New York Herald: “We shall support no party—be the agent of no faction or coterie, and we care nothing for any election, or any candidate from president down to constable.” It was a declaration of the press as an autonomous and objective force that could act as a check on political power. The Herald itself may not have lived up to the ideals Bennett espoused. It would go on to engage in the very kind of hoaxing it criticized the Sun for perpetrating, like their New York Zoo hoax about escaped animals, and it would by no means remain carefully neutral in politics, in fact favoring the nativist, anti-Catholic Know-Nothing Party. And as a penny paper, their coverage tended to the sensational, especially in the scandalous Robinson-Jewett case, a notorious sex and murder scandal, during which Bennett was the first to issue extra editions. But Bennett pioneered journalistic practices like reportage based on the observations of correspondents and interviews. His paper was one of the first to uncover local corruption, as well, a practice of investigative journalism that would go on to inspire some of the greatest work of the independent press in the 19th century, when the New York Times’ coverage of Boss Tweed became instrumental in taking down the Tammany Hall political machine in the 1870s. The development of the penny press is therefore clearly related to the development of both news hoaxes and to our highest journalistic standards. Still, the kind of hoaxes and sensationalism that came out of the penny press was not the kind of disinformation and propaganda we see from partisan news outlets today. Perhaps, then, we can find a forerunner of this kind of fake news in the so-called “yellow press.”

In many ways, the era of Yellow Journalism also evolved from the practices of the Penny Press and independent journalism. What Joseph Pulitzer and William Randolph Hearst, the two eventual magnates of the yellow press, had learned from their predecessors was that news was best told as a “story.” They took to heart what James Gordon Bennett once asserted, that the purpose of news “is not to instruct but to startle and amuse.” For Pulitzer, in his St. Louis Post-Dispatch, and later in his New York paper, the World, this meant investigative watchdog journalism such as what the independent press had pioneered in the 19th century. Pulitzer likewise crusaded against corruption, but often as a way of courting poor immigrant readers, for example focusing on exposing conditions in the tenements where these prospective readers lived. Since his crusading was in some cases a matter of business, Pulitzer has been credited with inventing muckraking, the kind of journalism that seeks to cause outrage and scandal even when it may not be warranted. On the other hand, William Randolph Hearst, who got his start in San Francisco with his paper the Examiner, courted female readers with human interest stories and a certain brand of muckraking story that came to be known as a “sob story.” It began with rumors of a poorly managed hospital. The paper chose a female cub reporter to investigate, which she did by pretending to faint on the street in order to be admitted. This reporter, Winifred Black, wrote a story about women’s treatment in the hospital that was said to have made women sob with every line, earning her the nickname “sob sister,” and launching her career as a muckraker with a dedicated audience of women readers. Other newspapers even tried to recreate this success, building whole teams, or “sob squads,” to churn out similar stories. These two newspaper empires, that of Pulitzer and that of Hearst, with their comparable approaches to muckraking and sensational reporting may have come closest to the kind of political engineering that we see from Rupert Murdoch and News Corp today, for they are widely credited with having led the U.S. into war with Spain. But how accurate is that characterization?

Hearst’s self-congratulatory newspaper coverage of his own jailbreak exploit in Cuba. Public Domain.

Hearst’s self-congratulatory newspaper coverage of his own jailbreak exploit in Cuba. Public Domain.

Since the mid-19th century, there had been more than one armed rebellion in Cuba against their Spanish colonial rulers. In 1895, rumblings of revolution began again, organized by Cuban exiles in the U.S. and Latin America and commenced as a series of simultaneous uprisings against colonial authority. In the U.S., Pulitzer’s and Hearst’s newspapers favored the cause of the rebels and vilified the Spanish. Hearst especially seems to have put the entire weight of his editorial influence into convincing the American public that they should cheer on the rebels, and eventually, that the U.S. must itself become involved. Hearst’s motivations are a matter of some dispute. Humanitarian concern and democratic ideals may indeed have played a substantial role, making the yellow press’s focus on the rebellion less selfish and unsavory than it is typically portrayed. But it is clear that Hearst, even then, had political ambitions, and to be seen as bolstering the cause of democracy would certainly burnish his reputation. Then there is the distinct possibility that he believed war with Spain would be good for business. Indeed, it would turn out to be a massive boon to his newspapers’ circulation, so perhaps, as has been his usual characterization, Hearst did after all have his eye on the bottom line, though it was more likely a combination of these motivations. But Hearst’s desire for the U.S. to go to war with Spain and his willingness to foment it by manipulating public opinion has certainly been exaggerated, to the point that it has become a myth. The favorite anecdote of those who promote this view is that Hearst sent an illustrator to Cuba in January 1897, and when the illustrator wrote him saying, “There is no trouble here. There will be no war,” Hearst replied, “You furnish the pictures. I’ll furnish the war.” However, media historian W. Joseph Campbell has proven through meticulous research that this exchange never took place. First of all, there is no evidence for the existence of these telegrams, and the anecdote first appeared in a memoir by a correspondent of Hearst’s who was not involved at all and was actually in Europe at the time, who used the anecdote not to disparage the yellow press, but to praise their foresight. Second, in 1897, there was already war in Cuba, and that was Hearst’s whole reason for sending the illustrator there. Moreover, Hearst’s editorial position in January of 1897 was that the rebels would succeed in throwing off the colonial yoke; his campaign for U.S. military intervention would not begin for some time. While this exchange is certainly a myth, though, it should not be thought that Hearst did not overstep in his newspaper coverage of the situation in Cuba.

Since the “sob story” had served him so well, Hearst cast about searching for the story of a mistreated woman. Such a story would appeal emotionally to both female readers who imagined what they would do if they ever found themselves in such distress and to male readers who fancied themselves chivalrous. He found just the story he needed in Evangelina Cosío Y Cisneros. This 18-year-old woman had visited the Isle of Pines in Cuba with some companions in 1896, intending to visit with her father, a rebel who was confined to the island. According to later accounts, a Spanish colonel one night came to her room and made unwanted sexual advances on her, but her companions came to her aid upon hearing her cries. Pulling the rapist off of her, they tied him to a chair, but a patrol of other Spanish officers happened upon the scene and arrested them. Evangelina Cisneros was charged with luring the colonel into a trap and thrown into a jail for prostitutes. Hearst’s flagship paper, the New York Journal-American, turned Cisneros into a paragon of feminine purity who was being brutalized, kept among fallen women, and regularly subjected to abuse. She became in his newspaper columns a symbol of all the innocent Cuban women that the Spanish were ravishing and debasing, making of the rebels who fought to protect these damsels in distress heroes firmly planted on the moral high ground. In fact, there is evidence that Cisneros was more of a pants-wearing, cigar-chomping rebel herself, and that she may in fact have been enacting some rebel plan at the time of her arrest, but otherwise, much of Hearst’s portrayal of her situation seems accurate. This “Flower of Cuba,” as the Hearst papers called her, was to be shipped overseas to a Spanish penal colony in North Africa. But Hearst had other plans. In one of the most astonishing instances of manufacturing the news in American history, he sent a rough-and-tumble correspondent, Karl Decker, to Havana to break her out of jail, and they succeeded. Exactly how this daring escape was effected remains unclear—there are stories of a file being smuggled in to her, or drugged bonbons that she used to put her cellmates to sleep. Regardless of how it was accomplished, we know that she was successfully sprung from the jail, hidden, and then put secretly aboard a ship back to America, where Hearst had her paraded around the country to tell Americans about the cruelties of the Spanish in Cuba. Rather than causing a scandal over the legality of such an unsanctioned action overseas or the role of the press in making news, Hearst’s exploit was widely praised, of course in Hearst’s papers but in others as well. Pulitzer, though, was quick to suggest that the jailbreak was a hoax, insisting that the Spanish authorities must have allowed it to happen, and this has been an enduring characterization of the affair. However, W. Joseph Campbell has uncovered through an examination of contemporary diplomatic correspondence that certain U.S. diplomats were involved with the jailbreak, and that they faced considerable risks in being involved. Moreover, the Spanish authorities ordered a search for Cisneros after her disappearance, showing that they had not winked at her escape. So while some myths certainly surround the level of involvement of the yellow press in Cuba in 1897, this was not one of them. William Randolph Hearst did indeed orchestrate the liberation of a Cuban prisoner of war.  

While the “Cisneros Affair” certainly galvanized the American public to espouse the cause of the Cuban rebels, it was another dramatic event that is usually identified as the tipping point for American military intervention in the Cuban War of Independence. This event is not disputed as a myth, but it has turned out to be an enduring mystery. In January of 1898, U.S. President William McKinley had the battleship USS Maine anchored in Havana Harbor as a demonstration of US power and determination to protect U.S. citizens in the war-torn country. On February 15th, while the crew slept in their quarters on the forward end of the vessel, an explosion occurred. There had been 354 men aboard the ship, and 266 of them perished in the explosion and the resulting fires as the ship sank into the harbor. Hearst’s Journal and Pulitzer’s World both put this tragedy on the front page every day afterward, of course, asserting that the explosion had been an attack, an act of war. Hearst even offered a $50,000 reward for evidence of who was responsible. The U.S. Navy wasted no time in launching an inquiry, which determined within a month that an underwater mine had detonated, in turn igniting the ship’s forward magazine. However, some of the experts consulted in the inquiry came to a different conclusion, suggesting instead that coal in the bunker adjacent to the magazine had spontaneously combusted. This scenario would have been more consistent with the findings of a Spanish inquiry, which argued that it is unusual for a ship’s munitions to explode when it is sunk by an underwater mine. Moreover, a spout of water would typically be seen when a mine detonates, and dead fish would afterward be found, neither of which was the case in Havana Harbor that evening. Numerous investigations have failed to resolve this mystery. Perhaps the Spanish inquiry’s conclusions are less than trustworthy because they surely were seeking to absolve themselves. And there is just as little evidence for an internal explosion as there is for an external one. In fact, the spontaneous combustion of coal appears to be just as uncommon as a ship’s magazine detonating after being struck by a mine. But none of this mattered to the yellow press, which ignored the Spanish inquiry and the dissenting expert opinions and declared to the world that the Spanish had murdered U.S. navy men in a brazen act of war. In less than a month, Congress declared war, and many an American sailor was heard to repeat the headlines of the yellow press in their war cries: “Remember the Maine! To hell with Spain!”

Hearst’s inflammatory newspaper coverage of the USS Maine explosion. Public Domain.

Hearst’s inflammatory newspaper coverage of the USS Maine explosion. Public Domain.

In considering the yellow press as a possible precursor of today’s disinformation outlets, we must reconsider what we presume to be true about yellow journalism. Historians have shown the “I’ll furnish the war” quote to have been a myth, so the truth about the complex myriad of factors that led to U.S. involvement in the Cuban War of Independence has likely been obscured by this appealing fiction. It is not as though the American public would not have learned of the events without the yellow press or would not otherwise have come to favor U.S. involvement. There were other, more respectable newspapers also reporting on the Cuban rebellion then, just as today there are many more scrupulous news outlets that consumers of Rupert Murdoch’s brand of news could seek out instead. And yellow journalism and public opinion alone did not sway President McKinley to pursue war. There had been jingoists in Congress pushing the same war agenda every step of the way. In the same way, today, Fox News and conservative media generally may not be inventing the talking points and leading this disinformation war so much as following the lead of the GOP, recognizing their niche market and continually pursuing their audience down whatever fringe path they’ve been led. Notwithstanding these parallels, though, I still find no examples from the history of American journalism that match the brazen manipulation, the invention of false narratives, the shameless promotion of disinformation regardless of potential public harms that we see in the media produced by Rupert Murdoch’s News Corp, especially Fox News Channel. They seem to represent the worst of every era, seemingly beholden to a political party as in the party press era, trading in oversimplified sensationalism in order to appeal to the everyman as in the penny paper era, and willing to manufacture news as in the age of yellow journalism. But Hearst and his empire were not really anything like Murdoch and News Corp. First of all, Hearst was driven by political ambitions, trying to parlay his newspaper platform into a Democratic presidential nomination. Murdoch appears to be motivated by the pursuit of wealth and a sinister ideology. Hearst envisioned a kind of “journalism of action” that would engage in democratic and humanitarian activism, certainly for the purposes of self-promotion but never seeking to do harm. But just this year, Fox News has been promoting conspiracy theories that encouraged the overturning of a free and fair election and engaging in anti-vaxxer science denial that will result in lost lives; they are attacking democracy and public health. To me, this seems like a new and unprecedented form of flawed journalism. We find ourselves in the era of the propaganda press. And what’s truly scary is that Fox News and the other outlets within Murdoch’s News Corp are no longer even the worst offenders, having shown clear signs of tempering their rhetoric in at least some of their programs—typically the ones that they would have a hard time claiming are for entertainment purposes only. Disinformation purveyors have grown in the last couple decades, with NewsMax, Breitbart, and One America News Network becoming the worst offenders. But credible news reporting and reliable outlets remain, the most impartial and conscientious being Associated Press and Reuters reportage. As for major legacy newspapers and other big cable news channels, yes, bias and the favoring of viewpoints is present, as it always has been. American consumers of news need to stop expecting anything different, learn to read laterally across platforms for a wider variety of editorial slants, and concentrate, most importantly, on rooting out barefaced propaganda.

Further Reading

Borch, Fred L., and Robert F. Dorr. "Maine's sinking still a mystery, 110 years later." Navy Times, sec. Transitions, 21 Jan. 2008, p. 36. NewsBank: Access World News – Historical and Current, infoweb.newsbank.com/apps/news/document-view?p=WORLDNEWS&docref=news/11E7C6BBE345B960.

Campbell, W. Joseph. “‘I’ll Furnish the War: The Making of a Media Myth.” Getting It Wrong: Ten of the Greatest Misreported Stories in American Journalism, University of California Press, 2010, pp. 9-25.

———. “Not a Hoax: New Evidence in the New York Journal’s Rescue of Evangelina Cisneros.” American Journalism, vol. 19, no. 4, 2002, pp. 67-94. W. Joseph Campbell, PhD, fs2.american.edu/wjc/www/nothoax.htm.

———. “Not Likely Sent: the Remington-Hearst ‘Telegrams.’” Journalism & Mass Communication Quarterly, vol. 77, no. 2, 2000, pp. 405-422. W. Joseph Campbell, PhD, fs2.american.edu/wjc/www/documents/Article_NotLikelySent.pdf.

———. “William Randolph Hearst: Mythical Media Bogeyman.” BBC News, 14 Aug. 2011, www.bbc.com/news/world-us-canada-14512411

Lowry, Elizabeth. “The Flower of Cuba: Rhetoric, Representation, and Circulation at the Outbreak of the Spanish-American War.” Rhetoric Review, vol. 32, no. 2, 2013, pp. 174–190. JSTOR, www.jstor.org/stable/42003444.

Park, Robert E. “The Natural History of the Newspaper.” American Journal of Sociology, vol. 29, no. 3, 1923, pp. 273–289. JSTOR, www.jstor.org/stable/2764232.

Pérez, Louis A. “The Meaning of the Maine: Causation and the Historiography of the Spanish-American War.” Pacific Historical Review, vol. 58, no. 3, 1989, pp. 293–322. JSTOR, www.jstor.org/stable/3640268.

Taylor, WIlliam. “USS Maine Explosion.” Disasters and Tragic Events: An Encyclopedia of Catastrophes in American History [2 Volumes], vol. 1, 2014, pp. 164-166. ABC-CLIO, 2014. EBSCOhost, search-ebscohost-com.ezproxy.deltacollege.edu/login.aspx?direct=true&db=nlebk&AN=781660&site=ehost-live&scope=site.

Extra! Extra! Extra-Terrestrial Hoaxes!

Holy Stones (1).jpg

While I have written a full-length second part to Hidden Bodies, my history of astronomical discovery, delving further into the blind alleys and paths to discovery that astrophysical science has taken, from aether to dark matter, that episode is reserved for patrons, so in lieu of its transcript, I present a blog post for a patron exclusive that I released to podcast listeners instead.

*

While in Hidden Bodies, I focused on wrong ideas about the solar system, in this post, I want to tell the related stories of some famous hoaxes about extraterrestrial life which themselves are surrounded by misconceptions and myths. We’ll begin with a story that has to do with the son of astronomer William Herschel. If you recall, William Herschel was a vocal proponent of the idea that the Moon, Mars, and even the Sun, were inhabited by intelligent life. I spoke briefly about Herschel’s beliefs about the inhabitation of the Sun, and in more detail about the observations he believed demonstrated that Mars was populated. Before all that, though, and much closer to home, he claimed in 1776 to have spotted large swathes of vegetation on the lunar surface that he believed to be cultivated, leading him to search the Moon for towns. The circular marks on the Moon that today we know to be craters he believed were villages, cities, and even vast metropolises. What makes Herschel’s claims even more interesting is the fact that, about 60 years later in 1935, his son John Herschel was famously credited for actually seeing in great detail the vegetable and animal life on the moon in what is considered to be one of the greatest hoaxes of all time.

On August 21st, 1835, the cheap “penny press” newspaper The New York Sun printed what it said was a reprint of an article that had originally appeared in the Edinburgh Journal of Science. The article discussed William’s son John Herschel—a famed astronomer in his own right—his gargantuan telescope in South Africa, and the amazing discoveries he had recently made. The article made claims that, though seemingly plausible, were completely untrue. John Herschel had indeed built a large telescope in the Cape of Good Hope, but it did not sport a 7-ton lens capable of 42,000 times magnification, and with it he certainly did not spy planets in other solar systems, which as we know from the last episode mankind was unable to detect until the 1990s. The biggest bombshell of the article, that Herschel had confirmed the existence of life on the Moon, was not emphasized, despite its enormous implications, and thus the first article in this series attracted little attention. It was not until the second piece, five days later, that readers began to take notice. Through his telescope, the paper claimed, Herschel had observed that the Moon’s surface was covered with dark red flowers and populated with strange animal life, such as horned goats and bison-like beasts that moved in herds, all of which had a distinctive appendage that crossed the forehead from ear to ear, not to mention a bizarre globular amphibian that rolled along the beaches. In the third installment of the series said to have been printed in the Edinburgh Journal of Science, it was revealed that intelligent life existed on the Moon, in the form of beaver-like creatures that went about on two feet, carrying their babies in their arms and living in huts from which issued smoke, indicating they had mastered fire. Finally, in the fourth piece, the observation of some winged humanoids, which Herschel supposedly named Vespertilio-homo, or man-bats, was described. Apparently, these beings spoke to each other intelligently and lived in elaborate structures. Finally, some blatantly racist notions cropped up in the account, as it was described that these dark-colored man-bats looked only slightly more intelligent than orangutans and tended to engage in public copulation, but that there existed another group of man-bats of a lighter color that was described as superior and more beautiful and like unto angels. According to these articles, the lunar beings had constructed gigantic geometrical structures, perhaps as a way to signal us Earthlings.

Portrait of a man-bat (Vespertilio-homo). Public Domain.

Portrait of a man-bat (Vespertilio-homo). Public Domain.

After the second article in this series appeared, circulation of the Sun newspaper began to increase, and many other penny papers began to republish their pieces. Many appear to have believed the story entirely. There are accounts of Christian organizations planning how best to convert the man-bats to Christianity, and humanitarian societies convening meetings to discuss how they might provide aid to the lunar needy. It would be weeks before the articles’ sources could be checked, when it was discovered that the Edinburgh Journal of Science had actually shut down about two years before the articles were said to have been published by them. A rival paper then accused the Sun’s editor, Richard Locke, a descendant of English philosopher John Locke, of having written the pieces himself in order to attract readers with sensationalism. Locke and the Sun never retracted the story, even as recently as 2010, when they playfully wrote that they were “looking into” claims that there are no lunar man-bats. The paper’s response seems to suggest that only a fool would have taken the pieces seriously, a stance which accords well with an alternative interpretation of the so-called Great Moon Hoax as actually being a satire penned by Locke that had not been widely understood at the time of its publication. And there is support for this interpretation. According to this version of events, Locke had written the pieces in order to satirize Scottish astronomer and “Christian Philosopher” Thomas Dick, as well as other proponents of so-called Natural Theology, which looked to astronomy and physics for proof of the existence of God. Dr. Dick and others, having learned that even a drop of water can be observed microscopically to be teeming with life, reasoned that the universe must likewise be crawling with life forms. Dr. Dick even put forth an estimation of the universe’s population, suggesting it was probably around 22 trillion, that the Moon itself must have about 4.2 billion inhabitants, and that we should build huge geometrical glyphs in order to send them a message. Supposedly, on the eve of publishing his articles, Richard Locke told friends that if it were taken seriously or scorned as a hoax, then his satire had failed. In the end, it may never be known whether he had only taken inspiration from Dr. Dick in a scheme to increase newspaper circulation, or if he truly meant it as a trenchant criticism of Dr. Dick and his ilk and simply accepted the derision he later received as penance for having failed in his literary endeavor.

Dr. Dick, the “Christian Philosopher” and “Natural Theologist” whose ideas about extra-terrestrial life Richard Locke may have been satirizing in his moon hoax. Public Domain.

Dr. Dick, the “Christian Philosopher” and “Natural Theologist” whose ideas about extra-terrestrial life Richard Locke may have been satirizing in his moon hoax. Public Domain.

Today, though, it is not the Great Moon Hoax that takes the title of the greatest media hoax of all time, nor is it the Balloon Hoax written by Edgar Allen Poe for the same newspaper some years later. Rather, that title usually goes to Orson Welles’s War of the Worlds radio broadcast on the night before Halloween 1938, another hoax related to William Herschel, who had so popularized the notion of a Martian civilization that he inspired H. G. Wells to write the novel that Orson Welles was adapting. The reason this incident is usually credited as the biggest media hoax is because of the mass panic that it supposedly elicited, but of course, a hoax is typically a purposeful act. Orson Welles, who appeared before news reporters a day later to insist that he found it baffling how anyone could have mistaken his radio drama for a genuine news broadcast, certainly does not seem to have intended to cause a panic, even though in later years, he certainly seems to have relished the notoriety it had earned him. The content of the broadcast was designed to create some verisimilitude, with an orchestral performance being interrupted by dramatized news bulletins that described the launch of spacecraft seen on Mars and their landing in a rural area, followed by their attacks, wielding poison gases and heat rays, until finally New York had been obliterated. In Welles’s defense, the broadcast was introduced as radio theatre, and since entire episode transpired within the program’s hour-long runtime, any rational or intelligent person should have surmised that the story was unfolding a little too quickly. There was simply no way that the Martians made their trip millions of miles and decimated American military forces all within a half an hour. The fact that the program announced the declaration of martial law alone should have tipped off listeners, as the wheels of government simply don’t move that fast. And the fact that the Martian invaders supposedly fell victim to germs before the hour was over also beggared belief, as that’s quite a fast onset for the common cold. But the seemingly credible explanation is that listeners tuned in late and didn’t listen to the entire broadcast before panicking, running out into the streets, and choking the roadways in their attempts to flee. The stories are many, almost all from newspapers that appeared the next day. After listening to the broadcast, terror seized millions, most fled their homes with their families, causing accidents and deaths in the stampede to reach some place of safety. Some chose to stay where they were, but attempted suicide, preferring death by their own hands to being cooked alive by alien heat rays. But how accurate is this characterization of the panic that ensued.

A sample of the sensational newspaper headlines in the days ensuing the War of the Worlds broadcast.

A sample of the sensational newspaper headlines in the days ensuing the War of the Worlds broadcast.

Most accounts come from newspapers, as I said, or from a 1940 Princeton psychological study, The Invasion from Mars: A Study in the Psychology of Panic. However, as media critic and historian W. Joseph Campbell writes in his book Getting It Wrong, most sociologists now agree that this Princeton study was foundationally flawed. It relied only on interviews with 135 people, all known to have been frightened by the program, which simply cannot prove the kind of widespread alarm and panicked flights that it claimed to prove. So we must look to the newspapers themselves. Likewise, each newspaper that covered the kerfuffle the morning after tended to claim that thousands in their area had fled their homes or choked up telephone lines with panicked calls, but only mentioned one or two vague anecdotal examples. There is an important distinction to be made between real evidence and anecdotal evidence here. Offering one or two or even dozens of anecdotes never proves that something is common or widespread. It only ever proves that it happened once, or twice, or a dozen times, depending on how many anecdotes are offered. Campbell, who examined the coverage of 36 major newspapers, observed that the anecdotes provided to support their claims about mass panic were typically lacking in detail. Moreover, what their examples demonstrated in a lot of cases was not that people were misconstruing the broadcast for a genuine news bulletin, but rather that rumor-mongers who had heard second- or third-hand that some calamity was transpiring were running around town getting people riled up without having ever listened to a moment of the broadcast. Perhaps the only real evidence of a large-scale reaction to the program were the reports of phone lines being backed up with increased traffic, in many cases with calls to local newspapers and police stations in order to ascertain what was happening. But of course, this is not a panic reaction. Rather, if you had misconstrued the nature of a radio program or heard someone shouting about alien invasion and the end of the world, calling a newspaper or a police station to confirm that, in fact, it was a false alarm would actually be the most calm and rational thing to do.

The press dressing down Orson Welles the day after his radio broadcast. Public domain.

The press dressing down Orson Welles the day after his radio broadcast. Public domain.

So how do we explain this sensationalized version of the events following the War of the Worlds broadcast, which has since become ingrained in our culture as a lasting myth? One explanation is that the broadcast happened in the evening, and newspapers went to print in the morning, leaving them little time to do any in-depth collection of reports and evidence. In fact, on such a timeline, when major newsworthy events seemed to be transpiring, papers tended to rely on the wire service. It’s clear that many of the newspapers who covered the supposed panic over the radio broadcast were simply reprinting claims that had come over the wire, often word for word. It was essentially the same phenomenon as seems to have transpired, on a smaller scale than the newspapers claimed, the night before. It was a contagion of false alarm. Some scattered misunderstanding of the broadcast was spread as a rumor by people acting as Paul Reveres, as Campbell puts it. In the same way, Associated Press round-ups of some anecdotes that suggested sweeping panic were spread like a rumor themselves, until the broad and uniform newspaper coverage appeared to prove that mass terror had swept the country the night before, when in fact, there is no strong evidence that it did. Additionally, the fact that newspapers considered radio to be a rival medium for news consumption prompted the newspapers to latch onto the story and embellish it, especially with editorials following the initial coverage, chiding radio and suggesting that any medium that could create such panic using sensationalism may need to face strict censorship in the near future. Ironically, though, it appears that it was the newspapers themselves that were engaging in sensationalism and stirring up something of a moral panic about the trustworthiness of their competitor.

Further Reading

Campbell, W. Joseph. Getting It Wrong: Ten of the Greatest Misreported Stories in American Journalism. University of California Press, 2010.

Vida, István Kornél. “The ‘Great Moon Hoax’ of 1835.” Hungarian Journal of English and American Studies (HJEAS), vol. 18, no. 1/2, 2012, pp. 431–441. JSTOR, www.jstor.org/stable/43488485. Accessed 14 May 2021.

Hidden Bodies: A Brief and Incomplete History of Astronomical Discovery

Hidden Bodies Title Card.jpg

It is not certain when Aristotle wrote his book On the Heavens, but it is thought to have been written sometime around 350 BCE. In it, he addresses the debates on cosmogony of his day, for example asserting the weakness of the argument of flat-earthers. As I’ve discussed before, the view of the Earth as spherical was common, even popular, all the way back then, and championed by Aristotle. However, in laying out his model of the universe, he favored a geocentric cosmology, viewing Earth as the center of the universe, an immutable and eternal constant with other planets, the Sun, and the stars revolving around it, and beyond the stars, a spiritual plane that he called the Sphere of the Prime Mover. Even then, though, there were alternative views. As Aristotle notes, the Pythagorean philosopher Philolaus believed that the Earth revolved around a Central Fire. However, this Central Fire was not the Sun, in his view, which he said also revolved around the center with the Earth, and he further believed that on the other side of this Central Fire at the universe’s center was an Antichthon, or Counter-Earth, a strange idea that survived long enough to become the fodder of sci-fi. Modern astronomers have even been obliged to disprove the existence of such a phantom planet, which would be detectable because of its gravitational influence on other planetary bodies. But Philolaus’s model influenced Aristarchus, who saw the Central Fire as being one and the same as the Sun, building a heliocentric model of the universe and even suggesting that the stars were themselves other suns. But Aristarchus’s model was often rejected in favor of the Aristotelian geocentric model, thereafter developed by Hipparchus of Nicaea and Ptolemy of Alexandria, who tweaked the model to suggest that each heavenly body, in its orbit of Earth, also moved in an epicycle, or a small circle, performing little loop-de-loops as it revolved around us. The heliocentric view of the universe would not rise again, as it were, until the 16th century, when Polish monk Nicolaus Copernicus’s On the Revolution of the Celestial Spheres set forth a model of the universe that the Church rejected. Then in 1610, when Galileo recognized that the planetary bodies he’d been observing were moons orbiting Jupiter, not revolving around Earth, the geocentric model of the universe was in its death throes. However, this new model still held that we were very close to the center of the universe: our sun, Sol. This notion would not be shattered until the 20th century, when head of the Harvard College Observatory, Harlow Shapley, placed our solar system in the boondocks of the Milky Way galaxy. Still, the Milky Way, it was thought, even by Shapley, was the only galaxy there was. Until Edwin Hubble showed that there were other galaxies beyond ours, proving it to Shapley in what Shapley described as a “letter that destroyed my universe.” Thus goes the march of scientific progress. When we believe we understand something, our illusions are obliterated by the next discovery. Today, we have the multiverse theory to suggest that our universe may not even be the only one, making our existence feel more and more insignificant.

*

In my recent blog post, covering the history of immunological science and the development of vaccine technology, as well as in a patron bonus on germ theory, I found it interesting to explore both the hits and misses of scientific progress. It illustrates well the scientific principle that only through experimentation, collection of evidence, observation and comparison can truth be established. We see in the history of science the concept that, as Isaac Newton once wrote, each generation stands on the shoulders of giants, building upon what has already been proven and disproving what has not in order to achieve a more perfect understanding of our world and universe. I find this gratifying because of how very different it often seems from historiology. Don’t get me wrong. Professional historians work tirelessly to revise and perfect our understanding of the past. The term “revisionist history” in fact has unfairly developed a negative connotation, when in reality, every professional historian engages in measured and evidence-supported revision. But since history is often viewed as static and unchanging, our evolving understanding of it often takes a long time to catch up. Textbooks continue to disseminate oversimplified narratives rife with myths and misconceptions. That, of course, is the bread and butter of this blog. Take, for example, Copernicus and Galileo, about whom there remain a wealth of myths that even scientists like Carl Sagan were known to repeat. The first is the Demotion Myth, the idea that the heliocentric model represented a demotion of the Earth’s place in the universe. Actually, according to medieval and early modern beliefs, in which the center was the worst place to be, like the center of Dante’s model of hell, moving Earth away from the center was in reality something of a promotion according to contemporary philosophy. Further myths claim that Copernicus waited until his deathbed to publish his heliocentric model, but that was more of a coincidence. Likewise, many myths surround Galileo, from apocryphal experiments atop the Leaning Tower of Pisa wrongfully attributed to him, to erroneously crediting him with the invention of the telescope, the thermometer, and the scientific method. Folklore tells us Galileo was excommunicated, convicted of heresy, and immured in a dungeon. In fact, he was put on trial, but in reality it was for breach of an agreement with the Holy Office. As the Pope had endorsed his work, Galileo had agreed not to present the Copernican model as proven fact, but rather to discuss the pros and cons of all cosmological models. Pope Urban VIII was actually sympathetic to the Copernican model, but when Galileo broke his agreement and presented it as fact, it put the Pope in an awkward situation. Far from being tossed in a dungeon, though, Galileo was sentenced to live in a 5-room suite in the Palace of the Holy Office, was able to receive guests and, records show, could come and go at great liberty, if not as he pleased. These myths make it clear that, while the science to which they contributed was built upon and furthered, the history of their lives was muddled and misrepresented. So let us retire, this once, from the benighted realm of Historical Blindness, and bask in the light of empirical science, specifically the luminous realm of astronomy, where wrong ideas have also abounded, but have almost universally been overcome by reason.

Saturn depicted by Galileo (top), Huygens (middle), and Cassini (bottom). Image credit: RM Chapple

Saturn depicted by Galileo (top), Huygens (middle), and Cassini (bottom). Image credit: RM Chapple

One of Galileo’s wrong ideas had to do with Saturn. In 1610, he was the first to observe this planet using a telescope, and he saw what appeared to be a triple planet, or a large planet with two moons on either side. However, two years later, he observed that these moons had disappeared, and two years after that, seeing that they appeared to have returned, he speculated that they were some kind of arms. But the shape of Saturn would baffle astronomers for a long time, sometimes appearing as three bodies and sometimes as an egg on its side. Some fifty years after Galileo first spied it, Christiaan Huygens, a Dutch astronomer, discovered Saturn’s moon Titan using a telescope superior to Galileo’s and in the process, he observed these arms of Saturn, which appeared to pierce it through its center, making it look something like a spinning top, but then vanishing with time. It was Huygens who hypothesized that these arms were a ring around the planet that when viewed edge-on appeared to be arms, or moons with inferior optics, and became impossible to discern from other angles. Huygens’s ring theory was not widely accepted at first, but other astronomers came around to his way of thinking, coming to believe that Saturn had a ring around it… a solid ring, for if it were not entirely solid, how could it possibly hold together as the planet spun? A hint came 15 years later, when Giovanni Domenico Cassini observed a gap in the ring. This gap proved it was not some giant ring of stone, all of a piece, so the mystery deepened. How did the isolated masses within the ring remain in place? Perhaps it was gaseous or composed of fluid? Not until the mid-19th century did James Clerk Maxwell demonstrate through equations that none of these possibilities would have been stable. Thus it was discovered that Saturn’s ring was composed of small unconnected particles. Almost forty years later, in 1895, James Keeler would further our understanding, observing that there were actually multiple rings of Saturn, and the innermost did not move at the same speed as did the outermost. So the history of astronomical discovery shows us that even when we see something with our naked eye, what we are seeing may not be entirely apparent.

Such was the case with Mars when William Herschel studied it with one of his homemade telescopes during the late 18th century. He made some important discoveries during his study, having to do with the rotation and axial tilt of the red planet. He also observed that the white spots on Mars’s poles, observed by both Cassini and Huygens and hypothesized earlier that century to be polar ice caps, changed size according to the season. This confirmation that ice existed on Mars helped to fuel Herschel’s speculation that the planet was inhabited. Herschel tended to vocally believe that all planets were inhabited, envisioning the moon as being akin to the English countryside, and even suggesting that the Sun supported life… not on its scorching surface, of course, but in some cooler underground realm, like a reverse Hollow Earth theory in which it is somehow hotter on the surface and more temperate within. Thus it’s no surprise that when Herschel viewed and mapped Mars in 1783, he asserted that all the visibly darker areas were oceans and declared the planet capable of supporting life, encouraging the perennial belief in Martians. With the 19th century construction of more and more advanced observatories, further mapping of Mars was accomplished, starting in 1877 by Giovanni Virginio Schiaparelli in Milan. It was he who named these supposed Martian continents and seas, giving them mythological names. Schiaparelli also observed something surprising. There appeared to be a network of pale lines in certain regions, which Schiaparelli called canali. When this news reached the English speaking world, his work was translated, and it was discovered that there were “canals” on Mars. At the time, canals were something of an engineering wonder, and the recent completion of the Suez Canal had been touted as a great accomplishment of mankind. So when the English speaking world heard “canal,” they thought massive artificial waterways, which would mean intelligent and industrious Martians. In reality, Schiaparelli’s word canali had been mistranslated. A more accurate translation would have been “channels,” a word less suggestive of engineering. But it was too late. The Martians were out of the bag. American astronomer Percival Lowell mapped what he saw as a network of canals with “oases” at certain intersections, speculating that the drying of the Martians’ planet had forced them to draw water from their polar ice caps. But again, what they had spied through their telescopes was not as clear as they believed. These features could be glimpsed only briefly and occasionally through Earth’s shimmering and shifting atmosphere, and as it turned out, they were an optical illusion. The lighter and darker regions were not oceans and continents, but rather what are called albedo features. Rather than bodies of water, or vegetation, as was an alternative theory, they didn’t correspond to topographical features at all and were likely just areas where wind had swept away the pale surface dust to reveal the darker ground beneath. As for the canali, they were an artifact of the human eye, creating phantom lines between briefly visible features, and the suggestion of infrastructure introduced by the English mistranslation added a psychological element, such that astronomers were looking for lines, expecting to find them, and staring until they saw them, making Mars something like the Magic Eye posters of the 1990s.

Martian canals depicted by Percival Lowell. Public Domain.

Martian canals depicted by Percival Lowell. Public Domain.

While sometimes astronomers were looking right at something and unable to discern what it was, or believed it was something it was not because of what others had told suggested, other times they searched and searched for something that wasn’t even there, again because some had insisted it would be there. The enduring search for a “phantom planet” is the perfect example of this. In the early 18th century, astronomers began to hypothesize about the regularity in distance between the planets in our solar system, concluding in 1772 with Bode’s Law, a formula for predicting the distance between planets that it was hoped would make the discovery of new planets possible. And indeed, it did. In 1781, William Herschel spotted Uranus, which he believed to be a comet, but of course it wasn’t. What it was was the 7th planet from the sun, right where Bode’s Law said it would be. After that, Johann Bode, one of the originators of the law, urged astronomers to search for another planet between Mars and Jupiter, as there was a gap there indicating the presence of another planet. This hypothetical planet was the subject of much interest at the turn of the century, and a group of astronomers who fancied themselves the Celestial Police devoted themselves to finding it. A discovery of a heavenly body in that slot between the orbits of Mars and Jupiter was made the next year, but not by one of the Celestial Police. One Giueseppe Piazzi discovered a heavenly body that he dubbed Ceres, and it was thought the predicted planet had been found. However, the next year, Heinrich Olbers discovered another body with about the same orbit: Pallas. After that came Juno and Vesta, and it became clear that numerous objects were in orbit there. Thus the asteroid belt was discovered, and a theory emerged that it represented the remains of a larger planet that had once orbited there where Bode’s Law had predicted a planet would be found. This “lost planet” was named Phaeton in an 1823 pamphlet by Johann Gottlieb Radlof, a German schoolteacher who took this theory and used it as a catastrophist explanation of certain myths and biblical incidents. In this way he was something of a predecessor of Immanuel Velikovsky, the catastrophist to whom I devoted an entire episode in my Chronological Revision Chronicles. But as usual, despite such pseudoscience, science marches on. Ceres and others of the largest asteroids in the belt are today considered dwarf planets at most, and the belt is believed to be material that simply never accreted into a planet because it was too perturbed by Jupiter’s gravitational influence.

Nearly a hundred years after the formation of the Celestial Police and the search for the lost planet, astronomers found themselves again all aflutter in search of a theorized planet, this one between Mercury and the Sun. It all started in 1810, when French astronomer Urbain Le Verrier constructed a model of Mercury’s orbit of the Sun according to Newtonian laws. When he had a chance to verify his model by observing Mercury’s transit, or its crossing of the disk of the Sun, however, it failed to confirm his model. It appeared there was some excess value observed in its perihelion precession for which Le Verrier simply could not account. Le Verrier’s solution was that there must be another small planet between Mercury and the Sun that was affecting Mercury’s orbit. He called this theoretical planet Vulcan, not in some prescient anticipation of Star Trek but, of course, after a Roman fire god, from whose name we also derive the word “volcano.” It wasn’t long before he received some confirmation of his theory in the form of an amateur astronomer’s claim to have spotted this previously unseen planet crossing the Sun. Despite the fact that another astronomer expressed doubt owing to the fact that he had been observing the sun at the same moment and had seen nothing of the sort, Le Verrier accepted the account as evidence in favor of his theory, and the amateur astronomer was lauded for his sighting. Perhaps it’s no surprise, then, that afterward, many amateur astronomers began to come forward claiming without any corroborative evidence that they too had observed Vulcan in years past. What Le Verrier needed, though, were sightings of the planet in transit that were confirmed by more than one astronomer. Since a solar eclipse would provide the best conditions for such a sighting, the total solar eclipse of 1878 served as their best opportunity to confirm the existence of Vulcan. Astronomers from all over converged by rail on the American West, gathering at the summit of Pike’s Peak in Colorado or overrunning the small town of Separation, Wyoming, places that just happened to lie on the eclipse’s path of totality, where the shadow of the moon would sweep over the country. Respected astronomers at these different locations did indeed sight something they believed to be an intra-mercurial planet, and not just one, but two! Excitement built that not only Vulcan but also another unknown planet had been discovered. Unfortunately, none of their coordinates matched, and the idea that four new planets had been discovered between Mercury and the Sun simply beggared belief. So the search continued, especially during eclipses, until, in 1915, Einstein’s Theory of Relativity satisfactorily explained the excess precession of Mercury, which meant there was no longer any reason to believe that Vulcan existed. Now, it is thought that many of the objects supposedly spied in transit were sunspots or perhaps artifacts of telescopic optics that had been damaged, like Icarus, when pointed too close to the Sun.

Astronomers gather in Separation, Wyoming, in 1878 to observe a solar eclipse in hopes of confirming the existence of the planet Vulcan. Image courtesy of the Carbon County Museum.

Astronomers gather in Separation, Wyoming, in 1878 to observe a solar eclipse in hopes of confirming the existence of the planet Vulcan. Image courtesy of the Carbon County Museum.

Despite this difficulty in sighting even our closest planetary neighbors, humanity has long held the conviction that the number of planets in the cosmos was innumerable. Greek Philosopher Epicurus told Herodotus that “the universe is boundless both in the number of the bodies and in the extent of the void” and that “there are infinite worlds both like and unlike this world of ours.” But for much of human history the only things that could be spied beyond our immediate solar system were luminous bodies: stars. The very fact that planets do not emit light made it essentially impossible to see any so-called exoplanets beyond our solar system. Planets, of course, only reflect light, but when searching for planets that orbit other stars, the great distance and the faintness of their reflected light in comparison to the brightness of their stars’ light make them hidden bodies out there in the void. In fact, hard as it may seem to believe, we did not have any evidence for the existence of planets outside of our own solar system until the 20th century. Astronomer Peter Van De Kamp had begun to hypothesize that planets outside our solar system could be detected by their gravitational effect on the movement of the stars they orbited. His first couple of identifications resulted in hypothetical planets that seemed far too large to be planets, but in 1963, he detected a wobble in the movement of Barnard’s Star, 36 trillion miles from us, and thus the belief in extrasolar planets, and possibly planets that can support life, was bolstered. Currently, the existence of more than 4,000 exoplanets has been confirmed, but surprisingly, Van De Kamp’s discoveries are not among them. As it turns out, all of the star wobbles that Van De Kamp took as evidence for the presence of a planet were actually caused by adjustments to the optics of his telescope. The first real evidence for the existence exoplanets didn’t actually arrive until the early 1990s, a fact which I find astonishing.

Despite the evidence for the corruption of Van De Kamp’s calculations favoring a planet orbiting Barnard’s Star having appeared decades earlier, as recently as 2013 astronomers were still seeking to definitively rule out their existence, which they appear to have finally done. This is the strength and power of science. It takes nothing for granted. Although the existence of oceans and canals on Mars has been ruled out, the existence of water on the red planet remains a topic of much study. In 2011, high resolution images from the Mars Reconnaissance Orbiter showed dark streaks on some slopes indicating seasonal water flow over the surface of the planet, and just last year, radar data from the European Space Agency’s Mars Express spacecraft detected possible underground lakes. And while the Phaeton theory that the asteroid belt is the remains of a destroyed planet is only supported by fringe pseudoscientists like Zecharia Sitchin, ideas about phantom planet and other hidden bodies within our solar system continue to be entertained. The notion of a planet Vulcan is mostly extinct, but some astronomers still suggest that intramercurial objects, which they call vulcanoids, could still exist and help to explain the various mysterious sightings in the 19th century. There is even some support for an unseen “Planet X,” but since Pluto has been demoted to a mere dwarf planet in the Kuiper Belt beyond Neptune, it is typically referred to as Planet 9 these days. Indeed, the only credible theory for the existence of another planet within our solar system places it in the same neighborhood, for the unusual orbits of the objects in the Kuiper Belt have led astronomers to hypothesize that the presence of a large body hidden far beyond Pluto may account for it. So, in astronomy and science generally, as in history, it is unwise to suggest too soon that the truth has been entirely settled.

Further Reading

Bakker, Frederick A. “The End of Epicurean Infinity: Critical Reflections on the Epicurean Infinite Universe.” Space, Imagination and the Cosmos from Antiquity to the Early Modern Period, edited by Bakker F., Bellis D., Palmerino C., Studies in History and Philosophy of Science, vol 48, Springer, 2018, pp. 41-67. SpringerLink, link.springer.com/chapter/10.1007/978-3-030-02765-0_3.

Baron, David. “The American Eclipse of 1878 and the Scientists Who Raced West to See It.” Mental Floss, 28 July 2017, www.mentalfloss.com/article/503114/american-eclipse-1878-and-scientists-who-raced-west-see-it.

Bartusiak, Marcia. Dispatches from Planet 3: Thirty-Two (Brief) Tales on the Solar System, the Milky Way, and Beyond. Yale University Press, 2018.

Basalla, George. Civilized Life in the Universe : Scientists on Intelligent Extraterrestrials. Oxford University Press, 2006. Internet Archive, archive.org/details/civilizedlifeinu0000basa/page/n3/mode/2up.

Choi, Jieun, et al. “Precise Doppler Monitoring of Barnard's Star.” The Astrophysical Journal, vol. 764, no. 2, 31 Jan. 2013, pp. 131-142. IOPScience, iopscience.iop.org/article/10.1088/0004-637X/764/2/131.

Matson, John. “50 Years Ago an Astronomer Discovered the First Unambiguous Exoplanet (or So He Thought).” Scientific American, 30 May 2013, blogs.scientificamerican.com/observations/50-years-ago-an-astronomer-discovered-the-first-unambiguous-exoplanet-or-so-he-thought/.

O’Callaghan, Jonathan. “Water on Mars: Discovery of Three Buried Lakes Intrigues Scientists.” Nature, 28 Sep. 2020, www.nature.com/articles/d41586-020-02751-1.

Sant, Joseph. “The Copernican Myths.” Scientus.org, 2019, www.scientus.org/Copernicus-Myths.html.

---. “The Galileo Myths.” Scientus.org, 2020. www.scientus.org/Galileo-Myths.html.

Anti-Vaccinationism, a Historical Hindrance to Herd Immunity

Antivaccinationism title card.jpg

The Reverend Cotton Mather was an exceptionally influential minister among New England Puritans. I recently discussed his role in spreading the fame of the enigmatic Dighton Rock, and his influence on the witch-hunters of Salem is widely known. Few, however, are aware of his role in popularizing early smallpox immunization efforts in America. In 1706, Mather was gifted a slave named Onesimus. Upon receipt, he looked this gift man in the mouth, as it were, searching his body for the telltale scars of a former smallpox infection and asking Onesimus if he had already had the disease. Onesimus showed Mather a small scar where he had been inoculated against the disease in Libya, the country of his birth. Mather afterward questioned numerous slaves in the area and found that the practice, which had come to be called variolation after the Latin name of smallpox, variola, was quite common and appeared very successful. Indeed, there was a long history to the practice of variolation against smallpox, a rudimentary form of immunization that involved purposely introducing biological material from an infected person—preferably one with a mild case of the disease—into the system of an uninfected person. The method goes as far back as 1000 CE in China, where they ground up infected scabs and blew them up people’s noses like snuff. By Mather’s day, the practice typically consisted of extracting pus from a smallpox sore and placing it under an uninfected person’s skin through a small incision on the arm. A variolated patient did develop a case of smallpox, but a milder and less deadly form, afterward developing immunity against the brutal “Speckled Monster” that had ravaged mankind for thousands of years and had spread the world over through trade routes, exploratory contact, and war. The more Mather learned about it, the more he supported the practice in Boston, and during a terrible smallpox outbreak in the 1720s, he convinced a local physician in his congregation to inoculate almost 300 Bostonians. While about 14% of those who contracted smallpox during this outbreak died from it, only six people, or about 2%, of those variolated died. But nevertheless, some did die, and this resulted in one of the first major backlashes against immunization efforts. Some declared that it was bad medicine, as it purposely caused a wound and an infection, while others saw it as a “devilish invention,” suggesting that some contracted smallpox as a punishment from God, and protecting them from God’s wrath was wrong. The debate was so fiery that one anti-variolater lobbed an improvised bomb into Cotton Mather’s house through a window. On it was a note that read, “Mather, you dog, Damn you, I’ll inoculate you with this.” Ironically, the only reason Mather was even able to read the note was that the poorly-made bomb failed to detonate. This furor over smallpox variolation would last for some time, but it would eventually subside, and the practice of variolation would become widespread, even among the nation’s Founding Fathers, like Thomas Jefferson, who traveled from Virginia to Pennsylvania to be variolated; John Adams, whose whose wife and children were variolated; and George Washington, who made variolation a requirement for all American soldiers. But of course, this was by no means the end of resistance to immunization science. And today, as we try desperately to get the Covid-19 pandemic under control in order to preserve lives and salvage our economies, it’s more important than ever to understand the history of anti-immunization rhetoric in order to refute its current iterations and encourage widespread vaccination.

*

As some of you may recall, I touched on historical anti-vaccination claims before, in an episode about Alfred Russel Wallace, whose contribution to anti-vaccinationist rhetoric will be discussed here again, but there is far more to the topic, and I’ve long wanted to talk about it. Now, with the push to vaccinate here in America and the corresponding push against vaccination by the practice’s critics, it seems like the best time to talk about this. When considering vaccination coverage generally, the fact that it remains very high in America, with about 95% of children receiving the doses of vaccines recommended, indicates that resistance to vaccination is not as widespread as one may think. Anti-vaxxers would like us to think that they are a massive movement, and with the amount of news coverage they get, one might assume they are. However, recent efforts by Facebook to curb anti-vaccination misinformation have uncovered that two thirds of all the anti-vax content on social media appears to come from only a dozen online sources, called the “Disinformation Dozen.” This does not mean that such misinformation should be allowed to spread unchecked, though, and when it comes to the Covid vaccine, this science denial becomes even more dangerous. Many adults, even those who are susceptible to anti-vaccine pseudoscience and conspiracy theory, still get their children vaccinated because of the vaccination requirements of schools, and beyond this, they only espouse anti-vaccination claims in an abstract way, the same way they’ll share conjecture that 9/11 was an inside job, or that JFK’s murder was actually the result of a shadowy government conspiracy. “You know what I heard…” they’ll tell friends over beers, but the idea never affects their behavior beyond their yearly decision of whether or not to get a flu shot, because they had all their vaccinations as a child. But the Covid vaccine is new, admittedly rushed, and must be administered to adults, who are far more susceptible to severe cases of Covid-19 than children, in order for us to reach herd immunity. So now these adults who fancy themselves “free-thinkers” and already have the seeds of anti-vax misinformation in their minds are making the decision not to be vaccinated. Even if they are not swayed by the absurd fringe claims that Bill Gates is putting microchips in us through the vaccine, or that, as Congress’s resident lunatic Marjorie Taylor Greene has asserted, Covid-19 vaccination records are the Mark of the Beast prophesied in Revelation, they remain hesitant due to distrust of the pharmaceutical industry or the government, or because of an imperfect understanding of the science behind this vaccine and vaccination in general. Before I was vaccinated, a family member told me that I didn’t need to get the shot because people were reaching herd immunity against Covid-19, which of course is false and fails to understand the basic fact that herd immunity to infectious diseases like Covid-19 is only achieved through widespread vaccination. So to start this history of anti-vaccinationism, we need to lay a foundation of basic understanding by discussing immunization science.

Rev. Cotton Mather’s house, scene of a failed bombing by someone who opposed Mather’s support of smallpox inoculation. Public Domain.

Rev. Cotton Mather’s house, scene of a failed bombing by someone who opposed Mather’s support of smallpox inoculation. Public Domain.

Most of the credit for the development of vaccination science goes to Edward Jenner, an English doctor who accepted the efficacy of inoculation, having been variolated himself as a child. In the 1790s, Jenner noticed that, in a given population suffering a smallpox outbreak, milkmaids seemed to rarely catch the disease. His hypothesis was that these milkmaids had developed immunity because of their exposure to cows that were infected with a similar disease, cowpox. More specifically, when these milkmaids came into physical contact with the cowpox pustules on udders, they were exposed to a pox that created only mild symptoms in humans and conferred immunity against the far more virulent and deadly smallpox. To understand how deadly and how virulent full-blown smallpox was, consider its symptoms. After a couple days of fever and body aches, a rash appeared inside the mouth and then spread over the entire body, becoming pustules, which might break, creating bloody sores. The most severe form had a fatality rate of up to 33%, with victims dying because of blood toxicity, infection, and blood loss. Those who survived typically suffered terrible scarring, sometimes over their entire body, and often blindness as well. In 18th century Europe, the death toll reached about 400,000 a year. After a year of Covid-19 that saw nearly 1 million deaths across Europe (and 1.3 million deaths across the Americas), this may seem tame, but with differences in population density, the comparison isn’t so simple. Suffice to say that the Speckled Monster was a violent, global scourge. Therefore, Jenner’s discovery was a medical miracle. He took pus from a cowpox pustule on the hand of a milkmaid and applied it to the arm of his gardener’s 9-year-old son using the variolation method, and months later, when he exposed the boy to smallpox several times, the boy never contracted the disease. Thus, the vaccine was born, named after the Latin word for cow, vaccus. Jenner wrote a book on the topic in 1798, and within five years, it had been translated into 5 languages and vaccination programs were underway in developed nations and colonies all over the world. There actually remains some mystery over whether early vaccinations were all derived from cowpox, as some early samples have been tested and shown to have been taken from a similar animal disease, horsepox. Regardless, the principle had been established, and through vaccination, the disease smallpox has been virtually eradicated. Because of this and how the science has been used to combat the spread of other diseases, the development of this immunization technique is considered the foremost medical breakthrough in the history of mankind.

At the time that Jenner developed the vaccine, the medical community’s understanding of how and why it worked was imperfect. After all, this was before the widespread acceptance of germ theory. Today, we know that vaccines work by activating the body’s immune system and relying on its memory. Amazingly, when it is exposed to the bacteria and viruses that cause disease, it is afterward able to remember certain features of them, like surface proteins, so that it will be better able to fight them off again in the future. This is called adaptive immunity. Vaccines have come a long way since the dangerous days of cutting open an arm and inserting infected pus into the wound, but the idea remains the same—the human body was better able to resist cowpox, and remembering features of that virus made it easier for the body to defeat the smallpox virus. Since then, vaccines became less dangerous, using weakened viruses or bacteria, or even rendering them incapable or replicating by killing them using formaldehyde or other chemicals. Some just introduced parts of a pathogen for the immune system to remember, or just toxins that a pathogen produces. Regardless, the central mechanism is the same. Think of antibody response as a bloodhound ready to track down and neutralize an intruder; the vaccine is just giving our bodies’ bloodhounds the scent to help them find and attack the invader. Now some of the Covid vaccines, the Pfizer and Moderna shots, use a brand-new method of exposing us to the viral proteins we want our immune system to remember: mRNA or messenger RNA. By injecting designer mRNA, our own body’s cells are directed to build the virus surface proteins that make Covid-19 so virulent. Therefore, no part of the virus is ever introduced into a vaccinated person’s system. Rather, cells are programmed to teach our immune systems how to battle the virus, should it ever enter our bodies. While this technology is new for vaccines against viruses, which may cause some vaccine-hesitance, it’s actually not as new as some believe. Scientists have been studying mRNA’s use in the creation of cells that mimic stem cells and in the development of a vaccine for cancer for more than 30 years, with hundreds of scientific papers published and dozens of longstanding clinical trials. So to suggest that this technology, which essentially programs the body’s immune response in the same way as traditional vaccines except through a different mechanism, was only developed in a rush during the last year, is inaccurate.

Dr. Jenner performing his first vaccination. Public Domain.

Dr. Jenner performing his first vaccination. Public Domain.

As the example of backlash against variolation efforts in colonial America shows, anti-vaccination sentiment is also not a new development and has been around since the dawn of immunization science. Many years after the development of Jenner’s smallpox vaccine, a variety of laws in 19th century England made vaccination compulsory. The Vaccination Act of 1840 outlawed the outmoded and far more dangerous technique of variolation and provided free vaccination to the poor, but by 1853, with vaccination rates not improving, a new act was passed making vaccination of infants required by law, with parents liable to be fined or imprisoned if they did not comply. This compulsory vaccination program was expanded by the Vaccination Act of 1867, which required all children under 14 to be vaccinated and began levying fines on doctors that failed to report families that resisted vaccination. In 1871, punitive measures against the poor who failed to comply included the confiscation of property and placement in a workhouse. It was in response to these draconian laws, which were actually pretty typical of laws governing the poor in the Victorian Age, that robust anti-vaccination activism emerged. As one might expect, a central complaint among these first organized anti-vaccinationists was the power of the state over personal liberty and its persecution of those who refused or were hesitant to be vaccinated. There were also, though, critics who complained that vaccination science was unproven, that it caused other diseases such as syphilis, or that disease actually emanated from decaying organic matter—the miasma theory of disease—and thus injecting oneself with what was essentially poison could not actually prevent disease. Rather, these “sanitarians” or “anti-contagionists” asserted that keeping city streets clean was the only way to prevent disease. Alternatively, there was again, as in Mather’s day, religious opposition on the grounds that immunization interfered in God’s plans, but with a new spin. 19th century critics like John Gibbs claimed that death by disease was foreordained. Therefore, if vaccination reduced death by smallpox, there would just be more death by consumption, whooping-cough, or measles, for divine providence could not be thwarted.

In 1867, John Gibbs’s cousin founded the Anti-Compulsory Vaccination League, and throughout the 1870s, the cause became popular among the working class and the poor in provincial organizations. It was among these small northern town associations that the most extreme justifications for resistance were prominent, and it was among them that it became a movement of civil disobedience, with organized refusal to comply with the law resulting in some leaders of the movement being imprisoned for their beliefs. Meanwhile, among middle-class intellectuals in London, the London Society for the Abolition of Compulsory Vaccination was organized. Among these was Lord Alfred Russel Wallace, the co-discoverer with Darwin of the scientific principle of Natural Selection, who like other vaccine critics took a more holistic view of health and likewise distrusted the growth of state power and medical authoritarianism. He also took a sanitarian view by attributing reductions in smallpox infection rates to general improvements in sanitation. Wallace further argued, much as some have about today’s Covid vaccines, that more study and experiment was necessary to prove the efficacy of the smallpox vaccine. At the time, Wallace was working with imperfect statistical evidence, and of course, the systematic tests he proposed have since been completed, and then some. Therefore, it’s hard to characterize Wallace, whose heroic refutation of flat-earthers I have previously discussed, as anti-scientific, even despite his obsession with seances and spiritualism. This demonstrates, though, that 19th century anti-vaccination rhetoric was not solely the domain of anti-intellectual denialists. Eventually, in 1896, all of these organizations combined into one National Anti-Vaccination League concentrated on parliamentary change. In 1898, they achieved victory when a new law was passed allowing abstention from vaccination on the grounds of “conscientious objection” (the first time the phrase was used before its later use in the context of the refusal of military service). However, by this time, vaccination had so reduced rates of smallpox infection that compulsion was no longer a necessity, making this victory more of a pacifying concession.

Wood engraving depicting fears over a compulsory vaccination act. By E.L. Sambourne, courtesy Wellcome Images (CC BY 4.0)

Wood engraving depicting fears over a compulsory vaccination act. By E.L. Sambourne, courtesy Wellcome Images (CC BY 4.0)

This would not be the end of anti-vaccination activism against compulsory smallpox vaccination, however. The formation of anti-vaccination leagues had spread to New England in the mid-1880s, and the compulsory vaccination of children being made a pre-requisite of enrollment in schools precipitated a surge of anti-vaccination rhetoric through the Progressive Era of the 20th century. Much of the resistance originated at first from farming families, who complained that the transient fevers that often resulted from vaccination kept their kids out of the fields and prevented them from bringing in the crop before the beginning of the school year. Some religious opposition was present here as well, though from a somewhat unorthodox source. I am referring to the Swedenborgian Church, which believes in the unusual mystical prophecies of Emanuel Swedenborg, who claimed to have had transportive visions that allowed him to talk with angels and demons. Swedenborg claimed the Last Judgment began in the mid-18th century, and that the Second Coming of Christ had actually happened, through the revelation of his own teachings. In 1906, the Swedenborgian Church in Bryn Athyn, Pennsylvania, or the General Church of New Jerusalem, resisted vaccination against smallpox during an outbreak because of their devotion to homeopathic medicine, and from among the members of this church emerged John Pitcairn, anti-vaccination giant and founder of the Anti-Vaccination League of America. However, the majority of the opposition to compulsory vaccination in early 20th century America came from parents who believed their children had been injured by a vaccine. In 1914, a New York state Republican committee delegate named James Loyster lost his son to infantile paralysis. As it happened three weeks after a vaccination, he came to believe that it had been caused by the vaccination—the old post hoc ergo propter hoc logical fallacy. This prompted him to undertake a crusade against vaccination. He surveyed many residents of upstate New York, and he believed that he had discovered some fifty cases of injury and death caused by vaccination, which he argued far outweighed the mere three deaths by smallpox during the same period, apparently never considering that the smallpox death rate had become so small precisely because of vaccination. Among the bereaved parents who lobbied against compulsory vaccination was one illustrator whose daughter, some time after vaccination, had become paralyzed and died due to a heart defect. The limp-limbed ragdoll this illustrator designed, Raggedy Ann, which he said reminded him of his late daughter, would many years later become something of a symbol for anti-vaccinationists. James Loyster took the anecdotes he had gathered from people like the Raggedy Ann doll creator, and he made pamphlets, which he distributed to New York legislators. Eventually, he succeeded in getting the state’s compulsory vaccination law altered. The Jones-Tallett amendment rescinded the requirement in rural areas and towns with populations of less than fifty thousand, except in cases of an outbreak, when compulsory vaccination could be enforced once again. However, Loyster rejected the law, seeing it as a defeat, since in cities of more than fifty thousand, the amendment actually expanded vaccination requirements.

As with anti-vaccinationism in 19th-century England, it’s clear that it would be unfair to characterize the philosophy of compulsory vaccination objectors as uniform. Beyond objections to the temporary loss of child labor or because of perceived dangers, there was objection to the laboratory experimentation on animals that took place to develop vaccines. These vaccine critics were early animal rights activists called anti-vivisectionists, who protested, as the name implies, surgical experimentation on live animals. Though they may seem like odd bedfellows with animal rights activists today, there was also a strain of libertarian ideology present among anti-vaccinationists. The two sons of John Pitcairn, the Swedenborgian founder of the Anti-Vaccination League of America, inherited the organization and likewise were leaders in the movement, and they also funded the Sentinels of the Republic, a conservative political organization that opposed federal overreach and socialism in any form, rejecting any social reforms of the day, including New Deal legislation, limitations on child labor, and even the establishment of the Department of Education. Among many anti-vaxxers of the Progressive Era, a government that compelled citizens to do anything, even for reasons of public health, looked a lot like Bolshevism. Then there were those who saw it as a matter of personal liberty, believing their control over the kind of medical care they received to be as sacrosanct as the freedom of religion. Among these were religious groups like the Swedenborgians, whose doctrines included a philosophy on health. Christian Scientists emerged as a similar group. They believed that illness was not actually physical and could best be overcome by appealing for recovery through prayer. The group was widely criticized around the turn of the century after some cases of children dying because they had only been “treated” by practitioners of Christian Science rather than by actual doctors. Charges were made that followers of Christian Science were unlawfully presenting themselves as medical practitioners, and this was another common thread among anti-vaccinationists. There were a variety of alternative medicine movements that positioned themselves in opposition to vaccination. Naturopathy was one, with its emphasis on natural therapeutics. Physical Culture was a movement emphasizing natural foods, exercise, and fresh air that also engaged in germ theory denialism, contending that disease resulted only from unclean living and poor fitness. Then there was chiropracty… that’s right, chiropractors. Now don’t get me wrong. I have benefitted from spinal adjustments myself, or at least I believe I have, but chiropracty started as a form of absolute quackery. True believer chiropractors claimed that people only became ill because of spinal misalignment, which disrupted the body’s flow of energy. One can almost hear a low voice out of the past moaning: woowoooo….

Anti-compulsory vaccination rally, 1919. Image credit: Wikimedia Commons User Cavernia (CC BY 2.0)

Anti-compulsory vaccination rally, 1919. Image credit: Wikimedia Commons User Cavernia (CC BY 2.0)

In the 1930s, anti-vaccinationism in America declined. First, a series of Supreme Court decisions found that compulsory vaccination laws were perfectly constitutional, and then the figures who had spearheaded major anti-vaccinationist organizations passed away. By the end of the 1970s, smallpox had been eradicated because of vaccination, and numerous other diseases, including measles, diphtheria, pertussis, and polio are now kept under control, also through immunization. Modern anti-vaccinationism didn’t really show up until 1998, when a British physician named Andrew Wakefield published a paper in The Lancet in their Early Report feature titled “Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children.” The paper speculated that the vaccine for the measles, mumps, and rubella, or MMR, might cause gastrointestinal inflammation, which in turn could be the cause of autism, a development disorder diagnosis that had become and continues to be more and more common in children. When it came out that Wakefield had falsified data and acted unethically in subjecting children to unnecessary spinal taps, his paper was retracted, he was disgraced, and his license to practice medicine revoked. Nevertheless, it has become a common claim among anti-vaccinationists today that vaccines cause autism. Wakefield has since made a pseudoscience career for himself by further fanning the flames of this disinformation, claiming that children who receive too many vaccines too quickly or receive them while unwell or on other medication are susceptible to being turned autistic. Other claims have to do with the presence of thimerosal in vaccines, used as a preservative, because it contained mercury, which is a known neurotoxin. Everyone seems to know that mercury was used by milliners in hat making, and that it drove them insane, thus the term “mad as a hatter.” However, the thimerosal in vaccines used ethyl mercury, which does not cross the blood-brain barrier like its counterpart, methyl mercury does, so this concern was groundless, and regardless, the chemical was removed from vaccines by the CDC in 2001 simply to pacify anti-vaxxers. While it’s true that we don’t really know why autism incidence has risen so dramatically since the 1970s, the fact that we can now more clearly recognize and more accurately screen for signs of being on the autism spectrum might account for the much of the increase in diagnoses. The fact is that no vaccine-autism connection has ever been found despite numerous scientific studies being conducted specifically to determine whether one exists. Studies that found zero association between the MMR vaccine and autism have been published in the scholarly journal Pediatrics, the similarly titled Journal of Pedatrics, The Lancet, the Journal of Child Psychology and Psychiatry, and the New England Journal of Medicine, to name a few. But anti-vaxxers don’t let a little thing like empirical evidence get in the way of their convictions.

Again, though, as in the past, modern anti-vaccinationists do not all approach the subject with the same philosophy and motivation. Objections predicated on religious doctrine remain, though with some variations. As with those who long ago objected to interfering with Providence, some anti-vaxxers, like Suzanne Humphries of the International Medical Council on Vaccination, will say that the pharmaceutical industry has set itself up to replace God, striking fear into humanity that they will be struck down by pestilence if they do not accept this peculiar communion in a syringe. Then there is the streak of environmentalism among anti-vaxxers, among hippies who object to the animal products in them or to what they deem to be toxic chemical ingredients, even though any chemical component that might be harmful in larger amounts exists in vaccines only in such a negligible quantity that the body can easily eliminate it. Environmentalist objection to vaccines sometimes blends with religious objection, arguing that the body’s natural defenses are better equipped to fight disease without technological aid because that is how God designed it in His infinite wisdom. The view that the fight against vaccination is a fight for personal liberty remains, but it has blended in an interesting way with feminism. Specifically, Second Wave Feminism introduced the idea that medical paternalism was a key element of the patriarchy, and that women must fight to reclaim control of their healthcare from the male-dominated medical industry. This view that women know their bodies best translated into anti-vaccination rhetoric in the form of mothers insisting that their maternal instincts were far more trustworthy than professional medical advice. This notion of the superiority of personal instincts over expert opinion would not be reserved only for women or mothers, though. Since the 1970s, while second-wave feminism was encouraging women to wrest away the reins of their healthcare from the patriarchy, the notion of respect for patient autonomy was spreading as a core principle of medical ethics. This was in response to some terrible ethical violations, some of which were indeed perpetrated by vaccine researchers who tested their vaccines on children with developmental disabilities. With the advent of the Internet, patient autonomy took on new meaning, for suddenly patients could easily self-diagnose, and doctors, who depended on positive online reviews and satisfaction scores to run a successful practice and maintain Medicare reimbursement, were under pressure to give patients what they wanted. Add to this the spread of misinformation online, and you have a recipe for quackery and anti-intellectualism. All of these undercurrents can be observed in the remarks of minor celebrity and prominent anti-vaccinationist Jenny McCarthy during her television appearance on Oprah. “The University of Google is where I got my degree,” she explained in defense of her claims that her son Evan became autistic because of a vaccine, adding, “my science is named Evan, and he’s at home. That’s my science.” When pushed to clarify how she could be so certain, she credited her own “mommy instinct” for letting her “know what’s going on in his body.”

Father of modern anti-vaccinationism Andrew Wakefield, doubling down. Image credit: Wikimedia Commons user Bladość (CC BY 4.0)

Father of modern anti-vaccinationism Andrew Wakefield, doubling down. Image credit: Wikimedia Commons user Bladość (CC BY 4.0)

But clearly the most common thread in all anti-vaccination rhetoric is the belief that vaccines cause harm or injury, especially to children. The fact is, though the risk of harm from vaccines may be minimal today, there have historically been cases of serious damage done by vaccines. As mentioned, the early practice of variolation was almost barbarous by comparison with vaccination, involving crude purposeful infection that often still resulted in death. Then after Jenner’s breakthrough using material from people infected with cowpox, it turned out that the donors of infected material sometimes were also infected with other diseases that were then passed right along to the person being vaccinated. There were cases in which numerous children caught syphilis and died because of this cross contamination through vaccination. Even in the late 19th and early 20th centuries, when vaccination had become more sterile, episodes still occurred with bacteria-contaminated needles or vaccination solutions that were contaminated with other diseases. In the 1950s, when Jonas Salk’s polio vaccine went to market through numerous pharmaceutical companies, one incident saw batches shipped with live instead of dead viruses, resulting in a manmade polio outbreak that saw 200 people paralyzed and ten killed, many of them children. But these were contained events, which were discovered and corrected. Much anti-vaccination rhetoric, as with most science denialism, relies on baseless conspiracy theory, claiming that the government and the pharmaceutical industry collude to cover up the harms of vaccines in order to make money. However, the simple fact that harms have been identified, admitted, and addressed disproves this. The fact that vaccine manufacturers are shielded from liability by legislation certainly contributes to this conspiracy theory. For example, in America, when Republican President Ronald Reagan signed a law creating the National Vaccine Injury Compensation Program, he set up a mechanism for reviewing claims of vaccine harm and compensating plaintiffs that demonstrate some probability that an injury was suffered due to vaccination. This program encourages manufacturers to continue making vaccines by removing the threat of lawsuits. Some might argue that this means manufacturers no longer need to worry about safety, but this is not accurate, as there is tremendous government oversight of vaccine safety. The CDC and FDA have their Vaccine Adverse Event Reporting System, to which anyone can contribute reports, and the CDC has their own separate program as well, the Vaccine Safety Datalink, which collects safety data directly from health care providers. As for the big government conspiracy theorists’ claims that all of these government agencies are in on it, the fact is that vaccination requirements are decided by local and state governments, not some shadowy federal monolith. While it’s true that vaccination recommendations are passed from these federal agencies, these recommendations are made by technical advisory groups that hold meetings that are open to the public. As usual, conspiracy theories like these break down under simple, clear logic.

The victory of immunization science over diseases has always been relative. While variolation against smallpox carried a 1-2% chance of death, this is vastly preferable to the 10-33% chance of death from smallpox. The idea has always been that the benefits justify the risks. The risks today are minimal, but they are still present. For example, the MMR vaccine comes with some small risk of febrile seizure, but then again, these convulsions also occur naturally in a small percentage of children, are typically short and mostly harmless, and vaccines can in some cases prevent febrile seizures by protecting against diseases that cause them. To be considered medically vital, we only need to determine that the potential benefits of vaccination outweigh the risk of side effects, and that can easily be demonstrated by the elimination of smallpox and polio by vaccination, as well as the initial elimination of measles before flagging vaccination rates allowed for its recent resurgence. This resurgence of measles also demonstrates that this decision of whether benefits outweigh risks is not a matter of individual preference. Vaccination against infectious disease is a matter of public health that requires collective action. This is the case for compulsory vaccination. Many today don’t understand the concept of herd immunity. It was touted last year as a kind of laissez-faire solution to the Covid-19 pandemic, as if it would be over with more quickly if we all ran out and purposely infected ourselves like one big chicken pox party. The fact is that herd immunity is only achieved through mass vaccination. The more people are immune to a disease, the less it will spread, which confers safety even on those who aren’t immune, but the rates of immunity required to achieve herd immunity vary depending on how contagious the disease might be. For the flu, a vaccination rate of 50-75% of the population confers herd immunity, but for measles, 83-94% immunity is needed, and for whooping-cough, 92-94% immunity must be achieved. Numbers like that don’t tend to occur without mass vaccination. We may not yet know what percentage of the populace that must be vaccinated to achieve herd immunity against Covid-19, but considering how contagious it is, we should assume that we will need something like 94% of the populace to be immunized. So if you are considering not getting vaccinated, or if you know someone who is hesitant, urge them, convince them, explain to them that we need to take action. If they are leery of the mRNA technology in the Pfizer and Moderna vaccines, encourage them to get the Johnson & Johnson vaccine, which instead uses a disabled adenovirus that contains some genetic material from the novel coronavirus in order to instruct the immune system on how to combat it. If they are disappointed that the Johnson & Johnson vaccine only has a 66% efficacy, let them know that this is likely just an aberration based on the fact that the clinical trials were held during the pandemic’s surge, and remind them that many flu vaccines also have a 60% efficacy but remain effective. We have a chance to take some control over this virus and to seize some form of normalcy that may pave the way for an economic recovery, but we all have to trust science and do our part. We’ve done it before, and we can do it again.

*

Until next time … remember… there’s denialism and then there’s post-denialism. True denialism disguises itself as logical and even scholarly, and because of that is very insidious. If it is not countered, it evolves to become post-denialism, the obtuse, purposeful opposition to reason that requires no evidence, just the desire to undermine. It’s the difference between a pseudo-academic denialist manifesto and a tweet that just says Covid-19 is a hoax. We must address the first to in order to prevent the second.

Further Reading

Colgrove, James. “‘Science in a Democracy’: The Contested Status of Vaccination in the Progressive Era and the 1920s.” Isis, vol. 96, no. 2, 2005, pp. 167–191. JSTOR, www.jstor.org/stable/10.1086/431531.

Garde, Damien. “The story of mRNA: How a once-dismissed idea became a leading technology in the Covid vaccine race.” STAT, 10 Nov. 2020, www.statnews.com/2020/11/10/the-story-of-mrna-how-a-once-dismissed-idea-became-a-leading-technology-in-the-covid-vaccine-race/.

"New clues on the historical origin of the vaccine used to eradicate smallpox." ScienceDaily, 11 October 2017, www.sciencedaily.com/releases/2017/10/171011180558.htm.

Porter, Dorothy, and Roy Porter. “The Politics of Prevention: Anti-Vaccinationism and Public Health in Nineteenth-Century England.” Medical History, vol. 32, no. 3, 1988, pp. 231-252. National Center for Biotechnology Information, doi: 10.1017/s0025727300048225.

Rothstein, Aaron. “Vaccines and Their Critics, Then and Now.” The New Atlantis, no. 44, 2015, pp. 3–27. JSTOR, www.jstor.org/stable/43551422.

Weston, Kathryn M. “Killing the Speckled Monster: Riots, Resistance, and Reward in the Story of Smallpox Vaccination.” Health and History, vol. 18, no. 2, 2016, pp. 138–144. JSTOR, www.jstor.org/stable/10.5401/healthhist.18.2.0138.

The Story of Chevalier d'Eon and the Inadequacy of the Historical Analysis of Transgender Identity

d'Eon title card.jpg

In the 1890s, German medical doctor Magnus Hirschfield noticed a disturbing trend among his homosexual patients. Many of them attempted suicide, and many more bore scars from suicide attempts. When one young military man left behind a suicide note urging Dr. Hirschfield to “contribute a future when the German fatherland will think of us in more just terms,” Hirschfield embarked upon his life’s work of advancing sexual science and devoting himself to sexual rights activism. He founded the Scientific-Humanitarian Committee in 1897 to defend gay rights and work against societal hostility toward homosexuals. Around 1904, Hirschfield was called to consult on the case of a suicidal man who had been hospitalized after an electrical accident. This turned out to be one Martha Baer, who had been raised as a woman but had chosen to express himself as a man and go by the name Karl. Karl Baer was not a female-to-male transgender person as we would recognize one today. In fact, he had been born biologically male, but due to the relatively common birth defect known as hypospadias, in which the urethral opening forms somewhere other than the tip of the penis, his sex was misidentified. Raised as a girl, this erroneous gender assignment caused him especially great distress after puberty, when he began to grow body and facial hair. His sexual attraction toward women led him to identify as a lesbian, and eventually to identify as a man. Dr. Hirschfield diagnosed Karl Baer’s gender misidentification based on biology, performed corrective surgery, and helped Baer obtain official gender reassignment in the eyes of the law. While this surgery was essentially a hypospadias repair, it is viewed by many as the first gender reassignment surgery, and certainly Dr. Hirschfield went on to advance social and medical knowledge of intersex individuals as well as cases of what he would call transvestism and transsexuality. At his Institute for Sexual Research in Berlin, Hirschfield provided counseling and treatment for a wide variety of people struggling with their sexualities and gender identities. While he still often viewed transgender individuals as suffering from a physical or psychological disorder, he pushed for reform and understanding and was in many ways ahead of his time. Unsurprisingly, his work was not popular with the Nazis, who shut down his institute in 1933. While in exile in Paris, at a cinema, Magnus Hirschfield had the distinct displeasure of watching a newsreel in which fascists plundered his institute and burned his life’s work. Today, many conservative and religious ideologues justify their intolerance of and opposition to transgender rights by claiming that it is a modern phenomenon, a symptom of liberal decadence, but in truth there is nothing new about transgender identity, just as there is nothing new about blaming changing gender notions on societal decadence. But in many cases, as in the case of Hirschfield’s institute, this history has been erased by the intolerant.

*

This post started out as another survey, this time listing and discussing all the transgender figures in history in an effort to demonstrate the ubiquity of the phenomenon throughout history. I envisioned it as a refutation of the critics who try to assert that this is a new phenomenon. However, what I found is that this is a difficult task. No doubt, it is an easy thing to turn up lists of historical figures who have been embraced as forerunners by the transgender community. But I feel compelled to dig deeper and consider some further points. First, we would today recognize an entire spectrum of subsets within the term “transgender.” Dr. Hirschfield rightly recognized transvestites, or “cross-dressers” –people who were comfortable with their biological sex but enjoyed dressing in clothes typically reserved for the opposite sex—as separate from transexuals, or those who longed to live entirely as a member of the sex they were not assigned at birth, or even to surgically alter their biological sex in order to more fully realize this desire. Our modern terminology did not arrive until the 1960s, when psychiatrist Robert Stoller began using terms like “gender identity” and “gender assignment.” The term “transgender” wasn’t coined until 1970 by activist Virginia Prince and was originally used to denote heterosexual transvestites. Today, the term is used more generally as an umbrella term, including those who identify genderfluid and non-binary or genderqueer. We now recognize sex assigned at birth and gender identity as just two elements in a larger spectrum of personal identity, which also includes gender expression, and sexual orientation, which itself can be further broken down into physical attraction versus emotional attraction. This evolution of our terminology and understanding of personal identity is what complicates our ability to identify trans figures in history. If we use it as an umbrella term, then it’s more simple. Any persons in history whose outward gender expression is recorded as being at variance with their birth-assigned sex can comfortably be labeled transgender. However, the critics of trans rights typically aren’t denying the presence in history of figures known to have dressed in the clothing their society reserved for the opposite sex. So it would seem some further differentiation and categorization is needed to refute those who desire to erase trans people from history. For example, everyone is familiar with the legends of Joan of Arc and Mulan, women who dressed as men in order to fight as soldiers. Few would argue that they are transgender figures, a fact that clearly indicates how important motivation is to making such an identification. The fact is that it is very difficult to discern what motivated gender transgressive behavior by historical figures, especially past a certain point in time, when polite society dictated that people should not speak or write about such matters. Thus we are left with only inferences and educated guesses. For example, there were many women who dressed as men in order to fight in the American Civil War, but one in particular, Albert Cashier, might be differentiated from the others and from figures like Mulan and Joan of Arc as well, for he chose to live as a man for more than fifty years, long after the conclusion of the war.

19th century depiction of Elagabalus in priestly vestments. Public Domain.

19th century depiction of Elagabalus in priestly vestments. Public Domain.

So to discern whether a given figure may reasonably be raised as an example of a transgender individual, we must scour the historical record for evidence of their motivations for gender transgressive behaviors like transvestism or for some indications of their feelings about their gender identities. Another problem that a historian faces in attempting this is the reliability of the primary sources that may give us such evidence. A perfect example of this difficulty is Varius Avitus Bassianus, who in 218 CE, amid a rebellion against Roman Emperor Macrinus, was raised up as a figurehead emperor at just 14 years old, taking the name Elagabalus. As the story is told, this emperor, assigned male at birth, alienated the aristocracy by dressing in clothing considered suitable only for women, by painting his face and wearing jewelry in an effeminate manner. It is even asserted that he wanted to be addressed as a woman, stating, “Call me not Lord, for I am a Lady,” and promised generous rewards for any physician who could surgically alter his genitals to be like those of a woman. This certainly sounds like a clear example of not just an ancient historical figure who identified as being of a gender opposite to the one assigned him at birth, but also of a trans person seeking gender reassignment surgery long before any such procedures had ever been invented. In my view, this is a strong argument for the presence of trans people throughout human history. The problem for the conscientious historian, however, is that the three primary sources from which all of this information about Elagabalus is gleaned, the works of Cassius Dio, Herodian, and his biography in the Roman collection of biographies, Historia Augusta, are all manifestly hostile toward Elagabalus, written by contemporaries who had been alienated, like so many others in his realm, by Elagabalus’s religious practices. So unhappy were the Roman elite with young Elagabalus as a figurehead that they ended up having him murdered within a few years of installing him, and his biographers in some passages include manifestly fictitious accounts of his outrages. As such, the accounts of this Roman Emperor’s gender transgressions may or may not be true.

Long before the terms “transexual” or “transgender,” the Age of Enlightenment gave the western world the term “Eonism” to describe such transgression of gender norms. The origin of this term, a remarkable person named Chevalier d’Eon, serves as the perfect example of both the presence of transgender figures in our historical past as well as the inadequacy of the historical record in helping us determine with any certainty the true feelings of historical trans figures regarding their gender identities. As such, the remainder of this episode will be devoted to this unusual and fascinating individual. Born Charles-Geneviève-Louis-Auguste-André-Timothée d'Éon de Beaumont to a noble but not a wealthy family in 1728, he lived his life as a young man, and thus I will refer to him using the masculine pronoun for now, for clarity’s sake. A gifted scholar, he graduated as a student of civil and canon law at 21 years old, in 1749, and embarked on a political career as a secretary to a series of administrators before being appointed a royal censor. In this role, he read literature and history extensively, but rather than working to ban books, he appears to have spent his time acquainting himself with the most current Enlightenment thinkers. He began to develop political philosophies of his own, which tended toward the idea of benevolent despotism, that absolute monarchs were the ideal agents of progressive reform, taking power away from the aristocracy and the church and imposing policies inspired by Enlightenment principles. With such an ideology, it’s no surprise that he sought to serve the purposes of King Louis XV by entering the diplomatic corps and working with the French Ambassador to Russia for a few years. Despite being of slight build and a “pretty boy,” as some described him, he thereafter entered military service during the Seven Years’ War between Britain and France, serving as dragoon and sustaining an injury. He was afterward sent to London, appointed secretary to the ambassador negotiating the Treaty of Paris that would end the Seven Years’ War. For his service in drafting this treaty, he would be awarded a knighthood, earning the title Chevalier. Throughout his military career and during his years living in London, despite his slender frame and somewhat effeminate beauty, d’Eon earned himself a reputation as a manly fellow. He always went about in his dragoon’s uniform, always proudly defended his own reputation and honor, and eagerly escalated personal conflicts by frequently challenging other men to duels, as though to prove his own manhood. This aspect of his character is especially interesting considering what would later be revealed.

Chevalier D'Éon, as a younger man. Mezzotint by Vispré. Credit: Wellcome Collection. Attribution 4.0 International (CC BY 4.0)

Chevalier D'Éon, as a younger man. Mezzotint by Vispré. Credit: Wellcome Collection. Attribution 4.0 International (CC BY 4.0)

Outwardly, Chevalier d’Eon was a rousing success in London. After the signing of the Treaty of Paris, d’Eon returned to London to take charge of diplomatic affairs between the two nations, being promoted to the position of plenipotentiary minister during an interim period when the previous ambassador had returned to Paris and a replacement had yet to be appointed. Chevalier d’Eon was a fixture of King George III’s court and made numerous social connections among the English nobility. He was living a life of luxury, rubbing elbows with the wealthy and powerful. But then the rug was pulled out from under him. A new ambassador arrived, the Comte de Guerchy, whom d’Eon thought of as inferior in intellect. D’Eon was knocked back down to the role of secretary, a regression that caused d’Eon keen humiliation. Brazenly, d’Eon refused to step down from his post, and the newspaper coverage of d’Eon’s defiance was widely read because of just how entertaining it was. When pestered by an agent of Geurchy, the newly appointed ambassador, d’Eon refused to grant him an audience because he was not highborn and challenged him to a duel when the agent persisted. This set off a pamphlet war, with Guerchy denouncing d’Eon’s refusal to step down and d’Eon, explosively, claiming that Guerchy had tried to drug him, kidnap him, and have him murdered. Indeed, d’Eon even pressed charges against Guerchy for attempting to have him assassinated, a legal action which Guerchy tried unsuccessfully to have nullified. Amid this feud, Chevalier d’Eon published Guerchy’s private correspondence, seemingly to embarrass him. In retaliation, Guerchy had d’Eon charged with libel in France, where of course d’Eon refused to return, thereby making d’Eon an outlaw in his home country. In England, however, d’Eon still enjoyed wide support. King George III refused his extradition, and the public, reading about d’Eon’s stand against his king in newspapers that portrayed his struggle sympathetically, began to view him as a hero of the people, pushing back against royal abuses of power. He was compared with Britain’s own John Wilkes, a radical journalist who around the same time had made enemies of king and prime minister alike and had likewise been transformed into an outlaw who enjoyed popular support. But there was more to all of this than the public was aware. Chevalier d’Eon’s publication of private papers was, unbeknownst to all, a powerful message to King Louis XV. It was, in fact, an overt threat, for Chevalier d’Eon kept a volatile secret that the King of France could not afford to see revealed.

We know now that, even before his work as a secretary to the Ambassador to St. Petersburg, a young Chevalier d’Eon had been enlisted into King Louis XV’s espionage service, “Le Secret du Roi,” or the King’s Secret. In general, this spy organization within the diplomatic corps was tasked with gathering foreign intelligence and working for the king’s purposes even when they were at cross-purposes with the goals his diplomats were publicly trying to reach. For example, while in Russia, d’Eon had been working on some schemes in Poland. Outwardly, Louis supported Polish independence, but through his agents in the King’s Secret, he was endeavoring to install his cousin on the Polish throne. In England, Louis’s secret agenda was even more devious. While Chevalier d’Eon and the Ambassador drafted the Treaty of Paris to end hostilities between France and Britain, King Louis had his agents making inroads with and bribing politicians, with the end goal of launching an invasion. Chevalier d’Eon had actually been placed into an even better position to work toward Louis’s goals when he had been made plenipotentiary minister, and as such had been spending a great deal of money in his efforts to forge relationships that he could leverage. It has been suggested that one reason d’Eon was replaced was that he was spending too much money, and likewise that this was the reason he refused to step down, because he had become too accustomed to his lifestyle. Whatever the case, when he defied King Louis, he knew that his knowledge of the King’s machinations might work to his benefit, and his publication of private correspondence was a shot across the bow, warning that he was all too willing to betray confidences. King Louis could not afford for his plan to invade Britain to be revealed, as it would precipitate war before the French were prepared to renew hostilities. However, as time went on and d’Eon failed to preserve his title, his pension, and his reputation through implied threat and negotiation, his leverage began to lose its power. During the early 1770s, the rumblings of colonial rebellion in America weakened d’Eon’s position, for Britain already began to suspect that France might use the opportunity, while Britain was embroiled in a war with her own colonies, to break the treaty and make war on them again. However, by that point in d’Eon’s years long negotiation with Louis XV, there was another secret of d’Eon’s in play, one that had been revealed and would have far greater consequences for his reputation.

Chevalier d’Eon as an older man. Public Domain.

Chevalier d’Eon as an older man. Public Domain.

In 1770, rumors began to circulate among the British that, despite his masculine comportment, the slight and comely Chevalier d’Eon was actually a woman. Some began to portray him a heroine who became a man solely to serve her country, first in the military and thereafter in politics, like a modern Joan of Arc, or to use a more recent and British example, like Hannah Snell, who had donned men’s clothing to travel in search of the husband who had abandoned her and ended up serving as a foot soldier fighting against the Jacobite rebellion of 1745. Most, though, doubted the rumor entirely, and the matter became the subject of much wagering in London, resulting in a pool being started at the actual London Stock Exchange. Interestingly, Chevalier d’Eon refused to confirm or deny the rumor and would not submit to a physical examination that would put the matter to rest, stating that it would be a great dishonor. When King Louis XV died in 1774. D’Eon renewed his negotiations with Louis XVI, seeking to maintain his title and pension and finally return to France. Due to the shrewd negotiations of playwright Pierre de Beaumarchais, who was working on the King’s behalf, the question of Chevalier d’Eon’s sex was made central to the matter. In a courtroom setting, witnesses swore that they had seen d’Eon in the nude, affirming that he was in fact biologically a woman. Using these testimonies to convince Louis XVI that d’Eon was a woman, Beaumarchais hammered out a deal. If d’Eon wanted to resolve the matter and return to France, he must adopt female dress and habits and agree to live the rest of his days as the woman he was. If he did so, he could keep his pension, but he would be stripped of his title of plenipotentiary minister because a woman could not hold such a position. To the surprise of many, d’Eon accepted, returned to France and lived as a woman the rest of her days, some 33 years. She was legally recognized as a biological woman, and she lived for many of those years as a sort of celebrity. She participated in fencing tournaments and was thought of as a kind of warrior lady, like the legendary Amazons. Thus it was a great shock when, after her death in 1810, a post-mortem examination revealed that she possessed “male organs in every respect perfectly formed.”

It is unsurprising that Chevalier—or as she would be known by the feminine form of her title, Chevalière—d’Eon is featured prominently on most online lists of historical trans figures. She is remembered by many as a male-to-female transgender person who had the audacity and genius to come out in the 18th century by convincing the world that she had actually been a woman cross-dressing as a man and was now just discontinuing her transvestism to live as the sex assigned her at birth. In her 1779 ghostwritten memoir, The Military, Political, and Private Life of Mademoiselle d’Eon, it was explained as such: she had been raised as a boy by her father because he required a son in order to ensure an inheritance from his in-laws. However, d’Eon’s post-mortem proved that this was just a cover story, and some other observations of the physician who examined the Chevalière’s body, specifically that she was round of limb and curvaceous, with “breasts remarkably full,” have led some to suggest that she was in some way biologically intersex. However, historians typically take an opposing view that is equally understandable. They might suggest that whatever feminine aspects her body showed were a result of weight gain and the shaping that occurs from wearing women’s clothing, such as corsets. These academics use the male pronoun and look at the circumstances surrounding d’Eon’s transition to argue he was a hoaxer, asserting that his adoption of women’s garments was simply a means to resolve his conflict with the French crown and negotiate his return home, or that it was not even his idea and he was forced by the playwright and negotiator for the crown, Beaucharnais, to humiliate himself by adopting women’s attire in order to return from exile. They will point to the fact that he never showed a desire to dress as a woman before the conclusion of his negotiation, that he in fact preferred his military uniform, and that he even resisted the stipulation to dress as a woman for two years. In fact, d’Eon even challenged Beaucharnais’s partner, who claimed d’Eon had confided to him that he was a woman, to a duel, a challenge easily avoided by saying it was dishonorable to fight a woman. Then there are further reports from during the remainder d’Eon’s life that, although he dressed as a woman to comply with the terms of his repatriation, he refused to behave as one would expect a lady to behave, climbing into and out of carriages without aid, and remaining after dinner parties to socialize with the men when the women retired.

Portrait of Chevalier d’Eon in women’s clothing. Public Domain.

Portrait of Chevalier d’Eon in women’s clothing. Public Domain.

Nearly every academic point raised to claim Chevalière d’Eon was a mere impostor can be refuted pretty easily and logically, starting with the claim that she never wore women’s clothing before 1777. According to her own memoir, while traveling to Russia on a diplomatic mission he had dressed as a woman in order to evade the British, and while at the court of Empress Elizabeth of Russia, would have taken part in that monarch’s unusual weekly masquerades, which forced everyone to cross-dress and weren’t exactly masquerades in that no masks were to be worn. The notion that she did not wish to dress as a woman receives a further blow by evidence that d’Eon promptly purchased an entire set of lady’s garments as soon as it became clear that dressing as a woman would be a stipulation of the repatriation agreement. Furthermore, d’Eon’s early resistance to dressing as a woman after the agreement was signed, as well as the challenging of Beaumarchais’s partner to a duel, likewise have an alternative explanation. It turned out that, in making d’Eon’s gender identity a part of the negotiations, Beaucharnais and his partner were motivated by greed. Recall that the rumors of Chevalier d’Eon’s womanhood had sparked a great deal of wagering in London. It turns out that the odds favored d’Eon being a man, so Beaumarchais and his partner stood to win a lot of money if they bet d’Eon was a woman and then were able to legally prove it. And this is exactly what they did, producing witnesses willing to perjure themselves by stating they had seen or touched Chevalier d’Eon’s female genitals. So d’Eon’s early refusal to comply with the order to dress as a woman, as well as her anger and attempt to duel these men, was prompted by the fact that she discovered they had used her gender identity in a scheme to make money. However, there is every reason to suspect that the original rumor about d’Eon being a woman was started by d’Eon herself, and that it was she who suggested using the repatriation negotiation to make this gender identity official. Remember that at any time, d’Eon could have disabused everyone of this notion. And she had good reason to, as well, for maintaining her female gender identity ensured that she could not retain her plenipotentiary minister title. This was something that she had previously made central to her demands, so it stands to reason that living as a woman was worth more to her than maintaining her status. When Beaumarchais’s partner afterward tried to sue the bettors who refused to pay on their wagers and was obliged produce evidence of d’Eon’s womanhood in a court of law, he did so counting on the fact that d’Eon would not show up to present evidence to the contrary, which she did not. And finally, following the French Revolution and Louis XVI’s execution in 1793, d’Eon returned to London, and thus was no longer subject to the terms of her repatriation agreement, yet she continued to live as a woman until her death in 1810. Some will argue that she enjoyed her reputation as a heroic woman too much to give it up, and others suggest that reversing the transition would have been too great a dishonor, but the simplest explanation, which accords with all the events of her life previous to that, is that Chevalière d’Eon had always secretly wanted to live as a woman and had schemed in the 1770s to find a brilliant way to do so despite the gender norms of her society.

Even when the academic literature about Chevalière d’Eon concedes that she may have wanted her gender transition, it tends to search for some further explanation, as though unwilling to accept the simple and clear idea that she had always identified as a woman, even throughout her military career and her constant belligerent dueling. They point to d’Eon’s sexual orientation to suggest a homosexual lifestyle may have led to the gender transition. As evidence, they observe that there is no record of his being engaged in heterosexual affairs. In fact, he lived for years in London apparently having an affair with his married landlady, who during the repatriation negotiations swore that she regularly slept with d’Eon, had seen d’Eon naked, and knew d’Eon to be a woman, which was at least in part, or in some sense, a lie. Likewise during several of her later years, d’Eon lived with a widow. But regardless of these relationships, the fact is that Chevalière d’Eon’s sexual orientation proves nothing about her gender identity, a fact that seems to escape some historians. Still other academics look to gender norms and evolving notions of masculinity in Enlightenment Europe in an effort to “explain” her gender transition. They point to the “effeminacy” that had become more and more common among the upper-class male, especially courtiers like Chevalière d’Eon. This was, after all, the age of fops and Macaronis, who preferred dainty and embroidered hose and garter belts, and wore their wigs high with dangling curls in a style previously popular only among women. However, to suggest that d’Eon’s gender transition was a result of the 18th century feminization of men would be to ignore the fact that in most circles, even those in which d’Eon moved, “effeminate” was still considered an extremely offensive descriptor for men, an insult over which duels were commonly fought, and the “effeminacy” of courtiers was one of the greatest criticisms of decadent court life. But again, regardless of the accuracy of this argument, it misses the mark on an even more fundamental level. Searching for a cultural “cause” of d’Eon’s gender transition is a tacit denial that she might have identified as a woman her entire life, regardless of her social environment.

A sketch of Chevalière d’Eon as an older woman. Public Domain.

A sketch of Chevalière d’Eon as an older woman. Public Domain.

Another academic view of Chevalière d’Eon’s gender transition is that it was more of a philosophical or theological statement. Once again, these academics look to notions about gender roles that were then prominent in the zeitgeist for an “explanation” of her transition, and also to evidence of the literature that d’Eon kept in her library. They make much of the fact that Chevalière d’Eon owned one of the largest collections of querelle des femmes literature in the world. This proto-feminist literary genre attempted to refute the notion, common in that era, that women were a corrupting influence in royal courts and society in general, arguing instead that women were actually morally superior to men, and that men should imitate their virtuous ways. But again, with the knowledge of gender identity we have today, Chevalière d’Eon’s transition does not require some further explanation. Furthermore, it’s backward to suggest d’Eon’s feminist views caused her transition, when it seems far more likely that her feelings about her gender led her to seek out and collect literature that complemented her deepest convictions about herself. Academics who suggest her transition was a result of her theological principles make this same mistake, but at least they rely on her own words, from an unfinished autobiography. Later in life, Chevalière d’Eon found religion, specifically leaning toward Jansenist Christianity, about which I spoke a great deal in my series on miracles in Enlightenment France. In her autobiography, though, she revealed a theological reason for disregarding her gender entirely. According to her, the bodies God gives us are absolutely corrupt, a tenet with which many devout Christians would agree wholeheartedly. Chevalière d’Eon’s point is that sexuality and gender variance is meaningless to God and irrelevant to whether or not one achieves salvation. In the end, all physical differences, whether they be that of biological sex, or infirmity, or race, will vanish when in paradise we are reborn in glory. When historians look at this and suggest these beliefs might “explain” her gender transition, it boggles the mind. If anything, these views, which she seems to have developed long after her transition, served to comfort and reassure her that there was nothing morally wrong with the acceptance of her gender identity. In fact, rather than picking this theology of gender apart for some better understanding of a gender transition that should be simple to understand, I think it should be viewed as a remarkable interpretation of Christian doctrine that promotes tolerance—something that many Christians today should consider adopting.

Recently, the opponents of trans rights—or should I more accurately say the oppressors of trans people?—shifted the focus of their arguments. In the recent past, they have focused on transgender women posing a threat to cisgender women, portraying them as public restroom prowlers. More recently, Harry Potter author J. K. Rowling suggested that recognizing the rights of trans women somehow negates the rights of cisgender women. You may notice that much of this opposition is focused on male-to-female trans people, as it seems the gender identity of these individuals in particular upsets patriarchal notions of masculinity. But another thread in recent arguments against recognizing the struggles of trans people as legitimate focuses instead on the science of human biology. Nothing can be more ironic than Qanon conspiracy enthusiast congresswoman Marjorie Taylor Greene posting a sign outside her office that states “There are TWO genders: MALE & FEMALE. ‘Trust the Science!’” Of course, I recognize that the irony of her saying “trust the science” may not actually be lost on her, since she put the phrase in scare quotes, but I don’t trust that she has really worked out the problem with mockingly repeating a phrase while seemingly trying to be earnest with its use. Regardless, let’s close our discussion by talking a bit about science and transgender issues. Actually, science has come a long way since the days of Dr. Hirschfield’s institute. The fact is, science no longer tells us there are only 2 genders. One example is the 2018 article “Only Two Sex Forms but Multiple Gender Variants: How to Explain?” in the academic journal Communicative & Integrative Biology, in which the author concludes that “there are probably as many different gender variants as there are sexually reproducing individuals.” The error of Marjorie Taylor Greene’s argument, of course, comes from the erroneous conflation of sex and gender, but science no longer recognizes only two sexes either. This can be clearly discerned in the 2015 Nature article “Sex Redefined” which states that “[t]he idea of two sexes is simplistic. Biologists now think there is a wider spectrum than that.”

But if we look at the history of scientific examination, and specifically psychological studies, of gender transitions that were conducted throughout the 1990s, we see some of the same problems that are present in the academic exploration of transgender history. One tested hypothesis was that boys who were “pretty” or deemed beautiful were treated more like girls and because of this treatment became transgender. Findings seemed to support this hypothesis, but the whole notion could be inverted, such that these “pretty boys” had actually changed their appearance to fit their gender identity, rather than changing their identity to match how others viewed them. Likewise, a similar study’s findings suggested girls who were seen as “ugly” were treated like boys and then identified as male, even though it appears those who later identified as male may have originally not been seen as pretty because they had cut their hair short to complement their burgeoning transgender identities. Researchers also tested the hypothesis that parents with depression or borderline personality disorder somehow turned their children trans, but again, the notion could be flipped to suggest that these parents actually suffered depression and BPD because of the mistreatment and struggles their transgender children were enduring. Then there was a study that sought to confirm that parents who allowed their kids to play with the wrong toys were somehow responsible for their children becoming transgender, when obviously the child’s desire to play with the toys deemed inappropriate to his or her gender was already present. What all these studies had in common was that psychologists and researchers viewed transgender identity as a mental illness, “gender identity disorder.” Because of this, they invariably approached their studies as experiments to determine the “problem.” This is exactly how historians often approach the study of transgender figures in history. If they are not trying to disprove entirely the notion that a historical person was transgender, as we understand it today, they seek to determine what might have “caused” the person to dress or act as a member of the opposite gender. They try to explain away their gender transgressions because historians are trained look more deeply, to avoid supposition, and to doubt the obvious. However, the story of Chevalière d’Eon shows us that, just as scientists struggled to measure and test transgender identity, so too historians often dismiss the apparent in their efforts to analyze it. It seems that academics would do well to remember that practicing critical thinking does not mean rejecting what is clear and evident.

A portrait capturing Chevalière d’Eon’s beauty. Public Domain.

A portrait capturing Chevalière d’Eon’s beauty. Public Domain.

Further Reading

Ainsworth, Claire. “Sex Redefined.” Nature, 18 Feb. 2015, www.nature.com/news/sex-redefined-1.16943.

Clark, Anna. “The Chevalier D'Eon and Wilkes: Masculinity and Politics in the Eighteenth Century.” Eighteenth-Century Studies, vol. 32, no. 1, 1998, pp. 19–48. JSTOR, www.jstor.org/stable/30054266.

De Loof, Arnold. “Only Two Sex Forms but Multiple Gender Variants: How to Explain?” Communicative & Integrative Biology, vol. 11, no. 1, 2018. NCBI, www.ncbi.nlm.nih.gov/pmc/articles/PMC5824932/.

“The first Institute for Sexual Science (1919-1933).” Magnus-hirchfield.de, magnus-hirschfeld.de/ausstellungen/institute/.

Icks, Martijn. The Crimes of Elagabalus. I.B. Tauris, 2011.

Kates, Gary. “The Transgendered World of the Chevalier/Chevalière D'Eon.” The Journal of Modern History, vol. 67, no. 3, 1995, pp. 558–594. JSTOR, www.jstor.org/stable/2124220.

Lander, James. “A Tale of Two Hoaxes in Britain and France in 1775.” The Historical Journal, vol. 49, no. 4, 2006, pp. 995–1024. JSTOR, www.jstor.org/stable/4140148. Accessed 12 Mar. 2021.

Stevens, Heidi. “Marjorie Taylor Greene wants us to ‘trust the science’ on transgender rights. Here’s the science.” Chicago Tribune, 26 Feb. 2021, www.chicagotribune.com/columns/heidi-stevens/ct-heidi-stevens-marjorie-taylor-greene-transphobia-trust-science-0226-20210226-atqjgfqht5hzjl54qjbeu4hk24-story.html.

Turban, Jack. “The Disturbing History of Research into Transgender Identity.” Scientific American, 23 Oct. 2020, www.scientificamerican.com/article/the-disturbing-history-of-research-into-transgender-identity/.