This is the first part of a two-part article.
The systematic dismantling of public health infrastructure now underway in the United States is not an aberration. It is not the product of one man’s ignorance, one administration’s malice, or a temporary rupture in an otherwise sound institutional order. Yet the dominant media narrative has treated it as precisely that—a shocking departure, a catastrophe that arrived without warning. What is missing from that reporting is history. Without it, the present crisis cannot be explained, its depth cannot be measured and the social forces capable of reversing it cannot be identified.
The agencies now being gutted, including the Centers for Disease Control and Prevention, the Food and Drug Administration, the National Institutes of Health, and globally, the World Health Organization, were not the automatic products of civilizational progress. They were built under specific historical conditions, won through specific social struggles and they have always been vulnerable to reversal under conditions of broader social crisis. Understanding what is being destroyed today requires understanding how and why it was built.
It is in this context that the career of Dr. Stanley Plotkin demands serious attention. Plotkin, widely revered as the “godfather of vaccines,” is 93 years old. In the March 2, 2026 edition of STAT News, journalist Helen Branswell traced the arc of his career and recorded his despairing assessment of the present moment, in a profile titled, “A titan of vaccine development sees his field’s achievements slip away.” Plotkin’s verdict was unsparing: that the field’s achievements are slipping away, that vaccine nihilism is rising, and that he does not know how to counter it. His longtime colleague Walter Straus noted that Plotkin is watching his life’s work dismantled, in some cases repudiated, on specious grounds.
Branswell’s profile is a work of genuine sympathy for a scientist who has earned it. But sympathy without analysis has its own political function at a moment of crisis. It renders Plotkin’s despair moving rather than explicable, leaving the reader with grief but no framework for understanding why this is happening or what would need to change. In this the STAT piece is not an exception to the broader media failure—it is its most accomplished expression.
For nearly seven decades, Plotkin’s work has been driven by a deep conviction in medicine as a transformative social force. As a 15-year-old growing up in the Bronx, he read Sinclair Lewis’s Arrowsmith and Paul de Kruif’s Microbe Hunters, books that shaped his dedication to biological science and vaccinology. Reflecting on his origins in a recent Q&A with the World Socialist Web Site, Plotkin explained the animating conviction of his career: “It struck me that science could be a social mission changing people’s lives.”
Today, however, that lifelong mission is colliding with a brutal political reality. The systematic destruction of public health protections, driven by the fascistic agenda of the Trump-Kennedy administration, threatens to undo the work of generations of researchers. Faced with this regression, Plotkin bluntly warned the WSWS: “The decrease in support of science will result in fewer prevention or cures. That is obvious.”
Plotkin’s grief is entirely legitimate, and his scientific achievements—most notably the rubella vaccine that eliminated the disease in the United States—are of genuine historical magnitude. He sees the demise of institutions that were themselves historically specific—built during a particular moment when ruling elites, under particular pressures, found it expedient to fund the scientific and public health infrastructure he spent his career inside. That moment has passed. What replaced it is what we are now witnessing.
This report traces the science Plotkin built, the history that made it possible, and—unavoidably—the ideological limits of the worldview that leaves one of its foremost champions without the political tools to defend it. The defense of public health, to include vaccines, is a political struggle.
Rubella: The disease medicine failed to see
For more than two centuries, the medical establishment fundamentally misunderstood rubella. Friedrich Hoffmann made the first clinical description in 1740; the name “rubella”—from the Latin for “little red”—was coined in 1866 by Henry Veale, an English surgeon observing an outbreak in India.
By 1938, Japanese researchers Hiro and Tosaka had confirmed viral transmission, yet the pathogen itself would remain uncaptured for another two decades. The disease was considered so benign that popular magazines promoted “German measles parties,” encouraging parents to expose their daughters before they reached childbearing age. The medical profession did not fear rubella because its catastrophic damage was invisible: it struck the developing fetus in the womb, leaving the presenting patient apparently well. The real victims were unborn children and the women who carried them. When World War II broke out, the mobilization of troops created conditions for mass contagion. In 1939 and 1940, a severe rubella epidemic swept through the crowded confines of Australia’s wartime army training camps. Young soldiers carried the infection home on leave, transmitting the virus to their families and the wider community.
In early 1941, a 50-year-old ophthalmologist from Sydney named Norman McAlister Gregg noticed an unusually high incidence of infants presenting at his clinic with atypical congenital cataracts. The medical dogma of the era held that congenital defects were strictly inherited, not caused by environmental infections and that the placenta was an impenetrable barrier to disease.
Gregg’s breakthrough came because he heard what the medical establishment had trained itself not to hear. Overhearing a conversation in his waiting room between two mothers of cataract-affected infants, he caught a detail both women mentioned: they had suffered from German measles during pregnancy. Investigating the records of every affected infant he could identify, Gregg established that 68 of 78 children with congenital cataracts had been exposed to rubella in the womb.
Gregg’s 1941 paper revolutionized the study of birth defects, establishing for the first time that an environmental viral agent could cause congenital malformations. The medical establishment’s response was dismissive. An editorial in The Lancet in 1944 suggested that Gregg had not adequately proven causation, reflecting a broader professional skepticism that the placenta could be breached by infection. Overseas medical audiences proved equally resistant—when the Australian pediatrician Sir Lorimer Dods traveled to the United States in 1947 to present Gregg’s findings, he later wrote of watching the assembled physicians: “You could see them all doubting.”
The resistance persisted until a University of Sydney statistician, Oliver Lancaster, fully vindicated Gregg in 1951, demonstrating mathematically that epidemics of deafness in Australia tracked precisely with rubella outbreaks nine months earlier—a correlation extending back to 1879, meaning congenital rubella had been causing preventable damage for generations without being recognized.
The lifelong human toll of the disease was documented with clinical precision in a 60-year follow-up study of Gregg’s original patients. The consequences of congenital rubella syndrome (CRS) were not passing childhood afflictions—they were permanent alterations of the body that accumulated across a lifetime. When 50 of the original cohort were reviewed in 1967 at the age of 25, 48 were deaf, 26 had cataracts or retinopathy, 14 had cardiac defects and five had intellectual disabilities.
By their 60-year review in 2000–2001, the damage had continued to evolve. Sixty-eight percent showed aortic valve sclerosis. Twenty-two percent had developed diabetes—nearly double the expected rate for their age group. Eight of 11 women had experienced early menopause. New cataracts had developed in three subjects over the preceding decade; eight had glaucoma. The virus, contracted in the womb 60 years earlier, was still making itself felt.
The social consequences were stark. By the time they reached 60, only eight of the subjects were still working; many had been forced into early retirement or lived on pensions, and nine had never married. Yet reflecting on their lives, these survivors expressed satisfaction that, because of the vaccine, “today’s young Australians do not have to cope with the problems they had to overcome.”
Despite Gregg’s discovery in 1941, two more decades would pass before the virus was isolated, and nearly another decade after that before a vaccine was ready. The gap was catastrophic. In the spring of 1963, a massive rubella epidemic erupted in Europe and spread to the United States in 1964 and 1965.
The thousands of afflicted infants left in the epidemic’s wake revealed that CRS was far more extensive than Gregg had observed in 1941. Beyond the core destruction of the optic lens, the cochlea and the heart, the virus caused systemic damage throughout the brain, lungs, liver, spleen and bone marrow, producing encephalitis, intellectual disability, pneumonia, and hepatitis.
Plotkin, who was practicing pediatrics in Philadelphia during the epidemic, recorded its human cost in a 2006 retrospective. “Those of us who were practicing pediatrics or obstetrics during those years remember with poignancy the many tragedies we witnessed as families struggled with decisions about therapeutic abortions and severely damaged infants,” he wrote. The numbers bore out the memory: at the height of the epidemic, Plotkin calculated that 1 percent of all births in Philadelphia bore the virus’s damage.
Science as social inheritance: the cell culture revolution, 1949–1962
The eventual elimination of congenital rubella cannot be understood outside the broader revolution in mid-century virology. As the WSWS has documented in its account of the measles vaccine, the 1949 breakthrough by John Franklin Enders—demonstrating that poliovirus could be grown in non-neural human tissue in laboratory culture—was the methodological key that unlocked modern vaccinology.
Enders’s innovation was not merely an individual triumph but a conceptual revolution that remade the field—establishing that viruses need not be cultivated in living animals, that they could be studied, weakened and eventually controlled in glass.
Science is a social and institutional inheritance, not a series of isolated individual discoveries. It was no coincidence that one of the scientists who isolated the rubella virus was Dr. Thomas Huckle Weller—a student of Enders, his co-recipient of the 1954 Nobel Prize, and the direct carrier of his methodological legacy. The transmission of technique across generations, sustained by the publicly funded institutions of the postwar era, made the rubella vaccine possible long before any single researcher sat down to make it.
For more than two decades after Norman Gregg identified rubella’s catastrophic effects on the fetus, the virus itself remained invisible to science, making vaccine development impossible. Then, in late 1962, two separate teams of researchers independently cracked the problem simultaneously, publishing their results in the same volume of the same journal.
At the Harvard School of Public Health, Weller and his colleague Franklin Neva cultured the virus in human amnion cells, detecting its presence through characteristic cytopathic changes.
Working simultaneously at the Walter Reed Army Institute of Research in Washington D.C., Paul Parkman, alongside Edward Buescher and Malcolm Artenstein, isolated the virus by a different method entirely. Recognizing that rubella spread easily through military barracks—where young men from isolated rural communities were crowded together with no prior immunity—Parkman’s team collected throat washings from infected soldiers at Fort Dix, New Jersey. They then demonstrated the virus’s presence in African green monkey kidney cells, using a technique of viral interference: a second virus, ECHO-11, failed to produce its usual cytopathic effect in cells already occupied by rubella, revealing the invisible pathogen indirectly.
The timing of this dual isolation was a critical, life-saving development. It arrived just before the catastrophic global rubella epidemic of 1963–1965. For the first time, researchers and doctors had the serological tools to accurately diagnose rubella infection, allowing them to confirm the disease in pregnant women and study the pathogenesis of the virus.
Paul Parkman’s life embodied the ethos of a public-spirited science that is now under systematic attack. The son of a postal clerk in the small town of Weedsport, New York—his father raised turkeys to finance his medical education—Parkman went on to build a career defined by his refusal to treat scientific discovery as a commercial asset.
When Parkman and his colleague Harry Meyer developed the first licensed rubella vaccine, they assigned their patents directly to the U.S. Department of Health—not to Merck, not to a private firm—so that the vaccine could reach as many people as quickly and affordably as possible.
Reflecting on his career in a 2002 retrospective, he wrote: “With the exception of safe drinking water, vaccines have been the most successful medical interventions of the 20th century. As I look back on my career, I have come to think that perhaps I was involved in the easy part. It will be for others to take on the difficult task of maintaining the protections that we struggled to achieve. We must prevent the spread of this vaccine nihilism, for if it were to prevail, our successes could be lost.” Dr. Paul Parkman died in May 2024 at the age of 91—long enough to watch his warning become prophecy, long enough to see the vaccine nihilism he feared take hold in the very institutions that had once celebrated his work.
Hilary Koprowski and the Wistar: the institutional conditions of discovery
Dr. Stanley Plotkin’s rubella vaccine did not emerge from isolated genius. It required specific institutional conditions—a particular laboratory, a particular director and a particular moment in the postwar funding of science.
Born in Warsaw in 1916, Hilary Koprowski was a polymath who completed his medical degree at the University of Warsaw and simultaneously studied piano at the Warsaw Conservatory of Music, later earning a further diploma at the Accademia di Santa Cecilia in Rome after fleeing the 1939 Nazi invasion. Koprowski and his wife made their way through Rome and Portugal to Brazil, where he worked for the Rockefeller Foundation’s Yellow Fever Research Service before joining Lederle Laboratories in New York in 1944.
At Lederle, Koprowski developed the world’s first live oral polio vaccine, which he tested first on himself in 1948 and later, in 1950, on a group of children at Letchworth Village in New York. Yet the medical establishment and the federal government ultimately backed the competing vaccines developed by Jonas Salk and Albert Sabin. It was Sabin’s oral vaccine that achieved global licensure—a result that, in the words of a colleague reflecting after Koprowski’s death, left him “deeply disappointed because it cost him the Nobel Prize.”
In 1957, Koprowski took over as director of the Wistar Institute of Anatomy and Biology in Philadelphia—a facility he later described as little more than “a moribund anatomic museum” known for breeding laboratory rats and displaying anatomical specimens. He transformed it completely. He cleared the museum, renovated the laboratories and built a biomedical research center of international standing. His gift was atmospheric as much as scientific: he minimized bureaucracy, demanded interdisciplinary exchange and drew researchers from across the world who found in Wistar an environment that, as one colleague recalled, “facilitated the practice of science” and was never, under Koprowski, boring.
Plotkin’s entry into this environment was a calculated move. Facing the military draft following his medical internship, he joined the Epidemic Intelligence Service (EIS) at the CDC—and when offered a list of public health service assignments, chose the one at the Wistar Institute, not for its anthrax research but for its director. “I knew that Hilary Koprowski had taken over at the Wistar Institute, and my reasoning was that if I went to the institute, I could get into his laboratory,” Plotkin recently recalled. He walked into Koprowski’s office, asked to join his polio research, and was accepted. The rubella vaccine, though neither man knew it yet, had found its home.
The 1964–65 rubella epidemic
The global rubella pandemic swept the United States in 1964 and 1965, infecting an estimated 12.5 million Americans. For weeks after exposure, most adults experienced only a fleeting rash and mild fever. The full horror revealed itself months later, when the babies began to be born.
The epidemic resulted in 11,250 fetal deaths and miscarriages, 2,100 neonatal deaths and 20,000 infants born with CRS. In Philadelphia alone, Plotkin calculated that 1 percent of all births bore the virus’s mark.
The 1964–65 epidemic also precipitated a social and political crisis that reshaped American law and the lives of women. Faced with the terrifying prospect of giving birth to severely damaged infants, women exposed to rubella began to demand control over their pregnancies and access to therapeutic abortions, directly precipitating some of the first legal abortion reform in the United States.
This clash between medical necessity and laws criminalizing abortion culminated in 1966 in the case of the “San Francisco Nine”—nine physicians sued by the California State Board of Medical Examiners for performing abortions on women exposed to rubella during pregnancy. The medical establishment rallied in their defense: the deans of 128 medical schools across the country publicly supported the accused doctors, and the state ultimately dropped the charges in 1970.
The epidemic was also a case study in how a public health crisis falls unequally along class lines. The ability to secure a safe, legal abortion when exposed to rubella was overwhelmingly determined by wealth and social position. Women of means, with access to private physicians, could generally obtain the procedure; for poor and working-class women, the same medical necessity was routinely denied.
Those with private physicians could navigate hospital “therapeutic abortion committees” to secure safe procedures under a psychiatric or medical guise. For working-class women relying on subsidized hospital outpatient clinics, that same choice was denied by the identical committees. Historical records from the period documented that for every publicly supported patient permitted to terminate a rubella-exposed pregnancy, nine privately paying patients received the procedure—the same disease, an entirely different outcome, determined entirely by income.
The epidemic laid bare with brutal clarity what the absence of a vaccine meant in practice. Vaccines had already broken the back of polio, were driving smallpox toward eradication, and had begun eliminating measles and diphtheria from entire generations. The work underway at the Wistar Institute carried on that history, and the rubella epidemic had just made the stakes undeniable.
To be continued
