Cocaine Toothache Drops and Cigarettes on Prescription: American Medicine's Jaw-Dropping 100-Year Transformation
Cocaine Toothache Drops and Cigarettes on Prescription: American Medicine's Jaw-Dropping 100-Year Transformation
Picture this: you walk into a pharmacy in 1900 with a toothache. The pharmacist hands you a small bottle labeled Lloyd's Cocaine Toothache Drops — Instantaneous Cure! You give it to your seven-year-old. Nobody bats an eye.
Or imagine your doctor, sometime around 1950, finishing your checkup and casually recommending a cigarette brand to help with your nerves. Printed ads in medical journals backed him up. Camel ran a campaign for years with the tagline: More doctors smoke Camels than any other cigarette.
This wasn't fringe quackery. This was mainstream American medicine — practiced by licensed professionals, endorsed by institutions, and trusted by millions of ordinary people. The distance between that world and the one we live in now is almost impossible to overstate.
When the Pharmacy Was the Wild West
Before the Pure Food and Drug Act of 1906, the United States had virtually no federal oversight of what could be sold as medicine. Patent medicines — bottled remedies with catchy names and zero accountability — flooded the market. Many contained alcohol, opium, morphine, or cocaine in quantities that would horrify any modern pharmacist.
Mrs. Winslow's Soothing Syrup, marketed to help teething babies sleep through the night, contained morphine. Bayer — yes, the aspirin company — launched heroin as a commercial product in 1898, marketing it as a "non-addictive" substitute for morphine and recommending it for coughs, bronchitis, and even as a children's remedy. It was available at pharmacies without a prescription.
The thinking wasn't entirely reckless by the standards of the time. Germ theory was still relatively new. The mechanisms of addiction weren't understood. Clinical trials as we know them didn't exist. Doctors were doing their best with genuinely limited tools — but those tools included substances we now classify as controlled narcotics.
Cigarettes, Radiation, and the Art of the Confident Guess
The early 20th century produced some medical recommendations that are almost comical in hindsight — except that real people followed them and real harm resulted.
In the 1930s and 1940s, some physicians recommended smoking as a treatment for anxiety and even certain respiratory conditions. The logic was that cigarette smoke could "soothe" irritated airways. Tobacco companies aggressively courted the medical establishment, and many doctors were happy to lend their credibility to ad campaigns. The link between smoking and lung cancer wasn't definitively established in the American medical mainstream until the 1950s — and even then, the tobacco industry spent decades funding research designed to muddy the waters.
Radium, discovered by Marie Curie in 1898, was briefly embraced as a health tonic. Radium-laced water was sold in the early 1900s as an energy supplement. A product called Radithor — essentially distilled water spiked with radium — was marketed as a cure for everything from arthritis to impotence. Wealthy socialite Eben Byers drank nearly 1,400 bottles before his jaw literally disintegrated. He died in 1932. The Wall Street Journal ran his obituary under the headline: The Radium Water Worked Fine Until His Jaw Fell Off.
Surgery was another frontier of confident improvisation. Anesthesia was unreliable, infection was rampant, and the concept of sterile technique wasn't universally adopted until well into the 20th century. Before antiseptics became standard practice, a hospital was often a more dangerous place than staying home.
The Slow, Painful Build Toward Evidence-Based Medicine
The transformation didn't happen overnight — it happened through decades of hard science, regulatory reform, and sometimes tragic trial and error.
The 1937 sulfanilamide disaster, in which a drug solvent killed over 100 people including children, directly led to the Federal Food, Drug, and Cosmetic Act of 1938, which required safety testing before a drug could be marketed. The thalidomide crisis of the early 1960s — which caused severe birth defects in thousands of children in Europe and was narrowly avoided in the US thanks to a cautious FDA reviewer named Frances Kelsey — led to even stricter drug approval standards.
The randomized controlled trial became the gold standard for medical research. The connection between lifestyle factors and chronic disease was mapped with increasing precision. Vaccines transformed the landscape of infectious disease. Surgical techniques became refined, sterile, and survivable in ways that would have seemed like science fiction to a Civil War-era surgeon.
Where We Are Now
Today, an American patient has access to a healthcare system that — for all its very real and well-documented problems with cost and access — is built on a foundation of evidence that simply didn't exist 100 years ago.
MRI machines can image soft tissue in real time. Blood tests can flag cancer markers before symptoms appear. Medications go through years of clinical trials before a single prescription is written. Doctors are required to disclose conflicts of interest. The pharmaceutical industry is heavily regulated in ways that would be unrecognizable to the patent medicine salesmen of 1900.
None of this means modern medicine is perfect. It isn't. Mistakes still happen. Gaps in knowledge still exist. Bias and inequality still shape who gets good care and who doesn't. But the baseline — the floor of what we consider acceptable medical practice — has been raised so dramatically that comparing it to the era of heroin cough syrup and radium water feels like comparing a smartphone to a telegraph.
The doctors of 1900 weren't stupid or malicious. They were working with what they had. What we have now is just incomparably better — and it's worth remembering that the next time a medical advancement feels unremarkable.
Somewhere, a patient in 1920 would have given anything for it.