We have a tendency to romanticize the past. Think about the food your great grandparents (or even their parents) ate in childhood and you might imagine farm fresh produce, pure milled grains, and pristine meat and dairy. But if they were living in the United States during the mid-to-late 19th century, that vision of food utopia wasnât likely reality.Â
Before 1906, there were no federal food safety regulations in the US. Local grocers were a wild west of unlabeled additives, untested chemicals, and inedible fillers. In the gap between the industrialization of the food system during the mid-1800âs and those first laws dictating what could be sold as food, working class Americans spent decades eating âmostly crap,â says Deborah Blum, a Pulitzer-Prize winning science journalist. In her 2019 book, The Poison Squad, Blum details the origin story of the landmark Food and Drug Act.
As more folks left farm life behind and came to rely on manufactured food âan enormous amount of food fraudâ emerged, Blum tells Popular Science. Nowadays, the overwhelming majority of people continue to purchase their food from grocery aisles, but the food we buy there is much less liable to make us sick. So, how did we get from that past to our current present? And, with regulatory agencies including the FDA facing enormous cuts, what might the future hold?
Ground Shells, Brick Dust, and Bones
European countries, including Britain, Germany, and France passed food safety regulations about 50 years before the US did. In classic American style, we eschewed top-down restrictions and allowed the free market, free rein. In lieu of federal regulation, there was a haphazard patchwork of state and local laws surrounding certain foods pre-1906. Massachusetts, for instance, passed âAn Act Against Selling Unwholesome Provisionsâ in 1785. But unsafe practices consistently fell through the cracks and into consumersâ stomachs, says Blum. In some cases, food wasnât food at all.Â
Pre-pasteurization, milk spoilage and bacterial growth was a major problem. Away from the farm, dairy had to travel farther and keep for longer if people in cities were going to buy it. So, the dairy section became a hotbed of questionable additives. Borax, which you may recognize as a general-purpose pesticide, was used as a milk and butter preservative. Formaldehyde (AKA embalming fluid) was also a common milk additive and antibacterial agent. In addition to preserving the milk, formaldehyde also reportedly had a slightly sweet flavor, which helped improve the taste of rot, Blum explains.Â

In cheese, lead compounds were added to boost its golden color. Plaster of Paris, gypsum, and other white, powdery fillers made their way into milk and flour for color and texture. Flour was often portioned out by the grocer in-storeâwith mixed results. âIf you went to a very honest grocer, you might get real flour. If you didnât you might get a mix,â she says.Â
Coffee and spices were particularly terrible offenders. Ground coffee was often about 80 to 90 percent adulterated in the mid-19th century, says Blum. It might be made up of ground bone, blackened with lead, or charred seeds and plant matter. Spices were frequently 100 percent adulterated. Or, in other words, entirely made up of something other than what they were sold as. Cinnamon was frequently brick dust. Ground pepper could have been ground shells or charred rope. âProbably a good half of these products had some adulteration, depending on what you were looking at and how much you were willing to pay,â she says. The wealthy were generally able to afford higher quality, authentic, uncontaminated products.Â
But for everyone else, the problem of food fraud was so prevalent that people developed a suspicion of ground coffee, Blum says. Consumers started opting for whole beans instead, wherever possible. Suddenly, there was a market for counterfeit whole coffee beans, made of pigeon beans and peas, or even wax and clay. âYou can find flyers that went to grocers that said, âyou can, multiply your profits with our super cheap dirt beans.ââ
In her research, Blum found a record of a congressional hearing, where a food manufacturer described producing and selling a âstrawberry jamâ that was entirely red dye, corn syrup, and grass seed. His defense for the practice: âwe have to be competitive in the market and other people are doing it too,â paraphrases Blum.Â
As all of the above was going on, people had little idea what they were consuming. âThere was no labeling,â she notes. Though there were many cases of people falling ill. In an Indiana orphanage, multiple children died from formaldehyde poisoning. In New York state, an estimated 8,000 infants died from adulterated âswill milkâ in a single year.Â
[ Related: FDA bans Red No. 3 dye found in many of your favorite snacks. ]
Pushing for Purity
Calls for change came from multiple fronts, including womensâ groups and the growing âpure foodâ movement of the late 1800s, says Blum. But one chemist and physician, Harvey Washington Wiley, proved particularly dedicated and ultimately influential.Â
Wiley began noting and publishing reports on food contaminants during his work at the USDA in the 1880s and â90s. His primary job was to develop alternatives to sugar cane, but he started studying and cataloging adulteration in butter, milk, and honeyâ and later spices and alcoholic beverages. Thatâs where much of our data on food adulteration at the time comes from, notes Blum. Soon, Wiley was releasing regular bulletins on food adulterants and advocating for national laws.
Many of his early attempts ended in failure. Congressional representatives received a lot of money from the food industry, and werenât receptive to Wileyâs science-backed pleas for labels, transparency, and contamination regulations, says Blum. âHe keeps pushing for it. The industry keeps shooting it down, and the political dog fight continues,â she says.Â

But then, Wiley shifted tactics. He began conducting a series of experiments that he called the âhygienic table trialsâ with a group of USDA employees, later dubbed âthe poison squad.â All of the dozen or so participants willingly and knowingly signed up to receive three freshly prepared meals, seven days a week, for six months from the newly created USDA test kitchen. Yet, along with their nourishing meals, a subset of the participants were also fed additives commonly found in adulterated food. âYou could never have gotten this sort of study approved today,â says Blum. âHe poisoned his co-workers.âÂ
The group worked their way through borax, boric acid, salicylic acid, benzoic acid, sulfur dioxide, formaldehyde, copper sulfate, and saltpeterâ among other things. Unsurprisingly, the squad was frequently sick and the experiment garnered a ton of publicity. âIf you go to newspapers of the time, every single one had a storyââAmericans are eating poison,ââ Blum says.Â
The fervor, paired with the public outcry in response to Upton Sinclairâs book The Jungle, about Chicagoâs meatpacking plants, led politicians to change their tune. In 1906, Congress passed both the Meat Inspection Act and the Food and Drug Act (colloquially known as âWileyâs Lawâ). Later, the Food and Drug Act would be replaced by the Food, Drug, and Cosmetics act of 1938, which has been extensively revised and updated since. From these laws, the modern USDAâ responsible for regulating meat and poultry productsâ and FDA, responsible for all other foods and pharmaceuticals, emerged.Â
The Future of Food
Since the start of federal food regulation, states have beefed up their policies and the food industry has adopted its own standards. Many companies have even signed on to efforts like the Global Food Safety Initiative, which involves third-party testing beyond whatâs legally required. Plus, the mere existence of federal law means that people can sue when things go wrong. Litigation is a big driver of compliance and caution at the corporate level, says Blum.Â
Yet the FDA still plays a key role in oversight, research, and responding to emerging threats like bird flu in milk, says Brian Schaneberg, a chemist and director of the Institute for Food Safety and Health (IFSH) at Illinois Tech.At IFSH, academic researchers collaborate directly with industry and FDA scientists and the institute hosts multiple federal projects and labs. Research there includes work on improving infant formula safety, food contamination from packaging, pathogen prevention in food manufacturing and produce, investigating the causes of illness outbreaks, and Grade A milk validation. âWe really touch a lot of areas,â Schaneberg tells Popular Science.Â
Recently, the Trump Administration slashed more than 3,500 FDA jobs, amid broader slap-dash federal cuts. Despite claims to the contrary, these layoffs included dozens of scientists who conduct quality control and proficiency testing on everything from infant formula to dairy products and pet foods. The cuts have temporarily left the Center for Processing Innovation at IFSH almost entirely unstaffed, Schaneberg notes. From 15 staff, theyâre down to four. Other labs across the country were also impacted. Â
After public pushback, FDA leadership promised to reinstate scientists in key roles and re-open a handful of the shuttered labs last week. At Schanebergâs institute, federal scientists have been told theyâll be reinstated. Though, he notes they havenât received formal notices confirming their re-hiring. The long-term fate of FDA research and testing labs also remains uncertain as proposed major budget cuts and a massive reorganization looms. The currently proposed Trump Administration plan would shift most food testing to the states. Â
[ Related: How to properly wash fruits and vegetables. ]
âIâm definitely concerned,â says Schaneberg. He doesnât see any clear, immediate threat to consumers, but in the long-termâ he is worried about the FDAâs ability to ensure food safety if the agency is equipped with fewer staff and resources. âI still think all the big companies are going to do the best thing they can because they donât want to hurt their brands and they donât want to impact people.â And many states might have the ability to fill gaps. Yet thereâs always bad actors, new brands, new additives, and unknowns, he notes.Â
It may be much rarer than it once was, but the FDA still detects unsettling instances of food contamination. In 2023 and 2024, the agency investigated high lead and chromium levels in cinnamon applesauce pouches, marketed to children, notes Martin Bucknavage, a senior food safety extension specialist in the Department of Food Science at Penn State University. âThereâs those types of things that pop up, and itâs like âwhoâ who else is going to go through and do that?,ââ he says. The FDA has expertise in the science and the supply chains that few other institutions do, Bucknavage says, along with the ability and authority to respond quickly.
With rapid changes and reorganization on the horizon, itâs hard to predict what the effect will be, he adds. âI think immediate-term, our food supply is going to be safe,â Bucknavage says. After all, FDA inspections are far less frequent than companiesâ own safety tests and measures. But without the final layer of oversight, itâs possible something could be lost down the line, he says.
Blum, with all her knowledge of the treacherous food landscape of decades past, agrees. âIâm not sitting here saying catastrophe, because we donât actually know,â she says. âBut thereâs nothing in what the [Trump Administration] is doing that you would look at and say, âoh this makes us safer.ââ