Milk Wars: Infant Formula And Breastfeeding

Introduction

Over the past few decades, infant feeding has been under the spotlight of public opinion. In the United States and other developed countries, debates have arisen about maternal leave policies, breastfeeding in public, and formula food risks. A similar infant feeding debate also exists in developing countries, yet, the discussions are focused on corrupt infant formula promotion, the lack of sanitation, and infectious diseases linked to infant feeding practices. Since 1990, the World Health Organization (WHO) has officially recommended exclusive breastfeeding for the first six months of the infant’s life. However, the universal advice given by health professionals and organizations has been variable and controversial since the early 1900s. Breastfeeding became a highly debated topic with the rise of infant formula, shifting the matter of infant feeding from private to public. A review of the history of infant feeding and the evolution of the disagreements reveals that the formula vs breastfeeding ‘milk wars’ were not only determined by scientific theory and empirical studies. Politics, economics, and socio-cultural norms also fueled both sides of the debate. The controversial messages fluctuating throughout history reflect the changes in scientific knowledge nested within these socio-ecological forces.

Throughout history, European and American scientists have led the way with their hypotheses and theories about health. Until the late 1800s, the miasma theory had persisted to be the most popular reason for explaining why disease occurs. Edwin Chadwick, William Farr, and Florence Nightingale were among the proponents for this theory, equating bad odor with disease. With the available empirical evidence, scientists believed this theory provided a satisfactory explanation for outbreaks of cholera, plague, and other infections. However, in the late 1800s, Louis Pasteur and others introduced the germ theory. Robert Koch defined the four postulates that are required for disease to occur in the 1890s and the germ theory took the place of miasma as the accepted scientific viewpoint explaining how disease is spread. The rise of the germ theory also gave rise to the popularity of infant formula instead of animal milk or wet nurses as a breastmilk alternative, making infant feeding both safer and simpler for many mothers. This popularity soon spread out of the developed world to developing countries, as the westernized infant formula marketing began to gain power over the feeding choices of women. Opinions about breastfeeding and infant formula have existed since the mass introduction of formula at the turn of the 20th century. The debate between the two feeding practices gradually gained steam throughout the 20th century and continues today. Tracking the evolution of these feeding practices illustrates the complexity of the milk wars and the forces behind them.

The Alternative Breast Milk Market

The need for breast milk alternatives is rooted in biological, social, and economical reasons. According to recent market research, the breast milk substitute market is predicted to reach USD 22.1 billion by 2025. On a global basis, the market is expected to exceed USD 119 billion by 2025. These astounding numbers reflect the lowering of breastfeeding practices, increased birth rate in some countries, and the rise in the number of working women. However, the demand for breast milk substitutes is not novel. Throughout history, milk alternatives have always had their place in society. Due to varying reasons including lactation failure, illness, death, personal reasons, workplace policies, social attitudes, and/or economic reasons, the demand for breastmilk alternatives has always ebbed and flowed. During the 19th century, it was more common for mothers to die during childbirth. Therefore, alternate ways had to be available to feed the surviving infant without their birth mother available to breastfeed. Aside from maternal deaths, many obstacles to breastfeeding fuel the need for alternative milk supplies for infants. Chronic or infectious illness can result in a lactation failure in which not enough milk is produced. In addition, illnesses such as HIV-1 (Human Immunodeficiency Virus -1) and CMV (Cytomegalovirus) can be transmitted from the mother to the infant through breast milk and put the infant at risk of disease.

Furthermore, some women have lactation failure for unknown reasons. At the turn of the 20th century, as the United States was rapidly urbanizing and industrializing, more women were beginning to work in the factories. The gilded age driven by money and power began to change the position of women in society. Instead of being homemakers with time to feed their babies on demand, women were now in the workforce with limited feeding schedules. The introduction of infant-care manuals and detailed feeding schedules emphasized the need for timed feeds. The attempt to schedule a natural, biologic process, in turn, reduced milk supplies of women. More and more mothers began to complain to their doctors about their lack of milk supply and the medicalization of feeding came into play. At this time, one of the most popular theories for lactation failure was that lactation was a disappearing function of the female body through evolution. Another theory was that the increase in educating girls during their puberty years was leading to the brain competing with the reproductive system for energy. Therefore, these doctors believed that an increase in brain usage was reducing milk output. With these ideas circulating the medical community, there was a widespread fear that lactation failure was becoming a growing and irreversible problem.

It is now known that lactation failure without disease or injury can be due to insufficient gland tissue that blocks the tubes required for producing milk on demand. This type of lactation failure is usually acquired by the mother naturally and is not diagnosed until the baby attempts to feed once born. However, as seen at the turn of the 20th century, maternal behaviors like scheduling feeding times or restricting feeds can induce lactation failure because the baby sucking stimulates milk production. These issues still exist today with restrictive work schedules for mothers that often do not give adequate time or privacy to pump or feed. Therefore, whether lactation failure is part of a woman’s biological make-up or if it occurs due to feeding behaviors, the reduction in milk supply is rarely a choice of the mother. External forces play a role: biology, social attitudes towards pumping in the workplace, and economic reasons for women with low paying jobs that do not have lunch breaks or paid time off, like a waitress.

Nevertheless, throughout time some women that are able to breastfeed have chosen not to. It has been documented that in the 19th and 20th centuries, predominantly wealthy and royal families paid someone else to breastfeed their infant for them. Sometimes these mothers were working so they were not with their baby enough to feed. Other times, breastfeeding was considered a low social class practice so wet nurses were hired. In more recent times, some women choose to use modern breast milk alternatives like infant formula or animal milk because of the ease of having a bottle ready, being able to have the father or other family members feed the infant, and not have the mess or pain of breastfeeding. For women who are able to breastfeed and are faced with the choice of whether to or not, the influence of social norms, marketing of substitutes, and cost all play a role in the decision making process. The prevalence of breastfeeding vs alternatives follows patterns that can be understood when we nest the trends into their socio-ecological environments.

Infant Feeding

Since lactation failure and choice not to breastfeed has been documented throughout history, the substitutes to breast milk have an evolution of their own. The first medical encyclopedia, The Papyrus Ebers, from Egypt in 1550 BC advises rubbing the mother back with swordfish bones to help get a supply of milk. From this time, around 2000 BC until the 20th century, wet nurses were used as the primary alternative feeding. A wet nurse is defined as “a woman who breastfeeds another’s child” and was a well-organized profession in the United States. These jobs were often filled by poor, African American women in the U.S. who sometimes could not support their own child. For this and other reasons, the wet nurse practice raised many ethical issues and was objected in the Middle Ages and Renaissance. Nevertheless, the practice remained popular for wealthy families who adopted orphans as cheap slaves or did not want to or could not breastfeed their own infants.

Aside from ethical issues, the medical reports describing the early years of wet nursing show distrust in the practice. In addition, these writings reflect some of the scientific viewpoints of the time and the lack of knowledge about the spread of disease. In the early 14th century, medical professionals were advising wet nurses to move around to ensure their flow of milk was activated. In addition, the wet nurse was recommended to be a healthy 25-35 year old who had recently delivered a male child. Evidently, the views of male superiority and the theories about the mechanism of milk production were highlighted by these advisories. During the middle ages, society switched to regarding wet nursing as a practice that should only be used in desperate circumstances of lactation failure. For example, in 1577, Omnibonus Ferrarious, an italian scientist interested in feeding practices emphasized that the mother of the infant was a better choice than a wet nurse where possible. Other scientists, like Frenchman Jacques Guillemeau continued arguing against the wet nurse practice through the 17th century. In his work, The Nursing of Children, Guillemeau explained that breast milk could transmit imperfections from the wet nurse to the child and subsequently to the parents of the infant. A more specific point on this issue was that wet nurses should not have red hair because of a common hot temperament associated with redheads that tainted the breast milk. These ideas that disease was spread by unseeable particles transmitted from person to person added to the early push for breastfeeding.

Despite these lashbacks, many wealthy families hired a wet nurse due to cultural, societal, or personal choice or biological necessity. Even when the medical recommendations of the time differed in opinion, the external social, biological and cultural forces determined the woman's decision. Many women claimed that breastfeeding was unfashionable because it would ruin their figures, prevent them from wearing the clothes they wanted to, and interfere with social activities. Furthermore, some parents hired a wet nurse because it was cheaper than hiring a house maid or a business hand. That way, the mother could attend to the house work of administrative duties of her husband’s duties without being disrupted by breastfeeding. In turn, governments began to respond to the regular practice of wet nursing with laws to help prevent poor women neglecting their own infants to earn money by wet nursing another woman’s infant.

Throughout the 18th century, the scientific theory that pushed against the use of wet nurses was finally heard by the wealthy families that used it. However, just as more wealthy mothers began breastfeeding their own infants, the wet nurse business shifted to low income families. The industrial revolution increased the laboring duties of mothers and left them little choice but to use the only alternate feeding practices for their young infants. Since the poorer families did not have high financial budgets for a safe wet nurse, infant mortality rates were heavily influenced by untrustworthy and underqualified wet nurses who often engaged in dangerous practices like using opiates to help infants sleep. The rise in unsafe practices stimulated more scientific theory to be published to raise awareness of the benefits of breastfeeding. The works of William Buchan, who wrote Domestic Medicine and others spread a distrust in wet nurses and helped shift the practice away from mothers of all socioeconomic statuses by the end of the 18th century.

In line with the rise of the germ theory of disease, the development of a hygienic feeding bottle also played a significant role in the decline of wet nurses. The bottle offered another widespread alternative that was deemed safe and easy to use. Before the 19th century feeding bottles of all kinds had been used to feed animal milk or bread soaked in water to infants. However, these feeding vessels were often hard to clean and the build-up of germs led to high rates of infant morbidity and mortality. During the mid-19th century, the modern feeding bottle started to evolve. In 1896, a refined open-ended bottle was developed in England and was very popular. Leather and rubber teats added to the bottle created the modern feeding bottle similar to the ones we see today. This invention led to a scientific focus on the nutrition benefits from non-human milk that could be easily put in the bottles instead of breast milk. This shift to artificial feeding provided a gateway to infant formula as a medicalized, marketable, and modern alternative to breast milk.

01 February 2021
close
Your Email

By clicking “Send”, you agree to our Terms of service and  Privacy statement. We will occasionally send you account related emails.

close thanks-icon
Thanks!

Your essay sample has been sent.

Order now
exit-popup-close
exit-popup-image
Still can’t find what you need?

Order custom paper and save your time
for priority classes!

Order paper now