PUBLISHED BLOG ARTICLES:

Exam Room Sterilization: A Troubling History

If you had been wheeled into a surgeon’s operating theater before 1885, it was as good as a coin flip to determine the odds of your post-surgery survival. 

Even relatively minor injuries like compound fractures could lead to fierce sepsis complications. Surgeons believed that infection spread through a mechanism known as miasma--or “bad air”--and the only precaution that they took against it was to keep operating areas well-ventilated, presumably so all that “bad air” could escape. It didn’t help that many surgeons would readily don garments splashed with pus and dried blood from previous procedures, and that neither hands nor surgical instruments were washed before or after an operation. Maybe most egregious of all, in some rural areas, it was common practice to apply a poultice of warm cow dung directly to the wound site to promote healing. 

exam room sterilization.png

At this time, Victorian-era doctors were only just grappling with the concept of germ theory, with many of them entirely dismissive of the idea that invisible microbes could be floating around in the air. Before there was a strong antiseptic foothold in the medical community, surgeons felt that germ theory was about as scientific as spiritualism and its claims of ghostly encounters and wisdom beyond the grave delivered via possessed planchette. 

There is, however, a long pre-germ theory history of people trying to tamper the spread and scope of infection. Ancient Egyptians used pitch or tar, resins, and aromatic herbs to ward off the spread of disease during body embalming. In 500 BC, Persians were aware that silver in drinking vessels could be used to preserve the purity of the water stored within. Even the Romans knew to sterilize medical implements by heating them over a fire or boiling them in a large cauldron of water before use. This knowledge, however, was lost, along with many things in the middle ages. As the bubonic plague spread across the European continent, stupefied healers tried desperately to burn sulfur and wood with the intent of chasing away the devouring sickness with little success.

It wasn’t until 1883 when Joseph Lister, a British surgeon, discovered a document on the existence of microbes, a research piece published as Recherches sur la putrefaction by French chemist Louis Pasteur. With the newfound backdrop of microbe theory, Lister hypothesized that this same microscopic life might also be responsible for the rapid spread of infection after surgery. To test his theory, he used carbolic acid--now more typically known as phenol--to sterilized medical implements, garments, the patient’s wound site, and his own hands before operating. The results were immediate. In his practice alone, mortality from post-surgical infection dropped by nearly 40 percent. 

In modern medicine, there are plenty of options for physicians when it comes to keeping exam and operating rooms as sterile as possible. We are lucky to have disposable garments, high-powered autoclaves and dry heat sterilizers, as well as germicidal cleaners to keep surfaces clean and hostile to microbial life. Without the pioneering efforts of undaunted germ theorists like Lister and Pasteur, we would not enjoy such high standards of sterilization in physician offices today. 


The Science of Paralysis and the Central Nervous System

Nearly 5.6 million people in the US alone live with paralysis of some kind day to day, which works out to nearly 1 in every 50 people. Despite being such a common medical condition, the exact mechanism underlying paralysis is varied, complex, and as with most symptoms originating in the brain, not especially well understood.

The most common types of conditions that can lead to long-term paralysis include spinal cord injuries, broken neck, nerve-damaging diseases, and autoimmune diseases. The initial cause of these conditions are numerous. Everything from stroke to car crash to workplace injury to progressive autoimmune disease can result in living with paralysis. Paralysis can be localized--as in only occurring in a specific region like a limb or the face--or generalized in form. Generally paralysis is considered a chronic, life-long condition, but there are certain select circumstances where it can be temporary and symptoms can improve over time.

But how does something like paralysis occur? To answer that, it’s important to understand the central nervous system.

neuron paralysis.png

Essentially, the central nervous system is made up of a combination of the brain and the spinal cord, and operates by way of nerve cells, or neurons, that make up a web of interconnected signal relays throughout the body. To understand how monumental the central nervous system is to the body consider this statistic: the brain itself utilizes about 20 percent of the total oxygen breathed in, all while accounting for only 2 percent of the body’s total mass. Consisting of over 100 billion inter-meshed neurons, the brain is rightly viewed as one of, if not the most complex organ in the human body.

The central nervous system is responsible for controlling all manner of physical responses in the body, both conscious movement and internal processes such as breathing, heart rate, hormone release, and body temperature. In order for the electrical relay to function properly, neurons must be protected by healthy glial cells. Glial cells, or neuralgia, support nerve cells in a variety of ways, including as an anchor between neurons and the blood supply, supporting the creation of the myelin sheaths that insulate electrical signals, providing a scaffolding on which neurons can grow, and in lining brain ventricles to supply cerebrospinal fluid. When there is significant damage to any portion of the central nervous system, that is when paralysis is likely to occur. A damaged neuron can no longer transmit a signal up or down the neuron chain between the brain and the limbs, so motor control is effectively dampened or entirely silenced.

There are, however, a multitude of options for paralysis treatment and support.

Mobility equipment such as wheelchairs, limb braces, and exterior supports can help people living with paralysis to remain autonomous and active. Physiotherapy with a trained physical therapist can aid in maintaining limb strength and muscle mass. Occupational therapy provides a way for people living with paralysis to adapt to everyday life at home and work through tasks that may prove challenging. And of course there is a whole market of medicine designed to relieve pain, stiffness, and muscle spasms associated with paralysis.

As an emerging field of study, neurologists and physicians alike are sure to discover further therapeutic options in the future to augment and improve everyday life for people living with paralysis.


History of Medicinal Honey

Virtually every civilization has utilized honey for its medicinal properties, going back as far as 8,000 BCE. The Assyrians and Chinese used it as a topical solution for wounds, and almost all Egyptian medicine contained honey mixed with wine and milk. The ancient Greeks mixed together a tincture of vinegar and honey to treat pain from wound sites, especially common as a remedy for a soldier’s war injuries. There are myriad illustrious passages in the Qur’an detailing the significance of honey as a remedy for any ailment. While it might not be a cure-all for everything, there is good reason for honey’s reputation of ambrosia-like properties throughout medical history.

Honey’s antimicrobial properties were first verified in 1892 and since then it has gone on to perform well in a variety of clinical trials. An inhibitory effect has been shown in over 60 species of bacteria, including aerobes and anaerobes as well as both gram-positive and gram-negative bacteria. This is an especially useful trait considering the growing microbial resistance to antibiotics. 

preview-full-medicinal honey illustration.png

So how does honey work as a natural bactericidal agent? 

When applied topically, honey draws moisture out of the wound area, dehydrating any growing bacterial strains. Honey’s sugar content is also high enough to prove toxic for microbes and hinders colony growth, and its low pH discourages bacteria that has trouble growing in an acidic environment. As a triple threat, honey also contains an enzyme called glucose oxidase, which gradually breaks down its sugars to generate hydrogen peroxide that acts as a third antibacterial agent. 

A variety of clinical evidence suggests that topical application of honey forms a protective, non-adhesive barrier dressing and can stimulate healing while reducing scarring potential. Further study needs to be done, but it seems as though honey also has a positive effect on several other conditions, including gastrointestinal tract disease, fungal infection, and may be useful as anti-inflammatory. Another boon that honey offers is its relative safety. Allergy potential is low in the general population, barring infants that may be sensitive to botulism.

Although science has shined light on the fallacies of ancient practices like bloodletting and the theory of the four humors, honey has remained a mainstay in wound care application since ancient times. As more research is done, honey may reveal itself to contain even more medicinal benefits than science is already aware of, pushing it closer and closer to that glittering ideal medicine outlined in the passages of the Qur’an. 


The History of X-Rays

Perhaps the most useful advancement in medical technology, x-rays have not been inextricably linked with the medical and dental communities all that long.

In 1895, Wilhelm Conrad Roentgen, a German physicist and mechanical engineer, was at work in his laboratory, toying with a cathode-ray tube. Cathode-ray tubes were popular experimental tools at the time in electromagnetic field research, and so Roentgen was perfectly in his element as he evacuated air in the tube and experimented with high voltage application to its contents. To Roentgen’s surprise, when he covered the tube with dense black paper to block out its light, some objects nearby remained illuminated with a ghostly phosphorescence. After further study--most famously in producing the world’s first x-ray radiographic print with the glowing bones in his wife’s hand--Roentgen determined that he had stumbled upon an invisible ray, a kind of light ray outside the sliver of the electromagnetic spectrum that humans can see with the naked eye.

Duz_UOqVYAAmBjM.jpg

It didn’t take long for the medical community to latch onto x-rays as an invaluable diagnostic tool. Because they were able to provide a complete view of the bone without having to sever or puncture skin, they allowed physicians to have an almost omnipotent view of a patient’s body. Within a month of their initial discovery, surgeons had already implemented x-rays into their surgical procedure routines. Having an x-ray done even became something of a fad for the general public for awhile, with everyday consumers using coin operated mechanisms at penny arcades to take radiographs of hands and feet as quick, touristy mementos.

There were, of course, downsides to allowing consumers to cheerfully dish out a dose of radiation at their own discretion, though the risks associated with x-rays weren’t fully realized until the mid-20th century. During the atomic era, when governments around the world were myopically focused on building and refining weaponized uranium, the effects of excess radiation on human health came to the scientific forefront. Radiation strips cells, shreds DNA, and rends cellular function out of its normal parameters. Unchecked, x-rays are capable of doing the same.

Presently, x-ray radiograms still stand out as a hallmark of medical innovation. Physicians and dental professionals alike use them routinely to diagnose bone health, foresee cavities and hairline fractures, and to provide general, non-invasive visual support during an exam. With the advent of personal computers, physicians have been able to digitize x-ray radiographs and store high resolution digital files with ease. Linear accelerators allow physicians to generate penetrating radiation for use in precise, high quality diagnostic imaging. And the technology itself has improved since Roentgen’s crude cathode-ray tube experiment, too. Man-made isotopes are far more powerful than those that occur naturally, and offer x-ray technicians a wide array of energy levels and half-lives to work with during imaging.

As a miracle of modern medicine, x-rays have come a long way. From a dim experimental laboratory curiosity in Germany during the late 19th century, to a routine full-mouth prognostic panel at dentist offices worldwide, x-rays are and will continue to be an invaluable diagnostic tool.


Healthcare-Acquired Infections: How to Keep Facilities Safe

There is a silent killer in healthcare facilities that has been largely untreated. One in ten patients in acute hospital care environments develop some kind of healthcare-acquired infection during their stay. As the 6th leading cause of death in the United States, healthcare-acquired infections cost the health industry upwards of $88 billion annually and contribute up to a 6 percent mortality rate in hospitals countrywide. The greatest microbiological threats attributed to these infections are methicillin-resistant Staphylococcus aureus, S. epidermidis, and multi-drug resistant gram-negative aerobes. 

1539288980_bacteria-illustration.png

Most healthcare-acquired infections are associated with invasive medical devices and complications from surgical procedures. Catheter use, which is extensive in the health industry and especially in the ICU and long term care facilities, contributes to a significant portion of healthcare-acquired infections. Gram-negative bacteria in particular is associated with both ventilator contamination and complications from unsanitary catheter care. Pneumonia typically arises from comprised ventilators, whereas urinary tract and bloodstream infections commonly stem from lack of catheter sterilization. Considering that these infections arise during or shortly after medical stay, the patient’s immune system is at its most vulnerable in this critical time period. For the elderly, infirm, and younger populations of hospital patients, these sterility compromises can be life-threatening. 

Fortunately, there are steps to take to keep hospitals, intensive care units, and long term healthcare facilities free of these infectious agents.

The hand hygiene of hospital staff goes a long way towards preventing microbial spread. Utilizing sterile gloves, aseptic handling techniques, antimicrobial flushes, and 2 percent chlorhexidine skin preparation provide an ideal defense against infection. With reduced drug longevity due to growing microbial resistance and the high capital costs required for new drug development, cutting infection off at the pass with safe handling techniques is the best method forward towards reducing hospital-acquired infections. 


The Ancient and Surprising History of Surgery

While it may seem intuitive to describe surgery as a ‘modern’ medical practice, its origins in fact can be traced back thousands of years, deep into the Pre-Classical and even Neolithic era.

The earliest recorded archaeological evidence for surgical practice is an astounding 12000 BCE. To put a time frame like this in perspective, writing wasn’t developed until 2600 BCE in Egypt. Early people were practicing rudimentary surgery before recorded history!

Prehistoric surgery focused around a practice called trepanation, which involves scooping or drilling a hole into the skull of the patient. This ancient practice was used as a panacea for everything from relieving migraine or intracranial swelling, cleaning out fractured skull fragments from a war injury, and even as a means of managing misunderstood mental health conditions that were viewed as a malignant spirit trapped within the patient’s cranial cavity.

history of surgery.png

In 500 BCE, the first example of plastic surgery sprung up in India. Having one’s nose cut off was a common punishment for a past crime, and so reformed felons would have their noses reconstructed via early rhinoplasty to avoid the social stigma. Surgery in Ancient Greece included a variety of makeshift surgical work including the setting of broken bones, bloodletting, the draining of lungs from patients suffering from pneumonia, and the severing of gangrenous limbs. The Mayans were really at the forefront of global surgical practice at this time and performed routine dentistry, filling in cavity with flecks of jade, turquoise, quartz, or hematite. The Incas had master surgeons specializing in head injuries and cranial surgery, and records show they had a substantially better success rates than surgeons during the American Civil War, nearly 500 years later.

Another major leap forward was in 900 AD, with the highly esteemed ‘father of surgery’ Al-Zahrawi, and his all-encompassing surgical text Kitab al-Tasrif. It was a cutting-edge compendium on every known practice and procedure, including orthopedics, military surgery, and ear, nose, and throat surgery. A combination of Al-Zahrawi’s collected knowledge and local folk remedies remained the go-to surgical manual for nearly eight hundred years. Up until the 18th century, there was no formal medical training in Europe, and so most surgeons learned their trade through apprenticeship, much like one might learn blacksmithing or another artisan skill. In fact, because surgeons were seen as ‘lesser’ physicians during this period of time, many women were trained and practiced as surgeons. It wasn’t until the 1700s, with the development of medical colleges and academic institutions, that women were then excluded from practicing.

As far as patient care was concerned, it really wasn’t until modernity that a surgical procedure became anything short of horrific to experience. With little regard for infection control, pain management, bodily fluid contamination, or proper wound maintenance, being a patient before modern anesthetic was akin to torture. Many physicians believed it was important to keep patients alert and awake during surgery, and would periodically rouse the patient if it appeared they were in danger of losing consciousness. Opium and alcohol were used as analgesics only sparingly, and never in quantities large enough to diminish patient consciousness. Japanese surgeons were the first to implement true general anesthetic in the early 1800s, at which point ether, chloroform, and locally-administered cocaine became more commonplace globally as anesthetic for patients in the operating room.

While far from a modern practice, surgery has grown and developed to be a vastly safer and more pleasant patient experience over the millennia, all thanks to talented medical pioneers and advocates all over the world.