Fiction Writing Samples — http://medium.com/@ren.t.graham


Opioid Crisis in the US

It’s no secret that the United States is broiling in the midst of an opioid epidemic. Even though it took until 2017 for the US Department of Health and Human Services to officially declare the opioid crisis a public health emergency, the escalating overdose risk has been going on since the late 1990s.

D5rF7n4UIAAnjUq.jpg

Patients want to trust their physicians when it comes to prescribed medication, but for decades, doctors have been pressured by big pharmaceutical companies to over-prescribe opioid painkillers. Pharmaceutical companies with profits in mind over patient wellness guaranteed that opioid painkillers were non-addictive and safe for use. With no other affordable pain relief options at their disposal, doctors became reliant on prescribing economical opioid painkillers for patients with chronic pain, acute muscle soreness, and for those in recovery after surgery.

When it comes to chronic diseases like cancer, even stronger opioids are needed to help patients manage pain. The synthetic opiate fentanyl was created for this express purpose. Fentanyl is 50 to 100 times more potent than morphine, which makes it incredibly dangerous if abused. Because it produces a more powerful psychoactive effect and requires less product for the same high, fentanyl is much cheaper to manufacture and distribute than traditional opioids. While fentanyl is an economic boon for illegal drug manufacturers looking to ship more product, it has led to a market saturated in heroin, cocaine, and other opiates that have been mixed with huge portions of fentanyl, often unbeknownst to the user. Fentanyl overdoses present a unique challenge to treat because of the drug’s strength, and oftentimes multiple doses of the rescue medication naloxone are needed to reverse them.

So who is most at risk for opiate overdose?

Nearly 80 percent of opioid overdoses in the United States can be attributed to white, low income Americans, and are especially prevalent in the Appalachian “Rust Belt” region of the country. There are a variety of socioeconomic factors at play here, but the lack of available jobs has certainly been a major contributor to the widespread depression of the region. Low income Americans that are unable to afford the tuition costs associated with a college degree are less likely to be able to find work, integrate, and uplift, and thus more likely to turn to illicit drugs as a kind of emotional salve. Military veterans are another demographic where opioid abuse is rampant. Without access to necessary therapeutic treatments for chronic pain or PTSD that developed as a result of military service, many veterans account for a disproportionately high number of opioid-related deaths.

To really drive home the scale of the epidemic, consider that more than 130 people in the United States die every day from opioid overdoses, and that in 2017 alone, there were 47,000 opioid-related deaths. The economic burden of the opioid crisis is over $78 billion a year, including the tallied costs of healthcare, lost productivity, addiction treatment, and criminal justice involvement. With 1 in 3 Medicare beneficiaries receiving a prescription of an opioid painkiller in 2016 and 30 percent of patients misusing their opioid prescriptions, overdose fatalities are only projected to rise.

While the news certainly seems dire, there are a variety of things that can be done to combat the epidemic.

First and foremost, it’s important to ensure that proper justice is dealt to pharmaceutical companies who were well aware of the dangers of their painkillers and continued to aggressively market them to medical professionals anyway. Next, it’s essential to create infrastructure to provide long term support for those suffering from opiate addiction. Safe injection clinics destigmatize addiction and simultaneously provide around-the-clock nurse supervision and naloxone treatment in the instance that an opioid overdose were to occur. If implemented, safe injection clinics can provide sufferers with informed options moving forward to treat addiction. Lastly, it’s critical to invest in new pain management research to find other methods of management that don’t rely on addictive medication. With these three key avenues funded and in place, physicians, politicians, and healthcare advocates will be able to create a safer place for vulnerable groups of people suffering not just from the effects of opiates, but from addictions of all kinds.


Exam Room Sterilization: A Troubling History

If you had been wheeled into a surgeon’s operating theater before 1885, it was as good as a coin flip to determine the odds of your post-surgery survival. 

Even relatively minor injuries like compound fractures could lead to fierce sepsis complications. Surgeons believed that infection spread through a mechanism known as miasma--or “bad air”--and the only precaution that they took against it was to keep operating areas well-ventilated, presumably so all that “bad air” could escape. It didn’t help that many surgeons would readily don garments splashed with pus and dried blood from previous procedures, and that neither hands nor surgical instruments were washed before or after an operation. Maybe most egregious of all, in some rural areas, it was common practice to apply a poultice of warm cow dung directly to the wound site to promote healing. 

exam room sterilization.png

At this time, Victorian-era doctors were only just grappling with the concept of germ theory, with many of them entirely dismissive of the idea that invisible microbes could be floating around in the air. Before there was a strong antiseptic foothold in the medical community, surgeons felt that germ theory was about as scientific as spiritualism and its claims of ghostly encounters and wisdom beyond the grave delivered via possessed planchette. 

There is, however, a long pre-germ theory history of people trying to tamper the spread and scope of infection. Ancient Egyptians used pitch or tar, resins, and aromatic herbs to ward off the spread of disease during body embalming. In 500 BC, Persians were aware that silver in drinking vessels could be used to preserve the purity of the water stored within. Even the Romans knew to sterilize medical implements by heating them over a fire or boiling them in a large cauldron of water before use. This knowledge, however, was lost, along with many things in the middle ages. As the bubonic plague spread across the European continent, stupefied healers tried desperately to burn sulfur and wood with the intent of chasing away the devouring sickness with little success.

It wasn’t until 1883 when Joseph Lister, a British surgeon, discovered a document on the existence of microbes, a research piece published as Recherches sur la putrefaction by French chemist Louis Pasteur. With the newfound backdrop of microbe theory, Lister hypothesized that this same microscopic life might also be responsible for the rapid spread of infection after surgery. To test his theory, he used carbolic acid--now more typically known as phenol--to sterilized medical implements, garments, the patient’s wound site, and his own hands before operating. The results were immediate. In his practice alone, mortality from post-surgical infection dropped by nearly 40 percent. 

In modern medicine, there are plenty of options for physicians when it comes to keeping exam and operating rooms as sterile as possible. We are lucky to have disposable garments, high-powered autoclaves and dry heat sterilizers, as well as germicidal cleaners to keep surfaces clean and hostile to microbial life. Without the pioneering efforts of undaunted germ theorists like Lister and Pasteur, we would not enjoy such high standards of sterilization in physician offices today. 


The Science of Paralysis and the Central Nervous System

Nearly 5.6 million people in the US alone live with paralysis of some kind day to day, which works out to nearly 1 in every 50 people. Despite being such a common medical condition, the exact mechanism underlying paralysis is varied, complex, and as with most symptoms originating in the brain, not especially well understood.

The most common types of conditions that can lead to long-term paralysis include spinal cord injuries, broken neck, nerve-damaging diseases, and autoimmune diseases. The initial cause of these conditions are numerous. Everything from stroke to car crash to workplace injury to progressive autoimmune disease can result in living with paralysis. Paralysis can be localized--as in only occurring in a specific region like a limb or the face--or generalized in form. Generally paralysis is considered a chronic, life-long condition, but there are certain select circumstances where it can be temporary and symptoms can improve over time.

But how does something like paralysis occur? To answer that, it’s important to understand the central nervous system.

neuron paralysis.png

Essentially, the central nervous system is made up of a combination of the brain and the spinal cord, and operates by way of nerve cells, or neurons, that make up a web of interconnected signal relays throughout the body. To understand how monumental the central nervous system is to the body consider this statistic: the brain itself utilizes about 20 percent of the total oxygen breathed in, all while accounting for only 2 percent of the body’s total mass. Consisting of over 100 billion inter-meshed neurons, the brain is rightly viewed as one of, if not the most complex organ in the human body.

The central nervous system is responsible for controlling all manner of physical responses in the body, both conscious movement and internal processes such as breathing, heart rate, hormone release, and body temperature. In order for the electrical relay to function properly, neurons must be protected by healthy glial cells. Glial cells, or neuralgia, support nerve cells in a variety of ways, including as an anchor between neurons and the blood supply, supporting the creation of the myelin sheaths that insulate electrical signals, providing a scaffolding on which neurons can grow, and in lining brain ventricles to supply cerebrospinal fluid. When there is significant damage to any portion of the central nervous system, that is when paralysis is likely to occur. A damaged neuron can no longer transmit a signal up or down the neuron chain between the brain and the limbs, so motor control is effectively dampened or entirely silenced.

There are, however, a multitude of options for paralysis treatment and support.

Mobility equipment such as wheelchairs, limb braces, and exterior supports can help people living with paralysis to remain autonomous and active. Physiotherapy with a trained physical therapist can aid in maintaining limb strength and muscle mass. Occupational therapy provides a way for people living with paralysis to adapt to everyday life at home and work through tasks that may prove challenging. And of course there is a whole market of medicine designed to relieve pain, stiffness, and muscle spasms associated with paralysis.

As an emerging field of study, neurologists and physicians alike are sure to discover further therapeutic options in the future to augment and improve everyday life for people living with paralysis.


History of Medicinal Honey

Virtually every civilization has utilized honey for its medicinal properties, going back as far as 8,000 BCE. The Assyrians and Chinese used it as a topical solution for wounds, and almost all Egyptian medicine contained honey mixed with wine and milk. The ancient Greeks mixed together a tincture of vinegar and honey to treat pain from wound sites, especially common as a remedy for a soldier’s war injuries. There are myriad illustrious passages in the Qur’an detailing the significance of honey as a remedy for any ailment. While it might not be a cure-all for everything, there is good reason for honey’s reputation of ambrosia-like properties throughout medical history.

Honey’s antimicrobial properties were first verified in 1892 and since then it has gone on to perform well in a variety of clinical trials. An inhibitory effect has been shown in over 60 species of bacteria, including aerobes and anaerobes as well as both gram-positive and gram-negative bacteria. This is an especially useful trait considering the growing microbial resistance to antibiotics. 

preview-full-medicinal honey illustration.png

So how does honey work as a natural bactericidal agent? 

When applied topically, honey draws moisture out of the wound area, dehydrating any growing bacterial strains. Honey’s sugar content is also high enough to prove toxic for microbes and hinders colony growth, and its low pH discourages bacteria that has trouble growing in an acidic environment. As a triple threat, honey also contains an enzyme called glucose oxidase, which gradually breaks down its sugars to generate hydrogen peroxide that acts as a third antibacterial agent. 

A variety of clinical evidence suggests that topical application of honey forms a protective, non-adhesive barrier dressing and can stimulate healing while reducing scarring potential. Further study needs to be done, but it seems as though honey also has a positive effect on several other conditions, including gastrointestinal tract disease, fungal infection, and may be useful as anti-inflammatory. Another boon that honey offers is its relative safety. Allergy potential is low in the general population, barring infants that may be sensitive to botulism.

Although science has shined light on the fallacies of ancient practices like bloodletting and the theory of the four humors, honey has remained a mainstay in wound care application since ancient times. As more research is done, honey may reveal itself to contain even more medicinal benefits than science is already aware of, pushing it closer and closer to that glittering ideal medicine outlined in the passages of the Qur’an. 


The History of X-Rays

Perhaps the most useful advancement in medical technology, x-rays have not been inextricably linked with the medical and dental communities all that long.

In 1895, Wilhelm Conrad Roentgen, a German physicist and mechanical engineer, was at work in his laboratory, toying with a cathode-ray tube. Cathode-ray tubes were popular experimental tools at the time in electromagnetic field research, and so Roentgen was perfectly in his element as he evacuated air in the tube and experimented with high voltage application to its contents. To Roentgen’s surprise, when he covered the tube with dense black paper to block out its light, some objects nearby remained illuminated with a ghostly phosphorescence. After further study--most famously in producing the world’s first x-ray radiographic print with the glowing bones in his wife’s hand--Roentgen determined that he had stumbled upon an invisible ray, a kind of light ray outside the sliver of the electromagnetic spectrum that humans can see with the naked eye.

Duz_UOqVYAAmBjM.jpg

It didn’t take long for the medical community to latch onto x-rays as an invaluable diagnostic tool. Because they were able to provide a complete view of the bone without having to sever or puncture skin, they allowed physicians to have an almost omnipotent view of a patient’s body. Within a month of their initial discovery, surgeons had already implemented x-rays into their surgical procedure routines. Having an x-ray done even became something of a fad for the general public for awhile, with everyday consumers using coin operated mechanisms at penny arcades to take radiographs of hands and feet as quick, touristy mementos.

There were, of course, downsides to allowing consumers to cheerfully dish out a dose of radiation at their own discretion, though the risks associated with x-rays weren’t fully realized until the mid-20th century. During the atomic era, when governments around the world were myopically focused on building and refining weaponized uranium, the effects of excess radiation on human health came to the scientific forefront. Radiation strips cells, shreds DNA, and rends cellular function out of its normal parameters. Unchecked, x-rays are capable of doing the same.

Presently, x-ray radiograms still stand out as a hallmark of medical innovation. Physicians and dental professionals alike use them routinely to diagnose bone health, foresee cavities and hairline fractures, and to provide general, non-invasive visual support during an exam. With the advent of personal computers, physicians have been able to digitize x-ray radiographs and store high resolution digital files with ease. Linear accelerators allow physicians to generate penetrating radiation for use in precise, high quality diagnostic imaging. And the technology itself has improved since Roentgen’s crude cathode-ray tube experiment, too. Man-made isotopes are far more powerful than those that occur naturally, and offer x-ray technicians a wide array of energy levels and half-lives to work with during imaging.

As a miracle of modern medicine, x-rays have come a long way. From a dim experimental laboratory curiosity in Germany during the late 19th century, to a routine full-mouth prognostic panel at dentist offices worldwide, x-rays are and will continue to be an invaluable diagnostic tool.


Healthcare-Acquired Infections: How to Keep Facilities Safe

There is a silent killer in healthcare facilities that has been largely untreated. One in ten patients in acute hospital care environments develop some kind of healthcare-acquired infection during their stay. As the 6th leading cause of death in the United States, healthcare-acquired infections cost the health industry upwards of $88 billion annually and contribute up to a 6 percent mortality rate in hospitals countrywide. The greatest microbiological threats attributed to these infections are methicillin-resistant Staphylococcus aureus, S. epidermidis, and multi-drug resistant gram-negative aerobes. 

1539288980_bacteria-illustration.png

Most healthcare-acquired infections are associated with invasive medical devices and complications from surgical procedures. Catheter use, which is extensive in the health industry and especially in the ICU and long term care facilities, contributes to a significant portion of healthcare-acquired infections. Gram-negative bacteria in particular is associated with both ventilator contamination and complications from unsanitary catheter care. Pneumonia typically arises from comprised ventilators, whereas urinary tract and bloodstream infections commonly stem from lack of catheter sterilization. Considering that these infections arise during or shortly after medical stay, the patient’s immune system is at its most vulnerable in this critical time period. For the elderly, infirm, and younger populations of hospital patients, these sterility compromises can be life-threatening. 

Fortunately, there are steps to take to keep hospitals, intensive care units, and long term healthcare facilities free of these infectious agents.

The hand hygiene of hospital staff goes a long way towards preventing microbial spread. Utilizing sterile gloves, aseptic handling techniques, antimicrobial flushes, and 2 percent chlorhexidine skin preparation provide an ideal defense against infection. With reduced drug longevity due to growing microbial resistance and the high capital costs required for new drug development, cutting infection off at the pass with safe handling techniques is the best method forward towards reducing hospital-acquired infections. 


Living with Arthritis: Mechanism, Symptoms, and Long Term Care

Often seen as a hallmark of age, arthritis affects nearly 50 million adults in the United States alone. Though it is generally associated with an older population, there are roughly 300,000 children in the U.S. who are also living with arthritis. The surprising truth here is that what we know as ‘arthritis’ is not even a single disease, but a set of symptoms associated with several different kinds of disease.

D6uLe5iUwAAMg4V.jpg

There are four distinct categories that most types of arthritis fall under, and over a hundred even more specialized subcategories that arthritis can break into from there. For the sake of brevity and simplicity, we’ll only go over the four major categories here: degenerative arthritis, inflammatory arthritis, infectious arthritis, and metabolic arthritis.

Degenerative arthritis, or osteoarthritis, is the most globally prevalent of all types of arthritis, and occurs most frequently in the wrists, legs, and hips. Because these regions are typically associated with movement, it’s natural that there is a higher degree of cartilage rubbing in these areas. Over time, when the cartilage cushioning between bones is completely rubbed down and worn away, the bones begin to rub and shift against one another. This can cause stiffness, both acute and chronic pain, decreased range of motion, and can contribute to an inability to do day-to-day activities that were once a staple part of a person’s life. As far as management techniques for osteoarthritis are concerned, there are a plethora of options. Finding ways to balance periods of activity and rest, practicing therapeutic muscle strengthening exercises, utilizing hot/cold therapy products, and taking over-the-counter pain relief medication can all contribute to successful osteoarthritis management.

The second type of arthritis is inflammatory, or rheumatoid arthritis. This is when the immune system is working improperly and mistakenly attacks joints, which in turn causes uncontrolled inflammation and pain. Over time, this immune system assault on vulnerable structures can cause internal organ damage, vision loss, and joint erosion. For this reason, it’s important to treat suspected cases of rheumatoid arthritis with quick, aggressive treatment methods. Disease-modifying antirheumatic drugs, or DMARDs, are often the best treatment option when it comes to facilitating disease remission.

As another form arthritis can take, infectious arthritis is just as straightforward as it sounds. Bacteria, viruses, or fungi can infiltrate and infect sensitive joints and trigger severe inflammation and pain. The most common microbial offenders in this case are salmonella, shigella, chlamydia, gonorrhea, and hepatitis C. Luckily, infectious arthritis is relatively easy to treat, and a course of antibiotics usually takes care of the unpleasant symptoms.

The last category of arthritis we’ll discuss here is metabolic arthritis, also known colloquially as gout. Gout occurs when there is a significant build-up of uric acid--a byproduct of the metabolic breakdown of purine nucleotides. When uric acid begins to adhere to the joints, it crystallizes and forms painful, icicle-like spears along the joint fissure. Generally gout is affected by diet, weight, and exercise, and so following a healthy day-to-day regimen can prevent and manage gout flare-ups fairly effectively.

Early diagnosis of arthritis symptoms generally results in a better pain management outcome for patients. The doctor may order blood, urine, joint fluid, and/or x-ray tests in order to effectively determine if a patient is suffering from arthritis. X-rays in particular are easily able to provide a direct view of possible joint narrowing and osteophyte formation in osteoarthritis cases. On the treatment side of things, physical therapists are able to provide care and therapeutic guidance for maintaining range of motion and joint limberness over the long term.

With arthritis diagnosis numbers growing annually, it’s important to remain proactive in reaching out for support and an expert opinion on concerning symptoms. Remember, early therapeutic care can lead to a lifetime of pain-free arthritis management.


The Ancient and Surprising History of Surgery

While it may seem intuitive to describe surgery as a ‘modern’ medical practice, its origins in fact can be traced back thousands of years, deep into the Pre-Classical and even Neolithic era.

The earliest recorded archaeological evidence for surgical practice is an astounding 12000 BCE. To put a time frame like this in perspective, writing wasn’t developed until 2600 BCE in Egypt. Early people were practicing rudimentary surgery before recorded history!

Prehistoric surgery focused around a practice called trepanation, which involves scooping or drilling a hole into the skull of the patient. This ancient practice was used as a panacea for everything from relieving migraine or intracranial swelling, cleaning out fractured skull fragments from a war injury, and even as a means of managing misunderstood mental health conditions that were viewed as a malignant spirit trapped within the patient’s cranial cavity.

history of surgery.png

In 500 BCE, the first example of plastic surgery sprung up in India. Having one’s nose cut off was a common punishment for a past crime, and so reformed felons would have their noses reconstructed via early rhinoplasty to avoid the social stigma. Surgery in Ancient Greece included a variety of makeshift surgical work including the setting of broken bones, bloodletting, the draining of lungs from patients suffering from pneumonia, and the severing of gangrenous limbs. The Mayans were really at the forefront of global surgical practice at this time and performed routine dentistry, filling in cavity with flecks of jade, turquoise, quartz, or hematite. The Incas had master surgeons specializing in head injuries and cranial surgery, and records show they had a substantially better success rates than surgeons during the American Civil War, nearly 500 years later.

Another major leap forward was in 900 AD, with the highly esteemed ‘father of surgery’ Al-Zahrawi, and his all-encompassing surgical text Kitab al-Tasrif. It was a cutting-edge compendium on every known practice and procedure, including orthopedics, military surgery, and ear, nose, and throat surgery. A combination of Al-Zahrawi’s collected knowledge and local folk remedies remained the go-to surgical manual for nearly eight hundred years. Up until the 18th century, there was no formal medical training in Europe, and so most surgeons learned their trade through apprenticeship, much like one might learn blacksmithing or another artisan skill. In fact, because surgeons were seen as ‘lesser’ physicians during this period of time, many women were trained and practiced as surgeons. It wasn’t until the 1700s, with the development of medical colleges and academic institutions, that women were then excluded from practicing.

As far as patient care was concerned, it really wasn’t until modernity that a surgical procedure became anything short of horrific to experience. With little regard for infection control, pain management, bodily fluid contamination, or proper wound maintenance, being a patient before modern anesthetic was akin to torture. Many physicians believed it was important to keep patients alert and awake during surgery, and would periodically rouse the patient if it appeared they were in danger of losing consciousness. Opium and alcohol were used as analgesics only sparingly, and never in quantities large enough to diminish patient consciousness. Japanese surgeons were the first to implement true general anesthetic in the early 1800s, at which point ether, chloroform, and locally-administered cocaine became more commonplace globally as anesthetic for patients in the operating room.

While far from a modern practice, surgery has grown and developed to be a vastly safer and more pleasant patient experience over the millennia, all thanks to talented medical pioneers and advocates all over the world.


Social Media Ad Copy Examples


  • Shop QuickMedical's wide range of patient aid products, perfect for long-term care facilities, clinics, and hospital use. Provide patients with welcoming hygiene care essentials away from home, including wipes, body wash, deodorant, and disposable razors, among dozens of other product categories.


  • The Mediana DT-100 Smart Thermometer offers non-contact infrared temperature measuring with a rapid response time. Using heat mapping technology, the DT-100 is next step in hygienic, easy-to-use patient temperature measuring. Shop today!


  • Electrode Prep Products from Parker Labs ensure a conductive connection between skin and the electrodes, and transmit electrical impulses clearly for ECG, TENS, muscle stimulation and other procedures. Parker Labs Electrode Prep Products are offered in gels and creams, and are ideal for hospitals, electrotherapy clinics, and other medical facilities.


  • QM Elite nitrile gloves are multi-purpose, with applications extending from laboratory use to routine medical exams. With textured fingertips to provide dexterity support and a range of size options, look no further for your clinic’s new favorite exam glove.