Thursday, January 29, 2009

Family Surrogate-Based Research for Alzheimer’s Patients

I just read that a new study is finally allowing surrogate consent for Alzheimer’s patients. Having been in the situation as care-giver for a parent with Alzheimer’s, I say about time. I find it appalling that this “surrogate” consent was not allowed earlier. As if research for Alzheimer’s needed any more stalling. How can we advance in research for Alzheimer’s and dementia, if studies are not done on patients suffering from these diseases in their varying stages? For instance, when my father was diagnosed with Alzheimer’s, he was immediately prescribed Aricept. However, because of his rapid advancing stages, the Aricept was not beneficial for him, and no other drugs were available to help or slow down his dementia. According to the article:

“By the time they (patients) have been diagnosed with Alzheimer’s disease, many patients’ decision-making ability is so impaired that they cannot give informed consent to participate in research studies. Close family members are left with the decision, but there is no clear policy for this so-called “surrogate” consent. Because of that, research about the increasingly common disease is often stalled. But a new study led by the University of Michigan Health System suggests that older Americans are very supportive of family surrogate-based research, and would support having their family members enroll them in research in case of future incapacity. The study appears in the new issue of the journal Neurology.”

Although the article notes that, “ethnic and racial minority groups were slightly less willing to participate in surrogate-based research,” there has been support from these groups. I know that I am a strong supporter of Alzheimer’s research and the concept of surrogate consent. I only hope that this decision was made 4 years earlier, when my father would have been able to benefit. The article mentions that, “The rates of Alzheimer’s disease are rising rapidly; in 2000, there were 4.5 million Americans with the incurable disease, and by 2050, this number is projected to be 12.5 million if no effective treatments are found.” Although my Dad will not be one of the lucky ones to benefit from this new decision, I’m happy that millions of others will. Who knows, this research may even help me one day, or one of my siblings.

I hope some of you dislike me relatively soon.

This is my first blog entry ever, anywhere. I had no idea how I was going to approach it. As I read the various blogs of my peers with their different perspectives and approaches, I kept asking myself, “What should a biomedical ethics (bioethics) blog be?”

During the past Presidential election cycle, bloggers from various degrees of the political spectrum immolating (I think the term “flaming” is no longer adequate for the current level of political discourse) those of even slightly differing viewpoints defined (fairly ot unfairly) what it means to “blog.” However, my training in medical writing leaves no doubt that opinion has absolutely no place in biomedical writing. I think we can all agree that any individual blog, the collective blogosphere, and even the internet in general are poor places to gain credible knowledge of evidence-based medicine. (However, there are online repositories and databases of evidence-based medical information, such as http://www.ncbi.nlm.nih.gov/pubmed, http://www.medlineplus.gov/, and http://www.cochrane.org/.) By their nature, blogs have a point of view. They are for commentary and (more often) advocacy, not journalism.

In a bioethics blog, we are not primarily concerned about whether scientific medicine is effective or safe. That is for the scientists, the doctors, and (ultimately) the FDA to decide. In a bioethics blog, we are primarily concerned with the ethics associated with the application of scientific medicine. We have to ponder the difficult, even ugly, questions. A lot of these questions are in the news today:

No one questions the contention that stem-cell research may lead to medical advances. However, the ethical question is what happens when fetuses become valuable and thus become a commodity? Will there be incentives to harvest fetuses? What kind of behavior will that spur?

A related question (which is considered by some to be old and cliché but is more difficult and relevant than ever) is, “When does life begin?”

A wildly disproportionate percentage of health care expenditures in a person’s life is incurred at the very end of his or her life. It is a difficult question, but are those expenditures justifiable?

What is more important, providing “equitable” health care for all patients or preserving providers’ incentives to be innovative and efficient? (If you think that is an easy choice of the former over the latter, then think again.)

Who is to blame for the explosion of medicine-related lawsuits in America, “greedy, predatory lawyers” or “incompetent, indifferent doctors?”

Should the government place extra taxes on the manufacturers, sellers, and consumers of certain products (soda, fast food, tobacco, alcohol, etc.) or even ban them altogether? Where do self-responsibility and individual choice come in? Who decides what is bad for another? Where does it end? Could limits on TV watching eventually be mandated by law?

It will be fun exploring these issues. I tend to alienate everyone on every side of every issue, because I do not subscribe to any one orthodoxy. If at least one of you does not end up personally disliking me, then my blog will not have been very good.

Wednesday, January 28, 2009

Extra Extra: War on Salt?

This time, Dr. Frieden, commissioner of New York City’s Department of Health and Mental Hygiene has sodium reduction in mind. He is known for his previous conquests on trans fat, smoking, and posting calorie counts on menus. However, salt is going to be harder to fight since all of its effects on health aren’t proven. The only plan we have of a low sodium diet is for health reasons and that is the DASH (Dietary Approaches to Stop Hypertension) diet, where you consume foods low in sodium, calorie and fat.

Dr. Frieden’s plan is to reduce the sodium in prepared and packaged foods. He’s expecting that if we reduce sodium little by little, then we can adhere to a low sodium diet for the average person. Sodium is an acquired taste just as spicy foods are to a person, if we can reduce the consumption of it and wean you of it, you most likely can survive on bland tasting foods. This is not an ideal situation we can achieve anytime soon or probably, never.

Mostly because salt is in everything, even if you prevent it from being in processed foods it is still in your daily diet. According to the RDA (Recommended Dietary Allowance), the average person is to consume less than 2,400 mg of sodium a day (1 teaspoon of table salt/day). I think we achieved this teaspoon of sodium consumption by the end of breakfast. The only way the average person can go on a low sodium diet is discipline, health, or they have to completely wipe out all foods rich in sodium, which is mostly everything. Reducing salt in packaged foods can help but we still have a long way to go to achieve the low sodium diet goal. Basically, it is people’s choice to eat what they want, and if they want to be in good health, eat sparingly. A little bit of everything is better for you than a large consumption of everything. It’s as the saying goes “don’t take more on your plate than you can handle”. The same goes with life.

The Epidemic That Wasn’t

The Epidemic That Wasn’t (The New York Times: January 26, 2009) reports on the outcomes of children born in the 1980s and ‘90s who had experienced prenatal exposure to crack cocaine, which was a prevalent issue at that time. Researchers have been following these children through adolescence and beyond to monitor the long-term effects of fetal exposure to crack on their brain development and behavior. The surprising results of this research indicate that, although the harmful effects are present, consistent, and measurable, they are subtle and have less of an impact than had been predicted.

Cocaine does slow fetal growth. Infants are smaller at birth and their head size is smaller, but this is a temporary effect that normalizes as these children grow. The largest of several studies of the IQ scores of cocaine-exposed children showed that the exposed children averaged 4 points lower on IQ scores at age 7 than unexposed children. However, a pooled analysis of studies of more than 4,000 children did not reveal any significant effect on IQ or language development in cocaine-exposed children aged 4 to 13. Cocaine-exposed children can have difficulty focusing on and performing tasks, particularly if visual attention is required. Behavioral effects of fetal exposure to cocaine include increased frequency of defiant behavior and misconduct, more so in boys than girls. However, experts point out that these children are not distinguishable from “normal” children in a group and that some of these effects are more likely related to socioeconomic factors, such as stress in the home, substandard education, inadequate access to healthcare, exposure to violence, and poverty.

There is no doubt that in utero drug exposure is harmful to developing fetuses. But, as pointed out in the article, the problem may have been handled as a moral versus a health problem. During the 1990s many women were prosecuted and jailed for illegal drug use during pregnancy and it is common for women who use illegal drugs during pregnancy to lose custody of their children. According to 2006 and 2007 Department of Health and Human Services data, 5.2% of pregnant women reported illicit drug use; however, the same data showed that two-to-three times as many pregnant women used alcohol and tobacco (11.6% and 16.4%, respectively), both of which are legal and have well-documented evidence of their effect on fetal development and lifetime health effects. In contrast, the long-term effects of cocaine on children’s cognitive development and behavior appear relatively small when compared to the effects of alcohol. The effects of cocaine and tobacco are comparable. So far, no link has been established between prenatal cocaine exposure and predisposition to drug use, although such a link has been reported in the case of prenatal tobacco exposure.

Children who have been exposed to cocaine and their mothers are often stigmatized and vulnerable to negative expectations from society. Numerous evidence-based interventions and resources are available to help mothers regain custody of their children and start a new life, including drug and alcohol recovery programs, halfway houses, parenting classes, counseling, vocational training, financial and life coaching, and Head Start programs for the children. As unfair as the treatment of crack-addicted mothers appears in comparison to that of alcohol- and tobacco-addicted mothers, these salutary interventions might not otherwise have been available to them

I remember when crack babies were a topic of conversation. Heartbreaking stories abounded of little babies born to crack-addicted mothers—inconsolable infants who required constant holding by volunteers in the nurseries of hospitals. These highly irritable newborns were expected to mature into a generation of problem children and sociopaths, a future burden on society. My initial reaction to this article was one of surprise, and fear that cocaine use might somehow be viewed as benign in terms of its long-term effects on development and behavior. But my deeper reaction was in response to our moral judgements about what is legal and illegal, regardless of the real impact of harmful behaviors. Tobacco companies continue to flourish and market their products to young people in spite of the well-documented long-term effects of tobacco on the smoker, prenatally, and as a result of second hand smoke inhalation. In contrast, cancer patients and those with chronic pain continue to fight for access to medical marijuana, which is still illegal in most states. I am relieved and happy that the crack-exposed children are doing better than expected. I am also hopeful that this new information may put a fresh perspective on the dangers of prenatal alcohol and tobacco exposure and how we, as a society, deal with these issues.
Infectious Mononucleosis: The Annie of Diseases?
By: Lisa Menard

In an article published on January 21, 2009, Carolyn Sayre introduced us to a teenager named Chelsea Day and her struggles with infectious mononucleosis. We soon find that Chelsea is just one of many who are affected by the disease. For the 35 to 40 percent of adolescents and young adults who develop symptoms, this disease has a major impact on their quality of life; however, according to Dr. Robert Frenck, mono is a disease that "doesn't get the respect it deserves."

Mononucleosis, or mono, is also known as "the kissing disease" because it is spread by close contact. About 95 percent of adults in the U.S. have been infected with Epstein-Barr, the herpes virus that causes mono, by age 35 to 40; however, most people who become infected with Epstein-Barr virus never develop symptoms.

Since mono is so common, some experts fear the disease has become trivialized among physicians and the research community; however, mono is anything but trivial for the young people who get it.

There is still no vaccine or antiviral drug that can ward off Epstein-Barr virus or treat the resulting infection; there simply isn’t enough demand for a vaccine to warrant expensive research. Dr. Gary Simon agreed with this mentality when he stated, “There are more serious things. In general, this is a self-limited illness; 99.9 percent of people will get better.”

On the other hand, Dr. Hank Balfour, who agrees that further research is warranted, blames big pharma for the orphan situation. According to Dr. Balfour, “If they (big pharma) don’t see it as a blockbuster, then they don’t want anything to do with it.”

I agree with Dr. Balfour's views, but only to a certain degree. Is this a disease that deserves further research? Yes. Is the study of mono an orphan situation? Yes. However, this is also the point where I tend to disagree with Dr. Balfour.

The study of the disease is not an orphan situation because of big pharma's desire for blockbuster drugs; rather it is an orphan situation because of Dr. Simon's main point; there are more serious things and 99.9 percent of the people will get better. Can we say the same for cancer, heart disease, or HIV/AIDS?

Orphan diseases are rarely studied for a reason. Teenagers such as Chelsea Day may feel that it is unjust that her suffering is trivialized; however, the decision is clear when we compare which action produces the most advantages and fewest disadvantages for the most people; this is where pharmacoeconomics comes into play.

If we are going to mention the topic of monetary value, then one must consider the cost versus benefit relationship. Money is being submersed into studies such as cancer and HIV/AIDS because these diseases have a more profound impact on the healthcare system as a whole. Think of it this way, which is greater: the cost of treating non-life threatening mono for a couple of months, or the years of treatments for cancer of HIV/AIDS?

Extending the monetary situation a little further, should orphan situations still be a topic of controversy? Incentives, such as tax credits and seven years of marketing exclusivity, make clinical research of rare diseases more feasible. In the case of mono, it may affect more than 200,000 persons in the U.S, but there is no expectation that the cost of developing a drug or vaccine will be recovered from sales; these incentives may make up the difference.

In the meantime, scientists such as Dr. Balfour are working on treatments to help patients recover faster; and last year, Belgian researchers reported the results of a mid-stage trial that limited the number of mono infections from 10 percent (control group) to 2 percent (study vaccine group). So, while the controversy still simmers, Chelsea Day and all of the future mono-sufferers can be assured that, "the sun will come out…tomorrow."