Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday May 22 2017, @07:11AM   Printer-friendly
from the prescription-for-a-disaster dept.

Arthur T Knackerbracket has found the following story:

Patients would no longer have to wake up in the middle of the night to take their pills, Purdue told doctors. One OxyContin tablet in the morning and one before bed would provide "smooth and sustained pain control all day and all night."

When Purdue unveiled OxyContin in 1996, it touted 12-hour duration.

On the strength of that promise, OxyContin became America's bestselling painkiller, and Purdue reaped $31 billion in revenue.

But OxyContin's stunning success masked a fundamental problem: The drug wears off hours early in many people, a Los Angeles Times investigation found. OxyContin is a chemical cousin of heroin, and when it doesn't last, patients can experience excruciating symptoms of withdrawal, including an intense craving for the drug.

The problem offers new insight into why so many people have become addicted to OxyContin, one of the most abused pharmaceuticals in U.S. history.

Over the last 20 years, more than 7 million Americans have abused OxyContin, according to the federal government's National Survey on Drug Use and Health. The drug is widely blamed for setting off the nation's prescription opioid epidemic, which has claimed more than 190,000 lives from overdoses involving OxyContin and other painkillers since 1999.

The internal Purdue documents reviewed by The Times come from court cases and government investigations and include many records sealed by the courts. They span three decades, from the conception of OxyContin in the mid-1980s to 2011, and include emails, memos, meeting minutes and sales reports, as well as sworn testimony by executives, sales reps and other employees.

The documents provide a detailed picture of the development and marketing of OxyContin, how Purdue executives responded to complaints that its effects wear off early, and their fears about the financial impact of any departure from 12-hour dosing.

Reporters also examined Food and Drug Administration records, Patent Office files and medical journal articles, and interviewed experts in pain treatment, addiction medicine and pharmacology.

Experts said that when there are gaps in the effect of a narcotic like OxyContin, patients can suffer body aches, nausea, anxiety and other symptoms of withdrawal. When the agony is relieved by the next dose, it creates a cycle of pain and euphoria that fosters addiction, they said.

OxyContin taken at 12-hour intervals could be "the perfect recipe for addiction," said Theodore J. Cicero, a neuropharmacologist at the Washington University School of Medicine in St. Louis and a leading researcher on how opioids affect the brain.

Patients in whom the drug doesn't last 12 hours can suffer both a return of their underlying pain and "the beginning stages of acute withdrawal," Cicero said. "That becomes a very powerful motivator for people to take more drugs."

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday May 22 2017, @06:28PM (2 children)

    by Anonymous Coward on Monday May 22 2017, @06:28PM (#513647)

    Before citing it as strong evidence, you should note that that estimate of 250,000 deaths is based on data from three papers that collectively accounted for 35 "preventable" deaths. No, that's not a typo. That's quite an extrapolation.

    I went to table 1 and checked the first reference in that table (ref 11). It looks like that one alone dealt with >10,000x more records than you claim:

    Of the total of 323,993 deaths among patients who experienced one or more PSIs from 2000 through 2002, 263,864, or 81%, of these deaths were potentially attributable to the patient safety incident(s)

    http://www.providersedge.com/ehdocs/ehr_articles/Patient_Safety_in_American_Hospitals-2004.pdf [providersedge.com]

    On the other hand I haven't checked this paper in detail, it is quite possible it will end up being normal medical research quality (extremely crappy). But getting such an estimate does not seem like it would be problematic in principle (besides the trouble with defining "error"). So that will be the fault of NIH, CDC, etc for not funding studies to collect this important info.

  • (Score: 2) by AthanasiusKircher on Thursday May 25 2017, @08:44PM (1 child)

    by AthanasiusKircher (5291) on Thursday May 25 2017, @08:44PM (#515697) Journal

    I'm not normally responding to ACs these days, but I need to correct an error here. If you actually read the study in the link you provided (rather than merely its "summary"), you'll find the following statement on page 6:

    We determined that the 16 PSIs we studied may have contributed to 263,864 deaths in the Medicare population from 2000 through 2002. Eighty-one percent of these preventable deaths were potentially attributable to the patient safety incident.

    These "weasel words" are there for very good reasons, despite being juxtaposed with seemingly contradictory rhetoric like "these preventable deaths."

    Those "263,864 deaths" quoted in the meta-study were extrapolated from analysis of "16 PSIs," which stands for "patient safety indicator." In other words, they didn't actually examine any specific cases to determine whether a "preventable death" occurred due to the details of the case. Instead, they extrapolated on the basis of vague issues that potentially indicate a problem with "patient safety." Some of those "indicators" seem clearer than others (see Appendix A for the list). For example, "foreign body left during procedure" sounds like a clear medical error, though again whether it was a primary cause of death was not investigated in any specific case in that study. On the other hand, "Post-operative hemorrhage or hematoma" -- well, lots of people experience bleeding post-op, especially if they don't adhere to doctor's instructions. Trying to extrapolate how many "preventable deaths" occurred based on an "indicator" like that seems problematic, though.

    So, how did they come up with their numbers? Well, if you look at the Appendix F from your link, you'll see they extrapolated based on statistics from this study [jamanetwork.com]. Except that study didn't actually examine mortality or "preventable deaths" by examining individual cases either, but rather used a sort of "case-control" methodology to look at the difference between outcomes with patients who did and did not experience these "PSIs." On that basis, they calculated "excess mortality" likely due to those PSIs.

    That may sound a little better methodologically (and I agree), but then you read their conclusion:"one can infer that the 18 types of medical injuries may add to a total of 2.4 million extra days of hospitalization, $9.3 billion excess charges, and 32 591 attributable deaths in the United States annually."

    So, your linked study took the estimates of "excess mortality" and applied them to a new dataset to extrapolate possible deaths and possible medical errors that may have contributed to these deaths, and then came up with an estimate for the Medicare patients alone that is 2.7 times higher than the estimates for ALL patients in the U.S. in the study I linked (and from which they got their estimates for mortality), even though the study I linked did a much less rigorous analysis.

    Anyhow, I stick to my original statement: only 35 actual cases were studied and determined to be preventable based on individual facts. I'm willing to accept a more rigorous case-control analysis or something as a way to extrapolate a broader estimate, but I don't see evidence that your linked study or the broader metastudy that's being discussed here used such methods. And given that their own source for methodology estimated the annual death rate as nearly an order of magnitude lower, I'd say there are serious red flags here.

    • (Score: 2) by AthanasiusKircher on Thursday May 25 2017, @09:05PM

      by AthanasiusKircher (5291) on Thursday May 25 2017, @09:05PM (#515703) Journal

      even though the study I linked did a much less rigorous analysis.

      Sorry -- meant to say "much MORE rigorous."

      Bottom line is the ~250k/year estimate is based on a metastudy that's based on 3 studies that looked at 35 actual preventable deaths, and based on one other study that made extrapolations based on methodology and extrapolation procedures from yet another study that came up with an estimate of ~32k/year.