The overuse of antibiotics in large-scale animal farming is particularly terrible; many are given low doses of antibiotics as part of their food [1]. This gives bacteria lots of time and selective breeding to become resistant, and it's probably the worst thing we could do for the long-term health of humanity.
The Wiki statement that Europe has banned antibiotics for growth promotion since 2006 may be true legally, but not in practice.
In fact, Denmark has used antibiotics to prevent infections in pig farming[1] up until at least the summer of 2016[2]. As such, some farms have 88% of their pig population infected with MRSA (Methicillin-resistant Staphylococcus aureus)[3], and over 10,000 Danes have been infected with pig-MRSA.[4]
The ban hits #1. #3 really doesn't deserve to be banned - it's a legitimate use of antibiotics in veterinary medicine. #2 is where there's still a problem, because the use of antibiotics as a preventative runs the gamut from a perfectly legitimate veterinary intervention to an egregious misuse of antibiotics depending on the circumstance.
China doesn't really care about the environment. Their top priority is growth in order to avoid social unrest, and for feeding all the people they need all the food they can get - and if antibiotics makes farming more productive, they'll do it.
The US, on the other hand, has the problem that Big Ag is a powerful lobby. And Trump, given his total lack of understanding on climate and environmental issues, won't do anything to change that.
China is in a phase of its development where the environment is not the most important thing. Once they're rich enough they'll put resources on fixing that.
This is how every rich country went through industrialization. Not because they were all immoral, but because they were poor.
The good news is that China is somehow doing this in fast forward. Each Chinese year seems to be 3 years in a regular economy, and their enormous push for solar power is (I hope) just one sign of what's to come.
China uses the most renewable resources of any nation and just committed to 2+ billion expansion of their development. Never wise to be dismissive and generalize an entire country when it is a complex landscape of ideals and practices
Could be because they need to, not because they want to. The air pollution in some of their cities is downright hazardous so they have to do something and it also makes economic sense by now with the falling prices of solar. I don't buy that they are suddenly really interested in sustainability.
US corn production may have started by feeding people, then animals, then substituting for other food (corn syrup>sugar), then being burned in cars. Which just shows people tend to find ever wider demand for products as they get cheaper.
Meat demand as price drops may end up in pet food or 'corn' flakes, but the usage does not really matter. In competitive markets producers tend to make a fixed percentage of sales as profits. Thus, increasing total sales tends to also increase industry profits.
First, "Europe banned antibiotics" is a massive oversimplification, and ignores that similar steps are being taken in the U.S.
Second, while the European bans are doing good things, it's far from perfect: http://www.cidrap.umn.edu/news-perspective/2016/10/eu-report... . Use of fluoroquinolones, macrolides and polymixins (including colistin) rose - Europe is weirdly fond of it.
For those who are concerned about the issue, one good step is change your diet to whole plants diet + wild caught fish (e.g. salmon). Livestock meat, esp. red meat and processed meat have adverse health effects, esp. north america
When it's wild-caught, make sure it's sustainably harvested. Alaskan fish is actually great in this regard. (Go figure one of their biggest industries is OK with government regulation.)
But seriously, antibiotic resistance is a population-wide problem and you personally not eating meat won't protect you against antibiotic resistant bacteria.
Not to mention the idea that red meat specifically causes adverse health effects in most people is an old idea that modern nutrition studies have mostly not shown to be true.
You've misunderstood the parent post. By embracing non-polluting alternatives, you promote and induce market growth along environmentally healthy lines and deny resources to unhealthy options. This may not protect you immediately, but over time it can absolutely work provided a sufficiently large group is able to act collectively.
There is plenty of evidence that this approach has worked in other domains to create behavioral changes in large, impersonal organizations.
Maybe I did misunderstand. In that case I would think that buying locally raised, organic, grass-fed beef would be a more comparable alternative than wild salmon (Though neither is sustainable enough to replace the current levels of beef consumption for everyone).
The thing to note, however, is that once the horse is out of the barn, closing the barn door is futile. In fact it may very well be counterproductive if it leads to infected meat spreading further.
Antibiotics are a renewable resource. If we could prohibit all use of a class of antibiotics for a time, probably decades, bacteria will stop spending energy on resistance and become vulnerable again.
Antibiotic resistance is a problem with a political solution. It just takes coordination and cooperation worldwide to do the right thing instead of chasing short-term profit.
So I guess we'd better hope for a technological solution instead.
Unable to edit my prior comment, but found research that appears to dispute your claim; citing notable research might help me better understand your claim.
It is very possible that I'm missing something, but that's not how evolution works to my knowledge, adoption of substantially new genetic features is rapid, but refinements or removal of unrelated genetic code is slow if it is not impacted by the substantive mutations.
For more information, see
"Long-term phenotypic evolution of bacteria":
No expert, but theoretically new mutations that eliminate resistance may be beneficial by allowing the newly mutated bacteria to outgrow the resistant ones if the mutation leads to spending less energy.
Edit: there will also be non-resistant bacteria in the population so the mutation to remove resistance may not even be necessary.
'Use it or loose it' apply to bacteria far more then for example humans.
Evolution doesn't work the way you think it does. Its not an adaption to something to all eternity but adaptation for current surroundings - once the context is changed, the genome will change too.
The process is simple - the first bacteria that ditch the antibiotic resistence gene will multiply more and use more available resources so other bacteria that have that gene gene (witch becomes resource hog since it doesn't contribute to survival any more) will die out.
Antibiotic resistant genes don't just disappear when we stop using antibiotics. They will remain in the gene pool effectively forever at low levels.
After the first time antibiotic resistance is developed, the gene frequency in the bacteria population may drop to nearly zero after it isn't so useful. But it will come back again very quickly with the reintroduction of that antibiotic. The time scale will be much quicker than when the bacteria first developed antibiotic resistance.
Widespread use of antibiotics (especially at low doses) was and is a criminal mistake.
You are right, but perspective is needed. How quick is quickly ? Is it quicker to return then they loose it ? Or is it slower to return then to loose it ? If so, we can circulate certain types of ABs effectivelly.
> Widespread use of antibiotics (especially at low doses) was and is a criminal mistake.
This is not only human made thing. Widespread AB use is common in nature. But AB use for growth promotion IS criminal. And AB should be given as last resort not as usual practice. Good nutrition and adequate supplemenation can cover the rest:
- Retynol is epic for mucosal layer protection.
- Vitamin D is great anti-infective agent.
- Vitamin C is awesome prevention and potential cure for helicobacter pylori.
- Many spices are fantastic as ABs.
- Fermented foods such as kefir and stuff like spirulina are great addition.
- High carb diet reduces immunity a lot.
You don't really need ABs every freaking day - its typical that kids have 6 or more tours in a year. My
kid is raised with above principles and more and never had an AB even with kindergarten (7 years now).
There were also some other problematic procedures such apendixtomy that influence this or usage of ABs with viral diseses (typial to prevent opportunistic infections) or for disease that it doesn't affect much and even makes the organ worse, such as otitis.
NSAID use is also contributing A LOT to this. People now bring down temperature on 37.5 and if you don't do it on 38 you are considered lunatic. I simply do no do it up until 40.5 (talking about kids, the exact protocol is contextual and depends on age, length etc). The science showed that when you do that, the disease last longer and mortality is proved to be higher on animals. Brain knows perfectly well when to stop it unless it is broken itself (i.e. meningitis).
You are right, but perspective is needed. How quick is quickly ? Is it quicker to return then they loose it ? Or is it slower to return then to loose it ? If so, we can circulate certain types of ABs effectivelly.
You are proposing coordination at the national or international level, among many different companies, medical institutions, and governments.
Just getting a ban on the use of antibiotics for non-infection related reasons will be difficult enough.
I am not proposing anything that radical. The first thing I would probably do is to stop using subtherapeutic ABs to grow animals. That would reduce 50% or more of the world AB usage and is certainly the easiest thing to try.
It would appear that the science is still not decided on this. If the resistance is carried on plasmids, it may disappear relatively rapidly from a population if the selection pressure for it is removed:
>. The presence of ascorbate induced a 50-75% decrease in minimal inhibitory concentrations of different antibiotics for resistant strains. When ascorbate is added, formerly subinhibitory concentrations of penicillin or tetracycline have an increased inhibitory effect on resistant strains and even induced the death of 25-93% of the initial population. These results suggest that ascorbate can induce the loss of several plasmids of S. aureus, and that the levels of antibiotic resistance are also affected by the presence of this compound.
Vitamin C saved the day again ! This level requires intervenous treatment tho, something considered frindge science at this moment for lame reasons.
> Antibiotic resistant genes don't just disappear when we stop using antibiotics. They will remain in the gene pool effectively forever at low levels.
They very likely already were in the gene pool before we went on the antibiotics binge. Remember that our antibiotics were discovered in nature. They're used by other species to fight bacteria. Penicillin was used by moulds for unknown millions of years before we coopted it.
Since bacteria multiply within days or even hours, evolution works very fast in them. This is why they develop resistance within a few decades. How fast they would lose the resistance if we stopped is AFAIK not something we can know until we do it and see what happens.
They very likely already were in the gene pool before we went on the antibiotics binge. Remember that our antibiotics were discovered in nature. They're used by other species to fight bacteria. Penicillin was used by moulds for unknown millions of years before we coopted it.
Those antibiotic resistant genes likely weren't in pathogenic bacteria that affect humans though. And by improperly using antibiotics, we've created an environment where those genes cross species boundaries from where they originated.
What makes you think that pathogenic bacteria never contained antibiotic resistant genes before humans developed antibiotics? I've never seen a study that proves that antibiotic resistant genes were never in pathogenic bacteria.
Seems quite possible that antibiotic resistant genes were in all types of bacteria and the volume thereof responds to selection pressure.
You would be right if the antibiotics resistance was 'free', i.e. had no downside, but IIRC it does pose a cost of some sort so when not in an environment with antibiotics there is evolutionary pressure to lose the resistance.
Standard theory of evolution: organisms that spend energy on features that do not contribute to their fitness will be at a disadvantage in the long run compared to others of otherwise similar type that do not. The others will have more energy to devote to reproduction.
Citing notable research might help me better understand your claim; it is very possible that I'm missing something, but that's not how evolution works to my knowledge, adoption of substantially new genetic features is rapid, but refinements or removal of unrelated genetic code is slow if it is not impacted by the substantive mutations.
I wish I could have read more than the abstract, but maybe this [1] ... I would especially like to know what do they mean by "Unfortunately, the available data suggest that the rate of reversibility will be slow at the community level"
> In hospitals, both modelling and analysis of
the correlations between antibiotic resistance and variation
in antibiotic use show that alterations in antibiotic
use can cause rapid changes (in the order of days
to months) in the frequency of resistance. By contrast,
when the fitness cost of resistance is the main driving
force behind its reversal, the rate of change is expected to
be much slower (months to years).
I think what that means is that hospitals are able to create more selective pressure by rotating through different kinds of antibiotics.
Said antibiotics are most often used for growth promotion, rather than to treat disease. That's becoming less common, but it's still a complex problem.
Yes and no. Said antibiotics are given because we know that the diet we're feeding them will cause them to develop infections. Cows aren't naturally healthy eating corn/soy.
For so many reasons, we need to end crop subsidies in this country.
It's been known since the 50's that giving penicillin to livestock increases growth rate. Farmers have understood that and used that.
The antibiotics counters and interacts in all sorts of ways we don't really understand (similar to how we don't really, really understand how the human gut works), but one of the ways it surely works is to counter problems caused by unsuitable living conditions.
Ending crop subsidies is probably a good idea in most of the world, but it certainly won't solve the overuse of antibiotics in agriculture.
In a way this is surprising to me. Shouldn't this destroy the gut flora, making it harder to digest/use the food ? Then again their entire gut flora might be anti-biotic resistant ....
Several of the antibiotics in use as feed additives act selectively, changing the mix of microorganisms rather than destroying them all.
I swear I'm not a shill for big ionophore, but this is something they do in ruminants, they shift the mix away from one that tends to lead to "bloat" when the animals are fed a lot of grain.
There's three reasons for using antibiotics in livestock:
1. Growth promotion
2. Prophylactic treatment (what you're suggesting)
3. The treatment of actual disease
Those go from the most to the least problematic - but I was mostly addressing the idea that a decrease in antibiotics would result in more animal products with bacterial contamination. For #1, that's not true, and for #2 it's not linear (infections in animals don't necessarily mean contaminated food products).
When we talk about reducing antibiotic use in livestock, it's usually focused on #1 and putting #2 in the hands of vets (prophylactic treatments when there's actual risk of disease). Rarely do we actually suggest banning #3.
> For so many reasons, we need to end crop subsidies in this country.
We also need to reduce consumption of meat, and work towards raising it more healthily. It should be common sense that an unstressed animal fed the diet that it's evolved to thrive on would yield better meat (e.g. grass fed beef is objectively superior to cows fattened on corn.)
Of course it is more expensive to do this, but so be it. There are plenty of cheap sources of nutritious proteins; people could be directed to adjust their expectations. Of course, there are many deep pocketed industry resources that would be against such an adjustment, and there's a new deep-pockets friendly administration in charge, so there's little hope of this happening in the near future...
> [banning use of antibiotics in animal feed] may very well be counterproductive if it leads to infected meat spreading further.
This isn't why they put antibiotics in the food; it's because animals that eat bulk antibiotics gain more weight over the short term than animals that don't. Probably by killing off all the gut bacteria (except the antibiotic resistant ones) that might compete for nutrients. The animals are grown quickly then killed.
If an animal is actually sick it's treated differently, or simply destroyed. Even farmers who don't use antibiotics to stimulate growth will treat their sick animals!
BTW the resistant gut bacteria we're talking about are the ones that get people sick like e coli.
Historically, a Chinese Medicine doctor was paid a
retainer to keep their patients healthy. If a patient
became sick, the doctor would not be paid until the
patient’s health returned. In a similar vein, a doctor
that resorted to surgery was considered an inferior
doctor. If he/she did their job correctly and helped
their clients stay healthy, there would be no need to
perform surgery.
Perhaps the current incentive for drug companies is the problem. They don't care if we stay healthy (which is what we all want), they only treat illness. In fact you could argue it's in the best interest of drug companies to keep us sick!
Maybe there's an alternative way to fund them, or an alternative way to deal with infections in the first place.
> If he/she did their job correctly and helped
their clients stay healthy, there would be no need to
perform surgery.
Sheer nonsense. This assumes that all surgeries are a result of poor preventative care which is trivially false.
Why does it seem like the folks criticizing the profit motive of pharmaceutical companies see no such perverse motive in those selling holistic treatments? I don't suppose that "alternative way of dealing with infections" happens to involve spending money regularly on herbs and acupuncture...
The top cause of death in the United States is heart disease, which is also the most preventable. Assuming all causes of death have surgical intervention proportional to their occurrence in the population that would mean yeah, maybe not all, but there would be significantly less surgery if there were more prevention.
However, doctors only contribute slightly to the status quo. The food industry more generally is to blame. As well as the government, for subsidizing unhealthy things.
doctors only contribute slightly to the status quo. The food industry more generally is to blame.
Worth remembering whenever you see people comparing different healthcare systems based on longevity. Doctors are not always the primary factor in lifespans.
> there would be significantly less surgery if there were more prevention.
True but that's a considerably diminished claim from the OP's, which was that all surgery is a sign of doctor failure. Also, the unscientific treatments they linked to would do nothing to prevent heart disease, and may even discourage patients from seeking care that can help them.
> Why does it seem like the folks criticizing the profit motive of pharmaceutical companies see no such perverse motive in those selling holistic treatments?
One reason is to do with scale, and perceived credibility.
A pharmaceutical company is a vast organisation, which is able to invest huge sums of money on advertising, marketing, lobbying, PR, insurance etc, and they have the benefit of being seen as having scientific rigour and government-administered quality assurance as inherent features of their offering.
This all means their products can be sold to many millions of people who trust the recommendation of their physician, even if it offers minimal benefit (e.g., antidepressants or statins prescribed to people who would benefit far more from lifestyle changes) or causes significant harm (see current spike in addictions and fatalities linked to opioid painkillers like Fentanyl).
A natural/holistic health practitioner is a normally an independent small business operator, offering treatment to about 20-30 people per week, most of whom are generally healthy and just want to stay that way by having good nutrition, reduced stress, better muscular-skeletal alignment, etc.
Most natural/holistic practitioners make a comparable income to a contract software developer, and have no intention or ability (or more importantly, stock market pressure) to make any more than that. They rely heavily on repeat business and word-of-mouth recommendations, and therefore must consistently help their clients achieve an improved quality of life.
When people die whilst under conventional medical/pharmaceutical treatment, the general attitude is "that's sad, but hey it was a scientifically validated treatment, so, what else could we do?"
Whereas any natural/holistic practitioner who causes serious harm or death of a patient is subjected to thorough scrutiny by regulators and the media, and indignant sharing on social media. Those found to be negligent are quickly put out of business.
The crucial difference between the pharmaceutical industry and the alternative health sector is centralisation vs fragmentation.
Due to being vast, centralised entities, pharmaceutical companies can utilise mass marketing, lobbying and the perception of scientific validity and government endorsement to get their products into the hands of customers, and can continue to do so even if their products cause widespread harm [1]).
But every natural/holistic practitioner is an independent operator servicing a small local area, and must work hard to provide a high degree of satisfaction to every customer every day, and face the reality that a single instance of serious harm will put them out of business for good.
The article you cited demonstrating the size of the industry only confirms that for the most part, practitioners are doing a pretty good job.
I'm not arguing that mainstream medicine and pharma shouldn't exist, or should even be dramatically changed. I happily use it, and find it very beneficial. But I also know from personal experience and simple market economics, that the oft-touted-here attitude, that most/all natural/holistic/alternative medicine is useless/harmful snake-oil peddled by charlatans to morons, is misinformed and mostly false.
You'd think the HN audience would be smart enough to actually look at evidence in regards to medicine, but there's a lot of Steve Jobs types who'd rather pay for a well placebo-ed death. I wouldn't mind if they weren't so insistent on preaching their bullshit everywhere they go and trying to pass it off as having any basis.
It's kinda disheartening to dedicate years and years to studying medicine, only to realize there's more money and satisfied patients to be made in hustling bullshit to new-age fools who will preach your crap right up until they die of an easily treated disease.
The problem is that we currently oppose the 2 views, and people defending one discard all benefit from the other.
But it's true that you are happy to get surgery in case of a grave emergency, with heavy drugs and a strong hopital system. It's also true that the current most used health system is actually very bad at keeping people healthy.
But no, all pharma related practice is not a z-serie drug dealer plot. And no, all alternative health practice is not hippie voodoo trying to cure cancer with placing hot stone on your third eye.
No, people are very bad at keeping people healthy. Modern medicine has almost eliminated acute disease as a leading cause of death in the developed world and most of our health problems are now self inflicted. Evidence based medicine has extended the average lifespan by decades despite our best efforts.
Alternative medicine may not all be hippie voodoo but it all has one thing in common: all the data all of its salesmen have gathered before profiting hand over fist has amounted to nothing. All of the data gathered since then has amounted to nothing except the rare fortunate placebo effect, early deaths, and needless suffering. Once every few decades we stumble on a gem like meditation because medicine works by testing and gathering data on everything, including folk remedies.
Our most common treatment for metastaticized cancers is to inject radioactive material into our bodies in the hope that it kills cancer cells faster than healthy tissue! We will clearly do anything so all any "alternative" medicine has to do is prove its effectiveness. There is lots of value to a variety of traditional remedies that utilize the antimicrobial properties of tea and honey or blood thinners in many herbs but that isn't going to make anyone any money except your neighborhood grocery store or ethnic market. The rest of alternative medicine is just a bunch of snake oil salesmen whose livelyhoods depends on rejecting the foundations of evidence based medicine.
There is no way you can judge "all the data" because "alternate medicine" is as big as saying "languages in the world". It's all that is not sanctioned. It includes a lot of things good and bad, crazy and sane.
E.G: calibrated fasting is alternate medicine. Data for fasting has amounted to many things including improving the immunity system response and lowering side effects of heavy drugs. There is nothing magical or mysterious about it.
It is "alternate medicine" because you won't see most doctors recommend fasting to your typical patient. There is no moral judgment in that. I'm not saying it's good or bad, it's just a fact.
You have one vision of "alternate medicine", and it's limited to a stereotype. I understand why. The louder champions of many alternate medicine are using pseudo science, repeated by stay-at-home mothers on facebook. It does make the credibility of many interesting informations crash.
However, dismissing what can be valuable information because of that is missing opportunities. Especially, a lot of chronic diseases have treatments outside of traditional medicine that make living with it much more bearable.
Again, it's not exclusive. You don't have to reject one to get the other. But trying a few things outside of ones own bubble can get some wonderful unexpected results.
I make a point of trying many such things. I'd say 9 times out of 10 it's a waste of my time and money. Sometime it's even unhealthy. But when you do find out something that brings strong benefits, it's usually really worth it.
It's a shame one's have to do it that way. I'd much rather have professionals advice me on those. And it's why it's so important no to close the door.
The all alternative medicine is bunk crowd would do well to consider the story of The 2015 Nobel Prize for Medicine 2015 winner Youyou Tu [1], who demonstrated (yes, using modern medical research techniques), the efficacy of a traditional Chinese medicine remedy for treating malaria.
Given stories like this, I fully expect that whatever turns out to be the next widely-adopted antibiotic treatment, will be something that has long been used in traditional or alternative medicine.
One promising candidate is Pulsed Electromagnetic Fields [2].
That is a bit of a mischaracterization. There will always be anecdotal evidence of unscientific or proto-scientific approaches working some of the time. The problem is that overwhelmingly most of the time they don't. That malarial treatment for example was not "long been used in traditional medicine." It was discovered in an ancient text and not part of contemporary traditional Chinese medicine, which had "forgotten" it and replaced it with ineffective treatments, due to its fundamentally unscientific nature. It was only through science that they separated what actually works from the vast bulk of placebos.
Wormwood-based remedies for parasitic infections have been consistently used in folk medicine in Europe and Asia for many centuries. Is it really such a simple truth that it had been "forgotten" and "replaced ... with ineffective treatments"? Or could it be something more nuanced, like that malaria ceased to be a problem in areas where this particular wormwood-based treatment had previously been in use? Regardless, wormwood-based remedies have remained widespread in various forms.
Yes, of course it took modern medical research practices to convert into a potent form, demonstrate its efficacy and combine it with other agents for maximum benefit against malaria.
But it also took someone - one person - to be willing to go against the tide of vicious contempt towards traditional medicine like that which has been on full display upthread here, and painstakingly study the ancient literature and see where it might lead.
That's one person who I'm quite sure would not ever have said anything like "I wouldn't mind if they weren't so insistent on preaching their bullshit everywhere they go and trying to pass it off as having any basis".
It's all very well to casually dismiss all non-mainstream remedies with "the problem is that overwhelmingly most of the time they don't", just as it's all very well to dismiss every new startup or technological innovation as being highly likely to fail. At least 95% of the time you might be right, but the time you're wrong is when it matters most, when you're talking about a new remedy for malaria, or, perhaps one day, a remedy for antibiotic-resistant bacterial infections.
Please just keep it in mind. Millions of lives saved already, because one person was able to suppress the temptation to sneer and let their curiosity win out.
I think you missed the rest of my post, especially where I ended:
> There is lots of value to a variety of traditional remedies that utilize the antimicrobial properties of tea and honey or blood thinners in many herbs but that isn't going to make anyone any money except your neighborhood grocery store or ethnic market. The rest of alternative medicine is just a bunch of snake oil salesmen whose livelyhoods depends on rejecting the foundations of evidence based medicine.
Perhaps we are working from different definitions of "alternative medicine." Wikipedia defines it: "Alternative medicine or fringe medicine are practices claimed to have the healing effects of medicine but are disproven, unproven, impossible to prove, or only harmful." I also have to acknowledge WebMD's definition which is "Alternative medicine is a term that describes medical treatments that are used instead of traditional (mainstream) therapies." I use a middle of the road definition that creates a (perhaps false, perhaps not) dichotomy between evidence based and alternative medicine. Both are broad fields but it boils down to this: when there's enough interesting anecdata or popularity for a therapy that is "alternative," someone eventually pays for a small study to examine it. When enough such studies are done with positive (or at least no conclusively negative) results, metastudies are compiled and eventually the science trickles through evidence based medicine until it hits medical textbooks.
With extended family who are old Soviet trained doctors, I am exposed to a lot of folk remedies, what you would call alternative medicine, but a quick search on Google scholar reveals many peer reviewed studies explaining why many of these remedies work and examining whether there are statistically significant effects. There's not enough profit motive for the pharmaceutical industry to spend nine and ten figure sums on something they can't patent and sell for billions but that doesn't mean we don't know how various tars slow down skin cell replication and help with psoriasis, how antimicrobials in tea, garlic, and honey work, or that many forms of fasting have strong evidence in life extension, let alone general health. This is all evidence based, not alternative, medicine and lumping it all in with reflexology, chakras, faith healing, or magnet therapy is a disservice to our existing scientific knowledge and legitimizes a snake oil industry.
Thanks - I should have been more specific about purely surgical treatments as I was with the other two groups so as to maximise the benefit of your knowledge :)
The context of this thread is more about drugs and there's a lot of discussion about pharmaceuticals so that was what I was really asking about.
But I guess radio / chemo therapy is on the list, even though I am uncomfortable with the amount of collateral damage they do to the patient in the process (seen it happen so painful topic)
Look - I get the point you are trying to make.
If you discount infectious diseases and surgeries, there are a lot of things off the table as far as a 'cure' full stop goes.
But that is incredibly disingenuous to medicine. You're removing all the things we do incredibly well - to the point where infectious diseases, for the most part - until we reach this disaster warned about in the article we are commenting on - have ceased to be a major burden of disease in the modern world.
That is just remarkable. The reason people are able to make claims like 'all we do is treat sick people, we don't cure them' is because the things that we do well, we do so fucking well it isn't a major issue anymore.
I know very well that my job in a modern first world country hospital revolves largely around treating people with chronic diseases, and that for the most part, I can't do much for them.
For example, when I work in the Emergency Department (Most of my work these days) It is rare for me to 'pick up' a patient that has never had a hospital admission. Most people swing in and out and and out and in and out until one day they don't make it out again.
They have chronic diseases caused by a lifetime of either 'just living' or living hard (smoking, drinking, eating too much) and the accumulated damage is BEYOND modern medicine's ability to fix it.
Personally, I think that for many of these chronic diseases, we are beyond any quick fix ever being developed - i envisage a future world where we literally grow someone a new body and attach their head to it so they can go again... the damage occurs on such a cellular (and slightly above that) level - and is so widespread that it is beyond my comprehension how any treatment we are looking at could fix it - although I am an optimist and who would have ever thought we would have the biological treatments we have now for certain diseases, modern miracles that they are.
But we are still WINNING - as pointed out elsewhere in the comments, we now have a cure for Hepatitis C.
Hepatitis C is a horrible condition. When I was at med school (graduated 2013) we were taught the 'Rule of 20' for Hep C: 20% clear it, of the remaining 80 20% have no issue, of that remaining 80 20% go on to develop fulminant liver failure. Now we can cure the fucking thing. Absolutely amazing. And there's talk of stem cell injections for damaged cardiac muscle; and since I left Med School the average life expectancy for Malignant Melanoma has risen from <12 months to close to 4 years and growing. This is remarkable. and anyone who says differently doesn't understand the facts, and is being willfully ignorant of the enormous efforts of thousands upon thousands of people who work tirelessly in labs and hospitals all over the world to bring about a better quality of life for the sick
The point was that mainstream medicine is good for structurual and acute problems. Majority of other stuff it actually probably makes worse and there is low motivation for actually doing it better given that chronic deases keep the buisnis running. When cancer is in question, it actually can't be worse because price of the medication may destroy not only a person but entire families.
That's because people in general, probably you too, are willing to take the tradeoff for resolution of acute problem over long term consequences. Most people are willing to pay for expensive cancer treatments even with the socioeconomic consequences because the alternative is death. Now whether they should or not is a different discussion altogether. Every medical intervention has associated risks and side effects - from the most routine blood draw to a simple laparoscopic hysterectomy to the antipsychotic to treat your schizophrenia. We are willing to take those risks because we decide that it is worth it.
And I would be careful to not make sweeping generalizations about how mainstream medicine makes chronic problems worse. First, most chronic problems are due to population and socioeconomic determinants of health that are often well out of the scope of healthcare providers at least in the inpatient setting. Second, even with chronic problems we have made advances: think about insulin for diabetes, antihypertensives for blood pressure, or statins for heart disease.
That is why I said majority of stuff, some are clearly good, like insulin for TDB1. For TDB2 however, the insulin also kills pateints due to the hypoglicemia (as body adapts to its sympothoms) in what I deem to be unacceptable number, while there may be more promissing treatments in "alternative medical circles". Statins are useful only for small subset of patients - clearly overprescribed just like AB and lead to diabetes, dementia and other adverse effects.
The points you made about the causes of those deseases are irrelevant when we talk about treatments. The reasons for a disease are not the reasons for clearly suboptimal treatments. And you certainly can't justify destroying family to lengthen the life of cancer patient for few months, maybe a year, in a general case ?
> ... are willing to take the tradeoff for resolution of acute problem over long term consequences
Meh... majority of acute treatmets have very low long term consequences. AB is not one of those as evidence shows that gut flora may take very long time to recover (or it may never recover) even after single usage (and we know that disfunctional gut flora is bad). But I can probably take a bet that few doses of ibuprofen will not harm me at all. What I am saying is that standard medicine does have many fenomenal things but that people should probably look elsewhere for help about stuff it doesn't handle good at all. This requires everybody to be very well informed, not something most of humans have time or motivation to do.
> We are willing to take those risks because we decide that it is worth it.
No, "we" are taking those risks because doctors tells us its safe and that risks are trivial or non existent (while at the same time bashing supplements as dangerious?!). "First do no harm" should probably be deleted from mission statement. You don't really think that random person knows anything about medicine or that it questions doctor ? Its mostly buisnis only.
You're doing a lot of hand waving that is not only not backed by the evidence but smells strongly of drinking the kool aid of alternative medicine.
Modern medicine not having tools to fix chronic disease says more about the second law of thermodynamics than it does about the state of modern medicine, and thinking that alternative modalities (which have been tried and tested ad nauseum) will do any better is complete madness. But you're welcome to waste your money if you wish.
Now when you get to the price of cancer treatments, fortunately I live and practice medicine in a western country with a universal health system - i don't have to have discussions with patients about whether they will be able to afford a treatment - everyone gets everything they need. The US would do well to introduce a good system here, but apparently that smells too much like communism... even if it does create the horrible situations you allude to
> But that is incredibly disingenuous to medicine. You're removing all the things we do incredibly well - to the point where infectious diseases, for the most part - until we reach this disaster warned about in the article we are commenting on - have ceased to be a major burden of disease in the modern world.
Please don't take it this way - I am in no way diminishing the achievements of modern Western medicine. To say that it has saved the lives of millions of people would be a gross understatement and this fact should rightly be celebrated. I was 'discounting' those things only because they are so obvious that you would frankly have to be dumb not to acknowledge them.
Surgery is a more complicated topic of course - someone gave me an example once of a doctor explaining to a patient due for heart surgery operation that they were saving some veins in their body as donor tissue for when they would inevitably be back for a repeat. So, it's not exactly ideal, right? Yet such surgery is proclaimed as a success.
Similarly when the 'cure' involves cutting away parts of the body as a lesser evil (or chemo / radiotherapy which is in a way similar) then there is also much room for improvement?
> For example, when I work in the Emergency Department (Most of my work these days) It is rare for me to 'pick up' a patient that has never had a hospital admission. Most people swing in and out and and out and in and out until one day they don't make it out again. They have chronic diseases caused by a lifetime of either 'just living' or living hard (smoking, drinking, eating too much) and the accumulated damage is BEYOND modern medicine's ability to fix it.
This is the real point that I wanted to get to. You can go a long way with correct nutrition and healthy lifestyle and yet there seems to be relatively little attention paid to it (I don't consider some leaflets in the GP surgery, or an article here and there in the press as adequate attention). And as you said in your earlier comment early diagnosis of a condition before it has chance to really set in and cause irreparable damage. This is not a problem for the individual doctors to solve, but it should be something addressed by medicine in general as well as government policy (and consequently a lot more money than is currently being invested into it it seems). Otherwise what you are doing is the same thing as we call 'firefighting' in IT - constantly fixing up the symptoms and not fixing the root cause of the issue.
And it is here that Chinese medicine could have the biggest impact (alongside Western medicine rather than in opposition to it) with it's emphasis on prevention, in-depth and continuous diagnosis, tailoring the treatment to the individual and holistic treatment of the patient (as an interconnected process rather than a collection of individual parts to be treated or removed as necessary). Uneducated people immediately think of acupuncture and herbs as being what Chinese medicine is about but these are only tools and are considered to be 2nd class medicine anyway compared to preventing illness and enhancing health.
> And it is here that Chinese medicine could have the biggest impact... with it's emphasis on prevention, in-depth and continuous diagnosis, tailoring the treatment to the individual and holistic treatment of the patient (as an interconnected process rather than a collection of individual parts to be treated or removed as necessary).
That is exactly what medical primary care providers do, up to the point of the magical "holistic treatment", where your definition turns muddy. Clearly, doctors understand that the body is an interconnected process. The only difference is the treatments you pay for are actually studied scientifically to validate their efficacy.
>Surgery is a more complicated topic of course - someone gave me an example once of a doctor explaining to a patient due for heart surgery operation that they were saving some veins in their body as donor tissue for when they would inevitably be back for a repeat. So, it's not exactly ideal, right? Yet such surgery is proclaimed as a success.
Given the alternative of dying now, it may very well be a success to get whatever time that surgery has granted them even if they'll need repeat surgeries in the future. As the physician you're addressing said- we would love to grow these people new hearts or entire bodies but our medical knowledge hasn't yet reached that point.
As to the bit on correct lifestyle, you're speaking as if this isn't something that all relevant health care professionals are already preaching. The unfortunate reality is that most of our patients would rather slowly kill themselves than adapt the lifestyle changes we recommend. Even without our input they're bombarded daily with nonstop messaging about how they should be slimmer, eat better, and exercise more. Culture is powerful, and a lifetime of overeating and being sedentary isn't going to be remedied in a 15 minute conversation with a physician despite our best efforts.
>This is not a problem for the individual doctors to solve, but it should be something addressed by medicine in general as well as government policy (and consequently a lot more money than is currently being invested into it it seems). Otherwise what you are doing is the same thing as we call 'firefighting' in IT - constantly fixing up the symptoms and not fixing the root cause of the issue.
I agree with you here, you need to incentivise people who would otherwise not plan ahead. We need to design our cities better, we need to get people more active, we need to improve social supports and interactions so that people's mental health is better. Sadly it doesn't look like much of the world is in a hurry to fix these problems.
I do however completely disagree with you in regard to Chinese medicine, which is, from the position of efficacy, utterly worthless.
Show me a well designed study that proves otherwise and I'll eat my hat, but in my years of study and investigation of alternative medicine the only study I am aware of that showed usefulness of Chinese medicine proved that it was 'time with practitioner' that resulted in the improvement, ie. talking to them about your problems, having a sympathetic ear.
This makes sense because so many problems and hospital admissions are psychogenically driven - people derive great comfort from being able to externalise their existential angst. But Chinese medicine is not medicine. Strip the bullshit and hand waving from it and call them therapists, and go to your Chinese therapist - but don't call it medicine because they aren't treating shit
Over there, it's common for (real, medical, certified) doctors to prescribe (as in, write on a sheet of paper and stamp) specific herbal teas. Which are sold in pharmacies. And they do help for sure.
Yes, you're right in that adopting verbatim the philosophy of ancient China isn't a perfect fit for our modern world. But the sentiment - that doctors should make more money when we're healthy vs not is one worth thinking about. Currently the entire medical field is reactive - you get sick, you get treated. I've never heard of doctors administering preventative care. We don't have a profession that fills the niche of "I'm feeling fine now, but I want to take the steps necessary to prevent things that might ail me in the near future".
There is a whole field of preventative medicine (also called population medicine) that aims to reduce the burden of disease.
Ever seen an advert for exercise? Seen warnings about levels of obesity? Seen an anti smoking sign? Received a vaccination?
They are all driven by doctors, who aim to change behaviour to reduce the burden of disease.
There is enough disease and sickness to keep all the doctors of the world employed even if we were actually able to fix all the problems people currently have.
The major problem is, we don't have tools to fix the issues that are major in western societies: they suffer chronic diseases, and the problem with chronic disease is that they are very hard to treat (degenerative bone disorders, arthritis, emphysema and bronchitis, cardiovascular disease, neurodegeneration, diabetes and diabetes-related complications, many many others)
Don't drink the cool aid that has been thrown around... the problem is changing behaviour is hard - no-one wants to do it until they start suffering, and then it's too late. The idea that doctors are sitting there enjoying this windfall from patients they allow to get sick just so they can keep bread on the table is quite frankly extremely offensive and basely ignorant
> I've never heard of doctors administering preventative care.
"Making every contact count", a UK initiative to make sure that doctors[1] use every opportunity to promote smoking cessation, alcohol use reduction, weight loss, and other lifestyle adjustments.
Though note that the NHS to some extend does incorporate that idea, namely that doctors are paid by the state and not by individual sick patients. Hence, doctors don’t have an incentive to ‘keep their patients sick, but alive’, which could be argued to be the case in systems such as Germany or the US, where doctors are paid for each individual visit by each individual patient.
> "Making every contact count", a UK initiative to make sure that doctors[1] use every opportunity to promote smoking cessation, alcohol use reduction, weight loss, and other lifestyle adjustments.
Wow, that's truly the nanny state at work. I like tobacco, I enjoy smoking and I'm working on my weight. I hate it when I go to a physician's office and get offered the standard blurb about how smoking tobacco is apparently only slightly less lethal than mainlining cyanide with an arsenic chaser.
Frankly, that's a major reason why I don't bother going to see one. My body's working well enough, and while it'd be nice to know that my organs are in good condition, I don't want to deal with someone who doesn't respect me.
Can your parents give you free smoking cessation tools such as nicotine replacements? Can they refer you to weight loss groups, and give you 3 free months?
Why would you need a free smoking cessation tool when your cigarettes cost you several euros or pounds per day? Any smoking cessation tool is already less expensive than your vice.
But then of course my comment was at least in part sarcastic. An advice can always be a good thing; on the other hand, my first reaction at the idea is to think that I'd like my doctor to listen to whatever health issue I have and solve it, rather than giving me paternalistic advice. As if we didn't have enough already.
There is some evidence to suggest that annual physicals do more harm than good [1][2] (although it's fair to add that there are competing viewpoints on that).
This sounds like lifetime insurance. When you're healthy, you pay premiums. When you're sick, the insurance company has to spend a lot of money treating you. In a well functioning market, the insurance company would have great incentive to engage in preventative care to keep you healthy.
I don't think that's a very good point. In order for people to be willing to pay the ever-rising costs for the best drugs they are going to have to work, drug companies aren't going to make much money once they've lost the trust of their customers. Perhaps the funding is problematic in one way or another but I doubt it's in drug companies' best interest to keep us sick.
This isn't a hypothetical problem but a well known issue within the pharmaceutical industry. The drug companies don't purposefully make drugs that cause harm in order to sell more of their other products but the economic incentives in the United States healthcare market are extremely misaligned. The end result is, essentially, an industry that "keeps us sick" to maximize profit not because of malice, but because of a market failure.
Market size and total profit over the lifetime of the patent is the primary factor that is considered at every step from random screening to clinical trials. Spinning up a clinical trial requires many tens or hundreds of man-years of work by very specialized and hard to find scientists/doctors before you even sign up your first human subject so even the biggest pharma companies have limited bandwidth. This forces them to be very selective, even when they have multiple high-potential targets, and that means choosing to run the clinical trials for drugs with the highest potential ROI and lowest regulatory approval risk. For a variety of reasons, cures are usually much more complex and difficult to develop and get approved than simple drugs that treat symptoms and since the US has a complex, opaque network of insurance companies instead of a single payer, no one is in a position to negotiate prices that maximize healthcare ROI at scale.
There is little incentive for pharmaceutical companies to develop cures for diseases when they can provide a "good enough" quality of life boost for the entire lifetime of the patient and sell it for the exact same price. This means that often times cures won't even be researched past animal testing because the pharma company has a large pipeline of less risky, more profitable drugs to get through the FDA
This is wrong on a number of levels. I've worked for a couple biotech companies and typically the commercial viability of a drug isn't examined until about phase 1. And even then it's a very rough estimate. I highly doubt any company is calculating an ROI at the screening stage.
Second, it's typically the science that limits the effectiveness of therapy, not the potential ROI. The reason why many drugs only treat symptoms is because that's the best science can do right now.
Third, curing a disease can often have a much better ROI than just treating symptoms forever. Look at Gilead's hepatitis C cure. Before it came to market the drug revenue for that disease was maybe $2 billion. The first year that Gilead's drugs were on the market, they sold $12 billion. The cure was way more profitable.
This. Especially for antibiotics, there's not really many levers you could use to manipulate the efficacy of a new drug - all of that is driven by the mechanism of action. There's really no "Lets make it 25% less effective..."
I think you are both right, both forces (and more) are at work. Market effects are not about what individuals plan or want, it's about system outcomes.
If, just as a thought experiment, it was possible to treat a disease quickly with a cheap drug or intervention, for example a cheap herbal drug that is available patent-free and for mere cents per liter, there would be no business in it, so no company could viably pursue this. The market selects for treatments that give a lot of profit. That always or even merely "usually" automatically somehow translates into the best option from a public health point of view only if you close your eyes and make lots of assumptions about how great the market magic works - in a market with extremely unequal information and very high pressure on customers (when you are sick).
On the other hand you are right that symptom treatment usually is the best we can do. The body is built during the early years and from then on it's maintenance, also what the body itself does and not just the doctors (it does not regenerate most structures, only repair). Just more and more ever more horrible hacks, workarounds and kludges and patches delaying the long decline more or less well. Or as most drugs do, don't even repair anything, just push levers and buttons in the body, but it doesn't do anything structural unless the body does it on its own. So if the structures don't regenerate the button-pushing drug has to be take lifelong.
That is so fundamental to complex life forms that current medicine simply cannot work against it. Maybe never, because replacing a structure in the grown body is much more difficult than growing it with everything else. For example, the difficulty of finding the path for new axons after a spinal injury is much greater than building everything together during very early development. Medicine would not just have to achieve what nature already does (when the body is assembled) - it would have to become far better! Repairing something often is much more difficult than starting from scratch.
> If, just as a thought experiment, it was possible to treat a disease quickly with a cheap drug or intervention, for example a cheap herbal drug that is available patent-free and for mere cents per liter, there would be no business in it, so no company could viably pursue this.
At that level of cheapness, individual patients can buy it themselves. If it became popular enough, somebody would research it. That's already happened with many Traditional Chinese Medicine herbs. Of course, they turned out to be ineffective but one of them might have worked.
Uhm.. it won't be MADE. It won't be RESEARCHED. It won't be MARKETED. It won't go through CLINICAL TRIALS.
I wonder how low the average HN reader IQ has gotten lately that my insightful comment got downvoted and that I have to deal with crap like yours. Way too many total little teenagers and idiots on this site now.
> This is wrong on a number of levels. I've worked for a couple biotech companies and typically the commercial viability of a drug isn't examined until about phase 1. And even then it's a very rough estimate. I highly doubt any company is calculating an ROI at the screening stage.
Either you're using a wildly different definition of "biotech companies" than the vast majority of people in the life sciences industry or we're talking about entirely different things. Comparing biotech ventures to established pharmaceutical giants and their research pipelines is like comparing a seed stage social network startup to Google: it's a nonsensical and uninformative exercise. The entire biotech industry exists because the pharmaceutical conglomerates are willing to pay hundreds of millions to billions of dollars for a single promising patent acquisition as long as VCs, LPs, and Joe Shmoe's pension shoulder the 99% failure rate. When you're a biotech, you're not afraid of cannibalizing existing revenue or wasting precious resources better allocated elsewhere and pharmaceutical companies are perfectly willing to let everyone else light their money on fire. The successful biotech investors that have decades of positive returns most certainly look at market potential when deciding where to invest but even they have to make wild shots in the dark because they lack the massive amount of data that pharmaceutical companies work with.
Oh and "typically the commercial viability of a drug isn't examined until about phase 1" is downright false unless "until about" means years in advance. Any biotech company that hasn't figured out whether to treat their potential targets as normal or orphaned drugs by the time they get to animal testing has no business developing medicine and spending other people's money in the process. That most basic step necessitates analyzing the market potential of drugs.
> Second, it's typically the science that limits the effectiveness of therapy, not the potential ROI. The reason why many drugs only treat symptoms is because that's the best science can do right now.
That's obvious, given that ROI doesn't effect biochemistry. There is a whole universe of molecules but which ones we actually find out about and bring to market is dependent on a complex equation balancing luck, science, and a bunch of actuaries. Biotech companies can ignore that last one because it's not their cash that is going up in flames.
> Third, curing a disease can often have a much better ROI than just treating symptoms forever. Look at Gilead's hepatitis C cure. Before it came to market the drug revenue for that disease was maybe $2 billion. The first year that Gilead's drugs were on the market, they sold $12 billion. The cure was way more profitable.
I don't think you could have chosen a worse example to make your case. Sofosbuvir was a once in a lifetime discovery made by a biotech startup, Pharmasset, that resulted in the largest pre-approval acquisition and one of the fastest FDA approvals for a major drug in the history of medicine. That company was started in the late 90s with the express purpose of curing hepatitis C and the second it was clear that Sofosbuvir was on the road to breakthrough therapy designation by the FDA, every single pharma company was scrambling to acquire it. The only way to get such a designation is if the existing treatments had a high toxicity, caused many unbearable side effects, and weren't that effective in treating chronic infections with a rapidly oncoming patent cliff. For all intents and purposes, there was no consistently effective treatment for hepatitis C, let alone a cure, before Sofosbuvir.
Oh, and what happened next? Gilead lost almost half of its market cap from its 2015 peak after missing quarter after quarter of earnings estimates and downgrading revenue projections for its entire antiviral portfolio. If competition continues chipping away at their hep C products, their 2011-2015 gains (80+% of their peak market cap) will be wiped out. That's what a cure gets you: a shrinking market and brutal competition that wants to cash in before the well runs dry.
1. Yes, companies will target certain diseases based on their economic viability. However, the work they do is about 5% of what's required for a fully vetted molecule. Take the number of patients and multiply by some price pulled out of the air and they can slap "a $5B potential market" on their pitch slides. Until they actually have some clinical data and start to dig into the pricing/reimbursement environment, those numbers are just wild-ass guesses. How do I know? My old job was doing commercial assessment for in-licensing and phase 2 molecules. Even the big companies had to shut down programs post-phase 1 because their initial commercial assessment was way off the mark. To say that hundreds of clinical programs never make it past screening because their don't have a good ROI isn't true. Companies just don't have enough data that early to make those kind of calls.
2. Sure Gilead lost almost half their market cap in 2015 since they were missing estimates. But that was simply because they missed some pretty ridiculous earnings goals. The fact remains the company made tens of billions of dollars over a couple years based on selling a cure. Both of their HCV drugs sold more in two years than most drugs sell over their entire lifecycle. They were simply a cash cow.
The increasing cost relates to drug companies running out of useful simple molecules. If you count the number of new drugs that treat common problems aka more than 100,000 people per year in the US and are not closely related to another drug you get a very tiny number of new drugs per year.
Sure we have a lot of drugs related to Morphine, but most of them are not significantly different from Morphine as to be worth high prices. When viagra goes generic at the end of the year the hurdle for the next male 'enhancement' drug ratchets up.
Well, sort of. It would depend how the government negotiates the payments given to the companies that provide medical services. If they stick with the standard fee-for-service model we mostly use now, there wouldn't be much difference.
How so? Even under fee-for-service a single payer org has an incentive to provide preventative services so they don't have to pay for problems later, no?
Well, there's two issues. (Two I can think of, anyways, there may be more.) One is that without universal insurance via a single-payer system, there will be people who aren't insured and can't afford preventative services. So, single payer solves that problem.
The other problem is that in a fee-for-service model, health care providers get paid a lot more for major procedures than minor ones, and they may get paid again to fix any complications that arise. So, there isn't a good incentive structure to reward fixing problems quickly and cheaply. Whether single payer fixes that problem will depend on how it's implemented. (I think there was an experiment done awhile back where Medicare stopped reimbursing hospitals for patients who were re-admitted due to infections following surgery. The result was rate of surgery complications went down.)
I kind of wonder if maybe there should be a bounty program for fixing minor problems before they turn into major problems. Just this last week, I had an adenoma removed from my colon. I haven't seen the bill yet, but it was a pretty routine, minor procedure. If I hadn't had it removed, it would have probably become cancerous within the next year or two (according to the doctor that removed it), and if it were any bigger it might have required a much more invasive procedure to remove. So, fixing it now saves a whole lot of medical expenses down the road and saves me a whole lot of unpleasant experiences. It seems kind of a shame that the doctor who removed my adenoma isn't rewarded (except in the knowledge of a job well done) for that.
That's called a "capitated" payment plan, and we already use those on a lot of contracts with physicians. They create perverse incentives (withholding care), and tend not to work well (it implies that physicians are capable of significantly improving patient outcomes over their current level of performance, and choose not to.)
To get an insurer to want to maximize your health care, you have to guarantee that the insurer saves money if they make you healthy, and loses money if they don't.
If patients can switch insurers at any time, then an insurer that spends a lot of money on preventative care only sees a small fraction of the benefit; they lose it whenever the patient switches away. And if you spend 'too much' money on preventative care, you'll probably end up losing customers to cut-rate insurance providers with lower premiums.
If instead we could measure health perfectly, then you could set up a system where if a person is made unhealthier by an insurer, then the old insurer has to compensate the new insurer whenever someone switches. But we don't really know how to measure health, if by 'health' we mean having a low probability that a person will become sick in the near future; we usually can only notice once a person becomes sick.
> In fact you could argue it's in the best interest of drug companies to keep us sick!
This assumes there's only one drug company. If company A's medicine heals me an company B's medicine does not then I chose company A's and company B's goes out of business.
So no, a drug company does not have an incentive to keep me sick.
The current system is imperfect, but it's not outright fraudulent. In general, if a company claims "Drug X gives 10% more pain relief than aspirin", they have to support that claim with evidence, which can be reviewed (and contested, if the methodology, outcome, or other factors don't support the claim).
That's not to say there aren't improvements that can be made: All Trials [1][2] is one campaign to improve the system.
This supposes that the docs are the ones responsible for the choice of giving antibiotics or not at the population/industry level. This is already questionable.
A second thing your statement supposes, is that people prescribing drugs in first-line care have the same agenda as people doing medical politics, or at least are not concerned about the well-being of their patients. Every profession has its black sheep, but I think you severely underestimate the drive docs have to see their patients do well, and the self-esteem hit we get when things go wrong.
Whenever you assume the innate goodness of someone overrides their natural desire to earn money, you end up with a problem. With doctors, having to legislate the gifts received from pharmaceutical companies is an obvious counterpoint to the idea that the vast majority of doctors are 100% ethical and only make choices for health reasons. Clearly a lot of them were influenced by pharma money.
Alexander Fleming gave the following warning, way back in his 1945 Nobel speech:
It is not difficult to make microbes resistant to
penicillin in the laboratory by exposing them to
concentrations not sufficient to kill them, and the
same thing has occasionally happened in the body.
BBC did a very good radio segment about the penicillin discovery in their "50 Things That Made the Modern Economy" radio series: http://www.bbc.co.uk/programmes/p04pfn2z
A few important points about antibiotic resistance (as far as understood by a lay person that likes to read articles on the topic):
1) Resistance is inevitable. It doesn't matter if everyone finishes their prescribed treatment or not, bacteria will develop resistance to antibiotics one way or another.
2) Antibiotic resistance comes at a cost to the bacteria. In the absence of antibiotics, bacteria will lose their resistance. It is pretty unlikely that you will get infected with antibiotic-resistant bacteria when you bruise your knee in the dirt.
3) Multi-resistant bacterial infections mostly occur in a clinical setting, where people are especially vulnerable to infections (people on a respirator, people with a central venous line). You can prevent multi-resistant infections just like you prevent normal infections: Use sterile gloves, isolate people, follow procedure protocols precisely, etc.
Multi-resistant pathogens are a problem, but it's far from the doomsday scenario painted in those "it's the end of antibiotics" articles.
> 1) Resistance is inevitable. It doesn't matter if everyone finishes their prescribed treatment or not, bacteria will develop resistance to antibiotics one way or another.
Not wrong, but it paints a pretty black and white picture of something that has a lot of shades of gray. The way we use antibiotics strongly influences how quick antibiotic-resistant bacteria evolve.
With very good usage discipline you could even try to kill all resistant bacteria by use of different antibiotics before the bacteria become multi-resistant.
#3 - the lady who got caught the "superbug" while she was treated in india for a hip bone fracture. Hospitals and especially hospitals in India will not use sterile gloves or isolate patients with hip fractures. This means patients with common flu, injuries, or other benign health issues can also catch this bacteria, and a scenario where dozens of patients if not hundreds catching superbugs in an Indian hospital is likely.
be very careful about semantics, the statement is:
bacteria will become resistant overtime when exposed to antibotics at concentrations insufficient to kill all of them, however when removing the selective pressure of antibiotics, they will "lose" resistance as that selective pressure is no longer applied.
As the poster above mentioned, energetically, resistance to antibiotics is costly, as bacteria will do things like increase the number of efflux pumps in their cell membranes pumping out antibiotics. When there are plenty of nutrients available as other strains have died, this is an acceptable tradeoff, however without that pressure, the strains with less defense mechanisms, but also need less nutrients to survive, will again outcompete and become dominant.
Finally, just to reiterate, it is exposure at levels not high enough to kill all strains that causes resistance development. If you maintain concentrations so high that no strains can survive, resistance will never develop. Unfortunately, this is virtually impossible in practice, as adverse safety events would also occur too frequently at those 'scorched earth' levels.
If you could ban an antibiotic, bacteria would lose their resistance to it in time. Resistance consumes a lot of energy and the bacteria without it would simply outcompete the resistant varieties to quick extinction in a few thousand or million generations.
But as long as people keep using the antibiotic, the bacteria will continue getting more resistant.
Electricity generation companies are paid not just for the electricity they produce, but also for reserve capacity. Because that's the only way to keep the lights on reliably.
Seems like this would work for antibiotics and snake antivenoms.
An electric company doesn't have to invent a new plant design or even build an entirely new plant to have reserves but with antibiotics you do. Building reserve antibiotics isn't simple and even holding back a known antibiotic doesn't guarantee it'll work when you need it. We kind of already do that with carbapenems which are supposed the be the antibiotic of last resort.
The problem is the commercial model for funding and research, as a new antibiotic is guaranteed to be a commercial failure, and the risks are high.
If EU, US and maybe the Chinese funded research - both universities but also in some way that it would be economically interesting for commercial entities, there would be new drugs.
You have to remember the time constraint on drug profitability (~15 years for a US patent on a new molecular entity).
A new antibiotic would be held in reserve except in exceptional circumstances. Even if it was put to use on day one for extensively drug resistant TB and the like, there are too few patients to find a functional price point. It's sort of like the situation with many rare diseases.
My favored approach for solving this problem is using some sort of prize system (Sanders had a bill to do this for all drugs last year if you want to see a model). Effectively, the government offers $X billion for a new antibiotic meeting criteria Y and Z, ensuring that the private sector has financial incentive. In most plans, the drug IP would become public, allowing for immediate low cost generics.
If you develop a new antibiotic then it won't be used, it will be held in reserve for when current antibiotics no longer work. We don't want to expose bacteria to new antibiotics until we have to. This is why it would be a commercial failure.
A change to the model where the government(s) develop their own antibiotics so that they can sit on them might be in order. It's not much of a vote winner but is as essential as defence.
Develop them on a prize basis or as government IP is one way.
Another way is to shelve the start of the patent window along with the drug. If we know it works but it's not needed yet, don't start the countdown to patent expiration until it is needed.
The former perhaps lowers costs for the drug more effectively. The latter gives more incentive to the drug companies to keep doing their research.
Suppose you have a new antibiotic that effectively kills multiresistant pathogens. The correct action in this case would be to hold it back and only use it when all else fails. This means you sell at very low volumes, and even if the prices are astronomical revenue is fairly low. Now, in addition, you can sell it on the side in places with a less moral or more weakly enforced regulatory regime, and you end up with overprescription, undercompliance, and use on livestock. As a manufacturer, you can recover costs and make profit much more easily in the second scenario, so it happens whenever the pharma companies can get away with it, which is every time. If you were to force them to hold back the new antibiotic for last resort cases only, it would indeed be guaranteed to be a commercial failure, since volumes are not sufficient to justify bringing it to market.
Using a current example, Fidaxomicin, it's not currently the most widely used therapy for C. difficile infections, but it is used - and admittedly the price is very expensive. But there's also very little evidence of it being used in grey markets or for off label uses.
And I'm unconvinced its destined to be a commercial failure given Merck is pushing money into it.
Even if something is being used as a "drug of last resort" may have fairly high volume. For example, colistin, which is one of those drugs, are often used in combinations with other antimicrobials to help combat resistance, and is pretty much universally available to hospitals in the U.S.
Give them the same patent term but not don't start the term until it's in use rather than when it's discovered. That way their patent window is just as long when it's really needed. Pay them some small lost income type of fee to keep from selling it for widespread use until it's needed.
Time to stimulate research into phage therapy instead of granting longer patents to antibiotics. The field seems promising with a long history in Russia but mostly ignored in the west until recently. https://en.m.wikipedia.org/wiki/Phage_therapy
Phages are being researched. This literally comes up in every single HN thread on antibiotics, so once more, posting my "Why Phages Aren't the Answer" shortlist. Note that I love phage therapy - this is the problems as seen by someone who doesn't think it's a dead end.
Phage therapy is neat, it really is, but there are a couple major issues:
- There is no such thing as a "broad spectrum" phage. You can't do empirical treatment using phages, and there's not really "off the shelf" phage therapy - it tends to be a bespoke creation for a particular infection.
- There's some serious regulatory problems, similar to those experienced by fecal transplant treatments. We're not yet really equipped to think about handling evolving, custom microbes as a treatment. - Because of the first, it's going to require a considerable amount more lab capacity than most clinical settings currently have, and considerable delays until treatment.
- There's also some biosafety issues around phage prep, but those are easily solvable. It's a great way to treat particularly resistant or hard to treat infections, but it's not a particularly great general solution. There's a reason it was abandoned in countries with easy access to antibiotics - they're just roundly superior in basically every respect.
As someone who used to work in the phage area there are in fact "broad spectrum" phages. The problem is the methods used by most groups to isolated phages selects for narrow host range phages. My group isolated hundred of broad host range phages by using a better protocol.
You are right that phage therapy is very challenging under current regulations. If we want phage treatments we are going to need to change how drugs are licensed.
Do you have a link to a paper on that protocol? I'd be interested in reading more about it, because I've never encountered a phage-prep technique intended for clinical use that wasn't targeted.
I didn’t think the protocol was that amazing so it is just described in passing in our papers (see [1] for an example).
The basic idea is really simple. Phages come as both generalists and specialists (and all grades in between). The specialists grow on a limited range of bacterial strains, while the generalists can grow on a wide range of bacteria (even different genera). The specialist phages grow faster than the generalists in a single bacterial strain as they are better adapted to their specific host. The problem is that when most groups isolate phages from the environment they only use one bacterial strain at a time so they end up isolating specialists rather than the generalists; the generalists are just too slow to form visible plaques in the presence of specialists.
The solution to isolating broad host range phages is to use multiple bacterial strains in the enrichment and isolation process (we used up to 50 at a time). Under these conditions the generalists grow faster since they have more hosts they can infect (i.e. the specialist can only reproduce in one bacterial strain, while the generalists can reproduce in multiple). This tips the isolation process towards pulling out generalists phages that have a broad host range. I have to say when I entered the field I thought that this was the way everyone isolated phages as it is so obvious, but I was wrong.
The major issue is that we don't really have the licensing framework for a living thing.
The "drug" keeps changing. The neat part of phage is that they co-evolve along with the bacteria they're feeding on, but it means there's no really consistent formulation. It's unique, and it's not fixed.
The same problem effects fecal transplant. "The healthy gut flora of a donor" is much harder to approve and regulate than "Famotidine 10mg, Calcium carbonate 800 mg, Magnesium hydroxide 165mg" (the antacids on my desk).
In its apparent desire to blame Congress for the end of antibiotics, this article misses a big reason companies aren't developing new antibiotics anymore: they just don't work like they used to. Every new generation of antibiotic is effective for a shorter period than the previous.
This will come as a surprise to Merck, which recently bought Cubist to boost their antibiotics research arm, and which is actively funding research in that area.
Or GSK, which has partnered with a bunch of European universities to develop new antibiotics against gram negatives.
I don't think it will "come as a surprise" to any major public pharma company. They are all in the business of managing risk:
* that a drug test will fail trials
* that they will be beat to market by a competing drug
* that a contamination in their products will cause damage to patients
* that R&D costs will not be recovered before the patent expires
* keeping R&D costs low enough to be competitive, but high enough to have a pipeline of future products
IIRC, general antibiotics have never lasted more than 18 months before we found a bacteria that became resistant to them. Pharma companies know this more than the rest of us.
They are betting/hedging on getting to market first with a multi-drug resistant (MDR) bug killer. Everyone on Earth will need a new antibiotic as the current MDR bugs become more widespread. And the more widespread the current MDR bugs are, the more valuable a fresh antibiotic would be (it would be an effective monopoly).
18 months is the time until any resistance is found, but that doesn't mean we hang up the antibiotic. We do susceptibility testing, we use higher doses, we use combination therapy, etc.
> IIRC, general antibiotics have never lasted more than 18 months before we found a bacteria that became resistant to them.
This does not seem to be correct. With the most recently introduced antibiotics—Levoflaxacin, Linezolid and Ceftaroline—resistance has developed very rapidly. But that didn't used to be the case; old antibiotics lasted much longer before resistance developed. [1]
I am not sure why this is---are these newer antibiotics more fragile (i.e., easier to resist), or are they being misused more broadly? If it is the latter reason, then perhaps we should stop developing new antibiotics until better usage practices are enforced. It is stupid to spend a lot of money developing antibiotics, only to throw it away by misusing them.
general antibiotics have never lasted more than 18 months before we found a bacteria that became resistant to them
But this doesn't mean the drug won't be used. Drug resistance rates are often in the single percent of patients. You can have resistance out there, but still have a huge market.
They should be blaming doctors and people for using them incorrectly. Doctors used to give them to people because they had a cold, for example. People insisted on being given some sort of medicine.
In India, you can still buy antibiotics over the counter.
This issue is not as easy as it seems. Doctors used to give them for a cold, and still do (even in the best medical practices). There are two important things to be aware of:
- Differentiating "common cold" (viral infection) from a bacterial infection, which would be a correct indication for antibiotics, is often impossible. If you want the best chance for your particular patient, what will you do? Now, if you want the best chance for the whole patient population, what will you do? Not the same thing.
- If you don't give antibiotics, things turn wrong, and the patient's family comes back wielding pitchforks, what do you do? This is less of a problem nowadays, but in small communities and rural environments can still be a very real question.
This is also a bit of a cultural issue. Now, I agree completely that OTC antibiotics make no sense at all.
Both your points are technically correct (the best kind of correct), but miss the mark in practice. There's no real reason most of the time to differentiate between the common cold and other upper respiratory infections, unless complications ensue. Simply put, a healthy, non-immunocompromised person will fight off basically any upper respiratory infection without a problem.
A healthy person, absolutely if he is 30 years old. Now how will you handle the healthy 80 yo patient? Will you take the risk of a bacterial infection that may kill him, although he is now perfectly healthy and still doing hiking on a regular basis? Now what, if he is 1 year old?
I think you underestimate the potential impact of common infections. This is a grey zone. There is no correct answer. Everything depends on where you put your safety/futility cutoff for the treatment decision, which is what doctors are paid for.
Had my father made it to 80, I strongly suspect he'd've wanted to take the risk to help preserve the antibiotics for younger people.
Of course this doesn't necessarily generalise, but there's plenty of older english people with the "I don't want to be a bother" mindset so it might do better than you think.
I don't agree with the premise. It's hospital-acquired pneumonia that is the greatest concern and the biggest incubator for resistance. There's really no need for GPs to withhold antibiotics from at-risk groups to take at home.
> non-immunocompromised person will fight off basically any upper respiratory infection without a problem.
I am curious if this is true, ie: does medical science hold above as an accepted fact by general consensus?
I ask because my understanding is that the 1918 flu (aka Spanish flu) which was an H1N1 (upper respiratory infection in humans , intestinal tract in birds) predominantly killed young adults, ie: healthy non-immunocompromised people. Here's the wikipedia remark:
"the 1918 pandemic predominantly killed previously healthy young adults."
"a rapid progressive respiratory failure and death through a cytokine storm (overreaction of the body's immune system). It was then postulated that the strong immune reactions of young adults ravaged the body, whereas the weaker immune systems of children and middle-aged adults resulted in fewer deaths among those groups"
But there's some conflicting explanations:
"special circumstances (malnourishment, overcrowded medical camps and hospitals, poor hygiene) promoted bacterial superinfection that killed most of the victims typically after a somewhat prolonged death bed
"
If it's the flu, it's not treatable with antibiotics. Antivirals, general care, symptom relief, and vaccines are the best we've got.
You've got to remember, too, that it wasn't an especially virulent strain of influenza alone that caused the deaths from 1917-1919. It spread quickly among the war-weary, crowded, exposure-riddled young men who lived with chickens and hogs at the battlefronts. It was common practice at the time to logistically set up small-pen animals close to the long, mostly stable fronts as a source of local food rather than shipping in so much preserved food. Those young men then travelled far and wide across the continent and even overseas as their service rotations came and especially when hostilities ceased.
As is common, the war's conditions and aftereffects killed far more people than died in combat. It was a nasty flu, but humans helped it out about the best we could.
Flu is actually a systemic infection, not s respiratory infection. It's not very treatable, except symptomatically, anyway. If the Spanish flu came back today, we'd have almost as hard a time dealing with it now as we did then.
Sure. And when they're, you know, ten, with a sore throat, some cervical lymph node tenderness, no cough, no tonsillar exudates? In other words, every case of sore throat that walks in the door?
1/5 chance it's Group A strep and worth throwing abx at. And... then what? We know the stats, we have a non-specific presentation, and the traditional reason to throw abx at this patient is because we want to minimize the chance of a subsequent heart or kidney disease (post-infectious immune hypersensitivity rxn). 4/5 chance they don't need abx; 1/5 chance that they'll benefit, but not a lot. But, hey,... how do you even attach a utility analysis to "and a small reduction in the likelihood of heart and/or kidney damage"? The magnitude of the risk is hard to comprehend, subjectively, in the face of the small reduction in incidence. And you will see those consequences play out; you're going to see so many sore throats that the small odds will manifest.
This isn't one of those, "stupid doctors, they don't know what they're doing!" things. It's something that doesn't readily yield to utility metrics.
Except that the throat culture takes 48 hours to come back, plus the time to get the patient started on the abx after that, on an infection that has a 5 day course to begin with, which got into your office somewhere on day 2. You wait 48+ hours on the backup culture, and you've missed your window to do anything with it.
...which is a big part of why it used to be common, and isn't now.
When I went to the doc with a cold and high fever they didn't give me antibiotics, they first looked for bacteria. It's not that hard to do, it just costs more than writing a prescription.
Rapid tests are not available for all types of infections. Furthermore, the doc may not want to risk superinfection with another bug if treating a fragile patient.
Rapid tests for strep throat are regulated by guidelines, and are part of standard care.
Unfortunately, that does work in most cases. I'm curious what the success rates would be if all zithromax (?) prescriptions were replaced with sugar pills. I've gotten that drug many times for what may or may not have been strep or bronchitis.
There is actually a company doing this. They were handing out samples at IDWeek.
It's called the "ZeePack" or something of the sort (a play on the the azithromycin/zithromax 'Z-Pak') and is essentially vitamins and herbal supplements you can give to cold patients so they feel like they have been given something.
Seems like one of the primary features of this antibiotic was to have a chemical that was new at the time, so patent protection could be enjoyed from that point forward.
Different features can have different priorities at different times, sometimes there's nothing "engineering" can do about it, it's a marketing thing.
Wherever cardiac health might fall on the priorities list, it's probably not way up there for this compound.
disclaimer: haven't worked for a drug producer since I was first out of college
The "success" rate would still be very high, as most people would surmount the infection. However, the "failure" rate (people dying) would be much higher as well...
I am sure you can see that would be an ethical and practical problem. Where would you put the "safety" cutoff of not giving zithromax?
In the 1918 influenza pandemics for example, many people died of a bacterial pneumonia following the flu infection. So you see, the impact of "1-tier" infections is very real without antibiotics.
EDIT: by all means, downvoters, make your point explicit!
Sadly I read this as a thinly veiled attempt to argue for Congress to fund anti-bioitic research.
Why? Well the article practically spells out the why: pharma is not incentivised to bring new anti-biotics to market, but they can create lots of "new" ones that are similar to the old ones. Why not change the model, and create a system where pharma companies can churn out "new and innovative" anti-biotics, for a nice guaranteed government payout. "Win win" for pharma, politicians, and who knows - maybe farmers and people?
Cynical? Extremely. An element of truth in my view? Almost certainly.
The way I view it - we have too many different options available on the market. We should look to remove a decent % of them from circulation in as many countries as possible, to allow any resistant strains to lose their resistance. Yes, this would impact pharma companies, but their could be a rotational system with extensions to patents as required to keep them happy.
Another big reason is the lack of financial incentives for companies. A lot of new antibiotics are for very small populations (you'll only use a new antibiotics for the rare cases of drug resistance).
As a result, drug companies are looking at spending $100M to make a new drug, when they'd be lucky to make their money back on it.
If you are serious with your question, instead of just trying to jockey for hacker forum dominance, a very quick google search will answer your question. It is possible that I misunderstood your Q, and your tone, and if so, I apologize.
I searched google.com for "increased antibiotic resistance over time" and found two sources that seem pretty legit. One is the World Health Organization, the other is the US National Library of Medicine. Both articles appear to address the question I believe you raised.
Thank you! My key question was less around increasing numbers of antibiotic-resistant pathogens and more toward the claim that antibiotics released today produce resistant strains faster than ones produced say fifty years ago. The timeline in your second link should help me piece that together...
The parent article is by two lawyers. Here's a better introductory article by a biochemist.[1] A key point is that there are a finite number of small molecules which can potentially be used as antibiotics. Throwing money at finding them may not work against a depleting resource.
In evolving resistance to multiple antibiotics do bacteria tend to become weaker or less fit in other ways? I would think there must be some trade offs involved if they can no longer use certain molecules in their cell walls or metabolic pathways. Or to put it another way, if there are resistant and non-resistant bacteria in a particular environment will the non-resistant population tend to out compete the resistant population in the absence of antibiotics?
The conventional assumption is that antibiotic resistance comes at a substantial fitness cost, so in the absence of selective pressure, susceptible bacteria will out compete them.
This is a nice, clear, elegant theory.
It hasn't worked in practice. Community-acquired MRSA rates have risen, even as the selective pressure from antibiotics has declined, and the general observation is that the fitness cost is nowhere near as big as we assumed it would be.
Interesting. I didn't know about Community-acquired MRSA. From some googling, it sounds like its spread is currently still limited to at-risk communities (athletes that share equipment, kids in daycare).
There still has to be a substantial selective cost, otherwise the resistant strain would quickly spread through the whole population, right?
We honestly don't have a great handle on it's prevalence.
It would only spread quickly through the population of there was also a lot of selective pressure from antibiotics in the community. There's definitely some selective cost to multi-drug resistance, but it's proving to not be nearly as big a hurdle as conventional wisdom suggested it would be.
Yes, resistance to antibiotics comes with a cost. There are several mechanisms for antibiotic resistance. For example, the bacteria could produce a protein that pumps the antibiotic out of the cell. Producing this protein is expensive, so in the absence of antibiotics, bacteria tend to lose the genes responsible for it.
Evolution is part of life, while some bacterias adapt to the antibiotics there are other animals (fungus, bacterias,...) that try to compete with them, and usually they find the ways to kill each other. It is part of evolution, but evolution takes time, and discover this new mechanism also takes time. So by the time we discover a new antibiotic it can be too late for a lot of people.
But it doesn't happen as you would want. The bacteria evolves fast but it doesn't mean they will create antibiotics for other bacterias. You have to be in the right place at the right moment. And them wait a lot of years until you can use that antibiotic in the human race.
The article seems to have thrown in the towel with regard to existing antibiotics, which seems like a waste since there's so much more we could do. Restrict antibiotics the same way we restrict morphines. Censure doctors who over prescribe antibiotics, when they aren't needed. Do not allow them to be used in animal feed for any reason. We're talking about a vital resource that can save millions of lives. Allowing this resource to be depleted by frivolous use, is downright criminal.
China for example is by far the world's largest abuser of antibiotics [1]. The average person in China is taking ten times the amount of antibiotics annually that the average person in the US takes (and when you consider the number of people in question, it's that much more staggering). You're spot on, in that it's a dramatic challenge, figuring out how to convince a nation like China to tamp down on that given the scale involved (eg a billion people being commonly prescribed antibiotics for a normal cold).
Other countries have followed the American example with regard to lots of bad policies. They might very well follow it with regard to a good policy. At least trying would be much better than just throwing our hands in the air.
well if the U.S. enforced food import restrictions on antibiotic feed farm food and visa restrictions, the rest of the world will quickly change its tune
You seem to be assuming some of these things aren't being done.
Antibiotic stewardship is a major push in most hospitals, and there are national level evaluation programs for antibiotic overuse.
But, as people have noted, there are other countries also driving usage (for example, China heavily uses colistin, while its a drug of last resort in the U.S.), and once resistance is established, it's very hard to make it go away.
At the individual level, I've always thought of antibiotic misuse as (at least in some part) a symptom of our resistance to authority. People end their antibiotic course early (and then usually use the remaining pills another time) because, of course, they know better. I've run into people like this many times.
Now, perhaps doctors should take part of the blame for not explaining why you should finish your course of antibiotics, but users should take some of the responsibility too.
To play devil's advocate, since this doesn't seem to get mentioned often:
1. People ending their antibiotic course early is a drop in the ocean compared to the abuse of antibiotics in agriculture.
2. The advice to finish the course was coined back when GPs did house calls. Now, if you use up all your current supply, what do you do next time you get sick? By the time you know you need an antibiotic, you're not going to be well enough to hump your arse across town to petition a doctor for a prescription. The official answer might be to call an ambulance, but the ambulances and hospitals are hopelessly overloaded already, and that's nevermind the situation in the US where a hospital visit can bankrupt you even if you have insurance.
People shouldn't have a supply for use when they feel like it. Antibiotics are a last resort, not first!
If someone has some left over then they did not finish the previous course, which means that they have potentially allowed the strongest bacteria from the previous illness to live on.
Maybe it's not common knowledge, but taking antibiotics is essentially putting evolution into overdrive. At first you kill off the weakest bacteria and then only later the most able to survive. If someone ends their course early, they are allowing the best adapted bacteria to live on and multiply.
Perhaps the person wouldn't be getting sick the second time if they had finished their course in the first place.
Also, taking antibiotics also does harm to a person's own flora, not just the invaders. So, please don't take antibiotics if you don't need them, and finish them when you do need to.
Now, if you use up all your current supply, what do you do next time you get sick?
By the time you know you need an antibiotic
How do you know you need an antibiotic?
Even doctors are discouraged from self-prescribing, so without seeing a medical practitioner, and having whichever tests they feel are appropriate for the symptoms, how can you be sure that those antibiotics are the correct treatment?
If, say, you were experiencing diarrhea which was preventing you from going to the GP, then an over-the-counter remedy such as Immodium should give enough short-term relief to allow a visit to the GP.
If you really are so severely unwell that you can't reach a GP, then perhaps an ambulance really is appropriate.
To address your question of hospital-induced bankruptcy, that's a dysfunction in the US healthcare system: it doesn't make it medically appropriate to keep a stash of antibiotics, although it does contribute to unsafe practices such as self-diagnosing and self-treatment.
An excellent question. If house calls aren't going to be brought back, the best I can think of is to allow consulting over the phone.
> If you really are so severely unwell that you can't reach a GP, then perhaps an ambulance really is appropriate.
Come on, if everyone called an ambulance every time they were too sick to travel, the system would probably collapse under the overload. Besides, every time you set foot inside a hospital, you expose yourself to the risk of picking up an antibiotic-resistant infection.
If house calls aren't going to be brought back, the best
I can think of is to allow consulting over the phone.
My local (UK) GP does do this, although I'm not sure how widespread this is. The times I've had a GP consult over the phone, I've been happy with the outcome. My GP surgery does also offer home-visits, although they ask for a telephone consult first.
if everyone called an ambulance every time they were too sick to travel
I was thinking specifically of cases where someone is too sick to travel, but also doesn't improve for a couple of days. Many illnesses such as winter vomiting bug (norovirus) and food poisoning are usually short-lived, and will either ease off enough to allow the patient to get to the GP, or be severe enough that a GP wouldn't treat it in any case, but instead refer it to a hospital.
Don't play devil's advocate. Only advocate positions you believe in.
2) is countered by the fact that antibiotics are not needed by most of those people; and antibiotics don't reduce the severity or duration of the disease by much.
See for example conjunctivitis. Most people attend their doctors and expect to be given antibiotics. This is the case even though doctors find it hard to tell the difference between bacterial and viral conjunctivitis; and that antibiotics don't reduce the severity of the illness, and only reduce duration by one third of one day.
>"Don't play devil's advocate. Only advocate positions you believe in."
That's terrible advice. The main reason for playing Devil's advocate is to try to see things from a different perspective to build a fuller understanding. This is frequently useful.
Well, in this case it's not that I'm saying 'yes, holding onto a personal supply of antibiotics is the right answer', it's that I'm saying there is a real problem to which the official advice offers no solution. Conjunctivitis is certainly a good example. As for what would be a solution, I think it would be helpful if you could get basic medical advice and an antibiotic prescription over the phone. (To the person about to point out the potential downsides of this - yes, I'm aware of them – but what's your better solution?)
There is actually very little evidence as to what should constitute a full dose of antibiotics[1], and some doctors and researchers are now starting to discuss this (however, due to public health message concerns, it's mostly kept pretty quiet). In many infections, once bacteria fall below a certain level, the body's immune system can kill off the remainder.
They should indeed. Would it be too much to expect literate people to read the bit of paper that comes in the prescription package making clear that the patient should complete the course.
It's such a shame we used them all on livestock. We raise around 60 billion land animals every year and probably more than 90% is being given last-line-of-defense antibiotics for faster growth. Perfect pool for some unbelievable evolution of antibiotic resistant bacteria, and yet we want to ban antibacterial soaps and other minor things.
But, at least the stakes and muttons and whatevz can be enjoyed.
This is a global health crisis waiting to happen. Incentives need to be created for pharma to create many new broad spectrum antibiotics. Longer patents, financial or tax incentives, even subsidies, whatever it takes.
The solution is already rapidly inbound, few are talking about it for some reason though. Within three or so years, everyone will be talking about the obvious solution to it. You'll see dozens upon dozens of articles pop up in that time, talking about the same thing...
For less than $20,000 you can start experimenting with new attacks on infection and resistance, in your kitchen (so to speak; I'd suggest a real lab), using CRISPR. You can order the bacterial samples you need very inexpensively, and just begin working on it. Based on the current legal position of Broad and Berkeley, You can also do any work you want around Cas9 or Cpf1 in the US, without concerns for licensing/patents (until or less you plan to commercialize).
Only a few interesting things are out there about it now, that'll change very soon:
Honestly drug bounties seems so much better than patents across the board. Patents are better when you're inventing the market too, but even then are no panacea as it may take a long time to build the market after the initial invention, eating into the patterns lifespan. True innovative is often bewildered.
Here not only is the market clear cut, but this benefits new entrents as the immediate windfall allows for more risk-taking. It's liquidity for innovation.
Finally, the costs for patients / emergencies can be dirt-cheap, which is good because just as one should have insurance to cover future emergencies, so should society budge the R&D up-front.
There may be a lot of antibiotics present in the soil which haven't been exploited yet because it's hard to grow in the lab the organisms which produce them. However techniques are being developed to circumvent this problem and here's an early discovery:
Hopefully other and fundamentally newer ways to fight infection will be found. In the mean time getting fit and healthy now seems like a slightly wiser choice than it already was.
I don't think this is true. When we initially found out about antibiotics, in about 15 years we got to about 5 working general purpose antibiotics (general = works against pretty much every pathogen). Since then we've found 1 more, in over 70 years.
Even if this new drug turns out to be a total success, counting on finding new drugs in the near future seems to be an extremely long shot. It's not going to happen. Rates of discovery predict availability of future drugs.
Furthermore the rate at which bacteria adapt to drugs has also increased. Adaptation happens faster and faster. While I don't know how that applies to this particular drug, a naive extrapolation would seem to indicate it won't last half a decade.
That is completely false. Just a Google search will reveal to you that dozens of new antibiotics have been found since but they didn't last long. BTW, Penicillin also lasted less than a year before bacteria developed resistance to it.
It'll be interesting to see if/when a non-antibiotic solution to killing bacteria comes along. I read about an approach a novel approach a couple months ago that uses a polymer to destroy bacteria.[1] I'm not qualified to comment on the merits of the science.
1. Always finish your course of antibiotics. Even if you're feeling better. The course is as long as it is for a reason.
2. If for some reason you choose not to do (1) above, do not ever give someone what is left over from your course.
There is little evidence behind the length of the antibiotic course[1]. Depending on the type of infection, coming off antibiotics early may be better for you (OTOH, this can lead to a bad outcome for you too).
Don't eat meat, eggs or dairy that use antibiotics in their normal rearing practices (organic is a good alternative, or look for antibiotic free foods, or minimise eating of those altogether). Talk about this with others, in a non confrontational and non judgemental way.
Also, don't take antibiotics yourself unless you are certain they are necessary; many infections are non bacterial, and even some bacterial infections can be fought off by your body quite well. Look after your immune system; it looks after you.
Ironically, taking antibiotics might negatively impact your immune system due to the impact of the antibiotics on your intestinal floral (mostly bacteria, a lot of which are killed during the course), which are increasingly being linked to healthy immune function.
News is made to scare people, so however frightening it sounds, it's probably not going to be that bad. There's probably no value in taking any action besides going to work as usual and providing value to society in the way you're best at.
One possible reason not to worry is antibiotic cycling. Bacteria lose their resistance to individual antibiotics when we stop giving them exposure to them, so the idea is that we could breed out old resistance while breeding in new resistance and always have a drug available that they're defenseless against.
Send the last three paragraphs of the article to your politicians. Excerpt:
> Congress should instead reward manufacturers that bring a targeted, highly innovative antibiotic to market with a substantial financial prize; in exchange, manufacturers would surrender their patent.
This premise is going to turn out to be entirely incorrect.
There's a wave of therapeutic approaches to antibiotic resistance coming in the next 10-15 years, courtesy of CRISPR (gene editing broadly). We're going to end up having hundreds of new experimental angles of attack at resistance and infection. In fact, by far the bigger problem, is going to be narrowing down the vast array of options of attack that CRISPR is going to unleash.
These articles repeating the same hyped up fear, are missing what's right around the corner (which usually happens with such statements of doom, ala the world running out of food claims from decades ago). And best of all it's going to be inexpensive, relatively speaking, and extremely fast paced, to make progress in that direction. Cas9 and its superior alternatives such as Cpf1 arrived just in time.
I sincerely hope this is true, but 10-15 years ago high throughput computational screening was going to usher in a new wave of targets and molecules that acted against them, and we were going to unlock who new families of antimicrobials.
Yes! Antibiotics (as an issue) are now associated with guilt and hubris. I trust CRISPR technologies will yield new weapons to fight infections and other diseases.
The problem is anti-biotics are no longer being developed. They could be developed but it is not finicially worth a large pharma to do so.
Consider this, development takes years (>10) and costs billions due to failed attempts (over 9 in 10 fail), testing, etc. They then go to market and no government will buy them because the price is high to recoup the billions spent developing over ten drugs which only one made to market. In the end the governments force the companies to sell them at a loss should there be an epidemic. Then to cap it all some company in another part of the world where IP is not honoured rips off the drug and floods the most needed places with a cheap knock off.
Where is the incentive for a pharma to go through this.
And before the FDA is over cautious the pharma is cautious. If they pushed a drug through that didn't work or caused serious side effects there would be legal implications that would wipe out some or all financial gains from the drug. Back to having no incentive.
The woman died because the antibiotic that could save her life was not approved in the US by the FDA, not because there was no anti-biotic that could possibly save her.
>Although the patent system is good at producing new blood-pressure medications and cardiovascular drugs, it’s not the right fit for antibiotics.
I don't know if I want to get into an arms race with microbes where we keep trying to discover new antibodies only to have them develop resistance. Maybe the better solution is to limit antibiotics only to life threatening situations and let people build up their natural resistance.
Doesn't it strike anyone that perhaps of constantly trying to kill everything in sight that perhaps we should figure out a way to add super pro-biotics to our micro-biome to create little beneficial bacteria armies to fight the war against bugs instead of nuking our whole flora which then results in zero immunity and ability to fight anything off.
How are we able to create vaccines against bacteria and why couldn't we do it for more strains i.e.,MRSA?
My son spent some time in the hospital fighting a pneumonia. Now his doctors are looking into why his body didn't get immunity from the pneumococcal vaccine (Prevnar13). I was always under the impression only viruses could be immunized against.
Wait until enough people are sick or dying at the same time to impact the economy or some high ranking politician finds his dicks rotting away from an untreatable STD and they'll soon plough money into the problem.
I've never really bought these doomsday scenario claims, if only because I feel like innovation is constantly taking place, and these projections necessarily assume no new innovations (how could they predict innovation, after all?).
I agree. Furthermore, now that people are accustomed to not dieing all the time, even with no new technology, we would probably tolerate more serious measures to maintain this safety. Even if it came to draconian quarantine and hygene laws, or just wearing a face mask every day.
Remember how Ebola didn't cause an epidemic in any moderately developed country despite being one of the easiest diseases to transmit?
Ebola didn't cause an epidemic in any moderately developed country
despite being one of the easiest diseases to transmit?
Ebola isn't airborne, and an infected person isn't contagious until they start to present symptoms. Although it's a very serious disease, its presentation makes an outbreak relatively easy to recognise and control.
An even more direct way for the government to push discovery of new antibiotics would be for the government to do the R&D itself. Of course, that would offend certain ideologues.
[1] https://en.wikipedia.org/wiki/Antibiotic_use_in_livestock#Un...