You don't need to be signed in to read BMJ Blogs, but you can register here to receive updates about other BMJ products and services via our Group site.

Genetic jiggerypokery

No Pain, All Gain: The Case for Farming Organs in Brainless Humans

10 Jun, 17 | by Iain Brassington

Guest post by Ruth Stirton, University of Sussex (@RuthStirton) and David Lawrence, Newcastle University (@Biojammer)

It is widely acknowledged that there is a nationwide shortage of organs for transplantation purposes.  In 2016, 400 people died whilst on the organ waiting list.  Asking for donors is not working fast enough.  We should explore all avenues to alleviate this problem, which must include considering options that appear distasteful.  As the world gets safer, and fewer young people die in circumstances conducive to the donation of their organs, there is only so much that increased efficiency in collection (through improved procedures and storage) can do to increase the number of human organs available for transplantation. Xenotransplantation – the transplantation of animal organs into humans – gives us the possibility of saving lives that we would certainly lose otherwise.

There are major scientific hurdles in the way of transplanting whole animal organs into humans, including significant potential problems with incompatibility and consequent rejection.  There is, however, useful similarity between human and pig cells, which means that using pigs as the source of organs is the most likely to be viable.  Assuming, for the moment, that we can solve the scientific challenges with doing so, the bigger issue is the question of whether we should engage in xenotransplantation.

A significant challenge to this practice is that it is probably unethical to use an animal in this way for the benefit of humans. more…

We’re all Gonna Die… Eventually

6 Oct, 16 | by Iain Brassington

It might just be a product of the turnover of people with whom I have much professional contact, but I’ve not heard as much about human enhancement in the past couple of years as I had in, say, 2010.  In particular, there seems to be less being said about radical life extension.  Remember Aubrey de Grey and his “seven deadly things“?  The idea there was that senescence was attributable to seven basic processes; those basic processes are all perfectly scrutable and comprehensible biological mechanisms.  Therefore, the argument went, if we just put the time and effort into finding a way to slow, halt, or reverse them, we could slow, halt, or reverse aging.  Bingo.  Preventing senescence would also ensure maximum robustness, so accidents and illnesses would be less likely to kill us.  To all intents and purposes, we’d be immortal.  Some enterprising people of an actuarial mindset even had a go at predicting how long an immortal life would be.  Eventually, you’ll be hit by a bus.  But you might have centuries of life to live before that.

Dead easy.

I was always a bit suspicious of that.  The idea that death provides meaning to life is utterly unconvincing; but the idea that more life is always a good thing is unconvincing, too.  What are you going to do with it?  In essence, it’s one thing to feel miffed that one isn’t going to have the time and ability to do all the things that one wants to do: life is a necessary criterion for any good.  But that doesn’t mean that more life is worth having in its own right.  Centuries spent staring at a blank wall isn’t made any better by dint of being alive.

But a letter published this week in Nature suggests that there is an upper end to human lifespan after all.  In essence, the demographic data seem to suggest that there’s an upper limit to survivability.  That being the case, we should stop worrying about making people live longer and longer, and concentrate on what’s going on during the 125 years or so that Dong, Milholland and Vijg think is allotted to us. more…

What is a Moral Epigenetic Responsibility?

23 Aug, 16 | by miriamwood

Guest Post by Charles Dupras & Vardit Ravitsky

Re: The ambiguous nature of epigenetic responsibility

Epigenetics is a recent yet promising field of scientific research. It explores the influence of the biochemical environment (food, toxic pollutants) and the social environment (stress, child abuse, socio-economic status) on the expression of genes, i.e. on whether and how they will switch ‘on’ or ‘off’. Epigenetic modifications can have a significant impact on health and disease later in life. Most surprisingly, it was suggested that some epigenetic variants (or ‘epi-mutations’) acquired during one’s life could be transmitted to offspring, thus having long-term effects on the health of future generations.

Epigenetics is increasingly capturing the attention of social scientists and ethicists, because it brings attention to the importance of environmental exposure for the developing foetus and child as a risk factor for common diseases such as cardiovascular, diabetes, obesity, allergies and cancers. Scholars such as Hannah Landecker, Mark Rothstein and Maurizio Meloni have argued that epigenetics may be used to promote various arguments in ongoing debates on environmental and social justice, as well as intergenerational equity. Some even suggested that epigenetics could lead to novel ways of thinking about moral responsibilities for health.

Is it fair that disadvantaged populations are exposed to an inequitable share of harmful environments – such as polluted areas – that are epigenetically-detrimental to their health? Who should be held responsible for protecting children and future generations from epigenetic harm induced by their environments? Should we hold the parents accountable for detrimental epigenetic impact of their behavior on their children? And how should we manage the possible risks of stigmatization and discrimination of people that we consider blameworthy of inflicting epigenetic harm on others? These sensitive questions call for a nuanced investigation of the impact epigenetics can have on our understanding of moral responsibility.


Enhancement as Nothing More than Advantageous Bodily and Mental States

20 May, 16 | by BMJ

Guest Post by Hazem Zohny

Some bodily and mental states are advantageous: a strong immune system, a sharp mind, strength.  These are advantageous precisely because, in most contexts, they are likely to increase your chances of leading a good life.  In contrast, disadvantageous states – e.g. the loss of a limb, a sense, or the ability to recall things – are likely to diminish those chances.

One way to think about enhancement and disability is in such welfarist terms.  A disability is no more than a disadvantageous bodily or mental state, while to undergo an enhancement is to change that state into a more advantageous one – that is, one that is more conducive to your well-being.  This would hugely expand the scope of what is considered disabling or enhancing.  For instance, there may be all kinds of real and hypothetical things you could change about your body and mind that would (at least potentially) be advantageous: you could mend a broken arm or stop a tumour from spreading, but you could also vastly sharpen your senses, take a drug that makes you more likeable, stop your body from expiring before the age of 100, or even change the scent of your intestinal gases to a rosy fragrance.

Would all such changes be instances of enhancement? more…

Patient Views about Consent, Confidentiality & Information-Sharing in Genetic Medicine.

29 Apr, 16 | by BMJ

Guest post by Sandi Dheensa, Angela Fenwick and Anneke Lucassen

Imagine you’re a clinician in genetic medicine.  For a while, you’ve been seeing Joe Bloggs, a patient with a mutation in a gene that’s caused a hereditary form of colon cancer.  As is your standard practice, you help Joe identify who in his family is also at risk and spend some time talking about how he’ll tell them.  The Bloggs’ are a large bunch: Joe has children, siblings, nieces, nephews, aunts, uncles, and cousins, all of whom might have the mutation.  Anyone who tests positive would be eligible for regular bowel screening, which – while not pleasant – makes it much more likely that any cancer will be caught at a  treatable stage.  Unfortunately, despite all this, you’ve reason to believe that Joe hasn’t told his relatives anything and now you’re unsure what to do.

What are your options?  You might say Joe’s confidentiality and autonomy are paramount: it’s up to him what he does, and, as his doctor, you’ve done your part by telling him the cancer is heritable.  Or you might argue that Joe’s family needs to know – but how  and when?  The GMC says you can share a patient’s personal information without consent if the benefit of doing so outweighs the risk: does the situation meet this criterion?  What if you share the information and Joe sues you for breaching his confidentiality?  But what if you don’t say anything and a relative develops a cancer that could’ve been prevented?  Won’t their trust in the health service be shaken if they knew you’d chosen not to share?  Indeed, the UK and Netherlands have recently seen cases where relatives  questioned the health service’s non-disclosure of relevant information.

Taking a “joint account” view of confidentiality from the outset would’ve avoided these situations.  The joint account involves viewing genetic and personal information as distinct: the possible inheritance of cancer is common to the whole Bloggs family, but that Joe has stage III bowel cancer is personal.  If genetic information is confidential to the family, not just the tested patient, you’d have told Joe upfront, before even drawing his blood, that you’d look into sensitive and appropriate ways to let his relatives know the information if they might benefit from it.  Later down the line, when it materialised he hadn’t told his family, it would’ve been easier to negotiate what to do.

In our recent JME paper, we explored the views of people affected by hereditary cancer and other conditions regarding the distinction between genetic and personal information, the levels of confidentiality afforded respectively, and healthcare professionals’ roles and responsibilities toward their patients’ relatives.

In line with the joint account approach, our interviewees considered their signs, symptoms, and diagnoses as personal, but thought genetic risk was familial and that their relatives needed to know about it. more…

How We Feel about Human Cloning

7 Apr, 16 | by BMJ

Guest post by Joshua May

Suppose you desperately want a healthy child to build a family of your own.  As is increasingly common, however, you can’t do it naturally – whether from infertility, a genetic disease you don’t want to pass on, or a non-traditional relationship.  If you seek a genetic connection with the child, there are some limitations to the main alternatives: adoption, surrogacy, and in vitro fertilization.  You may yearn for more options.

How would you feel about cloning?  Take the nucleus of a cell from yourself or a loved one, then put it into an egg that will eventually develop into a baby that shares nearly all the genes of the donor cell.  The resulting baby will simply be a kind of ‘delayed twin’ of the donor.

Most people believe this is immoral.  There’s a bit more support for therapeutic uses that merely create new tissue, for example.  But, at least in the US and UK, people overwhelmingly condemn cloning for the purposes of creating new human lives.  In fact, a recent poll suggests there is little disagreement in America over this issue, where human cloning is among the most widely condemned topics (alongside polygamy and infidelity).

That’s what people think, but how do they feel?  Controversial bioethical issues often generate intense feelings.  Some bioethicists treat cloning in particular as a line in the sand that we mustn’t cross, for fear of sliding down a slippery slope to a dystopia.

Consider Leon Kass, who played a major role in public policy as chair of George W. Bush’s President’s Council on Bioethics.  Kass argues that there is wisdom in repugnance toward human cloning, allowing us to ‘intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear’.  As opposed to mere unease or sadness, Kass and some others have argued that disgust is such a powerful and distinctive emotion that we should take it seriously as a moral guide when deliberating about ethical issues.

An empirical claim lurks.  Such bioethicists assume that people in general share their reaction of repugnance. Besides, if we can uncover the emotional reactions people tend to feel toward disputed moral issues, then we can better understand why they hold the beliefs they do.  Does the prospect of cloning humans make us sick?  Scared?  Sad?  Angry?  Excited?  At ease?

In my paper, I provide some initial evidence that people (at least in the States) feel primarily anxious and curious about human reproductive cloning.  These were the most frequently self-reported negative and positive emotions, not disgust, fear, sadness, anger, excitement, amusement, comfort, or joy. more…

How to be a good (consequentialist) bioethicist…

6 Jul, 15 | by David Hunter

There has recently been a pattern of papers (and I am not going to identify which ones) which I take as being slightly embarrassing to academic bioethicists because they portray us in a less than flattering light because of the naive mistakes they seem to make, or the outlandish poorly argued claims they make. I have noted a trend for these to have come from relatively new, consequentialist bioethicists and being the helpful sort that I am, the aim of this blog post therefore is to help consequentialist bioethicists from falling into these pitfalls.


The Moral Desirability of Early Fatherhood

5 Jun, 15 | by Iain Brassington

Guest Post by Kevin Smith

It is well known that the risk of disorders resulting from chromosomal abnormalities, such as Down’s syndrome, correlates with advancing maternal age.  Less widely known is the correlation between the age of fathers and an increased risk of a range of disorders in their resultant offspring, the most prominent of which are neuropsychiatric conditions including schizophrenia and autism.  This is the paternal age effect, the importance of which has recently become clear through a growing body of molecular genetic and epidemiological data.

The paternal age effect results from new mutations occurring in the stem cells from which sperm cells are derived, resulting in an accumulating mutational burden as the male ages.  Genetic abnormalities resulting from these paternal mutations are usually subtle at the molecular level (involving as little as a single nucleotide change), highly heterogeneous, and do not usually result in detectable foetal abnormalities.  Accordingly, the opportunity for prenatal detection of such cases is very limited.  (By contrast, the genetic abnormalities associated with maternal aging typically involve substantive chromosomal aberrations, comprise a relatively restricted range of commonly occurring forms, and frequently produce marked foetal defects; these features ensure that routine screening and testing reveals the majority of such cases prior to birth, permitting termination of affected foetuses.)  Additionally, and again in contrast to genetic aberrations associated with maternal age, paternal de novo mutations are transmitted through successive generations of males, with a concomitant intergenerational accumulation of genetic abnormalities.  Moreover, at least in Western societies, the average age of fatherhood is increasing markedly, a situation that will increase the burden of these mutations.  It follows that the age of potential fathers is of significant ethical importance.

My paper explores the ethical aspects of paternal age, in respect of both individual procreative decisions and societal responsibilities.  I argue inter alia that, somewhat contrary to the commonplace disapproval of young parents, early fatherhood is ethically desirable.  The most immediate practical means to achieve a reduction in paternal de novo mutations and associated genetic disorders would be the promotion of sperm banking amongst young males.

Read the full paper here.

This could get Personal

5 Dec, 14 | by Iain Brassington

And so 23andMe has launched in the UK.

For those not familiar with it, 23andMe allows individuals to swab themselves and have their genome analysed, at a cost of £125. The company is offering to generate a report covering about a hundred traits, giving information on a range of potentially important to fun things: the list includes tests for the presence or absence of inherited conditions such as Tay-Sachs and Beta Thalassemia; risk factors relating to things like Alzheimer’s; how much DNA you have in common with Neanderthals; and earwax type.

To be honest, I’d’ve thought that by the time you’ve got £125 to spend on a test like this, you’d probably know all you’d ever want to know about your earwax, but… well, apparently there’s more.  Joy.

Anyway: BBC Breakfast invited me to witter on about it the other day.  I only got a couple of minutes, and so didn’t get to say much; shamelessly, I’m going to think aloud a little bit here.  My basic starting point is that it’s hard to see why the test per se is too big a problem: all else being equal, who would begrudge a person information about himself?  All the same, I think that there are questions that are probably worth asking.  (NB: in what follows, whenever I mention 23andMe, the point should be taken to cover any company offering a similar service.)  So, in no particular order… more…

Growing a Kidney Inside a Pig Using your own DNA: The Ethics of ‘Chimera Organs’

6 Nov, 14 | by Iain Brassington

Guest post by David Shaw

Imagine that you’re in dire need of a new kidney. You’re near the top of the waiting list, but time is running out and you might not be lucky enough to receive a new organ from a deceased or living donor. But another option is now available: scientists could take some of your skin cells, and from them derive stem cells that can then be added to a pig embryo. Once that embryo is implanted and carried to term, the resulting pig will have a kidney that is a perfect genetic match to you, and the organ can be transplanted into your body within a few months without fear of immune rejection. Would you prefer to take the risk of waiting for an organ donated by a human, which would require you to take immunosuppressant drugs for the rest of your life? Or would you rather receive a “chimera organ”?

This scenario might seem far-fetched, but it is quite likely to be a clinical reality within a decade or so. Scientists have already used the same technique to grow rat organs inside mice, and it has also been shown to work in different types of pig. Although clinical trials in humans have not yet taken place, using these techniques to create human organs inside animals could solve the current organ scarcity problem by increasing supply of organs, saving thousands of lives each year in Europe alone. As illustrated in the example, organs created in this way could be tailored to the individual patient’s DNA, allowing transplantation without the risk of immune rejection. However, the prospect of growing organs of human origin within (non-human) animals raises several ethical issues, which we explore in our paper.

Although chimera organs are ‘personalised’ and unlikely to be rejected, one of the major concerns about using organs transplanted from animals is the risk of ‘zoonosis’ – the possibility that an animal virus might be transmitted along with the organ, resulting in a new disease that could cause a pandemic. more…

JME blog homepage

Journal of Medical Ethics

Analysis and discussion of developments in the medical ethics field. Visit site

Creative Comms logo

Latest from JME

Latest from JME

Blogs linking here

Blogs linking here