A New Normal

Advances in genetics have reshaped medicine, but many bumps lie ahead

Illustration of a group of test tubes with DNA strands inside them
Image: Andy/iStock/Getty Images Plus

Jimi Olaghare was born with sickle cell anemia, a genetic condition so named for the sickle-shaped red blood cells that are the hallmark of the disease. Oxygen-starved, sticky, and misshapen, the cells tend to clump, causing debilitating pain crises and damage to organs like the heart, lungs, and eyes.

In Nigeria, where Olaghere spent much of his childhood, he received nothing more than acetaminophen and ibuprofen for his pain episodes. As a teenager and then as a young adult in the United States, Olaghere’s pain grew ever more agonizing. He had his gallbladder removed and suffered two heart attacks in his 30s. Hydroxyurea, one of a handful of drugs for treating sickle cell anemia, plus periodic blood transfusions and morphine drips in the ER offered only partial relief.

The sole chance for a cure was a bone marrow transplant, an uncertain proposition that entailed finding a well-matched donor and the prospect of grueling chemotherapy to wipe out his immune system. Of course, Olaghere had heard about gene editing and its promise to cure genetic illnesses, but back in 2018, it was still deemed the stuff of medical science fiction. He knew sickle cell anemia would be his lifelong companion. So, when he came across a news story about a woman with the disease who was cured by the gene-editing tool CRISPR he immediately called the doctor’s office.

Today, Olaghere is among a growing number of patients with sickle cell anemia who have undergone curative CRISPR-based gene editing. The approach involves editing the underlying genetic abnormality in a person’s stem cells so that they can make a healthy form of hemoglobin and healthy red blood cells.

Since the treatment, Olaghere’s has not had a single pain crisis, but his life has been transformed in ways far more profound than the absence of pain.

“I am able to go out and dream,” Olaghere told the audience at the eighth annual Precision Medicine Symposium on Sept. 20, organized by Harvard Medical School’s Department of Biomedical Informatics. “I can wake up and do what I want to do, not what sickle cell wants me to do. At the end of the day, if I am unable to achieve these dreams, the blame is on me. I am glad that I failed. Sickle cell did not make me fail.”

Advances in genomics, an explosion of genetic testing and the emergence of new technologies over the past 20 years have led to tectonic shifts in medicine. Olaghere’s story is a striking illustration of the promise of this brave new world.

Yet, as is often the case, new insights can answer some questions and raise others, such as when it comes to genetic variation, what really is “normal,” which gene variants are benign and which ones portend disease, how do physicians and patients make sense of genetic risk scores to determine whether, when, and how someone should be treated.

To illuminate a path forward, the symposium brought together genomic experts, computational biologists, data scientists, and clinicians.

The dark side of disruptive new technologies

Gene editing is transforming lives and opening up exciting new frontiers.

“The goal is that gene editing will become available not only for sickle cell disease and beta thalassemia, which is what we are doing now, but for many other diseases,” said Haydar Frangoul, who led the team that cured Olaghere at TriStar Centennial Medical Center in Nashville, Tennessee. “There was a crack in the door, and we opened. Hopefully, many more patients with genetic disorders will be able to benefit from this therapy.”

Yet, Olaghere noted, CRISPR and similar new technologies also pose daunting challenges related to access and affordability.

“Some 100,000 people suffer from sickle cell disease in the U.S. There are four million with sickle cell disease in West Africa. I don’t even know what the number is worldwide,” Olaghere said. “Even if we cure everyone in the United States, that’s not even 10 percent of the world’s population with this condition.”

Zak Kohane, co-organizer of the symposium and chair of the Department of Biomedical Informatics in the Blavatnik Institute at HMS, noted three challenges central to the use of new technologies — cost, access, and patient selection.

“We’re quite wealthy in this country. We can afford expensive therapies, but there are many countries that do not have such budgets. How do we afford screening, small-molecule drugs, and cellular therapies, which are incredibly effective, but rarely get cheaper with time in the United States?” Kohane asked.

The mode of therapy delivery could make a critical difference in cost reduction, experts said. In the case of CRISPR, this means that the gene editing of a patient’s cells that now takes place in the lab would be done instead by giving the patient a pill.

“What we’ve done is proof of principle that gene editing of cells can be done in vitro, but in an ideal world, I hope that we can do this in vivo — you would basically take a drug and do the gene editing that way. If we are unable to move to a drug-in-a-bottle, it’s going to be very difficult worldwide, especially in countries with poor resources,” Frangoul said.

Another critical aspect of new genetic treatments and genomic tests is the public health value versus the individual patient value, which are not always aligned.

For many new therapies, the value is incontrovertible, but from a public health perspective, certain practical questions emerge about the value proposition, Kohane said, such as with the high cost of some new treatments, how many babies could a physician screen for this disease or treat for asthma.

“Think about the more than a billion dollars spent a year caring for patients with sickle cell disease in the United States — that does not even account for lost wages or missing work or school,” Frangoul said. “You can make an easy economic decision that many therapies are a fraction of the cost of what it’d take to care for patients long-term.”

Emerging genetic technologies can also pose ethical conundrums. Experts agreed that the value of knowledge gleaned from a test should be weighed against a patient’s ability to act on the information provided.

“My question about screening for some disorders is the impact on the patient and their ability to get health insurance and afford health care,” Frangoul said. “It’s great to do screening, but what’s the plan for how the result might impact the patient? These are questions we need to ask whenever we develop a new screening test.”

What does ‘normal’ mean, anyway?

The human genome is made up of more than 20,000 genes, but the biologic function for most of them remains unknown. Indeed, to date, only 4,684 genes have been directly implicated in disease development.

Adding another twist to an already convoluted biologic plot is the astounding genetic variation from person to person. Individuals typically harbor millions of gene variants.

Which of these variants are benign and inconsequential, and which ones might signal disease down the road? Again, for the vast majority of gene variants, the answer remains unknown.

“This is far from an abstract academic exercise,” said symposium co-organizer Arjun “Raj” Manrai, HMS assistant professor of biomedical informatics. “Determining what is normal variation may determine a genetic diagnosis, who’s at risk in the family, and who’s eligible to receive life-changing therapy.”

Despite enormous recent advances in genomic science, genetic testing today creates more uncertainty than diagnostic clarity, said Heidi Rehm, HMS professor of pathology and chief genomics officer in the Department of Medicine at Massachusetts General Hospital.

Rehm recently analyzed 1.5 million genetic tests done over two years by major sequencing labs across the United States to compare the rates of inconclusive findings related to variants of unknown significance versus these tests’ diagnostic results. Both specific gene-panel tests and the more comprehensive whole-genome sequencing generated diagnostic certainty at disappointingly low rates — in 10 percent and 17.5 percent of cases, respectively.

Another problem is interpretation of variant meaning. Addressing the problem should start with the establishment of more uniform, clear standards, said Birgit Funke, a molecular geneticist and vice president for genomic health at Sema4.

Making sense of genetic data — both in terms of gene-function discovery and variant interpretation — should be tackled along multiple planes, experts said, by curating more data, by optimizing clinical encounters between researchers and physicians, and by developing new machine-learning approaches for gene variant interpretation.

On the data-collection front, to chip away at the problem researchers must curate genetic data globally and convene genomic expert panels to weigh in on the function of genes and the role of variants of unknown significance, as well as to review all existing evidence to determine whether it’s sufficient to implicate a gene or a variant in disease. One such project, the NIH-funded ClinGen, is aimed at creating an authoritative central resource that defines the clinical relevance of genes and gene variants for clinical use and for research use. Such efforts must also include the design of faster, simpler ways to query databases and share data.

In the clinic, genetic testing is already used in various ways to inform medical care. Yet, the use of genomic data is by no means uniform across clinical settings, specialties, and disease areas. One pain point is the disconnect between primary care and specialty practices.

To address the problem, Rehm launched the Preventive Genomics Clinic at Mass General, which counsels patients about genetic risk and guides them through genetic-testing decisions. The clinic offers preventive genomic screening, guidance on diagnostic genetic testing, and interpretation of previous genetic test results. To cope with the demand, Rehm and colleagues launched a genetics e-consult service through which primary and specialty physicians ask questions and seek help with result interpretation and guidance on whether to test patients in the first place.

On the variant-interpretation front, there is a complete and ongoing state of confusion among both clinicians and patients. All are asking important questions, said Sharon Plon, professor of molecular and human genetics at Baylor College of Medicine and an HMS alumna.

One of the things the field needs to wrestle with is the development of new standards for reporting the findings of genetic tests and for writing reports that are clear to the general medical community and patients, Plon said.

Clinicians need to have a concrete framework to communicate results to patients using categories such as “certain,” “highly likely,” and “tentative,” said Vandana Shashi, professor of pediatrics at the Duke University School of Medicine and a member of the Undiagnosed Diseases Network.

Panelists spoke of the ever-growing need for more genetic counselors skilled in the interpretation of genetic findings and in counseling patients on the most prudent course of action. But the bigger problem remains that of scale — medical science has not caught up yet with the volume of data. There is a clear lack of understanding of the significance and meaning of many gene variants. Furthermore, the science evolves and so does the meaning of variation, with some variants being reclassified.

On the AI front,machine learning will play an increasingly important role as new AI tools rapidly sift through vast reams of genetic data to help clarify the meaning of gene variants. One such tool, called EVE, developed by computational biologists at HMS and colleagues at Oxford University, debuted in 2021. When applied to more than 36 million variants across 3,219 disease-associated proteins and genes, EVE indicated more than 256,000 human gene variants of unknown significance that should be reclassified as benign or pathogenic. Researchers hope that in combination with current clinical approaches, EVE and similar tools can improve clinical assessments of genetic mutations and boost diagnostic and prognostic accuracy.

Making sense of polygenic risk scores

Most rare diseases are caused by a single defective gene, but many common and chronic conditions arise from the minuscule effects of thousands of genes acting synergistically to amplify risk.

Enter polygenic risk scores — algorithms used to gauge overall risk for a disease based on a person’s total gene variants.

Moderator Shamil Sunyaev, professor of biomedical informatics at HMS, asked why these risk scores are not being applied more commonly to inform patient care.

Scientifically, one inherent problem with polygenic risk scores is that it is hard to disentangle the role of genes from lifestyle factors. Another problem is the lack of diverse datasets to render polygenic scores more reliable across individuals with different genetic ancestry.

“I see the single greatest limitation is the fact that our genetic datasets are vastly Euro-centric,” said Alicia Martin, assistant professor of medicine at HMS and an investigator in the Analytic and Translational Genetics Unit at Mass General. “We’re still not where we need to be if we are trying to reflect national and global representation. To overcome this, we need to democratize genomics.”

Clinically, the two central challenges in polygenic risk score use are patient selection — who should get tested and when — understanding the clinical meaning of a person’s score, and whether to treat based on the results.

Despite all this, polygenic risk scores are already informing care for patients with certain diseases.

In some forms of cardiovascular illness, for example, a polygenic risk score exceeds the predictive value of traditional risk factors such as cholesterol and high blood pressure, said Aniruddh Patel, an HMS instructor in medicine at Mass General. And using these scores in combination with other risk assessment tools for cardiac illness augments clinicians’ ability to identify people at elevated risk, he added.

Even among people with elevated polygenic risk for cardiac illness, DNA is not destiny, Patel said. Improved lifestyle and statin medications can dramatically reduce risk — a compelling argument for using polygenic risk scoring in this context.

Even so, questions of how many people the score applies to linger together with whether it carries over across populations. Another challenge is how to convey test results and contextualize their meaning both for frontline clinicians and patients.

“I graduated med school in 2014. I had not heard of the term polygenic risk score until I started doing research over the past five to six years,” Patel said. “The vast majority of primary care physicians who’d be dealing with risk-score reports coming in need to be educated. As far as educating patients, getting into the idea of probability is a complicated concept. We need better tools to convey risk.” 

The concept of normality can seem deceptively simple — flagging what falls out of the range of typical. Yet, it has bedeviled physicians and scientists for ages and will continue to challenge them even more as new information emerges.

“Deepening our understanding of normality is central not only to genetic testing but to many areas, all areas of clinical practices,” Manrai said. “I hope the perspectives that we heard today will help us enrich our understanding of normality and help us achieve the promise of precision medicine.”