The whole proposition was and still is so wild to me: self-organizing, distinct little molecular legos, which fit together a certain way and form sequences corresponding to another set of building blocks, proteins, which then go on to construct, well, everything. I mean literally everything. I’m talking about DNA.
The early 2000s was like a golden age in genomics research. I was getting my undergraduate degree when the Human Genome Project had just wrapped up a few years earlier. Some of it was done in my hometown, Houston, at Baylor College of Medicine. An offshoot of that was the HapMap project, which after sequencing many individuals of different ethnicities to find distinct inheritance patterns, created the concept of ‘haplotypes,’ referring to set of gene variants that tend to occur together. With the use of haplotypes, geneticists were able to trace human ancestry and understand specific inheritance patterns.
It felt like a singularly optimistic era, one so auspicious as to connect the strands of deep evolutionary time—memories of ancient humans as they marauded across continents, carrying with them new mutations and phasing out others— with the gleaming technologic promise of molding the very fate of our species with gene therapy, personalized medicine and novel treatments beyond anyone’s wildest dreams. We knew our past, we know our present and we will soon know our future. More than anything, simply put, it was hopeful.
Rather, I was hopeful. Like a 2-D avatar or emoji standing on a majestic parabolic function just before the inflection point was apparent, falsely assuming that there was nowhere to go but up.
Part of this hopefulness derived from being so close to the Texas Medical Center in Houston, of which Baylor College of Medicine is a part. It is the largest medical center in the world and even has a dedicated zip code. Two titans of cardiothoracic surgery-turned rivals, Dr. Michael DeBakey and Dr. Denton Cooley, were then still operating (though at different hospitals), and their presence embodied a palpable, inescapable mythos. The LifeFlight program at Memorial Hermann Hospital is an air ambulance service that was one of the first of its kind, and its founder Dr. James “Red” Duke was a legendary trauma surgeon. While still in training at Dallas’ Parkland Hospital, Red was on the trauma team that responded when John F. Kennedy was assassinated. He famously observed on the spot, “I can’t do much for a dead man.”
Then there’s MD Anderson, which has frequently ranked number one or two in cancer treatment since the rankings began in 1990. It’s considered one of the world’s few medical moonshots; if any place had a shot at curing a significant portion of cancers this is it.
I suppose it’s in the very nature of legends to be grandiose and somewhat contrived, but their promise is based on a sort of canonical triumph. In the very least, these legends created a sense of optimism and fervor necessary to drive many young minds to pursue careers in science.
But by the time I had finished my training, something had changed.
In the nearly twenty years since the Human Genome Project was completed what followed was utter disappointment. Not a single major treatment or medical intervention came to fruition as a direct result. As it turns out, genetic variants are responsible for only a handful of rare diseases but the common ones afflicting millions such as Alzheimer’s and heart disease are more complicated. These are heterogenous conditions that cannot be so cleanly mapped to genetic code, and are very often a mixture of genes, nurture, environment and pure luck. What’s more is that data sharing and open access, a central promise of the Human Genome Project, hasn’t panned out either. What we have now is a convoluted set of databases, barriers and intentional siloing (though often justified), making ongoing research on existing data far more onerous than it needs to be. Sadly, what started as peering into our past to connect with our distant ancestors is now recognizable to the public today as a handful of DNA testing services whose only benefit to society besides providing casual dinner party conversation fodder is cheap paternity tests.
I should have seen this coming. It was the same trajectory modern physics took. Since the triumphs of Relativity and Quantum theory (perhaps the highest low-hanging fruit that we hairless apes have yet been able to reach), and the coalescing around the Standard Model as the only “new” branch capable of being subject to experimental rigor, there really hasn’t been much else (regarding string theory, people either say it’s dead or on its death bed at the Science Hospice). It’s no wonder then that prospective physics PhD candidates today are actively counseled out of “foundational physics” and into other fields such as artificial intelligence or biophysics.
When the Higgs Boson was discovered I got a text from my brother (who happens to be a physics major turned AI engineer):
“So yeah, they confirmed its existence.”
“Pretty cool.”
“Lol yeah.”
This horrifyingly hollow exchange in response to the discovery of what popular science called “the God Particle” basically says it all (the name was a bastardization of its original and more apt moniker “the Goddamn Particle” for its difficulty to detect). The experiment to definitively find the Higgs particle was done in the Large Hadron Collider, a massive ring like structure 17 miles in circumferences and lined by powerful magnets that accelerate elementary particles in order to crash them into one another at ungodly speeds. It was through one of these collisions was the much coveted and once purely theoretical Higgs Boson drawn out. The whole project was estimated to have cost around 13 billion dollars. No doubt this was an epic marvel of engineering and experimental design. But in the end, it was just confirmation of existing theory. Nothing fundamentally novel was discovered. The asymptote was plainly evident as it went on forever along an invisible, seemingly uncrossable barrier.
On the subject of medicine, the Human Genome project was quite prophetic. There’s no better parallel than cancer research, which was similarly marked by lofty ambitions, a high price tag and long wait. Though targeted and immune therapies such as checkpoint blockade have made some relatively big strides in certain cancers, and more potential targets are continuously unearthed, one can safely say however, that for about a dozen or so cancers the survival rates have not changed since the ‘70s. Among these are pancreatic, brain, esophageal, hepatobiliary, lung and some forms of leukemia. One can also say that for most patients any incremental longevity gains made are far eclipsed by the negative biopsychosocial consequences of the exponentially increasing costs of care. Keep in mind that all of this was occurring as the replication crisis in science was coming to a head. And oh yeah, some entire subfields of clinical research might have been built on pure fraud. Just this year it came to light that the long thought cause of Alzheimer’s disease, the amyloid protein, might have been based on pure fabrication. If that turns out to the the case it means billions of dollars over the course of a decade were spent chasing the wrong lead.
But wait, what about Silicon Valley? They went from mobile apps to private space travel after all (for the sake of argument, lets conveniently ignore Theranos). Many exciting startups are indeed taking big swings at very hard problems from miniaturizing nuclear fusion to manufacturing in zero gravity. But the biggest bets are being made in artificial intelligence. Presently there is no human-like, general AI. The most cutting edge machine learning algorithms are capable of creating art from scratch, playing chess better than any human and analyzing vast data sets to find ever subtle patterns — all of which rely on an existing paradigm (“priors” if you will) and a lot of data. What they cannot yet do is generate new paradigms, ascertain what even constitutes data, collect and appraise new information and come up with novel theories. IBM’s Watson (the reigning champion of Jeopardy) set out to “eradicate cancer” and entered into a partnership with MD Anderson. After a lot of data crunching and 62 million dollars later the project failed to produce anything and was unceremoniously disbanded.
The younger generations see it now. There isn’t too much to get excited about in the pure sciences. The college to PhD to post-doc to professorship to tenure pipeline is hardly about science anymore. It’s a Faustian bargain to secure a comfortable living after a decade or more of highly skilled yet poorly compensated labor. Science funding as a whole has serious problems. It’s often infiltrated by corporations and tends to reward safe, niche and somewhat repetitive research. The NIH is an agency of the federal government and the single most important funding apparatus for basic sciences research in the US, and as such it controls a tremendous amount of influence. It’s a rather top down organization controlled by a relatively small group, and the NIH director is nominated by the president and can serve unlimited terms. The previous director, Dr. Francis Collins, served for 12 years and recently retired in 2021.
Where did Collins cut his teeth exactly? Well he ran the Human Genome Project.
“I don’t think we’re living in an incredibly fast technological age.” - Peter Thiel
So what’s left for aspiring scientists and intellectuals? What’s the sales pitch? Well it’s not really science. There is of course the cut and dry advertisement of a stable job within medicine, tech and even academia (though finance and trading has sucked up a good bit of this). But beyond that the pitch becomes quite bizarre.
Undergraduates and medical students now are being encouraged solve abstract or all-encompassing societal problems in faraway domains ranging from race, climate change, global diplomacy and gender. The British Medical Journal recently published an article on climate change solutions not because they have made so many strides in medicine that they have resources leftover, but for precisely the opposite reason: medicine has stagnated. Climate change itself is an interesting case study of this phenomenon; people would much rather frame climate as a moral and societal issue than a technological one, all the while actively overlooking a multitude of possible solutions. One of the most powerful climate groups, the Sierra Club is “unequivocally opposed” to nuclear energy, a technology that has the potential to be the cleanest energy source.
I can name many more instances but the evidence is abundant for anyone willing to acknowledge. If I sound like a science pessimist, rest assured I am not. I am a science ONLY-ist. Without a healthy, dynamic and culturally sane science infrastructure we are, simply put, doomed.
Contending with scientific asymptotes has a silver lining. It forces us to stop, reflect and ask questions. Should we be funding the same kinds of research? Should we broaden what’s considered “acceptable discourse” and re-examine contrarian, unpopular, under-explored or, dare I say, “fringe” viewpoints? Should a small group at select agencies and corporations have such an outsize say? Should education prepare us for silos or a multidisciplinary environment?
I actually think the Human Genome Project and LHC/CERN were worthy endeavors, and will likely bear fruit in unexpected areas. But I also think a lot of money is wasted, the standards have diminished, that scientific censorship has run amok and, most importantly, we have lost the ability to sell science to the younger generations on its own merits.
That is a travesty and it has to change.
The Scientific Asymptote
a thought along these lines that has long niggled at me:
the issue with modern foundational science is that is has become incredibly specialized and rarified. where once there were "renaissance men" who could be at the cutting edge of many fields now it takes a whole career to get anywhere near a tiny edge piece in just one.
this has a pernicious dual effect:
1. by the time you get there, you tend to have been sucked into an orthodoxy by the exigent needs of funding, publication, and tenure and will seek to defend it from challenge.
2. it makes interdisciplinary work much harder as few are cutting edge in multiple fields and interdisciplinary interaction is where so many breakthroughs and interesting findings occur.
everything is siloed and gathered into dogmatic fiefdoms desperate to never change paradigms. it's a sort of recipe for stultification and stagnation.
i suspect the future is in getting fields to interact with one another as was once the purpose of collecting them in one place (like research parks or universities). this mission has been lost.
in its rediscovery may lie the the path to once more making progress.
(oddly, this may be a field for which AI is better suited than humans...)
As Moore's Law collapses
As more replication crisises occur
As nothing new comes from pharma or physics
As we refine science vs discover anything new
As science slowly fossilizes
As our dominant culture asserts supremacy of one way
Slowly the seeds spread for revolutions in thought, discourse, science and culture
It'll be messy AF, but their is no better option.
Some part of society needs to go SS3