Why do you think you're right?

I estimate the likelihood of a high-consequence biological event stemming directly from human bioengineering activities by 2030 to be around 17% due to (1) accelerating technological democratization, (2) insufficient oversight/governance, and (3) inherent dual-use risks of synthetic biology (as mentioned in the question info). While this risk is not yet inevitable, the convergence of these factors creates a precarious threshold where intent, error, or hubris could trigger a catastrophe.


First, technological accessibility has been outpacing regulatory safeguards. Bioengineering tools like CRISPR, gene synthesis, and AI-driven pathogen design have become cheap, widely available, and user-friendly (check out this DIY CRISPR kit). A single individual with some training and resources not so difficult to get could engineer pathogens using open-source algorithms or mail-order DNA sequences. For example, in 2023, researchers demonstrated that AI could design novel antimicrobial peptides and toxic molecules with equal ease, highlighting the dual-use dilemma. Meanwhile, DIY biohacking communities and unregulated labs in jurisdictions with weak oversight (e.g., some private labs in Asia or Africa) operate with minimal accountability. Unlike nuclear technology, bioengineering requires no rare materials or large infrastructure— and the technical knowledge necessary to operate it is increasingly accessible via the internet and LLMs (e.g., ChatGPT’s ability to troubleshoot lab protocols). This democratization drastically lowers the barrier to misuse. Many specialized surveys are ranking bioengineered pandemics as a major global risk, citing LLM-enabled proliferation.


Second, there have been relevant precedents and, as a consequence, an increase in high-risk research. In the past 50 years, there have been 3 lab-origin outbreaks (1977 H1N1 re-emergence, 2003 SARS-1 leaks, and likely COVID-19 - as backed by 2023 and 2025 US intelligence assessments) suggesting a ~6% base rate, now amplified by technological advances. The COVID-19 pandemic has intensified gain-of-function research. Such work, which enhances pathogen transmissibility or lethality to “preempt pandemics,” is conducted in hundreds of BSL-3/4 labs globally, many in countries with opaque safety cultures. Biolab breaches are not new - or extremely rare. While most incidents were minor, the proliferation of labs engineering novel pathogens (e.g., avian influenza strains with mammalian adaptation) raises the statistical probability of a catastrophic leak


Source: Global Biolabs Report 2023


Third, there could be an intentional weaponization by State or Non-State Actors. Biological weapons are increasingly attractive to rogue states and terrorists due to their low cost and high impact. North Korea and Russia have been accused of maintaining clandestine bioweapon programs, while groups like ISIS have attempted to weaponize natural pathogens. Engineered pathogens—such as antibiotic-resistant plague or hybrid viruses—could bypass conventional defenses. Notably, AI-driven tools could automate the design of such agents, enabling malicious actors to circumvent technical hurdles. In 2024, the U.S. Department of Defense warned that LLMs had already been used to manufacture and refine bioweapons


In my perception, a 1% probability, as some forecasters have argued here, ignores the many risks presented today on the matter.

Files
Why might you be wrong?
  • Overestimation of DIY capabilities: Engineering a high-consequence pathogen may require tacit knowledge beyond current AI/LLM guidance.
  • Improved countermeasures: Advances in universal vaccines, pathogen-agnostic antivirals, or AI-driven outbreak prediction could mitigate spread.
  • Ethical norms: Stronger self-regulation by scientists (e.g., gene synthesis moratoriums) may reduce risks.
  • Attribution failures: Events caused by bioengineering might be misclassified as natural, lowering perceived probability.

Files
Files
Tip: Mention someone by typing @username