• Home  
  • Alzheimer’s Research Faces a Data Crisis
- Science & Research

Alzheimer’s Research Faces a Data Crisis

At WIRED Health 2026, John Hardy revealed a critical gap in Alzheimer’s research: science is ahead of data infrastructure. The bottleneck isn’t biology—it’s bytes. .

Alzheimer's Research Faces a Data Crisis

John Hardy said the treatments are already here. That was the first thing he said at WIRED Health on March 14, 2026. Not “we’re close.” Not “we’re making progress.” The therapies exist. We can slow cognitive decline. We can detect amyloid-beta and tau earlier than ever. But none of it matters at scale—because the data systems meant to support diagnosis, treatment, and research are in ruins.

Key Takeaways

  • John Hardy claims effective Alzheimer’s treatments are already available—but 90% of patients won’t benefit due to systemic data gaps
  • The bottleneck isn’t scientific discovery; it’s the lack of centralized, interoperable health data across clinics and trials
  • Current diagnostic tools rely on imaging and biomarkers that are underutilized because they’re siloed in disconnected hospital systems
  • Hardy warns that without urgent infrastructure investment, new therapies will fail to reach patients even if approved
  • The UK Biobank and ADNI are cited as rare models—but they cover only a fraction of global need

The Treatment Is Not the Problem

That’s the most counterintuitive part of what Hardy laid out. For decades, Alzheimer’s research chased a cure. The focus was on beta-amyloid, tau tangles, inflammation, synaptic loss. Now, after billions in funding and decades of trial failures, we have drugs that work—modestly, conditionally, but they work.

Lecanemab, donanemab, remternetug—they’ve shown 27% to 35% slowing of decline in early-stage patients. That’s not a cure. But it’s enough to delay nursing home placement, preserve independence, reduce caregiver burden. These are meaningful wins.

And yet, Hardy didn’t spend his talk celebrating them. He didn’t even mention their names until the Q&A. Because, as he put it: “It doesn’t matter if you’ve got the right key if the door’s locked from the inside.”

Data Is the Real Bottleneck

The door, in this case, is a global healthcare system that can’t identify who should get these drugs. We can detect amyloid buildup with PET scans. We can measure phosphorylated tau in cerebrospinal fluid. Blood tests for p-tau217 are now 93% accurate in predicting Alzheimer’s pathology. But who gets tested?

Not enough people. In the U.S. only 1 in 5 patients with early cognitive symptoms receives biomarker testing. In the UK, it’s worse—closer to 1 in 10. Why? Cost, access, and a complete absence of integrated data systems.

Imagine a primary care doctor in Leeds sees a 68-year-old with memory complaints. She suspects early Alzheimer’s. But her clinic doesn’t have a protocol to order a blood test. The local hospital’s PET scanner is booked for oncology. The patient’s cognitive assessments are in a paper file. Even if the test comes back positive, the specialist who could start treatment is 80 miles away—and doesn’t share electronic records with her.

That’s not an outlier. That’s the norm.

The Infrastructure Gap No One’s Fixing

Drug developers aren’t building data pipelines. Pharma companies run trials with tightly controlled datasets—but those end when the trial does. Hospitals collect data, but it’s trapped in incompatible EHRs. Research biobanks like the UK Biobank have deep longitudinal data, but they’re research tools, not clinical pathways.

Hardy called this a “tragedy of distributed failure.” No single entity owns the problem. The NIH funds discovery, not interoperability. The NHS prioritizes acute care. The FDA approves drugs, not data standards.

And so we have effective treatments idling in a world that can’t deliver them.

The Cost of Inaction

Alzheimer’s costs the U.S. $360 billion annually. By 2030, that could hit $500 billion. We’re spending that money on late-stage care, emergency visits, institutionalization. Meanwhile, early intervention could cut those costs—by delaying progression, reducing hospitalizations, keeping people at home longer.

But the economic case for early treatment collapses if we can’t find patients early. And we can’t find them because we’re not connecting the dots.

Consider this: the Alzheimer’s Disease Neuroimaging Initiative (ADNI) has spent $250 million over 20 years building a gold-standard dataset. It’s used in over 2,000 papers. But it includes only about 2,000 participants. The U.S. has over 6 million people with Alzheimer’s.

That’s not a dataset. That’s a sketch.

  • Only 3% of U.S. adults over 65 have had cognitive screening in the past year
  • Fewer than 10% of memory clinics routinely use blood-based biomarkers
  • 0% of national health systems have mandatory amyloid screening for early dementia
  • The average diagnostic delay from symptom onset to confirmation is 3.2 years
  • During that delay, patients miss the window for disease-modifying therapies

Hardy’s Warning: Science Outran Systems

“We’ve done the hard part,” Hardy said. “We’ve figured out the biology. Now we’re failing at the logistics.” That’s not just a problem for patients. It’s a crisis for the entire research pipeline.

When trials depend on recruiting patients with confirmed amyloid pathology, and only a fraction of eligible people are diagnosed, recruitment slows to a crawl. Trial sites burn through budgets waiting for participants. Sponsors delay launches. Investors pull back.

And every year of delay means 2 million more people enter the system too late for treatment.

Hardy pointed to the UK Biobank as a model—not because it’s perfect, but because it’s unified. It links genetic data, imaging, blood markers, and cognitive scores for 500,000 people. Researchers can query it in real time. But it wasn’t built for care. It wasn’t built to trigger clinical action.

“We need something like that,” Hardy said, “but embedded in the health system. Not a research island. A clinical network.”

“We’ve figured out the biology. Now we’re failing at the logistics.” — John Hardy, WIRED Health 2026

What Competitors Are Building—and Where They’re Falling Short

Some companies are trying to close the data gap—but their efforts remain fragmented. Roche’s Elecsys p-tau217 blood test is available in Europe and under FDA review, but it’s only used in select academic centers. The test costs $400 and isn’t routinely covered by insurers. Even when results are available, they don’t automatically feed into primary care EHRs like Epic or Cerner.

Biogen and Eisai are working with a consortium of U.S. health systems—including Mayo Clinic and Mass General Brigham—to pilot integrated diagnostic pathways for lecanemab. But those programs cover fewer than 50,000 patients combined. The infrastructure they’re building—custom APIs, biomarker dashboards, referral workflows—isn’t being standardized or shared.

Startups like C2N Diagnostics and ALZpath offer blood-based tests and data platforms, but they operate outside mainstream care. Their tools require clinicians to opt in, manually upload data, and interpret results without decision support. None are FHIR-enabled or connected to payer systems.

In the EU, the Joint Programme – Neurodegenerative Disease Research (JPND) has funded cross-border data sharing projects in 30 countries. But national privacy laws, especially Germany’s strict interpretation of GDPR, have blocked real-time access. One project in the Netherlands linked 15,000 patient records across three provinces—only after a two-year legal review.

The irony? The technical pieces exist. HL7 FHIR standards allow secure, modular data exchange. Google Health’s Common Data Model has shown it can harmonize EHRs across disparate systems. But adoption is voluntary. Without mandates, incentives, or public funding, integration stays patchy.

The Bigger Picture: Why This Matters Now

This isn’t just about Alzheimer’s. It’s about how medicine handles complex chronic diseases in the 21st century. Diabetes, heart failure, Parkinson’s—they all depend on early detection, longitudinal tracking, and coordinated care. Alzheimer’s is simply the first neurodegenerative condition where we have both effective treatments and validated biomarkers. That makes it a test case.

If we can’t deploy drugs like lecanemab at scale, what does that say about our readiness for other precision therapies? The pipeline is filling. Novo Nordisk is testing semaglutide for cognitive protection. AC Immune is developing vaccines targeting multiple tau isoforms. But none will matter if patients aren’t identified early and tracked consistently.

The U.S. Centers for Medicare & Medicaid Services (CMS) recently proposed a rule requiring cognitive assessments for beneficiaries over 65 during annual wellness visits. That’s a start—but without linked biomarker data, such screenings are just check-the-box exercises. Same with the UK’s NHS Long Term Plan, which promises expanded dementia diagnosis rates by 2030. It allocates £90 million for memory services, but zero for data integration.

Meanwhile, private insurers are hesitant. UnitedHealthcare and Aetna won’t cover amyloid PET scans without prior neuropsych testing and specialist referral—steps that can take months. Medicare’s reimbursement for blood-based biomarkers remains unclear, even as tests like ALZpath p-tau217 move toward commercial availability.

We’re stuck in a loop: payers demand real-world evidence of cost savings, but we can’t generate that evidence without broader testing, and we can’t scale testing without payer support. Breaking that cycle requires a coordinated push—something no single stakeholder is willing to lead.

What This Means For You

If you’re building health tech, this isn’t someone else’s problem. The gap between discovery and delivery is where engineers, data architects, and software founders actually have more use than clinicians or regulators. The tools exist to integrate EHRs, standardize biomarker reporting, automate risk scoring, and connect primary care to specialty networks. The missing piece isn’t technical—it’s coordination.

Startups that can bridge research data and clinical workflows—without reinventing the wheel—will be critical. Think API layers for biomarker data, FHIR-compliant dementia modules, or edge-computing tools that run cognitive assessments on low-bandwidth devices. This isn’t about chasing AI hallucinations. It’s about plumbing. And plumbing pays when the faucet finally turns on.

Because here’s the real question: what good is a cure if the system can’t deliver it?

Sources: Wired, original report

About AI Post Daily

Independent coverage of artificial intelligence, machine learning, cybersecurity, and the technology shaping our future.

Contact: Get in touch

We use cookies to personalize content and ads, and to analyze traffic. By using this site, you agree to our Privacy Policy.