The Future of Backup: AI, Tape, Glass Storage and DNA

Matt Parker Infrastructure Engineer at Synextra
Article by:
Matt Parker
Infrastructure Engineer

Backups don’t get much airtime in conversations about the future of tech. AI, quantum computing, the next wave of cyberattacks — those get the column inches. But in the latest episode of Experts in Polo Shirts, Michael, Alex, and Matt made a pretty compelling case that the future of backup technology is quietly going through one of its most interesting periods in decades — and that the infrastructure keeping your data recoverable when everything goes wrong deserves a lot more attention than it gets.

The stakes are real. The UK Cyber Security Breaches Survey 2025 found that the proportion of UK businesses affected by ransomware doubled year-on-year, with 74% of large businesses reporting some form of cyber breach or attack. Even excluding the ransom itself, the average cost for a UK business to recover from an attack has climbed to $2.58 million. How you store and protect your backups has never mattered more.

Here’s what they covered — and why it’s worth paying attention to.

AI has been in your backup stack longer than you think

The AI story in backup isn’t some future concept. It’s already embedded in products — it’s just older than the current hype cycle.

Vendors have been integrating machine learning-based AV and EDR engines directly into backup pipelines for years, scanning files as they stream from source to destination and flagging malicious content before it gets written. That’s AI. It just doesn’t look like a chatbot, so nobody’s writing breathless articles about it. Some vendors are now calling this category BDR — Backup Detection and Response — essentially a rebrand that reflects how deeply security tooling has become embedded in the backup process itself.

Where the more recent LLM-style reasoning becomes genuinely useful is intelligent alerting. Most backup monitoring is still surprisingly dumb: you get an email on Sunday morning saying a job failed, someone investigates, repeat. The smarter version — something that notices the same server has failed every Sunday for a month, reads the logs, reasons through the likely cause, and sends you an alert that actually tells you what’s broken and how to fix it — is a meaningful improvement on the status quo. Not AI for its own sake. AI that means fewer Monday mornings staring at log files.

The oldest solution to the newest threat

There’s a persistent assumption that tape is a legacy technology on its way out. It isn’t. It never really left — and in an era of ransomware, it’s arguably more relevant than it’s been in years.

Research found that in 93% of ransomware incidents, threat actors specifically target connected backup repositories — resulting in 75% of victims losing at least some of their backups during the attack. The UK has felt this acutely. The Blue Yonder ransomware attack is a recent example of how quickly the damage cascades — disrupting supply chain management services with direct impact on Sainsbury’s and Morrisons. Ransomware is only effective if it can reach your data. A physical air gap removes that possibility entirely. As Matt puts it: “You cannot beat a physical air gap. Take it out of the machine, it doesn’t require any power, you put it on a shelf.” No network interface, no API endpoint, nothing to encrypt remotely.

This thinking is reflected in the evolution of backup best practice. The traditional 3-2-1 rule — three copies of data, on two different media types, with one stored offsite — has been updated for the ransomware era. The enhanced 3-2-1-1-0 framework adds two critical layers: one immutable copy and zero errors in restoration testing. Tape is a natural fit for that offline, immutable copy.

The honest caveat is that tape’s total cost of ownership is regularly undersold. It’s not just the media — it’s the hardware, the storage space, and the ongoing maintenance. Tapes sitting on a shelf still need to be periodically loaded, read, and rewritten to prevent bit rot. That’s an operational overhead that doesn’t appear in the per-terabyte cost comparisons. But for organisations that need to store large volumes of data locally and keep it genuinely air-gapped, the economics still make sense — and the ransomware protection case is stronger than ever.

The one vulnerability nobody has a patch for yet: “They hack the office robot, and then get the robot to go and load the tapes manually.” For now, that remains theoretical.

Microsoft is storing data in kitchen glass – seriously

Microsoft’s Project Silica is one of those research projects that sounds like science fiction until you look at what they’ve actually built. As Matt puts it: “You’ve all seen Star Trek, right? The computer crystals — that’s becoming a reality almost.”

The concept: store data in small, square glass platters — roughly DVD-sized, about half a centimetre thick — by using a femtosecond laser to alter the molecular structure of the glass itself. Data is written in 3D layers (the laser has enough precision to write one layer, then write another beneath it), and read back using ordinary light and microscopy. It’s WORM storage — write once, read many — which means it’s immutable backup storage by default.

The numbers are significant. Raw capacity upwards of 7 terabytes per platter. Durability of at least 1,000 years, with up to 10,000 years achievable in controlled conditions. Unlike magnetic media, the glass isn’t affected by temperature, electricity, or light exposure.

There’s been a significant development since the episode was recorded too. Microsoft recently published new research in Nature showing they’ve cracked how to store data in borosilicate glass — the same material used in kitchen cookware — rather than the expensive fused silica the project previously relied on. The new technique stores hundreds of layers of data in glass just 2mm thin, with improvements to both writing speed and reader simplicity. Microsoft itself describes this as directly addressing one of the key barriers to commercialisation: the cost and availability of storage media.

In practice, the system involves a warehouse of small robots retrieving the right platter on demand. Read/write speeds are not yet competitive with modern storage — Microsoft’s own messaging suggests the research phase is now complete, with the company indicating it will share findings with the wider scientific community rather than announcing a commercial roadmap. But for use cases where speed doesn’t matter and longevity does, the proposition is hard to argue with. Legal records, regulated data, government archives, anything with a 15 or 20-year retention requirement: write it to glass and stop worrying about it.

There’s also a recycling angle worth noting. Because it’s just glass, a platter can be melted down and recast when data needs to be updated or overwritten. Close to 100% recyclable, with minimal material waste.

The most extreme storage idea that might actually work

DNA storage is still largely in the research phase, but the underlying data density figures are genuinely difficult to comprehend. The team put it simply: “10 billion songs’ worth of data in a litre of liquid.”

To put that in context — we’re talking around 60 petabytes per litre of DNA solution. Most organisations will never generate a petabyte of data across their entire existence. As an information medium, the storage density of DNA makes everything else look crude by comparison.

The durability claims are similarly striking — potentially billions of years, though the honest counterpoint is that DNA is an organic molecule and degrades under real-world conditions. The fact that viable dinosaur DNA has never been recovered from amber suggests the upper bound is somewhat theoretical.

The practical infrastructure for reading and writing data at any kind of scale doesn’t exist yet — and the failure modes are admittedly novel. As Alex points out: “Disc four’s gone moldy again.” But as the total volume of data in the world continues to grow — demand was projected to exceed hundreds of zettabytes by 2025, and those projections appear to be roughly accurate — the pressure to find radically higher-density storage solutions is only going to increase. DNA storage is likely part of that answer, eventually.

So what does this mean for your backups?

AI-enhanced monitoring, tape, Project Silica, and DNA storage are all responses to the same underlying problem: data volumes are growing faster than conventional storage can handle, ransomware means immutable backup storage matters more than it ever has, and long-term retention requirements aren’t going away.

The future of backup technology probably isn’t one approach replacing another. It’s a layered strategy — intelligent alerting on top of storage media that’s physically robust, naturally immutable, and built for the long term.

The glass platter sitting in a Microsoft warehouse, waiting for a robot to retrieve it in the year 3026, is just a particularly vivid illustration of where that’s heading.

Backup strategy is one of those things that’s easy to deprioritise until it’s too late. If you’d like a straightforward conversation about where your organisation stands, we’re always happy to chat.

Subscribe to our newsletter

Stay ahead of the curve with the latest trends, tips, and insights in cloud computing

thank you for contacting us image
Thanks, we'll be in touch.
Go back
By sending this message you agree to our terms and conditions.