5 Ways Model Context Protocol Will Transform Disaster Recovery

By Sebastian Straub, Principal Solutions Architect at N2W [ Join Cybersecurity Insiders ]
default-cybersecurity-insiders-image

Imagine that finding and restoring a file from backups were as easy as asking ChatGPT to draft an email. Or that you could use a Large Language Model (LLM) to help double-check whether any important data is missing from your backups. Or that AI could validate that data was properly restored following an incident.

Thanks to the Model Context Protocol, or MCP, these and many other innovative use cases in the world of data backup and recovery have become possible. Indeed, it’s not hyperbole to say that MCP is poised to reshape certain facets of data protection by making it much easier to integrate agentic AI capabilities into data backup and recovery operations.

This is why, if you haven’t heard of MCP (which is more than possible because the technology remains quite new), now is the time to familiarize yourself with the standard and the potential use cases it supports in the realm of data protection and beyond.

Model Context Protocol: The Latest Innovation In Data Backup and Recovery

What is MCP?

The Model Context Protocol, or MCP, is a standardized framework for connecting AI agents to data sources and tools that are external to an AI model.

Put another way, MCP provides an interface between AI models (like those behind Anthropic Claude or ChatGPT), allowing them to interact with resources that they would otherwise not be able to access, and to perform actions that they otherwise couldn’t handle.

For instance, you could use MCP to build an AI agent that integrates an AI model with Gmail. Using the agent, you could issue requests like “remove all of the emails I haven’t opened in thirty days” or “find the email where I asked Jane about the picnic,” and it would complete the task for you.

MCP is a big deal because it helps to enable new possibilities in the field known as agentic AI, which focuses on using AI agents to perform autonomous tasks. Until now, most of the innovation related to generative AI technology has centered on tools and services that are essentially just chatbots, like ChatGPT. Those are great if you need to generate text or code. But if you want to perform other actions – like orchestrating backups or finding files inside backup data – there has historically not been an easy way to do it using generative AI.

MCP closes this gap by providing a standardized way to develop AI agents that can interface with a variety of models, as well as a variety of external tools and services.

How does MCP work?

The MCP architecture is based on a server-client model. An MCP server is a type of process that makes a certain data source or software tool available for use by an AI agent. MCP clients are the AI agents that respond to user requests by interacting with MCP servers and AI models.

This means that to use MCP, you must first create an MCP server that exposes whichever type of data and/or tools you want to work with. In the context of data backup and recovery, MCP servers could do things like make backup data accessible to AI agents, or make it possible for AI agents to perform automated tasks using the command-line interface or APIs provided by backup and recovery software.

The role of MCP in data backup and recovery

MCP is already being used to address various use cases. (For example, check out the “official” list of MCP servers on GitHub, which highlights the types of systems and platforms that MCP can connect with using open source servers.)

But the MCP use cases that we find most interesting center on data backup and recovery. In this area, MCP could potentially enable a wide variety of innovations, such as the following.

1. Formulating backup policies

Using MCP, data protection teams could ask questions about the data present on the systems they need to support as a way of determining what, exactly, to back up and how often to back it up. They could then configure backup policies accordingly.

This would be especially useful if, for example, a team needed to know which data changes most often and should therefore be backed up most frequently. Rather than having to look at file metadata manually, they could use MCP to automate the process.

2. Monitoring backups

A lot can go wrong when creating backups. Historically, the only way to figure out what happened was to investigate manually.

But with an MCP server that connects to data backup sources and tools, backup admins pose questions in natural language when investigating backup issues. They could ask, for example, which files are missing from a recent backup due to I/O problems. They could even potentially direct backup tools to reattempt to back up skipped or corrupted files. The result would be faster backups with fewer errors.

3. Exploring backed-up data

File-level recovery from data backups has also traditionally been a slow, manual affair. Hyperscalers like AWS require pre-indexing in order to restore individual files or folders, which can lead to a host of problems. MCP could accelerate it significantly by making it possible to instruct AI agents to do things like “restore the most recent version of document X from backups.” Or, as a more detailed example involving restoring a historical version of a file, an AI agent could be asked to do something like “find the backup of file X that was created prior to the last time John logged into the system, then restore it.”

MCP could also enable the creation of AI agents that check whether backups are complete, providing an added level of assurance that the data organizations need for recovery will actually be there when disaster strikes.

Here again, the benefit of MCP for data backup and recovery would be faster operations and a more streamlined approach to nuanced recovery tasks.

4. Orchestrating data recovery

As every seasoned backup admin knows, creating backups is only half the data protection battle – if that. The real challenge often lies in restoring files and systems following an outage or incident.

There are innovative cloud native platforms that can already provide powerful recovery capabilities to help with this process. But MCP could enhance them by allowing admins to issue nuanced instructions during recovery operations. For example, if a team knew that a certain server had been compromised by malware, they could use MCP and an AI agent to tell their recovery tools not to restore any files from that server, or to restore them only using older backups that predated the malware infection.

Capabilities like these would save crucial time during recovery – and given that downtime can cost hundreds of thousands of dollars per minute, time-savings is very valuable.

5. Cost-optimizing backup and recovery

Speaking of money, MCP could also help organizations to add cost efficiency to their data protection strategies.

There are many ways to reduce disaster recovery costs, such as moving older backups to lower-cost storage tiers and using IT asset prioritization to reduce backup frequency for low-priority assets. A major challenge surrounding strategies like these, however, is that they require manual analysis and configuration to implement.

With MCP, teams could potentially use AI to help assess their backup and recovery needs, then configure backup tools and policies accordingly. In this way, agentic AI could potentially identify disaster recovery cost-savings opportunities that a business is overlooking, while also reducing the strain placed on IT teams.

The state of MCP for data backup

One important point to emphasize about MCP and data protection is that, to date, very little has actually been done to create MCP-based tooling for backup and recovery needs. Thus, the use cases described above remain very speculative for the most part. The software to support them doesn’t yet exist.

But that’s due mostly to the fact that MCP, which Anthropic introduced in late 2024, remains pretty new. Most early MCP server implementations targeted low-hanging use cases (like email or consumer cloud storage management) that benefit large sets of users. It will naturally take some time to see the development of enterprise-focused MCP tooling that focuses on narrower domains, like data backup and recovery.

But there’s no technical reason why creating such tools would be difficult. It’s likely only a matter of time before we start to see MCP-powered AI agents become a common part of data protection routines.

Limitations of MCP for data protection

Another important point to note is that MCP has clear limitations, such as:

• Security: Any data that is exposed through an MCP server could potentially be leaked or abused. For this reason, MCP can only be used to work with backup data within secure environments.

• Capability limitations: MCP agents primarily rely on command-line interfaces or APIs to carry out tasks. If no CLI or API is available for a given tool, MCP usually has no way to implement a given capability. Technically, it’s possible to emulate mouse clicks so that MCP can integrate with tools that only expose a graphical interface, but this is much more complicated and tool-specific.

• Risk of errors: Like all forms of generative AI technology, MCP is subject to hallucination risks, which could cause agents to do things that admins didn’t intend. It’s important to check their work and never blindly entrust important workflows to AI alone.

These and other limitations mean that MCP – and agentic AI in general – are not full-fledged replacements for traditional backup and recovery platforms or for skilled backup admins. MCP can provide important feature enhancements when used in conjunction with data recovery tools, but someone still has to tell MCP agents what to do, and backup and recovery software needs to provide the functionality that allows them to do it.

The future of MCP and data protection

The bottom line: Don’t expect MCP to revolutionize every aspect of data backup and recovery. But do expect it to become an important resource for teams aiming to reduce manual toil and build better bridges between AI and backup and recovery software. For backup admins, taking advantage of agentic AI as a data protection assistant is likely to become an important skill.

___

Author Bio: Sebastian Straub, Principal Solutions Architect at N2W

Sebastian is the Principal Solutions Architect at N2W bringing in more than 2 decades of experience in enterprise technology, data protection and cybersecurity. With previous critical roles at Dell, Oracle, the FBI and the Department of Defense, he has established himself as a leading expert in enterprise security, backup & DR and identity management solutions.

[email protected]

Join our LinkedIn group Information Security Community!

No posts to display