Data governance creates an illusion of security: the real battle is software integrity

Amidst a tense global environment, a rapidly evolving technological landscape, and a wave of new regulations, such as the EU’s Digital Operational Resilience Act (DORA), data sovereignty has become the security strategy of choice.
Governments and businesses alike are betting that keeping data within national borders makes it safer, more compliant, and easier to manage.
The transition to a dynamic cloud is an attempt to solve a 21st century problem with a 20th century mindset. The premise of data localization assumes that security comes from borders and space. In fact, modern cyber risk is less about where the data resides and more about whether the software that processes it can be trusted.
If the software supply chain — the network of code, tools, dependencies, and processes used to build, package, and deliver software — is unsecured or prone to vulnerability, the well-intentioned move to focus on data geography is a top-layer fix that doesn’t address the core problem of data security.
Where data localization can fail
The decision to move data locally effectively addresses one real concern: political control. In theory, it allows local governments to have control over data, protecting it from foreign access or censorship.
However, this move is primarily administrative and political, and does not improve technical security. Storing data on local servers does nothing to protect the applications and code running on that cloud infrastructure.
More than 90% of the code in the systems we use today, from our banking services to the way we view content, is made up of open source software. This open source code is made up of thousands of components created by strangers on the Internet from all over the world.
Regardless of where it is used, open source software may contain bugs or vulnerabilities, either unintentionally or intentionally.
Someone is running a server in London or San Francisco, but an engineer in Bengaluru wrote a critical library that they use, and it probably contains a zero-day exploit, which was injected by a threat actor operating in Moscow.
Data localization does not solve this problem. The risk vector is not affected by the physical location of the data center, but by the security of the components of the software supply chain. If the code is compromised, the location of the data becomes irrelevant as it can be removed or corrupted regardless of its location.
An open source puzzle
There is tension in the cloud debate. Sovereign cloud pushes the concept of isolation which is a sign of a return to the closed source mentality. However, businesses today rely on the agility, speed, and interoperability of the open source development model to support innovation and stay competitive.
We cannot view the global software development ecosystem with suspicion; rather, we should encourage more integration and, importantly, validation.
But how does one balance the speed and ease of open source development with the security and compliance that regulators and customers now require?
A mature security strategy of 2026 will embrace a global, open environment for development and instead focus on tools that ensure the integrity and provenance of every line of code, providing evolution and traceability of every part of the software stack.
More visibility into where the software was written, by whom – whether it’s a human developer or AI – or where the data ultimately resides, will help organizations and governments build and innovate with more confidence.
The real strength in 2026
The consequences of neglecting software integrity are dire and more visible than ever. Recent events like the M&S and Jaguar Land Rover cyberattacks, or the end of AWS, show that your software ecosystem is only as strong as its weakest link.
These failures rarely occur where data is hosted. They range from raw libraries, vulnerable build systems, and murky supply chains that aren’t fully monitored.
To build true resilience in 2026, the conversation must change. We cannot treat security as a boundary to protect or a box to draw data from. We should treat the code we use every day as critical infrastructure.
This means ensuring that no vulnerability arrives undetected, the code has not been tampered with, and the root cause can be traced back. It also means accepting that the code engineers rely on today is natural, and designing security strategies that reflect that fact, rather than fighting it.
This is the only pragmatic and scalable way to deliver the security and control we need to survive and thrive in today’s computing world.
We have installed the best encryption software.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the tech industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you would like to contribute find out more here:



