How one energy giant tackled legacy SAP decommissioning at scale
We spoke with Pierre Adloff, SNP’s engagement lead for a large-scale legacy decommissioning program at one of the world’s largest energy companies. He shares what the project looked like from the inside – and what SAP program leaders should think about before their next system retirement.
Your contact
Pierre Adloff
Business Development Executive, SNP Group
Share
You ran the RFP response for one of the largest energy companies in the world. What was the starting point for that engagement?
In March 2023, we received an RFP consisting of 60 documents, marking the beginning of our collaboration – we had no prior relationship with the company. It covered legal archiving for SAP legacy systems, and it was exceptionally detailed. For the next five months, this was essentially my full-time focus.
What was the core business problem they were trying to solve?
Like many large enterprises running SAP for decades, they had accumulated a significant number of legacy ECC systems that were no longer operationally active but still held data subject to strict regulatory retention requirements – in some cases for ten, fifteen, or even twenty-five years.
One approach is to keep those systems running in read-only mode. That preserves access to the data, but it creates a real headache: You’re maintaining infrastructure you can no longer patch or upgrade for cybersecurity due to obsolescence of the operating system, databases, and SAP versions. You can’t move it forward, and you can’t easily switch it off, otherwise you never know whether you will be able to switch it on again. The cost and risk just compound over time, especially considering the potential fines if you are unable to provide data requested by the authorities.
The shift that Kyano Datafridge (SNP’s tool for application retirement) enables is straightforward: Instead of archiving the system, you archive the data. You extract it, structure it for long-term legal accessibility, store it in a compliant cloud environment, and then decommission the underlying system completely. The compliance obligation is fully met, the security exposure is gone, and you stop paying to run infrastructure that serves no operational purpose.
How did the business case land internally at the customer?
Their internal champion put it very simply to leadership: “What is the fine if we can’t produce this data when a regulator asks for it?” In their case, the exposure ran to tens of millions of euros. That was the end of the business case conversation. Everything else followed from there: cost of implementation, contract structure, and timeline.
It’s a question worth asking in any large enterprise sitting on legacy SAP systems. The risk of not having a compliant archiving strategy is often much more concrete than it appears.
The evaluation process was rigorous – close to a thousand questions. What stood out?
Honestly, what stood out was that we could answer all of them positively. Legal requirements, technical architecture, security certifications, scalability, service model – every dimension was covered. I was still learning the product in depth myself as we worked through the RFP, which tells you something about the breadth of what was being asked. But Kyano Datafridge and the SNP team based in Bratislava had an answer for everything.
The one area where we had to extend ourselves was storage. The customer wanted a full SaaS model – software, services, and cloud storage under one contract, with a single vendor accountable for the whole stack. We structured an arrangement with a hyperscaler to make that work. That completeness, I think, was a meaningful factor in the final decision.
What should SAP program leaders think about when structuring a legal archiving contract?
Two things that we learned the hard way – or rather, that the customer’s board was wise enough to catch before signature.
The first is duration. Legal archiving obligations can run to 25 years. A two or three year contract – which is what procurement teams often default to – doesn’t fit that reality. If you’re going to tender for this, make sure the contract duration reflects the actual retention period. It will save significant time in the approval process.
The second is scope. If your organization is planning an SAP S/4HANA migration, the number of systems that will eventually need legal archiving is almost certainly much larger than the systems you’re thinking about today. A migration from SAP ECC to SAP S/4HANA generates its own wave of legacy systems. It’s worth scoping for that pipeline from the start, not revisiting it later.
In this case, addressing both points properly meant the eventual contract was a genuine framework – covering the full portfolio and giving both sides a solid foundation for a multi-year program. An additional crucial point is to establish a growth mechanism: Agree the prices in an easy and straightforward way in case the number of systems increases.
How did the actual delivery go?
The target from contract signature was 15 systems decommissioned within 12 months – running in parallel with a knowledge transfer program for the customer’s primary system integrator partner. It was an ambitious timeline, and the delivery team met it cleanly, on schedule, and with no issues.
That first delivery was important beyond the immediate scope. It built the trust that opened up every subsequent conversation. Now in 2026, the program has grown to 30 systems, with projections of 50 to 70 systems as the full SAP S/4HANA migration continues to roll out.
Were there any outcomes that surprised you?
There was one that we hadn’t fully anticipated: When the customer compared the cloud storage costs to what they were paying under their own direct infrastructure contracts, the difference was striking – significantly more cost-effective. That prompted a natural next question: If this works so well for decommissioned legacy systems, why not apply the same model to archiving across productive systems as well?
That question has since become a program in its own right, now extending to the decommissioning of OpenText environments across their estate. What began as a compliance-driven project for legacy SAP has broadened into a much wider data infrastructure conversation.
Any final advice for a CTO or SAP program lead reading this who is facing a similar decision?
Don’t wait for the migration project to force the question. The time to think about your legal archiving strategy is before you start decommissioning, not during it. The compliance obligations are real, the cybersecurity risk of leaving legacy systems running is real, and the cost of a well-structured archiving program is almost always lower than the cost of the alternative – which is maintaining ageing infrastructure indefinitely. Additionally, the potential fines in case of inability to provide the data requested by the government are in another dimension, making all other business case discussions redundant.
The other thing I’d say is to think about your system integrator relationship early. In this program, the system integrator’s deep presence at the customer was an important part of making the delivery model work. If you have an established SI, getting them involved in the knowledge transfer from the start makes the whole program more sustainable.
Your contact
Pierre Adloff
Business Development Executive, SNP Group