Why “Go to Datasphere” isn’t always the answer
The clock is ticking on SAP BW’s end-of-support. By 2027, organizations worldwide will have to make a strategic decision they can no longer put off: what will your analytics landscape look like in a world where SAP BW is no longer supported?
For years, the answer seemed obvious: just switch to SAP Datasphere and you’re all set. But the world of data has changed fundamentally. And that assumption deserves to be seriously challenged.
The ground has shifted
SAP itself has made the most significant move. With the launch of SAP Business Data Cloud and native connectors to Databricks, Snowflake, and later this year Microsoft Fabric, SAP is no longer positioning itself as the destination for all your data. It is positioning itself as the source.
Transactional excellence, semantic richness, domain expertise: SAP. Everything else? That’s open to debate.
This is a major shift. For the first time, SAP customers have genuine, architecturally supported freedom to choose where their analytics workloads run. And with that freedom comes a new challenge: most organizations don’t know how to make that choice effectively.
Meanwhile, the “Big 3” (Databricks, Snowflake, and Fabric) have come of age. These are no longer niche tools for data scientists and hyperscalers. They are mature, enterprise-grade platforms with robust governance, proven BI capabilities, and ecosystems that are growing by leaps and bounds.
Many enterprises are already using these platforms for non-SAP workloads. And they want to consolidate, not duplicate.
The question for SAP BW customers today is not whether they should modernize, but how and with whom.
Why “Just go to Datasphere” isn’t enough
SAP Datasphere is a robust product. For organizations that primarily want to make SAP data accessible with minimal complexity and a short learning curve, it remains the most natural evolution from BW. It retains the semantic layer, speaks the language your BW team already knows, and integrates seamlessly with SAP Business Data Cloud.
That said: Datasphere has a deliberate focus. SAP Business Data Cloud is designed as a broader ecosystem. Some capabilities have been intentionally placed elsewhere:
- ML and AI workloads are the domain of Databricks and Snowflake, not Datasphere. If your organization has ambitions that go beyond traditional reporting, real-time scoring, feature engineering, and large-scale model training, those use cases point to a different part of the stack.
- Platforms such as Databricks or Snowflake are better suited for handling flexibility and heterogeneous environments. They are built for complex, multi-source engineering workloads.
- Vendor dependency is worth serious consideration. For some organizations, the BDC ecosystem is the right long-term solution. For others, a more open architecture, connecting SAP data to a platform they already have, is the better strategic approach.
For organizations with a higher level of engineering maturity, existing investments in cloud platforms, or AI ambitions, Datasphere may not be the right solution. Even though it’s the easiest to sell internally.
The opportunity that no one takes
Here’s an interesting point: the market recognizes this complexity, but few partners are equipped to navigate it objectively.
The major system integrators have both SAP BI practices and cloud data practices. However, these often operate in silos, with different commercial incentives, different talent pools, and limited ability to provide truly independent advice at the intersection of the two. Clients are steered toward the practice that is most eager to secure business.
Cloud-native firms have in-depth knowledge of Databricks and Snowflake, but they don’t know SAP. They don’t analyze your BW metadata and logic, they don’t assess the complexity of your transformation logic, and they lack a frame of reference for what a migration actually entails on the SAP side.
The opportunity, and it’s a big one, lies precisely at the intersection: organizations that are fluent in SAP BW, Datasphere, Snowflake, Databricks, and Fabric, and that provide advice that is truly platform-agnostic.
What Good Advice Looks Like
When we take on a BW modernization project, we don’t start with a platform recommendation. We begin with two parallel assessments: a technical one and an organizational one. Only once both aspects are clear do we present a platform recommendation.
The technical scan: knowing what you’re working with
Most BW modernization projects underestimate the complexity of the source environment. Simply because no one has ever properly mapped it out.
Over the course of decades, BW landscapes accumulate thousands of objects: InfoProviders, transformations, data flows, process chains, ABAP routines, and HANA Calculation Views. Many of these were created years ago by people who have long since left the company. A significant portion is no longer actively used.
The question is: which ones really matter, and how difficult will they be to migrate?
This is where our tooling makes a difference.
Automated BW Model Extraction
The BW Model Extractor is a tool that connects directly to the BW system and captures the entire technical landscape in a fraction of the time it would take to do so manually. It retrieves object inventories, usage statistics, load frequencies, data volumes, transformation logic, ABAP complexity scores, dependency chains, and more.
The result is a comprehensive, structured view of the BW environment. It is not filtered through anyone’s assumptions or political sensitivities, but is derived directly from the system itself.
Interactive complexity reports
Raw metadata isn’t enough. That’s why the extractor’s output directly powers interactive complexity reports that make the landscape visible and navigable for both technical teams and business stakeholders.
These reports allow you to segment the environment by: object type, business domain, usage frequency, data volume, transformation complexity, and estimated migration effort.
What does this mean in practice? You walk into a steering committee meeting and use visual and interactive tools to show exactly how large the landscape is, which parts are actively being used, where the complexity lies, and how different migration strategies affect scope and costs.
No more gut feelings. No more consulting estimates based on sampling. Real decisions, based on your actual BW system.
Automated migration roadmap generation
Based on the complexity analysis, we generate a migration roadmap (let’s call them “Waves”) that sequences objects and workflows in a logical, risk-driven order. This is not a generic template. It is derived from the actual dependency graph of your BW system, understanding which objects feed into which others, what can be migrated independently, and where the critical paths lie.
The roadmap takes into account:
- Decommissioning potential (typically, 30–50% of a BW landscape can be cleaned up before migration begins, which dramatically reduces the scope and costs)
- Migration wave sequencing
- Parallel-run periods
- Cutover requirements
Mapping each object to the corresponding construct on the target platform. This way, you know in advance what needs to be converted, what can be reused, and what needs to be rebuilt.
WavePath: AI-driven dataflow generation
The best part: we don’t just analyze your BW system, we use the output to automatically generate data models in the target environment. That’s where WavePath comes in.
WavePath is an AI-powered migration agent. It was specifically designed to accelerate the transition from SAP BW to BDC and its partners by automating the generation of data models, the most time-consuming and error-prone part of any BW migration.

Here’s what WavePath does:
- Reads existing BW data models and transformation logic
- Generates a wave-by-wave deployment plan that respects object dependencies
- Recreates those models directly as Datasphere, Databricks, Snowflake, or Fabric models, with logic and dependencies intact
The result is a migration process that is not only faster but also significantly more reliable: no manual transcription errors, no missed dependencies, and no inconsistencies between what is analyzed and what is built.
All generated data flows are automatically aligned with our standardized reference architecture for each target platform. This means that each migration wave is deployed into an environment that is consistent, maintainable, and managed, not a collection of individually created objects that will become the next generation of technical debt.
This isn’t a one-click solution. Complex ABAP and AMDP transformations still require human judgment and manual rework by BI specialists. But for the vast majority of standard data models that follow predictable patterns, and in most BI landscapes, that’s the majority, WavePath dramatically reduces the migration effort. It shifts the team’s focus to the truly difficult problems.
WavePath vs. SAP’s Data Product Generator: Understanding the Difference
At this point, a valid question arises: why not just use SAP BW PCE and the BW Data Product Generator (DPG)? What does WavePath bring to the table?
Good question. The honest answer: they solve fundamentally different problems. DPG is a data replication tool. WavePath is a migration and development tool.
The practical difference:
The DPG replicates BW InfoProvider data to Datasphere’s managed object store within BDC, essentially copying the data from BW to a file layer in the cloud. It handles the “shift” phase of the BW modernization journey, creating local tables in the HDLFS layer from BW InfoProviders. It excels at what it does: extracting data from BW and getting it into BDC quickly.
However, there are critical limitations that are not always highlighted in SAP’s presentations:
The DPG replicates data, not models. For each InfoProvider, a LocalTable is created containing basic metadata such as data types, descriptions, and field names. This LocalTable can then be used in Datasphere for further modeling. This further modeling must still be done manually by a person.
The DPG does not convert your BW transformation logic, your ABAP routines, your AMDP code, or your business rules into functional Datasphere, Snowflake, or Databricks models. It simply provides the raw data. What you build on top of that is still your responsibility.
This requires that BW be deployed in BDC’s private cloud first. Upgrading to the private cloud edition is a prerequisite for using SAP BW within SAP Business Data Cloud. This comes at a significant cost.
The DPG is a useful tool in a specific scenario: organizations that are migrating their BW to BDC’s private cloud and want to quickly expose InfoProvider data for use in Datasphere or Databricks. For that use case, it gets the job done.
But it’s not a migration tool. It doesn’t reduce the modeling effort in your target environment. It doesn’t analyze what should and shouldn’t be migrated. It doesn’t generate a roadmap. And it doesn’t work at all if you stay on-premises, or if you want to decommission BW instead of just connecting to the cloud.
WavePath tackles the entire migration challenge. The two can also complement each other: DPG replicates data to the BDC object store, while WavePath rebuilds the analytical models on top of it.
Understanding what each tool does and being honest about the gaps, that’s exactly the kind of guidance that’s missing from most vendor meetings today.

Our partner: Seapark Consultancy
WavePath was developed in close collaboration with Seapark Consultancy, an Ireland-based SAP BW migration specialist. Like us, they are one of only a handful of SAP partners worldwide that have been specifically recognized by SAP for BW migrations.
We work together to deliver a depth of BW migration expertise that is truly unique in the market. We know what a BW landscape looks like from the inside. We know where the complexities lie. And we have the tools to uncover them quickly and cost-effectively.

The Organizational Assessment: Is Your Organization Ready for Change?
Technical readiness is only half the equation. We have seen technically straightforward migrations fail because the organizational conditions were not in place: no clear data ownership, no governance framework, insufficient skills within the team, and a history of failed change programs that made stakeholders skeptical.
We have also seen technically complex migrations succeed because leadership was aligned, the team was motivated, and the organization had built up real data management capabilities over time.
The organizational scan is where we assess that second dimension. And for this, we’ve partnered with another fantastic company from our SUPERP group: SynTouch, the leading Dutch specialist in data management and integration.
SynTouch offers a proven, structured approach to organizational data maturity assessment, based on the DAMA DMBOK (Data Management Body of Knowledge) framework, the globally recognized standard for professional data management.
Their Data Management Maturity Scan provides organizations with a clear, evidence-based picture of where they stand across the key areas that determine the success of a modernization program.

SynTouch’s methodology includes the option to repeat the assessment over time, establish a baseline, and track organizational development as the modernization program progresses.
Together, the technical scan and the organizational scan provide something most clients have never had before: a genuine, evidence-based foundation for a platform decision. And a realistic picture of what it will cost to implement.
Choosing the Right Platform for the Right Organization
Not every organization has to make the same choice. And we’re happy to say so.
For organizations that primarily want to access and report on SAP data, have limited engineering capacity, and prefer low operational complexity, SAP Datasphere (often supplemented with SAP Business Data Cloud) is often the right solution. The BW team’s existing expertise translates well, time-to-market is faster, and there is strong architectural alignment with SAP’s direction.
For organizations with broader analytics ambitions, BI-centric workloads, and a preference for simplicity, Snowflake is often a good fit. It is SQL-native (offering a short learning curve for SAP BI teams), operationally clean, and excels at the kind of managed, concurrent analytics that most business intelligence use cases require. Its layered architecture also aligns well with the BW LSA++ framework that many customers already understand.
For organizations with true data engineering maturity, teams that run complex pipelines, have streaming requirements, or want to build ML capabilities at scale, Databricks is often the superior choice. Its power comes with complexity, but for the right organization, it’s the right trade-off.
Many real-world architectures will be hybrid: SAP as the authoritative source, connected via the Business Data Cloud to a modern analytics platform, with Databricks handling AI workloads alongside Snowflake or Fabric for BI.
This isn’t just an excuse; it’s often the architecturally sound solution.
A bridge, not a binary choice
The “SAP vs. Cloud” narrative is outdated. The market has moved on, SAP has moved on, and the most forward-thinking customers have moved on.
What organizations need are partners who operate at the intersection: with enough expertise in SAP to interpret a BW system with clinical precision, enough experience with cloud data platforms to provide unbiased comparative advice, and enough independence to prioritize the customer’s outcome over any relationship with a platform vendor.
That is the position we have built together: 25 years of SAP BI expertise, a robust cloud engineering practice, and a growing track record of successful modernizations spanning the entire spectrum, from Datasphere to Snowflake and Databricks to Fabric.
Supported by our world-class BW migration tools, WavePath’s AI-driven data model generation, SynTouch’s proven organizational data maturity framework, and our partnership with Seapark.
We offer something truly unique: an assessment that is both technically rigorous and organizationally honest. And an execution capability built to deliver results.
If your organization is approaching the BW 2027 deadline without a clear plan, now is the time to develop one. Not because the deadline is tomorrow, but because making the right decision takes time. And undoing the wrong one is costly…
Tick-tock

Wondering which platform is right for your organization?
Louis from our SAP BI team will give you objective advice.

Louis ter Voorde
Business Development
