Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Dashboarding

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
NextLytics Newsletter Teaser
Sign up now for our monthly newsletter!
Sign up for newsletter
 

Migration from SAP BW to BW Bridge & Datasphere - A practical example

Anyone who has ever been involved in an SAP BW migration project knows that the larger and more complex the system, the more important it is to be methodical and standardized as much as possible. Migrating from SAP BW to SAP Datasphere also confronts companies with the challenge of switching from their familiar on-premise solution to a public cloud environment. This transformation process requires not only careful planning, but also a clear vision of how to optimally leverage the advantages of the new cloud technology while preserving proven business logic.

SAP BW Bridge is designed for the latter aspect - it acts as a link between the systems and enables the migration of existing models and business logic. In this joint blog post by NextLytics and bluetelligence, we will use a practical example from a project to show how this approach can be successfully implemented. After all, using the bridge naturally means partially foregoing the advantages promised by a greenfield approach in Datasphere. In contrast, as a “faster” migration option, it offers many advantages that are worth considering. The following reasons encourage the use of BW Bridge:

  • Investment in BW models: Often, a lot of effort has been expended in the development of specific BW models that contain complex business logic and customized data flows – sometimes with extensive ABAP logic.
  • Limited availability of CDS-based extractors: In cases where SAP-based source systems do not yet provide the required CDS extractors (e.g. SAP ECC systems), classic S-API extractors must be used. However, it is not recommended to connect these directly to SAP Datasphere, as there are limitations, for example, when DataSources require mandatory selection fields or certain delta methods are not supported.
  • Lack of specific skills: Often, teams have strong BW and ABAP expertise, but not enough knowledge of SQL or Python, which are used in SAP Datasphere, to rebuild the models natively.

In our example of a real project, all these factors applied. We were dealing with a large, historically grown BW system of an energy supplier with highly specialized models that reflect the specifics of the complex and highly regulated energy market. The project objective was to preserve the existing models, business logic and data flows and to rebuild only the reporting layer in SAP Datasphere. In this context, the use of bluetelligence's Performer Suite also turned out to be a key success factor for an efficient and well-targeted migration.

Vision and approach

Before we present the detailed objective of the project, it is important to explain a special feature of BW Bridge in connection with SAP Datasphere. Although queries can be transferred to the BW Bridge during migration, they cannot be executed as in the classic BW system or used as a source for other data targets. Instead, queries are provided in the BW Bridge as metadata. This means that they can be imported and used as a basis for the entity import into SAP Datasphere. This feature plays a central role in our approach.

The following diagram illustrates the project's main objective:

Abb1_BW Bridge

 

The project defined several key steps that were crucial for the migration from SAP BW to SAP Datasphere.

1. Tool-supported transfer via Shell Conversion:
  • Migration of models and data flows up to the composite provider
2. Development and transfer of master queries:
  • Mapping of all global and local calculated and restricted key figures as well as all characteristics relevant for reporting
  • Creation per composite provider
  • Transfer using Shell Conversion
3. Generation of analytics models:
  • Entity import of the master queries
  • Automatic creation of the analytics models based on imported queries
4. Conversion of reporting:
  • Conversion of query-based reporting to analytics models in the various front-end tools

By strategically using entity import, we were able to automatically generate a large number of the required objects and thus significantly reduce the time required for modeling in Datasphere.

The illustration in the diagram is intended to be simplified and focuses on the core aspects of the migration. The complete concept includes additional important components:

  • Further non-SAP source systems
  • Detailed layer concept
  • SPACE concept for the collaboration between central IT and specialist departments
  • Implementation of authorizations via data access controls
  • Consideration of specific requirements depending on the data recipient (e.g. via ODBC or OData)

For detailed information, please feel free to take a look at our article on the reference architecture SAP Datasphere: Datasphere Reference Architecture - Overview & Outlook

Practical use of the Performer Suite in the project

Master-Query Analysis

One of the key challenges of the project was to create the key figure definitions as templates for the master queries – to do this, we had to analyze the structures of a large number of queries from the customer's BW system. Even in the planning phase, it was clear to us that we would need a powerful, tool-supported solution for this. As a partner of bluetelligence, we knew the strengths of the Performer Suite and therefore decided to use a temporary license in the project.

The goal was to create a master query for each composite provider that contains all global and local calculated and restricted key figures as well as all characteristics relevant for reporting. In order to only take over the key figures relevant for reporting, we first analyzed all queries executed in the last 18 months using the System Scout analysis Data Loads and Usages:

Abb2_BW Bridge

 

This analysis provided us with a list of all relevant queries for each composite provider. We used the Query SetCard Designer to create a central list of the definitions of all global and local calculated and restricted key figures. We created a SetCard in which we collected all the necessary information on the key figures used in the queries:

Abb3_BW Bridge

 

With the help of the Docu Performer (a tool in the suite), we were then able to export the queries to Excel according to our SetCard template. It was important to deactivate the “One file per documentation” setting so as not to create a separate Excel file for each query. The result was a consolidated overview of the key figures for each composite provider. We used this overview as a template for creating the master queries.


Download the whitepaper and find out
which product is best for your data warehousing strategy!

Neuer Call-to-Action


Planning of the migration and data flow analysis

At the beginning of the migration, we developed a structured wave plan that relied on a step-by-step conversion during the project duration, instead of a single big bang at the end. This approach allowed us to iteratively migrate reporting scenarios from SAP BW to SAP Datasphere, test them, and conduct user acceptance tests (UAT) with the departments.

Abb4_BW Bridge

Abb5_EN_BW Bridge

This strategy offered several advantages:

  1. Risk minimization through step-by-step conversion
  2. Early stabilization of the new environment
  3. Continuous learning and optimization of the approach
  4. Flexibility to respond to challenges without disrupting the overall project

In a first step, the customer split the wave planning at the BW InfoArea level. Based on these specifications, we identified the affected composite providers using the entity lists in the Performer Suite. We created lists of these providers, enriched with additional attributes, and made them available to the customer. This allowed for a precise readjustment of the planning, since not all composite providers from an InfoArea could necessarily be migrated in the same wave due to dependencies.

Our task was to use the BW conversion tool as a shell conversion to migrate the data flows for each composite provider completely and smoothly. Although the conversion tool can determine dependencies itself using a scope analysis, it proved useful to manually spread the objects across several conversion tasks. The automatic scope analysis often returns very extensive object lists, which increases the risk of terminations in the case of complex dependencies. Furthermore, certain dependencies, such as lookups in ABAP transformations, are not automatically recognized.

Our solution was to manually define the scope for each conversion step to achieve better manageability and reduce migration complexity. Based on this approach, we developed the following prioritization for the migration:

  1. InfoProviders (InfoObjects, aDSOs, Composite Providers)
  2. Transfer of native DDIC objects via ABAPGit into the BW Bridge ABAP stack (for example, lookups on Z tables in transformations, swapping out routine coding into ABAP classes)
  3. transformations
  4. DTPs and process chains

The DDIC objects had to be manually adapted and activated in the BW Bridge's BTP ABAP cloud environment. This environment is more restrictive than ABAP Classic and only allows APIs released by SAP. Most existing ABAP developments therefore required significant refactoring.

To create the scope lists, we needed a complete listing of all objects in the data flow, including transformations and DTPs, for each composite provider. We used the data flow analysis in the Performer Suite for this. This enabled us to assign all objects in the data flow to a scenario (in our case, a migration wave).

Abb6_BW Bridge

After that, we were able to select all objects in the entity list of the Performer Suite by selecting the corresponding scenario and output them as a list. 

Abb7_BW Bridge

These lists served as work lists for the migration and were enriched with additional information (e.g. transformation used ABAP coding) and the migration status. This method allowed us to manage the objects to be migrated for each wave in a transparent and structured way.

Further possible uses in the project

The Performer Suite proved to be a powerful tool in our migration project. Particularly valuable were the easy ways to quickly create lists of BW objects for various purposes.

  • For ad hoc analysis, the suite (specifically, System Scout) provided quick insights into data flows that proved to be more detailed and user-friendly than the data flow view in the BW modeling tools.
  • In cases where models were archived rather than migrated, we used the suite to quickly create complete model documentation.
  • For development testing, the suite was particularly useful for quickly determining which InfoProviders were loaded via which process chains. This quick overview helped us to manage the development test process more efficiently.

Challenges and lessons learned during the project

During the project, we did of course also encounter challenges and limitations although we had tool support. These were mainly restrictions on the part of SAP, but there were also possibilities that the Performer Suite does not (yet) offer.

Exporting the query set cards for 3.x query versions proved problematic, which meant that manual checks were necessary. For larger query selections, occasional crashes occurred, so that we had to spread the queries belonging to a composite provider across several export jobs. Consolidating the individually documented queries on a sheet required additional effort, which we managed by developing a macro. A more targeted analysis of all global and local key figures of a composite provider, for example as a system scout analysis, would be helpful in the future.

The major roadblock for our master-query approach was the extensive restrictions on entity import in Datasphere, which are listed in detail in an SAP Note (https://me.sap.com/notes/2932647). The import process was not transparent, as it was not possible to see which problems were occurring and requiring manual corrections. Some features prevented the import completely, others were skipped, resulting in incomplete key figure definitions. Especially critical were the restrictions on the supported formula operators. The limitation to only the basic operations (+, -, *, /) makes the entity import for queries almost unusable in practice. Real queries often have complexities that go far beyond these simple operations. In view of these problems, manual creation of the key figures in Datasphere ultimately proved to be more efficient. We focused the entity import on the composite providers, from which we then created the analytic models, and manually created the key figures in the analytics models. We used the lists of query set cards as a basis, which we had created for the master queries.

When creating the object lists for the migration waves, we encountered another challenge in the Performer Suite: the data flow analysis did not allow a direct assignment of the DTPs to the scenarios. We had to iteratively check the InfoProviders from the scenario in the entity view using the parent column to determine which DTPs belong to which InfoProvider and then subsequently assign them to the scenario. A simpler functionality for the holistic recording of all components of a data flow, including DTPs, without the need for workarounds, would have significantly simplified and speeded up the process.

Summary and outlook

In summary, we can say that the use of Performer Suite in our project provided enormous value in reducing the manual effort and potential sources of error and in speeding up the entire migration process. The advantage of the project license was that we only had to purchase a temporary license for the specific use case. Our intention was not to integrate the Performer Suite into the daily IT routine, but to use it specifically for our purpose. Therefore, no expert knowledge in all facets of the tool was required.

We are aware that we could have possibly gained even more helpful information from the tool or could have continued working directly with the lists in the tool. Often, we fell back on the traditional method that we often see in customers and that we normally warn of: exporting to Excel, followed by creative further processing of the exported data. But sometimes you just have to be pragmatic! If it was easier for the project participants, who did not all have access to the tool, to work with object lists in the familiar environment of Excel, then that was absolutely fine with us.

In addition to the features already mentioned with regard to key figure and data flow analysis, we still see potential for future enhancements that could further increase the benefits of the Performer Suite in such projects:

  • More ABAP-based information in the data flow analysis, especially in terms of ABAP Cloud language syntax. For example, it would be very helpful if the data flow analysis could show which transformations use ABAP coding that is not ABAP Cloud-compatible.
  • It is great to see that bluetelligence is investing heavily in the expansion of its analyses in the direction of SAP Datasphere and SAP Analytics Cloud. In addition to views, Analytics Models and Task Chains will also be possible to analyze via the Performer Suite in the future, which will help in migration projects when comparing the old BW world with the new Datasphere world. Unfortunately, with our current version of the tool, we were only able to connect to the old BW system, but not to Datasphere or BW Bridge.
  • Of course, it would be ideal if the Migration Booster, which is used as migration support for BW to BW/4HANA, could also be used for BW Bridge projects in the future. Compared to the SAP Conversion Tool, this enables a much more convenient and targeted migration and also allows the assignment of new naming conventions. It is yet to be seen whether SAP will provide the APIs for this on the BTP and whether the market for BW Bridge migrations is large enough to justify such an investment. But it's not too early to start making wishes – Christmas is coming soon :-)

Do you have questions about SAP Datasphere or SAP BW Bridge? Are you trying to build up the necessary know-how in your department or do you need support with a specific issue? Please do not hesitate to contact us. We look forward to exchanging ideas with you! 

Learn more about  SAP Datasphere

,

avatar

David

David has 13 years of experience as an SAP consultant in the areas of SAP Data & Analytics and the energy industry. His expertise includes customer support and consulting as well as IT conception, architecture and development of SAP solutions. He is particularly specialized in SAP Planning (BPC and SAP Analytics Cloud), SAP BW/4HANA and SAP Datasphere. In his spare time, David enjoys freediving and is an enthusiastic home barista.

Got a question about this blog?
Ask David

Migration from SAP BW to BW Bridge & Datasphere - A practical example
16:18

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts