Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Dashboarding

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
NextLytics Newsletter Teaser
Sign up now for our monthly newsletter!
Sign up for newsletter
 

Milestone reached: NextLytics becomes Databricks partner

NextLytics is now an official consulting partner of Databricks! We are pleased to take this step to further expand our ongoing successful work with the data and AI platform Databricks. NextLytics stands for personal, trustworthy and independent consulting in the areas of data, analytics and business intelligence. We are thus officially adding a new facet to our many years of expertise in the system landscapes of SAP Business Intelligence products and open source software alternatives. We are convinced of the possibilities that Databricks and the associated architectural and technical concepts and components open up for our customers and see great potential in combining them with other products from the broad spectrum of data science and analytics.

Why Databricks?

Databricks is the cutting edge of data management, combining the best of two worlds: the flexibility and scalability of data lake storage with the structural advantages of a data warehouse. The result is the so-called Lakehouse architecture – a platform that combines analytics, data engineering and machine learning capabilities under one roof. No other platform currently integrates these different approaches as well and as seamlessly as Databricks. For users, the previously often unwieldy concepts and software libraries for distributed data storage and parallelized, distributed processing are tightly integrated into the infrastructure. “Big Data” suddenly looks like a normal database and frees data scientists from the old burden of technical restrictions for insightful data analysis. At the same time, data scientists and stakeholders can use the platform's integrated analytics tools to analyze trends in the data and make critical, data-driven decisions.

We have been successfully working with Databricks and Apache Spark for our customers for more than two years and we know how powerful this combination is. Whether it's processing huge amounts of data, automating time-critical ETL processes, or optimizing machine learning workflows, Databricks gives us the tools to develop powerful, scalable solutions for our customers. At the same time, Databricks focuses on a structured, collaborative and code-driven approach, thus opening up the possibility of transferring established software development best practices to working with large amounts of data: automated tests, quality controls, review and approval mechanisms for critical processes, and a complete change history of the code used and the data stored in Databricks. 

2024-12-11_screenshot_table_history_Databricks_partner

Databricks user interface for the Unity data catalog. Databricks-managed tables include a complete and accessible change history. Details about the time of the change and links to the notebook from which the changes originated allow data changes to be traced.

2024-12-11_screenshot_deltalivetables_light_Databricks_partner

Databricks Delta Live Tables user interface: the delta live table itself is displayed in the upper area, while all relevant events and status changes are listed in the event log in the lower section. 

Databricks as a building block in your system landscape

Databricks shines as a central platform solution for data-based challenges and opens up a wide range of possibilities for integration and combination with other systems. It is characterized by a large number of connectivity options to other tools and platforms. These include a wide range of data sources, whose data can be collected and processed in Databricks, as well as BI tools such as Apache Superset or Microsoft PowerBI. Git integration for code version control also makes it easier to work collaboratively on data pipelines and ensures the necessary transparency when making changes to data processing logic.

This is just a small teaser, as Databricks integrates seamlessly into the environments of the major cloud providers Microsoft Azure, Amazon Web Services and Google Cloud Platform, enabling a wide range of personalized connections to be set up for a variety of use cases. 

What does this partnership mean for our customers?

Becoming a Databricks partner enables us to offer our customers even more comprehensive support – both technically and strategically. We present five examples of how we generate added value for our customers when working with Databricks:

1. Optimized data pipelines

With Databricks, we develop scalable data pipelines that work efficiently even with large amounts of data. By using Databricks features like Delta Lake and Delta Live Tables, we ensure data transparency, reliability, and the ability to perform near real-time analysis.


We are happy to help you!

Do you have any questions or need support for your next AI project?   We will be happy to assist you in implementing or optimizing your AI-based  application with our know-how and show you how Machine Learning can provide  added value for you and your company.   Get free consultation  


2. Best practices for your development teams

Our aim is not only to build functional solutions, but also to strengthen your team. We bring DevOps best practices that support your developers – from standardized workflows for operating Databricks to customized Python libraries that we have developed for working with Databricks and Spark.2024-12-10_databricks-devops-best-practices-git-terraform_Databricks_partnerDatabricks DevOps processes using Azure DevOps as an example: the Infrastructure-as-Code process is shown above the three Databricks system environments, which defines, checks, and deploys system components such as workspaces and authorization management, Spark clusters, Databricks jobs, etc. using Terraform. The development process for data processing in the Lakehouse is outlined in the lower half. SQL and Python Notebooks as well as central Python libraries are developed on the Databricks Dev-Workspace, versioned and synchronized with Git, automatically tested, and delivered via deployment pipeline.

3. Machine Learning made easy

Databricks provides the perfect environment to train, test and deploy machine learning models. Through tools like MLflow, we help you manage the entire machine learning lifecycle, from initial idea to deployment.

4. Integration into your cloud environment

Whether AWS, Azure or Google Cloud – we'll make sure Databricks integrates seamlessly into your existing infrastructure. This way, you can use your data platform efficiently without rebuilding your architecture from the ground up.

5. SAP and Databricks

SAP-made products can be found in many companies, but they do present some hurdles when it comes to integrating with systems from other providers. We are familiar with both worlds and help our customers find the best way to exchange data between SAP and Databricks for every scenario. As a partner of both SAP and Databricks, we are familiar with the roadmaps of both providers and stay up to date for our customers in terms of integration options.

HighRes_2024-12-11_databricks_sap_cdc_sequence-diagram_Databricks_partner

Simplified process of a streaming data pipeline between SAP systems and databricks using the change data capture method. On the SAP side, delta-capable data objects must be set up using CDS views, for example, and these can then be transferred to the data lakehouse via batch routine or real live query.

Our vision: Transforming data into value

Our mission is to empower companies to become data-driven. Data is one of the most important resources today – but only when it is used meaningfully does it develop its true value.

With Databricks, we have a platform that enables us to do just that:

  • Process data faster
  • Design scalable processes
  • Facilitate data-based decisionmaking

And best of all: with Databricks' Lakehouse architecture, we can break down the traditional boundaries between business intelligence, data engineering and data science and combine everything in one platform.

Together into the future

We at NextLytics are convinced that the partnership with Databricks is a big step in the right direction. Together with our customers, we want to build the data platforms of the future - platforms that are not only powerful, but also sustainable and flexible.

Would you like to find out more about how we can support you with Databricks? Then feel free to contact us - we look forward to exchanging ideas with you!

Learn more about Machine Learning and AI

avatar

Markus

Markus has been a Senior Consultant for Machine Learning and Data Engineering at NextLytics AG since 2022. With significant experience as a system architect and team leader in data engineering, he is an expert in micro services, databases and workflow orchestration - especially in the field of open source solutions. In his spare time he tries to optimize the complex system of growing vegetables in his own garden.

Got a question about this blog?
Ask Markus

Milestone reached: NextLytics becomes Databricks partner
8:06

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts