Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Dashboarding

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
Sign up now for our monthly newsletter!
Sign up for newsletter
 

State-of-the-art Workflow Orchestration with Apache Airflow

Digital workflow management is growing in importance

Automated digital workflows are key to efficient and successful business processes: Departments can focus on the important decisions and always have the latest reports at their disposal, analytical dashboards show up-to-date information, and prediction models are trained with the most fit data available. Availability and accountability of data are crucial for trust in data-driven decision making and planning. Trust the world’s most popular open source workflow orchestration platform to run your workflows like clockwork: Apache Airflow

Apache Airflow is the leading open source workflow orchestration framework. It runs the time-critical processes and data pipelines of hundreds of companies worldwide including pioneers of the data-centric business model like AirBnB, Pinterest, Spotify, or Zalando. Software-as-a-Service giants AWS and Google Cloud Platform offer Airflow as part of their regular service catalog and Astronomer even builds their whole product and company around the open source project.

 

Apache Airflow

 

The success of Airflow is based on the project’s core principles of scalability, extensibility, dynamic, and elegance. The modular micro services architecture built with the Python programming language lends itself to meet any technical requirements one can encounter. Workflows definitions in Airflow are expressed as Python code, which makes the system easy to learn for anyone with some experience in programming. Since Airflow is not specifically a data pipeline tool, there are no boundaries to what business logic can be automated with the system. The extended Airflow ecosystem boosts connector libraries for most any third-party system one can think of including AWS, Azure, Oracle, Exasol, Salesforce, Snowflake. NextLytics can even provide you with SAP integration! And last but not least there’s no license fees or vendor lock-in to be feared because of the permissive Apache software license.

Reliable workflow orchestration is an absolute necessity for the digital organization. Continue reading or download our whitepaper to learn more about how Airflow can contribute to your company’s ongoing success and how NextLytics can support you on that journey.

Whitepaper: Effective Workflow management with Apache Airflow 2.0

How do you manage your workflows with Apache Airflow? Which application scenarios are feasible in practice? With which features does the new major release react to the current challenges of workflow management?
 

Digital workflows with the open source platform Apache Airflow

Creating advanced workflows in Python

In Apache Airflow the workflows are created with the programming language Python. The entry hurdle is low. In a few minutes you can define even complex workflows with external dependencies to third party systems and conditional branches.

Modubild_Icons_1

 

Schedule, execute and monitor workflows

The program-controlled planning, execution and monitoring of workflows runs smoothly thanks to the interaction of the components. Performance and availability can be adapted to even your most demanding requirements.

Modubild_Icons_5

 

Best suited for Machine Learning

Here, your Machine Learning requirements are met in the best possible way. Even their complex workflows can be ideally orchestrated and managed using Apache Airflow. The different requirements regarding software and hardware can be easily implemented.

Modubild_Icons_4

 

Reliable orchestration of third-party systems

Already in the standard installation of Apache Airflow numerous integrations to common third party systems are included. This allows you to realize a robust connection in no time. Without risk: The connection data is stored encrypted in the backend.

Modubild_Icons_2

 

Ideal for the Enterprise Context

The requirements of start-ups and large corporations are equally met by the excellent scalability. As a top level project of the Apache Software Foundation and with its origins at Airbnb, the economic deployment on a large scale was intended from the beginning.

Modubild_Icons_3

 

A glance at the comprehensive intuitive web interface

A major advantage of Apache Airflow is the modern, comprehensive web interface. With role-based authentication, the interface gives you a quick overview or serves as a convenient access point for managing and monitoring workflows.

02_Beispiel_Startseite
06_Bearbeitung_Task
07_Logs_über_Weboberfläche
01_Beispiel_Workflow
05_Komplexe_Workflows
04_Tree_View

The orchestration of third-party systems is realized through numerous existing integrations.

  • Apache Hive

  • Kubernetes Engine

  • Amazon DynamoDB

  • Amazon S3

  • Amazon SageMaker

  • Databricks

  • Hadoop Distributed File System (HDFS)

  • Bigtable

  • Google Cloud Storage (GCS)

  • Google BigQuery

  • Google Cloud ML Engine

  • Azure Blob Storage

  • Azure Data Lake

  • ...

Apache Airflow_1

 

The workflow management platform for your demands for your demands

Flexibility by customization
The adaptability is given by numerous plugins, macros and individual classes. Since Airflow is completely based on Python, the platform is theoretically changeable up to the basics. Adapt Apache Airflow to your current needs at any time.

Truly scalable
Scaling with common systems like Celery, Kubernetes and Mesos is possible at just any time. In this context a lightweight containerization can be installed.

Completely free of charge
The workflow management platform is quickly available without license fees and with minimal installation effort. You can always use the latest versions to the full extent without any fees.
Benefit from a whole community
As the de facto standard for workflow management, the Airflow Community not only includes users, but the platform also benefits from dedicated developers from around the world. Current ideas and their implementation in code can be found online.

Agility by simplicity
The workflow definition is greatly accelerated by the implementation in Python and the workflows benefit from the flexibility offered. In the web interface with excellent usability, troubleshooting and changes to the workflows can be implemented quickly..

State-of-the-art workflow management with Apache Airflow 2.X

The new major release of Apache Airflow offers a modern user interface and new functions:

  • Fully functional REST API with numerous endpoints for two-way integration of Airflow into different systems such as SAP BW
  • Functional definition of workflows to implement data pipelines andfor improved data exchange between tasks in the workflow using the TaskFlow API
  • Just-in-time scheduling based on change detection or availability of required data itemsInterval-based checking of an starting condition with Smart Sensors, which keep the workload of the workflow management system as low as possible
  • Dynamic task creation and scaling based on metrics of the current data flow
  • Improved business logic monitoring through integration with data observability frameworks
  • Increased usability in many areas (simplified Kubernetes operator, reusable task groups, automatic update of the web interface)

Apache Airflow_2

 

Do you have any questions or need support for your next AI project?

We will be happy to assist you in implementing or optimizing your AI-based application with our know-how and show you how Machine Learning can provide added value for you and your company.
 

Icon_Datenübertragung

Automatic file transfer

Due to the large number of integrations to other systems, the transfer of data and files can be easily realized. So-called sensors are used to check start conditions such as the existence of a file at periodic intervals. For example, a CSV file can be loaded from the cloud service into a database. In this way, unnecessary manual work can be automated.


Icon_Automatisierung

Triggering of external process chains via API

If there is no easy way for system integration per task operator (and the community has not provided a solution yet), the connection via APIs and HTTP Requests is still possible at any time. For example, a process chain in SAP BW can be started and tracked synchronously. Conversely, system integration through Airflows excellent API is possible without any problems.

Icon_Projekt

Executing ETL workflows

Your data management will benefit the most from Apache Airflow. It has never been easier to combine different, structured data sources into one target table. Originally Airflow was developed according to the needs of Extract-Transform-Load (ETL) workflows and offers smart concepts like reusable workflow parts to build even complex workflows fast and robust.


Icon_MachineLearning

Implement Machine Learning process

Not only during the development of a Machine Learning application many processes which are ideally implemented as a workflow exist. The productive execution of a model can also be implemented as a workflow and thus provide, for example, current forecast data at fixed intervals. The data preparation and training of the current model version can be easily realized with Airflow.

Would you like to know more about Machine Learning?

Find interesting articles about this topic in our Blog

Increase Efficiency with Apache Airflow Managed Service Operations

Any data-driven business needs at least one orchestration service to automate and streamline...

Increase Efficiency with Apache Airflow Managed Service Operations

Implementing Single Sign On (SSO) Authentication in Apache Airflow

Apache Airflow is an open-source orchestration platform that provides a Python code–based interface...

Implementing Single Sign On (SSO) Authentication in Apache Airflow

Efficient Continuous Deployment Monitoring with Apache Airflow

Continuous Integration and Continuous Deployment (CI/CD) is an integral part of every modern...

Efficient Continuous Deployment Monitoring with Apache Airflow
Table Of Contents
Whitepaper Pages Blogs