Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Dashboarding

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
NextLytics Newsletter Teaser
Sign up now for our monthly newsletter!
Sign up for newsletter
 

Efficient Continuous Deployment Monitoring with Apache Airflow

Continuous Integration and Continuous Deployment (CI/CD) is an integral part of every modern software development life cycle. A DevOps engineer is tasked with ensuring that the application is tested, built and deployed properly to a QA environment and/or delivered to the customers in a timely manner. This pipeline usually runs many times a day and the results of each individual job is immediately documented. For example, if a unit test fails or the code has syntactical errors, the team would immediately be notified and start working on fixing the issue. When the deployment is finished, potential runtime errors could arise, causing the application to exit abnormally. Failing to establish a connection to the database, failed validation of environment variables and misconfiguration of the Docker containers are problems that - among other things - can only be detected after the CI/CD pipeline is finished.

If your system does not include sophisticated monitoring tools like Prometheus and Grafana, a simple Apache Airflow DAG could do the job just fine. Running every few hours, it can identify problematic Docker containers and compose a report which will be emailed to the DevOps team, or sent to a Google space group chat as a notification. Let’s see how this DAG can be implemented.

Collect problematic containers

Our application consists of a REST API and a Postgres database. This pair is deployed on the QA environment every time we open a merge request and gets redeployed when new commits are available. All container names are prefixed with the project name (“my-project” in this example), followed by the branch name.

First, we develop two shell scripts to identify abnormally exited and restarting containers. Fortunately, this can be accomplished by leveraging the powerful Docker CLI and the grep tool.

collect_exited_containers.sh

#!/bin/bash

docker ps -a --filter 'status=exited' --format '\t\t\t\t' | grep -v 'Exited (0)' | grep “my-project”

collect_restarting_containers.sh

#!/bin/bash

docker ps -a --filter 'status=restarting' --format '\t\t\t\t' | grep nexthub

The scripts are filtering all Docker containers based on their status (exited or restating), formatting the output and selecting only the ones associated with our project. Note that we exclude containers that exit with status code 0, since this indicates a normal termination.

We also give execute permissions to both scripts.

$  chmod +x collect_exited_containers.sh collect_restarting_containers.sh

Develop Airflow DAG for monitoring

The DAG consists of five tasks:

  • collect_exited_containers
  • collect_restarting_containers
  • compose_report
  • send_email_report
  • send_notification_report

The first two are self explanatory, we’ve already outlined the shell scripts help with identifying problematic containers, so the task implementations will be fairly straight forward.

collect_restarting_containers_Continuous_Deployment

collect_exited_containers_Continuous_Deployment


Effective workflow management with Apache Airflow 2.0 -
Download the Whitepaper here! 

NextLytics Whitepaper Apache Airflow


The report will be composed by this simple Python callback function.

compose_report_callback_Continuous_Deployment

compose_report_Continuous_Deployment

We basically fetch the results from the two previous tasks and if they’re not empty, we add them to the final message. A typical message would look like this:

Abnormally exited containers:

my-project_api_documentation-upgrades                        Exited (137) 2 days ago

my-project_api_localization-data-refactoring                   Exited (137) 2 days ago

Restarting containers:

my-project_api_establish_db_connection                         Restarting (2) 6 seconds ago

Please check the containers logs in order to fix the related issues.

 

Sending an email report can be accomplished with the EmailOperator.

email_sending_Continuous_Deployment

And pushing a notification to a Google Chat space is done by sending a POST request to a predefined webhook.

push_callback_Continuous_Deployment

notification_sending_Continuous_Deployment

The tasks can be arranged like so, thus creating dependencies between them.

tasks_Continuous_Deployment

Screenshot 2024-07-03 at 09-15-18 cicd_check_monitoring - Grid - Airflow_Continuous_Deployment

Continuous Deployment with Apache Airflow  - Our Conclusion

With this simple sequence of tasks, we can create a mechanism that is responsible for monitoring the deployed Docker containers of an application and notify us if something is wrong. This use case demonstrates once again the versatility of Airflow very clearly, being able to integrate and communicate with multiple systems.

You can find the whole code for this DAG here. If you have questions about this implementation or if you’re wondering how you can use Apache Airflow to empower your data driven business we are happy to help you. Contact the Data Science and Engineering team at Nextlytics today.

Learn more about Apache Airflow

,

avatar

Apostolos

Apostolos has been a Data Engineering Consultant for NextLytics AG since 2022. He holds experience in research projects regarding deep learning methodologies and their applications in Fintech, as well as background in backend development. In his spare time he enjoys playing the guitar and stay up to date with the latest news on technology and economics.

Got a question about this blog?
Ask Apostolos

Efficient Continuous Deployment Monitoring with Apache Airflow
4:25

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts