Databricks Engineer

Job openings     Databricks Engineer

Databricks Engineer

At DAC.digital, we are constantly growing our business. As part of our growth strategy, we are starting a strategic partnership with one of the biggest consulting companies in Germany with a strong focus on Data Management. Their clients include major banking, insurance, retail, telecommunications, and automotive brands.


Key information:

Salary: 17 500 - 25 000  PLN net/month - pure B2B contract

15 000 – 22 000  PLN net/month – B2B contract (days off included)

Requirements


Technology stack:

  • Microsoft Azure

  • Azure Data Factory

  • Databricks

  • Log Analytics

  • Metric Dashboards



Nice to have technologies:

  • German

  • ServiceNow

  • Production Industry

  • Retail Industry

  • ITIL


It is vital that you have:

  • strong experience working with Microsoft Azure and its core services;

  • hands-on expertise in building and managing data pipelines using Azure Data Factory;

  • proficiency in working with Databricks for data processing and analytics;

  • solid understanding of Log Analytics for monitoring and troubleshooting system performance;

  • experience creating and maintaining Metric Dashboards to track key performance indicators and system health.

  • knowledge of English (min. B2);

  • working in agile methodologies (Scrum, Kanban);

  • high communication skills;

  • eager to learn and share knowledge;

  • willing to travel if necessary (conferences, coworking events, etc.)

Responsibilities


You will be responsible for supporting our team in:

  • monitoring system health by identifying and addressing critical errors through Log Dashboards or email alerts;

  • ensuring that performance metrics remain within acceptable thresholds using Metrics Dashboards;

  • reviewing and resolving open or pending ServiceNow tickets;

  • verifying progress and updates on maintenance activities via the DevOps Maintenance Board;

  • reviewing Microsoft Azure recommendations to implement performance and security best practices;

  • monitoring Azure Data Factory runtimes to confirm successful pipeline executions;

  • identifying and renewing expiring secrets stored in Azure Key Vault;

  • investigating and resolving failed data pipelines or jobs to ensure smooth operations;

  • optimizing pipeline runs and improving data product supply efficiency, particularly for data flows between Azure and SAP;

  • participating in daily team meetings to report progress and align on priorities.

Offer


  • possibility to work 100% remotely or on-site at our office in Gdańsk;

  • b2b contract with included 30 days off;

  • private Medical care;

  • group insurance;

  • pre-paid card or Sport Card;

  • referral program;

  • real Agile practices;

  • employee well-being online platform;

  • cafeteria benefits.

Share this Job

   
Recruitment powered by tomHRM