Rozwiń/zwiń
The person in this role will work with project teams developing a modern data platform in a cloud environment for global brands. They will work in an international environment and specialise in the architecture of cloud-based Big Data, Business Intelligence solutions and the latest technologies in this area and promote the principals of the DataOps manifesto.
Responsibilities:
- Coming up with improvement initiatives in existing solutions and designing new solutions
- Coordinating architectural arrangements with Architects on the client side and other vendors
- Coordinating the work of a team of Data Engineers (responsible for the development of data platforms and ETL/ELT processes) and Data Analysts (responsible for the data model and report development) on the BitPeak side across multiple projects running in parallel for the same client; working closely with Project Managers
- Optimisation of the production process through process and tool changes
- Enforcing and improving platform development standards, as well as setting them (where gaps are identified)
- Ensuring quality of delivered solutions, conducting code review
- Working "at the grassroots" as Data Engineer and Data Analyst (to stay in touch with the technology)
- Keeping up to date with current trends, tools, services in the Data area
Requirements:
- Minimum 5 years of experience in designing and building Business Intelligence, ETL/ELT, Data Warehouse, Data Lake, Big Data solutions
- Practical knowledge of various relational (e.g. SQL Server/SQL Database, Oracle, Redshift, PostgreSQL) and non-relational (e.g. MongoDB, CosmosDB, DynamoDB, Neo4j) database engines
- Very good knowledge of SQL and Python (min. 5 years' experience)
- Knowledge of process orchestration and data processing solutions, in particular: Spark/Databricks (including structured streaming mechanisms, Delta Lake, etc.), Snowflake, Azure Data Factory, Apache Airflow, dbt
- Understanding of the following areas: data governance, data quality, data visualisation
- Data modelling skills (Star schema / Lakehouse / Medallion / Data Vault / Data Mesh / Common Data Model / Corporate Data Model)
- Advanced skills in the use of git repositories (Bitbucket/GitHub)
- Familiarity and experience with data services offered by the Azure and/or AWS platform
- Flexibility, self-reliance and efficiency in action, as well as responsibility for assigned tasks
- Knowledge of CI/CD issues and experience in configuring them
- Knowledge of issues and specialised solutions in the area of Data Governance
- Experience in conducting code-review, managing a team, delegating tasks
- Practical knowledge of English at a level of min. B2 (C1+ preferred)
We offer:
- Global projects in multiple clouds - we work with clients from all over the world based on modern cloud technologies
- Certification reimbursement - we fund exams, certifications from Microsoft, AWS, Databricks, Snowflake
- Time to learn - 60 paid hours per year
- Flexible approach - you can choose to work from home or meet at our offices
- Personalized benefits - medical care, subsidized sports packages, language tuition, new employee referral bonus (up to PLN 15,000) as well as annual and media bonus