Your contribution to TenneT
What does a Cloud Data Platform Engineer do at TenneT?
As a Cloud Data Platform Engineer in the Digital & Data organization, you are a key player in the design, implementation and maintenance of TenneT's cutting edge TenneT Data Cloud (TDC). You will work in one of our amazing DevOps teams where you work together on a daily base with your colleagues. You setup and maintain Azure services like Azure Data Factory, Azure Databricks, and Microsoft Fabric. Your stakeholders are mainly the other Digital & Data DevOps teams. Together with them you will translate their needs into smart technical solutions. You automate workflows, make sure that there is a seamless integration with various data sources, and you guard the performance of the TDC to make sure that we can guarantee a high level of availability and reliability. Because you are curious and eager to learn, you stay up to date with the latest Azure technologies and best practices.
What do we expect from you?
- Hands on design, develop, and implement services on Azure;
- Build and maintain CI/CD pipelines for efficient and automated deployment and testing of data engineering processes;
- Utilize scripting and development skills in languages such as Python, Java, and other relevant languages to enhance data processing and manipulation;
- Demonstrate expertise in Azure services, ensuring efficient data storage, data ingestion, data transformation, and data analytics;
- Develop and integrate APIs to enable smooth data communication and interaction with external systems;
- Implement automated workflows and integrations;
- Work with IAC to provision and manage cloud infrastructure on Azure.
Your profile and background
- Minimum 5 years of experience in a comparable role, with a focus on implementing, optimizing and maintaining large scale analytics platforms;
- Hands-on experience with Azure services, specifically Azure Data Lake Storage, Azure Data Factory, Azure Databricks, Azure Functions, and Log Analytics;
- In Azure Data Factory, you’ve designed or implemented metadriven pipelines;
- In Databricks, you’ve designed or implemented workflows and pipelines, ideally using Databricks Asset Bundles ;
- Knowledge of continuous integration and continuous deployment (CI/CD) methodologies and tools, preferably Azure DevOps or GitHub;
- Hands-on experience with deploying cloud infrastructure using Infrastructure As Code, preferably Terraform;
- Knowledge of containerization technologies such as Docker or Kubernetes;
- Solid scripting and development skills. Knowledge of Python and SQL is a plus;
- Hands-on experience with setting up alerting in Azure and incident management tools like Splunk On-Call or PagerDuty;
- Familiarity with automated workflows and integrations;
- Understanding of API development and integration;
- Excellent problem-solving abilities and a proactive approach to troubleshooting and optimising data engineering processes;
- Strong communication skills and the ability to work collaboratively in a team environment;
- Knowledge of big data technologies and frameworks like Hadoop, Spark, and Kafka.
Our recruiting process
Our offer
This will be our challenge
Additional information
- The weekly working time is 40 hours
- Type of contract is indefinite
- Job interviews for this position will be (partly) in English. Please send us your application document in English as well
Salary
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo onsequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo onsequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.