DataOps Engineer
The group, operating internationally and active in the services sector, is looking for a DataOps Engineer.
Key Responsibilities
* Collaborate with Data Engineers, DevOps and Architects teams to design, deploy and operate scalable and reliable data infrastructures supporting data ingestion, analytics and AI projects;
* Build, automate and manage data platform environments (data lakes, data warehouses, streaming systems) leveraging AWS services and Infrastructure as Code practices;
* Implement and maintain CI/CD pipelines for data workflows, ensuring high availability, observability, and security across all environments;
* Develop monitoring, logging, and alerting systems to ensure performance, reliability, and cost optimization of data workloads;
* Contribute to the evolution of a data-centric culture by enabling fast, safe, and repeatable deployment of data solutions;
* Work within an Agile team with a collaborative mindset, contributing to continuous improvement of processes, automation, and platform reliability.
Required Skills
* Strong experience with AWS services (e.g. S3, Glue, ECS, EKS, Lambda, CloudFormation, IAM, CloudWatch);
* Solid understanding of CI/CD pipelines and tools (e.g. GitHub Actions, Jenkins, CodePipeline, dbt Cloud);
* Hands-on experience with Infrastructure as Code (Terraform, AWS CDK, or CloudFormation);
* Familiarity with data orchestration tools (Airflow, Prefect, Dagster) and ETL/ELT frameworks;
* Proficient in Python or other scripting languages for automation and operational tasks;
* Experience with containerization and orchestration (Docker, Kubernetes);
* Good knowledge of monitoring and observability tools (Prometheus, Grafana, ELK, Datadog);
* Strong focus on reliability, automation, and scalability of data systems.
Smart working: 2 days per month on-site in Milan, with great flexibility.