KIKO MILANO was founded in 1997 in Milan and since then has revolutionized how cosmetics are sold globally. We thrive to offer an incredible variety of products, textures & colours, a multi-sensorial experience in innovative technology yet always at an affordable price. Based in Italy, and truthful to its DNA, KIKO MILANO takes advantage of combining trustworthy quality, creativity and stunning aesthetics.
Your role at KIKO MILANO
We are looking for a Senior Data Engineer to join KIKO MILANO’s IT Team, reporting to the Data, AI & Cloud Director.
Such talent will be is responsible for designing, building, and optimizing scalable data pipelines and data platforms within a cloud environment (AWS/Databricks) to support advanced analytics and AI use cases. The role is focused on developing reliable data ingestion and transformation frameworks, implementing efficient data models for Data Warehouses and Data Marts, and ensuring high performance, security, and governance of the data infrastructure.
In particular, you will:
* Design, build, and optimize scalable and reliable ETL/ELT data pipelines to guarantee high performance and sustainable costs
* Develop and validate data ingestion frameworks for various sources (APIs, databases, streaming sources)
* Implement data transformation, aggregation, and validation logic to prepare data for analytical and AI use cases
* Contribute to the strategic technical roadmap and architecture for the Databricks Enterprise Data Platform (Data Lake, Data Warehouse, Data Mesh/Fabric) in AWS
* Monitor, tune, and troubleshoot data infrastructure components in the cloud environment to ensure high availability and performance
* Design and implement optimal data models (e.g., Star Schema, 3NF) for Data Warehouses and operational Data Marts
* Ensure adherence to security and governance standards in all data pipeline development
* Proficiency understanding business requirements, translating in technical design and implementation
What you will need to succeed
* Deep expertise in modern data architectures (Data Lakehouse, Data Mesh, Dimensional Modeling)
* Strong expertise over Databricks platform
* Expertise in modern FE presentation layer (eg. PowerBI)
* Expertise in Big Data technologies (e.g., Spark) and workflow orchestration tools (e.g., Airflow, ADF)
* Strong focus on cost optimization and Cloud FinOps within the data domain
* Expertise in AWS services (VPC, EC2, S3, RDS, Lambda, ECS/EKS, IAM) is nice to have
* Strong understanding of automation, Infrastructure as Code, and CI/CD pipelines in a data context
* Experience with Machine Learning Operations (MLOps) and productionizing AI models is higly preferred
* Familiarity with data virtualization and data fabric concepts
* Fluency in English (required for technical documentation and international alignment)
We would love if you had
* Technical Leadership abilities, capable of driving technical direction
* Proactive attitude towards technical debt reduction and process improvement
* Strong analytical abilities to troubleshoot and resolve complex data flow issues
* Experience coordinating work with external system integrators
* Ability to work effectively within an agile team environment, collaborating with Data Scientists and Business Analysts
#J-18808-Ljbffr