Generali Employee Benefits
Established in Trieste (Italy) in 1831, Assicurazioni Generali SpA is a business with a history. The Generali Group is one of Europe’s biggest multiline insurers by market capitalization and ranks in the top five insurers in the world by global premium income.
Generali Employee Benefits (GEB) is the employee benefits activity of Generali Group. The GEB Network, composed of more than 100 local insurance companies, is one of the leading partners in the international employee benefits management field, servicing more than 1,400 international corporate customers.
For more information, please visit our website www.geb.com
Role Overview
As Data Engineering Manager, you will lead the technical evolution and operational excellence of GEB’s Data Ingestion Tool platform. You will act as a technical architect and technical team leader, driving the design, development, and scalability of data ingestion pipelines, while ensuring alignment with business needs and enterprise architecture standards.
You will collaborate closely with the Product Owner, business analysts, IT teams, and key stakeholders across the COO and business departments. Your role combines strategic oversight, hands-on technical leadership, and mentoring of data engineering practices, supporting the transition from legacy systems to a modern, cloud-based data ingestion framework.
Key Responsibilities
Technical Leadership & Architecture
* Define and evolve the ingestion platform architecture using PySpark, Databricks, and Azure Data Services.
* Lead the implementation of scalable ingestion pipelines following Medallion architecture (Bronze, Silver, Gold layers), including proactive initiation of (technical) improvements.
* Ensure robust data quality validation, metadata management, and access control using Unity Catalog and Delta Lake.
* Oversee integration with APIs (REST, GraphQL), third-party tools, and enterprise systems.
* Coordinate continuous improvement with the Product Owner, to spur relevant functional and business changes with measurable impact – both for process-oriented and technological improvements (ML, AI).
Team & Delivery Management
* Mentor and guide data engineers, promoting best practices in development, testing, and CI/CD.
* Translate business requirements into technical specifications and oversee their implementation.
* Collaborate with the Product Owner to align technical delivery with product roadmap and business priorities.
* Drive Agile delivery practices, including sprint planning, backlog grooming, and release management.
Operational Excellence
* Ensure platform reliability through proactive monitoring, incident management, and performance optimization.
* Maintain and evolve documentation in Jira and Confluence, including training and enablement materials.
* Support stakeholder engagement through technical presentations, workshops, and training sessions.
Skills & Competences
* Strong leadership and communication skills, with the ability to influence technical direction and engage stakeholders.
* Proven experience in designing and implementing scalable data architectures.
* Ability to balance strategic oversight with hands-on technical execution.
* Proactive problem-solver with a structured approach to managing complexity.
* Comfortable working in Agile teams and managing multiple priorities.
* Collaborative mindset with a focus on team development and knowledge sharing.
Qualifications & Experience
* Minimum 7 years of experience in data engineering.
* Experience in insurance or financial services is a strong asset.
* Proficient in: Python (PySpark), Databricks, Delta Lake, Azure Data Services, PostgreSQL, REST/Graph APIs, CI/CD tools (Azure DevOps), Git, and optionally Terraform.
* Familiarity with Medallion architecture and Unity Catalog.
* Experience with Agile methodologies and tools (Jira, Confluence).
* Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
* Fluent in English; additional languages are a plus.