Data Warehouse Specialist (Google Cloud Platform)
Generali is one of the largest global insurance and asset management providers.
Established in ****, it is present in 50 countries in the world the Generali Group is one of the most significant players in the global insurance and the asset management industry.
Generali supports each firm to innovate and grow with full investment autonomy ensuring they are set up to develop sustainable and innovative solutions.
With the stability of Generali Group backing, the platform enables investors to access distinctive strategies and experts' insights.
GEB is a global Employee Benefits platform that helps Multinational Corporates succeed by protecting and enhancing the physical, emotional & financial wellbeing of their human capital.
Driven by customer service, innovation, and operational excellence, GEB is built on an ecosystem of partnerships to support clients on their Environmental, Social & Governance journey.
Its presence is truly global (127 countries) and reliable thanks to 136 trusted local Network Partners, who enable the provision of focused expertise and support to 298 Lifecycle Pooling coordinated multinational programmes, 324 other global solutions and 62 Captive programmes, with a premium volume of €***** billion (YE **** figures).
For more information, please visit our website
TheData Warehouse Specialistplays a key role in the Data Governance, Automation and Process Optimization team, supporting the design, development, and maintenance of scalable data infrastructure.
This role focuses on building robust data pipelines, optimizing data flows, and enabling secure, efficient access to enterprise data for analytics and reporting.
The Specialist will collaborate with cross-functional teams to ensure data integrity, automation, and compliance across GEB's cloud-based data ecosystem.
Main accountabilities (non-exhaustive list)
Design and implement data extraction pipelines using GCP components such as Data Proc, Data Fusion, and Composer/Airflow
Develop and maintain DAGs for ETL processes and automate environment recreation using Google Cloud SDK and API scripting
Integrate Cloud Functions (Gen 1 & 2) and Secret Manager into data workflows for secure and scalable operations
Build and optimize Big Query datasets for enterprise reporting and analytics
Apply AEAD encryption and manage Data plex environments where applicable
Write and maintain advanced SQL queries and Shell scripts for data manipulation and automation
Support release management activities, including Cloud Build deployment processes
Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications
Ensure data quality, consistency, and security across all data warehouse environments
Document data models, processes, and architecture for internal use and compliance
Coordinate with testing teams to validate data components and ensure alignment with business expectations
Maintain up-to-date knowledge of emerging data technologies, warehousing standards, and best practices
Undertake additional responsibilities and tasks reasonably aligned with the scope of this role, as required by evolving business needs
The above list is not limitative and may be amended/adapted at any time by the Employer, at its own discretion, in accordance with the business needs.
Qualifications
Very solid hands-on experience with Google Cloud Platform (GCP) components including Data Proc, Data Fusion, Composer, Cloud Functions, Big Query, and Secret Manager
Strong proficiency in Python scripting and SQL is a must
Solid understanding of data modelling, ETL design, and data warehouse architecture
Experience with Shell scripting and automation of cloud environments
Familiarity with release management and CI/CD pipelines (Cloud Build experience is a plus)
Experience in encryption using AEAD and Data plex are considered a strong asset
Strong communication skills with the ability to explain technical concepts to non-technical stakeholders
High level of autonomy and accountability
Self-initiative, proactive mindset, and team-oriented approach
Flexibility in working hours to accommodate global collaboration
Master's degree in computer science, data engineering, information systems, or a related field
Minimal 6 years of experience in data warehousing or business intelligence roles within an international environment
Certification in Google Professional Cloud Data Engineer is highly desirable
Experience with employee benefits, reinsurance, or financial services data is a plus
Familiarity with data governance, compliance, and security best practices
Willingness to travel occasionally as required by business needs
Fluent in English, any other language would be considered an asset
This recruitment deals with a permanent and full-time position, based in Assago (close to Milan, Italy).
Generali Employee Benefits' commitment for recruiting
Generali Employee Benefits is committed to promote equal opportunities in employment.
Candidates will receive equal treatment regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, color, nationality, ethnic or national origin, religion or belief, sex or sexual orientation.
At Generali, we believe that it is our differences that make the difference.
At the heart of everything we do, we value the fact that we are all human beings, unique in our own ways, bringing different cultures, lifestyles, mindsets, and preferences.
Our commitment is to leverage this Diversity to create long-term value, to be innovative, sustainable, to make the difference for our people, our clients, our partners as well as our communities.
We strive to promote a culture where D&I; is embedded in how we work and do business every day.
All of us around the world are taking actions every day to create an inclusive and accessible workplace, where every person feels empowered to take ownership to challenge biases and lead the transformation with a human touch.
Generali Employee Benefits endeavours to contact candidates within 21 days of application.
However, if you do not hear back after 3 weeks then please assume on this occasion, unfortunately, you have not been successful.
Personal data collected will be strictly used for recruitment purpose only.
All unsuccessful applications will be destroyed max.
3 months after this recruitment campaign closing.
Seniority Level
Mid-Senior level
Employment Type
Full-time
Job Function
Information Technology
Industry
Insurance
Referrals increase your chances of interviewing at Generali Employee Benefits - GEB by 2x.
Get notified about new Data Warehouse Specialist jobs inAssago, Lombardy, Italy.
#J-*****-Ljbffr