PpThe Group is building a shared Data Platform and a centralized Data Factory to industrialize data AI use cases at international scale. /p pThe operating model is federated: each country contributes to local execution while relying on a common foundation (standards, data models, patterns, tooling, and technical governance). /p pIn this context, the Local Data Engineer is the country-level “builder” profile. The role focuses on developing and operating data pipelines, delivering locally used datasets, and organizing a structured handover to Data Analysts for usage and functional evolutions. /p h3Mission /h3 pWe’re looking for a Data Engineer who thrives on building robust, scalable data pipelines and turning complex data into high-quality, analytics-ready products. /p pThis is a hands-on role where you’ll design, build, and deploy end-to-end data pipelines — from ingestion and transformation through to modelling and exposure — ensuring they are production-ready, monitored, and optimised for performance and cost. /p pYou’ll work closely with Data Analysts to shape requirements and deliver clean, well-documented datasets that drive real business value. Clear ownership boundaries are key: you’ll own the engineering robustness and industrialisation, while Analysts focus on insight and impact. /p h3Reporting Organization (dual reporting) /h3 ul libTechnical reporting /b: Tech Lead Data / Data Architect (Data Factory) /li libFunctional reporting /b: Local Head of Data (Country) /li li→ local priorities, allocation, business coordination, adoption, and satisfaction of local teams. /li /ul pDaily collaboration with: Country Data Analysts, local business stakeholders, and the central Data Factory. /p h3What You’ll Be Doing: /h3 ul liDesigning and deploying scalable data pipelines in Azure /li liTransforming multi-source data into structured, analytics-ready datasets /li liEnsuring reliability through monitoring, testing, and quality controls /li liApplying best practices across CI/CD, Git, security, RBAC, and observability /li liContributing to a “build once, reuse across countries” Data Factory model /li liSupporting smooth handovers to Data Analysts through documentation and knowledge sharing /li /ul h3Tech Environment /h3 ul liBatch and event-driven processing /li lin8n (where relevant within the framework) /li liGit, CI/CD, monitoring alerting tools /li liData governance, security, and access management standards /li /ul h3What We’re Looking For /h3 ul liDepth of experience in data engineering /li liStrong SQL and data modelling expertise /li liExperience building and operating production-grade pipelines /li liCloud experience (Azure preferred) /li liComfortable working in a federated, multi-country environment /li liA collaborative mindset and ability to partner effectively with Data Analysts /li /ul pThis role would suit someone who enjoys combining technical depth with real business impact — building data products that are reliable, reusable, and built to scale. /p h3Who Are Entegra: /h3 pEntegra is a performance improvement company and the world’s leading hospitality performance partner, helping clients buy better, operate smarter, and perform more sustainably. /p pLeveraging over €36 billion in global purchasing power, we deliver savings, services, and operational excellence that drive client growth and impact. /p pPart of the Sodexo Group, a global leader in Quality-of-Life Services, Entegra operates across 10 countries in North America, Continental Europe, and the UK Ireland, employing more than 450 colleagues and achieving double-digit growth year on year. /p pWe’re committed to equity of opportunity and to fostering a culture that values the strength of a diverse workforce. As a Disability Confident Committed employer, we actively remove barriers to inclusion and ensure everyone has the chance to thrive. /p pAt Entegra, inclusivity drives performance — we create an environment where every colleague feels supported, valued, and empowered to succeed. /p /p #J-18808-Ljbffr