Ph3Responsibilities /h3 ul liCreate and maintain optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS using AWS native services (such as GLUE, Lambda) or using data management technologies (such as Talend or Informatica) /li liDesign and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3 /li liDesign operations architecture and conduct performance engineering for large scale data lakes in production environment /li liParticipate in client design workshops and provide tradeoffs and recommendations towards building solutions. /li liMentor other engineers in coding best practices and problem solving /li /ul h3Required Skills /h3 ul liWorking experience in a cloud native environment in one of the 3 major public clouds (GCP, AWS, Azure), at least 5 year experience on AWS is preferred /li liExperience and knowledge of Big Data Architectures, cloud and on premise /li liAWS infrastructure and networking working experience /li liAWS Collection Services: Kinesis, Kafka, Database Migration Service /li liAWS main Storage Service: S3, RDS, Redshift, DynamoDB /li liAWS main Compute Service: EC2, Lambda, ECS, EKS /li liExperience in building and deliver proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies /li liWorking experience in migrating workloads from on premise to cloud environment /li liExperience in monitoring distributed infrastructure, using AWS tools or open source ones such as CloudWatch, Prometheus, and the ELK stack /li liProven experience in: Java, Scala, Python, and shell scripting /li liWorking experience with: Apache Spark, Databricks, Azure Data Factory, Azure Synapse, and other Azure related ETL/ELT tools /li liAWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics /li liWorking experience with Agile Methodology and Kanban /li liSQL language knowledge /li liExperience working with source code management tools such as AWS CodeCommit or GitHub /li /ul pLocation: Bologna, Roma, Milano, Torino, Bari, Cosenza, Napoli, Treviso, Pisa e Salerno /p pThird parties fraudulently posing as NTT DATA recruiters /p pNTT DATA recruiters will never ask job seekers and candidates for payment or banking information during the recruitment process, for any reason. Please remain vigilant of third parties that may try to impersonate NTT DATA recruiters, either in writing or by phone, in an attempt to deceptively obtain personal data or money from you. All email communications from an NTT DATA recruiter will be associated with an @ email address. NTT DATA will not use any non-NTT DATA or personal email domains (Gmail, Yahoo, etc.) or personal communication channels (WhatsApp, Facebook etc) at any time during the recruitment process. If you suspect any fraudulent activity, please. /p h3Seniority level /h3 ul liMid-Senior level /li /ul h3Employment type /h3 ul liFull-time /li /ul h3Job function /h3 ul liEngineering and Information Technology /li liIndustries /li liIT Services and IT Consulting /li /ul pReferrals increase your chances of interviewing at NTT DATA Europe Latam by 2x /p pWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI. /p /p #J-18808-Ljbffr