Www.nttdata.com/it Responsibilities A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value. Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform. Use an analytical, data-driven approach to drive a deep understanding of fast-changing business needs. Collaborate with the team to evaluate business needs and priorities, liaise with key business partners, and address team needs related to data systems and management. Participate in project planning by identifying milestones, deliverables, and resource requirements; track activities and task execution. Requirements Required Skills Bachelor’s degree in Computer Science, Computer Engineering, or a relevant field. At least 5-10 years of experience in a data engineering role. Expertise in software engineering using Scala, Java, or Python. Advanced SQL skills, with a preference for using BigQuery. Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion. Experience using workflow management tools. Good understanding of GCP architecture for batch and streaming data. Strong knowledge of data technologies and data modeling. Expertise in building modern, cloud-native data pipelines and operations, following an ELT philosophy. Experience with Data Migration and Data Warehousing. Strong understanding of how to organize, normalize, and store complex data to enable both ETL processes and end-user access. Passion for designing ingestion and transformation processes for data from multiple sources to create cohesive data assets. Good understanding of developer tools, CI/CD, etc. Excellent communication skills; empathetic with end users and internal customers. Nice-to-have: Experience with Big Data ecosystems such as Hadoop, Hive, HDFS, HBase. Experience with Agile methodologies and DevOps principles. J-18808-Ljbffr