Descrizione Lavoro
ResponsibilitiesA Data Architect is an IT expert who enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with an emphasis on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission of a Data Architect is to turn raw data into information, creating insights and business value.Build large-scale batch and real-time data pipelines using data processing frameworks in GCP cloud platform.Apply an analytical, data-driven approach to understand rapidly changing business needs.Collaborate with the team to evaluate business needs, liaise with key business partners, and address team requirements related to data systems and management.Participate in project planning, identify milestones, deliverables, resource requirements, and track activities and task execution.Required SkillsBachelor’s degree in Computer Science, Computer Engineering, or a relevant field.At least 5-10 years of experience in a data engineering role.Expertise in software engineering using Scala, Java, or Python.Advanced SQL skills, preferably with BigQuery.Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.Experience with workflow management tools.Solid understanding of GCP architecture for batch and streaming data processing.Strong knowledge of data technologies and data modeling.Expertise in building modern, cloud-native data pipelines and operations following an ELT approach.Experience with data migration and data warehousing.Ability to organize, normalize, and store complex data effectively, supporting ETL processes and end-user needs.Passion for designing ingestion and transformation of data from multiple sources to create cohesive data assets.Good understanding of developer tools, CI/CD pipelines, etc.Excellent communication skills, empathetic towards end users and internal customers.Nice-to-have:Experience with Big Data ecosystem tools such as Hadoop, Hive, HDFS, HBase.Experience with Agile methodologies and DevOps principles.J-18808-Ljbffr
#J-18808-Ljbffr