Lead Data Engineer

Latina 30-06-2025

Lead Data Engineer

Bain & Company Latina 30-06-2025
Riassunto

Località

Latina

Divisione Aziendale

Tipo di contratto

Data di pubblicazione

30-06-2025

Descrizione Lavoro

We are proud to be consistently recognized as one of the world’s best places to work, a champion of diversity and a model of social responsibility. We are currently #1 ranked consulting firm on Glassdoor’s Best Places to Work list and have maintained a spot in the top four on Glassdoor’s list for the last 13 years. We believe that diversity, inclusion and collaboration are key to building extraordinary teams. We hire people with exceptional talents, abilities, and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally.
WHO YOU’LL WORK WITH
As a member of Bain's Advanced Analytics Group (AAG), you’ll join a talented team of diverse and inclusive analytic and engineering professionals dedicated to solving complex challenges for our clients. We work closely with generalist consultants and clients to develop data-driven strategies and innovative solutions.
WHERE YOU’LL FIT WITHIN THE TEAM
As a Lead, Data Engineering, you will leverage your experience to implement and refine technical solutions across various industries. You will engage in the entire engineering lifecycle, focusing on designing, developing, optimizing, and deploying sophisticated data engineering solutions and infrastructure at a production scale suitable for the world’s largest companies.
WHAT YOU’LL DO :

Develop data and software solutions to address large-scale enterprise challenges for Bain's clients, serving as the data engineer and expert within a cross-functional team.
Develop and maintain long-lasting products that support internal or client needs.
Collaborate closely with and influence general consulting teams to identify analytics solutions for client business problems and to execute those solutions.
Collaborate with data engineering leaders to develop and advocate for modern data engineering concepts to both technical audiences and business stakeholders.
Enable data and technology for data science, analytics, and other application use cases via data engineering.
Perform transformations at scale including cleaning, enriching, de-duping, joining, and correlating structured, semi-structured, or unstructured data.
Define and implement new deployment techniques, tooling, and infrastructure automation within Bain, including full software development lifecycle activities such as designing, writing documentation, and conducting code reviews.
Participate in infrastructure engineering for the data ecosystem, including development, testing, deployment, and release.
Provide technical guidance to external clients and internal stakeholders in Bain.
Contribute to industry-leading innovations that generate significant impact for clients.
Stay current with emerging trends and technologies in cloud computing, data analysis, and software engineering, proactively seeking opportunities to enhance the analytics platform.
Travel is required (30%).

ABOUT YOU :

Proven experience in end-to-end data and software engineering within product engineering or professional services organizations, including project setup, testing, dependency, and build management.
Master’s degree in Computer Science, Engineering, or a related technical field.
Minimum of 5 years of experience.
At least 3 years at Senior or Staff level, or equivalent.

Technical Skills and Knowledge :

Working knowledge (3+ years) of programming languages such as Python, Scala, C/C++, Java, C#, or Go.
Experience deploying serverless data pipelines via containerization and Terraform orchestration.
Experience in data ingestion using modern ETL frameworks like Airflow, Beam, Luigi, Spark, Nifi, or similar.
Experience (3+ years) with SQL or NoSQL databases such as PostgreSQL, SQL Server, Oracle, MySQL, Redis, MongoDB, Elasticsearch, Hive, HBase, Teradata, Cassandra, Redshift, Snowflake.
Experience with cloud platforms (AWS, Azure, GCP) or Kubernetes via Terraform, with understanding of failover, high availability, and scalability.
Experience with DevOps practices, CI/CD pipelines, GitHub Actions, and version control.
Experience optimizing schema and performance of SQL and ETL pipelines in data lake and warehouse environments.
Strong fundamentals in computer science including data structures, algorithms, automated testing, object-oriented programming, and understanding of computer architecture implications on software performance.
Experience working within agile methodologies.

Interpersonal Skills :

Strong communication skills, with the ability to explain technical solutions to colleagues and clients from various disciplines.
Curiosity, proactivity, and critical thinking.
Ability to collaborate effectively across different levels and regions.

#J-18808-Ljbffr

Condividi

Come Candidarsi

Per maggiori informazioni e per candidarti, clicca il pulsante.