Data Engineer - Python AND Kafka AND (Hadoop OR HDFS OR Hive) AND Snowflake AND apache AND (iceberg Job at NTT DATA, Inc., Indiana

Mnh1aVZNMFQvR0ZQU1NUTTFzOWdQOVZrWkE9PQ==
  • NTT DATA, Inc.
  • Indiana

Job Description

Responsibilities:

Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse. This is a high visibility and crucial project for Goldman Sachs.
Responsibilities of the Engineer includes:

Pipeline Migration
Logic & Scheduling: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment.
Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity.
Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "handoff and sign-off" conversations with data owners to ensure migrated assets meet business requirements.

Consumption Pattern Migration
Code Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg.
Usage analysis: Understand usage patterns to deliver the required data products.
Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "handoff and sign-off" conversations with data owners to ensure migrated assets meet business requirements.

Data Reconciliation & Quality
A rigorous approach to data validation is required. Candidates must work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows.

Engineer will also need to work with our other internal data management platform, and must have an aptitude for learning new workflows and language constructs as necessary.

Technical Skills:

Basic Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field.
Experience: Minimum of 3-5 years of professional "hands-on-keyboard" coding experience in a collaborative, team-based environment.  Ability to trouble shoot  (SQL) and basic scripting experience.
Languages: Professional proficiency in Python or Java.
Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience.

Core Data Engineering Competencies:  Candidates must demonstrate a sophisticated understanding of the following modeling concepts to ensure data correctness during reconciliation:
Temporal Data Modeling: Managing state changes over time (e.g., SCD Type 2).
Schema Management: Expertise in Schema Evolution (Ref: Iceberg Apache) and enforcement strategies.
Performance Optimization: Advanced knowledge of data partitioning and clustering.
Architectural Theory: Balancing Normalization vs. Denormalization and the strategic use of Natural vs. Surrogate Keys.

Technical Stack Requirements: 
While candidates are not expected to be experts in every tool, the collective team must cover the following technologies:

Extraction & Logic

Kafka, ANSI SQL, FTP, Apache Spark


Data Formats

JSON, Avro, Parquet


Platforms

Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ

Core Competencies:

Demonstrates strong integrity and consistently models good conduct and ethical decision-making.
Acts as a trusted team player who collaborates effectively across multiple teams and functions.
Communicates with clarity and confidence - concise written updates, structured verbal briefings, and proactive stakeholder management.
Works effectively with global teams across time zones and cultures; builds alignment and resolves issues constructively.
Delivery-focused with a strong sense of ownership; drives work to closure and meets commitments.
Brings high energy and urgency to achieve targets while maintaining quality and professionalism.
Shows intellectual curiosity; asks thoughtful questions, surfaces risks early, and seeks feedback to continuously improve.

Job Tags

Similar Jobs

Chronim Delivery and Logistics

Delivery Driver / Courier/ Cargo Van, Sprinter REQUIRED - Minneapolis Job at Chronim Delivery and Logistics

 ...Chronim Delivery and Logistics is seeking dependable Delivery Drivers to join our team. Drivers will be responsible for picking up packages...  ...Requirements: Must have a high-top or extended cargo van Experience with multi-drop/multi-stop routes is a plus Ability... 

Confidential

Generator Service Technician Job at Confidential

 ...Job Title: Generator Service Technician Job Description We are seeking a dedicated Field Service Technician to maintain and repair...  ...Word and Outlook. Essential Skills Experience in generator maintenance and troubleshooting. Proficiency in electrical and... 

Confidential

Offer: Family Advocate Job at Confidential

 ...Family Advocate POSITION OVERVIEW Family Advocate - Queensbury, NY POSITION SUMMARY this job offer will function as a member of the Mobile Crisis Services, assisting the clinicians on Mobile visits by providing support and advocacy for family members... 

epay, a Euronet Worldwide Company

Customer Service Representative Noon shift Job at epay, a Euronet Worldwide Company

 ...is growing and we are looking for Call Center Agents for our Customer Service Team! If you have availability on your afternoons, and you...  ...3:00 pm to 12:00 am. Base Salary + performance bonus + night shift hours payment Life insurance 100% covered 50/50 Health... 

Confidential

Security Software Engineer - Endpoint Security Job at Confidential

 ...seeking an experienced cybersecurity software engineer to drive the development and enhancement of their cloud security monitoring platform. This role involves building...  ...APIs Practical familiarity with AWS, GCP, and Azure APIs for resource discovery, configuration...