3,174 Etl Engineer jobs in India

Senior ETL Engineer

Pune, Maharashtra Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS.

Key Responsibilities  :

ETL Development and Maintenance  :

  • Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals.
  • Maintain existing ETL processes ensuring data accuracy and adequate process performance

Data Warehouse Design & Development  :

  • Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions.
  • Proficiently utilize SQL queries to retrieve and manipulate data as required.

Data Quality and Analysis  :

  • Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality.
  • Implement data quality improvement strategies to ensure the accuracy and reliability of data.

Performance Optimization  :

  • Investigate and resolve database and query performance issues to ensure optimal system functionality.
  • Continuously monitor system performance and make recommendations for improvements.

Business Collaboration  :

  • Collaborate with business users to gather comprehensive data and reporting requirements.
  • Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise.
  • Bachelors or Masters degree in Computer Science, Information Technology, or a related field.
  • Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS).
  • Proven expertise in designing, implementing, and managing ETL processes and data warehouses.
  • Proficiency with SQL and experience in optimizing queries for performance.
  • Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions.
  • Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams.
  • Detail-oriented with strong problem-solving capabilities.

Skills Required
Sql, Data Modeling, Etl Tools, Python, Data Warehousing, Apache Spark, Cloud Technologies, Data Integration
This advertiser has chosen not to accept applicants from your region.

Data ETL Engineer

Pune, Maharashtra IDT Corporation

Posted today

Job Viewed

Tap Again To Close

Job Description

IDT () is a communications and financial services company founded in 1990 and headquartered in New Jersey, US. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1500 people across 20+ countries, and have revenues in excess of $ billion.We are looking for a Mid-level Business Intelligence Engineer to join our global team. If you are highly intelligent, motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals across many lines of business. * The interview process will be conducted in English.

Responsibilities:

  • Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker).
  • Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
  • Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes.
  • Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products.
  • Should be willing to explore and learn new technologies and concepts to provide the right kind of solution.
  • Target and result oriented with strong end user focus.
  • Effective oral and written communication skills with BI team and user community.
  • Requirements:

  • 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
  • Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
  • Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
  • Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
  • Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
  • Excellent English communication skills.
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. 
  • Pluses:

  • Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
  • Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. 
  • Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
  • Experience with reporting/visualization tools (, Looker) and job scheduler software.
  • Experience in Telecom, eCommerce, International Mobile Top-up.
  • Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise.
  • Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.
  • Please attach CV in English. The interview process will be conducted in English.Only accepting applicants from INDIA
    This advertiser has chosen not to accept applicants from your region.

    Senior ETL Engineer

    Pune, Maharashtra MicroStrategy India

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Description

    Company Description

    Strategy (Nasdaq: MSTR) is at the forefront of transforming organizations into intelligent enterprises through data-driven innovation. We don't just follow trends—we set them and drive change. As a market leader in enterprise analytics and mobility software, we've pioneered BI and analytics space, empowering people to make better decisions and revolutionizing how businesses operate.

    But that's not all. Strategy is also leading to a groundbreaking shift in how companies approach their treasury reserve strategy, boldly adopting Bitcoin as a key asset. This visionary move is reshaping the financial landscape and solidifying our position as a forward-thinking, innovative force in the market. Four years after adopting the Bitcoin Standard, Strategy's stock has outperformed every company in S&P 500.

    Our people are the core of our success. At Strategy, you'll join a team of smart, creative minds working on dynamic projects with cutting-edge technologies. We thrive on curiosity, innovation, and a relentless pursuit of excellence.

    Our corporate values—bold, agile, engaged, impactful, and united—are the foundation of our culture. As we lead the charge into the new era of AI and financial innovation, we foster an environment where every employee's contributions are recognized and valued.

    Join us and be part of an organization that lives and breathes innovation every day. At Strategy, you're not just another employee; you're a crucial part of a mission to push the boundaries of analytics and redefine financial investment.

    Position Overview :

    Job Location: Working Full time from Strategy Pune office.
    We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS.

    Key Responsibilities :

    ETL Development and Maintenance :

    • Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals.

    • Maintain existing ETL processes ensuring data accuracy and adequate process performance

    Data Warehouse Design & Development :

    • Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions.

    • Proficiently utilize SQL queries to retrieve and manipulate data as required.

    Data Quality and Analysis :

    • Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality.

    • Implement data quality improvement strategies to ensure the accuracy and reliability of data.

    Performance Optimization :

    • Investigate and resolve database and query performance issues to ensure optimal system functionality.

    • Continuously monitor system performance and make recommendations for improvements.

    Business Collaboration :

    • Collaborate with business users to gather comprehensive data and reporting requirements.

    • Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise.

    Qualifications :

    • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.

    • Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS).

    • Proven expertise in designing, implementing, and managing ETL processes and data warehouses.

    • Proficiency with SQL and experience in optimizing queries for performance.

    • Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions.

    • Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams.

    • Detail-oriented with strong problem-solving capabilities.

    Additional Information

    The recruitment process includes online assessments as a first step (English, logic, business) - we send them via e-mail, please check also your SPAM folder


    Array
    This advertiser has chosen not to accept applicants from your region.

    Platform/ETL Engineer

    Bengaluru, Karnataka Confidential

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Senior Consultant - Platform Engineer

    Responsibilities

    • Design, build, and optimize platform solutions in Azure Databricks to support scalable, high-performance, and secure data engineering workflows.

    • Implement and maintain platform automation for infrastructure, monitoring, and deployment pipelines using modern DevOps practices.

    • Align with best practices for platform architecture, governance, and security within Databricks.

    • Collaborate with data engineers, modellers, and testers to ensure the platform meets technical and business requirements.

    • Proactively monitor, troubleshoot, and resolve issues to maintain platform stability and performance.

    Role Requirements

    • Strong platform engineering experience with Azure Databricks and related Azure services.

    • Familiarity with DevOps principles and CI/CD pipeline tools (e.g., Azure DevOps, Jenkins, GitHub Actions).

    • Background in system Azure architecture, networking, Windows and Linux configuration, DevOps or SRE roles
    • • Ability to design scalable solutions using Databricks features such as Delta Lake, Unity Catalog, and cluster policies.
    • Write and deploy infrastructure as Code (IaC) /Configuration as Code (CaC) (Azure Resource Manager, Puppet, Ansible, Terraform or similar)
    • • Strong problem-solving skills, with the ability to think strategically and avoid tactical or short-term solutions when possible.
    •  Experience working in Agile/Scrum teams.

    Experience/Skillset

    •  8+ years of experience in platform engineering or a related field.

    • Strong knowledge of Databricks, including Spark optimization, cluster management, and governance frameworks.

    • Good knowledge of Azure DevOps based toolchains
    • • Proficiency in Python, SQL, and PySpark, with the ability to implement robust and efficient solutions in Databricks.
    • • Expertise in Azure services such as Azure Data Factory, Azure Storage, and Azure Key Vault.
    • • Strong knowledge of SDLC and Agile methodologies.
    • • Excellent verbal and written communication skills.
    • • Proven ability to manage multiple projects and priorities effectively while delivering to tight deadlines.


    Skills Required
    Machine Learning, Llm
    This advertiser has chosen not to accept applicants from your region.

    Cloud & ETL Engineer Lead Sr

    Mumbai, Maharashtra Anicalls (Pty) Ltd

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    • Over 9+ Years of Experience in IT and Data warehousing, ETL (Extract, Transform and Load)
    • Experience in Informatica (Power center/Power Mart/Power Exchange).
    • Experience in SQL Server with CUBE
    • Extensively experience on ETL Informatica Transformations using Lookup, Filter, Expression, Router, Normalizer, Joiner, Update, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator and created complex mappings.
    • Experience in implementing the entire life cycle of the project
    • Experience in Unix Shell Scripts for automation of the ETL process.
    • Should be able to provide the end to end architecture of the ETL process for loading staging/landing zone and audit control process.
    • Experience in creating detailed technical design documents, including the Source to target mapping docs.
    • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins, and hash indexes in Oracle/Teradata and Postgre SQL databases.
    • Expertise in Data modeling techniques like Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
    • Experienced in the Performance of Mapping Optimizations in Informatica.
    This advertiser has chosen not to accept applicants from your region.

    Senior ETL Engineer/Consultant Specialist

    Pune, Maharashtra HSBC

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Some careers shine brighter than others.

    If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

    HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

    We are currently seeking an experienced professional to join our team in the role of Consultant Specialist

    In this role, you will:

  • Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs.
  • Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks.
  • Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (, BigQuery, Cloud Storage) to support analytics and reporting.
  • GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting.
  • Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs.
  • Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (, GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities.
  • Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team.
  • Requirements

    To be successful in this role, you should meet the following requirements:

  • Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development.
  • Technical Skills: Strong knowledge of GCP services (, BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform.
  • Analytical Skills: Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights.
  • This advertiser has chosen not to accept applicants from your region.

    ETL Automation Engineer

    Noida, Uttar Pradesh Strategic Talent Partner

    Posted 3 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems.

    Responsibilities

    • Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
    • Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
    • Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
    • Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
    • Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
    • Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
    • Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
    • Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
    • Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.

    Job Qualifications:

    Requirements and skills

    · At Least 4+ Years Experience


    • Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).

    · Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.

    · Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.

    · Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.

    · Performance Testing

    · Experience with version control systems like Git

    · Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.

    · Strong communication and collaboration skills.

    · Attention to detail and a passion for delivering high-quality solutions.

    · Ability to work in a fast-paced environment and manage multiple priorities.

    · Enthusiastic about learning new technologies and frameworks.

    Experience with the following tools and technologies are desired.

    • QLIK Replicate
    • Matillion ETL
    • Snowflake
    • Data Vault Warehouse Design
    • Power BI
    • Azure Cloud – Including Logic App, Azure Functions, ADF
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Etl engineer Jobs in India !

    ETL Automation Engineer

    Noida, Uttar Pradesh Strategic Talent Partner

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems.

    Responsibilities

    • Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
    • Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
    • Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
    • Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
    • Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
    • Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
    • Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
    • Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
    • Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.

    Job Qualifications:

    Requirements and skills

    · At Least 4+ Years Experience

    • Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).

    · Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.

    · Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.

    · Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.

    · Performance Testing

    · Experience with version control systems like Git

    · Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.

    · Strong communication and collaboration skills.

    · Attention to detail and a passion for delivering high-quality solutions.

    · Ability to work in a fast-paced environment and manage multiple priorities.

    · Enthusiastic about learning new technologies and frameworks.

    Experience with the following tools and technologies are desired.

    • QLIK Replicate
    • Matillion ETL
    • Snowflake
    • Data Vault Warehouse Design
    • Power BI
    • Azure Cloud – Including Logic App, Azure Functions, ADF
    This advertiser has chosen not to accept applicants from your region.

    ETL Automation Engineer

    Noida, Uttar Pradesh Strategic Talent Partner

    Posted 2 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems.
    Responsibilities
    Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments.
    Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources.
    Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic.
    Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies.
    Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly.
    Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process.
    Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner.
    Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy.
    Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices.
    Job Qualifications:
    Requirements and skills
    · At Least 4+ Years Experience

    Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.).
    · Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing.
    · Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations.
    · Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes.
    · Performance Testing
    · Experience with version control systems like Git
    · Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.
    · Strong communication and collaboration skills.
    · Attention to detail and a passion for delivering high-quality solutions.
    · Ability to work in a fast-paced environment and manage multiple priorities.
    · Enthusiastic about learning new technologies and frameworks.
    Experience with the following tools and technologies are desired.
    QLIK Replicate
    Matillion ETL
    Snowflake
    Data Vault Warehouse Design
    Power BI
    Azure Cloud – Including Logic App, Azure Functions, ADF
    This advertiser has chosen not to accept applicants from your region.

    ETL Informatica Engineer

    Bengaluru, Karnataka NTT DATA

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Description

    Req ID:  314638 

    NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.

    We are currently seeking a ETL Informatica Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN).

    "Job Duties: As a Data Engineer, you will be a member of the Controls Engineering, Measurement and Analytics (CEMA)-Technology Risk Solutions development team, with specific focus on sourcing data and developing data solutions for Legal, Compliance, Audit, Risk Management functions within Morgan Stanley. In this role you will be primarily responsible for the development of data workflows, views, and stored procedures, in addition to performing data analysis, and monitoring and tuning queries and data loads. You will be working closely with data providers, data scientists, data developers, and data analytics teams to facilitate the implementation of client-specific business requirements and requests.
    YOUR KEY RESPONSIBILITIES: 
    • To develop ETLs, stored procedures, triggers, and views on our DB2-based Data Warehouse
    • To perform data profiling and analysis on source system data to ensure that source system data can be integrated and represented properly in our models
    • To monitor the performance of queries and data loads and perform tuning as necessary
    • To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues
    SKILLS / QUALIFICATIONS
    • Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required 
    • At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes
    • At least 5+ years of experience developing complex ETLs with Informatica PowerCenter
    • At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis
    • At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)
    • Experience with performance tuning DB2 tables, queries, and stored procedures
    • Experience with scripting languages like Perl and Python
    • Experience with data ingestion into Hadoop a plus
    • Experience with Autosys or Airflow a plus
    • An understanding of E-R data models (conceptual, logical and physical)
    • Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal BI-Temporal models, etc.)
    • Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions
    • Experience with both Waterfall and Agile development methodologies
    • Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
    • Self starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities. 
    • Strong problem solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements and priorities.

    Minimum Skills Required: Must have strong Informatica experience (hands-on experience):
    • At least 5+ years of experience developing complex ETLs with Informatica PowerCenter
    • At least 5+ years of experience in data development and solutions in highly complex data environments with large data volumes
    •At least 5+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis
    • At least 5+ years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus)"

    #LI-CDL

    About NTT DATA

    NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com

    NTT DATA endeavors to make to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

    Apply

    Listen to the story of Employee Voice

    Annette Barnabas

    Business Analysis Associate Director

    India

    Read more

    Vanathi Asok

    Director – Project and Application Services

    India

    Read more

    Swathi Sujir

    Business Operations Supervisor

    India

    Read more

    Apply Back to search results
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Etl Engineer Jobs