Statusneo-Data Engineer (ETL)
Posted today
Job Viewed
Job Description
Role: ETIL Developer
Location: Mumbai
Experience: 3-5 Years
Skills - ETL, BDM, Informatica, Data Integrator
Role Overview:
We are seeking a skilled ETL Developer with experience in Informatica, Big Data Management (BDM), and Data Integrator. The ideal candidate will have a strong background in data extraction, transformation, and loading (ETL) processes, with a focus on optimizing data integration solutions for complex data environments. You will play a critical role in designing and implementing ETL workflows to support our business intelligence and data warehousing initiatives.
Key Responsibilities:
- Design, develop, and maintain ETL processes using Informatica, BDM, and Data Integrator.
- Collaborate with data architects and business analysts to understand data requirements and translate them into ETL solutions.
- Optimize ETL processes for performance, scalability, and reliability.
- Conduct data quality assessments and implement data cleansing procedures.
- Monitor and troubleshoot ETL processes to ensure timely and accurate data integration.
- Work with large datasets across multiple data sources, including structured and unstructured data.
- Document ETL processes, data flows, and mappings to ensure clarity and consistency.
Required Skills:
- 3-5 years of experience in ETL development with a strong focus on Informatica, BDM, and Data Integrator.
- Proficiency in SQL and database technologies (e.g., Oracle, SQL Server, MySQL).
- Experience with big data technologies and frameworks.
- Strong analytical and problem-solving skills.
- Familiarity with data warehousing concepts and best practices.
- Excellent communication and collaboration skills.
About Statusneo :
We accelerate your business transformation by leveraging best fit CLOUD NATIVE technologies wherever feasible. We are DIGITAL consultants who partner with you to solve & deliver. We are experts in CLOUD NATIVE TECHNOLOGY CONSULTING & SOLUTIONS. We build, maintain & monitor highly scalable, modular applications that leverage elastic compute, storage and network of leading cloud platforms. We CODE your NEO transformations. #StatusNeo
Business domain experience is vital to the success of neo transformations empowered by digital technology. Experts in domain ask the right business questions to diagnose and address. Our consultants leverage your domain expertise & augment our digital excellence to build cutting edge cloud solutions.
Etl developer
Posted today
Job Viewed
Job Description
About PTR GlobalPTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.Job SummaryWe are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.ResponsibilitiesDesign, develop, and maintain ETL processes to support data integration and business intelligence initiatives.Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.Implement and manage ETL processes using SSIS (SQL Server Integration Services).Design and model data warehouses to support reporting and analytics needs.Ensure data accuracy, quality, and integrity through effective testing and validation procedures.Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.Must HavesMinimum 4+ years of experience as an ETL Developer or in a similar role.Proficiency in T-SQL for writing complex queries and stored procedures.Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.Knowledge of ADF (Azure Data Factory) and its application in ETL processes.Experience in data warehouse design and modeling.Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.Strong problem-solving and analytical skills.Excellent communication and interpersonal skills.Strong attention to detail and commitment to data quality.Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
Etl developer
Posted today
Job Viewed
Job Description
ETL Developer
Posted today
Job Viewed
Job Description
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society.
About Client:
Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations.
Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $53.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5 6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines.
Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries.
Requirements- Scripting & Monitoring Automation Engineer
Years of exp- 4-6 Years
Location- Chennai
Immediate Joiners Preferred.
JD Summary:
Looking for an engineer with expertise in scripting & automation for batch job scheduling and monitoring . Must have hands-on with Autosys/Oracle Scheduler/Airflow , scripting (Python/Shell/PowerShell), and exposure to Talend/ Cognos .
Key Responsibilities:
- Automate & monitor job scheduling workflows.
- Integrate with ETL/BI tools.
- Troubleshoot & optimize batch processes.
- Support CI/CD & DR planning.
Requirements:
- Bachelor’s degree in IT/related field.
- Good communication & analytical skills.
ETL Developer
Posted today
Job Viewed
Job Description
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society.
About Client:
Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations.
Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $53.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5 6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines.
Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries.
Requirements- Scripting & Monitoring Automation Engineer
Years of exp- 4-6 Years
Location- Chennai
Immediate Joiners Preferred.
JD Summary:
Looking for an engineer with expertise in scripting & automation for batch job scheduling and monitoring . Must have hands-on with Autosys/Oracle Scheduler/Airflow , scripting (Python/Shell/PowerShell), and exposure to Talend/ Cognos .
Key Responsibilities:
- Automate & monitor job scheduling workflows.
- Integrate with ETL/BI tools.
- Troubleshoot & optimize batch processes.
- Support CI/CD & DR planning.
Requirements:
- Bachelor’s degree in IT/related field.
- Good communication & analytical skills.
ETL Developer
Posted today
Job Viewed
Job Description
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society.
About Client:
Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations.
Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $53.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5 6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines.
Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries.
Requirements- Scripting & Monitoring Automation Engineer
Years of exp- 4-6 Years
Location- Chennai
Immediate Joiners Preferred.
JD Summary:
Looking for an engineer with expertise in scripting & automation for batch job scheduling and monitoring . Must have hands-on with Autosys/Oracle Scheduler/Airflow , scripting (Python/Shell/PowerShell), and exposure to Talend/ Cognos .
Key Responsibilities:
- Automate & monitor job scheduling workflows.
- Integrate with ETL/BI tools.
- Troubleshoot & optimize batch processes.
- Support CI/CD & DR planning.
Requirements:
- Bachelor’s degree in IT/related field.
- Good communication & analytical skills.
ETL Developer
Posted today
Job Viewed
Job Description
About PTR Global
PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.
Job Summary
We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.
Responsibilities
- Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
- Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
- Implement and manage ETL processes using SSIS (SQL Server Integration Services).
- Design and model data warehouses to support reporting and analytics needs.
- Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
- Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
- Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
- Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
- Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.
Must Haves
- Minimum 4+ years of experience as an ETL Developer or in a similar role.
- Proficiency in T-SQL for writing complex queries and stored procedures.
- Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.
- Knowledge of ADF (Azure Data Factory) and its application in ETL processes.
- Experience in data warehouse design and modeling.
- Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
- Strong problem-solving and analytical skills.
- Excellent communication and interpersonal skills.
- Strong attention to detail and commitment to data quality.
- Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
Be The First To Know
About the latest Etl Jobs in Mumbai !
ETL Developer
Posted today
Job Viewed
Job Description
About PTR Global
PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success.
Job Summary
We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills.
Responsibilities
- Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives.
- Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation.
- Implement and manage ETL processes using SSIS (SQL Server Integration Services).
- Design and model data warehouses to support reporting and analytics needs.
- Ensure data accuracy, quality, and integrity through effective testing and validation procedures.
- Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs.
- Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly.
- Document ETL processes, workflows, and data mappings to ensure clarity and maintainability.
- Stay current with industry trends and best practices in ETL development, data integration, and data warehousing.
Must Haves
- Minimum 4+ years of experience as an ETL Developer or in a similar role.
- Proficiency in T-SQL for writing complex queries and stored procedures.
- Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes.
- Knowledge of ADF (Azure Data Factory) and its application in ETL processes.
- Experience in data warehouse design and modeling.
- Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate.
- Strong problem-solving and analytical skills.
- Excellent communication and interpersonal skills.
- Strong attention to detail and commitment to data quality.
- Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.
ETL Developer
Posted today
Job Viewed
Job Description
Website-
Job Title: ETL Developer – DataStage, AWS, Snowflake
Experience: 5–7 Years
Location: (Remote)
Job Type: (Full-time )
About the Role
We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.
Key Responsibilities
- Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
- Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
- Participate in code reviews , performance tuning, and defect triage sessions.
- Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
- Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
- Troubleshoot and resolve issues in QA/UAT/Production environments as needed.
- Adhere to agile delivery practices, sprint planning, and documentation requirements.
Required Skills and Experience
- 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
- Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
- Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
- Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
- Familiarity with S3 , version control systems (Git), and job orchestration tools.
- Experience with data profiling, cleansing, and quality validation routines.
- Understanding of data lake/data warehouse architectures and DevOps practices.
Good to Have
- Experience with Collibra, BigID , or other metadata/governance tools
- Exposure to Data Mesh/Data Domain models
- Experience with agile/Scrum delivery and Jira/Confluence tools
- AWS or Snowflake certification is a plus
ETL Developer
Posted today
Job Viewed
Job Description
Website-
Job Title: ETL Developer – DataStage, AWS, Snowflake
Experience: 5–7 Years
Location: (Remote)
Job Type: (Full-time )
About the Role
We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI/ML use cases.
Key Responsibilities
- Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
- Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
- Participate in code reviews , performance tuning, and defect triage sessions.
- Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
- Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
- Troubleshoot and resolve issues in QA/UAT/Production environments as needed.
- Adhere to agile delivery practices, sprint planning, and documentation requirements.
Required Skills and Experience
- 4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
- Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
- Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
- Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
- Familiarity with S3 , version control systems (Git), and job orchestration tools.
- Experience with data profiling, cleansing, and quality validation routines.
- Understanding of data lake/data warehouse architectures and DevOps practices.
Good to Have
- Experience with Collibra, BigID , or other metadata/governance tools
- Exposure to Data Mesh/Data Domain models
- Experience with agile/Scrum delivery and Jira/Confluence tools
- AWS or Snowflake certification is a plus