2,973 Consultant Etl Developer jobs in India
Job No Longer Available
This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.
However, we have similar jobs available for you below.
Principal Consultant, DB ETL Developer
Posted today
Job Viewed
Job Description
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients.Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI.
Inviting applications for the role of Principal Consultant, DB ETL Developer
In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies.
Responsibilities
·Will be responsible for design, code & maintain databases and ensuring their stability, reliability, and performance.
·Research and suggest new database products, services and protocols.
·Ensure all database programs meet company and performance requirements.
·Collaborate with other database teams and owners of different applications.
·Modify databases according to requests and perform tests.
·Maintain and own database in all environments
Qualifications we seek in you!
Minimum Qualifications
·BE/B Tech/MCA
·Excellent written and verbal communication skills
Preferred Qualifications/ Skills
·A bachelor’s degree in Computer Science or a related field.
·Hands-on developer in Sybase, DB2, ETL technologies.
·Worked extensively on data integration, designing, and developing reusable interfaces/
·Advanced experience in Sybase, shell scripting, Unix, database design and modelling, and ETL technologies, Informatica
·Hands-on experience with Snowflake OR Informatica with below details:-
oDemonstrate expertise in Snowflake data modelling and ELT using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts.
oDesigning, implementing, and testing cloud computing solutions using Snowflake technology
oExpertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time travel, and understand how to use these features.
oCreating, monitoring and optimization of ETL/ELT processes (Talend, Informatica) migrating solutions from on-premises to public cloud platforms.
·Expert level understanding of data warehouse, core database concepts and relational database design
·Skilled at writing, editing large complicated SQL statements
·Experience in writing stored procedures, optimization, and performance tuning
·Strong Technology acumen and a deep strategic mindset
·Proven track record of delivering results
·Proven analytical skills and experience making decisions based on hard and soft data
·A desire and openness to learning and continuous improvement, both of yourself and your team members
·Exposure to SDLC tools, such as: JIRA, Confluence, SVN, TeamCity, Jenkins, Nolio, and Crucible.
·Experience on DevOps, CI/CD, and Agile methodology.
·Good to have experience with Business Intelligence tools
·Familiarity with Postgres and Python is a plus
Data Migration
Posted today
Job Viewed
Job Description
ducation Qualification-
Competencies-
Must have migration skills
Must have experience on S/4 HANA
Master Data management experience
Certificates-
SAP Certified Technology Associate - OS/DB Migration for SAP NetWeaver 7.52
Skills-
Must have good Data migration skills,
Good experience S/4 HANA database
Must have experience on SAP experience on other modules
Data migration
Posted today
Job Viewed
Job Description
Vertiv, a $ global organization with nearly 27,000 employees, designs, builds and services critical infrastructure that enables vital applications for data centers, communication networks, and commercial and industrial facilities. We support today’s growing mobile and cloud computing markets with a portfolio of power, thermal and infrastructure management solutions.
Job Summary
The Data Migration – Item MDM will manage extract, transform and load (ETL) item related data from/to Oracle PD/PIM and Oracle EBS.
Responsibilities :
- Part of the Data Migration team doing ETL activities in
- Oracle Fusion cloud PD and PIM related to updating item attributes and BOM. Loading new item, document, attachments and BOM information
- Oracle EBS related migration of all master and transactional data. Updating item attribution and BOM information.
- Previous experience of Data Migration of Item related data in Oracle PD/PIM or Oracle EBS a must.
- Adhere to a data migration strategy and usage of specific data migration tools.
- Identify risks and issues in a timely manner and escalate for resolution as needed.
- Manage data quality across different phases of the data migration and make sure that data is fit for purpose.
- Knowledge of Fusion Data Migration tools including FBDI/HDL/ADFDI and Fusion Web Services.
- Work collaboratively to ensure data is cleansed in a timely manner.
- Substantial experience working with databases and ETL tools capable of data cleansing.
- Perform data migration audit, reconciliation and exception reporting.
- Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements.
- Work across multiple functional work streams to understand data usage and implications for data migration.
- Support initiatives for data integrity and governance. Perform source data identification and analysis to manage source to target data mapping.
- Managing master and transactional data, including creation, updates, and deletion.
Requirements :
- Bachelor's Degree in Information Technology, Process Management or related degree or experience
- At least 4 years of combined experience in item/product data migration specifically extract, transform and load.
- Candidates should have 2+ years of experience in Oracle Fusion and Oracle EBS data migration roles.
- Business Knowledge: Demonstrates strong knowledge of current and possible future policies, practices, trends, technology, and information related to the business and the organization.
- Communication: Demonstrates excellent listening and communication skills (written and verbal)
- Initiative : Works independently and is highly motivated to initiate and accept new challenges
- Judgment/Decision Making : Makes solid decisions based on a mixture of analysis, wisdom, experience, and judgment.
- Managing & Adapting to Change : Readily adapts to changes in priority of initiatives and overall strategic direction within a multi-plan, geographically widespread organization.
- Professionalism : Exhibits appropriate attributes in all aspects of performance and demeanor
- Teamwork : Organizes and directs effective teams at the cross-functional level that consistently achieve stated goals
- Results Oriented : Bottom-line oriented and can be counted on to consistently meet and exceed goals.
Data Migration
Posted today
Job Viewed
Job Description
Greetings from Netsach - A Cyber Security Company.
We are looking for an experienced SAP MDG Applications consultant and candidate must have worked in S4/HANA for at least one AMS or implementation Project.
Job Title: SAP Data & MDG Application
Exp: 5-10yrs
No of Openings: 3
Job Location: Bangalore/Hyderabad/Mumbai/Kolkata
Work Type: Fulltime - Hybrid
Interested candidates please share your resume at and post in netsachglobal.com.
Experience of working in S4/HANA for at least one AMS or implementation Project.
Along with the above, the candidate should have strong knowledge in:
In-depth exposure to SAP MDG applications - F, S, C, M (Financial Master Data, Supplier, Customer and Material Master Data maintenance)
Prior functional experience with Data Migration, SAP FI, SAP SD or SAP MM modules
Participate in requirement gathering sessions and then document the FSD
Good understanding on MDG standard process and should be able to guide user around MDG
Experience in SAP MDG S4HANA Configuration (Including Data Model, BRFPlus and Floor Plan manager )
Experience in configuration rule-based workflow
Experience in integration business process requirements with technical implementation of SAP Master Data Governance.
Experience in developing user statistics reports in MDG.
Knowledge on generating statistics reports for material master data cleansing activities.
Thank You
Emily Jha
+91
Netsach - A Cyber Security Company
Data Migration
Posted 16 days ago
Job Viewed
Job Description
Greetings from Netsach - A Cyber Security Company.
We are looking for an experienced SAP MDG Applications consultant and candidate must have worked in S4/HANA for at least one AMS or implementation Project.
Job Title: SAP Data & MDG Application
Exp: 5-10yrs
No of Openings: 3
Job Location: Bangalore/Hyderabad/Mumbai/Kolkata
Work Type: Fulltime - Hybrid
Interested candidates please share your resume at and post in netsachglobal.com.
Experience of working in S4/HANA for at least one AMS or implementation Project.
Along with the above, the candidate should have strong knowledge in:
In-depth exposure to SAP MDG applications - F, S, C, M (Financial Master Data, Supplier, Customer and Material Master Data maintenance)
Prior functional experience with Data Migration, SAP FI, SAP SD or SAP MM modules
Participate in requirement gathering sessions and then document the FSD
Good understanding on MDG standard process and should be able to guide user around MDG
Experience in SAP MDG S4HANA Configuration (Including Data Model, BRFPlus and Floor Plan manager )
Experience in configuration rule-based workflow
Experience in integration business process requirements with technical implementation of SAP Master Data Governance.
Experience in developing user statistics reports in MDG.
Knowledge on generating statistics reports for material master data cleansing activities.
Thank You
Emily Jha
+91
Netsach - A Cyber Security Company
Data Migration Engineer
Posted 1 day ago
Job Viewed
Job Description
**Data Migration Engineer**
About Oracle FSGIU - Finergy:
The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals.
**Job Summary:**
We are seeking a skilled Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime.
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
**Qualifications & Skills:**
+ Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
+ 3 to 5+ Years of experience in Databricks and big data frameworks
+ Proficient in AWS services and data migration
+ Experience in Unity Catalogue
+ Familiarity with Batch and real time processing
+ Data engineering with strong skills in Python, PySpark, SQL
+ Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus.
**Soft Skills:**
+ Strong problem-solving and analytical skills.
+ Excellent communication and collaboration abilities.
+ Ability to work in a fast-paced, agile environment.
**Responsibilities**
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
Career Level - IC1
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling +1 in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Data Migration Engineer
Posted 1 day ago
Job Viewed
Job Description
**Data Migration Engineer**
About Oracle FSGIU - Finergy:
The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals.
**Job Summary:**
We are seeking a skilled Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime.
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
**Qualifications & Skills:**
+ Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
+ 3 to 5+ Years of experience in Databricks and big data frameworks
+ Proficient in AWS services and data migration
+ Experience in Unity Catalogue
+ Familiarity with Batch and real time processing
+ Data engineering with strong skills in Python, PySpark, SQL
+ Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus.
**Soft Skills:**
+ Strong problem-solving and analytical skills.
+ Excellent communication and collaboration abilities.
+ Ability to work in a fast-paced, agile environment.
**Responsibilities**
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
Career Level - IC1
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling +1 in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Be The First To Know
About the latest Consultant etl developer Jobs in India !
Data Migration Engineer
Posted 1 day ago
Job Viewed
Job Description
**Data Migration Engineer**
About Oracle FSGIU - Finergy:
The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals.
**Job Summary:**
We are seeking a skilled Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime.
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
**Qualifications & Skills:**
+ Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
+ 3 to 5+ Years of experience in Databricks and big data frameworks
+ Proficient in AWS services and data migration
+ Experience in Unity Catalogue
+ Familiarity with Batch and real time processing
+ Data engineering with strong skills in Python, PySpark, SQL
+ Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus.
**Soft Skills:**
+ Strong problem-solving and analytical skills.
+ Excellent communication and collaboration abilities.
+ Ability to work in a fast-paced, agile environment.
**Responsibilities**
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
Career Level - IC1
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling +1 in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Data Migration Engineer
Posted 1 day ago
Job Viewed
Job Description
**Data Migration Engineer**
About Oracle FSGIU - Finergy:
The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals.
**Job Summary:**
We are seeking a skilled Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime.
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
**Qualifications & Skills:**
+ Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
+ 3 to 5+ Years of experience in Databricks and big data frameworks
+ Proficient in AWS services and data migration
+ Experience in Unity Catalogue
+ Familiarity with Batch and real time processing
+ Data engineering with strong skills in Python, PySpark, SQL
+ Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus.
**Soft Skills:**
+ Strong problem-solving and analytical skills.
+ Excellent communication and collaboration abilities.
+ Ability to work in a fast-paced, agile environment.
**Responsibilities**
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
Career Level - IC1
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling +1 in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Data Migration Engineer
Posted 1 day ago
Job Viewed
Job Description
**Data Migration Engineer**
About Oracle FSGIU - Finergy:
The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals.
**Job Summary:**
We are seeking a skilled Data Migration Engineer with expertise in AWS, Databricks, Python, PySpark, and SQL to lead and execute complex data migration projects. The ideal candidate will design, develop, and implement data migration solutions to move large volumes of data from legacy systems to modern cloud-based platforms, ensuring data integrity, accuracy, and minimal downtime.
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
**Qualifications & Skills:**
+ Education: Bachelor's degree in Computer Science, Engineering, Data Science, or a related field. Advanced degrees are a plus.
+ 3 to 5+ Years of experience in Databricks and big data frameworks
+ Proficient in AWS services and data migration
+ Experience in Unity Catalogue
+ Familiarity with Batch and real time processing
+ Data engineering with strong skills in Python, PySpark, SQL
+ Certifications: AWS Certified Solutions Architect, Databricks Certified Professional, or similar are a plus.
**Soft Skills:**
+ Strong problem-solving and analytical skills.
+ Excellent communication and collaboration abilities.
+ Ability to work in a fast-paced, agile environment.
**Responsibilities**
**Job Responsibilities**
**Software Development:**
+ Design, develop, test, and deploy high-performance and scalable data solutions using Python, PySpark, SQL
+ Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
+ Implement efficient and maintainable code using best practices and coding standards.
**AWS & Databricks Implementation:**
+ Work with Databricks platform for big data processing and analytics.
+ Develop and maintain ETL processes using Databricks notebooks.
+ Implement and optimize data pipelines for data transformation and integration.
+ Utilize AWS services (e.g., S3, Glue, Redshift, Lambda) and Databricks to build and optimize data migration pipelines.
+ Leverage PySpark for large-scale data processing and transformation tasks.
**Continuous Learning:**
+ Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
+ Share knowledge with the team and contribute to a culture of continuous improvement.
**SQL Database Management:**
+ Utilize expertise in SQL to design, optimize, and maintain relational databases.
+ Write complex SQL queries for data retrieval, manipulation, and analysis.
Career Level - IC1
**About Us**
As a world leader in cloud solutions, Oracle uses tomorrow's technology to tackle today's challenges. We've partnered with industry-leaders in almost every sector-and continue to thrive after 40+ years of change by operating with integrity.
We know that true innovation starts when everyone is empowered to contribute. That's why we're committed to growing an inclusive workforce that promotes opportunities for all.
Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs.
We're committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing or by calling +1 in the United States.
Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans' status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.