173 Etl Architect jobs in India

ETL Architect

Pune, Maharashtra Accelirate Inc.

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

ETL Lead

Location: Pune, India (Hybrid/Remote)

Job Type: Full-Time

Experience: 10+ years


About Company

Accelirate is a leading AI and automation firm that builds intelligent agents transforming enterprise operations. We’re a certified partner of UiPath, Microsoft, and Salesforce, automating over 1500+ processes annually across industries. We’re proud to be recognized as a Top Workplaces USA winner five years in a row. Our work empowers humans, agents, and robots to collaborate and work smarter.


Job Summary:

We are seeking an experienced ETL Lead / ETL Architect to design, develop, and optimize scalable data pipelines. This role involves working with ETL tools (e.g. Matillion, Informatica, Talend, SSIS, ADF, AWS Glue etc), cloud data platforms (Snowflake, Azure, AWS, GCP), and data warehousing best practices to ensure efficient data integration and processing.

Key Responsibilities:

  • Design (Architect) end-to-end data warehousing solutions for the customer
  • Design and implement high-performance ETL workflows for structured and unstructured data.
  • Develop and manage data pipelines on cloud platforms like Snowflake, Redshift, or Azure Synapse.
  • Ensure data quality, governance, and security across all ETL processes.
  • Collaborate with cross-functional teams to define data requirements and integration strategies.
  • Optimize ETL performance, scalability, and cost efficiency.
  • Lead and mentor a team of ETL developers.


Required Skills & Experience:

  • 10+ years of experience in ETL development and data integration.
  • Expertise in ETL tools (Matillion, Informatica, Talend, SSIS, AWS Glue etc).
  • Strong SQL skills and proficiency in Python or Scala for automation.
  • Hands-on experience with cloud data warehouses (Snowflake, Redshift, Azure Synapse, BigQuery).
  • Understanding of data modelling, data lakes, and workflow orchestration (Airflow, ADF, AWS Step Functions).
  • Strong problem-solving and leadership skills.


Preferred Qualifications:

  • Experience with performance tuning and cost optimization of ETL workflows.
  • Knowledge of real-time data processing and streaming technologies (Kafka, Kinesis, Azure Event Hub, etc).
  • Familiarity with data cataloguing, lineage tracking, and metadata management.
  • Hands-on experience in data security and compliance (GDPR, HIPAA, SOC 2).
  • Certifications in ETL Tools, Snowflake, AWS, Azure, or GCP would be appreciated.
This advertiser has chosen not to accept applicants from your region.

Data and Analytics ETL Architect

Bengaluru, Karnataka Astellas Pharma

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas!
Astellas' Global Capability Centres (GCCs) are strategically located sites that give Astellas the ability to access talent across various functions in the value chain and to co-locate core capabilities that are currently dispersed. Our three GCCs are located in India, Poland and Mexico.
The GCCs will enhance our operational efficiency, resilience and innovation potential, enabling a timely response to changing business demands.
Our GCCs are an integral part of Astellas, guided by our shared values and behaviors, and are critical enablers of the company's strategic priorities, sustainable growth, and commitment to turn innovative science into VALUE for patients is a hybrid position and is based in Bangalore, India. At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
**Purpose and Scope:**
As a Data and Analytics Architect specializing in Business Intelligence (BI) and ETL (Extract, Load, Transform) technologies, you will play a crucial role in designing, developing, and optimizing data solutions for our organization. Your expertise in BI tools and ETL processes will contribute to the success of our data-driven initiatives.
**Essential Job Responsibilities:**
+ Data Strategy Contribution: Contribute to the organization's data strategy by identifying opportunities for data-driven insights and improvements.
+ Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
+ Design, develop and implement robust and scalable data analytics using modern technologies.
+ Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
+ Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
+ Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments.
+ Participate in the continuous delivery pipeline. Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
+ Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
+ Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.
**Qualifications:**
**Required**
+ Bachelor's Degree in computer science, information technology, or related field (or equivalent experience.)
+ 3+ years of experience using Qlik, PowerBI or equivalent technologies
+ Analytical mindset and logical thinking.
+ Familiarity with Business Intelligence and Data Warehousing concepts.
+ Web integration skills (Qlik Sense).
+ Advanced SQL knowledge.
+ Understanding of stored procedures, triggers, and tuning.
+ Experience with other BI tools (Tableau, D3.js) is a plus.
+ Business Intelligence Tools:
+ QLIK: Proficiency in designing, developing, and maintaining QLIK applications. Experience with QLIK Sense and QLIKView is highly desirable. Experience working with N-Printing, Qlik Alerting
+ Expert at creating and consuming complex Qlik/PowerBI data models
+ Power BI: Strong expertise in creating interactive reports, dashboards, and visualizations using Power BI. Knowledge of DAX (Data Analysis Expressions) is essential and Power Automate (MS Flow) or PowerBI alerts
+ Data Modeling and Integration:
+ Best practice knowledge of modelling data that is to be consumed by QlikSense or PowerBI
+ Ability to design and implement logical and physical data models.
+ Familiarity with data warehousing concepts, including star schema, snowflake schema, and data marts.
+ Understanding of ETL (Extract, Transform, Load) processes and data integration techniques.
+ Ensure data security and compliance with privacy regulations (e.g., GDPR).
+ Validate access controls and encryption mechanisms.
+ SQL and Database Management:
+ Proficiency in SQL for querying and manipulating data.
+ Knowledge of database management systems (e.g., SQL Server, Oracle, MySQL).
+ Data Governance and Quality:
+ Understanding of data governance principles, data lineage, and metadata management
+ Experience ensuring data quality, consistency, and accuracy
+ Programming Languages:
+ Basic understanding of SQL, Python, or R for data validation
+ Data Analysis:
+ Knowledge of statistical analysis and data visualization tools (e.g., Tableau, Power BI Posit)
+ Pharmaceutical Domain:
+ Understanding of pharmaceutical data (clinical trials, drug development, patient records).
+ Attention to Detail: Meticulous review of requirements and data quality.
+ Experience performance tuning BI data models and reports to ensure apps are responsive to users' needs
+ Proven skill at creating dashboards of pixel perfect quality
+ The ability to work with end users to uncover business requirements and turn these into powerful action orientated applications.
+ Create and maintain technical documentation
+ Ability to communicate and work effectively with other technical team members to agree best solution
+ A self-starter able to work on own but know when to collaborate with wider team to ensure best overall solution.
+ Experience working with data warehousing, data modelling and architecture
+ Understanding of Qlik, Tableau and or PowerBI architecture or any equivalent technology
+ Ability to install, configure and upgrade Qlik/ PowerBI or equivalent systems
+ Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
+ Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.
+ Collaboration and Communication: Work closely with cross-functional teams, including data engineers, data scientists, and business stakeholders.
**Working Environment**
At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
"Beware of recruitment scams impersonating Astellas recruiters or representatives. Authentic communication will only originate from an official Astellas LinkedIn profile or a verified company email address. If you encounter a fake profile or anything suspicious, report it promptly to LinkedIn's support team through LinkedIn Help"
#LI-CH1
Category FoundationX
Astellas is committed to equality of opportunity in all aspects of employment.
EOE including Disability/Protected Veterans
This advertiser has chosen not to accept applicants from your region.

Data Integration Architect

Navi Mumbai, Maharashtra Reliance New Energy

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

full-time

The Global Power Market is amidst a fundamental transition from a central (Predictable, vertically integrated, one way) to a distributed (Intermittent, horizontally networked, bidirectional) model with increasing penetration of Renewables playing a key role in this transition.

RILs newly created Distributed Renewables (RE) business intends to accelerate this transition by providing safe, reliable, affordable, and accessible distributed green energy solutions to Indias population thereby improving quality of life.

Digital is the key enabler for the business to scale-up through the 3 pillars of agility, delightful customer experience and data driven decision making.

Work Location : Navi Mumbai

Department: Digital, Distributed Renewable Energy

Reporting to: Head, Digital Initiatives, Distributed Renewables

Job Overview:

We are seeking a highly skilled and experienced Data and Integration Architect to join our team. This role is crucial for designing and implementing robust data and integration architectures that support our company's strategic goals. The ideal candidate will possess a deep understanding of data architecture, data modeling, integration patterns, and the latest technologies in data integration and management. This position requires a strategic thinker who can collaborate with various stakeholders to ensure our data and integration frameworks are scalable, secure, and aligned with business needs.

Key Responsibilities:

1.   Data Architecture Design : Develop and maintain an enterprise data architecture strategy that supports business objectives and aligns with the companys technology roadmap.

2.   Integration Architecture Development: Design and implement integration solutions that seamlessly connect disparate systems both internally and with external partners, ensuring data consistency and accessibility.

3.   Data Governance and Compliance: Establish and enforce data governance policies and procedures to ensure data integrity, quality, security, and compliance with relevant regulations.

4.   System Evaluation and Selection: Evaluate and recommend technologies and platforms for data integration, management, and analytics, ensuring they meet the organizations needs.

5.   Collaboration with IT and Business Teams: Work closely with IT teams, business analysts, and external partners to understand data and integration requirements and translate them into architectural solutions.

6.   Performance and Scalability: Ensure the data and integration architecture supports high performance and scalability, addressing future growth and technology evolution.

7.   Best Practices and Standards: Advocate for and implement industry best practices and standards in data management, integration, and architecture design.

8.   Troubleshooting and Optimization: Identify and address data and integration bottlenecks, performing regular system audits and optimizations to improve performance and efficiency.

9.   Documentation and Training: Develop comprehensive documentation for the data and integration architectures. Provide training and mentorship to IT staff and stakeholders on best practices.

Qualifications:

1.   Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field.

2.   Minimum of 7 years of experience in data architecture, integration, or a related field, with a proven track record of designing and implementing large-scale data and integration solutions.

3.   Expert knowledge of data modeling, data warehousing, ETL processes, and integration patterns (APIs, microservices, messaging).

4.   Experience with cloud-based data and integration platforms (e.g., AWS, Azure, Google Cloud Platform) and understanding of SaaS, PaaS, and IaaS models.

5.   Strong understanding of data governance, data quality management, and compliance regulations (e.g., GDPR, HIPAA).

6.   Proficient in SQL and NoSQL databases, data integration tools (e.g., Informatica, Talend, MuleSoft), and data visualization tools (e.g., Tableau, Power BI).

7.   Excellent analytical, problem-solving, and project management skills.

8.   Outstanding communication and interpersonal abilities, with the skill to articulate complex technical concepts to non-technical stakeholders.

What We Offer:

1.   Opportunities for professional growth and advancement.

2.   A dynamic and innovative work environment with a strong focus on collaboration and continuous learning.

3.   The chance to work on cutting-edge projects, making a significant impact on the companys data strategy and operations.

This position offers an exciting opportunity for a seasoned Data and Integration Architect to play a key role in shaping the future of our data and integration strategies. If you are passionate about leveraging data to drive business success and thrive in a dynamic and collaborative environment, we encourage you to apply.


This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

600031 Chennai, Tamil Nadu Kavi Software Technologies Private Limited

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent
We’re seeking a skilled Data Integration Engineer with 10+ years of experience to design, develop, and maintain robust data pipelines and integration solutions. The ideal candidate will have hands-on experience with Informatica, ETL processes, and cloud data Platforms.

Key Responsibilities:

·    Develop and maintain ETL workflows using Informatica.

·    Design and implement data pipelines for ingestion, transformation, and loading.

·    Work with SQL and Python to manipulate and analyse data.

·    Integrate data across various systems and platforms, including GCP and BigQuery.

·    Ensure data quality, consistency, and security across all integrations.

·    Collaborate with data architects, analysts, and business stakeholders.

Required Skills:

·    Strong experience with Informatica and ETL development.

·    Proficiency in Python and SQL.

·    Hands-on experience with Google Cloud Platform (GCP) and Big Query.

·    Solid understanding of data integration best practices and performance optimization.

This advertiser has chosen not to accept applicants from your region.

SAP Data Integration

Karnataka, Karnataka Awign Expert

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description

This is a remote position.

Duration: 6 months Location: Remote Timings: Full Time (As per company timings) Notice Period: (Immediate Joiner - Only) Experience: 6-9 Years JD: We seek a Senior Data Integration Developer with deep expertise in SAP Data Intelligence to support a large-scale enterprise data program. You will be responsible for designing, building, and optimizing SAP DI pipelines for data ingestion, transformation, and integration across multiple systems. Key Responsibilities Design, develop, and deploy data integration pipelines in SAP Data Intelligence. Integrate SAP and non-SAP data sources, ensuring scalability and performance. Implement data quality checks, metadata management, and monitoring. Collaborate with MDM teams, functional consultants, and business analysts to meet integration requirements. Troubleshoot issues and optimize workflows for efficiency. Prepare technical documentation and handover materials.    6+ years of data integration experience, with at least 3 years in SAP Data Intelligence. Strong skills in SAP DI Graphs, Operators, and connectivity with SAP HANA, S/4HANA, and cloud platforms. Experience with data transformation, cleansing, and enrichment processes. Proficiency in Python, SQL, and integration protocols (REST, OData, JDBC). Strong problem-solving and debugging skills.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Integration

Navi Mumbai, Maharashtra IBM

Posted today

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Integration

Pune, Maharashtra IBM

Posted today

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components.
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality Documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl architect Jobs in India !

Data Engineer-Data Integration

Navi Mumbai, Maharashtra IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing.
Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets.
In this role, your responsibilities may include:
* Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques
* Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements
* Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.
* Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results
Your primary responsibilities include:
* Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools.
* Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT.
* Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues.
**Required technical and professional expertise**
* Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter
* Knowledge of Cloud, Power BI, Data migration on cloud skills.
* Experience in Unix shell scripting and python
* Experience with relational SQL, Big Data etc
**Preferred technical and professional experience**
* Knowledge of MS-Azure Cloud
* Experience in Informatica PowerCenter
* Experience in Unix shell scripting and python
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Integration

Navi Mumbai, Maharashtra IBM

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs.
* Your primary responsibilities include:
* Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* aImplement data quality and validation processes within Ab Initio. Data Modeling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyze and model data to ensure optimal ETL design and performance.
* Ab Initio Components: Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Integration

Kochi, Kerala IBM

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio
**Your role and responsibilities**
* Hiring manager As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include: * Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Architect Jobs