25 Data Engineer jobs in Kochi

Data Engineer

Kochi, Kerala Exult Global

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Role: Data Engineer

Experience: 3-5 yrs

Notice Period: Immediate

Work Mode: Hybrid


Job Description


Requirements:

• Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL

• Strong background in distributed data processing and storage (e.g. Apache, Spark, Hadoop)

• Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines

• Development experience with at least one cloud (Azure high preference, AWS, GCP)

• Knowledge of data lake and data lake house patterns

• Knowledge of ETL performance tuning and cost optimization

• Knowledge of data structures and algorithms and good software engineering practices


Nice to Have:

• Experience with Azure Databricks

• Knowledge of DevOps - ETL pipeline orchestration

o Tools: GitHub actions, Terraform, Databricks workflows, Azure DevOps

• Certifications: Databricks, Azure, AWS, GCP highly preferred

• Knowledge of code version control (e.g. Git)

This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Integration

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio
**Your role and responsibilities**
* Hiring manager As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client's needs.
* Your primary responsibilities include: * Design, build, optimize and support new and existing data models and ETL processes based on our client's business requirements
* Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
* Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need
**Required technical and professional expertise**
* Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.
* Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis.
* Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.
* Analyse and model data to ensure optimal ETL design and performance.
* Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components
**Preferred technical and professional experience**
* Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration.
* Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes.
* Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Warehouse

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities:
* Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Warehouse

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities:
* Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer - Kochi

682001 Kochi, Kerala ₹800000 - ₹1100000 annum VGreenTEK

Posted 27 days ago

Job Viewed

Tap Again To Close

Job Description

Permanent

Job Role: Data Engineer

Experience: 4 to 5years

Job Location: Kochi, Kerala

Job Type: Full-time, Permanent

Notice: Immediate/15 days or currently serving (LWD within 30 days)

Candidates from Kerala are the first priority.

We are seeking an experienced and driven Data Engineer with 4-5 years of hands-on experience in building scalable data infrastructure and systems. You will play a key role in designing and developing robust, high-performance ETL pipelines and managing large-scale datasets to support critical business functions. This role requires deep technical expertise, strong problem-solving skills, and the ability to thrive in a fast-paced, evolving environment.

Key Responsibilities


1)Design, develop, and maintain scalable and reliable ETL/ELT pipelines for processing large 
volumes of data (terabytes and beyond).
2)Model and structure data for performance, scalability, and usability.
3)Work with cloud infrastructure (preferably Azure) to build and optimize data workflows.
4)Build and manage data lake/lakehouse architectures in alignment with best practices.
5)Optimize ETL performance and manage cost-effective data operations.
6)Collaborate closely with cross-functional teams including data science, analytics, and software 
engineering.
7)Ensure data quality, integrity, and security across all stages of the data lifecycle.

Required Skills & Qualifications:


1)4 to 5 years of relevant experience in data engineering.
2)Advanced proficiency in Python, including libraries such as Pandas and NumPy.
3)Strong skills in SQL for complex data manipulation and analysis.
4)Hands-on experience with Apache Spark, Hadoop, or similar distributed systems.
5)Proven track record of handling large-scale datasets (TBs) in production environments.
6)Cloud development experience with Azure(preferred), AWS, or GCP.
7)Solid understanding of data lake and data lakehouse architectures. 
8)Expertise in ETL performance tuning and cost optimization techniques.
9)Knowledge of data structures, algorithms, and modern software engineering practices.

Soft Skills:


1)Strong communication skills with the ability to explain complex technical concepts clearly 
and concisely.
2)Self-starter who learns quickly and takes ownership.
3)High attention to detail with a strong sense of data quality and reliability.
4)Comfortable working in an agile, fast-changing environment with incomplete requirements.

Preferred Qualifications:


1)Experience with tools like Azure Data Factory, or similar. 
2)Familiarity with CI/CD and DevOps in the context of data engineering.
3)Knowledge of data governance, cataloging, and access control principles.

This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Platforms-AWS

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform
* Responsibilities:
* Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS
* Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform
* Experience in developing streaming pipelines
* Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc
**Required technical and professional expertise**
* Total 3-5+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills
* Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ;
* Minimum 3 years of experience on Cloud Data Platforms on AWS;
* Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB
* Good to excellent SQL skills
**Preferred technical and professional experience**
* Certification in AWS and Data Bricks or Cloudera Spark Certified developers
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Platforms-AWS

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform
* Responsibilities:
* Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS
* Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform
* Experience in developing streaming pipelines
* Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc
**Required technical and professional expertise**
* Total 3-5+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills
* Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ;
* Minimum 3 years of experience on Cloud Data Platforms on AWS;
* Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB
* Good to excellent SQL skills
**Preferred technical and professional experience**
* Certification in AWS and Data Bricks or Cloudera Spark Certified developers
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in Kochi !

Data Engineer-Data Platforms-Azure

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform
* Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS
* Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform
* Experience in developing streaming pipelines
* Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc
**Required technical and professional expertise**
* Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills
* Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala;
* Minimum 3 years of experience on Cloud Data Platforms on Azure;
* Experience in Data Bricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB
* Good to excellent SQL skills
**Preferred technical and professional experience**
* Certification in Azure and Data Bricks or Cloudera Spark Certified developers
* Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB
* Knowledge or experience of Snowflake will be an added advantage
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Platforms-Azure

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
* 7+ Yrs total experience in Data Engineering projects & 4+ years of relevant experience on Azure technology services and Python
* Azure : Azure data factory, ADLS- Azure data lake store, Azure data bricks,
* Mandatory Programming languages : Py-Spark, PL/SQL, Spark SQL
* Database : SQL DB
* Experience with Azure: ADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates
* Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
* Experience with object-oriented/object function scripting languages: Python, SQL, Scala, Spark-SQL etc.
* Data Warehousing experience with strong domain
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Experience with Azure: ADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates
* Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
* Experience with object-oriented/object function scripting languages: Python, SQL, Scala, Spark-SQL etc
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Platforms-Azure

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform
* Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS
* Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform
* Experience in developing streaming pipelines
* Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc
**Required technical and professional expertise**
* Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills
* Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala;
* Minimum 3 years of experience on Cloud Data Platforms on Azure;
* Experience in Data Bricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB
* Good to excellent SQL skills
**Preferred technical and professional experience**
* Certification in Azure and Data Bricks or Cloudera Spark Certified developers
* Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB
* Knowledge or experience of Snowflake will be an added advantage
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs View All Jobs in Kochi