20,925 Etl jobs in India

ETL Data Engineer

New Delhi, Delhi The Techgalore

Posted today

Job Viewed

Tap Again To Close

Job Description

Pls rate the candidate (from 1 to 5, 1 lowest, 5 highest ) in these areas

  1. Big Data
  2. PySpark
  3. AWS
  4. Redshift

Position Summary

Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform

Requirements

  • About 4-6 years of professional technology experience mostly focused on the following:
  • 4+ year of experience on developing data ingestion pipelines using Pyspark(batch and streaming).
  • 4+ years experience on multiple Data engineering related services on AWS, e.g. Glue, Athena, DynamoDb, Kinesis, Kafka, Lambda, Redshift etc.
  • 1+ years of experience of working with Redshift esp the following.

o Experience and knowledge of loading data from various sources, e.g. s3 bucket and on-prem data sources into Redshift.

o Experience of optimizing data ingestion into Redshift.

o Experience of designing, developing and optimizing queries on Redshift using SQL or PySparkSQL

o Experience of designing tables in Redshift(distribution key, compression etc., vacuuming,etc. )

Experience of developing applications that consume the services exposed as ReST APIs. Experience and ability to write and analyze complex and performant SQLs

Special Consideration given for

  • 2 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend
  • Good knowledge on Data Modellin g(design patterns and best practices)
  • Experience with Reporting Technologies (i.e. Tableau, PowerBI)

What youll do

Analyze and understand customers use case and data sources and extract, transform and load data from multitude of customers enterprise sources and ingest into Adobe Experience Platform

Design and build data ingestion pipelines into the platform using PySpark

Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed.

Develop and test complex SQLs to extractanalyze and report the data ingested into the Adobe Experience platform.

Ensure the SQLs are implemented in compliance with the best practice to they are performant.

Migrate platform configurations, including the data ingestion pipelines and SQL, across various sandboxes.

Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues.

Support Data Architects in implementing data model in the platform.

Contribute to the innovation charter and develop intellectual property for the organization.

Present on advanced features and complex use case implementations at multiple forums.

Attend regular scrum events or equivalent and provide update on the deliverables.

Work independently across multiple engagements with none or minimum supervision.



This advertiser has chosen not to accept applicants from your region.

ETL Data Engineer

Delhi, Delhi The Techgalore

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

remote

Pls rate the candidate (from 1 to 5, 1 lowest, 5 highest ) in these areas 

  1. Big Data
  2. PySpark
  3. AWS
  4. Redshift

Position Summary

Experienced ETL Developers and Data Engineers to ingest and analyze data from multiple enterprise sources into Adobe Experience Platform

 Requirements 

  • About 4-6 years of professional technology experience mostly focused on the following: 
  •  4+ year of experience on developing data ingestion pipelines using Pyspark(batch and streaming).
  • 4+ years experience on multiple Data engineering related services on AWS, e.g. Glue, Athena, DynamoDb, Kinesis, Kafka, Lambda, Redshift etc.
  •   1+ years of experience of working with Redshift esp the following.

o   Experience and knowledge of loading data from various sources, e.g. s3 bucket and on-prem data sources into Redshift.

o   Experience of optimizing data ingestion into Redshift.

o   Experience of designing, developing and optimizing queries on Redshift using SQL or PySparkSQL

o   Experience of designing tables in Redshift(distribution key, compression etc., vacuuming,etc. ) 

  Experience of developing applications that consume the services exposed as ReST APIs.   Experience and ability to write and analyze complex and performant SQLs

Special Consideration given for  

  • 2 years of Developing and supporting ETL pipelines using enterprise-grade ETL tools like Pentaho, Informatica, Talend
  • Good knowledge on Data Modellin g(design patterns and best practices)
  •   Experience with Reporting Technologies (i.e. Tableau, PowerBI)

What youll do

  Analyze and understand customers use case and data sources and extract, transform and load data from multitude of customers enterprise sources and ingest into Adobe Experience Platform

  Design and build data ingestion pipelines into the platform using PySpark

  Ensure ingestion is designed and implemented in a performant manner to support the throughout and latency needed.

  Develop and test complex SQLs to extractanalyze and report the data ingested into the Adobe Experience platform.

  Ensure the SQLs are implemented in compliance with the best practice to they are performant.

  Migrate platform configurations, including the data ingestion pipelines and SQL, across various sandboxes.

  Debug any issues reported on data ingestion, SQL or any other functionalities of the platform and resolve the issues.

  Support Data Architects in implementing data model in the platform.

  Contribute to the innovation charter and develop intellectual property for the organization.

  Present on advanced features and complex use case implementations at multiple forums.  

  Attend regular scrum events or equivalent and provide update on the deliverables.

  Work independently across multiple engagements with none or minimum supervision.



This advertiser has chosen not to accept applicants from your region.

Statusneo-Data Engineer (ETL)

Mumbai, Maharashtra Nexthire

Posted today

Job Viewed

Tap Again To Close

Job Description

Role: ETIL Developer
Location: Mumbai
Experience: 3-5 Years

Skills - ETL, BDM, Informatica, Data Integrator

Role Overview:
We are seeking a skilled ETL Developer with experience in Informatica, Big Data Management (BDM), and Data Integrator. The ideal candidate will have a strong background in data extraction, transformation, and loading (ETL) processes, with a focus on optimizing data integration solutions for complex data environments. You will play a critical role in designing and implementing ETL workflows to support our business intelligence and data warehousing initiatives.

Key Responsibilities:

  • Design, develop, and maintain ETL processes using Informatica, BDM, and Data Integrator.
  • Collaborate with data architects and business analysts to understand data requirements and translate them into ETL solutions.
  • Optimize ETL processes for performance, scalability, and reliability.
  • Conduct data quality assessments and implement data cleansing procedures.
  • Monitor and troubleshoot ETL processes to ensure timely and accurate data integration.
  • Work with large datasets across multiple data sources, including structured and unstructured data.
  • Document ETL processes, data flows, and mappings to ensure clarity and consistency.

Required Skills:

  • 3-5 years of experience in ETL development with a strong focus on Informatica, BDM, and Data Integrator.
  • Proficiency in SQL and database technologies (e.g., Oracle, SQL Server, MySQL).
  • Experience with big data technologies and frameworks.
  • Strong analytical and problem-solving skills.
  • Familiarity with data warehousing concepts and best practices.
  • Excellent communication and collaboration skills.

About Statusneo :

We accelerate your business transformation by leveraging best fit CLOUD NATIVE technologies wherever feasible. We are DIGITAL consultants who partner with you to solve & deliver. We are experts in CLOUD NATIVE TECHNOLOGY CONSULTING & SOLUTIONS. We build, maintain & monitor highly scalable, modular applications that leverage elastic compute, storage and network of leading cloud platforms. We CODE your NEO transformations. #StatusNeo

Business domain experience is vital to the success of neo transformations empowered by digital technology. Experts in domain ask the right business questions to diagnose and address. Our consultants leverage your domain expertise & augment our digital excellence to build cutting edge cloud solutions.

This advertiser has chosen not to accept applicants from your region.

Data ETL Engineer

Pune, Maharashtra IDT Corporation

Posted today

Job Viewed

Tap Again To Close

Job Description

IDT () is a communications and financial services company founded in 1990 and headquartered in New Jersey, US. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1500 people across 20+ countries, and have revenues in excess of $ billion.We are looking for a Mid-level Business Intelligence Engineer to join our global team. If you are highly intelligent, motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals across many lines of business. * The interview process will be conducted in English.

Responsibilities:

  • Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker).
  • Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
  • Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes.
  • Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products.
  • Should be willing to explore and learn new technologies and concepts to provide the right kind of solution.
  • Target and result oriented with strong end user focus.
  • Effective oral and written communication skills with BI team and user community.
  • Requirements:

  • 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
  • Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
  • Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
  • Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
  • Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
  • Excellent English communication skills.
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. 
  • Pluses:

  • Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
  • Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. 
  • Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
  • Experience with reporting/visualization tools (, Looker) and job scheduler software.
  • Experience in Telecom, eCommerce, International Mobile Top-up.
  • Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise.
  • Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.
  • Please attach CV in English. The interview process will be conducted in English.Only accepting applicants from INDIA
    This advertiser has chosen not to accept applicants from your region.

    Etl

    Pune, Maharashtra BugendaiTech

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    **Job Id**:BTJ21003

    **Technology / Domain**:ETL

    **Role**: ETL Developer

    **Job description**:

    - 2+ years building, deploying, and maintaining end-to-end (data lake to visualization) ETL pipelines
    - High proficiency with SQL
    - Proficiency with Looker (or similar BI tool)
    - Experience with conceptual, logical, and physical data modeling
    - Proficiency in Python (or other OOP languages)
    - Experience with version control and deploying production code
    - Demonstrable experience querying and transforming data programmatically
    - Able to analyze data and critically examine results for patterns
    - Familiarity with dbt,Jenkins,Apache Airflow,AWS (s3, Lambda, EC2, IAM),Stats software packages (R, Python pandas, etc)is a big plus.
    This advertiser has chosen not to accept applicants from your region.

    Etl

    Delhi, Delhi Nityo Infotech

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    5 hours ago

    **Job Code**:
    JD-19626

    **JOB DESCRIPTION**: ETL Developer Exp - 3 to 5 yrs Shift time - 10 am to 7 pm (Should be available as per our international/national client needs) Budget - 18 to 25 LPA No of position - 2 Location - Bangalore (Wfo/Hybrid) Notice period - Immediate joiner Responsibilities Independently plans, designs, develop, executes and monitor complex data integration activities to support project delivery and daily operation. Expert in defining, implementing, debugging and optimizing data integration mappings and scripts from a variety of data sources. Spearheads development of ETL code, metadata definition and models, queries, scripts, schedules, work processes and maintenance procedures and identify opportunities to optimize the sizing, performance and efficiency of existing processes/procedures. Mentors less experienced analysts on proper standards/techniques to improve their accuracy and efficiency. Perform the unit testing, system integration testing, regression testing and assist with user acceptance testing. Articulates business requirements in a technical solution that can be designed and engineered. Consults with the business to develop documentation and communication materials to ensure accurate usage and interpretation of data. Develops technical understanding of how the data flows from various source systems and source types to fine tune data integration solutions. Work independently or as part of a team to deliver data warehouse ETL projects. Adhere to established standards and best practices and provide input for improvement of those processes. Self-motivated team player who can work with mínimal supervision and can adapt to a quickly changing environment.

    **Experience Required**:
    3 - 5 Years

    **Industry Type**:
    IT

    **Employment Type**:
    Permanent

    **Location**:
    India

    **Roles & Responsibilities**:
    **Expertise & Qualification**:
    Btech
    This advertiser has chosen not to accept applicants from your region.

    Sr. Software Engineer - ETL(Extract, Transform, Load) Job

    Indore, Madhya Pradesh YASH Technologies

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.

    At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.

    We are looking forward to hire ETL(Extract, Transform, Load) Professionals in the following areas :

    Experience: 4-6 years

    Involve in implementation, maintenance and participate in data loads for Microsoft CRM Dynamics platform. Coordinate with data team on requirements and apply technical expertise in data clean up, data profiling and data movement from Sybase to CRM Dynamics using ETL concepts and SSIS tool. Having SQL, ETL Concepts and data knowledge is desired to perform activities efficiently of this role.

    Roles and responsibilities :

  • Experience in Sybase, SQL Server, ETL Concepts for data clean up, data quality and data load activities is must.
  • Understanding of complex SQL queries (Stored Procedures, SQL Functions,transformations etc.) for data load trouble shootings and fixing the data load isues is must.
  • Understanding of Microsoft CRM Dynamics application and Azure Cloud environment (Azure Blob, Azure Functions App's is added advantage and Not Primarily reqired).
  • Understanding of developed ETL SSIS Jobs and GIT environment concepts.
  • Should be Collabarative and Communicating with team in understanding the prioritiziation of tasks for completion of project delivarables.
  • At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.

    Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture
  • This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Etl Jobs in India !

    Sr. Software Engineer - ETL(Extract, Transform, Load) Job

    Hyderabad, Andhra Pradesh YASH Technologies

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.

    At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.

    We are looking forward to hire ETL(Extract, Transform, Load) Professionals in the following areas :

    Experience: 4-6 years

    Involve in implementation, maintenance and participate in data loads for Microsoft CRM Dynamics platform. Coordinate with data team on requirements and apply technical expertise in data clean up, data profiling and data movement from Sybase to CRM Dynamics using ETL concepts and SSIS tool. Having SQL, ETL Concepts and data knowledge is desired to perform activities efficiently of this role.

    Roles and responsibilities :

  • Experience in Sybase, SQL Server, ETL Concepts for data clean up, data quality and data load activities is must.
  • Understanding of complex SQL queries (Stored Procedures, SQL Functions,transformations etc.) for data load trouble shootings and fixing the data load isues is must.
  • Understanding of Microsoft CRM Dynamics application and Azure Cloud environment (Azure Blob, Azure Functions App's is added advantage and Not Primarily reqired).
  • Understanding of developed ETL SSIS Jobs and GIT environment concepts.
  • Should be Collabarative and Communicating with team in understanding the prioritiziation of tasks for completion of project delivarables.
  • At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.

    Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture
  • This advertiser has chosen not to accept applicants from your region.

    Sr ETL Data Engineer -HL7

    BigRio

    Posted 1 day ago

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title: Sr ETL Data Engineer -HL7

    Location: Remote – India (UK Shift)

    Type: Full-Time


    About BigRio:

    BigRio is a remote-based, technology consulting firm headquartered in Boston, MA. We specialize in delivering advanced software solutions that include custom development, cloud data platforms, AI/ML integrations, and data analytics. With a diverse portfolio of clients across industries such as healthcare, biotech, fintech, and more, BigRio offers the opportunity to work on cutting-edge projects with a team of top-tier professionals.


    About the Role:

    We are seeking a highly skilled and detail-oriented Sr Data ETL Engineer to join our team supporting a leading healthcare client. This is a remote, full-time opportunity based in India, aligned with the UK business hours . The ideal candidate will have deep expertise in Data Pipelines, ETL and HL7 structured data and familiarity with EMR and EHR systems like ModMed .


    Key Responsibilities:


    • Build and maintain robust ETL pipelines to ingest and transform clinical and operational data.
    • Integrate data from various healthcare sources using HL7 , ADT , SUI , and Formsite -based inputs.
    • Ensure accuracy, integrity, and security of sensitive healthcare data.
    • Collaborate with application developers and clinical teams to understand requirements and deliver scalable data solutions.
    • Provide data extracts and reports as needed, working closely with analytics and product teams.
    • Work independently and effectively in a remote, distributed team environment during UK hours.

    Required Skills:

    • 5+ years of experience in data engineering with strong proficiency in ETL and Healthcare.
    • Proven expertise in building and maintaining ETL pipelines in a healthcare or regulated environment.
    • Deep understanding of healthcare data formats and protocols : HL7 , ADT , SUI , Formsite , etc.
    • )Working experience with EHR platforms , particularly ModMed or similar (e.g., Epic, Cerner).
    • Familiarity with data privacy standards and compliance (HIPAA or similar frameworks).
    • Comfortable working in agile environments and using tools like Jira and Confluence.
    • Excellent communication skills in English (verbal and written).

    Nice to Have:

    • Experience with cloud data services (AWS/GCP/Azure).
    • Familiarity with scripting languages like Python or Bash.
    • ModMed (preferred) or other EHR experience
    • Understanding of database version control and CI/CD workflows.

    Shift Details:

    • This role follows UK business hours (approx. 1:00 PM to 10:00 PM IST).

    Flexibility for occasional overlap with US teams is a plus.




    Equal Opportunity Statement

    BigRio is an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, religion, national origin, sex, sexual orientation, gender identity, age, pregnancy, status as a qualified individual with disability, protected veteran status, or other protected characteristic as outlined by federal, state, or local laws. BigRio makes hiring decisions based solely on qualifications, merit, and business needs at the time. All qualified applicants will receive equal consideration for employment.

    This advertiser has chosen not to accept applicants from your region.

    Sr ETL Data Engineer -HL7

    Delhi, Delhi BigRio

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    Job Title: Sr ETL Data Engineer -HL7
    Location: Remote – India (UK Shift)
    Type: Full-Time

    About BigRio:
    BigRio is a remote-based, technology consulting firm headquartered in Boston, MA. We specialize in delivering advanced software solutions that include custom development, cloud data platforms, AI/ML integrations, and data analytics. With a diverse portfolio of clients across industries such as healthcare, biotech, fintech, and more, BigRio offers the opportunity to work on cutting-edge projects with a team of top-tier professionals.

    About the Role:
    We are seeking a highly skilled and detail-oriented Sr Data ETL Engineer to join our team supporting a leading healthcare client. This is a remote, full-time opportunity based in India, aligned with the UK business hours . The ideal candidate will have deep expertise in Data Pipelines, ETL and HL7 structured data and familiarity with EMR and EHR systems like ModMed .

    Key Responsibilities:

    Build and maintain robust ETL pipelines to ingest and transform clinical and operational data.
    Integrate data from various healthcare sources using HL7 , ADT , SUI , and Formsite -based inputs.
    Ensure accuracy, integrity, and security of sensitive healthcare data.
    Collaborate with application developers and clinical teams to understand requirements and deliver scalable data solutions.
    Provide data extracts and reports as needed, working closely with analytics and product teams.
    Work independently and effectively in a remote, distributed team environment during UK hours.
    Required Skills:
    5+ years of experience in data engineering with strong proficiency in ETL and Healthcare.
    Proven expertise in building and maintaining ETL pipelines in a healthcare or regulated environment.
    Deep understanding of healthcare data formats and protocols : HL7 , ADT , SUI , Formsite , etc.
    )Working experience with EHR platforms , particularly ModMed or similar (e.g., Epic, Cerner).
    Familiarity with data privacy standards and compliance (HIPAA or similar frameworks).
    Comfortable working in agile environments and using tools like Jira and Confluence.
    Excellent communication skills in English (verbal and written).
    Nice to Have:
    Experience with cloud data services (AWS/GCP/Azure).
    Familiarity with scripting languages like Python or Bash.
    ModMed (preferred) or other EHR experience
    Understanding of database version control and CI/CD workflows.
    Shift Details:
    This role follows UK business hours (approx. 1:00 PM to 10:00 PM IST).
    Flexibility for occasional overlap with US teams is a plus.

    Equal Opportunity Statement
    BigRio is an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, religion, national origin, sex, sexual orientation, gender identity, age, pregnancy, status as a qualified individual with disability, protected veteran status, or other protected characteristic as outlined by federal, state, or local laws. BigRio makes hiring decisions based solely on qualifications, merit, and business needs at the time. All qualified applicants will receive equal consideration for employment.
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Etl Jobs