4,152 Data Integration jobs in India

Data Integration

Chennai, Tamil Nadu ₹900000 - ₹1200000 Y Virtusa

Posted today

Job Viewed

Tap Again To Close

Job Description

Design and implement scalable data pipelines using SnapLogic, Fivetran, and DBT. Develop and optimize data models and queries in Snowflake using SQL and Python. Integrate data from various sources using REST/SOAP APIs with appropriate authentication methods. Collaborate with cross-functional teams to understand data requirements and deliver high-quality solutions. Maintain version control and CI/CD pipelines using GitHub and related tools. Ensure data quality, integrity, and security across all integrations. Troubleshoot and resolve data pipeline issues in a timely manner.

Required Skills:

Strong experience with SnapLogic and Fivetran for data integration. Proficiency in Snowflake, SQL, and Python for data engineering tasks. Hands-on experience with DBT for data transformation and modeling. Familiarity with REST/SOAP APIs and various authentication methods (OAuth, API keys, etc.). Experience with CI/CD pipelines and GitHub for version control and deployment. Excellent problem-solving and communication skills.

Preferred Qualifications:

Experience in cloud platforms such as AWS. Knowledge of data governance and compliance best practices. Prior experience in agile development environments.

About Virtusa

Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.

Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.

Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

This advertiser has chosen not to accept applicants from your region.

Data Integration

Noida, Uttar Pradesh ₹1500000 - ₹2000000 Y Crenovent

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description – Data Integration & Architecture Engineer

Location:
Noida

Team:
Platform Engineering – RevAi Pro

Type:
Full-time

About Us

Crenovent Technologies is building
RevAi Pro
, a next-gen revenue operations platform that unifies data, AI agents, and SaaS workflows across industries (SaaS, Banking, Insurance, E-commerce, and Financial Services). Our platform uses
Azure Fabric + PostgreSQL + pgvector
to power near real-time revenue intelligence with deep compliance, governance, and multi-tenancy at its core.

As part of our early engineering team, you'll work on the
foundational data architecture
that powers everything from AI meeting assistants to compensation modelling.

The Role

We are looking for a
Data Integration & Architecture Engineer
who will design and operate the pipelines that unify
CRM, billing, support, and SaaS app data
into our Azure-native data fabric. You'll own the
Bronze → Silver → Gold transformations, Fabric → Postgres egress flows, and tenant-level governance models
.

This role sits at the intersection of
data engineering and platform architecture
— perfect for someone who enjoys solving complex data puzzles and wants to grow into a
Lead Data Architect
role.

What You'll Do

  • Build ingestion pipelines
    (CDC, batch, webhook, file-based) from CRMs (Salesforce, Dynamics, HubSpot), billing systems (Stripe, Zuora), and support tools (Zendesk, JSM) into
    Azure Fabric Bronze/Silver layers
    .
  • Curate standardized models
    in Fabric Silver → Gold using Canonical RevAi Schema (CRS), including SCD2 handling, PII tagging, and business-rule conformance.
  • Operate low-latency egress pipelines
    from Fabric → Postgres (staging → ext_* → app consumption), ensuring subsecond query performance for AI agents.
  • Implement tenant isolation
    via RLS in Postgres and Purview policies in Fabric, ensuring strict multi-tenancy and compliance with residency rules.
  • Own observability & SLAs
    : freshness checks, rowcount anomaly detection, and DQ tests across Bronze/Silver/Gold.
  • Contribute to FinOps dashboards
    by instrumenting pipeline cost metrics, cache hit rates, and egress efficiency.
  • Collaborate with AI engineers
    to ensure pgvector embeddings and agent run data integrate seamlessly into the operational data store.
  • Write runbooks for DR/retention/erasure, and actively participate in quarterly pen-test preparations.

What We're Looking For

Must-Have Skills:

  • 2–4 years in
    data engineering / integration roles
    .
  • Hands-on with
    Azure Data Factory, Azure Fabric (OneLake/Lakehouse), or equivalent (Databricks/Snowflake acceptable)
    .
  • Strong SQL and
    PostgreSQL experience
    (MERGE, staging, RLS, partitioning).
  • Experience with
    CDC, batch ingestion, or event streaming
    (Airflow, dbt, Kafka, or Fabric pipelines).
  • Comfort with
    data governance concepts
    (Purview, PII tagging, masked views, data lineage).

Nice-to-Have Skills:

  • Knowledge of
    pgvector, embeddings, or AI data models
    .
  • Prior exposure to
    multi-tenant SaaS architectures
    .
  • Experience with
    observability tooling
    (freshness checks, data quality gates, anomaly detection).
  • Familiarity with
    FinOps or cost-optimization in data pipelines
    .
This advertiser has chosen not to accept applicants from your region.

Data Integration

Bengaluru, Karnataka ₹1500000 - ₹2000000 Y IBM

Posted today

Job Viewed

Tap Again To Close

Job Description

Introduction
Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best running and growing their business.

Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us.

Your Role And Responsibilities
As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients' goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you'll collaborate with clients to optimize and trailblaze new solutions that address real business challenges.

If you are passionate about success with both your career and solving clients' business challenges, this role is for you. To help achieve this win-win outcome, a 'day-in-the-life' of this opportunity may include, but not be limited to…

  • Solving Client Challenges Effectively: Understanding clients' main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency .
  • Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations.
  • Technical Solution Workshops: Conducting and participating in technical solution workshops.
  • Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions.
  • Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions.
  • Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team.

Preferred Education
Bachelor's Degree

Required Technical And Professional Expertise
In-depth knowledge of the IBM Data & AI portfolio.

  • 18+ years of experience in software services
  • 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms
  • Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines

In addtion to the above, Experience on the below IBM tools and technologies are also requirements.

IBM Knowledge Catalog

Manta

Data Product Hub

Cloud Pak for Data

MDM

  • 10+ years' experience with ETL and database technologies,
  • Experience in architectural planning and implementation for the

upgrade/migration of these specific products

  • Experience in designing and implementing Data Quality solutions
  • Experience with installation and administration of these products
  • Excellent understanding of cloud concepts and infrastructure
  • Excellent verbal and written communication skills are essential

Preferred Technical And Professional Experience

  • Experience with any of DataStage, Informatica, SAS, Talend products
  • Experience with any of IKC, IGC, Axon
  • Experience with IBM Knowledge Catalog
  • Experience with Manta
  • Experience with Data Product Hub
  • Experience with Cloud Pak for Data
  • Experience with MDM
  • Experience with programming languages like Java/Python
  • Experience in AWS, Azure Google or IBM cloud platform
  • Experience with Redhat OpenShift
  • Good to have Knowledge: Apache Spark, Shell scripting, GitHub, JIRA
This advertiser has chosen not to accept applicants from your region.

Data Integration

Bengaluru, Karnataka ₹900000 - ₹1200000 Y Headsnminds Consultants

Posted today

Job Viewed

Tap Again To Close

Job Description

Informatica support and admin Engineer (5 positions)

Job location - Bengalure

Exps range - 4 to 8 yrs

  • Expertise in Informatica Administration tasks including Installation, Configuration of domains, Code Promotions/ Migrations, managing users, groups, and associated privileges, performing backups, and restoring domain components for Informatica tools.
  • Familiarity with application support models and working in a 24*7 support environment using ITIL processes.
  • Hands-on experience in developing ETL mappings, and workflows and providing production support for critical data warehouse environments.
  • Hands-on experience in administration and supporting Informatica's PowerCenter, Data Quality, Informatica Webservices, PowerExchange and Informatica Cloud(IDMC) and DVO etc.
  • Familiarity with various inter-related tools for version control like Bitbucket, and GitHub, and scheduling tools like AutoSys.
  • Hands-on experience in setting up the security for Informatica environments/domains.
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Bengaluru, Karnataka ₹1400000 - ₹1500000 Y RM Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title:
Data Integration Engineer

Location:
Bangalore

Job Type:
Full-time

Exp: 4+

Immediate Joiners Only max 30 days

Budget 15 LPA

Budjet:14 LPA

Role Overview

We are seeking a highly motivated
Data Integration Engineer
to join our engineering team. In this role, you will be responsible for designing and maintaining scalable data pipelines, integrating diverse data sources, and ensuring seamless data availability for analytics and decision-making. You will collaborate with cross-functional teams to build robust, cloud-based data solutions that drive business impact.

Key Responsibilities

  • Design, build, and maintain
    scalable data pipelines
    and APIs using Python.
  • Integrate data from
    third-party APIs, internal systems, and diverse sources
    into unified platforms.
  • Transform
    large and unstructured datasets
    into structured, usable formats for analytics.
  • Collaborate with product, engineering, and business teams to
    define data requirements and deliver solutions
    .
  • Leverage
    AWS services (EC2, S3), Snowflake, and Databricks
    to scale and optimize data infrastructure.
  • Ensure
    high performance, reliability, and responsiveness
    of data applications.
  • Write
    clean, maintainable, and well-documented code
    with a focus on quality and scalability.

Required Skills & Experience

  • Strong proficiency in
    Python
    and data libraries such as Pandas.
  • Experience with
    web frameworks
    (Django, FastAPI, or Flask).
  • Hands-on experience with
    NoSQL databases
    (MongoDB or similar).
  • Proficiency in working with
    RESTful APIs
    and data formats like JSON.
  • Experience with
    AWS services
    (EC2, S3) and cloud data platforms (Snowflake, Databricks).
  • Solid understanding of
    data exploration, troubleshooting, and data transformation
    .
  • Real-world experience with
    large-scale data systems
    in cloud environments.
  • Ability to work effectively in a
    fast-paced, high-growth, and deadline-driven environment
    .
  • Self-starter with strong
    problem-solving and ownership
    mindset.
  • Comfortable working with
    messy or unstructured data
    .

Preferred Qualifications

  • Bachelor's or Master's degree in
    Computer Science, Engineering, or related field
    .
  • Exposure to
    Big Data technologies
    (e.g., Spark, Hadoop) and
    Machine Learning workflows
    is a plus
This advertiser has chosen not to accept applicants from your region.

Data Integration Lead

Pune, Maharashtra ₹1000000 - ₹2500000 Y Wenger & Watson

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Experience in IDQ development around data profiling, cleansing, parsing, standardization, verification, matching and data quality exception monitoring and handling
  • Designing, developing, testing, deploying, and documenting a project s data quality procedures and its outputs.
  • Profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the project data, check for duplicate or redundant records, and provide information on how to proceed with backend ETL processes.
  • Experiences with creating SQL procedures, views, subqueries, multiple SQL joins, unions to perform various data extraction tasks and developing PL/SQL triggers for updating or inserting new records
  • Minimum 9-13 years of hands-on experience with Informatica Data Quality (IDQ) toolset (Version 10.x preferred)
  • Working experience in Informatica PowerCenter
  • Strong SQL skills with the ability to tune and optimize complex queries
  • Experience in the fllowing will be added advantage.
  • Reporting tools-AXON,EDC
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

₹900000 - ₹1200000 Y Hexaware Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Hiring for Big Data Lead

Experience : 4-6yrs

Work location : Bangalore/Pune/Chennai

Work Mode: Hybrid

Notice Period : Imm - 30 days

Primary Skills : AWS S3, DMS, Glue, Lambda, Redshift, Python, SQL, Git, CI/CD, Agile delivery

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data integration Jobs in India !

Data Integration Architect

Bengaluru, Karnataka ₹1200000 - ₹2500000 Y Alstom

Posted today

Job Viewed

Tap Again To Close

Job Description

OVERALL PURPOSE OF THE ROLE:

The purpose of this role to build inhouse technical expertise for data integration area and delivery of data integration services for the platform.

Primary Goals and Objectives-

This role should be responsible for delivery model for Data Integration services. Person should be responsible for building technical expertise on data integration solutions and providing data integration services. The role is viewed as an expert in solution design, development, performance tuning and troubleshooting for data integration.

RESPONSIBILITIES:

Technical -

  • Hands-on-experience architecting and delivering solutions related to enterprise integration, APIs, service-oriented architecture, and technology modernizations
  • 3-4 years hands-on experience with the design, and implementation of integrations in the area of Dell Boomi
  • Understanding the Business requirements and Functional requirement Documents and Design a Technical Solution as per the needs
  • Person should be good with Master Data Management, Migration and Governance best practices
  • Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities
  • Lead and build data migration objects as needed for conversions of data from different sources
  • Should have architected integration solutions using Dell Boomi for cloud, hybrid and on-premise integration landscapes
  • Ability to build and architect a high performing, highly available, highly scale Boomi Molecule Infrastructure
  • In depth understanding of enterprise integration patterns and prowess to apply them in the customers IT landscape
  • Assists project teams during system design to promote the efficient re-use of IT assets Advises project team during system development to assure compliance with architectural principles, guidelines and standards
  • Adept in building the Boomi processes with Error handling and email alerts logging best practices
  • Should be proficient in using Enterprise level and Database connectors
  • Extensive data quality and data migration experience including proficiency in data warehousing, data analysis and conversion planning for data migration activities
  • Excellent understanding on REST with in-depth understanding on how Boomi processes can expose consume services using the different URI and Media type
  • Understand Atom, Molecule, Atmosphere Configuration and Management, Platform Monitoring, Performance Optimization Suggestions, Platform Extension, User Permissions Control Skills.
  • Knowledge on API governance and skills like caching, DB management and data warehousing
  • Should have hands on experience in configuring AS2, SFTP involving different authentication methods
  • Thorough knowledge on process deployment, applying extensions, setting up schedules, Web Services user management process filtering and process reporting
  • Should be expert with XML and JSON activities like creation, mapping and migrations
  • Person should have worked on integration on SAP, SuccessFactors, Sharepoint, cloud-based apps, Web applications and engineering application
  • Support and resolve issues related to data integration deliveries or platform

Project Management

  • Person should deliver Data Integration projects using data integration platform
  • Manage partner deliveries by setting up governance of their deliveries
  • Understand project priorities, timelines, budget, and deliverables and the need to proactively push yourself and others to achieve project goals

Managerial:

  • Person is individual contributor and operationally managing small technical team

Qualifications & Skills:

  • 10+ years of experience in the area of enterprise integrations
  • Minimum 3-4 years of experience with Dell boomi
  • Should have working experience with database like sql server, Data warehousing
  • Hands on experience on REST, SOAP, XML, JSON, SFTP, EDI
  • Should have worked on integration of multiple technologies like SAP, Web, cloud based apps.

EDUCATION: B.E.

BEHAVIORAL COMPETENCIES:

  • Demonstrate excellent collaboration skills as person will be interacting with multiple business units, Solution managers and internal IT teams
  • Should have excellent analytical and problem solving skills
  • Coaches, supports and trains other team membres
  • You demonstrate excellent communication skillsTECHNICAL COMPETENCIES & EXPERIENCETechnical expertise in Delll Boomi for data integration is MUST.

  • Language Skills: English

  • IT Skills: Dell Boomi, SQL, REST APIs, EDI, JSON, XML
  • Location for the role? Travel? If yes, how much (%) - Bangalore. 5%.
  • Contract Type/ Bonus (OPTIONAL)
This advertiser has chosen not to accept applicants from your region.

Data Integration Specialist

Mohali, Punjab ₹104000 - ₹130878 Y Jorie Health

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Data Integration Specialist with expertise in Electronic Health Records (EHR) and strong proficiency in HL7 and FHIR standards. The ideal candidate will have a deep understanding of healthcare data integration processes and will be responsible for ensuring seamless data flow between healthcare systems, improving interoperability, and optimizing data exchange across various platforms. The role demands expertise in FHIR R4 and the ability to implement and manage EHR systems efficiently.

Responsibilities

  • Data Integration: Design, implement, and maintain data integration solutions to connect EHR systems with other healthcare platforms, ensuring the exchange of accurate and secure patient information.
  • FHIR & HL7 Standards: Develop and configure FHIR (R4) interfaces and HL7-based systems for seamless integration across various healthcare systems and third-party applications.
  • EHR Systems Expertise: Work with Electronic Health Records (EHR) systems to enhance functionality, improve interoperability, and streamline workflows.
  • Troubleshooting & Support: Identify, troubleshoot, and resolve integration issues, ensuring minimal downtime and disruption in healthcare services.
  • Compliance & Security: Ensure that all data integration processes comply with HIPAA and other relevant healthcare regulations and standards.
  • Collaboration: Collaborate with cross-functional teams, including software developers, project managers, and healthcare professionals, to understand requirements and implement integration solutions.
  • Documentation: Maintain detailed documentation of integration processes, data mappings, and system configurations for future reference and compliance audits.
  • Training & Knowledge Sharing: Provide training to internal teams on integration processes and tools, ensuring proper usage and understanding of new systems.

Requirements & skills

  • Experience: Minimum 5 years of experience in healthcare data integration, with a focus on EHR systems, HL7, and FHIR.
  • Certifications:

  • HL7 Certification

  • FHIR R4 Certification

  • Technical Skills:

  • Hands-on experience with HL7 messaging protocols, FHIR standards, and EHR platforms.

  • Proficiency in FHIR R4 and its applications in healthcare integration.
  • Strong knowledge of data integration tools and platforms (e.g., Mirth Connect, Informatica, MuleSoft).
  • Familiarity with RESTful APIs and web services.

  • Education: Bachelors degree in computer science, Health Informatics, Information Technology, or related field.

Communication Skills: Strong written and verbal communication skills, with the ability to work collaboratively in a cross-functional team

Additional Preferred Skills:

  • Experience with FHIR RESTful APIs and HL7 v2.x/v3 messaging standards.
  • Background in healthcare data standards (CDA, CCD, CCDS).
  • Understanding of interoperability frameworks and patient data exchange protocols.
  • Prior experience working with cloud-based healthcare solutions (AWS, Azure, etc.).
  • Familiarity with SQL and databases related to healthcare systems.
This advertiser has chosen not to accept applicants from your region.

Data Integration Delivery

Pune, Maharashtra ₹1200000 - ₹2400000 Y HR Central

Posted today

Job Viewed

Tap Again To Close

Job Description

Role & responsibilities

  • Data Integration Development:

Develop, implement, and maintain robust and scalable data integration solutions

on designated middleware platforms, adhering to project timelines and technical

specifications.

Apply a DevOps mindset to development, contributing to the optimization of

processes and tasks through automation (e.g., CI/CD pipelines, Jenkins, GitLab).

  • Operational Excellence:

Provide operational support for integrations running on the integration platforms,

ensuring high availability and performance.

Adhere to company-wide ITIL processes (incident, change, problem

management) within a 24/7 support window.

  • Documentation & Standards:

Create and maintain comprehensive documentation of integration flows, system

interfaces, and adhere to established standards and design patterns.

  • Solution Implementation & Best Practices:

Collaborate with data architects and integration specialists to implement data

integration solutions that are robust, scalable, and performant, aligning with

overall data architecture principles.

Ensure adherence to technical best practices and coding standards.

  • Stakeholder Collaboration:

Collaborate effectively with project stakeholders, including business users and IT

teams, to understand requirements and deliver solutions.

Communicate technical details and progress clearly.

  • Process Contribution:

Contribute to the identification and implementation of opportunities to optimize

data integration processes, improve efficiency, and enhance data quality through

automation, standardization, and new tool adoption.

  • Risk & Issue Management:

Identify and escalate project risks and issues, and contribute to the development

of mitigation strategies to ensure successful project outcomes.

  • Quality Assurance:

Assist in executing testing strategies with different stakeholders for data

integration solutions, ensuring data accuracy, completeness, and adherence to

business rules before deployment.

  • Technical Expertise:

Stay abreast of the latest trends and technologies in data integration, ETL/ELT,

data streaming, and cloud data services, applying relevant innovations to

projects.

Preferred Skill Set-

  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related

technical field.

  • 5+ years of hands-on experience in data integration development and support with a

major expertise on the Kong platform

  • Strong understanding and hands-on experience with following integration and ETL tools

OIC, Kafka, Fivetran, Boomi, Workato or willing to master these platforms as well

  • Experienced in applying DevOps principles and practices.
  • Solid analytical, problem-solving, and debugging abilities with a keen eye for detail.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively

with various teams.

  • Ability to manage multiple technical priorities in a fast-paced environment.
  • Actively participate in a 24/7 support rotation schedule to ensure continuous

operational coverage.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Integration Jobs