7,263 Integration jobs in India

Data Integration Engineer

Bengaluru, Karnataka ₹900000 - ₹1200000 Y Ekfrazo Technologies Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Role: Data Integration Engineer

Location : Bangalore

Shift Time: 2-11 PM with cab facility

Exp: 5 to 7 yrs.

Familiarity with using APIs for application development.

Knowledge of Python, experience with ETL.

Good working knowledge of GCP and GCP Serverless functions.

Good working experience with Unix/Linux.

Prior knowledge of Instana or work with monitoring tools in general is desirable.

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

₹900000 - ₹1200000 Y Innova ESI

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description :

Must Have
:

  • Experience in Data Engineering with a strong focus on Databricks
  • Proficiency in Python, SQL, and Spark (PySpark) programming
  • Hands-on experience with Delta Lake, Unity Catalog, and MLFlow
  • Experience working with CI/CD pipelines

Nice to Have
:

  • Exposure to the Azure ecosystem and its services
  • Experience developing ELT/ETL frameworks
  • Automation of workflows for ingesting structured, semi-structured, and unstructured data
  • Familiarity with data visualization tools such as Power BI
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Noida, Uttar Pradesh ₹2000000 - ₹2500000 Y Crenovent

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description – Senior Data Integration Engineer (Azure Fabric + CRM Integrations)

Location:
Noida

Employment Type:
Full-time

About Crenovent Technologies

Crenovent is building
RevAi Pro
, an enterprise-grade
Revenue Operations SaaS platform
that integrates CRM, billing, contract, and marketing systems with AI agents and Generative AI search. Our vision is to redefine RevOps with AI-driven automation, real-time intelligence, and industry-specific workflows.

We are now hiring a
Senior Data Integration Engineer
to lead the integration of
CRM platforms (Salesforce, Microsoft Dynamics, HubSpot)
into
Azure Fabric
and enable secure, multi-tenant ingestion pipelines for RevAi Pro.

Role Overview

You will be responsible for designing, building, and scaling data pipelines that bring CRM data into
Azure Fabric (OneLake, Data Factory, Synapse-style pipelines)
and transform it into RevAi Pro's
standardized schema (50+ core fields, industry-specific mappings)
.

This is a
hands-on, architecture + build role
where you will work closely with RevOps SMEs, product engineers, and AI teams to ensure seamless data availability, governance, and performance across multi-tenant environments.

Key Responsibilities

  1. Data Integration & Pipelines

  2. Design and implement
    data ingestion pipelines
    from Salesforce, Dynamics 365, and HubSpot into Azure Fabric.

  3. Build
    ETL/ELT workflows
    using Azure Data Factory, Fabric pipelines, and Python/SQL.
  4. Ensure
    real-time and batch sync
    options for CRM objects (Leads, Accounts, Opportunities, Forecasts, Contracts).

  5. Schema & Mapping

  6. Map CRM fields to RevAi Pro's
    standardized schema (50+ fields across industries)
    .

  7. Maintain schema consistency across SaaS, Banking, Insurance, and E-commerce use cases.
  8. Implement data transformation, validation, and enrichment logic.

  9. Data Governance & Security

  10. Implement
    multi-tenant isolation
    policies in Fabric (Purview, RBAC, field-level masking).

  11. Ensure
    PII compliance, GDPR, SOC2 readiness
    .
  12. Build audit logs, lineage tracking, and monitoring dashboards.

  13. Performance & Reliability

  14. Optimize pipeline performance (latency, refresh frequency, cost efficiency).

  15. Implement
    autoscaling, retry logic, error handling
    in pipelines.
  16. Work with DevOps to set up CI/CD for Fabric integrations.

  17. Collaboration

  18. Work with
    RevOps SMEs
    to validate business logic for CRM fields.

  19. Partner with AI/ML engineers to expose clean data to agents and GenAI models.
  20. Collaborate with frontend/backend developers to provide APIs for RevAi Pro modules.

Required Skills & Experience

  • 3+ years
    in Data Engineering / Integration roles.
  • Strong expertise in
    Microsoft Azure Fabric
    , including:
  • OneLake
    ,
    Data Factory
    ,
    Synapse pipelines
    ,
    Power Query
    .
  • Hands-on experience with
    CRM APIs & data models
    : Salesforce, Dynamics 365, HubSpot.
  • Strong SQL and Python for data transformations.
  • Experience with
    ETL/ELT workflows
    , schema mapping, and multi-tenant SaaS data handling.
  • Knowledge of
    data governance tools (Azure Purview, RBAC, PII controls)
    .
  • Strong grasp of
    cloud security & compliance (GDPR, SOC2, HIPAA optional)
    .

Preferred (Nice to Have)

  • Prior experience building integrations for
    Revenue Operations, Sales, or CRM platforms
    .
  • Knowledge of
    middleware
    (MuleSoft, Boomi, Workato, Azure Logic Apps).
  • Familiarity with
    AI/ML data pipelines
    .
  • Experience with
    multi-cloud integrations (AWS, GCP)
    .
  • Understanding of
    business RevOps metrics
    (pipeline, forecast, quota, comp plans).

Soft Skills

  • Strong ownership and problem-solving ability.
  • Ability to translate business needs (RevOps fields) into technical data pipelines.
  • Collaborative mindset with cross-functional teams.
  • Comfortable working in a fast-paced
    startup environment
    .
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Bengaluru, Karnataka ₹900000 - ₹1200000 Y GR Engineering Projects

Posted today

Job Viewed

Tap Again To Close

Job Description


• Strong proficiency in Python and data libraries like Pandas.

• Experience with web frameworks like Django, FastAPI, or Flask.

• Hands-on experience with MongoDB or other NoSQL databases

• Proficiency in working with RESTful APIs and JSON.

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

₹104000 - ₹130878 Y Mobile Programming

Posted today

Job Viewed

Tap Again To Close

Job Description

Interested can dm or call me

Job Description Data Integration Engineer

Position: Data Integration Engineer

Experience: 5 8 years

Overview

We are seeking a skilled Data Integration Engineer to lead the integration of client data from

multiple source systems—including QuickBooks, Excel, CSV files, and other legacy

platforms—into Microsoft Access or SQL Server. This role will focus on designing and

automating data pipelines, ensuring data accuracy, consistency, and performance across

systems.

Responsibilities

  • Collaborate with Implementation and technical teams to gather data integration

requirements.

  • Design and implement automated data pipelines to extract, transform, and load

(ETL) data into Access or SQL databases.

  • Analyze and map source data to target schemas, ensuring alignment with business

rules and data quality standards.

  • Develop and document data mapping specifications, transformation logic, and

validation procedures.

  • Automate data extraction and transformation using tools such as SQL, Python, or

ETL platforms.

  • Ensure referential integrity and optimize performance of integrated data systems.
  • Validate integrated data against source systems to ensure completeness and

accuracy.

  • Support testing and troubleshooting during integration and post-deployment

phases.

  • Maintain documentation of integration processes, mappings, and automation

scripts.

Required Skills & Qualifications

  • Strong experience with Microsoft Access and/or SQL Server (queries, schema

design, performance tuning).

  • Proficiency in data transformation and automation using SQL, Excel, or scripting

languages.

  • Experience with ETL processes and data integration best practices.
  • Ability to troubleshoot data issues and resolve discrepancies independently.

  • Excellent documentation and communication skills.

  • Experience with ERP systems and strong data mapping skills are a plus.
  • Strong data mapping skills and ability to translate business requirements into

technical specifications.

  • Prior experience in automating data workflows and building scalable integration

solutions.

Primary Skills

SQL , Microsoft Access, ETL ADF

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Karnataka, Karnataka ₹400000 - ₹1200000 Y BLUEWINGS

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Details:

  • Location: Bangalore-Onsite
  • Type- Work From Office-Bangalore E-city

We are seeking a motivated Data Integration Engineer to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.

Key Responsibilities

  • Design, build, and maintain data pipelines and APIs using Python.
  • Integrate data from various sources including third-party APIs and internal systems.
  • Work with large, unstructured datasets and transform them into usable formats.
  • Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
  • Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
  • Ensure high performance and responsiveness of applications.
  • Write clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

  • Strong proficiency in Python and data libraries like Pandas.
  • Experience with web frameworks like Django, FastAPI, or Flask.
  • Hands-on experience with MongoDB or other NoSQL databases.
  • Proficiency in working with RESTful APIs and JSON.
  • Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
  • Solid understanding of data mining, data exploration, and troubleshooting data issues.
  • Real-world experience with large-scale data systems in cloud environments.
  • Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
  • Self-starter with a strong sense of ownership and a passion for problem-solving.
  • Comfortable working with messy or unstructured data.

Preferred Qualifications

  • Bachelor's or Master's degree in Computer Science.
  • Exposure to Big Data and Machine Learning technologies is a plus.

Interview Process for selected candidates

  1. First Round: Conducted via Google Meet.

  2. Second Round: Technical round Face to face.

Job Type: Full-time

Pay: ₹40, ₹120,000.00 per month

Benefits:

  • Health insurance
  • Paid sick time
  • Provident Fund

Ability to commute/relocate:

  • Electronic City, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required)

Location:

  • Electronic City, Bengaluru, Karnataka (Required)

Work Location: In person

This advertiser has chosen not to accept applicants from your region.

Data integration engineer

₹104000 - ₹130878 Y Accendia Technologies Pvt. Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a motivated 
Data Integration Engineer

to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.

Key Responsibilities

  • Design, build, and maintain data pipelines and APIs using Python.
  • Integrate data from various sources including third-party APIs and internal systems.
  • Work with large, unstructured datasets and transform them into usable formats.
  • Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
  • Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
  • Ensure high performance and responsiveness of applications.
  • Write clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

  • Strong proficiency in Python and data libraries like Pandas.
  • Experience with web frameworks like Django, FastAPI, or Flask.
  • Hands-on experience with MongoDB or other NoSQL databases.
  • Proficiency in working with RESTful APIs and JSON.
  • Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
  • Solid understanding of data mining, data exploration, and troubleshooting data issues.
  • Real-world experience with large-scale data systems in cloud environments.
  • Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
  • Self-starter with a strong sense of ownership and a passion for problem-solving.
  • Comfortable working with messy or unstructured data.

Preferred Qualifications

  • Bachelor's or Master's degree in Computer Science.
  • Exposure to Big Data and Machine Learning technologies is a plus.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Integration Jobs in India !

Data Integration Engineer

Bengaluru, Karnataka Hexaware Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

: SSIS Conversion to Abinitio

Responsibilities for Ab Initio developer:

. Knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application.

. Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination

. Follow release and change processes: distribution of software builds and releases to development , test environments and production

. Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members

. Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files

. Should engage in the intake/release/change/incident/problem management processes

. Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects )

. Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans

. Thoroughly unit test ETL code to ensure error free/efficient delivery

. Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment.

. Should be able to provide data modeling solutions.

Be independent developer. Seek support from seniors in the team for ensuring smooth delivery

 Qualifications for Ab Initio developer:

. Minimum 7+ years professional experience in ETL-Ab Initio, RDBMS(MS SQL Server), Batch Processing

. Strong in-depth knowledge of RDBMS database, Datawarehouse/Datalake concepts

. Should be well proficient in Ab Initio, MS SQL Server, Hadoop/MangoDB, other Scripting to support ETL development

. 4+ years of experience in Datawarehouse/Datalake development using the Ab Initio ETL tool as well as have used MS SQL database

. Experience with scripting language such as T-SQl, hive/hql, batch scripting, shell scripting and any other popular scripting

. Experience working in complex development environment with Large data

. Strong Communication/team skills and analytical skills

. Ability to articulate advanced technical topics to both technical and non-technical staff

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Pune, Maharashtra Hexaware Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

:
  • Develop and maintain conceptual, logical, and physical data models with its corresponding metadata.
  • Perform data mapping based on data source schemas and reverse engineering of existing transformations from multiple source database systems on cloud data platform to meet the corporate standards.
  • Conduct data analysis and capture data requirements.
  • Work closely with all the squad and product owners to implement data strategies.
  • Validate logical data models with business subject matter experts.
  • Work with the development team to ensure all requirements are captured and reflected in the data model.
  • Work with the DBA team to design physical models to satisfy optimal performance.
  • Active participation in metadata definition and management.
  • Skillsets :

  • Experience in evaluating the data models of existing data recording systems
  • Experience in creating logical data model and data flows
  • Understanding of data modeling best practices
  • Sound Data modeling techniques using any of the data modeling tool such as Erwin, ER/Studio
  • Should have experience in data modeling for Data Warehouse/Data Platform
  • Experience in data modeling for Commercial Insurance systems primarily Lloyd’s and London Market
  • Commercial insurance experience primarily focused on Lloyd’s and London Market
  • This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Integration Jobs