7,786 Integration jobs in India

Data Integration Engineer

₹900000 - ₹1200000 Y Innova ESI

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description :

Must Have
:

  • Experience in Data Engineering with a strong focus on Databricks
  • Proficiency in Python, SQL, and Spark (PySpark) programming
  • Hands-on experience with Delta Lake, Unity Catalog, and MLFlow
  • Experience working with CI/CD pipelines

Nice to Have
:

  • Exposure to the Azure ecosystem and its services
  • Experience developing ELT/ETL frameworks
  • Automation of workflows for ingesting structured, semi-structured, and unstructured data
  • Familiarity with data visualization tools such as Power BI
This advertiser has chosen not to accept applicants from your region.

Data Integration engineer

Bengaluru, Karnataka ₹900000 - ₹1200000 Y Ethos HR

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Location: Bangalore

Role Overview

We are seeking a motivated Data Integration Engineer to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.

Key Responsibilities

Design, build, and maintain data pipelines and APIs using Python.

Integrate data from various sources including third-party APIs and internal systems.

Work with large, unstructured datasets and transform them into usable formats.

Collaborate with cross-functional teams to define data requirements and deliver timely solutions.

Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.

Ensure high performance and responsiveness of applications.

Write clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

Strong proficiency in Python and data libraries like Pandas.

Experience with web frameworks like Django / FastAPI / Flask.

Hands-on experience with MongoDB or other NoSQL databases.

Proficiency in working with RESTful APIs and JSON.

Familiarity with AWS services: EC2, S3, Snowflake / Databricks.

Solid understanding of data mining, data exploration, and troubleshooting data issues.

Real-world experience with large-scale data systems in cloud environments.

Ability to thrive in a fast-paced, high-growth, deadline-driven setting.

Self-starter with a strong sense of ownership and a passion for problem-solving.

Comfortable working with messy or unstructured data.

Preferred Qualifications

Bachelors or Master's degree in Computer Science.

Exposure to Big Data and Machine Learning technologies is a plus.

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Noida, Uttar Pradesh ₹2000000 - ₹2500000 Y Crenovent

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description – Senior Data Integration Engineer (Azure Fabric + CRM Integrations)

Location:
Noida

Employment Type:
Full-time

About Crenovent Technologies

Crenovent is building
RevAi Pro
, an enterprise-grade
Revenue Operations SaaS platform
that integrates CRM, billing, contract, and marketing systems with AI agents and Generative AI search. Our vision is to redefine RevOps with AI-driven automation, real-time intelligence, and industry-specific workflows.

We are now hiring a
Senior Data Integration Engineer
to lead the integration of
CRM platforms (Salesforce, Microsoft Dynamics, HubSpot)
into
Azure Fabric
and enable secure, multi-tenant ingestion pipelines for RevAi Pro.

Role Overview

You will be responsible for designing, building, and scaling data pipelines that bring CRM data into
Azure Fabric (OneLake, Data Factory, Synapse-style pipelines)
and transform it into RevAi Pro's
standardized schema (50+ core fields, industry-specific mappings)
.

This is a
hands-on, architecture + build role
where you will work closely with RevOps SMEs, product engineers, and AI teams to ensure seamless data availability, governance, and performance across multi-tenant environments.

Key Responsibilities

  1. Data Integration & Pipelines

  2. Design and implement
    data ingestion pipelines
    from Salesforce, Dynamics 365, and HubSpot into Azure Fabric.

  3. Build
    ETL/ELT workflows
    using Azure Data Factory, Fabric pipelines, and Python/SQL.
  4. Ensure
    real-time and batch sync
    options for CRM objects (Leads, Accounts, Opportunities, Forecasts, Contracts).

  5. Schema & Mapping

  6. Map CRM fields to RevAi Pro's
    standardized schema (50+ fields across industries)
    .

  7. Maintain schema consistency across SaaS, Banking, Insurance, and E-commerce use cases.
  8. Implement data transformation, validation, and enrichment logic.

  9. Data Governance & Security

  10. Implement
    multi-tenant isolation
    policies in Fabric (Purview, RBAC, field-level masking).

  11. Ensure
    PII compliance, GDPR, SOC2 readiness
    .
  12. Build audit logs, lineage tracking, and monitoring dashboards.

  13. Performance & Reliability

  14. Optimize pipeline performance (latency, refresh frequency, cost efficiency).

  15. Implement
    autoscaling, retry logic, error handling
    in pipelines.
  16. Work with DevOps to set up CI/CD for Fabric integrations.

  17. Collaboration

  18. Work with
    RevOps SMEs
    to validate business logic for CRM fields.

  19. Partner with AI/ML engineers to expose clean data to agents and GenAI models.
  20. Collaborate with frontend/backend developers to provide APIs for RevAi Pro modules.

Required Skills & Experience

  • 3+ years
    in Data Engineering / Integration roles.
  • Strong expertise in
    Microsoft Azure Fabric
    , including:
  • OneLake
    ,
    Data Factory
    ,
    Synapse pipelines
    ,
    Power Query
    .
  • Hands-on experience with
    CRM APIs & data models
    : Salesforce, Dynamics 365, HubSpot.
  • Strong SQL and Python for data transformations.
  • Experience with
    ETL/ELT workflows
    , schema mapping, and multi-tenant SaaS data handling.
  • Knowledge of
    data governance tools (Azure Purview, RBAC, PII controls)
    .
  • Strong grasp of
    cloud security & compliance (GDPR, SOC2, HIPAA optional)
    .

Preferred (Nice to Have)

  • Prior experience building integrations for
    Revenue Operations, Sales, or CRM platforms
    .
  • Knowledge of
    middleware
    (MuleSoft, Boomi, Workato, Azure Logic Apps).
  • Familiarity with
    AI/ML data pipelines
    .
  • Experience with
    multi-cloud integrations (AWS, GCP)
    .
  • Understanding of
    business RevOps metrics
    (pipeline, forecast, quota, comp plans).

Soft Skills

  • Strong ownership and problem-solving ability.
  • Ability to translate business needs (RevOps fields) into technical data pipelines.
  • Collaborative mindset with cross-functional teams.
  • Comfortable working in a fast-paced
    startup environment
    .
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

₹800000 - ₹1600000 Y Mobile Programming

Posted today

Job Viewed

Tap Again To Close

Job Description

Interested can dm or call me

Job Description Data Integration Engineer

Position: Data Integration Engineer

Experience: 5 8 years

Overview

We are seeking a skilled Data Integration Engineer to lead the integration of client data from

multiple source systems—including QuickBooks, Excel, CSV files, and other legacy

platforms—into Microsoft Access or SQL Server. This role will focus on designing and

automating data pipelines, ensuring data accuracy, consistency, and performance across

systems.

Responsibilities

  • Collaborate with Implementation and technical teams to gather data integration

requirements.

  • Design and implement automated data pipelines to extract, transform, and load

(ETL) data into Access or SQL databases.

  • Analyze and map source data to target schemas, ensuring alignment with business

rules and data quality standards.

  • Develop and document data mapping specifications, transformation logic, and

validation procedures.

  • Automate data extraction and transformation using tools such as SQL, Python, or

ETL platforms.

  • Ensure referential integrity and optimize performance of integrated data systems.
  • Validate integrated data against source systems to ensure completeness and

accuracy.

  • Support testing and troubleshooting during integration and post-deployment

phases.

  • Maintain documentation of integration processes, mappings, and automation

scripts.

Required Skills & Qualifications

  • Strong experience with Microsoft Access and/or SQL Server (queries, schema

design, performance tuning).

  • Proficiency in data transformation and automation using SQL, Excel, or scripting

languages.

  • Experience with ETL processes and data integration best practices.
  • Ability to troubleshoot data issues and resolve discrepancies independently.

  • Excellent documentation and communication skills.

  • Experience with ERP systems and strong data mapping skills are a plus.
  • Strong data mapping skills and ability to translate business requirements into

technical specifications.

  • Prior experience in automating data workflows and building scalable integration

solutions.

Primary Skills

SQL , Microsoft Access, ETL ADF

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Bengaluru, Karnataka ₹900000 - ₹1200000 Y GR Engineering Projects

Posted today

Job Viewed

Tap Again To Close

Job Description


• Strong proficiency in Python and data libraries like Pandas.

• Experience with web frameworks like Django, FastAPI, or Flask.

• Hands-on experience with MongoDB or other NoSQL databases

• Proficiency in working with RESTful APIs and JSON.

This advertiser has chosen not to accept applicants from your region.

Data integration engineer

₹1500000 - ₹2500000 Y Accendia Technologies Pvt. Ltd

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Overview

We are seeking a motivated 
Data Integration Engineer

to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.

Key Responsibilities

  • Design, build, and maintain data pipelines and APIs using Python.
  • Integrate data from various sources including third-party APIs and internal systems.
  • Work with large, unstructured datasets and transform them into usable formats.
  • Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
  • Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
  • Ensure high performance and responsiveness of applications.
  • Write clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

  • Strong proficiency in Python and data libraries like Pandas.
  • Experience with web frameworks like Django, FastAPI, or Flask.
  • Hands-on experience with MongoDB or other NoSQL databases.
  • Proficiency in working with RESTful APIs and JSON.
  • Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
  • Solid understanding of data mining, data exploration, and troubleshooting data issues.
  • Real-world experience with large-scale data systems in cloud environments.
  • Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
  • Self-starter with a strong sense of ownership and a passion for problem-solving.
  • Comfortable working with messy or unstructured data.

Preferred Qualifications

  • Bachelor's or Master's degree in Computer Science.
  • Exposure to Big Data and Machine Learning technologies is a plus.
This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Karnataka, Karnataka ₹480000 - ₹1440000 Y BLUEWINGS

Posted today

Job Viewed

Tap Again To Close

Job Description

Key Details:

  • Location: Bangalore-Onsite
  • Type- Work From Office-Bangalore E-city

We are seeking a motivated Data Integration Engineer to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.

Key Responsibilities

  • Design, build, and maintain data pipelines and APIs using Python.
  • Integrate data from various sources including third-party APIs and internal systems.
  • Work with large, unstructured datasets and transform them into usable formats.
  • Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
  • Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
  • Ensure high performance and responsiveness of applications.
  • Write clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

  • Strong proficiency in Python and data libraries like Pandas.
  • Experience with web frameworks like Django, FastAPI, or Flask.
  • Hands-on experience with MongoDB or other NoSQL databases.
  • Proficiency in working with RESTful APIs and JSON.
  • Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
  • Solid understanding of data mining, data exploration, and troubleshooting data issues.
  • Real-world experience with large-scale data systems in cloud environments.
  • Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
  • Self-starter with a strong sense of ownership and a passion for problem-solving.
  • Comfortable working with messy or unstructured data.

Preferred Qualifications

  • Bachelor's or Master's degree in Computer Science.
  • Exposure to Big Data and Machine Learning technologies is a plus.

Interview Process for selected candidates

  1. First Round: Conducted via Google Meet.

  2. Second Round: Technical round Face to face.

Job Type: Full-time

Pay: ₹40, ₹120,000.00 per month

Benefits:

  • Health insurance
  • Paid sick time
  • Provident Fund

Ability to commute/relocate:

  • Electronic City, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required)

Location:

  • Electronic City, Bengaluru, Karnataka (Required)

Work Location: In person

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Integration Jobs in India !

data integration engineer

Bengaluru, Karnataka ₹1000000 - ₹1400000 Y Employee Hub

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description: Data Integration Engineer

Location: Bangalore

Role Overview

We are seeking a motivated Data Integration Engineer to join our engineering team. This

individual will play a critical role in integrating and transforming large-scale data to power

intelligent decision-making systems.

Key Responsibilities

 Design, build, and maintain data pipelines and APIs using Python.

Integrate data from various sources including third-party APIs and internal systems.

ork with large, unstructured datasets and transform them into usable formats.

ollaborate with cross-functional teams to define data requirements and deliver timely

solutions.

everage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to

scale data infrastructure.

nsure high performance and responsiveness of applications.

rite clean, maintainable code with a focus on craftsmanship.

Required Skills & Experience

trong proficiency in Python and data libraries like Pandas.

xperience with web frameworks like Django / FastAPI / Flask.

ands-on experience with MongoDB or other NoSQL databases.

roficiency in working with RESTful APIs and JSON.

amiliarity with AWS services: EC2, S3, Snowflake / Databricks.

olid understanding of data mining, data exploration, and troubleshooting data issues.

eal-world experience with large-scale data systems in cloud environments.

bility to thrive in a fast-paced, high-growth, deadline-driven setting.

elf-starter with a strong sense of ownership and a passion for problem-solving.

omfortable working with messy or unstructured data.

Preferred Qualifications

achelor's or Master's degree in Computer Science.

xposure to Big Data and Machine Learning technologies is a plus.

Job Types: Full-time, Permanent

Pay: Up to ₹1,400,000.00 per year

Work Location: In person

This advertiser has chosen not to accept applicants from your region.

Data Integration Engineer

Mumbai, Maharashtra Hexaware Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

JD - Data Analyst (MS SQL Server) 5+ years of SQL development experience on MS SQL Server in designing and implementing database structures: This involves creating tables, views, stored procedures, and other database objects eshooting database issues: This includes identifying and resolving database errors, performance issues, and other problems ence in Performance Tuning, Query Optimization and constructing dynamic queries. ping and maintaining database applications: This includes writing SQL queries, creating reports, and developing database-driven applications. orating with other IT professionals: This involves working with developers, system administrators, and other IT professionals to ensure that the database meets the needs of the organization. to have hands on experience working Vermillion reporting suite (VRS) which Developing and maintaining reports, designing and developing reports that provide insights into financial data, such as performance reports, risk reports, and compliance report ng data accuracy and consistency: This involves validating data and ensuring that it is accurate and consistent across all reports advantages if having experience in SSIS packages development, test and deployments. e support on projects including designing, maintaining metadata models and complex ETL packages. SSIS packages, importing data from files, file operation, tune SSIS packages to ensure accurate and efficient movement of data. Gathering and analyzing business requirements: This includes working with business stakeholders to understand their data needs and translating those needs into Power BI solutions. m unit testing/validation Testing dge of Financial Markets or Asset Management domain. g to learn new technology. Must have good analytical skills. have good communication (verbal, written) skills, able to connect co-ordinate with client. to have knowledge in SSRS, Crystal reports etc. al knowledge of GIT, JIRA and Control-M. g up-to-date with new technologies: This includes keeping up with the latest trends and developments in financial reporting and applying them to the organization's Vermilion Reporting Suite.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Integration Jobs