7,786 Integration jobs in India
Data Integration Engineer
Posted today
Job Viewed
Job Description
Job Description :
Must Have
:
- Experience in Data Engineering with a strong focus on Databricks
- Proficiency in Python, SQL, and Spark (PySpark) programming
- Hands-on experience with Delta Lake, Unity Catalog, and MLFlow
- Experience working with CI/CD pipelines
Nice to Have
:
- Exposure to the Azure ecosystem and its services
- Experience developing ELT/ETL frameworks
- Automation of workflows for ingesting structured, semi-structured, and unstructured data
- Familiarity with data visualization tools such as Power BI
Data Integration engineer
Posted today
Job Viewed
Job Description
Job Description
Location: Bangalore
Role Overview
We are seeking a motivated Data Integration Engineer to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.
Key Responsibilities
Design, build, and maintain data pipelines and APIs using Python.
Integrate data from various sources including third-party APIs and internal systems.
Work with large, unstructured datasets and transform them into usable formats.
Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
Ensure high performance and responsiveness of applications.
Write clean, maintainable code with a focus on craftsmanship.
Required Skills & Experience
Strong proficiency in Python and data libraries like Pandas.
Experience with web frameworks like Django / FastAPI / Flask.
Hands-on experience with MongoDB or other NoSQL databases.
Proficiency in working with RESTful APIs and JSON.
Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
Solid understanding of data mining, data exploration, and troubleshooting data issues.
Real-world experience with large-scale data systems in cloud environments.
Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
Self-starter with a strong sense of ownership and a passion for problem-solving.
Comfortable working with messy or unstructured data.
Preferred Qualifications
Bachelors or Master's degree in Computer Science.
Exposure to Big Data and Machine Learning technologies is a plus.
Data Integration Engineer
Posted today
Job Viewed
Job Description
Job Description – Senior Data Integration Engineer (Azure Fabric + CRM Integrations)
Location:
Noida
Employment Type:
Full-time
About Crenovent Technologies
Crenovent is building
RevAi Pro
, an enterprise-grade
Revenue Operations SaaS platform
that integrates CRM, billing, contract, and marketing systems with AI agents and Generative AI search. Our vision is to redefine RevOps with AI-driven automation, real-time intelligence, and industry-specific workflows.
We are now hiring a
Senior Data Integration Engineer
to lead the integration of
CRM platforms (Salesforce, Microsoft Dynamics, HubSpot)
into
Azure Fabric
and enable secure, multi-tenant ingestion pipelines for RevAi Pro.
Role Overview
You will be responsible for designing, building, and scaling data pipelines that bring CRM data into
Azure Fabric (OneLake, Data Factory, Synapse-style pipelines)
and transform it into RevAi Pro's
standardized schema (50+ core fields, industry-specific mappings)
.
This is a
hands-on, architecture + build role
where you will work closely with RevOps SMEs, product engineers, and AI teams to ensure seamless data availability, governance, and performance across multi-tenant environments.
Key Responsibilities
Data Integration & Pipelines
Design and implement
data ingestion pipelines
from Salesforce, Dynamics 365, and HubSpot into Azure Fabric.- Build
ETL/ELT workflows
using Azure Data Factory, Fabric pipelines, and Python/SQL. Ensure
real-time and batch sync
options for CRM objects (Leads, Accounts, Opportunities, Forecasts, Contracts).Schema & Mapping
Map CRM fields to RevAi Pro's
standardized schema (50+ fields across industries)
.- Maintain schema consistency across SaaS, Banking, Insurance, and E-commerce use cases.
Implement data transformation, validation, and enrichment logic.
Data Governance & Security
Implement
multi-tenant isolation
policies in Fabric (Purview, RBAC, field-level masking).- Ensure
PII compliance, GDPR, SOC2 readiness
. Build audit logs, lineage tracking, and monitoring dashboards.
Performance & Reliability
Optimize pipeline performance (latency, refresh frequency, cost efficiency).
- Implement
autoscaling, retry logic, error handling
in pipelines. Work with DevOps to set up CI/CD for Fabric integrations.
Collaboration
Work with
RevOps SMEs
to validate business logic for CRM fields.- Partner with AI/ML engineers to expose clean data to agents and GenAI models.
- Collaborate with frontend/backend developers to provide APIs for RevAi Pro modules.
Required Skills & Experience
- 3+ years
in Data Engineering / Integration roles. - Strong expertise in
Microsoft Azure Fabric
, including: - OneLake
,
Data Factory
,
Synapse pipelines
,
Power Query
. - Hands-on experience with
CRM APIs & data models
: Salesforce, Dynamics 365, HubSpot. - Strong SQL and Python for data transformations.
- Experience with
ETL/ELT workflows
, schema mapping, and multi-tenant SaaS data handling. - Knowledge of
data governance tools (Azure Purview, RBAC, PII controls)
. - Strong grasp of
cloud security & compliance (GDPR, SOC2, HIPAA optional)
.
Preferred (Nice to Have)
- Prior experience building integrations for
Revenue Operations, Sales, or CRM platforms
. - Knowledge of
middleware
(MuleSoft, Boomi, Workato, Azure Logic Apps). - Familiarity with
AI/ML data pipelines
. - Experience with
multi-cloud integrations (AWS, GCP)
. - Understanding of
business RevOps metrics
(pipeline, forecast, quota, comp plans).
Soft Skills
- Strong ownership and problem-solving ability.
- Ability to translate business needs (RevOps fields) into technical data pipelines.
- Collaborative mindset with cross-functional teams.
- Comfortable working in a fast-paced
startup environment
.
Data Integration Engineer
Posted today
Job Viewed
Job Description
Interested can dm or call me
Job Description Data Integration Engineer
Position: Data Integration Engineer
Experience: 5 8 years
Overview
We are seeking a skilled Data Integration Engineer to lead the integration of client data from
multiple source systems—including QuickBooks, Excel, CSV files, and other legacy
platforms—into Microsoft Access or SQL Server. This role will focus on designing and
automating data pipelines, ensuring data accuracy, consistency, and performance across
systems.
Responsibilities
- Collaborate with Implementation and technical teams to gather data integration
requirements.
- Design and implement automated data pipelines to extract, transform, and load
(ETL) data into Access or SQL databases.
- Analyze and map source data to target schemas, ensuring alignment with business
rules and data quality standards.
- Develop and document data mapping specifications, transformation logic, and
validation procedures.
- Automate data extraction and transformation using tools such as SQL, Python, or
ETL platforms.
- Ensure referential integrity and optimize performance of integrated data systems.
- Validate integrated data against source systems to ensure completeness and
accuracy.
- Support testing and troubleshooting during integration and post-deployment
phases.
- Maintain documentation of integration processes, mappings, and automation
scripts.
Required Skills & Qualifications
- Strong experience with Microsoft Access and/or SQL Server (queries, schema
design, performance tuning).
- Proficiency in data transformation and automation using SQL, Excel, or scripting
languages.
- Experience with ETL processes and data integration best practices.
Ability to troubleshoot data issues and resolve discrepancies independently.
Excellent documentation and communication skills.
- Experience with ERP systems and strong data mapping skills are a plus.
- Strong data mapping skills and ability to translate business requirements into
technical specifications.
- Prior experience in automating data workflows and building scalable integration
solutions.
Primary Skills
SQL , Microsoft Access, ETL ADF
Data Integration Engineer
Posted today
Job Viewed
Job Description
• Strong proficiency in Python and data libraries like Pandas.
• Experience with web frameworks like Django, FastAPI, or Flask.
• Hands-on experience with MongoDB or other NoSQL databases
• Proficiency in working with RESTful APIs and JSON.
Data integration engineer
Posted today
Job Viewed
Job Description
Role Overview
We are seeking a motivated
Data Integration Engineer
to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.
Key Responsibilities
- Design, build, and maintain data pipelines and APIs using Python.
- Integrate data from various sources including third-party APIs and internal systems.
- Work with large, unstructured datasets and transform them into usable formats.
- Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
- Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
- Ensure high performance and responsiveness of applications.
- Write clean, maintainable code with a focus on craftsmanship.
Required Skills & Experience
- Strong proficiency in Python and data libraries like Pandas.
- Experience with web frameworks like Django, FastAPI, or Flask.
- Hands-on experience with MongoDB or other NoSQL databases.
- Proficiency in working with RESTful APIs and JSON.
- Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
- Solid understanding of data mining, data exploration, and troubleshooting data issues.
- Real-world experience with large-scale data systems in cloud environments.
- Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
- Self-starter with a strong sense of ownership and a passion for problem-solving.
- Comfortable working with messy or unstructured data.
Preferred Qualifications
- Bachelor's or Master's degree in Computer Science.
- Exposure to Big Data and Machine Learning technologies is a plus.
Data Integration Engineer
Posted today
Job Viewed
Job Description
Key Details:
- Location: Bangalore-Onsite
- Type- Work From Office-Bangalore E-city
We are seeking a motivated Data Integration Engineer to join our engineering team. This individual will play a critical role in integrating and transforming large-scale data to power intelligent decision-making systems.
Key Responsibilities
- Design, build, and maintain data pipelines and APIs using Python.
- Integrate data from various sources including third-party APIs and internal systems.
- Work with large, unstructured datasets and transform them into usable formats.
- Collaborate with cross-functional teams to define data requirements and deliver timely solutions.
- Leverage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to scale data infrastructure.
- Ensure high performance and responsiveness of applications.
- Write clean, maintainable code with a focus on craftsmanship.
Required Skills & Experience
- Strong proficiency in Python and data libraries like Pandas.
- Experience with web frameworks like Django, FastAPI, or Flask.
- Hands-on experience with MongoDB or other NoSQL databases.
- Proficiency in working with RESTful APIs and JSON.
- Familiarity with AWS services: EC2, S3, Snowflake / Databricks.
- Solid understanding of data mining, data exploration, and troubleshooting data issues.
- Real-world experience with large-scale data systems in cloud environments.
- Ability to thrive in a fast-paced, high-growth, deadline-driven setting.
- Self-starter with a strong sense of ownership and a passion for problem-solving.
- Comfortable working with messy or unstructured data.
Preferred Qualifications
- Bachelor's or Master's degree in Computer Science.
- Exposure to Big Data and Machine Learning technologies is a plus.
Interview Process for selected candidates
First Round: Conducted via Google Meet.
Second Round: Technical round Face to face.
Job Type: Full-time
Pay: ₹40, ₹120,000.00 per month
Benefits:
- Health insurance
- Paid sick time
- Provident Fund
Ability to commute/relocate:
- Electronic City, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required)
Location:
- Electronic City, Bengaluru, Karnataka (Required)
Work Location: In person
Be The First To Know
About the latest Integration Jobs in India !
data integration engineer
Posted today
Job Viewed
Job Description
Job Description: Data Integration Engineer
Location: Bangalore
Role Overview
We are seeking a motivated Data Integration Engineer to join our engineering team. This
individual will play a critical role in integrating and transforming large-scale data to power
intelligent decision-making systems.
Key Responsibilities
Design, build, and maintain data pipelines and APIs using Python.
Integrate data from various sources including third-party APIs and internal systems.
ork with large, unstructured datasets and transform them into usable formats.
ollaborate with cross-functional teams to define data requirements and deliver timely
solutions.
everage cloud-based services, especially AWS (EC2, S3), Snowflake / Databricks to
scale data infrastructure.
nsure high performance and responsiveness of applications.
rite clean, maintainable code with a focus on craftsmanship.
Required Skills & Experience
trong proficiency in Python and data libraries like Pandas.
xperience with web frameworks like Django / FastAPI / Flask.
ands-on experience with MongoDB or other NoSQL databases.
roficiency in working with RESTful APIs and JSON.
amiliarity with AWS services: EC2, S3, Snowflake / Databricks.
olid understanding of data mining, data exploration, and troubleshooting data issues.
eal-world experience with large-scale data systems in cloud environments.
bility to thrive in a fast-paced, high-growth, deadline-driven setting.
elf-starter with a strong sense of ownership and a passion for problem-solving.
omfortable working with messy or unstructured data.
Preferred Qualifications
achelor's or Master's degree in Computer Science.
xposure to Big Data and Machine Learning technologies is a plus.
Job Types: Full-time, Permanent
Pay: Up to ₹1,400,000.00 per year
Work Location: In person
Data Integration Engineer
Posted today
Job Viewed
Job Description
Description
JD - Data Analyst (MS SQL Server) 5+ years of SQL development experience on MS SQL Server in designing and implementing database structures: This involves creating tables, views, stored procedures, and other database objects eshooting database issues: This includes identifying and resolving database errors, performance issues, and other problems ence in Performance Tuning, Query Optimization and constructing dynamic queries. ping and maintaining database applications: This includes writing SQL queries, creating reports, and developing database-driven applications. orating with other IT professionals: This involves working with developers, system administrators, and other IT professionals to ensure that the database meets the needs of the organization. to have hands on experience working Vermillion reporting suite (VRS) which Developing and maintaining reports, designing and developing reports that provide insights into financial data, such as performance reports, risk reports, and compliance report ng data accuracy and consistency: This involves validating data and ensuring that it is accurate and consistent across all reports advantages if having experience in SSIS packages development, test and deployments. e support on projects including designing, maintaining metadata models and complex ETL packages. SSIS packages, importing data from files, file operation, tune SSIS packages to ensure accurate and efficient movement of data. Gathering and analyzing business requirements: This includes working with business stakeholders to understand their data needs and translating those needs into Power BI solutions. m unit testing/validation Testing dge of Financial Markets or Asset Management domain. g to learn new technology. Must have good analytical skills. have good communication (verbal, written) skills, able to connect co-ordinate with client. to have knowledge in SSRS, Crystal reports etc. al knowledge of GIT, JIRA and Control-M. g up-to-date with new technologies: This includes keeping up with the latest trends and developments in financial reporting and applying them to the organization's Vermilion Reporting Suite.