938 Azure Data Factory jobs in India

Azure Data Factory

Chennai, Tamil Nadu Cognizant

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Job Summary**
The Sr. Developer role focuses on leveraging Azure Data Factory to design and implement data solutions in a hybrid work model. With 9 to 12 years of experience the candidate will contribute to optimizing data processes and enhancing efficiency. Experience in Cards & Payments is advantageous enabling impactful contributions to the companys data-driven initiatives.
**Responsibilities**
+ Develop and implement data solutions using Azure Data Factory to streamline data processes and enhance efficiency.
+ Collaborate with cross-functional teams to understand data requirements and translate them into effective data models.
+ Optimize data pipelines to ensure seamless integration and processing of large datasets.
+ Analyze and troubleshoot data-related issues providing timely resolutions to maintain data integrity.
+ Ensure data security and compliance with industry standards safeguarding sensitive information.
+ Contribute to the continuous improvement of data architecture and infrastructure.
+ Utilize domain knowledge in Cards & Payments to enhance data solutions and drive business insights.
+ Participate in code reviews and provide constructive feedback to peers fostering a culture of excellence.
+ Stay updated with the latest advancements in Azure Data Factory and related technologies.
+ Document data processes and workflows to ensure clarity and facilitate future enhancements.
+ Support the deployment of data solutions in a hybrid work model ensuring smooth transitions and minimal disruptions.
+ Engage in knowledge sharing sessions to promote best practices and innovative solutions.
+ Collaborate with stakeholders to align data solutions with business objectives maximizing impact on company goals.
**Qualifications**
+ Possess extensive experience in Azure Data Factory demonstrating proficiency in designing and implementing data solutions.
+ Have a strong understanding of data integration and processing techniques ensuring efficient data workflows.
+ Experience in Cards & Payments domain is a plus providing valuable insights into industry-specific data needs.
+ Demonstrate excellent problem-solving skills with the ability to troubleshoot and resolve data-related issues effectively.
+ Exhibit strong communication skills enabling effective collaboration with cross-functional teams.
+ Show commitment to continuous learning and staying updated with emerging data technologies.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

Chennai, Tamil Nadu Cognizant

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Job Summary**
We are seeking a Sr. Developer with 8 to 10 years of experience to join our team. The ideal candidate will have expertise in Databricks SQL Databricks Workflows and PySpark along with domain knowledge in Medicare and Medicaid Claims Claims and Payer. This hybrid role requires a proactive individual who can contribute to our projects during day shifts without the need for travel.
**Responsibilities**
+ Develop and maintain scalable data processing solutions using Databricks SQL and PySpark to enhance data analysis capabilities.
+ Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations and improve efficiency.
+ Analyze Medicare and Medicaid Claims data to identify trends and insights that can drive business decisions and improve healthcare outcomes.
+ Ensure data integrity and accuracy by implementing robust data validation and quality checks within the Databricks environment.
+ Optimize existing data pipelines to improve performance and reduce processing time contributing to faster decision-making processes.
+ Provide technical expertise and support to team members fostering a collaborative environment that encourages knowledge sharing and innovation.
+ Monitor and troubleshoot data processing workflows to ensure seamless operations and minimize downtime.
+ Document technical specifications and processes to maintain a comprehensive knowledge base for future reference and training purposes.
+ Stay updated with the latest advancements in Databricks and PySpark technologies to continuously improve data processing capabilities.
+ Collaborate with stakeholders to understand business requirements and translate them into technical solutions that align with company objectives.
+ Participate in code reviews and provide constructive feedback to ensure high-quality code standards are maintained across the team.
+ Contribute to the development of best practices and guidelines for data processing and analysis within the organization.
+ Support the companys mission by leveraging data insights to improve healthcare services and positively impact society.
**Qualifications**
+ Possess strong expertise in Databricks SQL Databricks Workflows and PySpark with a proven track record of successful implementations.
+ Demonstrate in-depth knowledge of Medicare and Medicaid Claims Claims and Payer domains with the ability to apply this knowledge to real-world scenarios.
+ Exhibit excellent problem-solving skills and the ability to work independently in a hybrid work model.
+ Show proficiency in data validation and quality assurance techniques to ensure data accuracy and reliability.
+ Have strong communication skills to effectively collaborate with cross-functional teams and stakeholders.
+ Display a commitment to continuous learning and staying abreast of industry trends and technological advancements.
+ Hold a bachelors degree in Computer Science Information Technology or a related field with relevant certifications being a plus.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

Chennai, Tamil Nadu ₹1500000 - ₹2500000 Y Artech

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Developer

Work Location: ~CHENNAI,TN.

Skill Required: Digital : Cloud DevOps~Digital : Microsoft Azure~Digital : DevOps Continuous Integration and Continuous Delivery (CI/CD)~Azure Data Factory

Experience Range in Required Skills: 5-7

Job Description:

• Data Pipeline Design: Architect and implement scalable, reliable data pipelines using Databricks, Spark, and Delta Lake

• Medallion Architecture Implementation: Ingest raw data into the Bronze layer, process and clean it in the Silver layer, and aggregate/transform it for analytics in the Gold layer

• Data Modeling: Design and optimize data models for efficient storage, retrieval, and analytics

• Data Quality & Security: Implement data validation, quality checks, and security controls throughout the pipeline

• Automation & Monitoring: Automate pipeline execution, monitor performance, and troubleshoot issues

• Collaboration: Work closely with data scientists, analysts, and business stakeholders to deliver high-quality, analytics-ready data

Essential Skills:

• Databricks Certified Associate Developer for Apache Spark (or equivalent)

• Strong Python/Scala for Spark development

• Experience with Delta Lake and Spark DataFrame API

• Proven experience migrating from Cloudera to Databricks

• Data pipeline orchestration (Databricks Jobs, Airflow, etc.)

• Data quality and security best practices

Desirable Skills:

• Databricks Certified Data Engineer Associate

• SQL and advanced query optimization

• Cloud platforms (Azure, AWS, GCP)

• Real-time/streaming data processing (Spark Structured Streaming)

• DevOps/MLOps (CI/CD, Docker, Kubernetes)

• Experience with data governance and lineage tools Role & responsibilities

Preferred candidate profile

This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

Gurugram, Uttar Pradesh ₹900000 - ₹1200000 Y Multicloud4u Technologies

Posted today

Job Viewed

Tap Again To Close

Job Description

technical expertise in Azure Data Factory, SQL, Oracle, and Snowflake

Finance or ERP domain experience

SQL Server and Oracle databases

Snowflake data warehouses

Data Factory, Synapse, and related services

This advertiser has chosen not to accept applicants from your region.

Azure data factory

Pune, Maharashtra ₹900000 - ₹1200000 Y ID4 Consultancy

Posted today

Job Viewed

Tap Again To Close

Job Description

Azure Data Factory: Experience in building and managing data pipelines.

 Azure Data Lake Gen 2: Proficient in data storage and management within Azure's

data lake environment.

Azure Databricks / Apache Spark: Hands-on skill with distributed data processing,

transformations, and analytics.

ower BI: Expertise in data visualization and reporting.

Good to have

asic SQL Performance Tuning: Ability to write and optimize SQL queries.

ata Governance & Unity Catalog: Understanding of data governance principles and

experience with Unity Catalog for data management.

ertification: Microsoft DP-203 (Azure Data Engineer Associate) certification.

I/CD Pipelines: Experience implementing CI/CD pipelines for Azure Data Factory or

Databricks projects.

This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

₹1500000 - ₹2500000 Y IDESLABS PRIVATE LIMITED

Posted today

Job Viewed

Tap Again To Close

Job Description

We are looking for a skilled professional with 12-16 years of experience to join our team as an Azure Data Engineer in Bengaluru. The ideal candidate will have expertise in writing terraform code to deploy Azure Data Services on the Azure Platform.

Roles and Responsibility

  • Design, develop, and implement data pipelines using Azure Data Factory.
  • Deploy and manage Azure DataLake and Azure Databricks services.
  • Develop and maintain terraform templates and modules for data infrastructure.
  • Collaborate with cross-functional teams to ensure seamless integration of data systems.
  • Troubleshoot and resolve issues related to data quality and performance.
  • Ensure compliance with security and governance standards.

Job Requirements

  • Strong knowledge of Azure CLI, Terraform, and Bitbucket.
  • Experience with Azure Subscription, Networking, Resource Groups, and ETL services deployment.
  • Familiarity with GitHub is a plus.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work collaboratively in a team environment.
  • Strong understanding of data engineering principles and practices.
This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

₹1500000 - ₹2500000 Y Infosys

Posted today

Job Viewed

Tap Again To Close

Job Description

Educational Requirements

MCA,MSc,Bachelor of Engineering,BBA,BSc

Service Line

Data & Analytics Unit

Responsibilities

A day in the life of an Infoscion- As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.
- You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.
- You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.
- You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.
- You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for youIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you

Additional Responsibilities:

  • Knowledge of more than one technology
  • Basics of Architecture and Design fundamentals
  • Knowledge of Testing tools
  • Knowledge of agile methodologies
  • Understanding of Project life cycle activities on development and maintenance projects
  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes
  • Basics of business domain to understand the business requirements
  • Analytical abilities, Strong Technical Skills, Good communication skills
  • Good understanding of the technology and domain
  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods
  • Awareness of latest technologies and trends
  • Excellent problem solving, analytical and debugging skills

Technical and Professional Requirements:

  • Primary skills:Technology->Cloud Platform->Azure Development & Solution Architecting

Preferred Skills:

Technology->Cloud Platform->Azure Development & Solution Architecting

Generic Skills:

Technology->Cloud Integration->Azure Data Factory (ADF)

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Azure data factory Jobs in India !

Azure Data Factory

Chennai, Tamil Nadu ₹2000000 - ₹2500000 Y Cognizant Technology Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary

We are seeking a Sr. Developer with 8 to 10 years of experience to join our team. The ideal candidate will have expertise in Databricks SQL Databricks Workflows and PySpark along with domain knowledge in Medicare and Medicaid Claims Claims and Payer. This hybrid role requires a proactive individual who can contribute to our projects during day shifts without the need for travel.

Responsibilities

  • Develop and maintain scalable data processing solutions using Databricks SQL and PySpark to enhance data analysis capabilities.
  • Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations and improve efficiency.
  • Analyze Medicare and Medicaid Claims data to identify trends and insights that can drive business decisions and improve healthcare outcomes.
  • Ensure data integrity and accuracy by implementing robust data validation and quality checks within the Databricks environment.
  • Optimize existing data pipelines to improve performance and reduce processing time contributing to faster decision-making processes.
  • Provide technical expertise and support to team members fostering a collaborative environment that encourages knowledge sharing and innovation.
  • Monitor and troubleshoot data processing workflows to ensure seamless operations and minimize downtime.
  • Document technical specifications and processes to maintain a comprehensive knowledge base for future reference and training purposes.
  • Stay updated with the latest advancements in Databricks and PySpark technologies to continuously improve data processing capabilities.
  • Collaborate with stakeholders to understand business requirements and translate them into technical solutions that align with company objectives.
  • Participate in code reviews and provide constructive feedback to ensure high-quality code standards are maintained across the team.
  • Contribute to the development of best practices and guidelines for data processing and analysis within the organization.
  • Support the companys mission by leveraging data insights to improve healthcare services and positively impact society.

Qualifications

  • Possess strong expertise in Databricks SQL Databricks Workflows and PySpark with a proven track record of successful implementations.
  • Demonstrate in-depth knowledge of Medicare and Medicaid Claims Claims and Payer domains with the ability to apply this knowledge to real-world scenarios.
  • Exhibit excellent problem-solving skills and the ability to work independently in a hybrid work model.
  • Show proficiency in data validation and quality assurance techniques to ensure data accuracy and reliability.
  • Have strong communication skills to effectively collaborate with cross-functional teams and stakeholders.
  • Display a commitment to continuous learning and staying abreast of industry trends and technological advancements.
  • Hold a bachelors degree in Computer Science Information Technology or a related field with relevant certifications being a plus.
This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

₹1500000 - ₹2500000 Y Tekskills

Posted today

Job Viewed

Tap Again To Close

Job Description

Role & responsibilities

  • Azure Data Factory (ADF), Azure Databricks with Python experience
  • Specifically looking for hands-on experience in Databricks Delta Live Tables (DLT), Expectations, Workflows Orchestration are extensively used on ADI Data Engineering side along with PySparK programming and also Unity Catalog feature"
  • Good to know Azure Synapse, snowflake
  • Decent communication skills
  • Able to collaborate with multiple project teams
  • Excellent stakeholder management skills

Preferred candidate profile

This advertiser has chosen not to accept applicants from your region.

Azure Data Factory

Bengaluru, Karnataka ₹2000000 - ₹2500000 Y SWITS DIGITAL Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description
Job Title
: Azure Data Factory

Experience
: 7+ years

Location
: PAN India (Preferred: Bangalore / Mumbai)

Key Responsibilities

  • Design, build, and implement data pipelines using Fabric, Azure Data Factory, PySpark, SparkSQL, SQL, and Azure DevOps.
  • Develop and maintain ETL scripts and workflows to enable seamless data integration across multiple sources.
  • Analyze functional specifications and design dimensional data models, KPIs, and metrics to support business reporting needs.
  • Ingest data from diverse applications while ensuring compliance with business SLAs.
  • Implement data security measures including encryption, masking, and access controls.
  • Define and maintain data validation rules, quality checks, and profiling reports.
  • Execute data migration and conversion from legacy systems to modern cloud-based data platforms.

Primary Skills

  • 5+ years of hands-on experience in Azure Data Factory for pipeline orchestration and integration.
  • Strong expertise in PySpark & SparkSQL for distributed data processing.
  • Advanced SQL programming skills with query optimization.
  • Deep understanding of Data Warehousing concepts and dimensional modeling.
  • Proven experience in end-to-end data engineering projects with large-scale datasets.

Core Skills

  • Azure Data Factory | PySpark | SparkSQL
  • Advanced SQL | Data Warehousing | ETL Development
  • Fabric | Azure DevOps | CI/CD Pipelines
  • Dimensional Modeling | KPI & Metrics Development
  • Data Migration | Data Security | Quality & Validation
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Azure Data Factory Jobs