29,956 Azure Data Engineer Data Factory Synapse Analytics jobs in India
Data Warehousing Engineer
Posted today
Job Viewed
Job Description
Who are we looking for?
- We are looking for 7+ Years of database development experience should have min 5+ years of relevant experience. Strong SQL experience in creating database objects like Tables, Stored Procedures, DDL/DML Triggers, Views, Indexes, Cursors, Functions & User defined data types.
Technical Skills:
- Looking for 7+ Years of database development experience and should have min 5+ years of relevant experience.
- Strong PLSQL experience in creating database objects like Tables, Stored Procedures, DDL/DML Triggers, Views, Indexes, Cursors, Functions & User defined data types.
- Expertise in using Oracle Performance tuning concepts with Oracle hints and EXPLAIN PLAN tool
- Strong experience using SQL and PL/SQL features like Built In Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner
- Strong understanding of Data Warehousing and Extraction Transformation Loading (ETL)
- Sound understanding of RDBMS (Oracle)
- Should have used Oracle SQL Loader/External File Utilities to load files
- Good to have experience with Snowflake cloud data platform including Snowflake utilities like SnowSQL, SnowPipe, data loading within cloud (AWS or Azure)
Data Warehousing Specialist
Posted today
Job Viewed
Job Description
This role is ideal for an analytical thinker with a wide range of skills in database administration, reporting and dashboarding disciplines.
We seek an experienced data engineer with extensive experience in TSQL (stored procedures, functions, triggers, ad hoc queries), SSRS (design/development, subscriptions, query performance tuning and BAU support), ETL, Azure data storage, Azure data factory and pipelines, Azure Synapse and data warehousing. Some exposure to Power BI and Azure analytics is beneficial.
Data Warehousing Specialist
Posted today
Job Viewed
Job Description
Tata Consultancy Services is hiring for Senior SAP HANA Modeling / SAC / Datasphere Consultant with Experience between 4 - 12 Years for Bangalore, Hyderabad, Chennai, Pune locations.
Apply Below
hashtag#SAP hashtag#HANAModeling hashtag#SAC hashtag#SAPDatasphere hashtag#BW4HANA hashtag#Analytics hashtag#SAPJobs hashtag#HyderabadJobs hashtag#TCS hashtag#DataAnalytics hashtag#SAPBTP hashtag#Hiring hashtag#CloudAnalytics hashtag#TechJobs
SAP Datasphere (DWC) Modeling: Build flexible and scalable data models (like a central hub) to connect data from different sources.
SAP Analytics Cloud (SAC) Development:
Create executive dashboards, reports, and interactive stories.
Develop planning and budgeting solutions in SAC Planning.
Data Integration: Connect data from SAP systems (S/4HANA, BW/4HANA) and non-SAP sources into Datasphere.
HANA Modeling: Use Native SAP HANA/HANA Cloud (e.G., Calculation Views) for high-performance data processing.
Consulting: Work closely with business users to understand their needs and translate them into technical designs.
Performance: Tune data models and queries to ensure fast reporting speeds.
Governance: Implement data security, access control, and quality checks.
Required Skills:
Expertise in SAP Cloud Analytics: Deep, hands-on experience with SAP Datasphere and SAP Analytics Cloud (SAC).
Data Modeling: Strong background in data warehousing and modeling concepts (e.G., star schema).
Technical Tools: Proficiency in SQL and experience with modern ETL/ELT tools.
SAP Ecosystem: Good knowledge of SAP S/4HANA, BW/4HANA, and the SAP Business Technology Platform (BTP).
Soft Skills: Excellent client communication, problem-solving, and team leadership.
Experience Level Expectation
5-8 Years (Senior Consultant): Proven track record of end-to-end project delivery and strong independent technical work.
8-12 Years (Lead/Architect): Experience leading multiple projects, designing enterprise-level solutions, and mentoring teams.
Data Warehousing Specialist
Posted today
Job Viewed
Job Description
SoftMaster Technology Solutions provides end-to-end IT services, including custom software development, web & mobile application development, cloud solutions, data analytics & AI, and enterprise solutions. They offer training and placement programs for fresh graduates and professionals in IT careers through online, offline, and hybrid learning modes.
Role Description
This is a full-time on-site role located in Hyderabad for a Data Engineer at SoftMaster Technology Solutions. The Data Engineer will be responsible for data engineering tasks, including data modeling, ETL processes, data warehousing, and data analytics.
Qualifications
- Data Engineering and Data Modeling skills
- Experience with Extract Transform Load (ETL) processes
- Data Warehousing and Data Analytics skills
- Strong analytical and problem-solving skills
- Proficiency in SQL and programming languages like Python or R
- Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus
- Bachelor's or Master's degree in Computer Science or related field
Data Warehousing Solutions Engineer
Posted today
Job Viewed
Job Description
About Position:
We are seeking a skilled Data Engineer with hands-on experience in Azure Cloud technologies to join our dynamic data team. This role is ideal for someone passionate about building scalable data pipelines, designing robust data architectures, and enabling data-driven decision-making across the organization. You will play a key role in shaping our data infrastructure, ensuring high-quality data ingestion, transformation, and modeling to support analytics and business intelligence initiatives.
- Role: Azure Data Engineer
- Location: All Persistent Locations
- Experience: 5+Years
- Job Type: Full Time Employment
What You'll Do:
- Develop and maintain data pipelines using Azure Data Factory (ADF) for efficient ingestion from diverse sources.
- Transform and process large-scale datasets using PySpark within Databricks, ensuring optimized performance and scalability.
- Perform advanced data profiling and analysis using SQL to ensure data quality, consistency, and readiness for downstream consumption.
- Design and implement data architecture, including data warehousing solutions and Lakehouse Medallion Architecture, to support both operational and analytical workloads.
- Apply dimensional modeling techniques to structure data for intuitive and efficient querying by business users and analysts.
- Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements and deliver impactful solutions.
- Data Engineering on Azure Cloud- Data Ingestion (ADF), Transformation (Pyspark - Databricks), Data Profiling ? Advanced SQL
- Data Design/Pattern- Data warehousing, Dimensional Modelling and Lakehouse Medallion Architecture
Expertise You'll Bring:
- Proven experience in Azure Data Factory (ADF) for orchestrating data workflows and integrations.
- Strong proficiency in PySpark and Databricks for scalable data transformation and processing.
- Expertise in Advanced SQL for data profiling, cleansing, and complex querying.
- Solid understanding of data warehousing concepts, including dimensional modeling and ETL best practices.
- Familiarity with Lakehouse architecture, especially the Medallion pattern (Bronze, Silver, Gold layers) for organizing and optimizing data lakes.
- Ability to design and implement data solutions that are secure, scalable, and aligned with business goals.
- Proven experience in Azure Data Factory (ADF) for orchestrating data workflows and integrations.
- Strong proficiency in PySpark and Databricks for scalable data transformation and processing.
- Expertise in Advanced SQL for data profiling, cleansing, and complex querying.
- Solid understanding of data warehousing concepts, including dimensional modeling and ETL best practices.
- Familiarity with Lakehouse architecture, especially the Medallion pattern (Bronze, Silver, Gold layers) for organizing and optimizing data lakes.
- Ability to design and implement data solutions that are secure, scalable, and aligned with business goals.
Benefits:
- Competitive salary and benefits package
- Culture focused on talent development with quarterly growth opportunities and company-sponsored higher education and certifications
- Opportunity to work with cutting-edge technologies
- Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards
- Annual health check-ups
- Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents
Values-Driven, People-Centric & Inclusive Work Environment:
Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds.
- We support hybrid work and flexible hours to fit diverse lifestyles.
- Our office is accessibility-friendly, with ergonomic setups and assistive technologies to support employees with physical disabilities.
- If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment
Let's unleash your full potential at Persistent
"Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."
Senior Data Warehousing Consultant
Posted today
Job Viewed
Job Description
About Invenio
Invenio is the largest independent global SAP solutions provider serving the public sector, as well as offering specialist skills in media and entertainment. We bring deep expertise combined with advanced technologies to enable organisations to modernise so they can operate at the speed of today’s business. We understand the complexities of international businesses and public sector organisations, working with stakeholders to drive change and create agile organisations of tomorrow using the technologies of today. Learn more at
Role - SAP BO BW Senior Consultant
Location-Delhi/Mumbai/Pune/Noida/Hyderabad
Responsibilities
- Document all technical and functional specifications for implemented solutions.
- Proficient in BW/B4H & ABAP/CDS with experience in the areas of Analysis, Design, Development
- Collaborate with clients to gather business requirements and translate them into BI/BW technical solutions.
- Interact with key stakeholders/support members in different areas of BW.
- Provide technical solutions to fulfil business requests using SAPs BW.
- Design, develop, configure, migrate, test and implement SAP BW 7.X data warehousing solutions using SAP BW, BW/4HANA, and related tools.
- Ensure data accuracy, integrity, and consistency in the SAP landscape.
- Optimize performance of queries, reports, and data models for better efficiency.
- Manage delivery of services against agreed SLAs as well as manage escalations both internal and externally.
- Understands client business requirements, processes, objectives and possesses the ability to develop necessary product adjustments to fulfil clients' needs.
- Develop process chains to load and monitor data loading.
- Provide technical guidance and mentorship to junior consultants and team members.
- Interact with key stakeholders/support members in different areas of BW.
- Design and build data flows including Info Objects, Advanced Datastore Objects (ADSO),Composite Providers, Transformations, DTPs and Data Sources
- Conduct requirement gathering sessions and provide design thinking approach.
- Develop process chains to load and monitor data loading.
- Work closely with clients to understand their business needs and provide tailored solutions.
- Build and maintain strong relationships with key stakeholders, ensuring satisfaction and trust.
- Manage and mentor a team of consultants, ensuring high-quality delivery and skill development.
- Facilitate knowledge sharing and promote the adoption of new tools and methodologies within the team.
- Act as an escalation point for technical and functional challenges. Well experience in handling P1 and P2 situations.
Skills & Qualifications
- Bachelor’s Degree IT or equivalent 6 to 8 years of experience in one or more SAP modules.
- At least four full life cycle SAP BW implementation and at least two with BI 7.X experience (from Blueprint/Explore through Go-Live).
- Ability to use Service Marketplace to create tickets, research notes, review release notes and solution roadmaps as well as provide guidance to customers on release strategy.
- Exposure to other SAP modules and integration points.
- Strong understanding of SAP BW architecture, including BW on HANA, BW/4HANA, and SAP S/4HANA integration.
- Knowledge of SAP ECC, S/4HANA, and other SAP modules.
- Proficiency in SAP BI tools such as SAP BusinessObjects, SAP Lumira, and SAP Analytics Cloud.
- Experience with data modelling, ETL processes, and SQL.
- Certifications in SAP Certified Application Associate - SAP Business Warehouse (BW), SAP Certified Application Associate - SAP HANA.
- Should be well versed to get the data through different extraction methods.
- Flexible to work in shifts based on the project requirement.
- Strong skills in SAP BI/BW, BW/4HANA and BW on HANA development and production support experience.
- Excellent communication, client management, and stakeholder engagement abilities.
- Extensively worked on BW user exits, start routines, end routines with expertise in ABAP/4.
- Extensively worked on standard data source enhancements and info provider enhancements.
- In-depth knowledge and understanding of SAP BI Tools such as: Web Intelligence, Analysis for Office, Query Designer.
- Has end-to-end experience: can independently investigate issues from Data Source/Extractor to BI Report level problem solving skills.
- Has end-to-end Development experience: can build extractors, model within SAP BW, and develop Reporting solutions, including troubleshooting development issues.
Invenio is an equal opportunities employer. We do not discriminate based on race, colour, creed, religion, nationality, ancestry, citizenship status, age, sex or gender (including pregnancy and related conditions), gender identity or expression, sexual orientation, marital status, military service, veteran status, genetic information, or any other characteristic protected by applicable laws. Invenio’s management team is committed to this policy in all areas of employment, including recruitment, hiring, placement, promotion, training, compensation, benefits, and workplace environment.
ETL and Data Warehousing Engineer
Posted today
Job Viewed
Job Description
Mandate Skills - SQL, ETL ,hadoop, pyspark,
Required Skills
- Pipeline/ETL (Extract, Transform, Load) processes, API integration, scripting languages (Python), and big data technologies (Trio, Iceberg, DuckDB/Parquet).
- Database design, data modeling, and data warehousing. SQL and at least one cloud platform.
- Analytical tools (Superset, Power BI), statistical analysis, and SQL. To derive insights from data.
- Data governance principles and data quality.
- Data management, data integration projects using IPAAS tools like IBM Sterling, Mulesoft, Dell BoomiJob description:
- Experience and expertise in
- Pipeline/ETL (Extract, Transform, Load) processes, API integration, scripting languages (Python), and big data technologies (Trio, Iceberg, DuckDB/Parquet).
- Database design, data modeling, and data warehousing. SQL and at least one cloud platform.
- Analytical tools (Superset, Power BI), statistical analysis, and SQL. To derive insights from data.
- Data governance principles and data quality.
- Data management, data integration projects using IPAAS tools like IBM Sterling, Mulesoft, Dell Boomi
Preferred Skills
- Experience with SQL/ETL/Pyspark / Hadoop
- Experience with containerization technologies (e.G., Docker, Kubernetes) will added advantage
Mandate Skills - SQL, ETL ,pyspark
Job Location - Bangalore, Pune, Chennai
Experience - 5+ yrs
Round of interview - Online assessment / Technical round / Manager
About Finacle
Finacle is an industry leader in digital banking solutions. We partner with emerging and established financial institutions to inspire better banking. Our cloud-native solution suite and SaaS services help banks to engage, innovate, operate, and transform better. We are a business unit of EdgeVerve Systems, a wholly-owned product subsidiary of Infosys – a global technology leader with over USD 15 billion in annual revenues. We are differentiated by our functionally-rich solution suite, composable architecture, culture, and entrepreneurial spirit of a start-up. We are also known for an impeccable track record of helping financial institutions of all sizes drive digital transformation at speed and scale.
Today, financial institutions in more than 100 countries rely on Finacle to help more than a billion people and millions of businesses to save, pay, borrow, and invest better.
Finacle website (
Disclaimer :- Edgeverve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately.
Be The First To Know
About the latest Azure data engineer data factory synapse analytics Jobs in India !
SAP Data Warehousing Solutions Lead
Posted today
Job Viewed
Job Description
About the Position:
We seek an experienced professional to lead our team in delivering data warehousing solutions using SAPs Business Intelligence platform. The ideal candidate will possess a deep understanding of technical and functional specifications, as well as excellent collaboration and communication skills.
Key Responsibilities:
- Develop detailed technical and functional specifications for implemented solutions.
- Collaborate with clients to gather business requirements and translate them into BI/BW technical solutions.
- Provide expert-level technical solutions to meet business requests using SAPs BW.
- Design, develop, configure, migrate, test and implement SAP BW 7.x data warehousing solutions.
- Evaluate and ensure data accuracy, integrity, and consistency in the SAP landscape.
Required Skills & Qualifications:
- Bachelor's Degree in IT or equivalent with 6-8 years of experience in one or more SAP modules.
- At least four full life cycle SAP BW implementation and at least two with BI 7.x experience.
- Proficiency in using Service Marketplace to create tickets, research notes, review release notes and solution roadmaps.
- Knowledge of other SAP modules and integration points.
Technical Program Manager - Data Warehousing
Posted today
Job Viewed
Job Description
D&A TECH PM
Experience Required- 12- 14 years
Tech Project Manager – Job description
- Project Manager will be responsible for driving project management activities in the Azure cloud
- Strong understanding of DW and Data Lake process execution from acquiring data from source system to visualization
- Required Skills/ Experience - Overall 12-14 year of IT experience in Datawarehouse with relevant experience in Project & People Management for min 4 years
- He/ She will be responsible for END TO END project execution and delivery across multiple clients
- Exposure to Azure skills – ADF, Databricks, PySPark, Synapse
- Continuous learning and improvements in Engineering, Technical & Process capabilities
- Deep hands-on technical expertise in technical architecture and delivery in Data Warehouse
- Must have experience in managing at least 2 end to end on Data Lake and Data Warehouse projects
- Owns end to end accountability of the deliverables
- Good communication skills. Should be able to organize the work/tasks well to the team.
- Expected to manage a team of 15+ people with multiple modules/products running in parallel.
- Strong leadership skills, including coaching, team-building, and conflict resolution
- Project Management skills including time and risk management, resource prioritization and project structuring
- Highly interpersonal, analytical, problem solving and decision making skills.
- Provide presale support - requirement understanding, estimation and sizing
Data Pipelines Architect
Posted today
Job Viewed
Job Description
We're revolutionizing the future of healthcare analytics by crafting pipelines that are reliable, observable, and continuously improving in production.
This role is fully remote, open to candidates based in Europe or India, with periodic team gatherings.
- Design scalable ETL data pipelines using Python (Pandas, PySpark) and SQL, orchestrated with Airflow to deliver high-quality insights.
- Develop and maintain a robust SAIVA Data Lake/Lakehouse on AWS, ensuring quality, governance, scalability, and accessibility.
- Run and optimize distributed data processing jobs with Spark on AWS EMR and/or EKS for enhanced performance.
- Implement batch and streaming ingestion frameworks (APIs, databases, files, event streams) to collect and process diverse data sources.
- Enforce validation and quality checks to ensure reliable analytics and ML readiness.
- Monitor and troubleshoot pipelines with CloudWatch, integrating observability tools like Grafana, Prometheus, or Datadog for proactive issue resolution.
- Automate infrastructure provisioning with Terraform, following AWS best practices for efficient deployment.
- Manage SQL Server, PostgreSQL, and Snowflake integrations into the Lakehouse for seamless data exchange.
- Participate in an on-call rotation to support pipeline health and resolve incidents quickly, ensuring minimal downtime.