251 Data Warehouse jobs in India

Data Warehouse Architect

Bengaluru, Karnataka Cvent

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

About the Company


Cvent is a leading meetings, events, and hospitality technology provider with more than 4,800 employees and ~22,000 customers worldwide, including 53% of the Fortune 500. Founded in 1999, Cvent delivers a comprehensive event marketing and management platform for marketers and event professionals and offers software solutions to hotels, special event venues and destinations to help them grow their group/MICE and corporate travel business. Our technology brings millions of people together at events around the world. In short, we’re transforming the meetings and events industry through innovative technology that powers the human connection.


The DNA of Cvent is our people, and our culture has an emphasis on fostering intrapreneurship – a system that encourages Cventers to think and act like individual entrepreneurs and empowers them to take action, embrace risk, and make decisions as if they had founded the company themselves. At Cvent, we value the diverse perspectives that each individual brings. Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections.


About the Role -

Looking for an IT Professional with 4-12 years of hands-on Snowflake Data Cloud experience to lead the design, implementation, and optimization of scalable data solutions. The ideal candidate excels in Snowflake architecture, data warehousing, data lakes, serverless/cloud automation, and AI/LLM concepts.


Key Responsibilities:


  • Architect and deliver end-to-end data solutions on Snowflake.
  • Demonstrate deep understanding of data warehousing concepts and work with structured, semi-structured, and unstructured data.
  • Apply deep expertise in Snowflake architecture (compute, storage, security, data sharing).
  • Use Snowflake native load utilities (e.g., Snowpipe, COPY INTO) for data ingestion and transformation.
  • Develop and manage API-based integrations and automation.
  • Lead data governance, quality, security, and compliance efforts.
  • Optimize performance and manage costs.
  • Oversee Snowflake infrastructure provisioning, scaling, and monitoring.
  • Act as primary stakeholder contact; drive meetings and translate business needs.
  • Foster innovation in data architecture, analytics, and AI-driven solutions.
  • Collaborate with cross-functional teams.
  • Stay current on Snowflake advancements, especially AI features like Cortex.
  • Proactively perform POCs for new Snowflake features & drive adoptions.
  • Leverage data lake frameworks & serverless technologies



Qualifications :


  • Bachelor’s/master’s in computer science, IT, or related field.
  • 7-12 years of Snowflake Data Cloud experience.
  • Deep understanding of Snowflake architecture and ecosystem.
  • Strong grasp of data warehousing concepts and diverse data structures.
  • Expertise in Snowflake native load utilities.
  • API integration and automation experience.
  • Data governance, security, and compliance background.
  • Performance optimization and cost management skills.
  • Experience with Snowflake infrastructure management.
  • Excellent communication and stakeholder management.
  • Innovative, AI-driven mindset.
  • Hands-on Python for data engineering/automation.
  • Knowledge of AI, LLMs, and Snowflake AI features (e.g., Cortex).
  • Experience with data lake frameworks and serverless technologies.


Preferred


  • Familiarity with data modelling, Python, Pyspark and warehousing best practices.
  • Exposure to MLOps, AI/ML pipelines, or deploying AI models.
  • Snowflake or relevant cloud certifications.
This advertiser has chosen not to accept applicants from your region.

Snowflake Data Warehouse

600086 Chennai, Tamil Nadu 2coms

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Application Lead – Snowflake Data Warehouse Location: Chennai, India Experience: Minimum 5 years in Snowflake Data Warehouse Education: 15 years of full-time education required Job Summary:

As an Application Lead , you will be responsible for leading the design, development, and configuration of data-driven applications, with a focus on Snowflake Data Warehouse. Acting as the primary point of contact, you will collaborate with cross-functional teams to ensure application requirements are met while maintaining alignment with business goals. You will guide your team throughout the development lifecycle, resolve technical challenges, and ensure delivery excellence in both performance and quality.

Roles & Responsibilities:

Act as the Subject Matter Expert (SME) for Snowflake Data Warehouse solutions.

Lead and manage a development team, ensuring high performance and collaboration.

Take responsibility for team-level decisions and accountability for deliverables.

Collaborate with multiple teams across the organization to drive key architectural and strategic decisions.

Provide innovative and scalable solutions to data-related challenges, both within the immediate team and across projects.

Facilitate knowledge sharing , promote adoption of best practices, and support ongoing team development.

Monitor project milestones , ensure timely delivery of application components, and maintain a focus on quality and efficiency.

Drive improvements in data warehousing processes and contribute to continuous optimization.

Professional & Technical Skills: Must-Have Skills:

Strong proficiency in Snowflake Data Warehouse with at least 5 years of hands-on experience.

Deep understanding of cloud-based data solutions and scalable architecture.

Good-to-Have Skills:

Experience with ETL processes and data integration tools (e.g., Informatica, Talend, Matillion).

Proficiency in SQL and data modeling techniques (e.g., dimensional, star-schema).

Knowledge of performance tuning and optimization for data warehouse solutions.

This advertiser has chosen not to accept applicants from your region.

BI & Data Warehouse Data Engineer

Bengaluru, Karnataka Astellas Pharma

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives.
DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills. We are seeking committed and talented MDM Engineers to join our new FoundationX team, which lies at the heart of DigitalX. As a member within FoundationX, you will be playing a critical role in ensuring our MDM systems are operational, scalable and continue to contain the right data to drive business value. You will play a pivotal role in building maintaining and enhancing our MDM systems.
This position is based in Bangalore, India. We recognize the importance of work/life balance & believe in optimizing the most productive work environment for all employees to succeed and deliver.
**Purpose and Scope:**
As a Junior Data Engineer, you will play a crucial role in assisting in the design, build, and maintenance of our data infrastructure focusing on BI and DWH capabilities. Working with the Senior Data Engineer, your foundational expertise in BI, Databricks, PySpark, SQL, Talend and other related technologies, will be instrumental in driving data-driven decision-making across the organization. You will play a pivotal role in building maintaining and enhancing our systems across the organization. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT solutions.
**Essential Job Responsibilities:**
+ Collaborate with FoundationX Engineers to design and maintain scalable data systems.
+ Assist in building robust infrastructure using technologies like PowerBI, Qlik or alternative, Databricks, PySpark, and SQL.
+ Contribute to ensuring system reliability by incorporating accurate business-driving data.
+ Gain experience in BI engineering through hands-on projects.
Data Modelling and Integration:
+ Collaborate with cross-functional teams to analyze requirements and create technical designs, data models, and migration strategies.
+ Design, build, and maintain physical databases, dimensional data models, and ETL processes specific to pharmaceutical data.
Cloud Expertise:
+ Evaluate and influence the selection of cloud-based technologies such as Azure, AWS, or Google Cloud
+ Implement data warehousing solutions in a cloud environment, ensuring scalability and security.
BI Expertise:
+ Leverage and create PowerBI, Qlik or equivalent technology for data visualization, dashboards, and self-service analytics.
Data Pipeline Development:
+ Design, build, and optimize data pipelines using Databricks and PySpark. Ensure data quality, reliability, and scalability.
+ Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions.
+ Collaborate with application teams to ensure a seamless transition.
+ Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.
+ Data Strategy Contribution: Contribute to the organization's data strategy by identifying opportunities for data-driven insights and improvements.
+ Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
+ Design, develop and implement robust and scalable data analytics using modern technologies.
+ Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
+ Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
+ Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments.
+ Participate in the continuous delivery pipeline. Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
+ Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
+ Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.
**Qualifications:**
**Required**
+ Bachelor's degree in computer science, Information Technology, or related field (master's preferred) or equivalent experience
+ 3-5+ years of experience in data engineering with a strong understanding of BI technologies, PySpark and SQL, building data pipelines and optimization.
+ 3-5+ years + experience in data engineering and integration tools (e.g., Databricks, Change Data Capture)
+ 3-5+ years + experience of utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus.
+ Experience with relational and non-relational databases.
+ Any relevant cloud-based integration certification at foundational level or above. (Any QLIK or BI certification, AWS certified DevOps engineer, AWS Certified Developer, Any Microsoft Certified Azure qualification, Proficient in RESTful APIs, AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP or any relevant certification)
+ Experience in MuleSoft (Anypoint platform, its components, Designing and managing API-led connectivity solutions).
+ Experience in AWS (environment, services and tools), developing code in at least one high level programming language.
+ Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools
+ Experience with Azure services related to computing, networking, storage, and security
+ Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management
**Preferred**
+ Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains.
+ Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical.
+ Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools
+ Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
+ Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.
+ Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.
**Working Environment**
At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
#LI-CH1
Category FoundationX
Astellas is committed to equality of opportunity in all aspects of employment.
EOE including Disability/Protected Veterans
This advertiser has chosen not to accept applicants from your region.

BI & Data Warehouse Data Engineer

Bengaluru, Karnataka Astellas Pharma

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives.
DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills.
**Purpose and Scope:**
As a Junior Data Engineer, you will play a crucial role in assisting in the design, build, and maintenance of our data infrastructure focusing on BI and DWH capabilities. Working with the Senior Data Engineer, your foundational expertise in BI, Databricks, PySpark, SQL, Talend and other related technologies, will be instrumental in driving data-driven decision-making across the organization. You will play a pivotal role in building maintaining and enhancing our systems across the organization. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT solutions.
This position is based in Bangalore, India. We recognize the importance of work/life balance & believe in optimizing the most productive work environment for all employees to succeed and deliver.
**Essential Job Responsibilities:**
+ Collaborate with FoundationX Engineers to design and maintain scalable data systems.
+ Assist in building robust infrastructure using technologies like PowerBI, Qlik or alternative, Databricks, PySpark, and SQL.
+ Contribute to ensuring system reliability by incorporating accurate business-driving data.
+ Gain experience in BI engineering through hands-on projects.
Data Modelling and Integration:
+ Collaborate with cross-functional teams to analyze requirements and create technical designs, data models, and migration strategies.
+ Design, build, and maintain physical databases, dimensional data models, and ETL processes specific to pharmaceutical data.
Cloud Expertise:
+ Evaluate and influence the selection of cloud-based technologies such as Azure, AWS, or Google Cloud.
+ Implement data warehousing solutions in a cloud environment, ensuring scalability and security.
BI Expertise:
+ Leverage and create PowerBI, Qlik or equivalent technology for data visualization, dashboards, and self-service analytics.
Data Pipeline Development:
+ Design, build, and optimize data pipelines using Databricks and PySpark. Ensure data quality, reliability, and scalability.
+ Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition.
+ Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.
+ Data Strategy Contribution: Contribute to the organization's data strategy by identifying opportunities for data-driven insights and improvements.
+ Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
+ Design, develop and implement robust and scalable data analytics using modern technologies.
+ Collaborate with cross functional teams and practices across the organization including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
+ Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
+ Champion continuous improvement initiatives identifying opportunities to optimize performance security and maintainability of existing data and platform architecture and other technology investments.
+ Participate in the continuous delivery pipeline. Adhering to DevOps best practices for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
+ Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
+ Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.
**Qualifications:**
**Required**
+ Bachelor's degree in computer science, Information Technology, or related field (master's preferred) or equivalent experience
+ 3-5+ years of experience in data engineering with a strong understanding of BI technologies, PySpark and SQL, building data pipelines and optimization.
+ 3-5 +years + experience in data engineering and integration tools (e.g., Databricks, Change Data Capture)
+ 3-5 + years + experience of utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus.
+ Experience with relational and non-relational databases.
+ Any relevant cloud-based integration certification at foundational level or above. (Any QLIK or BI certification, AWS certified DevOps engineer, AWS Certified Developer, Any Microsoft Certified Azure qualification, Proficient in RESTful APIs, AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP or any relevant certification)
+ Experience in MuleSoft (Anypoint platform, its components, Designing and managing API-led connectivity solutions).
+ Experience in AWS (environment, services and tools), developing code in at least one high level programming language.
+ Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools
+ Experience with Azure services related to computing, networking, storage, and security
+ Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management
**Preferred**
+ Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains.
+ Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical.
+ Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools
+ Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
+ Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.
+ Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.
**Working Environment**
At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
#LI-CH1
Category FoundationX
Astellas is committed to equality of opportunity in all aspects of employment.
EOE including Disability/Protected Veterans
This advertiser has chosen not to accept applicants from your region.

BI & Data Warehouse Data Engineer

Bengaluru, Karnataka Astellas Pharma

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

As part of the Astellas commitment to delivering value for our patients, our organization is currently undergoing transformation to achieve this critical goal. This is an opportunity to work on digital transformation and make a real impact within a company dedicated to improving lives.
DigitalX our new information technology function is spearheading this value driven transformation across Astellas. We are looking for people who excel in embracing change, manage technical challenges and have exceptional communication skills.
This position is based in Bangalore, India. We recognize the importance of work/life balance & believe in optimizing the most productive work environment for all employees to succeed and deliver.
**Purpose and Scope:**
As a Junior Data Engineer, you will play a crucial role in assisting in the design, build, and maintenance of our data infrastructure focusing on BI and DWH capabilities. Working with the Senior Data Engineer, your foundational expertise in BI, Databricks, PySpark, SQL, Talend and other related technologies, will be instrumental in driving data-driven decision-making across the organization. You will play a pivotal role in building maintaining and enhancing our systems across the organization. This is a fantastic global opportunity to use your proven agile delivery skills across a diverse range of initiatives, utilize your development skills, and contribute to the continuous improvement/delivery of critical IT solutions.
**Essential Job Responsibilities:**
+ Collaborate with FoundationX Engineers to design and maintain scalable data systems.
+ Assist in building robust infrastructure using technologies like PowerBI, Qlik or alternative, Databricks, PySpark, and SQL.
+ Contribute to ensuring system reliability by incorporating accurate business-driving data.
+ Gain experience in BI engineering through hands-on projects.
+ Data Modelling and Integration:
+ Collaborate with cross-functional teams to analyse requirements and create technical designs, data models, and migration strategies.
+ Design, build, and maintain physical databases, dimensional data models, and ETL processes specific to pharmaceutical data.
+ Cloud Expertise:
+ Evaluate and influence the selection of cloud-based technologies such as Azure, AWS, or Google Cloud.
+ Implement data warehousing solutions in a cloud environment, ensuring scalability and security.
+ BI Expertise:
+ Leverage and create PowerBI, Qlik or equivalent technology for data visualization, dashboards, and self-service analytics.
+ Data Pipeline Development:
+ Design, build, and optimize data pipelines using Databricks and PySpark. Ensure data quality, reliability, and scalability.
+ Application Transition: Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition.
+ Mentorship and Leadership: Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.
+ Data Strategy Contribution: Contribute to the organization's data strategy by identifying opportunities for data-driven insights and improvements.
+ Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
+ Design, develop and implement robust and scalable data analytics using modern technologies.
+ Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, DataX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
+ Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
+ Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments.
+ Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
+ Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
+ Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.
**Qualifications:**
**Required**
+ Bachelor's degree in computer science, Information Technology, or related field (master's preferred) or equivalent experience
+ 1-3+ years of experience in data engineering with a strong understanding of BI technologies, PySpark and SQL, building data pipelines and optimization.
+ 1-3 +years + experience in data engineering and integration tools (e.g., Databricks, Change Data Capture)
+ 1-3+ years + experience of utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding/certification of AWS and Azure is considered a plus.
+ Experience with relational and non-relational databases.
+ Any relevant cloud-based integration certification at foundational level or above. (Any QLIK or BI certification, AWS certified DevOps engineer, AWS Certified Developer, Any Microsoft Certified Azure qualification, Proficient in RESTful APIs, AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP or any relevant certification)
+ Experience in MuleSoft (Anypoint platform, its components, Designing and managing API-led connectivity solutions).
+ Experience in AWS (environment, services and tools), developing code in at least one high level programming language.
+ Experience with continuous integration and continuous delivery (CI/CD) methodologies and tools
+ Experience with Azure services related to computing, networking, storage, and security
+ Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management
**Preferred**
+ Subject Matter Expertise: possess a strong understanding of data architecture/ engineering/operations/ reporting within Life Sciences/ Pharma industry across Commercial, Manufacturing and Medical domains.
+ Other complex and highly regulated industry experience will be considered across diverse areas like Commercial, Manufacturing and Medical.
+ Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools
+ Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
+ Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.
+ Agile Champion: Adherence to DevOps principles and a proven track record with CI/CD pipelines for continuous delivery.
**Working Environment**
At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
#LI-CH1
Category FoundationX
Astellas is committed to equality of opportunity in all aspects of employment.
EOE including Disability/Protected Veterans
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Warehouse

Kochi, Kerala IBM

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in DataMart's using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities: * Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Warehouse

Bangalore, Karnataka IBM

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities:
* Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data warehouse Jobs in India !

Data Engineer-Data Warehouse

Kochi, Kerala IBM

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities:
* Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Data Engineer-Data Warehouse

Kochi, Kerala IBM

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

**Introduction**
In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
**Your role and responsibilities**
* Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies.
* Redesign Control M Batch processing for the ETL job build to run efficiently in Production.
* Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow.
* Responsibilities:
* Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support.
* Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart.
**Required technical and professional expertise**
* Intuitive individual with an ability to manage change and proven time management
* Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
* Up-to-date technical knowledge by attending educational workshops, reviewing publications
**Preferred technical and professional experience**
* Responsible to develop triggers, functions, stored procedures to support this effort
* Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes.
* Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
This advertiser has chosen not to accept applicants from your region.

Sr Data Warehouse Engineer

Vadodara, Gujarat Kiash Solutions LLP

Posted 26 days ago

Job Viewed

Tap Again To Close

Job Description

remote

Location - Remote ((initially for 5 days the candidate needs to come down to our office in Vadodara) and then it would be work from office.

CTC upto 23 LPA

Must have skills mentioned on their resume:

  • SQL Server
  • Data Warehouse
  • Dimensional Modeling
  • Azure (Datalake, Data Factory, Data Bricks) ALL 3 Must
  • Power BI
  • SSIS
  • Python
  • Tableau
  • SSRS

Must have

  • Bachelors or Masters degree in Computer Science, Information Technology, or a related field.
  • 7+ years of experience with Microsoft SQL Server .
  • Expertise in building Data Warehouse  using SQL Server.
  • Hands-on experience Dimensional Modeling  using Facts and Dimensions.
  • Expertise in SSIS  and Python  for ETL development.
  • Strong experience in Power BI  for reporting and data visualization.
  • Strong understanding of relational database design, indexing, and performance tuning.
  • Ability to write complex SQL scripts, stored procedures and views.
  • Experience with Git  and JIRA .
  • Problem-solving mindset and analytical skills.
  • Excellent communication and documentation abilities.


This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Warehouse Jobs