1,714 Data Management Server Roles jobs in India
Data Architecture
Posted today
Job Viewed
Job Description
About Company :
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
· Job Title: Data Architecture
· Location: Bangalore
· Experience: 5+ yrs
· Job Type : Contract to hire.
· Notice Period:- Immediate joiners.
Mandatory Skills:
Mode- hybrid, twice in a week
Shift- 1 to 11 pm
must have-
5-10 yrs of experience in Life Insurance & Annuities Domain
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Skills:
10+ years of experience in Life Insurance & Annuities Domain
6+ yrs relevant exp in Data Architecture
Strong knowledge of Life Insurance functional and business process for USA market
Strong knowledge of Life policy end to end cycle and transactions
Strong understanding of Life Insurance Data and mapping and migration
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Excellent communication and stakeholder’s management skills.
Data Architecture
Posted today
Job Viewed
Job Description
About Company :
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
· Job Title: Data Architecture
· Location: Bangalore
· Experience: 5+ yrs
· Job Type : Contract to hire.
· Notice Period:- Immediate joiners.
Mandatory Skills:
Mode- hybrid, twice in a week
Shift- 1 to 11 pm
must have-
5-10 yrs of experience in Life Insurance & Annuities Domain
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Skills:
10+ years of experience in Life Insurance & Annuities Domain
6+ yrs relevant exp in Data Architecture
Strong knowledge of Life Insurance functional and business process for USA market
Strong knowledge of Life policy end to end cycle and transactions
Strong understanding of Life Insurance Data and mapping and migration
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Excellent communication and stakeholder’s management skills.
Data Architecture
Posted today
Job Viewed
Job Description
About Company :
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
· Job Title: Data Architecture
· Location: Bangalore
· Experience: 5+ yrs
· Job Type : Contract to hire.
· Notice Period:- Immediate joiners.
Mandatory Skills:
Mode- hybrid, twice in a week
Shift- 1 to 11 pm
must have-
5-10 yrs of experience in Life Insurance & Annuities Domain
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Skills:
10+ years of experience in Life Insurance & Annuities Domain
6+ yrs relevant exp in Data Architecture
Strong knowledge of Life Insurance functional and business process for USA market
Strong knowledge of Life policy end to end cycle and transactions
Strong understanding of Life Insurance Data and mapping and migration
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Excellent communication and stakeholder’s management skills.
Data Architecture
Posted today
Job Viewed
Job Description
About Company :
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
· Job Title: Data Architecture
· Location: Bangalore
· Experience: 5+ yrs
· Job Type : Contract to hire.
· Notice Period:- Immediate joiners.
Mandatory Skills:
Mode- hybrid, twice in a week
Shift- 1 to 11 pm
must have-
5-10 yrs of experience in Life Insurance & Annuities Domain
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Skills:
10+ years of experience in Life Insurance & Annuities Domain
6+ yrs relevant exp in Data Architecture
Strong knowledge of Life Insurance functional and business process for USA market
Strong knowledge of Life policy end to end cycle and transactions
Strong understanding of Life Insurance Data and mapping and migration
Exposure in creating data maps for policy master, agency master, claims master etc.
Exposure if Data Architecture, Data model desiging, Data extraction, Data validation
Excellent communication and stakeholder’s management skills.
Specialist, Data Architecture
Posted 1 day ago
Job Viewed
Job Description
We're Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day - quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we're involved. If you want to make an impact on a global scale, come make a difference at Fiserv.
**Job Title**
Specialist, Data Architecture
+ Experience on SSIS/SQL(4-7) years and will be responsible for the development of ETL and Reporting solutions
+ Strong Knowledge of SSIS packages, design principles & best practices.
+ Experience with requirements gathering, technical analysis, and writing technical specifications
+ Candidate must have strong database fundamentals.
+ Must have good knowledge of Data Warehousing & Data Modelling Concepts.
+ Good communication skills are required.
+ Capability to work in a distributed team environment with minimal supervision is required for this profile.
+ The position doesn't require working in shifts, however flexibility to overlap with US hours is required.
+ Should have good knowledge in writing SQL commands, queries and stored procedures
+ Good Knowledge of Snowflake would be preferred.
+ Good Knowledge of Python/Pyspark would be preferred
Thank you for considering employment with Fiserv. Please:
+ Apply using your legal name
+ Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).
**Our commitment to Diversity and Inclusion:**
Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.
**Note to agencies:**
Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.
**Warning about fake job posts:**
Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Data Architecture Expert
Posted today
Job Viewed
Job Description
Job Description
We are seeking a skilled Data Architect to lead our finance team's data strategy. This role involves designing and implementing end-to-end data and analytics solutions that drive business value.
Responsibilities:
- Develop technical solutions for scalable analytical solutions leveraging cloud and big data technologies
- Design and drive end-to-end data and analytics solution architecture
- Develop conceptual/logical/physical data models for analytics solutions
- Ensure industry-accepted data architecture principles are integrated
- Provide mentoring on data architecture design and requirements
- Review solution requirements and architecture for technology selection and resource efficiency
Requirements
To be successful in this role, you will need demonstrated experience delivering multiple data solutions, ETL development, SQL Server, Azure Synapse, and HDInsights. An advanced study/knowledge in computer science or software engineering is also necessary.
What We Offer
Our organization offers a dynamic work environment with opportunities for growth and development. We encourage innovation, teamwork, and continuous learning.
Why Choose Us
Join us to make a meaningful impact on our organization's success. Together, we can drive business outcomes through data-driven insights.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
About Us
We are a leading engineering company that delivers business outcomes for hundreds of enterprises globally.
Our Expertise- Data and AI/ML Engineering
- Cybersecurity Solutions
- SDx and Digital Workspace Services
We offer a range of services including Professional and Advisory Services, Managed Services, and Talent Acquisition & Platform Resell Services.
Job Description:
The successful candidate will be responsible for building additional platform capabilities, optimizing, and maintaining robust data pipelines and platforms that empower data-driven decision-making across the organization.
Responsibilities:
- Collaborate with stakeholders during requirements clarification and sprint planning sessions to ensure alignment with business objectives.
- Design and implement technical solutions, including ETL pipelines, leveraging PySpark to extract, transform, and load data efficiently.
- Build solutions to integrate two main data platforms (Palantir Foundry & Databricks)
- Integrate the data platforms with other platforms, including incident and monitoring tools, identity management, data observability, etc.
- Optimize existing ETL processes for improved performance, scalability, and reliability.
- Develop and maintain unit and integration tests to ensure quality and resilience.
- Provide support to QA teammates during the acceptance process.
- Resolve production incidents as a third-line engineer, ensuring system stability and uptime.
Required Skills and Qualifications:
- Bachelor's degree in IT or a related field.
- Minimum 8 years in IT/Data-related roles.
Technical Expertise:
- Proficient in PySpark for distributed computing and Python for ETL development.
- Advanced SQL skills for writing and optimizing complex queries.
- Familiarity with ETL tools, processes, and data warehousing platforms, particularly Databricks.
- Solid understanding of data modeling, including dimensional modeling, normalization, and schema design.
- Experienced with version control tools such as Git.
- Knowledge of monitoring tools to track and optimize pipeline performance.
- Knowledge of scheduling tools.
- Proficiency in data freshness and quality frameworks.
- Agile Methodologies: Comfortable with Agile practices, including sprint planning, stand-ups, and retrospectives.
- Collaboration Tools: Skilled in using Azure DevOps for team collaboration and project management.
- Problem-Solving: Strong debugging and troubleshooting abilities for complex data engineering issues.
- Communication: Exceptional written and verbal communication skills, with the ability to articulate technical concepts to non-technical stakeholders.
This role requires a passionate and motivated individual with extensive experience in data platform engineering, ETL development, and expertise in PySpark and Python. Good communications skills are a must and candidates need to be articulate, precise, and concise.
Be The First To Know
About the latest Data management server roles Jobs in India !
Data Architecture Professional
Posted today
Job Viewed
Job Description
Role Summary:
Our organisation is seeking a highly skilled and experienced Data Architecture Professional to join our team. As a key member of our technology department, you will be responsible for designing, developing, and supporting conceptual/logical/physical data models for analytics solutions.
Key Responsibilities:
- Develop technical solutions to deliver scalable analytical solutions leveraging cloud and big data technologies.
- Design, develop, and support conceptual/logical/physical data models for analytics solutions.
- Ensure industry-accepted data architecture principles are integrated with allied disciplines and coordinated roll-out strategies are in place.
- Drive the design, sizing, setup of Azure environments and related services.
- Provide mentoring on data architecture design and requirements to development and business teams.
- Advise on new technology trends and possible adoption to maintain competitive advantage.
- Participate in pre-sales activities and publish thought leaderships.
- Work closely with founders to drive technology strategy for organisation.
- Lead technology team recruitments in various areas of data analytics.
Requirements:
To succeed in this role, you will need:
- 10+ years experience delivering multiple data solutions.
- Experience with ETL development using SSIS, Data Factory, and related Microsoft technologies.
- In-depth skills with SQL Server, Azure Synapse, Azure Databricks, HDInsights, Azure Data Lake.
- Experience with different data models like normalised, de-normalized, stars, and snowflake models.
- Data Quality Management (Microsoft DQS) and Data Architecture standardization experience.
- Advanced study/knowledge in computer science or software engineering.
- Familiarity with principles and practices involved in development and maintenance of software solutions.
What We Offer:
Our organisation offers a dynamic and supportive work environment, opportunities for career growth and professional development, and a competitive salary and benefits package.
Data Architecture Specialist
Posted today
Job Viewed
Job Description
Job Overview:
We are seeking a highly skilled Technical Architect with 12+ years of experience to oversee and guide the development of advanced data architectures, business intelligence solutions, and ETL workflows using Azure Data Bricks (ADB), Power BI, and related technologies. This is a strategic role in which you will collaborate with cross-functional teams to design robust, scalable, and efficient data systems that meet business objectives.
Key Responsibilities:
- Architecture Design & Strategy:
- Lead the design and architecture of data pipelines, ETL workflows, and BI solutions leveraging Azure Data Bricks (ADB) and Power BI.
- Develop high-level solutions, ensuring scalability, performance, and cost-effectiveness.
- Create data models, schemas, and architecture blueprints for various business units.
- Guide and mentor teams in implementing best practices for data processing, transformation, and storage.
- ETL Solutions:
- Design, develop, and optimize ETL workflows using Azure Data Bricks, Azure Data Factory, and other Azure services.
- Integrate data from multiple sources into a centralized data lake or warehouse for reporting and analytics.
- Ensure ETL processes are efficient, automated, and error-free.
- Business Intelligence & Reporting:
- Lead the implementation of Power BI solutions for reporting, dashboards, and data visualization.
- Collaborate with stakeholders to understand business requirements and deliver actionable insights through visual reports.
- Ensure that Power BI reports are optimized for performance, usability, and scalability.
- Collaboration & Leadership:
- Work closely with cross-functional teams (business analysts, data engineers, software engineers, and stakeholders) to gather requirements and deliver solutions.
- Provide technical guidance and mentorship to junior team members and foster a culture of continuous learning.
- Translate business requirements into technical specifications and deliver scalable and reliable solutions.
- Cloud & Data Technologies:
- Utilize Azure cloud services such as Azure Databricks, Azure SQL Database, Data Factory, ADLS, Blob Storage, and Azure Synapse to manage and orchestrate data workflows.
- Stay up-to-date with the latest trends in cloud computing, data architecture, and business intelligence.
- Quality Assurance & Best Practices:
- Establish coding standards, data governance practices, and security protocols for all data-related processes.
- Conduct code reviews, and performance tuning, and ensure data integrity and accuracy.
- Design disaster recovery and backup strategies to ensure data availability and reliability.
Required Qualifications:
- 12+ years of experience as a Technical Architect,, or similar role with a focus on Azure Data Bricks, Power BI, and ETL.
- Expertise in designing and implementing data architectures using Azure Data Bricks (ADB).
- Strong proficiency in Power BI for building scalable reports and dashboards.
- In-depth knowledge of ETL tools and processes, particularly with Azure Data Factory and other Azure-based ETL solutions.
- Proficiency in SQL and familiarity with data warehousing concepts (e.G., star schema, snowflake schema).
- Strong understanding of cloud computing and Azure services, including storage, compute, and security best practices.
- Experience with data lake architecture, data pipelines, and data governance.
- Ability to understand complex business requirements and translate them into technical solutions.
- Strong communication skills with the ability to collaborate across business and technical teams.
- Leadership and mentoring experience, guiding junior team members to achieve project goals.
Preferred Qualifications:
- Certification in Azure (e.G., Azure Solutions Architect, Azure Data Engineer).
- Experience with other BI tools or visualization platforms (e.G., Power BI,, PowerApps).
- Knowledge of programming/scripting languages such as Python, Scala, or DAX.
- Familiarity with DevOps practices in data pipelines and CI/CD workflows.
- Experience with Agile methodologies and project management tools like JIRA or Azure DevOps.
Data Architecture Manager
Posted today
Job Viewed
Job Description
Job Title: Senior Data Architect – OSS
Minimum - 10+ years of hands-on experience in data architecture, data modelling, and designing large-scale data platforms.
Location: Bangalore (Onsite)
Why should you choose us?
Rakuten Symphony is a Rakuten Group company, that provides global B2B services for the mobile telco industry and enables next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are taking our mobile offering global. To support our ambitions to provide an innovative cloud-native telco platform for our customers, Rakuten Symphony is looking to recruit and develop top talent from around the globe. We are looking for individuals to join our team across all functional areas of our business – from sales to engineering, support functions to product development. Let’s build the future of mobile telecommunications together!
What Do We Expect From You
As a Senior Data Architect within Product Architecture, you will address the critical challenge of a fragmented data model across our various teams. This pivotal role is responsible for unifying disparate data sources - including telemetry, service graphs, and Root Cause Analysis (RCA) inputs - into a single, coherent, and consistent data model. This foundational work is a prerequisite for advancing our capabilities in Generative AI, comprehensive observability, and robust service assurance, enabling a truly data-driven approach across our product portfolio.
Responsibilities:
Unified Data Model Design & Ownership:
- Lead the architectural design and implementation of a unified data model that consolidates telemetry, service graphs, RCA, and other critical operational data.
- Define conceptual, logical, and physical data models that support diverse product needs while ensuring consistency and interoperability.
- Establish and maintain data architecture standards, principles, and best practices across the product organization.
Data Integration & Pipeline Architecture:
- Architect scalable and efficient data pipelines for ingesting, transforming, and harmonizing data from various heterogeneous sources.
- Design robust data integration patterns (e.G., streaming, batch processing) to ensure data availability and freshness for downstream consumers.
- Ensure data quality, integrity, and security throughout the data lifecycle.
Enabling AI, Observability & Assurance:
- Ensure the unified data model serves as a foundational prerequisite for GenAI applications, providing clean, contextualized data for training and inference.
- Design data structures and access patterns that enhance observability capabilities, enabling real-time monitoring, analytics, and alerting.
- Architect data solutions that power advanced service assurance functionalities, enabling accurate fault correlation, root cause analysis, and performance management.
Data Governance & Lifecycle Management:
- Collaborate with data governance teams to establish data ownership, definitions, and stewardship across product domains.
- Define data retention policies, archiving strategies, and data lifecycle management processes.
- Ensure compliance with data privacy regulations and security best practices.
Technical Leadership & Collaboration
- Provide senior technical leadership and guidance to engineering teams on data modelling, database technologies, and data architecture best practices.
- Work closely with other architects (AIML, Observability, Workflow Management), data scientists, and product managers to understand data requirements and deliver impactful solutions.
- Champion a data-first mindset and promote data literacy across the organization.
Qualifications :
- 10+ years of hands-on experience in data architecture, data modelling, and designing large-scale data platforms.
- Proven track record of unifying fragmented data landscapes and building robust, scalable data models for complex domains.
- Expertise in designing data solutions that support real-time analytics, observability, and machine learning applications.
- Strong background in architecting data pipelines and integration strategies for high-volume, diverse data sources.
Technical Skills
- Deep expertise in relational, NoSQL databases (e.G., Cassandra, ClickHouse) and TSDB (e.G., Cortex, VictoriaMetrics, etc.)
- Strong experience with stream processing technologies (e.G., Kafka, Flink, Spark Streaming) and batch processing (e.G., Apache Spark).
- Proficient in data modelling techniques (dimensional, relational, graph) and data schema design.
- Knowledge of data governance frameworks and tools.
- Proficient in SQL and at least one relevant programming language.
Domain Knowledge
- Familiarity with telecom operational data, including network telemetry, service topology/graphs, and alarm/event data.
Analytical and Problem-Solving Skills
- Exceptional analytical skills to identify data relationships, inconsistencies, and design optimal data structures.
- Strong problem-solving abilities for complex data integration and quality challenges.
Collaboration & Communication
- Excellent written and verbal communication skills, able to articulate complex data architectural concepts to technical and business stakeholders.
- Proven ability to collaborate effectively with cross-functional teams, driving consensus on data standards.
Educational Background
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related technical field.
RAKUTEN SHUGI PRINCIPLES :
Our worldwide practices describe specific behaviours that make Rakuten unique and united across the world. We expect Rakuten employees to model these 5 Shugi Principles of Success.
- Always improve, always advance . Only be satisfied with complete success - Kaizen.
- Be passionately professional . Take an uncompromising approach to your work and be determined to be the best.
- Hypothesize - Practice - Validate - Shikumika. Use the Rakuten Cycle to success in unknown territory.
- Maximize Customer Satisfaction . The greatest satisfaction for workers in a service industry is to see their customers smile.
- Speed! Speed! Speed! Always be conscious of time. Take charge, set clear goals, and engage your team.