Data Modeling Advisor
Posted 7 days ago
Job Viewed
Job Description
About Evernorth:
Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people.
Data Modeling Advisor
Position Summary:
The Health Services Data Design and Metadata Management team is hiring an Architecture Senior Advisor to work across all projects. The work involves understanding and driving data design best practices, including data modeling, mapping, and analysis, and helping others to apply them across strategic data assets. The data models are wide-ranging and must include the appropriate metadata to support and improve our data intelligence. Data design centers around standard health care data (eligibility, claim, clinical, and provider data) across structured and unstructured data platforms.
Job Description & Responsibilities:
- Perform data analysis, data modeling, and data mapping following industry and Evernorth data design standards for analytics/data warehouses and operational data stores across various DBMS types, including Teradata, Oracle, Cloud, and Hadoop, Databricks and datalake.
- Perform data analysis, profiling and validation, contributing to data quality efforts to understand data characteristics and ensure data correctness/condition for use.
- Participate in and coordinate data model metadata development processes to support ongoing development efforts (data dictionary, NSM, and FET files), maintenance of data model/data mapping metadata, and linking of our data design metadata to business terms, data models, mapping documents, ETL jobs, and data model governance operations (policies, standards, best practices).
- Facilitate and actively participate in data model/data mapping reviews and audits, fostering collaborative working relationships and partnerships with multidisciplinary teams.
- Provide guidance, mentoring, and training as needed in data modeling, data lineage, ddl code, and the associated toolsets (Erwin Data Modeler, Erwin Web Portal, Erwin model mart, Erwin Data Intelligence Suite, Alation). Assist with the creation, documentation, and maintenance of Evernorth data design standards and best practices involving data modeling, data mapping, and metadata capture including data sensitivity, data quality rules, and reference data usage. Develop and facilitate strong partnerships and working relationships with Data Governance, delivery, and other data partners. Continuously improve operational processes for data design metadata management for global and strategic data.
- Interact with Business stakeholders and IT in defining and managing data design. Coordination, collaboration, and innovation with Solution Verticals, Data Lake teams, IT & Business Portfolios to ensure alignment of data design metadata and related information with ongoing programs (cyber risk and security) and development efforts.
Experience Required:
- 11 to 13 years' experience with data modeling (logical / physical data model, canonical structures, etc.) and SQL code;
Experience Desired:
- Subject matter expertise level experience preferred
- Experience executing data model / data lineage governance across business and technical data.
- Experience utilizing data model / lineage / mapping / analysis management tools for business, technical and operational metadata (Erwin Data Modeler, Erwin Web Portal, Erwin Model Mart, Erwin Data Intelligence Suite, Alation
- Experience working in an Agile delivery environment (Jira, Confluence, SharePoint, Git, etc.)
Education and Training Required:
- Advanced degree in Computer Science or a related discipline and at least six, typically eight or more years experience in all phases of data modeling, data warehousing, data mining, data entity analysis, logical data base design and relational data base definition, or an equivalent combination of education and work experience.
Primary Skills:
- Physical Data Modeling, Data Warehousing, Metadata, Reference Data, Data Mapping
- Data Mining, Teradata, Data Quality, Excellent Communication Skills, Data Analysis, Oracle
- Data Governance, Database Management System, Jira, DDL, Data Integration, Microsoft, SharePoint, Database Modeling, Confluence, Agile, Marketing Analysis, SharePoint, Operations, Topo, Data Lineage, Data Warehouses, Documentation
- Big Data, Web Portal, Maintenance, Erwin, SQL, Unstructured Data, Audit, Git, Pharmacy
- DBMS, Databricks, AWS
Azure Data Modeling Specialist
Posted today
Job Viewed
Job Description
Tezo is a new generation Digital & AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence.
Data Modeler – Azure Data Engineering
Location: Hyderabad
Experience Level: 8–13 Years
- 12+ years of experience in data management and data architecture, including 5+ years focused on data modeling .
- Expertise in dimensional modeling (Star/Snowflake) , normalized models , and data vault / data lake modeling .
- Strong experience with SQL and Azure Cloud ecosystem — Azure Synapse, Data Factory, Data Lake, SQL DB, Databricks.
- Proven experience designing data models for enterprise data warehouses, data lakes, and analytics platforms .
- Experience working with business glossary, metadata management, and data catalog tools (e.G., Purview, Collibra).
- Knowledge of ETL/ELT processes , data pipelines , and data integration patterns .
- Excellent communication, stakeholder management, and documentation skills.
Key Responsibilities
- Data Modelling & Architecture
- Design, develop, and maintain conceptual, logical, and physical data models for data warehouse, data lakehouse, and transactional systems.
- Implement and optimize dimensional models (star/snowflake) and data vault/lakehouse models aligned with business needs.
- Define and enforce data modeling standards , naming conventions, and metadata management practices.
- Collaborate with architects to define the data architecture blueprint , ensuring scalability, governance, and performance.
- Cloud Data Engineering (Azure)
- Partner with Azure data engineers to implement data models using services such as Azure Data Lake, Azure Synapse Analytics, Azure SQL Database, Data Factory, and Databricks .
- Contribute to the design and optimization of data ingestion, transformation, and orchestration pipelines in Azure.
- Participate in data governance , master data management (MDM) , and data quality initiatives .
- Collaboration & Stakeholder Engagement
- Work with business teams and data analysts to understand reporting and analytical requirements.
- Partner with enterprise architects to align modeling practices with data strategy and enterprise standards.
- Document data models, lineage, and definitions using enterprise metadata tools.
ETL and Data Modeling Specialist
Posted today
Job Viewed
Job Description
1. Cloud & Infrastructure
- AWS services : Must be proficient in building scalable data pipelines and managing cloud-native ETL workflows.
- Snowflake: Moderate understanding of Snowflake architecture.
- CICD - Terraform or CloudFormation, Jenkins ,Bitbucket : For infrastructure-as-code and deployment automation.
2. Programming & Scripting
- Python & PySpark: Ability to write efficient scripts for data transformation, and pipeline orchestration, knowledge of Spark or any distributing frameworks .
- SQL: Advanced querying, optimization, and data modelling.
3. ETL & Data Modelling
- Familiarity with event-driven architectures, API -based data sources ,data quality validation, archival strategies, and incremental loading techniques.
SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Posted today
Job Viewed
Job Description
Role: SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Client NBCUniversal
Location: Remote in India
Duration: 12-month contract with extensions
Job Description:
Provide state-of-the-art technical support for the SAP Datasphere, BW4 Systems maintenance and enhancements, including integration to other systems. This includes participating in project implementation, design, and development activities to support the successful business adoption of the new solution, and the ongoing production support and enhancement post go-live. This role requires a strong understanding of Financial/HR Reporting/BPC and proven experience in designing and delivering high-quality solutions through technical development that meet overall business requirements.
Candidates' Core Skillset Must Be:
- DataSphere
- S4 Hana
- SAC
- BW Data Modeling
Plusses:
- SAP ABAP, HANA/AMDP/CDS, SOAP/OData/Rest API’s etc.
SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Posted 2 days ago
Job Viewed
Job Description
Role: SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Client NBCUniversal
Location: Remote in India
Duration: 12-month contract with extensions
Job Description:
Provide state-of-the-art technical support for the SAP Datasphere, BW4 Systems maintenance and enhancements, including integration to other systems. This includes participating in project implementation, design, and development activities to support the successful business adoption of the new solution, and the ongoing production support and enhancement post go-live. This role requires a strong understanding of Financial/HR Reporting/BPC and proven experience in designing and delivering high-quality solutions through technical development that meet overall business requirements.
Candidates' Core Skillset Must Be:
- DataSphere
- S4 Hana
- SAC
- BW Data Modeling
Plusses:
- SAP ABAP, HANA/AMDP/CDS, SOAP/OData/Rest API’s etc.
SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Posted 2 days ago
Job Viewed
Job Description
Role: SAP Data Engineer (DataSphere/BW/HANA/Data Modeling)
Client NBCUniversal
Location: Remote in India
Duration: 12-month contract with extensions
Job Description:
Provide state-of-the-art technical support for the SAP Datasphere, BW4 Systems maintenance and enhancements, including integration to other systems. This includes participating in project implementation, design, and development activities to support the successful business adoption of the new solution, and the ongoing production support and enhancement post go-live. This role requires a strong understanding of Financial/HR Reporting/BPC and proven experience in designing and delivering high-quality solutions through technical development that meet overall business requirements.
Candidates' Core Skillset Must Be:
- DataSphere
- S4 Hana
- SAC
- BW Data Modeling
Plusses:
- SAP ABAP, HANA/AMDP/CDS, SOAP/OData/Rest API’s etc.
Junior Data Scientist - Predictive Modeling
Posted 17 days ago
Job Viewed
Job Description
Responsibilities:
- Assist senior data scientists in collecting, cleaning, and preparing large datasets for analysis.
- Develop and implement predictive models using various machine learning algorithms (e.g., regression, classification, clustering).
- Perform exploratory data analysis to identify trends, patterns, and insights.
- Create data visualizations to effectively communicate findings to technical and non-technical stakeholders.
- Contribute to the development and testing of new algorithms and statistical models.
- Collaborate with cross-functional teams to understand data needs and project requirements.
- Document methodologies, findings, and code for reproducibility and knowledge sharing.
- Participate in team meetings, brainstorming sessions, and code reviews.
- Learn and apply new data science techniques and tools as needed.
- Assist in the deployment and monitoring of machine learning models in production environments.
Qualifications:
- Currently pursuing or recently completed a Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, Data Science, or a related quantitative field.
- Foundational knowledge of statistical concepts and machine learning algorithms.
- Proficiency in programming languages such as Python or R.
- Familiarity with data manipulation libraries (e.g., Pandas, NumPy) and data visualization tools (e.g., Matplotlib, Seaborn).
- Experience with SQL for database querying.
- Strong analytical and problem-solving skills.
- Excellent written and verbal communication abilities.
- Eagerness to learn and adapt to new technologies.
- Ability to work independently and collaboratively in a remote team setting.
- Previous internship or project experience in data analysis or machine learning is a plus.
Be The First To Know
About the latest Databases Jobs in Hyderabad !
Data Storage Specialist
Posted today
Job Viewed
Job Description
Primary Skill: DELL, PURE, Brocade
Secondary Skill: Cisco, HPE & Netapp
YOE: 5-8 years
Job Description
Design and Implementation
o Architect deploy and manage SAN solutions with a focus on DELL PURE
Cisco Brocade and HPE technologies
o Design and implement scalable highavailability SAN environments to
support business applications and disaster recovery solutions
Administration and Maintenance
o Perform regular SAN maintenance updates and patches to ensure optimal
performance and security
o Monitor SAN performance and troubleshoot issues to minimize downtime
and data loss
o Manage SAN capacity planning and performance tuning
Database Management Engineer
Posted today
Job Viewed
Job Description
Greetings From TCS!
TCS presents an excellent opportunity for "Primavera Omega P6 requirement"
Job Title: Primavera Omega P6 requirement
Location: Hyderabad/ Pune/ Indore
Experience Range: 6 Years & Above
Job Description:
- Hands-on experience of Primavera modules P6 EPPM, P6 Professional, P6 Analytics, P6 API and P6 web services.
- Working experience on P6 installations, configurations & upgrades and should have strong knowledge of P6 database Schema.
- Should be able to create custom Bi reports/application- wise reports.
- Knowledge on Integration of P6 with other applications.
- Implement enhancements and custom features aligned with business needs.
- Provide subject matter expertise in Primavera application to the customer
- Should have project planning and scheduling knowledge to address the user queries on functionality.
- END user support- Should be able to resolve the technical/functional issues.