3,496 Data Modeling jobs in India

Data Modeling

Hyderabad, Andhra Pradesh Virtusa

Posted today

Job Viewed

Tap Again To Close

Job Description

Data Modeling - CREQ Description GDT WPB DF Data Expert JD:

Mandatory Skills
Strong Data Analysis and/or Data Modeling experience of 8-12 years.
Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping.
Comprehensive understanding of the Data Modeling conceptual, logical, and physical, create and deliver high-quality data models, by following agreed data governance and standards.Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid.
Should be an individual contributor with good understanding of the SDLC & Agile methodologies.
A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast paced and continuously changing environment.
Excellent communication & stakeholder management skills and should be capable of building rapport and relationships.
Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects.
Other Skills and Tools SQL, MS Office tools, GCP Big Query, Erwin, Visual Paradigm preferable.
Responsibilities
Support the delivery of complex transformation program development within data portfolio.
Work within agile multi-skilled teams to create world class products to serve our customers needs.
Perform the elicitation and analysis of business change, functional and non functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution.
Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products.
Understand the user journey end to end which goes beyond the system.
Provide advanced business knowledge and technical support for requirements development.
Create/enhance logical and physical data models by adhering to the agreed standards, to fulfil both business as well as technical requirements and expectations.
Undertake the metadata analysis which includes but not limited to naming of the logical entity and attributes and physical table and columns, definitions and appropriate data type and length etc.
Create and maintain the data models.

Primary Location Hyderabad, Andhra Pradesh, India Job Type Experienced Primary Skills Data Modelling for Analytical (OLAP) Years of Experience 6 Qualification

GDT WPB DF Data Expert JD:

Mandatory Skills
Strong Data Analysis and/or Data Modeling experience of 8-12 years.
Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping.
Comprehensive understanding of the Data Modeling conceptual, logical, and physical, create and deliver high-quality data models, by following agreed data governance and standards.Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid.
Should be an individual contributor with good understanding of the SDLC & Agile methodologies.
A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast paced and continuously changing environment.
Excellent communication & stakeholder management skills and should be capable of building rapport and relationships.
Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects.
Other Skills and Tools SQL, MS Office tools, GCP Big Query, Erwin, Visual Paradigm preferable.
Responsibilities
Support the delivery of complex transformation program development within data portfolio.
Work within agile multi-skilled teams to create world class products to serve our customers needs.
Perform the elicitation and analysis of business change, functional and non functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution.
Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products.
Understand the user journey end to end which goes beyond the system.
Provide advanced business knowledge and technical support for requirements development.
Create/enhance logical and physical data models by adhering to the agreed standards, to fulfil both business as well as technical requirements and expectations.
Undertake the metadata analysis which includes but not limited to naming of the logical entity and attributes and physical table and columns, definitions and appropriate data type and length etc.
Create and maintain the data models.

Travel No
This advertiser has chosen not to accept applicants from your region.

Data Modeling

Bengaluru, Karnataka Photon

Posted today

Job Viewed

Tap Again To Close

Job Description

**Job Summary**:
We are seeking a skilled
Data Engineer
with expertise in
Data Modeling, SQL, Snowflake, Python, AWS, and NoSQL
NoSQL Data Modeling
is a plus.
**Key Responsibilities**:
Design and implement
data models
to support analytical and operational workloads.
Develop, optimize, and manage
SQL queries
for data extraction, transformation, and loading (ETL).
Work extensively with
Snowflake
to build scalable data pipelines and warehouses.
Develop and maintain
Python scripts
for data processing and automation.
Implement
AWS services
for cloud-based data solutions.
Work with
NoSQL databases
to handle semi-structured and unstructured data.
Ensure data accuracy, consistency, and security across various storage systems.
Collaborate with data scientists, analysts, and software engineers to deliver business insights.

**Required Skills**:
Strong experience in
Data Modeling
(Relational & NoSQL).
Proficiency in
SQL
and experience with database technologies.
Hands-on experience with
Snowflake
for data warehousing.
Strong programming skills in
Python
for data processing.
Expertise in
AWS cloud services
for data infrastructure.
Experience working with
NoSQL databases
**Good to Have**:
Experience in
NoSQL Data Modeling
best practices.
**Location**:
Bangalore
**Experience**:
6-9 Years
This advertiser has chosen not to accept applicants from your region.

Data Modeling Advisor

Hyderabad, Andhra Pradesh Evernorth

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

About Evernorth:

Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people.

Data Modeling Advisor
Position Summary:

The Health Services Data Design and Metadata Management team is hiring an Architecture Senior Advisor to work across all projects. The work involves understanding and driving data design best practices, including data modeling, mapping, and analysis, and helping others to apply them across strategic data assets. The data models are wide-ranging and must include the appropriate metadata to support and improve our data intelligence. Data design centers around standard health care data (eligibility, claim, clinical, and provider data) across structured and unstructured data platforms.

Job Description & Responsibilities:

  • Perform data analysis, data modeling, and data mapping following industry and Evernorth data design standards for analytics/data warehouses and operational data stores across various DBMS types, including Teradata, Oracle, Cloud, and Hadoop, Databricks and datalake.
  • Perform data analysis, profiling and validation, contributing to data quality efforts to understand data characteristics and ensure data correctness/condition for use.
  • Participate in and coordinate data model metadata development processes to support ongoing development efforts (data dictionary, NSM, and FET files), maintenance of data model/data mapping metadata, and linking of our data design metadata to business terms, data models, mapping documents, ETL jobs, and data model governance operations (policies, standards, best practices).
  • Facilitate and actively participate in data model/data mapping reviews and audits, fostering collaborative working relationships and partnerships with multidisciplinary teams.
  • Provide guidance, mentoring, and training as needed in data modeling, data lineage, ddl code, and the associated toolsets (Erwin Data Modeler, Erwin Web Portal, Erwin model mart, Erwin Data Intelligence Suite, Alation). Assist with the creation, documentation, and maintenance of Evernorth data design standards and best practices involving data modeling, data mapping, and metadata capture including data sensitivity, data quality rules, and reference data usage. Develop and facilitate strong partnerships and working relationships with Data Governance, delivery, and other data partners. Continuously improve operational processes for data design metadata management for global and strategic data.
  • Interact with Business stakeholders and IT in defining and managing data design. Coordination, collaboration, and innovation with Solution Verticals, Data Lake teams, IT & Business Portfolios to ensure alignment of data design metadata and related information with ongoing programs (cyber risk and security) and development efforts.

Experience Required:

  • 11 to 13 years' experience with data modeling (logical / physical data model, canonical structures, etc.) and SQL code;

Experience Desired:

  • Subject matter expertise level experience preferred
  • Experience executing data model / data lineage governance across business and technical data.
  • Experience utilizing data model / lineage / mapping / analysis management tools for business, technical and operational metadata (Erwin Data Modeler, Erwin Web Portal, Erwin Model Mart, Erwin Data Intelligence Suite, Alation
  • Experience working in an Agile delivery environment (Jira, Confluence, SharePoint, Git, etc.)

Education and Training Required:

  • Advanced degree in Computer Science or a related discipline and at least six, typically eight or more years experience in all phases of data modeling, data warehousing, data mining, data entity analysis, logical data base design and relational data base definition, or an equivalent combination of education and work experience.

Primary Skills:

  • Physical Data Modeling, Data Warehousing, Metadata, Reference Data, Data Mapping
  • Data Mining, Teradata, Data Quality, Excellent Communication Skills, Data Analysis, Oracle
  • Data Governance, Database Management System, Jira, DDL, Data Integration, Microsoft, SharePoint, Database Modeling, Confluence, Agile, Marketing Analysis, SharePoint, Operations, Topo, Data Lineage, Data Warehouses, Documentation
  • Big Data, Web Portal, Maintenance, Erwin, SQL, Unstructured Data, Audit, Git, Pharmacy
  • DBMS, Databricks, AWS
This advertiser has chosen not to accept applicants from your region.

Data Modeling - BLR

Photon

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a skilled Data Engineer with expertise in Data Modeling, SQL, Snowflake, Python, AWS, and NoSQL . The ideal candidate will be responsible for designing and implementing scalable data solutions, ensuring efficient data storage, retrieval, and processing. Experience in NoSQL Data Modeling is a plus.

Key Responsibilities:
  • Design and implement data models to support analytical and operational workloads.
  • Develop, optimize, and manage SQL queries for data extraction, transformation, and loading (ETL).
  • Work extensively with Snowflake to build scalable data pipelines and warehouses.
  • Develop and maintain Python scripts for data processing and automation.
  • Implement AWS services for cloud-based data solutions.
  • Work with NoSQL databases to handle semi-structured and unstructured data.
  • Ensure data accuracy, consistency, and security across various storage systems.
  • Collaborate with data scientists, analysts, and software engineers to deliver business insights.
  • Required Skills:
  • Strong experience in Data Modeling (Relational & NoSQL).
  • Proficiency in SQL and experience with database technologies.
  • Hands-on experience with Snowflake for data warehousing.
  • Strong programming skills in Python for data processing.
  • Expertise in AWS cloud services for data infrastructure.
  • Experience working with NoSQL databases .
  • Good to Have:
  • Experience in NoSQL Data Modeling best practices.
  • Location: Bangalore

    Experience: 6-9 Years

    This advertiser has chosen not to accept applicants from your region.

    Data Modeling l4

    Tredence

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    About Tredence- Tredence is a global data science solutions provider founded in 2013 by Shub Bhowmick, Sumit Mehra, and Shashank Dubey, focused on solving the last-mile problem in AI. Headquartered in San Jose, California, the company embraces a vertical-first approach and an outcome-driven mindset to help clients win and accelerate value realization from their analytics investments. The aim is to bridge the gap between insight delivery and value realization by providing customers with a differentiated approach to data and analytics through tailor-made solutions. Tredence is 2000-plus employees strong with offices in San Jose, Foster City, Chicago, London, Toronto, and Bangalore, with the largest companies in retail, CPG, hi-tech, telecom, healthcare, travel, and industrials as clients- Job Summary**Role and Responsibilities**
    - Working closely with the data analyst and data platform teams to Analyze and evaluate existing data systems and continuously updating and optimising data models.
    - Working closely with Business and Analytics team to understand business needs and translate them into long-term solution data models.
    - Working with the data ingestion teams from different business areas to understand the data availability and format and to create conceptual data models and data flows.
    - Jointly accountable, together with Platform and Development teams, for the performance of the reporting solutions produced.
    - Developing best practices for data coding naming conventions, default values, semantics, data dictionary, etc. to ensure consistency within the system and act as an educator and ambassador for best practices when it comes to the data modelling.
    - Reviewing modifications of existing systems for cross-compatibility.
    - Implementing data strategies and developing physical data models.
    - Job Location- Bangalore, Chennai, Gurgaon, Pune- Roles & Responsibilities- Bachelor’s degree in computer science, information technology, or a similar field.
    - Eight years hands-on experience with physical and relational data modelling, with overall 4+ years of industry experience.
    - Expert knowledge of metadata management and related tools.
    - Knowledge of mathematical foundations and statistical analysis.
    - Strong interpersonal skills.
    - Excellent communication and presentation skills.
    - Advanced troubleshooting skills
    - Qualification & Experience- 7 to 14 Years- Why Join Tredence?- There is a reason we are one of the fastest growing private companies in the country! You will have the opportunity to work with some of the smartest and fun-loving people in the data analytics space. You will work with the latest technologies and interface directly with the key decision stakeholders at our reputed clients, some of the largest and most innovative global business brands. Our people are our greatest asset and we value every one of them. We are an equal opportunity employer who adhere to our core values & reflect this in our day to day life. So, please come & see why we’re so successful in one of the most competitive and fastest growing industries in the world.
    This advertiser has chosen not to accept applicants from your region.

    Big Data Engineer, Data Modeling

    Hyderabad, Andhra Pradesh data.ai

    Posted today

    Job Viewed

    Tap Again To Close

    Job Description

    What can you tell your friends

    when they ask you what you do?

    We’re looking for an experienced Big Data Engineer who can create innovative new products in the analytics and data space. You will participate in the development that creates the world's #1 mobile app analytics service. Together with the team, you will build out new product features and applications using agile methodologies and open-source technologies. You will work directly with Data Scientists, Data Engineers, Product Managers, and Software Architects, and will be on the front lines of coding new and exciting analytics and data mining products. You should be passionate about what you do and excited to join an entrepreneurial start-­up.

    To ensure we execute on our values we are looking for someone who has a passion for:

    As a Big Data Engineer, we will need you to be in charge of model implementation and maintenance, and to build a clean, robust, and maintainable data processing program that can support these projects on huge amounts of data, this includes

  • Able to design and implement complex data product components based on requirements with possible technical solutions.
  • Write data programs using Python (e.g., pyspark) with a commitment to maintaining high-quality work while being confident in dealing with data mining challenges.
  • Discover any feasible new technologies lying in the Big Data ecosystem, for example, the Hadoop ecosystem, and share them with to team with your professional perspectives.
  • Get up to speed in the data science and machine learning domain, implementing analysis components in a distributed computing environment (e.g., MapReduce implementation) with instruction from Data Scientists.
  • Be comfortable conducting detailed discussions with Data Scientists regarding specific questions related to specific data models.
  • You should be a strong problem solver with proven experience in big data.
  • You should recognize yourself in the following…

  • Hands-on experience and deep knowledge of the Hadoop ecosystem.
  • Must: PySpark, MapReduce, HDFS.
  • Plus: Storm, Kafka.
  • Must have 2+ years of Linux environment development experience.
  • Proficient with programming in Python & Scala, experience in Pandas, Sklearn or Other data science and data analysis toolset is a big plus.
  • Experience in data pipeline design & automation.
  • Having a background in data mining, analytics & data science components implementation, and machine learning domain, familiarity with common algorithms and libs is a plus.
  • Passion for cloud computing (AWS in particular) and distributed systems.
  • You must be a great problem solver with the ability to dive deeply into complex problems and emerge with clear and pragmatic solutions.
  • Good communication, and cooperation globally.
  • Major in Math or Computer Science.
  • You are driven by passion for innovation that pushes us closer to our vision in everything we do. Centering around our purpose and our hunger for new innovations is the foundation that allows us to grow and unlock the potential in AI.
  • You are an Ideal Team Player: You are hungry and no, we are not talking about food here. You are humble, yet love to succeed, especially as a team! You are smart, and not just book smart, you have a great read on people.
  • This position is located in Hyderabad, India.

    We are hiring for our engineering team at our data.ai India subsidiary entity, which is in the process of getting established . As we are awaiting approval from the Indian government, they shall be interim employees at Innova Solutions who is our Global Employer of Record.

    This advertiser has chosen not to accept applicants from your region.

    Data Modeling Techniques And Methodologies

    560001 Bangalore, Karnataka 2coms

    Posted 391 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Hi All,

    Greetings for the day!

    We are hiring for one of our renowned IT Global MNC Clients.

    PFB the job details 

    Job Title data Modeling Techniques and Methodologies

    Job Location : Bangalore /Mumbai/Chennai /Hyderabad/Kolkata

    Total Exp = 6+yrs

    Rel Exp = 5+ years Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies. - Strong understanding of relational and dimensional data modeling concepts. - Experience with data modeling tools such as ERwin or PowerDesigner. - Knowledge of SQL and database management systems. - Familiarity with data integration and ETL processes. - Good To Have Skills: Experience with data governance and metadata management. - Experience with cloud-based data platforms such as AWS or Azure. - Knowledge of big data technologies like Hadoop or Spark. Additional Information: - The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies. - This position is based at our Bengaluru office. - A 15 years full time education is required.
    This advertiser has chosen not to accept applicants from your region.
    Be The First To Know

    About the latest Data modeling Jobs in India !

    Architect - Data Modeling (Bangalore / Chennai / Hyderabad)

    560001 Bangalore, Karnataka 2coms

    Posted 526 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Permanent
    About ClientOur Client is a global leader in AI and Analytics, helping Fortune 1000 companies solve their toughest challenges.They have a team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore, and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare.Our client are Great Place to Work-Certified.They are ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Job Responsibility As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with crossfunctional stakeholders. Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifactsRequirements10+ years of experience in Data space. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables.
    This advertiser has chosen not to accept applicants from your region.

    Data Engineer - Data Modeling (Bangalore / Chennai / Hyderabad)

    560001 Bangalore, Karnataka 2coms

    Posted 550 days ago

    Job Viewed

    Tap Again To Close

    Job Description

    Permanent
    About Client Our Client is a global leader in AI and Analytics, helping Fortune 1000 companies solve their toughest challenges.They have a team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore, and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare.Our client are Great Place to Work-Certified.They are ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine.Job ResponsibilityThe Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Understand the Business Requirements and the corresponding data sets. Translate business requirements into working logical and physical Service Layer Data models. Create and maintain Logical Data Model (LDM) and Physical Data Model (PDM) applying best practices and providing business insights. Create and maintain the Source to Target Data Mapping document. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information.  Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use Version control to maintain versions of data models Design and develop data extraction and integration code modules. Work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patternsRequirements Required Experience, Skills & Competencies: Strong experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) Good understanding of Star schema, Snowflake schema, Data Vault Modelling Good experience in OLTP and OLAP systems Excellent Data Analysis skills Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) Understanding of DevOps process Hands-on experience in any Data Modelling Tool Good understanding of one or more ETL tool Excellent Communication skills Minimum 3 years of relevant hands-on experience
    This advertiser has chosen not to accept applicants from your region.
     

    Nearby Locations

    Other Jobs Near Me

    Industry

    1. request_quote Accounting
    2. work Administrative
    3. eco Agriculture Forestry
    4. smart_toy AI & Emerging Technologies
    5. school Apprenticeships & Trainee
    6. apartment Architecture
    7. palette Arts & Entertainment
    8. directions_car Automotive
    9. flight_takeoff Aviation
    10. account_balance Banking & Finance
    11. local_florist Beauty & Wellness
    12. restaurant Catering
    13. volunteer_activism Charity & Voluntary
    14. science Chemical Engineering
    15. child_friendly Childcare
    16. foundation Civil Engineering
    17. clean_hands Cleaning & Sanitation
    18. diversity_3 Community & Social Care
    19. construction Construction
    20. brush Creative & Digital
    21. currency_bitcoin Crypto & Blockchain
    22. support_agent Customer Service & Helpdesk
    23. medical_services Dental
    24. medical_services Driving & Transport
    25. medical_services E Commerce & Social Media
    26. school Education & Teaching
    27. electrical_services Electrical Engineering
    28. bolt Energy
    29. local_mall Fmcg
    30. gavel Government & Non Profit
    31. emoji_events Graduate
    32. health_and_safety Healthcare
    33. beach_access Hospitality & Tourism
    34. groups Human Resources
    35. precision_manufacturing Industrial Engineering
    36. security Information Security
    37. handyman Installation & Maintenance
    38. policy Insurance
    39. code IT & Software
    40. gavel Legal
    41. sports_soccer Leisure & Sports
    42. inventory_2 Logistics & Warehousing
    43. supervisor_account Management
    44. supervisor_account Management Consultancy
    45. supervisor_account Manufacturing & Production
    46. campaign Marketing
    47. build Mechanical Engineering
    48. perm_media Media & PR
    49. local_hospital Medical
    50. local_hospital Military & Public Safety
    51. local_hospital Mining
    52. medical_services Nursing
    53. local_gas_station Oil & Gas
    54. biotech Pharmaceutical
    55. checklist_rtl Project Management
    56. shopping_bag Purchasing
    57. home_work Real Estate
    58. person_search Recruitment Consultancy
    59. store Retail
    60. point_of_sale Sales
    61. science Scientific Research & Development
    62. wifi Telecoms
    63. psychology Therapy
    64. pets Veterinary
    View All Data Modeling Jobs