6 Database Management jobs in Delhi
Event And Database Management Internship in Delhi
Posted today
Job Viewed
Job Description
Selected Intern's Day-to-day Responsibilities Include
- Calling and inviting potential exhibitors, including C-level executives, decision-makers, and industry leaders.
- Connecting with existing and prospective clients, fostering strong professional relationships, and identifying opportunities for collaboration.
- Ensuring meticulous data entry into spreadsheets and databases, maintaining the highest level of accuracy and organization.
- Gathering essential information from clients while building and nurturing long-term relationships for mutual growth.
- Taking ownership of additional responsibilities as assigned, contributing to the team's overall success.
- Assist in managing, cleaning, and updating company databases to ensure data accuracy and integrity.
- Perform data entry, validation, and cleansing to maintain high-quality datasets.
About Company: Messe Stuttgart India Pvt. Ltd., a subsidiary of Landesmesse Stuttgart GmbH, is aiming to be the leading force in the Indian trade fair and exhibition industry. With a commitment to facilitating business connections and inspiring growth, we curate and organize world-class trade events that bring together industry leaders, innovators, and stakeholders from various sectors. Through our portfolio of events, we create valuable opportunities for businesses to showcase their products and services, connect with potential partners, and stay at the forefront of their respective industries. With a focus on excellence and a dedication to delivering exceptional experiences, Messe Stuttgart India is your trusted partner for successful trade fairs and exhibitions in India.
Virtual Assistant for Data Entry and Database Management
Posted today
Job Viewed
Job Description
We are seeking a detail-oriented and reliable Virtual Assistant with experience in data entry to support our business operations. The role involves organizing, inputting, and updating data in spreadsheets and databases to ensure accuracy and efficiency. This is a remote position, ideal for a proactive freelancer who can work independently and deliver high-quality results.
Key Responsibilities:
• Perform accurate data entry into Google Sheets, Microsoft Excel, or CRM platforms.
• Organize and clean data to maintain consistent and error-free records.
• Verify and cross-check data for accuracy and completeness.
• Update and manage databases with current information (e.g., customer details, product inventories, or business records).
• Handle light administrative tasks, such as scheduling or email organization, as needed.
• Follow provided guidelines to ensure data integrity and confidentiality.
Requirements:
• Proven experience in data entry or virtual assistant roles (please provide examples of previous work).
• Proficiency in Microsoft Excel, Google Sheets, and/or CRM tools (e.g., HubSpot, Zoho, or similar).
• Strong attention to detail and commitment to accuracy.
• Excellent time management and ability to meet deadlines.
Data Integration & Modeling Specialist
Posted today
Job Viewed
Job Description
Job Title: Data Integration & Modeling Specialist
Job Type: Contract
Location: Remote
Duration: 6 Months
Job Summary:
We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.
Key Responsibilities:
Develop common metamodels by integrating requirements across diverse systems and organizations.
Define integration specifications, establish data standards, and develop logical and physical data models.
Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.
Implement and govern semantic data solutions using RDF and SPARQL.
Perform data transformations and scripting using TCL, Python, and Java.
Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.
Document technical specifications and provide guidance on data standards and modeling best practices.
Required Qualifications:
3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.
3+ years of experience (within the last 8 years) in:
Defining integration specifications
Developing data models
Governing data standards
2+ years of recent experience with:
Tool Command Language (TCL)
Python
Java
2+ years of experience with:
Resource Description Framework (RDF)
SPARQL Query Language
2+ years of experience working with:
Fixed Record Layout (FRL)
Variable Record Layout (VRL)
Hierarchical Record Layout (HRL)
XML
JSONodeling Specialist
Data Integration & Modeling Specialist
Posted 6 days ago
Job Viewed
Job Description
Job Title: Data Integration & Modeling Specialist
Job Type: Contract
Location: Remote
Duration: 6 Months
Job Summary:
We are seeking a highly skilled Data Integration & Modeling Specialist with hands-on experience in developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. The ideal candidate will bring deep technical expertise and a collaborative mindset to support enterprise-level data integration and standardization initiatives.
Key Responsibilities:
Develop common metamodels by integrating requirements across diverse systems and organizations.
Define integration specifications, establish data standards, and develop logical and physical data models.
Collaborate with stakeholders to align data architectures with organizational needs and industry best practices.
Implement and govern semantic data solutions using RDF and SPARQL.
Perform data transformations and scripting using TCL, Python, and Java.
Work with multiple data formats including FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines.
Document technical specifications and provide guidance on data standards and modeling best practices.
Required Qualifications:
3+ years of experience (within the last 8 years) in developing common metamodels, preferably using NIEM standards.
3+ years of experience (within the last 8 years) in:
Defining integration specifications
Developing data models
Governing data standards
2+ years of recent experience with:
Tool Command Language (TCL)
Python
Java
2+ years of experience with:
Resource Description Framework (RDF)
SPARQL Query Language
2+ years of experience working with:
Fixed Record Layout (FRL)
Variable Record Layout (VRL)
Hierarchical Record Layout (HRL)
XML
JSONodeling Specialist
Senior Data Scientist - Actuarial Modeling
Posted 4 days ago
Job Viewed
Job Description
Responsibilities:
- Develop, validate, and implement advanced actuarial and statistical models.
- Apply machine learning and data mining techniques to insurance data.
- Extract, clean, and transform large, complex datasets from multiple sources.
- Perform exploratory data analysis to identify trends and insights.
- Assess model performance, interpret results, and provide actionable recommendations.
- Collaborate with actuarial teams, underwriters, and business stakeholders.
- Communicate complex quantitative findings clearly and concisely to diverse audiences.
- Stay current with industry trends, new technologies, and best practices in data science and actuarial science.
- Contribute to the innovation of new modeling methodologies and approaches.
- Document models and methodologies thoroughly.
Qualifications:
- Master's degree or Ph.D. in Statistics, Data Science, Mathematics, Actuarial Science, or a related quantitative field.
- Minimum of 7 years of experience in data science, actuarial modeling, or a similar quantitative role within the insurance industry.
- Proven expertise in statistical modeling, machine learning algorithms, and predictive analytics.
- Strong proficiency in programming languages such as Python or R, and SQL.
- Experience with big data technologies and cloud platforms (e.g., AWS, Azure, GCP).
- Excellent understanding of actuarial principles and insurance products.
- Strong analytical, problem-solving, and critical thinking skills.
- Exceptional communication and presentation abilities.
- Ability to work independently and manage projects effectively in a remote setting.
- Knowledge of actuarial software (e.g., Prophet, GGY Axis) is a plus.
Data Engineer- dimension/Fact modeling
Posted today
Job Viewed
Job Description
A data engineer specializing in dimensional (Dim) and fact (Fact) modeling is responsible for designing, building, and maintaining data models (dimensional, fact, and dimension tables) within data warehouses to optimize data for reporting.
Be The First To Know
About the latest Database management Jobs in Delhi !