23 Data Analytics jobs in Jaipur
Data Analytics Mentor
Posted today
Job Viewed
Job Description
Job Title: Data Analytics Mentor
Job Type: Full-Time
Location: Remote (India)
Experience Required: 6+ Years
Job Description
We are hiring a full-time Data Analytics Mentor to join our team in a remote capacity. This role is focused on delivering high-quality training, mentorship, and career guidance to aspiring data analysts. The ideal candidate will have strong industry experience, excellent communication skills, and a passion for helping learners become job-ready through hands-on support and real-world insights.
Key Responsibilities
- Deliver live, interactive online sessions covering core data analytics topics such as data wrangling, data visualization, statistical analysis, Excel, SQL, and Python.
- Guide learners through practical assignments and capstone projects to strengthen their problem-solving skills and portfolio.
- Provide personalized feedback, career mentorship, and interview preparation assistance.
- Collaborate with the internal team to improve curriculum and ensure training content remains current and relevant to industry needs.
- Evaluate learner performance through project reviews and one-on-one guidance.
Required Qualifications
- A minimum of 6 years of experience in data analytics, business intelligence, or a related role.
- Proficiency in SQL , Excel , Python (Pandas, NumPy), and data visualization tools such as Tableau , Power BI , or Google Data Studio .
- Strong understanding of data analysis techniques, data storytelling, and reporting methods.
- Experience in mentoring, training, or teaching in a professional or academic setting is preferred.
- Excellent communication and interpersonal skills.
- Relevant certifications (e.g., Microsoft Data Analyst Associate, Tableau Desktop Specialist) will be an added advantage.
Additional Information
- This is a remote, full-time employment opportunity.
- Compensation will be based on experience and aligned with industry standards.
- Candidates must be legally authorized to work in India.
Data Analytics Faculty
Posted today
Job Viewed
Job Description
Work Location: Mavelikara, Kerala
Job description:
Key Responsibilities:
- Deliver classroom and online sessions on data analytics tools and techniques including Excel, SQL, Python, R, Tableau, Power BI, and data visualization.
- Design and update course content, lesson plans, assessments, and hands-on projects aligned with industry requirements.
- Teach concepts such as data cleaning, statistical analysis, data mining, predictive analytics, machine learning basics, and business intelligence.
- Guide students through capstone projects and case studies.
- Conduct evaluations and provide feedback to improve student performance and engagement.
- Stay current with industry trends, tools, and best practices to keep the curriculum relevant.
- Mentor students on career paths, interview preparation, and project building.
- Preference will be given to candidates from surrounding areas.
Qualifications & Requirements:
- Bachelor’s or Master’s degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
- Minimum 2–3 years of professional experience in data analytics or business intelligence.
- Strong command over analytics tools such as Excel, SQL, Python/R, Tableau, and Power BI.
- Excellent communication and presentation skills.
- Ability to explain complex concepts clearly to learners with different levels of experience.
- Passion for teaching and mentoring.
Job Types: Full-time, Permanent
Benefits:
- Health insurance
- Provident Fund
Language:
- English (Preferred)
Work Location: Mavelikara, Alappuzha
Data Analytics Faculty
Posted today
Job Viewed
Job Description
Work Location: Mavelikara, Kerala
Job description:
Key Responsibilities:
- Deliver classroom and online sessions on data analytics tools and techniques including Excel, SQL, Python, R, Tableau, Power BI, and data visualization.
- Design and update course content, lesson plans, assessments, and hands-on projects aligned with industry requirements.
- Teach concepts such as data cleaning, statistical analysis, data mining, predictive analytics, machine learning basics, and business intelligence.
- Guide students through capstone projects and case studies.
- Conduct evaluations and provide feedback to improve student performance and engagement.
- Stay current with industry trends, tools, and best practices to keep the curriculum relevant.
- Mentor students on career paths, interview preparation, and project building.
- Preference will be given to candidates from surrounding areas.
Qualifications & Requirements:
- Bachelor’s or Master’s degree in Data Science, Statistics, Computer Science, Mathematics, or a related field.
- Minimum 2–3 years of professional experience in data analytics or business intelligence.
- Strong command over analytics tools such as Excel, SQL, Python/R, Tableau, and Power BI.
- Excellent communication and presentation skills.
- Ability to explain complex concepts clearly to learners with different levels of experience.
- Passion for teaching and mentoring.
Job Types: Full-time, Permanent
Benefits:
- Health insurance
- Provident Fund
Language:
- English (Preferred)
Work Location: Mavelikara, Alappuzha
Data Analytics Mentor
Posted 13 days ago
Job Viewed
Job Description
Job Title: Data Analytics Mentor
Job Type: Full-Time
Location: Remote (India)
Experience Required: 6+ Years
Job Description
We are hiring a full-time Data Analytics Mentor to join our team in a remote capacity. This role is focused on delivering high-quality training, mentorship, and career guidance to aspiring data analysts. The ideal candidate will have strong industry experience, excellent communication skills, and a passion for helping learners become job-ready through hands-on support and real-world insights.
Key Responsibilities
- Deliver live, interactive online sessions covering core data analytics topics such as data wrangling, data visualization, statistical analysis, Excel, SQL, and Python.
- Guide learners through practical assignments and capstone projects to strengthen their problem-solving skills and portfolio.
- Provide personalized feedback, career mentorship, and interview preparation assistance.
- Collaborate with the internal team to improve curriculum and ensure training content remains current and relevant to industry needs.
- Evaluate learner performance through project reviews and one-on-one guidance.
Required Qualifications
- A minimum of 6 years of experience in data analytics, business intelligence, or a related role.
- Proficiency in SQL , Excel , Python (Pandas, NumPy), and data visualization tools such as Tableau , Power BI , or Google Data Studio .
- Strong understanding of data analysis techniques, data storytelling, and reporting methods.
- Experience in mentoring, training, or teaching in a professional or academic setting is preferred.
- Excellent communication and interpersonal skills.
- Relevant certifications (e.g., Microsoft Data Analyst Associate, Tableau Desktop Specialist) will be an added advantage.
Additional Information
- This is a remote, full-time employment opportunity.
- Compensation will be based on experience and aligned with industry standards.
- Candidates must be legally authorized to work in India.
Data Analytics Teacher
Posted today
Job Viewed
Job Description
- **Descriptive Analytics**: Focuses on understanding what happened.
- **Diagnostic Analytics**: Explores why something happened.
- **Predictive Analytics**: Forecasts what might happen in the future.
- **Prescriptive Analytics**: Suggests actions to take based on the data.
Data analytics is widely used across industries, from healthcare to retail, to optimize operations, predict trends, and improve strategies. If you're interested in learning more, you can explore resources like Coursera or CareerFoundry. Let me know if you'd like to dive deeper into any specific aspect!
**Job Types**: Part-time, Permanent
Pay: ₹5,000.00 - ₹10,000.00 per month
**Benefits**:
- Flexible schedule
Schedule:
- Day shift
- Morning shift
Work Location: In person
Senior Data Analytics Engineer
Posted today
Job Viewed
Job Description
Role Overview:
We are seeking a skilled and motivated Data Analytics Engineer to join our growing data team. This role is ideal for someone with a strong foundation in SQL, experience using Sigma Computing, and a deep understanding of Snowflake. You will work closely with cross-functional teams to design and implement scalable data models and build insightful dashboards that drive strategic business decisions.
Key Responsibilities:
Design scalable and efficient data models that align with business goals and growth needs.
Architect robust data storage and processing pipelines for structured and unstructured data.
Manage, tune, and optimize the Snowflake data warehouse for performance, cost efficiency, and scalability.
Collaborate with business teams to design and deliver interactive dashboards and reports using Sigma Computing.
Ensure high data quality, consistency, and governance across all analytics processes.
Work cross-functionally with engineering, product, and business stakeholders to translate data needs into actionable insights.
Required Skills & Experience
Proven experience in writing complex and optimized SQL queries.
Strong hands-on experience with Snowflake and understanding of its architecture and performance tuning.
Experience building data visualizations and reports using Sigma Computing.
Solid understanding of data warehousing concepts, dimensional modeling, and analytics architecture.
Ability to work with large datasets and make data accessible, usable, and insightful.
Familiarity with data governance, security, and access control best practices.
Excellent problem-solving, communication, and analytical skills.
Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field.
Must Have Qualifications
Solid experience in writing and optimizing complex SQL queries.
Proficiency with at least one modern BI tool (e.g., Sigma Computing, Tableau, Looker, or Power BI).
Exposure to Python for data transformations or automation.
Familiarity with cloud data platforms such as Snowflake, Databricks, Google Cloud Platform (GCP), or Amazon Redshift.
Strong analytical and critical thinking skills.
Good communication skills and the ability to work cross-functionally.
Senior Data Analytics Engineer
Posted today
Job Viewed
Job Description
Role Overview:
We are seeking a skilled and motivated Data Analytics Engineer to join our growing data team. This role is ideal for someone with a strong foundation in SQL, experience using Sigma Computing, and a deep understanding of Snowflake. You will work closely with cross-functional teams to design and implement scalable data models and build insightful dashboards that drive strategic business decisions.
Key Responsibilities:
Design scalable and efficient data models that align with business goals and growth needs.
Architect robust data storage and processing pipelines for structured and unstructured data.
Manage, tune, and optimize the Snowflake data warehouse for performance, cost efficiency, and scalability.
Collaborate with business teams to design and deliver interactive dashboards and reports using Sigma Computing.
Ensure high data quality, consistency, and governance across all analytics processes.
Work cross-functionally with engineering, product, and business stakeholders to translate data needs into actionable insights.
Required Skills & Experience
Proven experience in writing complex and optimized SQL queries.
Strong hands-on experience with Snowflake and understanding of its architecture and performance tuning.
Experience building data visualizations and reports using Sigma Computing.
Solid understanding of data warehousing concepts, dimensional modeling, and analytics architecture.
Ability to work with large datasets and make data accessible, usable, and insightful.
Familiarity with data governance, security, and access control best practices.
Excellent problem-solving, communication, and analytical skills.
Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field.
Must Have Qualifications
Solid experience in writing and optimizing complex SQL queries.
Proficiency with at least one modern BI tool (e.g., Sigma Computing, Tableau, Looker, or Power BI).
Exposure to Python for data transformations or automation.
Familiarity with cloud data platforms such as Snowflake, Databricks, Google Cloud Platform (GCP), or Amazon Redshift.
Strong analytical and critical thinking skills.
Good communication skills and the ability to work cross-functionally.
Be The First To Know
About the latest Data analytics Jobs in Jaipur !
Data Analytics Intern - Github Repository Analysis
Posted today
Job Viewed
Job Description
We're seeking a motivated Data Analytics Intern to join our team and dive deep into GitHub repository data. You'll work with large datasets of software repositories, contributor activity, and development patterns to uncover actionable insights that drive our product and engineering decisions.
This is an excellent opportunity for students or recent graduates interested in data science, software analytics, and open-source ecosystems to gain hands-on experience with real-world data at scale.
What You'll Do
- **Extract and analyze GitHub repository data** using APIs and web scraping techniques
- **Build automated data pipelines** to collect repository metrics, commit histories, and contributor patterns
- **Create compelling visualizations and dashboards** to communicate findings to technical and non-technical stakeholders
- **Conduct statistical analysis** on code quality, development velocity, and project health metrics
- **Research trends** in programming languages, frameworks, and open-source project adoption
- **Collaborate with engineering teams** to identify metrics that matter for software development
- **Present insights and recommendations** to leadership based on your analysis
What You'll Learn
- Advanced GitHub API usage and repository mining techniques
- Large-scale data processing and analysis workflows
- Data visualization best practices for technical audiences
- Software development metrics and their business impact
- Experience with cloud-based analytics platforms
- Professional data science project management
Required Qualifications
- Currently pursuing or recently completed a degree in **Data Science, Computer Science, Statistics, Mathematics, or related field**:
- **Programming proficiency in Python or R** with experience in data manipulation libraries (pandas, dplyr, etc.)
- **Basic understanding of Git and GitHub** workflows
- **SQL knowledge** for database querying and data extraction
- **Data visualization experience** (matplotlib, seaborn, ggplot2, Tableau, or similar)
- Strong **analytical thinking and problem-solving skills**:
- Excellent **written and verbal communication** abilities
- **Self-motivated** with ability to work independently and manage multiple projects
Preferred Qualifications
- Experience with **GitHub API, GraphQL, or other developer APIs**:
- Knowledge of **software engineering concepts** (code review processes, CI/CD, testing)
- Familiarity with **cloud platforms** (AWS, GCP, Azure) and big data tools
- Experience with **statistical analysis and hypothesis testing**:
- Background in **machine learning or predictive modeling**:
- Previous internship or project experience in **data analytics or software development**:
- Interest in **open-source software** and development communities
Technical Environment
You'll work with:
- **Languages**: Python, SQL, R
- **Tools**: Jupyter notebooks, Git, GitHub API
Application Requirements
Please submit:
- **Resume** highlighting relevant coursework and projects
- **Cover letter** explaining your interest in data analytics and software development
- **Portfolio or GitHub profile** showcasing data analysis projects (required)
- **Optional**: Link to a project analyzing any public dataset
Pay: ₹5,000.00 - ₹10,000.00 per month
**Education**:
- Bachelor's (preferred)
Senior Data Analytics Engineer – Azure Data Stack | Remote
Posted 2 days ago
Job Viewed
Job Description
Title: Senior Data Analytics Engineer – Azure Data Stack | Remote
Location: Remote (Must overlap with US Eastern/Central time zones; 2 PM–11 PM IST shift acceptable)
Experience Level: Senior | 3–5+ years in data engineering
Role Overview:
We’re looking for a Senior Data Analytics Engineer to join our Global Delivery - DADP team , building high-performance, Azure-native data solutions. You’ll work directly with clients to translate complex business needs into scalable data platforms, applying your deep expertise across Azure Data Factory, Databricks, Synapse, and modern data warehousing patterns.
This is a remote consulting role with real client impact, autonomy, and a fast-paced, agile environment.
Senior Full Stack SDE with Data Engineering for Analytics
Posted today
Job Viewed
Job Description
Summary
Truckmentum is seeking a Senior Full Stack Software Development Engineer (SDE) with deep data engineering experience to help us build cutting-edge software and data infrastructure for our AI-driven Trucking Science-as-a-Service platform. We’re creating breakthrough data science to transform trucking — and we’re looking for engineers who share our obsession with solving complex, real-world problems with software, data, and intelligent systems.
You’ll be part of a team responsible for the development of dynamic web applications, scalable data pipelines, and high-performance backend services that drive better decision-making across the $4 trillion global trucking industry. This is a hands-on role focused on building solutions by combining Python-based full stack development with scalable, modern data engineering.
About Truckmentum
Just about every sector of the global economy depends on trucking. In the US alone, trucks move 70%+ of all freight by weight (90+% by value) and account for $40 billion in annual spending (globally 4+ trillion per year). Despite this, almost all key decisions in trucking are made manually by people with limited decision support. This results in significant waste and lost opportunities. We view this as a great opportunity.
Truckmentum is a self-funded seed stage venture. We are now validating our key data science breakthroughs with customer data and our MVP product launch to confirm product-market fit. We will raise 4-6 million in funding this year to scale our Data Science-as-a-Service platform and bring our vision to market at scale.
Our Vision and Approach to Technology
T he back of our business cards reads “Moneyball for Trucking”, which means quantifying hard-to-quantiify hidden insights, and then using those insights to make much better business decision. If you don’t want “Moneyball for Trucking” on the back of your business card, then Truckmentum isn’t a good fit.
Great technology begins with customer obsession. We are obsessed with trucking companies' needs, opportunities, and processes, and with building our solutions into the rhythm of their businesses. We prioritize rapid development and iteration of large scale, complex data science problems, backed by actionable, dynamic data visualizations. We believe in an Agile, lean approach to software engineering, backed by a structured CI/CD approach, professional engineering practices, clean architecture, clean code and testing.
Our technology stack includes AWS Cloud, MySQL, Snowflake, Python, SQLAlchemy, Pandas, Streamlit and AGGrid to accelerate development of web visualization and interfaces.
About the Role
As a Senior Full Stack SDE with Data Engineering for Analytics, you will be responsible for designing and building the software systems, user interfaces, and data infrastructure that power Truckmentum’s analytics, data science, and decision support platform. This is a true full stack role — you’ll work across frontend, backend, and data layers using Python, Streamlit, Snowflake, and modern DevOps practices. You’ll help architect and implement a clean, extensible system that supports complex machine learning models, large-scale data processing, and intuitive business-facing applications.
You will report to the CEO (Will Payson), a transportation science expert with 25 years in trucking, who has delivered $1B+ in annual savin s for FedEx and Amazon. You will also work closely with the CMO/Head of Product, Tim Liu, who has 20+ years of experience in building and commercializing customer-focused digital platforms including in logistics.
Responsibilities and Goals
- Design and build full stack applications using Python, Streamlit, and modern web frameworks to power internal tools, analytics dashboards, and customer-facing products.
- Develop scalable data pipelines to ingest, clean, transform, and serve data from diverse sources into Snowflake and other cloud-native databases.
- Implement low-latency, high-availability backend services to support data science, decision intelligence, and interactive visualizations.
- Integrate front-end components with backend systems and ensure seamless interaction between UI, APIs, and data layers.
- Collaborate with data scientists / ML engineers to deploy models, support experimentation, and enable rapid iteration on analytics use cases.
- Define and evolve our data strategy and architecture, including schemas, governance, versioning, and access patterns across business units and use cases.
- Implement DevOps best practices, including testing, CI/CD automation, and observability, to improve reliability and reduce technical debt.
- Ensure data integrity and privacy through validation, error handling, and secure design.
- Contribute to product planning and roadmaps by working with cross-functional teams to estimate scope, propose solutions, and deliver value iteratively.
Required Qualifications
- 5+ years of professional software development experience, with a proven track record of building enterprise-grade, production-ready software applications for businesses or consumers, working in an integrated development team using Agile and Git / GitHub.
- Required technology experience with the following technologies in a business context:
- Python as primary programming language (5+ years’ experience)
- Pandas, Numpy, SQL
- AWS and/or GCP cloud configuration / deployment
- Git / GitHub
- Snowflake, and/or Redshift or Big Query
- Docker
- Airflow, Prefect or other DAG orchestration technology
- Front end engineering (e.g., HTML/CSS, JavaScript, and component-based frameworks)
- Hands-on experience with modern front-end technologies — HTML/CSS, JavaScript, and component-based frameworks (e.g., Streamlit, React, or similar).
- Experience designing and managing scalable data pipelines, data processing jobs, and ETL/ELT
- Experience in defining Data Architecture and Date Engineering Architecture, including robust pipelines, and building and using cloud services (AWS and/or GCP)
- Experience building and maintaining well-structured APIs and microservices in a cloud environment.
- Working knowledge of, and experience applying, data validation, privacy, and governance
- Comfort working in a fast-paced, startup environment with evolving priorities and an Agile mindset.
- Strong communication and collaboration skills — able to explain technical tradeoffs to both technical and non-technical stakeholders.
Desirable Experience (i.e., great but not required.)
- Desired technology experience with the following technologies in a business context:
- Snowflake
- Streamlit
- Folium, Plotly, AG Grid
- Kubernetes
- Javascript, CSS
- Flask, Fast API and SQLAlchemy
- Exposure to machine learning workflows and collaboration with data scientists or MLOps teams.
- Experience building or scaling analytics tools, business intelligence systems, or SaaS data products.
- Familiarity with geospatial data and visualization libraries (e.g., Folium, Plotly, AG Grid).
- Knowledge of CI/CD tools (e.g., GitHub Actions, Docker, Terraform) and modern DevOps practices.
- Contributions to early-stage product development — especially at high-growth startups.
- Passion for transportation and logistics, and for applying technology to operational systems.
Why Join Truckmentum
At Truckmentum, we’re not just building software — we’re rewriting the rules for one of the largest and most essential industries in the world. If you’re excited by real-world impact, data-driven decision making, and being part of a company where you’ll see your work shape the product and the business, this is your kind of team.
Some of the factors that make this a great opportunity include:
- Massive market opportunity: Trucking is a $4T+ global indust y / strong customer interest in solution
- Real business impact: Our tech has already shown a 5% operating margin gain at pilot customers.
- Builder’s culture: You’ll help define architecture, shape best practices, and influence our direction.
- Tight feedback loop: We work directly with real customers and iterate fast.
- Tech stack you’ll love: Python, Streamlit, Snowflake, Pandas, AWS — clean, modern, focused.
- Mission-driven team: We’re obsessed with bringing "Moneyball for Trucks" to life — combining science, strategy, and empathy to make the complex simple, and the invisible visible
We value intelligence, curiosity, humility, clean code, measurable impact, clear thinking, hard work and a focus on delivering results. If that sounds like your kind of team, we’d love to meet you.
- PS. If you read this far, we assume you are focused and detail oriented. If you think this job sounds interesting, please fill in a free personality profile on and email a link to the outcome to to move your application to the top the pile.