122 Data Analytics jobs in Kochi
Data Analytics Instructor
Posted 1 day ago
Job Viewed
Job Description
Why Join Us?
We’re looking for highly skilled, passionate, and motivated individuals who are excited
about shaping the next generation of data professionals. Here’s why you should
consider joining our team:
1. Empower Future Talent – Play a key role in transforming lives by teaching real world data skills.
2. Growth & Recognition – Fast-track your career in a rapidly growing ed-tech
startup with leadership opportunities.
3. Innovative Work Culture – Thrive in a dynamic, collaborative, and flexible work
environment that values creativity and ownership.
4. Be a Part-Owner of the Company – ESOPs offered on performance based as a
loyalty benefit.
5. Make Real Impact – Your contribution directly affects learners’ career success.
What You’ll Do
As a Data Analytics Instructor at SkilloVilla, you will be instrumental in delivering
cutting-edge, industry-relevant training to aspiring data professionals. Your core
responsibilities will include:
1. Training Delivery – Deliver structured, interactive, and engaging training
sessions tailored to varying learner levels – from beginners to advanced.
2. Content Development – Develop content through research to create up-todate, comprehensive, and practical training materials and resources.
3. Project-Based Learning – Guide students through real-world data analytics
projects covering topics such as data analysis, machine learning, statistical
modeling, and data visualization.
4. Mentorship & Support – Provide personalized guidance, resolve student
queries, and support learners through challenges in their educational and career
journey.
5. Interactive Learning – Create a collaborative, problem-solving learning
environment that encourages active participation and critical thinking.
6. Assessment & Feedback – Evaluate student progress through quizzes,
assignments, and hands-on projects while offering constructive feedback.
7. Placement Support – Offer interview preparation, resume reviews, and career
guidance to help learners secure job placements.
8. Training Content Management – Manage training assets including session
presentations, assignments, quizzes, and other learning materials.
What You’ll Need
We’re looking for someone who combines strong technical expertise with a passion for
teaching and mentoring. The ideal candidate should possess:
1. Training Expertise – Proven experience delivering training in Tableau, Excel, SQL,
Python, Statistics, and Power BI.
2. Technical Proficiency – Strong knowledge of data tools and libraries like
Pandas, NumPy, Scikit-learn, and TensorFlow. Familiarity with Machine Learning,
Big Data, and BI tools.
3. Analytical Thinking – Ability to interpret large datasets, identify insights, and
perform data-driven decision-making. Skilled in statistical analysis and model
building.
4. Teaching & Mentorship Skills – Passion for education with the ability to explain
complex concepts in a simple and engaging manner.
5. Communication Skills – Ability to deliver training in English, with excellent
verbal and written communication skills to convey complex ideas clearly to
diverse learners.
6. Up-to-Date Knowledge – Awareness of current trends, tools, and best practices
in the field of data analytics.
7. Engaging Delivery Style – Ability to make learning enjoyable and effective
through interactivity and real-world examples.
Compensation
• Position Type : Full-Time
• Salary : As per industry standards and experience
• ESOPs : Offered on performance based as part of loyalty benefits
Perks & Benefits
1. Impact-Driven Role – Help students launch high-paying careers in data
analytics.
2. Work with Leadership – Collaborate directly with industry leaders and the
founding team.
3. Career Growth – Opportunity to grow into leadership or curriculum design roles.
4. Learning Environment – Continuous learning through internal training and
knowledge-sharing sessions.
About SkilloVilla
SkilloVilla is on a mission to bridge the gap between education and employment by
empowering graduates with industry-relevant skills. We focus on real-world job
requirements, enabling learners to secure high-paying jobs and fast-track their careers.
Our platform includes upskilling programs, resume support, mock interviews, and
placement opportunities with top-tier companies.
Our Founding Team
• Ronak Agrawal (CEO): IIT Delhi, Ex-Business Head, Cuemath
• Rajat Agrawal (CTO): IIIT Hyderabad, Ex-Tech Lead, Swiggy
• Deepak Kharol (COO): IIT Delhi, Ex-Sr. Growth Manager, Swiggy
Our Culture
1. Innovators at Heart – We constantly explore new ways to solve real problems.
2. Fast Executioners – We move quickly and efficiently in everything we do.
3. Ownership Mindset – Every individual owns their outcomes and drives impact.
4. Customer-Centric Approach – Learners are at the core of every decision we
make.
5. Open & Fun Work Environment – We value transparency, collaboration, and a
youthful vibe.
Ready to inspire future data professionals and make a lasting impact? Join
SkilloVilla and be a part of the change!
Data Analytics & Insights Analyst
Posted 1 day ago
Job Viewed
Job Description
Data Analytics & Insights Analyst
Astreya offers comprehensive IT support and managed services. These services include Data
Center and Network Management, Digital Workplace Services (like Service Desk, Audio Visual, and
IT Asset Management), as well as Next-Gen Digital Engineering services encompassing Software
Engineering, Data Engineering, and cybersecurity solutions. Astreya's expertise lies in creating
seamless interactions between people and technology to help organizations achieve operational
excellence and growth.
Job Description
We are seeking experienced Data and Insights Analyst to join our analytics division.
- You will be aligned with our Data Analytics and BI vertical and help us generate insights by leveraging the latest Analytics techniques to deliver value to our clients.
- You will also help us apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using Google Platforms and technologies.
- You will be required to utilize the existing Tools, frameworks, standards, patterns to create architectural foundations and services necessary for Analytics applications that scale from multi-user to enterprise-class
- You will be working as a part of the Google Analytics team which provides analytics, actionable insights and recommendations to internal & external organizations in terms of optimizing ROI & performance efficiency in operations.
Requirements
Experience & Education
- 5+ years of progressive experience in data analytics and business intelligence
- Bachelor's degree required; preferably in Computer Science, Analytics, Statistics,
- or related field
- Proven experience in the IT services industry or managed services provider environment
- Technical Expertise
Technical Expertise
- Extensive experience in Data Science and Advanced Analytics delivery teams
- Strong statistical programming experience - SQL, Python.
- Experience working with large data sets and big data tools like GCP (BigQuery, VertexAI), AWS, MS Azure etc.
- Solid knowledge in at least one of the following –
- Multivariate Statistics, Reliability Models, Markov Models, Stochastic models
- Classification, Regression, Clustering
- Ensemble Modelling (random forest, boosted tree, etc.)
- Experience in at least one of these business domains: Supply Chain, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce
- Extensive experience in client engagement and business development
- Ability to work in a global collaborative team environment
Industry Knowledge
● Understanding of ITIL and IT service management frameworks
● Experience with service desk metrics and KPIs
● Knowledge of data center and network management analytics
● Familiarity with cybersecurity analytics and reporting
Data Analytics & Insights Analyst
Posted today
Job Viewed
Job Description
Data Analytics & Insights Analyst
Astreya offers comprehensive IT support and managed services. These services include Data
Center and Network Management, Digital Workplace Services (like Service Desk, Audio Visual, and
IT Asset Management), as well as Next-Gen Digital Engineering services encompassing Software
Engineering, Data Engineering, and cybersecurity solutions. Astreya's expertise lies in creating
seamless interactions between people and technology to help organizations achieve operational
excellence and growth.
Job Description
We are seeking experienced Data and Insights Analyst to join our analytics division.
- You will be aligned with our Data Analytics and BI vertical and help us generate insights by leveraging the latest Analytics techniques to deliver value to our clients.
- You will also help us apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using Google Platforms and technologies.
- You will be required to utilize the existing Tools, frameworks, standards, patterns to create architectural foundations and services necessary for Analytics applications that scale from multi-user to enterprise-class
- You will be working as a part of the Google Analytics team which provides analytics, actionable insights and recommendations to internal & external organizations in terms of optimizing ROI & performance efficiency in operations.
Requirements
Experience & Education
- 5+ years of progressive experience in data analytics and business intelligence
- Bachelor's degree required; preferably in Computer Science, Analytics, Statistics,
- or related field
- Proven experience in the IT services industry or managed services provider environment
- Technical Expertise
Technical Expertise
- Extensive experience in Data Science and Advanced Analytics delivery teams
- Strong statistical programming experience - SQL, Python.
- Experience working with large data sets and big data tools like GCP (BigQuery, VertexAI), AWS, MS Azure etc.
- Solid knowledge in at least one of the following –
- Multivariate Statistics, Reliability Models, Markov Models, Stochastic models
- Classification, Regression, Clustering
- Ensemble Modelling (random forest, boosted tree, etc.)
- Experience in at least one of these business domains: Supply Chain, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce
- Extensive experience in client engagement and business development
- Ability to work in a global collaborative team environment
Industry Knowledge
● Understanding of ITIL and IT service management frameworks
● Experience with service desk metrics and KPIs
● Knowledge of data center and network management analytics
● Familiarity with cybersecurity analytics and reporting
Graduate Trainee - Data Analytics
Posted 9 days ago
Job Viewed
Job Description
Your responsibilities will include assisting in the development of data models, performing statistical analysis, creating compelling data visualizations, and contributing to data-driven decision-making processes. You will gain exposure to various analytical tools and techniques, including SQL, Python/R, and business intelligence platforms. This internship is a unique chance to develop your skills in data manipulation, interpretation, and reporting.
We are looking for candidates with a strong quantitative background, a curious mind, and a passion for data. A degree in a relevant field such as Statistics, Mathematics, Computer Science, Economics, or Engineering is required. Excellent analytical and problem-solving skills, along with strong attention to detail, are essential. While prior experience is not mandatory, a foundational understanding of data analysis concepts and programming languages will be advantageous.
Key Learning Areas & Responsibilities:
- Assisting in data collection, cleaning, and preprocessing.
- Performing descriptive and inferential statistical analysis.
- Developing and implementing data visualization dashboards (e.g., Tableau, Power BI).
- Learning and applying SQL for data extraction and manipulation.
- Gaining experience with programming languages like Python or R for data analysis.
- Supporting senior analysts in generating reports and presentations.
- Understanding business requirements and translating them into analytical tasks.
- Contributing to team projects and problem-solving discussions.
- Recent graduate with a Bachelor's or Master's degree in Statistics, Mathematics, Economics, Computer Science, Engineering, or a related quantitative field.
- Strong analytical and logical reasoning abilities.
- Proficiency in Microsoft Excel.
- Basic understanding of programming concepts (Python or R is a plus).
- Excellent communication and teamwork skills.
- Eagerness to learn and a proactive attitude.
- Ability to work collaboratively in a team environment.
Data Science Specialist
Posted today
Job Viewed
Job Description
Job Title: N8N Developer / Automation Engineer
Experience: 1–4 years (Immediate Joiners Preferred)
Location: (Your Location) / Remote
Job Summary:
We are looking for a skilled N8N Developer to design, develop, and maintain workflow automations and integrations across multiple systems. The ideal candidate should have strong experience with N8N , API integrations , and Node.js , and be able to create efficient automation solutions that improve operational efficiency.
Key Responsibilities:
- Design, develop, and deploy automated workflows using N8N .
- Integrate multiple systems and APIs (REST, GraphQL, etc.) to automate business processes.
- Build and maintain custom N8N nodes when required using Node.js and JavaScript .
- Monitor, troubleshoot, and optimize existing workflows for performance and reliability.
- Collaborate with cross-functional teams to identify automation opportunities.
- Implement error-handling, logging, and version control for workflows.
- Document workflows, processes, and integration logic for internal teams.
Required Skills & Experience:
- Hands-on experience with N8N workflow automation.
- Strong knowledge of Node.js , JavaScript , and RESTful APIs .
- Familiarity with webhooks , OAuth , and JSON data structures .
- Experience working with databases (MongoDB, MySQL, PostgreSQL, etc.).
- Understanding of cloud platforms (AWS, GCP, or similar) is a plus.
- Ability to debug and optimize API calls and workflow logic.
- Excellent problem-solving and analytical skills.
Preferred Skills (Good to Have):
- Experience with Zapier , Make (Integromat) , or other automation tools.
- Exposure to Docker and CI/CD pipelines .
- Basic understanding of DevOps and API security .
- Knowledge of Express.js or other backend frameworks.
Qualification:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Please send me your resume on /
Data Science Intern
Posted 1 day ago
Job Viewed
Job Description
NLP Data Science Intern
Did you notice a shortage of food at supermarkets during covid? Have you heard about the recent issues in the global shipping industry? or perhaps you’ve heard about the shortages of microchips? These problems are called supply chain disruptions. They have been increasing in frequency and severity. Supply chain disruptions are threatening our very way of life.
Our vision is to advance society’s capacity to withstand shocks and stresses. Kavida.ai believes the only way to ensure security is through supply chain resiliency. We are on a mission to help companies proactively manage disruption supply chain disruption risks using integrated data.
Our Story
In March 2020 over 35 academics, data scientists, students, and software engineering volunteers came together to address the food shortage issues caused by the pandemic - Covid19foodsupply.com. A core team of 9 was formed and spun off into a startup and the rest is history.
Our investors include one of the world's largest supply chain quality & compliance monitoring companies, a £1.25bn apparel manufacturer, and some very impressive angel investors.
Social Impact:
Social impact is in our DNA. We believe private sector innovation is the only way to address social problems at scale. If we achieve our mission, humanity will always have access to its essential goods for sustenance. No more shortages of food, PPE, medicine, etc.
Our Culture :
Idea Meritocracy:
The best ideas win. We only care about what is right, not who is right. We know arriving at the best answer requires constructive tension. Sometimes it can get heated but it's never personal. Everyone contributes to better ideas knowing they will be heard but also challenged.
Drivers Not Passengers:
We think as owners who drive the bus, not as passengers. We are self-starters and never wait for instructions. We are hungry for autonomy, trust, and responsibility. Everyone is a leader because we know leadership is a trait, not a title. Leaders drive growth and navigate the chaos.
We Figure Out The Answers:
We trust our ability to figure stuff out. We do not need all the information to start answering the question. We can connect the dots and answer difficult questions with logic.
Customer & Mission Obsessed:
Our customers are our heroes and we are obsessed with helping them. We are obsessed with; understanding their supply chains better, resolving their biggest headaches, and advancing their competitiveness.
Learning and growth
We all take personal responsibility for becoming smarter, wiser, more skilled, happier. We are obsessed with learning about our industry and improving our own skills. We are obsessed with our personal growth; to become more.
Job Description:
As a member of our Research team, you will be responsible for researching, developing, and coding Agents using state-of-the-art LLM's with automated pipelines.
- Write code for the development of our ML engines and micro-services pipelines.
- use, optimize, train, and evaluate state-of-the-art GPT models.
- research and Develop Agentic pipelines using LLM's.
- research and develop RAG based pipeline using vector DB's .
Essential Requirements:
- prompt engineering and Agentic LLm frameworks like langchain/llama index
- good enough undersanding of vectors/tensors and RAG pipelines
- Knowledge of building NLP systems using transfer learning or building custom NLP systems from scratch using TensorFlow or PyTorch.
- In-depth knowledge of DSA, async, python, and containers.
- Knowledge of transformers and NLP techniques is essential, and deployment experience is a significant advantage.
Salary Range: ₹15000 - ₹25000
We are offering a full-time internship position to final-year students. The internship will last for an initial period of 6-12 months before converting to a full-time job, depending on suitability for both parties. If the applicant is a student who needs to return to university, they can continue with the program on a part-time basis.
Data Science Intern
Posted 1 day ago
Job Viewed
Job Description
NLP Data Science Intern
Did you notice a shortage of food at supermarkets during covid? Have you heard about the recent issues in the global shipping industry? or perhaps you've heard about the shortages of microchips? These problems are called supply chain disruptions. They have been increasing in frequency and severity. Supply chain disruptions are threatening our very way of life.
Our vision is to advance society's capacity to withstand shocks and stresses. Kavida.ai believes the only way to ensure security is through supply chain resiliency. We are on a mission to help companies proactively manage disruption supply chain disruption risks using integrated data.
Our Story
In March 2020 over 35 academics, data scientists, students, and software engineering volunteers came together to address the food shortage issues caused by the pandemic - A core team of 9 was formed and spun off into a startup and the rest is history.
Our investors include one of the world's largest supply chain quality & compliance monitoring companies, a £1.25bn apparel manufacturer, and some very impressive angel investors.
Social Impact:
Social impact is in our DNA. We believe private sector innovation is the only way to address social problems at scale. If we achieve our mission, humanity will always have access to its essential goods for sustenance. No more shortages of food, PPE, medicine, etc.
Our Culture :
Idea Meritocracy:
The best ideas win. We only care about what is right, not who is right. We know arriving at the best answer requires constructive tension. Sometimes it can get heated but it's never personal. Everyone contributes to better ideas knowing they will be heard but also challenged.
Drivers Not Passengers:
We think as owners who drive the bus, not as passengers. We are self-starters and never wait for instructions. We are hungry for autonomy, trust, and responsibility. Everyone is a leader because we know leadership is a trait, not a title. Leaders drive growth and navigate the chaos.
We Figure Out The Answers:
We trust our ability to figure stuff out. We do not need all the information to start answering the question. We can connect the dots and answer difficult questions with logic.
Customer & Mission Obsessed:
Our customers are our heroes and we are obsessed with helping them. We are obsessed with; understanding their supply chains better, resolving their biggest headaches, and advancing their competitiveness.
Learning and growth
We all take personal responsibility for becoming smarter, wiser, more skilled, happier. We are obsessed with learning about our industry and improving our own skills. We are obsessed with our personal growth; to become more.
Job Description:
As a member of our Research team, you will be responsible for researching, developing, and coding Agents using state-of-the-art LLM's with automated pipelines.
- Write code for the development of our ML engines and micro-services pipelines.
- use, optimize, train, and evaluate state-of-the-art GPT models.
- research and Develop Agentic pipelines using LLM's.
- research and develop RAG based pipeline using vector DB's .
Essential Requirements:
- prompt engineering and Agentic LLm frameworks like langchain/llama index
- good enough undersanding of vectors/tensors and RAG pipelines
- Knowledge of building NLP systems using transfer learning or building custom NLP systems from scratch using TensorFlow or PyTorch.
- In-depth knowledge of DSA, async, python, and containers.
- Knowledge of transformers and NLP techniques is essential, and deployment experience is a significant advantage.
Salary Range: 15000 - 25000
We are offering a full-time internship position to final-year students. The internship will last for an initial period of 6-12 months before converting to a full-time job, depending on suitability for both parties. If the applicant is a student who needs to return to university, they can continue with the program on a part-time basis.
Be The First To Know
About the latest Data analytics Jobs in Kochi !
Data Science Intern
Posted today
Job Viewed
Job Description
NLP Data Science Intern
Did you notice a shortage of food at supermarkets during covid? Have you heard about the recent issues in the global shipping industry? or perhaps you’ve heard about the shortages of microchips? These problems are called supply chain disruptions. They have been increasing in frequency and severity. Supply chain disruptions are threatening our very way of life.
Our vision is to advance society’s capacity to withstand shocks and stresses. Kavida.ai believes the only way to ensure security is through supply chain resiliency. We are on a mission to help companies proactively manage disruption supply chain disruption risks using integrated data.
Our Story
In March 2020 over 35 academics, data scientists, students, and software engineering volunteers came together to address the food shortage issues caused by the pandemic - Covid19foodsupply.com. A core team of 9 was formed and spun off into a startup and the rest is history.
Our investors include one of the world's largest supply chain quality & compliance monitoring companies, a £1.25bn apparel manufacturer, and some very impressive angel investors.
Social Impact:
Social impact is in our DNA. We believe private sector innovation is the only way to address social problems at scale. If we achieve our mission, humanity will always have access to its essential goods for sustenance. No more shortages of food, PPE, medicine, etc.
Our Culture :
Idea Meritocracy:
The best ideas win. We only care about what is right, not who is right. We know arriving at the best answer requires constructive tension. Sometimes it can get heated but it's never personal. Everyone contributes to better ideas knowing they will be heard but also challenged.
Drivers Not Passengers:
We think as owners who drive the bus, not as passengers. We are self-starters and never wait for instructions. We are hungry for autonomy, trust, and responsibility. Everyone is a leader because we know leadership is a trait, not a title. Leaders drive growth and navigate the chaos.
We Figure Out The Answers:
We trust our ability to figure stuff out. We do not need all the information to start answering the question. We can connect the dots and answer difficult questions with logic.
Customer & Mission Obsessed:
Our customers are our heroes and we are obsessed with helping them. We are obsessed with; understanding their supply chains better, resolving their biggest headaches, and advancing their competitiveness.
Learning and growth
We all take personal responsibility for becoming smarter, wiser, more skilled, happier. We are obsessed with learning about our industry and improving our own skills. We are obsessed with our personal growth; to become more.
Job Description:
As a member of our Research team, you will be responsible for researching, developing, and coding Agents using state-of-the-art LLM's with automated pipelines.
- Write code for the development of our ML engines and micro-services pipelines.
- use, optimize, train, and evaluate state-of-the-art GPT models.
- research and Develop Agentic pipelines using LLM's.
- research and develop RAG based pipeline using vector DB's .
Essential Requirements:
- prompt engineering and Agentic LLm frameworks like langchain/llama index
- good enough undersanding of vectors/tensors and RAG pipelines
- Knowledge of building NLP systems using transfer learning or building custom NLP systems from scratch using TensorFlow or PyTorch.
- In-depth knowledge of DSA, async, python, and containers.
- Knowledge of transformers and NLP techniques is essential, and deployment experience is a significant advantage.
Salary Range: ₹15000 - ₹25000
We are offering a full-time internship position to final-year students. The internship will last for an initial period of 6-12 months before converting to a full-time job, depending on suitability for both parties. If the applicant is a student who needs to return to university, they can continue with the program on a part-time basis.
Data Science Specialist
Posted today
Job Viewed
Job Description
Job Title: N8N Developer / Automation Engineer
Experience: 1–4 years (Immediate Joiners Preferred)
Location: (Your Location) / Remote
Job Summary:
We are looking for a skilled N8N Developer to design, develop, and maintain workflow automations and integrations across multiple systems. The ideal candidate should have strong experience with N8N , API integrations , and Node.js , and be able to create efficient automation solutions that improve operational efficiency.
Key Responsibilities:
- Design, develop, and deploy automated workflows using N8N .
- Integrate multiple systems and APIs (REST, GraphQL, etc.) to automate business processes.
- Build and maintain custom N8N nodes when required using Node.js and JavaScript .
- Monitor, troubleshoot, and optimize existing workflows for performance and reliability.
- Collaborate with cross-functional teams to identify automation opportunities.
- Implement error-handling, logging, and version control for workflows.
- Document workflows, processes, and integration logic for internal teams.
Required Skills & Experience:
- Hands-on experience with N8N workflow automation.
- Strong knowledge of Node.js , JavaScript , and RESTful APIs .
- Familiarity with webhooks , OAuth , and JSON data structures .
- Experience working with databases (MongoDB, MySQL, PostgreSQL, etc.).
- Understanding of cloud platforms (AWS, GCP, or similar) is a plus.
- Ability to debug and optimize API calls and workflow logic.
- Excellent problem-solving and analytical skills.
Preferred Skills (Good to Have):
- Experience with Zapier , Make (Integromat) , or other automation tools.
- Exposure to Docker and CI/CD pipelines .
- Basic understanding of DevOps and API security .
- Knowledge of Express.js or other backend frameworks.
Qualification:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Please send me your resume on /