1873 Lead Data Science Positions jobs in Bengaluru
Big Data Lead
Posted today
Job Viewed
Job Description
Description
:Big Data Developer
Posted 4 days ago
Job Viewed
Job Description
Years of experience: 4 - 7 years
Location: Bangalore, Gurgaon
Job Description:
- Experience in working on Spark framework, good understanding of core concepts, optimizations, and best practices
- Good hands-on experience in writing code in PySpark, should understand design principles and OOPS
- Good experience in writing complex queries to derive business critical insights
- Hands-on experience on Stream data processing
- Understanding of Data Lake vs Data Warehousing concept
- Knowledge on Machin learning would be an added advantag
- Experience in NoSQL Technologies – MongoDB, Dynamo DB
- Good unestanding of test driven development
- Flexible to learn new technologies
Roles & Responsibilities:
- Design and implement solutions for problems arising out of large-scale data processing
- Attend/drive various architectural, design and status calls with multiple stakeholders
- Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support
- Design, build & maintain efficient, reusable & reliable code
- Test implementation, troubleshoot & correct problems
- Capable of working as an individual contributor and within team too
- Ensure high quality software development with complete documentation and traceability
- Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Big Data Specialist
Posted 4 days ago
Job Viewed
Job Description
Role Overview
We are seeking a highly skilled Big Data Engineer to join our team. The ideal candidate will have strong experience in building, maintaining, and optimizing large-scale data pipelines and distributed data processing systems. This role involves working closely with cross-functional teams to ensure the reliability, scalability, and performance of data solutions.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Work with large datasets using Hadoop ecosystem tools (Hive, Spark).
- Build and optimize real-time and batch data processing solutions using Kafka and Spark Streaming.
- Write efficient, high-performance SQL queries to extract, transform, and load data.
- Develop reusable data frameworks and utilities in Python.
- Collaborate with data scientists, analysts, and product teams to deliver reliable data solutions.
- Monitor, troubleshoot, and optimize big data workflows for performance and cost efficiency.
Must-Have Skills
- Strong hands-on experience with Hive and SQL for querying and data transformation.
- Proficiency in Python for data manipulation and automation.
- Expertise in Apache Spark (batch and streaming).
- Experience working with Kafka for streaming data pipelines.
Good-to-Have Skills
- Experience with workflow orchestration tools (Airflow etc.)
- Knowledge of cloud-based big data platforms (AWS EMR, GCP Dataproc, Azure HDInsight).
- Familiarity with CI/CD pipelines and version control (Git).
Big Data Developer
Posted 16 days ago
Job Viewed
Job Description
Experience: 4 - 7 years
Location: Bangalore
Job Description:
- Strong experience working with the Apache Spark framework, including a solid grasp of core concepts, performance optimizations, and industry best practices
- Proficient in PySpark with hands-on coding experience; familiarity with unit testing, object-oriented programming (OOP) principles, and software design patterns
- Experience with code deployment and associated processes
- Proven ability to write complex SQL queries to extract business-critical insights
- Hands-on experience in streaming data processing
- Familiarity with machine learning concepts is an added advantage
- Experience with NoSQL databases
- Good understanding of Test-Driven Development (TDD) methodologies
- Demonstrated flexibility and eagerness to learn new technologies
Roles & Responsibilities
- Design and implement solutions for problems arising out of large-scale data processing
- Attend/drive various architectural, design and status calls with multiple stakeholders
- Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support
- Design, build & maintain efficient, reusable & reliable code
- Test implementation, troubleshoot & correct problems
- Capable of working as an individual contributor and within team too
- Ensure high quality software development with complete documentation and traceability
- Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Big Data Developer
Posted 17 days ago
Job Viewed
Job Description
Job Description:
Experience in working on Spark framework, good understanding of core concepts, optimizations, and best practices
Good hands-on experience in writing code in PySpark, should understand design principles and OOPS
Good experience in writing complex queries to derive business critical insights
Hands-on experience on Stream data processing
Understanding of Data Lake vs Data Warehousing concept
Knowledge on Machin learning would be an added advantag
Experience in NoSQL Technologies – MongoDB, Dynamo DB
Good understanding of test driven development
Flexible to learn new technologies
Roles & Responsibilities:
Design and implement solutions for problems arising out of large-scale data processing
Attend/drive various architectural, design and status calls with multiple stakeholders
Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support
Design, build & maintain efficient, reusable & reliable code
Test implementation, troubleshoot & correct problems
Capable of working as an individual contributor and within team too
Ensure high quality software development with complete documentation and traceability
Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Big Data Architect
Posted 18 days ago
Job Viewed
Job Description
Job Tittle- Big Data Architect
Location - Bangalore/ Pune
Experience: 10Yrs to 16 Years
Experienced profile with strong integration data architecture, data modeling, database design, proficient in SQL and familiar with at least one cloud platforms. Good understanding of data integration and management tools (MuleSoft/IBM Sterling Integrator/Talend/Informatica.) Knowledge of ETL, Data Warehousing and Big Data technologies
Skills Requirements:
- Strong organizational and communication skills.
- Work with Client Architect, Drive Data architecture related client workshops, internal meetings, proposals etc.
- Strong understanding of NiFi architecture and components
- Experience with data formats like JSON, XML, and Avro
- Knowledge of data protocols like HTTP, TCP, and Kafka
- Coach and create a Data strategy, vision for the larger team, provide subject matter training
- Data governance principles and data quality including database design, data modeling and Cloud architecture
- Familiarity with data governance and security best practices
- Knowledge of containerization and orchestration (Docker and Kubernetes)
Responsibilities :
- High level Designs, data architecture, data pipelines for Apache NiFi, AI-NEXT platform
- Ensures database performance, data quality, integrity, and security
- Guide team for solution implementation
- Partner with Internal Product architect team, engineering team, security team etc.
- Support pre-sales team for Data Solution
- Optimize and troubleshoot NiFi workflows for performance, scalability, and reliability
Collaborate with cross-functional teams to integrate NiFi with other systems including Databases, API and cloud services and other backend apps
Big Data Developer
Posted 2 days ago
Job Viewed
Job Description
Experience: 4 - 7 years
Location: Bangalore
Job Description:
- Strong experience working with the Apache Spark framework, including a solid grasp of core concepts, performance optimizations, and industry best practices
- Proficient in PySpark with hands-on coding experience; familiarity with unit testing, object-oriented programming (OOP) principles, and software design patterns
- Experience with code deployment and associated processes
- Proven ability to write complex SQL queries to extract business-critical insights
- Hands-on experience in streaming data processing
- Familiarity with machine learning concepts is an added advantage
- Experience with NoSQL databases
- Good understanding of Test-Driven Development (TDD) methodologies
- Demonstrated flexibility and eagerness to learn new technologies
Roles & Responsibilities
- Design and implement solutions for problems arising out of large-scale data processing
- Attend/drive various architectural, design and status calls with multiple stakeholders
- Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support
- Design, build & maintain efficient, reusable & reliable code
- Test implementation, troubleshoot & correct problems
- Capable of working as an individual contributor and within team too
- Ensure high quality software development with complete documentation and traceability
- Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Be The First To Know
About the latest Lead data science positions Jobs in Bengaluru !
Big Data Developer
Posted 3 days ago
Job Viewed
Job Description
Job Description:
Experience in working on Spark framework, good understanding of core concepts, optimizations, and best practices
Good hands-on experience in writing code in PySpark, should understand design principles and OOPS
Good experience in writing complex queries to derive business critical insights
Hands-on experience on Stream data processing
Understanding of Data Lake vs Data Warehousing concept
Knowledge on Machin learning would be an added advantag
Experience in NoSQL Technologies – MongoDB, Dynamo DB
Good understanding of test driven development
Flexible to learn new technologies
Roles & Responsibilities:
Design and implement solutions for problems arising out of large-scale data processing
Attend/drive various architectural, design and status calls with multiple stakeholders
Ensure end-to-end ownership of all tasks being aligned including development, testing, deployment and support
Design, build & maintain efficient, reusable & reliable code
Test implementation, troubleshoot & correct problems
Capable of working as an individual contributor and within team too
Ensure high quality software development with complete documentation and traceability
Fulfil organizational responsibilities (sharing knowledge & experience with other teams/ groups)
Big Data Architect
Posted 4 days ago
Job Viewed
Job Description
Job Tittle- Big Data Architect
Location - Bangalore/ Pune
Experience: 10Yrs to 16 Years
Experienced profile with strong integration data architecture, data modeling, database design, proficient in SQL and familiar with at least one cloud platforms. Good understanding of data integration and management tools (MuleSoft/IBM Sterling Integrator/Talend/Informatica.) Knowledge of ETL, Data Warehousing and Big Data technologies
Skills Requirements:
- Strong organizational and communication skills.
- Work with Client Architect, Drive Data architecture related client workshops, internal meetings, proposals etc.
- Strong understanding of NiFi architecture and components
- Experience with data formats like JSON, XML, and Avro
- Knowledge of data protocols like HTTP, TCP, and Kafka
- Coach and create a Data strategy, vision for the larger team, provide subject matter training
- Data governance principles and data quality including database design, data modeling and Cloud architecture
- Familiarity with data governance and security best practices
- Knowledge of containerization and orchestration (Docker and Kubernetes)
Responsibilities:
- High level Designs, data architecture, data pipelines for Apache NiFi, AI-NEXT platform
- Ensures database performance, data quality, integrity, and security
- Guide team for solution implementation
- Partner with Internal Product architect team, engineering team, security team etc.
- Support pre-sales team for Data Solution
- Optimize and troubleshoot NiFi workflows for performance, scalability, and reliability
Collaborate with cross-functional teams to integrate NiFi with other systems including Databases, API and cloud services and other backend apps