512 Data Scientists jobs in Noida

Big Data

Noida, Uttar Pradesh Iris software

Posted today

Job Viewed

Tap Again To Close

Job Description

JD:

- Minimum EAP experience - 7 years.- Responsibilities for Big Data Engineer- Perform data cleaning, integration, validation and analysis
- Extensive experience of developing(must), deploying (must) and maintaining(must) the Big Data Ecosystem
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Create complex data processing jobs in PySpark to load data from RDBMS (Oracle) & process based on complex business rules (must)
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues (must) and support their data infrastructure needs.
- Qualifications for Big Data Engineer:
- Strong track record of building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Working SQL knowledge and experience working with relational databases
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with structured datasets.
- Strong track record of building in build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores (optional)
- Experience in supporting and working with cross-functional teams in a dynamic environment.
- Tech Stack
- Experience with big data tools:

- Hadoop Ecosytem (must),Hive (must)
- Impala (must)
- Sqoop (must)
- Python (must)
- Spark with Python (must)
- PySPARK (must) - experience must in Spark Tuning as well
- SPARK-SQL (must)- Any one RDBMS (good to have)
- Any cloud platform (S3 or Azure)
- Rich experience in scripting language Python (must)
- Rich experience in building optimal pySpark jobs handling huge data load (must)
- Rich experience in performing capacity management and gauging memory needs for huge data processing (must)
- Experience with stream-processing systems: Storm, Spark-Streaming, etc. (good to have)

**Job Summary**:
**Role Based Competencies**:

- Tech - Requirement Management S/W- Tech - NFR- Tech - High level design- Tech - Code quality and coding standards- Beh - Result Orientation- Tech - Code Reviews- Tech - Build Management- Tech - Unit Testing- Beh - Information Seeking- Tech - Agile Methodology- Tech - Analytical Problem Solving- Beh - Communication- Beh - Customer Orientation- Beh - Collaboration**Mandatory Competencies**:

- Big Data - Hadoop- Big Data - Hive- Big Data - Impala- Big Data - PySpark**Good to Have Competencies**:
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Delhi, Delhi Kresta Softech Private Limited

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Role Highlights:
Position: Big Data Engineer
Experience: 4+ years
Location: All India-Remote, Hyderabad- Hybrid
Notice Period: Immediate/7 days joiners mandate

Job Overview:

Must have skills- Big Data, Scala, AWS and Python or Java
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Delhi, Delhi Affine

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

Experience: 5 to 9 years

Must have Skills:
Kotlin/Scala/Java
Spark
SQL
Spark Streaming
Any cloud (AWS preferable)
Kafka /Kinesis/Any streaming services
Object-Oriented Programming
Hive, ETL/ELT design experience
CICD experience (ETL pipeline deployment)
Data Modeling experience

Good to Have Skills:
Git/similar version control tool
Knowledge in CI/CD, Microservices

Role Objective:

Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products

Roles & Responsibilities:
Sound knowledge in Spark architecture and distributed computing and Spark streaming.
Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
Good understanding in object-oriented concepts and hands on experience on Kotlin/Scala/Java with excellent programming logic and technique.
Good in functional programming and OOPS concept on Kotlin/Scala/Java
Good experience in SQL
Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
Able to mentor new members for onboarding to the project.
Understand the client requirement and able to design, develop from scratch and deliver.
AWS cloud experience would be preferable.
Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on cloud (AWS is preferred)
Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
Managing project timing, client expectations and meeting deadlines.
Should have played project and team management roles.
Facilitate meetings within the team on regular basis.
Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
Optimization, maintenance, and support of pipelines.
Strong analytical and logical skills.
Ability to comfortably tackling new challenges and learn
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Ghaziabad, Uttar Pradesh Affine

Posted today

Job Viewed

Tap Again To Close

Job Description

Experience: 5 to 9 years


Must have Skills:

  • Kotlin/Scala/Java
  • Spark
  • SQL
  • Spark Streaming
  • Any cloud (AWS preferable)
  • Kafka /Kinesis/Any streaming services
  • Object-Oriented Programming
  • Hive, ETL/ELT design experience
  • CICD experience (ETL pipeline deployment)
  • Data Modeling experience


Good to Have Skills:

  • Git/similar version control tool
  • Knowledge in CI/CD, Microservices


Role Objective:


Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products


Roles & Responsibilities:

  • Sound knowledge in Spark architecture and distributed computing and Spark streaming.
  • Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
  • Good understanding in object-oriented concepts and hands on experience on Kotlin/Scala/Java with excellent programming logic and technique.
  • Good in functional programming and OOPS concept on Kotlin/Scala/Java
  • Good experience in SQL
  • Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
  • Able to mentor new members for onboarding to the project.
  • Understand the client requirement and able to design, develop from scratch and deliver.
  • AWS cloud experience would be preferable.
  • Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on cloud (AWS is preferred)
  • Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
  • Managing project timing, client expectations and meeting deadlines.
  • Should have played project and team management roles.
  • Facilitate meetings within the team on regular basis.
  • Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
  • Optimization, maintenance, and support of pipelines.
  • Strong analytical and logical skills.
  • Ability to comfortably tackling new challenges and learn
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Ghaziabad, Uttar Pradesh Kresta Softech Private Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Role Highlights:

Position: Big Data Engineer

Experience: 4+ years

Location: All India-Remote, Hyderabad- Hybrid

Notice Period: Immediate/7 days joiners mandate


Job Overview:


Must have skills- Big Data, Scala, AWS and Python or Java

This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Noida, Uttar Pradesh Training Basket

Posted today

Job Viewed

Tap Again To Close

Job Description

We are looking for passionate   B.Tech freshers   with strong programming skills in   Java   who are eager to start their career in   Big Data technologies . The role offers exciting opportunities to work on real-time big data projects, data pipelines, and cloud-based data solutions.


Requirements
  • Assist in designing, developing, and maintaining   big data solutions .

  • Write efficient code in   Java   and integrate with big data frameworks.

  • Support in building   data ingestion, transformation, and processing pipelines .

  • Work with   distributed systems   and learn technologies like   Hadoop, Spark, Kafka, Hive, HBase .

  • Collaborate with senior engineers on data-related problem-solving and performance optimization.

  • Participate in   debugging, testing, and documentation   of big data workflows.

Required Skills:
  • Strong knowledge of   Core Java & OOPs concepts .

  • Good understanding of   SQL and database concepts .

  • Familiarity with   data structures & algorithms .

  • Basic knowledge of   Big Data frameworks   (Hadoop/Spark/Kafka) is an added advantage.

  • Problem-solving skills and eagerness to learn new technologies.

Eligibility Criteria:
  • Education:   B.Tech (CSE/IT or related fields).

  • Batch:   (specific, e.g., 2024/2025 pass outs).

  • Experience:   Fresher (0–1 year)



Benefits
  • Training and mentoring in   cutting-edge Big Data tools & technologies .

  • Exposure to   live projects   from day one.

  • A fast-paced, learning-oriented work culture.



This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Noida, Uttar Pradesh Training Basket

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

We are looking for passionate   B.Tech freshers   with strong programming skills in   Java   who are eager to start their career in   Big Data technologies . The role offers exciting opportunities to work on real-time big data projects, data pipelines, and cloud-based data solutions.


Requirements
  • Assist in designing, developing, and maintaining   big data solutions .

  • Write efficient code in   Java   and integrate with big data frameworks.

  • Support in building   data ingestion, transformation, and processing pipelines .

  • Work with   distributed systems   and learn technologies like   Hadoop, Spark, Kafka, Hive, HBase .

  • Collaborate with senior engineers on data-related problem-solving and performance optimization.

  • Participate in   debugging, testing, and documentation   of big data workflows.

Required Skills:
  • Strong knowledge of   Core Java & OOPs concepts .

  • Good understanding of   SQL and database concepts .

  • Familiarity with   data structures & algorithms .

  • Basic knowledge of   Big Data frameworks   (Hadoop/Spark/Kafka) is an added advantage.

  • Problem-solving skills and eagerness to learn new technologies.

Eligibility Criteria:
  • Education:   B.Tech (CSE/IT or related fields).

  • Batch:   (specific, e.g., 2024/2025 pass outs).

  • Experience:   Fresher (0–1 year)



Benefits
  • Training and mentoring in   cutting-edge Big Data tools & technologies .

  • Exposure to   live projects   from day one.

  • A fast-paced, learning-oriented work culture.




Requirements
Strong knowledge of Core Java & OOPs concepts. Good understanding of SQL and database concepts. Familiarity with data structures & algorithms.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data scientists Jobs in Noida !

Big Data Engineer

Noida, Uttar Pradesh Kiash Solutions LLP

Posted today

Job Viewed

Tap Again To Close

Job Description

Basic Qualifications:

Bachelors degree or higher in Computer Science, or equivalent degree and 3-10 years related working experience.

In-depth experience with a big data cloud platform, preferably Azure.

Strong grasp of programming languages (Python, PySpark, or equivalent) and a willingness to learn new ones.

Experience writing database-heavy services or APIs.

Experience building and optimizing data pipelines, architectures, and data sets.

Working knowledge of queueing, stream processing, and highly scalable data stores

Experience working with and supporting cross-functional teams.

Strong understanding of structuring code for testability.

Preferred Qualifications:

Professional experience implementing and maintaining MLOps pipelines in MLflow or AzureML.

Professional experience implementing data ingestion pipelines using Data Factory.

Professional experience with Databricks and coding with notebooks.

Professional experience processing and manipulating data using SQL and Python code.

Professional experience with user training, customer support, and coordination with cross-functional teams.


This advertiser has chosen not to accept applicants from your region.

Big Data Engineer

Noida, Uttar Pradesh Confidential

Posted today

Job Viewed

Tap Again To Close

Job Description

Description

We are seeking a skilled Big Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data architectures and pipelines, and will be responsible for ensuring the efficient processing and storage of large datasets.

Responsibilities
  • Design and implement scalable data pipelines to support data ingestion, processing, and storage.
  • Collaborate with data scientists and analysts to understand data requirements and provide necessary data solutions.
  • Optimize and maintain existing data architectures to ensure high performance and reliability.
  • Develop and maintain documentation for data engineering processes and data flow diagrams.
  • Monitor and troubleshoot data pipeline issues to ensure data integrity and availability.
Skills and Qualifications
  • 5-10 years of experience in Big Data technologies such as Hadoop, Spark, and Kafka.
  • Proficient in programming languages such as Java, Python, or Scala.
  • Experience with data modeling and database design, including both SQL and NoSQL databases.
  • Strong understanding of ETL processes and tools.
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of data warehousing solutions and architectures.
  • Experience with containerization technologies like Docker and orchestration tools like Kubernetes.

Education
Bachelor Of Computer Application (B.C.A), Bachelor Of Technology (B.Tech/B.E), Master in Computer Application (M.C.A), Post Graduate Diploma in Computer Applications (PGDCA), Masters in Technology (M.Tech/M.E)
Skills Required
Hadoop, Spark, Kafka, Sql, Nosql, Python, Data Warehousing, Etl, Data Modeling, Cloud Services
This advertiser has chosen not to accept applicants from your region.

Big Data Developer

Ghaziabad, Uttar Pradesh Affine

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Experience: 5 to 9 years


Must have Skills:

  • Kotlin/Scala/Java
  • Spark
  • SQL
  • Spark Streaming
  • Any cloud (AWS preferable)
  • Kafka /Kinesis/Any streaming services
  • Object-Oriented Programming
  • Hive, ETL/ELT design experience
  • CICD experience (ETL pipeline deployment)
  • Data Modeling experience


Good to Have Skills:

  • Git/similar version control tool
  • Knowledge in CI/CD, Microservices


Role Objective:


Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products


Roles & Responsibilities:

  • Sound knowledge in Spark architecture and distributed computing and Spark streaming.
  • Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
  • Good understanding in object-oriented concepts and hands on experience on Kotlin/Scala/Java with excellent programming logic and technique.
  • Good in functional programming and OOPS concept on Kotlin/Scala/Java
  • Good experience in SQL
  • Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
  • Able to mentor new members for onboarding to the project.
  • Understand the client requirement and able to design, develop from scratch and deliver.
  • AWS cloud experience would be preferable.
  • Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on cloud (AWS is preferred)
  • Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
  • Managing project timing, client expectations and meeting deadlines.
  • Should have played project and team management roles.
  • Facilitate meetings within the team on regular basis.
  • Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
  • Optimization, maintenance, and support of pipelines.
  • Strong analytical and logical skills.
  • Ability to comfortably tackling new challenges and learn
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Scientists Jobs View All Jobs in Noida