5,513 Big Data Solutions jobs in India
Big Data Solutions Specialist
Posted today
Job Viewed
Job Description
**Job Title:** Data Engineer
Key Responsibilities- Design and build scalable big data pipelines using Scala, PySpark, Spark SQL, and Spark Streaming on Databricks.
- Develop and maintain real-time data processing solutions using Kafka Streams and similar event-driven platforms.
- Implement cloud-based solutions on Azure leveraging services such as Azure Data Factory (ADF), Azure Functions, and Azure Cosmos DB.
- Build microservices with Core Java 8+, Spring Boot, and Docker.
- Collaborate on system design including API development, event-driven architecture, and front-end development with JavaScript and React.
- Ensure application reliability through monitoring tools like Grafana and New Relic.
- Utilize modern CI/CD tools like Git, Jenkins, Kubernetes, and Argo CD for deployment and version control.
- 5+ years of professional experience as a software or data engineer.
- Strong programming skills in Scala, Python, and Java.
- Experience with Databricks, Spark SQL, Spark Streaming, and PySpark.
- Hands-on experience with Azure cloud services and data engineering tools.
- Solid knowledge of microservices development, Spring Boot, and Core Java 8+.
- Familiarity with event-driven platforms, Kafka, and CI/CD pipelines.
- Strong problem-solving and communication skills.
- Bachelor's or Master's degree in Computer Science or related field preferred.
This role offers a unique opportunity to work on cutting-edge big data projects and collaborate with experienced professionals in the field.
As a data engineer, you will have the chance to develop your skills in a fast-paced environment and contribute to the growth and success of our organization.
We offer competitive compensation packages, opportunities for career advancement, and a dynamic work environment that values innovation and teamwork.
Join our team and be part of a collaborative and supportive community that is passionate about delivering high-quality solutions.
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
We are seeking highly skilled and motivated professionals to support data engineering initiatives in the pharmaceutical domain. The ideal candidate will have hands-on expertise in cloud-based big data platforms, SQL, and Python programming, and a strong understanding of pharma/life sciences data.
Key Responsibilities:- Create and optimize scalable data pipelines
- Transform complex datasets for analytics and business intelligence purposes
- A minimum of 3–5 years of experience in data engineering roles
- Strong hands-on experience with cloud-based big data platforms for data processing and pipeline development
- Proficiency in SQL for querying, transforming, and troubleshooting data
- Solid programming skills in Python for data manipulation and automation
- Proven experience working with pharmaceutical or life sciences data, including familiarity with industry-specific data structures and compliance considerations
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
Azure Data Engineer Role
We are seeking an experienced Azure Data Engineer to join our team. The ideal candidate will have a strong background in designing and developing data pipelines, ETL workflows, and scalable data models.
- Design and develop efficient data pipelines and ETL workflows using Databricks and Azure Data Factory.
- Build scalable data models and Delta Lake architectures for analytics and AI applications.
- Develop and optimize Python-based data processing scripts (PySpark, pandas, APIs).
- Manage and tune SQL and Postgres databases for schema design, indexing, and query performance.
- Integrate Vector Databases (pgvector, Qdrant, Pinecone) for advanced AI search solutions.
- Collaborate on Generative AI applications leveraging Azure OpenAI or similar platforms.
- Implement CI/CD pipelines and Git-based workflows for continuous data deployment and version control.
Required Skills:
- Python programming skills (PySpark, pandas, REST APIs).
- Databricks expertise (workflows, notebooks, Delta Lake).
- Advanced SQL knowledge for data modeling and optimization.
- Azure experience: Data Factory, Data Lake Gen2, Synapse Analytics.
- Postgres proficiency (schema design, indexing, tuning).
- Vector Databases and Generative AI concepts.
- CI/CD pipelines and Git-based version control familiarity.
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Big Data Solutions Architect to join our team. The ideal candidate will have expertise in designing, building, and optimizing big data pipelines using various technologies.
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
Big Data Solutions Architect
In this role, you will design and implement scalable big data pipelines using cutting-edge technologies.
- Data Engineering: Design, build and optimize big data pipelines using Scala, PySpark, Spark SQL, Spark Streaming and Databricks.
- Backend Development: Build scalable microservices with Core Java (8+) and Spring Boot.
- Cross-functional collaboration: Work cross-functionally with data engineers, software developers and architects to deliver high-quality solutions.
Technical Expertise:
- Strong programming skills in Scala, Python and Java.
- Experience with Databricks, Spark SQL, Spark Streaming and PySpark.
- Solid knowledge of microservices development with Spring Boot.
Education:
- Bachelor's or master's degree in computer science, Engineering or a related field.
The ideal candidate will have strong expertise in big data technologies, cloud platforms, microservices and system design, with the ability to build efficient data-driven applications.
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
We are looking for a skilled Big Data Engineer to design, build and optimize big data pipelines using Scala PySpark Spark SQL Spark Streaming and Databricks.
This role requires hands on experience across data engineering backend development and cloud deployment along with a strong foundation in modern DevOps and monitoring practices.
- The ideal candidate will have expertise in designing scalable efficient data driven applications.
- They should have experience with real time data processing solutions using Kafka Streams or similar event driven platforms.
- Azure based solutions leveraging services such as Azure Data Factory ADF and Azure Functions.
- Building scalable microservices with Core Java 8 and Spring Boot.
- Collaborating on system design including API development and event driven architecture.
- Familiarity with frontend development JavaScript React as needed.
- Maintaining application reliability through monitoring tools such as Grafana New Relic or similar.
- Utilizing modern CI CD tools Git Jenkins Kubernetes ArgoCD etc for deployment and version control.
Requirements:
- Bachelor s or master s degree in computer science Engineering or a related field preferred.
- Strong problem solving and communication skills.
- Experience with API design and event driven architecture nice to have.
- Frontend development experience with React and JavaScript nice to have.
The ideal candidate should have
- 5+ years of professional experience as a Software/Data Engineer or Full Stack Engineer.
- Strong programming skills in Scala Python and Java.
- Hands on experience with Databricks Spark SQL Spark Streaming and PySpark.
- Experience with Azure cloud services and data engineering tools.
- Solid knowledge of microservices development with Spring Boot.
- Familiarity with event driven platforms such as Kafka.
- Experience with CI CD pipelines and containerization/orchestration tools.
Big Data Solutions Architect
Posted today
Job Viewed
Job Description
Data Engineering Opportunity
">We are seeking an experienced Data Engineer to join our team. As a key member of our data & AI practice, you will design, build, and optimize end-to-end data solutions using Microsoft's data platforms.
">Key Responsibilities
- Design and implement scalable data models for real-time and batch processing.
- Develop and optimize data pipelines for efficient data processing.
- Build and maintain high-quality data systems using ADF, Synapse, Databricks.
Required Skills & Qualifications
- 5–7 years of data engineering experience with a strong focus on Microsoft technologies.
- Expertise in SQL, T-SQL, Spark SQL, Python/Scala, DAX, Power Query (M).
- Proficiency in real-time streaming (Event Hubs/Kafka) and batch processing.
- A proven track record in implementing medallion architecture and enterprise BI solutions.
Why This Role Matters
You will have the opportunity to work on cutting-edge Microsoft data projects, collaborate with global teams, and contribute to solutions that shape the future of business decision-making.
Be The First To Know
About the latest Big data solutions Jobs in India !
Big Data Solutions Expert
Posted today
Job Viewed
Job Description
Job Title: Big Data Architect
We are seeking a highly skilled Full Stack engineer to build scalable and efficient data-driven applications using big data technologies, cloud platforms, microservices, and system design.
Key Responsibilities:
- Design and optimize big data pipelines using Scala, PySpark, Spark SQL, and Databricks.
- Develop real-time data processing solutions using Kafka Streams or similar event-driven platforms.
- Implement cloud-based solutions on Azure services like Azure Data Factory and Azure Functions.
- Build scalable microservices with Core Java and Spring Boot.
- Collaborate on system design including API development and event-driven architecture.
Qualifications:
- 5+ years of professional experience as a Software/Data Engineer or Full Stack Engineer.
- Strong programming skills in Scala, Python, and Java.
- Experience with Databricks, Spark SQL, and Azure cloud services.
- Solid knowledge of microservices development with Spring Boot.
We offer a challenging opportunity for professionals looking to advance their careers and contribute to the success of our organization. Our team is committed to fostering a collaborative environment that encourages creativity and innovation.
The ideal candidate will have a passion for working with large-scale data systems and a commitment to delivering high-quality results. If you're interested in taking your career to the next level, we encourage you to apply.
Big Data Solutions Expert
Posted today
Job Viewed
Job Description
Data Engineer
About the Role:We are seeking a skilled data engineer to design and optimize scalable data pipelines. As part of our team, you will collaborate with data scientists and business stakeholders to drive data-driven decision-making.
Key Responsibilities:- Architect and maintain scalable data pipelines to process structured and unstructured data from diverse sources.
- Collaborate with cross-functional teams to gather and analyze requirements, develop database solutions, and support application development efforts.
- Hands-on experience with modern data platforms such as Snowflake, Redshift, BigQuery, or Databricks.
- Strong knowledge of T-SQL and SQL Server Management Studio (SSMS).
- Experience in writing complex stored procedures, Views and query performance tuning on large datasets.
We offer a dynamic work environment, opportunities for professional growth and development, and a competitive compensation package.
Big Data Solutions Specialist
Posted today
Job Viewed
Job Description
We are currently looking for a skilled Data Platform Architect to enhance our data infrastructure.
- The ideal candidate will have experience with distributed computing and ETL development using PySpark , as well as advanced SQL skills for writing complex queries.
This role requires strong collaboration and communication skills, with the ability to work closely with stakeholders and technical teams to deliver high-quality solutions.
About the Role:- Develop and implement efficient ETL pipelines using PySpark to extract, transform, and load data.
- Collaborate with cross-functional teams to integrate data platforms with other systems and tools.
- Design and implement robust data pipelines and platforms to support business decision-making.
- Collaborate with stakeholders during requirements clarification and sprint planning sessions to ensure alignment with business objectives.