7,324 Databricks Python jobs in India
AWS Glue Databricks Python
Posted today
Job Viewed
Job Description
The Candidate must have 4 Year of relevant of experience in Databricks AWS Python
Hands on experience on AWS Cloud platform especially S3 Glue Lamda.
Experience on spark scripting.
Senior Databricks Python Developer
Posted today
Job Viewed
Job Description
Data Bricks and Python Engineer
Job Responsibilities
Software Development:
Design, develop, test, and deploy high-performance and scalable data solutions using Python,PySpark, SQL
Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
Implement efficient and maintainable code using best practices and coding standards.
Databricks Platform:
Work with Databricks platform for big data processing and analytics.
Develop and maintain ETL processes using Databricks notebooks.
Implement and optimize data pipelines for data transformation and integration.
Continuous Learning:
Stay updated on the latest industry trends, tools, and technologies related to Python, SQL, and Databricks.
Share knowledge with the team and contribute to a culture of continuous improvement.
SQL Database Management:
Utilize expertise in SQL to design, optimize, and maintain relational databases.
Write complex SQL queries for data retrieval, manipulation, and analysis.
Mandatory Skills:
- 4 to 8 Years of experience in Databricks and big data frameworks
- Advanced proficiency in AWS, including EC2, S3 and container orchestration (Docker, Kubernetes)
- Proficient in AWS services and data migration
- Experience in Unity Catalogue
- Familiarity with Batch and real time processing
- Data engineering with strong skills in Python, PySpark, SQL
Databricks with Python
Posted today
Job Viewed
Job Description
Role: Databricks with Python
Experience range: 5-8 years
Location: Hyderabad/ Bangalore
NOTE: Notice period must be up to 45 days
Job description:
Must Have:
- Extensive expertise in designing and implementing data load processes using Azure Data Factory, Azure Databricks, Delta Lake, Azure Delta Lake Storage and Python/PySpark
- Proficient with Databricks & Python
- Senior developers with Full Database/ Datawarehouse/ DataMart development capabilities
- Senior SME with SQL Server required.
- Proficiency with standard file formats and markup language technology such as JSON, XML, XSLT.
- Proficiency with Azure Cloud data environments highly desired.
- Excellent verbal communication skills. Good problem-solving skills. Attention to detail.
- Proficiency with Data Warehousing principals and design
Nice to have:
- Familiarity with oData Services/Kafka/ Azure Data Factory
- Familiarity with Terraform, Unity Catalog
- Familiarity with other database technology such as Mongo, Cosmos, MySQL or Oracle a plus
Stanra - Databricks Developer - Python/SQL
Posted today
Job Viewed
Job Description
Job Title
: Databricks Developer (Contract)
Contract Duration :
4 Months +Exendible based on Performance
Job Location
: Remote
Job Timings
: India Evening Shift (till 11 : 30 PM IST)
Experience Required
: 4+ Years
Job Description
We are seeking a skilled
Databricks Developer
to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment.
Key Responsibilities
- Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift.
- Work in multi-cloud environments including AWS, Azure, and GCP.
- Implement workflow orchestration using Airflow or similar frameworks.
- Design, implement, and manage data warehouse solutions, schema evolution, and data versioning.
- Collaborate with cross-functional teams to deliver high-quality data solutions.
Required Skills & Experience
- 4+ years of hands-on experience in Databricks, Python, Spark (PySpark), DBT, and AWS data services.
- Strong experience with SQL and large-scale datasets.
- Hands-on exposure to multi-tenant environments (AWS/Azure/GCP).
- Knowledge of data modeling, data warehouse design, and best practices.
- Good understanding of workflow orchestration tools like Airflow.
)
python databricks developer
Posted today
Job Viewed
Job Description
Role:
Python Databricks Developer
Experience Required:
7+ Years
Location:
Hybrid – Bangalore, Pune, Mumbai, Hyderabad, Noida
Work Hours:
11:00 AM – 8:00 PM (4-hour US overlap)
Skills
- Databricks
- Python
- Spark
- SQL
- AWS (S3, Lambda)
- Airflow
- Healthcare data
- ETL systems (DataStage
Big Data
Posted today
Job Viewed
Job Description
Greetings from Teknikoz
Experience : 5-7 Years
Skill Required: Digital: Big Data and Hadoop Ecosystems~Digital: Python~Digital: PySpark
Specific activities required
:
- Lead the implementation of infrastructure via code and provide strategic advice/recommendations for the development and advancement of Microsoft Azure technologies based on previous research on trends in public cloud environments.
- Integrate and automate the delivery of standardised Azure deployments, in conjunction with orchestration products such as Azure DevOps with Terraform, Azure ARM templates and other modern deployment technologies.
- Act as the escalation point for level three Azure-related issues, providing technical support and fault resolution, as well as guidance and mentoring of operational run teams, both locally and remotely throughout the organization.
- Ensure the appropriate gathering of business requirements and their translation into appropriate solutions.
- Maintain and deliver all related documentation for the design, development, build, and deployment methods used, ensuring the source of control of all applicable code is stored and managed properly.
- Provide guidance and assistance to al support teams.
- Provide complimentary support and leadership in hardening and security testing.
KEY COMPETENCIES:
Key competences: - Tertiary qualifications in a relevant discipline with relevant certifications in Microsoft Azure. - Worked as Data engineer on Azure Cloud - Good knowledge of Pyspark, Azure Data Factory - Comprehensive knowledge of public cloud environments and industry trends. - Significant experience supporting, , designing, and developing public cloud solutions via Infrastructure as Code, including Terraform and ARM. - Extensive DevOps experience. - The ability to communicate effectively and work collaboratively with diverse team members. - Demonstrated experience in security hardening and testing. - Proven ability in creating and updating accurate documentation. - Excellent verbal and written communication skills. - Willingness and flexibility to work outside of standard office hours, and on weekends as required
Big Data
Posted today
Job Viewed
Job Description
Minimum Experience:
10 Years
We are looking for a
Big Data / Cloud Architect
to become part of Atgeir's Advanced Data Analytics team. The desired candidate is a Professional with proven track record of working on Big Data and Cloud Platforms.
Required Skills:
- Work closely with customers, understand customer requirements and render those as architectural models that will operate at large scale and high performance, and advise customers on how to run these architectural models on traditional Data Platforms (Hadoop Based) as well as Modern Data Platforms (Cloud Based).
- Work alongside customers to build data management platforms using
Open Source Technologies
as well as Cloud Native services - Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with Advanced Analytics Centre of Excellence (CoE) team at Atgeir.
- Highly technical and analytical with 10 or more years of
ETL
and analytics systems development and deployment experience - Strong verbal and written communications skills are a must,
as well as the ability to work effectively across internal and external organizations and virtual teams. - Ability to think understand complex business requirements and render them as prototype systems with quick turnaround time.
- Implementation and tuning experience in the Big Data Ecosystem, (such as
Hadoop
,
Spark
,
Presto
,
Hive
), Database (such as
Oracle
,
MySQL
,
PostgreSQL
,
MS SQL Server
) and Data Warehouses (such as Redshift, Teradata, etc.) - Knowledge of foundation infrastructure requirements such as Networking, Storage, and Hardware Optimization with Hands-on experience with one of the clouds (
GCP / Azure / AWS
) and/or data cloud platforms (
Databricks
/
Snowflake
) - Proven hands-on experience with at least one programming language among (
Python, Java. Go, Scala
) - Willingness to work hands-on on the projects
- Ability to lead and guide large teams
- Architect level
certification
on one of the clouds will be an added advantage.
Be The First To Know
About the latest Databricks python Jobs in India !
Big Data
Posted today
Job Viewed
Job Description
Why Join Us?
Are you inspired to grow your career at one of
India's Top 25 Best Workplaces in IT industry?
Do you want to do the best work of your life at one of the
fastest growing IT services companies
? Do you aspire to thrive in an award-winning work culture that
values your talent and career aspirations
? It's
happening right here
at Iris Software.
About Iris Software
At Iris Software, our vision is to be our client's most trusted technology partner, and the first choice for the industry's top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.
Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.
Working at Iris
Be valued, be inspired, be your best.
At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow.
Our employee value proposition (EVP) is about "Being Your Best" – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We're a place where everyone can discover and be their best version.
Job Description
Must have:
- 7+ years of years of dev experience
- Big data developer with exposure to Java
- Spark/Scala/Hive experience
- Good background in SQL
Nice To Have
- Familiarity/Experience with Cloud
Mandatory Competencies
Big Data - Big Data - HIVE
Big Data - Big Data - SPARK
Beh - Communication and collaboration
Database - Database Programming - SQL
Programming Language - Scala - Scala
Perks And Benefits For Irisians
At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment.
Join us and experience the difference of working at a company that values its employees' success and happiness.
Big Data
Posted today
Job Viewed
Job Description
We're Hiring –
Big data
Posted today
Job Viewed
Job Description
Role & responsibilities
Responsibilities
A day in the life of an Infoscion
•As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.
•You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.
•You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.
•You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.
•You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you
Additional Responsibilities:
- Knowledge of more than one technology
•Basics of Architecture and Design fundamentals
•Knowledge of Testing tools
•Knowledge of agile methodologies
•Understanding of Project life cycle activities on development and maintenance projects
•Understanding of one or more Estimation methodologies, Knowledge of Quality processes
•Basics of business domain to understand the business requirements
•Analytical abilities, Strong Technical Skills, Good communication skills
•Good understanding of the technology and domain
•Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods
•Awareness of latest technologies and trends
•Excellent problem solving, analytical and debugging skills
Technical and Professional Requirements:
- Primary skills:Technology->Functional Programming->Scala - Bigdata
Preferred Skills:
Technology->Functional Programming->Scala
Preferred candidate profile