4,785 Autonomous Systems jobs in India
Autonomous Systems Engineer
Posted today
Job Viewed
Job Description
Job Description: Autonomous Navigation Specialist {robotics engineer}
Role Overview
We are seeking an experienced Autonomous Navigation Specialist to design, develop, and optimize advanced navigation systems for autonomous vehicles, drones, and robotic platforms. The ideal candidate will have a proven track record of building robust systems that can operate independently and safely in GPS-denied and complex environments. This role requires deep expertise in algorithm development, multi-sensor fusion, and real-world deployment of navigation stacks.
Key Responsibilities
- Lead the design and implementation of Simultaneous Localization and Mapping (SLAM) pipelines for autonomous platforms.
- Develop and optimize path planning and motion control algorithms to enable safe and efficient navigation.
- Integrate and calibrate diverse sensors, including IMUs, LiDARs, cameras, GNSS, and radar , into a cohesive navigation framework.
- Implement and maintain sensor fusion algorithms (e.G., EKF, UKF, graph-based optimization).
- Conduct extensive simulation, testing, and validation using tools like ROS/ROS2, Gazebo, CARLA, or similar platforms.
- Collaborate with hardware, perception, and AI teams to deliver end-to-end autonomous navigation solutions.
- Troubleshoot system performance in field tests and iteratively improve reliability under real-world conditions.
- Mentor junior engineers and contribute to technical roadmaps and research directions.
Required Skills & Experience
- Strong background in robotics, computer vision, or control systems (Master’s or PhD preferred).
- 2+ years of hands-on experience developing and deploying navigation systems for autonomous drones, robots, or vehicles.
- Deep expertise in SLAM (visual, LiDAR, or visual-inertial) , path planning , and multi-sensor fusion .
- Proficiency in C++ and Python , with extensive experience in ROS/ROS2 .
- Strong understanding of control theory, state estimation, and probabilistic robotics .
- Experience with simulation tools (Gazebo, CARLA, AirSim, or equivalent).
- Familiarity with GPU/embedded platforms (NVIDIA Jetson, FPGA, or similar).
- Demonstrated ability to solve complex problems and deliver production-ready systems.
Preferred Qualifications
- Experience with autonomous drones (UAVs) and ground robots (UGVs) .
- Publications or patents in navigation, robotics, or related domains.
- Hands-on experience with field testing of autonomous navigation systems.
- Knowledge of safety standards and regulatory requirements for autonomous systems.
What We Offer
- Opportunity to work on cutting-edge autonomous systems with global impact.
- A highly collaborative environment with hardware and AI experts.
- Growth opportunities in a fast-scaling deep-tech startup.
Autonomous Systems Engineer
Posted today
Job Viewed
Job Description
Build the Future of Humanoid Robotics
At gephr Labs, we're not just building robots—we're democratizing humanoid robotics for the entire world. We're looking for a talented Robotics Engineer to join our mission of making humanoid robot development accessible to everyone through our open-source humanoid training platform.
The Opportunity
You'll be at the forefront of humanoid robotics, contributing to gephr's open-source humanoid training platform that's changing how the world trains and deploys humanoid robots. This isn't just another engineering role;
it's a chance to shape technology that thousands of developers, researchers, and companies will build upon.
What makes this role special:
- Your code will be public, impactful, and used by the global robotics community
- Collaborate with an open-source community of brilliant minds
- Help define the standards for the next generation of humanoid platforms
What You'll Do
Core Responsibilities:
- Design and implement control systems for humanoid locomotion, manipulation, and balance
- Develop and optimize training pipelines for humanoid behaviours using VLAs, RL and imitation learning
- Build robust simulation environments and sim-to-real transfer workflows
- Contribute to the core gephr platform architecture—kinematics, dynamics, perception, and control
- Create comprehensive documentation, tutorials, and examples for the open-source community
- Collaborate with external contributors and maintain high code quality standards
- Debug and solve complex robotics challenges in both simulation and hardware
- Push the boundaries of what's possible with accessible humanoid robotics
What We're Looking For
Required:
- Bachelor's or Master's in Robotics, Computer Science, Mechatronics or related field
- 2+ years of hands-on experience in robotics (manipulation, locomotion, or mobile robotics)
- Strong programming skills in Python and C++
- Experience with robotics frameworks (ROS/ROS2, MuJoCo, PyBullet, Isaac, or similar)
- Solid understanding of kinematics, dynamics, and control theory
- Comfort working with Linux and command-line tools
- Passion for open source and collaborative development
Bonus Points:
- Experience with VLA models, reinforcement learning and imitation learning
- Familiarity with humanoid robots or legged locomotion
- Active GitHub profile with robotics projects
- Experience with sim-to-real transfer and hardware integration
- Knowledge of computer vision and sensor fusion
- Contributions to open-source robotics projects
- Experience with modern ML frameworks (PyTorch, JAX, TensorFlow)
Why Gephr Labs?
Impact: Your work will be used by researchers, hobbyists, and companies worldwide. Every commit you make helps democratize advanced robotics.
Open Source First: We believe the future of robotics is open. Full transparency, community-driven development, and standing on the shoulders of giants.
Learning Culture: Work with cutting-edge technology daily. We encourage experimentation, rapid iteration, and learning from failures.
Flexibility: We trust you to do great work. Remote-friendly with flexible hours because great engineering doesn't happen on a fixed schedule.
Community: Join a passionate community of robotics enthusiasts who contribute, collaborate, and push each other to build amazing things.
The Stack
- Languages: Python, C++, CUDA
- Frameworks: ROS2, PyTorch, MuJoCo, NVIDIA Isaac
- Tools: Git, Docker, CI/CD pipelines
- Hardware: Real humanoid platforms
How to Apply
Send us:
- Your resume
- Your GitHub profile (we want to see your code!)
- A brief note about:
- A robotics problem you're proud of solving
- Why you're excited about open-source humanoid robotics
- What you'd contribute to the gephr humanoid training platform
Think you're a fit but don't check every box? Apply anyway. We value passion, curiosity, and the drive to learn just as much as experience.
Start contributing to the gephr humanoid training platform and we'll know this is your calling:
Apply at or via LinkedIn.
"The future of robotics won't be built behind closed doors—it'll be built in the open, together."
Lead Autonomous Systems Engineer
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Lead the design and development of core autonomous driving software modules, including perception, prediction, planning, and control.
- Oversee the integration of various sensors (LiDAR, radar, cameras, IMU) and their data fusion algorithms.
- Develop and implement advanced algorithms for environment perception and understanding.
- Design and validate control strategies for vehicle maneuverability and safety.
- Collaborate with cross-functional teams, including hardware engineers, software developers, and testing personnel.
- Define system requirements, architecture, and specifications for autonomous driving features.
- Conduct simulations and real-world testing to validate system performance and safety.
- Mentor and guide junior engineers, fostering a culture of technical excellence.
- Stay updated with the latest advancements in autonomous driving technology and industry best practices.
- Contribute to the development of robust safety cases and validation methodologies for autonomous systems.
- Manage project timelines and deliverables effectively, ensuring successful product launches.
- Master's or Ph.D. in Electrical Engineering, Mechanical Engineering, Computer Science, Robotics, or a related field.
- Significant experience in developing and implementing autonomous driving systems or robotics.
- Proficiency in C++, Python, and relevant robotics frameworks (ROS).
- Strong knowledge of sensor fusion, Kalman filters, particle filters, and other state estimation techniques.
- Experience with machine learning frameworks (e.g., TensorFlow, PyTorch) for perception and prediction tasks.
- Solid understanding of control theory and its application in automotive systems.
- Experience with simulation tools (e.g., CARLA, Gazebo) and data analysis.
- Excellent leadership, communication, and interpersonal skills.
- Proven ability to work effectively in a hybrid team environment.
- Experience with automotive safety standards (e.g., ISO 26262) is a plus.
Senior Autonomous Systems Engineer
Posted 10 days ago
Job Viewed
Job Description
Key Responsibilities:
- Design, develop, and implement algorithms for autonomous systems, including perception, localization, planning, and control.
- Develop and integrate software modules for autonomous navigation and operation.
- Conduct simulation and real-world testing of autonomous systems.
- Analyze system performance data and identify areas for improvement.
- Collaborate with cross-functional teams to define system requirements and specifications.
- Research and evaluate new technologies and methodologies in autonomous systems.
- Troubleshoot and resolve complex technical issues in real-time.
- Ensure the safety, reliability, and efficiency of autonomous system operations.
- Create and maintain comprehensive technical documentation.
- Mentor junior engineers and contribute to the team's knowledge base.
- Stay current with industry trends and advancements in robotics and AI.
- Master's or Ph.D. in Robotics, Computer Science, Electrical Engineering, or a related field with a focus on autonomous systems.
- Minimum of 7 years of experience in the development of autonomous systems or robotics.
- Proficiency in programming languages such as C++ and Python.
- Experience with robotics middleware (e.g., ROS).
- Strong understanding of control theory, path planning algorithms, and sensor fusion techniques.
- Experience with machine learning for robotics applications (e.g., object detection, state estimation).
- Familiarity with simulation environments (e.g., Gazebo, Isaac Sim).
- Excellent analytical, problem-solving, and debugging skills.
- Strong communication and collaboration skills, with the ability to work effectively in a fully remote team.
- Experience with software development best practices, including version control (Git) and testing.
- Familiarity with various sensors (LiDAR, cameras, IMUs) and their data processing.
Senior Autonomous Systems Engineer
Posted 10 days ago
Job Viewed
Job Description
Key Responsibilities:
- Design, develop, and implement robust algorithms for autonomous vehicle systems, including perception, sensor fusion, localization, path planning, and control.
- Develop and maintain simulation environments for testing and validating autonomous driving software.
- Collaborate with cross-functional teams, including hardware engineers, AI researchers, and safety experts.
- Analyze large datasets from vehicle testing to identify areas for improvement and implement corrective actions.
- Write high-quality, well-documented, and testable code in C++ and/or Python.
- Lead technical discussions and provide mentorship to junior engineers.
- Contribute to the definition of system requirements and architecture for autonomous driving features.
- Stay abreast of the latest research and advancements in the field of autonomous systems and AI.
- Perform root cause analysis for system failures and implement effective solutions.
- Work closely with safety teams to ensure all systems meet rigorous safety standards.
- Contribute to the overall strategy and roadmap for autonomous vehicle development.
- Participate in field testing and validation activities.
- Master's or Ph.D. in Computer Science, Electrical Engineering, Robotics, or a related field with a specialization in autonomous systems or AI.
- Minimum of 7 years of hands-on experience in developing and deploying autonomous driving software.
- In-depth knowledge of key autonomous systems components: perception (e.g., LiDAR, camera, radar), sensor fusion, localization, planning, and control.
- Proficiency in C++ and/or Python, with experience in relevant libraries (e.g., ROS, OpenCV, PCL).
- Strong understanding of machine learning concepts as applied to autonomous systems.
- Experience with simulation tools (e.g., CARLA, Gazebo) and data analysis pipelines.
- Excellent debugging, problem-solving, and analytical skills.
- Proven ability to lead complex technical projects and mentor team members.
- Strong communication and collaboration skills, with the ability to work effectively in a remote, distributed team.
- Familiarity with functional safety standards (e.g., ISO 26262) is a plus.
Autonomous Systems Control Engineer
Posted today
Job Viewed
Job Description
Job Title: RL Research Engineer (Planning & Control)
Location: Vapi, Gujarat
Employment Type: Full-Time
Overview
We are seeking a highly skilled Reinforcement Learning (RL) Research Engineer specializing in planning and control. The role focuses on designing learning-based planners and policies (RL, imitation learning, model-based) and integrating them with classical control approaches to enable safe, efficient, and robust autonomous operation across multiple domains including humanoids, AGVs, cars, and drones.
Key Responsibilities
- Develop and train policies from human demonstrations and teleoperation data.
- Implement safe reinforcement learning approaches with constraints.
- Design long-horizon planners using world models and uncertainty-aware control.
- Implement safety shields, fallback controllers, and verify-before-deploy pipelines.
- Collaborate with cross-functional teams to integrate RL policies with control systems.
- Conduct sim-to-real transfer and ensure policies generalize in real-world settings.
- Design reward functions and implement offline RL and behavioral cloning strategies.
Must-Haves
- 4–8+ years of experience in RL and control systems.
- Strong expertise in Model Predictive Control (MPC), Control Barrier Functions (CBFs), reachability analysis, or similar methods.
- Master’s or PhD in Robotics, Control, AI, or a related field.
- Experience with sim-to-real transfer, reward design, offline RL, and behavioral cloning.
Nice-to-Haves
- Experience with multi-agent reinforcement learning.
- Knowledge of hierarchical options and diffusion policies.
- Familiarity with long-horizon task planning in complex environments.
Success Metrics
- Task success rate in target domains.
- Rate of human or system interventions during execution.
- Compliance with energy, jerk, and other control limits.
- Minimization of constraint violations in real-world deployment.
Domain Notes
Humanoids:
- Stable locomotion and bimanual task RL.
AGVs (Autonomous Ground Vehicles):
- Navigation in mixed human zones, traffic rule compliance, and aisle etiquette.
Cars:
- Interactive merges, handling unprotected turns, and safe navigation in dynamic traffic.
Drones:
- Wind-robust flight, safe landing and perching maneuvers.
Application Instructions
Interested candidates may apply by sending their resume and cover letter to with the subject line: “RL Research Engineer (Planning & Control) Application” .
Autonomous Systems Robotics Engineer
Posted today
Job Viewed
Job Description
About Clutterbot
Imagine a world where you never have to worry about picking up clutter again! At our robotics startup, we understand the challenges families face trying to balance work and daily responsibilities while keeping their homes tidy. That's why we're developing a cutting-edge household robot that will revolutionize the way you live. This safe and innovative robot will drive around your house, effortlessly picking up toys, clothes, and other items off the floor and organizing them neatly into containers. Say goodbye to clutter and hello to a more efficient and
stress-free home life!
As part of our close-knit team, you'll have the opportunity to work on cutting-edge technology and be at the forefront of the robotics industry. You'll also have the chance to grow your skills and expertise through collaboration with experienced professionals in the field. Join us in building the future of home automation and experience a fulfilling and rewarding career journey!
About The Role
We are seeking a highly skilled and experienced Robotics Engineer to lead our SLAM (Simultaneous Localization and Mapping) efforts within our advanced mobile robotics system.
This role involves both technical leadership and hands-on development, requiring the individual
to guide a small team while also actively designing, implementing, and optimizing complex
behaviors for autonomous mobile robots. The successful candidate will leverage expertise in
ROS, C++, and Python to develop cutting-edge solutions.
What you’ll be doing
1. Develop and implement SLAM algorithms for mobile robotic systems, ensuring accurate localization and mapping in dynamic environments.
2. Design and implement robust perception algorithms for reliable object detection and tracking.
3. Perform multi-sensor calibration and fusion, integrating data from cameras, LIDAR, IMU, and other sensors.
4. Work closely in collaboration with navigation and control teams to generate reliable environment mapping to enable smarter planning capabilities.
5. Own comprehensive testing and evaluation efforts for SLAM and perception algorithms, ensuring robust performance in real-world scenarios.
6. Collaborate with cross-functional teams including software engineers, hardware engineers, and product managers to integrate SLAM and perception capabilities into our robotic systems.
7. Optimize algorithms for real-time performance and ensure their scalability across different hardware platforms.
8. Stay up to date with the latest advancements in SLAM and perception technologies, and actively contribute to research and development efforts.
9. Provide technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement.
Your Background/ Skills
1. Bachelor's or Master's degree in Robotics, Computer Science, Electrical Engineering, or a related field (Ph.D. is a plus).
2. Minimum of 5 years of proven experience in robotics development, with a strong track record in both technical execution and team/project management.
3. Solid understanding of SLAM techniques and algorithms, such as Extended Kalman Filters, Particle Filters, Graph-based SLAM, etc.
4. Strong background in computer vision and 3D geometry, with practical experience in object detection, pattern recognition, and tracking algorithms.
5. Proven hands-on experience in designing SLAM systems for mobile robots, including defining system architecture, selecting appropriate sensors, implementing data fusion strategies, and ensuring scalability in real-world environments.
6. Proficiency in programming languages such as C++, Python, and popular robotics frameworks like ROS2 (Robot Operating System) and Gazebo.
7. Demonstrated experience with sensor calibration and fusion (cameras, LIDAR, IMU).
8. Excellent mathematical and analytical skills, with the ability to efficiently implement complex algorithms.
9. Proven ability to solve complex problems with minimal supervision in a timely manner.
10. Strong communication skills, both verbal and written, with the ability to present complex technical concepts to both technical and non-technical stakeholders.
11. Ability to collaborate effectively with remote and global teams in a fast-paced, collaborative environment.
Preferred Qualifications:
1. Familiarity with semantic mapping and/or learning-based pipelines for SLAM use-cases.
2. Familiarity with working of common SLAM sensor drivers like LIDAR, cameras, IMU etc.
3. Knowledge of path planning and behavior trees.
4. Experience with CI/CD infrastructure and software packaging.
5. Experience with simulation frameworks like Nvidia Isaac, Habitat, etc.
Benefits
1. Competitive compensation package
2. Team building activities
3. Flexible work culture
4. Company-Sponsored Devices.
Be The First To Know
About the latest Autonomous systems Jobs in India !
Lead Autonomous Systems Engineer
Posted today
Job Viewed
Job Description
About Us & The Role
At Truxt.Ai, we're not just another startup—we're solving large enterprises' biggest data paradoxes and pioneering the world's first fully autonomous software operations. As an early-stage company driven by a meaningful mission, we seek exceptional engineering talent who values purpose, long-term impact, and significant equity ownership over traditional salary compensation.
As a Founding Senior Machine Learning Engineer, you'll be instrumental in architecting our AI-driven autonomous systems, directly influencing our core ML infrastructure, model deployment strategies, and data engineering decisions. You won't just be implementing models—you'll have the freedom to research, experiment, and deploy cutting-edge ML solutions while building substantial ownership in our company's future.
What Makes This Opportunity Different:
- Freedom to Execute: Set your own goals, timelines, and approaches without bureaucratic hurdles.
- True Ownership: Founding engineers receive substantial equity packages (competitive with senior engineer salaries at scale), ensuring you directly benefit from the value you create.
- Technical Challenge: Solve complex real-world problems with meaningful impact.
- Impact-Driven Culture: Directly influence product direction, company culture, and growth strategy.
- Transparent Leadership: Collaborative environment, weekly strategy sessions, and full visibility into company health and roadmap.
- Exponential Upside Potential: Your equity stake grows with company success—potential for life-changing returns as we scale.
Apply Only If You:
- Believe your work should be meaningful and aligned with your values.
- Seek freedom to turn cutting-edge research and ideas into reality.
- Are tired of incremental feature development on uninspiring products.
- Want to build something from scratch with significant technical challenges.
- Dream big and prioritize ownership, autonomy, and exponential growth potential over guaranteed salary.
- Thrive without hand-holding and proactively create your own direction.
- Are financially positioned to accept equity-only compensation during our early growth phase.
Our Promise to You
- Zero Bureaucracy: No VP approvals or innovation theater—just you, meaningful problems, and the freedom to solve them.
- Transparent Leadership: Open, collaborative culture with weekly strategy sessions.
- Impact-Driven Culture: Your voice directly shapes our product and strategy.
- Significant Equity Package: Founding-level equity stake that reflects your early-stage commitment and risk.
Compensation Structure
- Base Salary: Rs. 0 (Equity-only position) until we close Funding, Better than market average post Funding.
- Future Upside: Participation in all future funding rounds and liquidity events
What We're Looking For:
ML Expertise
- Passionate Builder: Proactive problem-solver who identifies, iterates, and ships solutions.
- ML Veteran: 5+ years building and deploying production ML systems at scale
- Deep Learning Specialist: Hands-on experience with transformer architectures, LLMs, and generative AI
- MLOps Mastery: Expert in ML pipeline orchestration, model monitoring, and A/B testing frameworks
- Programming Excellence: Proficiency in Python, with experience in PyTorch/TensorFlow, and familiarity with systems languages (Go, Rust, C++)
Technical Requirements:
- Enterprise AI: Experience deploying ML models in enterprise environments with strict SLAs
- Autonomous Systems: Background in reinforcement learning, multi-agent systems, or autonomous decision-making
- Data Engineering: Comfortable with large-scale data processing, feature engineering, and real-time inference
- Cloud ML Platforms: Experience with AWS SageMaker, Google AI Platform, or Azure ML
- Model Optimization: Knowledge of model compression, quantization, and edge deployment
Ready to Build Something Extraordinary? If you're done optimizing ad algorithms or CRUD apps and eager to tackle genuinely impactful problems while building true ownership in a revolutionary company, send us your profile.
Accelerated Application Process for ML Engineers: To expedite your application and demonstrate your problem-solving skills, please complete one of the following:
- Kaggle Competition: Participate in any active Kaggle competition and share your
- Research Contribution: Share a recent ML project, paper, or open-source contribution you're proud of
"No great engineer ever joined a startup just for salary—join us for the mission, the freedom, the opportunity to turn your boldest ideas into reality, and the chance to own a meaningful piece of the future we're building together."
Autonomous Systems Research Engineer
Posted today
Job Viewed
Job Description
Job Title: VLM Research Engineer
Location: Vapi, Gujarat
Employment Type: Full-Time
Overview
We are seeking a highly skilled VLM Research Engineer to build multimodal (vision-language-action) models for instruction following, scene grounding, and tool use across platforms. The role involves developing advanced models that bridge perception and language understanding for autonomous systems.
Key Responsibilities
- Pretrain and finetune VLMs, aligning them with robotics data including video, teleoperation, and language.
- Build perception-to-language grounding for referring expressions, affordances, and task graphs.
- Develop Toolformer/actuator interfaces to convert language intents into actionable skills and motion plans.
- Create evaluation pipelines for instruction following, safety filters, and hallucination control.
- Collaborate with cross-functional teams for integration of models into robotics platforms.
Must-Haves
- Master’s or PhD in a relevant field.
- 1–2+ years of experience in Computer Vision/Machine Learning.
- Strong proficiency in PyTorch or JAX;
experience with LLMs and VLMs. - Familiarity with multimodal datasets, distributed training, and RL/IL.
Nice-to-Haves
- Experience with world models, diffusion-policy integration, and speech interfaces.
- Familiarity with sim-to-real transfer in robotics applications.
Success Metrics
- on language-based tasks.
- Grounding precision and latency.
- Sim-to-real performance retention.
Domain Notes
Humanoids:
- Language-guided manipulation and tool use.
AGVs (Autonomous Ground Vehicles):
- Natural language tasking for warehouse operations;
semantic maps.
Cars:
- Gesture and sign interpretation;
driver interaction.
Drones:
- Natural language mission specification;
target search and inspection.
Application Instructions
Interested candidates may apply by sending their resume and cover letter to with the subject line: “VLM Research Engineer Application” .
Autonomous Systems Software Developer
Posted today
Job Viewed
Job Description
Embedded Software Developer
Location: Mumbai, India (On-site / Project sites)
Job Type : Full-Time | Mid-Level | Experience: 2–5 years
Company: DroneStark Technologies
Company Overview:
DroneStark Technologies is a leading provider of high-performance drones and autonomous systems for defence, industrial, and research applications. Our mission is to design, manufacture, and deploy cutting-edge robotics platforms with real-world impact. If you are passionate about working on mission-critical systems and flying robots with real autonomy, this is the team for you.
Role Overview:
We are seeking a hands-on Embedded Software Developer with a passion for robotics, drone technology, and real-time systems. You will collaborate closely with our hardware and autonomy teams to develop control software that powers aerial and ground platforms. Your responsibilities will encompass writing low-level drivers, integrating ROS-based pipelines on real hardware, and more.
Key Responsibilities:
* Develop, test, and optimise embedded software for UAVs and UGVs
* Interface with sensors, motor controllers, and flight control systems (ArduPilot, PX4)
* Integrate and deploy autonomy stacks using ROS/ROS2 on Jetson, Raspberry Pi, and other platforms
* Handle real-time communication over MAVLink, SBUS, UART, I2C, SPI, CAN
* Debug and test systems in field conditions alongside the integration team
* Design startup scripts, watchdogs, and hardware-software fault handling logic
Collaboration with Electronics, Firmware, and Simulation Teams for Full-Stack Development
Required Skills:
- Strong experience in C/C++ and Python programming languages.
- Hands-on experience with ROS/ROS2 for navigation, SLAM, VIO, URDF, TF, and other functionalities.
- Proficiency in embedded Linux environments such as Ubuntu, Yocto, and Raspbian.
- Experience integrating and debugging ArduPilot/PX4 systems.
- Familiarity with microcontrollers including STM32, Arduino, and Teensy.
- Knowledge of tools like Gazebo, RViz, QGroundControl, and MAVProxy.
Bonus Skills (Preferred):
- Experience working with drones, autonomous rovers, or robotic arms.
- Familiarity with Jetson platforms (Nano/Orin/AGX) and Raspberry Pi 4/5/CM4.
- Knowledge of FPGA programming, GPR integration, and custom PCB bring-up.
- Familiarity with network-based control systems, streaming video interfaces, and safety-critical systems.
Preferred Qualifications:
- Deployment of code on real hardware, not just simulations.
- Enjoyment of field testing, debugging in challenging conditions, and pushing code that flies.
- Self-motivation, curiosity, and a passion for building functional systems.
- Excitement about working in a fast-paced, startup-style environment where work matters daily.
Application Process:
Submit your resume, portfolio/GitHub link, and a brief note detailing your most challenging robotics project. Consider including any experience working with systems that provide global remote access , SLAM/VIO , or multi-mode control (4WD, Crab, Ackermann) .