Data Solutions Coordinator (Regulatory Intelligence)

Mount, Kerala Redica Systems

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

About the Company

Redica Systems is a data analytics start-up serving over 200 customers in the life sciences sector, with a particular focus on Pharmaceuticals and MedTech. Our team is distributed globally, with headquarters in Pleasanton, CA. Redica’s platform empowers companies to enhance product quality and stay ahead of evolving regulations. Using proprietary processes, we leverage one of the industry’s most comprehensive datasets, sourced from hundreds of health agencies and FOIA records. Our customers use Redica Systems to more effectively and efficiently manage their inspection preparation, monitor their supplier quality, and perform regulatory surveillance. More information is available at redica.com.


About the Role

As a Data Solutions Coordinator on the Regulatory Intelligence team at Redica Systems, you will play a critical role in monitoring and analyzing the global regulatory landscape within the life sciences industry. Your primary responsibility will be to support our regulatory intelligence products and services by actively participating in global monitoring, surveillance, and data acquisition activities. You will work closely with cross-functional teams to identify emerging trends in the pharmaceutical and MedTech regulatory environments and ensure the accuracy and integrity of our regulatory data. This role is ideal for someone passionate about regulatory affairs who enjoys diving deep into data and thrives in a collaborative, fast-paced environment. If you are driven by the opportunity to make a significant impact in a high-growth company and are eager to contribute to our mission of helping companies navigate complex regulatory landscapes, we want to hear from you!


Responsibilities

  • Actively participate in global regulatory intelligence activities, including monitoring, acquiring, tracking and categorizing information related to the evolving global regulatory landscape within the life sciences industry.
  • Maintain a thorough awareness of all current and relevant regulations, guidelines, policies, procedures, and practices.
  • Utilize a strong understanding of GXP to proactively identify, categorize, and analyze regulations, rules, and guidance document changes.
  • Collaborate with teams to identify and assess emerging trends in the pharmaceutical and medtech regulatory environments.
  • Conduct data quality checks to ensure the accuracy, completeness, and integrity of regulatory data.
  • Monitor and report on key performance indicator (KPI) metrics related to regulatory activities.
  • Ensure that all data acquisition and processing processes are clearly defined and consistently followed, while also recommending and implementing improvements where necessary.


Qualifications

  • 2-4 years of experience in the pharmaceutical or medical device industries, with a focus on regulatory affairs or quality assurance.
  • A Master’s degree (M.S.) in Regulatory Affairs or Quality Assurance is preferred.


Required Skills

  • Communicates Effectively: Develops and delivers clear, concise communications across multiple platforms, ensuring accurate and timely sharing of regulatory intelligence insights, updates, and trends with teams and stakeholders.
  • Decision Quality: Analyze and evaluate regulatory data, rules, and guidance document changes to make informed, timely decisions that drive forward the organization’s regulatory intelligence activities.
  • Plans and Aligns: Organize and prioritize multiple time-sensitive projects related to monitoring, tracking, and categorizing global regulatory information, maintaining a high level of attention to detail to ensure data accuracy and relevance.
  • Nimble Learning: Demonstrate a commitment to continuous learning by staying updated on the evolving global regulatory landscape, using new insights to improve regulatory intelligence processes and outcomes.
  • Tech Savvy: Leverage digital tools and technology solutions to monitor regulatory changes, ensuring data accuracy and integrity while contributing to the company’s digital transformation and innovation in regulatory intelligence.
  • Engaged: You share our values and possess the essential competencies needed to thrive at Redica, as outlined here.


Additional Information

It’s a dynamic role blending regulatory expertise with data management and process improvement in a tech-forward environment. We offer competitive salaries, comprehensive benefits packages, and a dynamic team where you can grow and develop your skills. Top Pharma Companies, Food Manufacturers, MedTech Companies, and Service firms from around the globe rely on Redica Systems to mine and process government inspection, enforcement, and registration data in order to quantify risk signals about their suppliers, identify market opportunities, benchmark against their peers, and prepare for the latest inspection trends. Our data and analytics have been cited by major media outlets such as MSNBC, WSJ, and the Boston Globe. We are committed to creating a diverse and inclusive workplace where everyone feels welcomed and valued. We believe that diversity of perspectives, backgrounds, and experiences is essential to our success. We are always looking for talented individuals who can bring unique skills and perspectives to our team. All your information will be kept confidential according to EEO guidelines.

This advertiser has chosen not to accept applicants from your region.

Azure Data Engineer

Mount, Kerala MindBrain

Posted today

Job Viewed

Tap Again To Close

Job Description

Position: Azure Data Engineer

Location: Remote

Experience: 4+ Years

Notice Period: Immediate Joiners Only

Overview:

We are looking for an experienced Azure Data Engineer to work in a hybrid Developer + Support role. The position involves enhancing and supporting existing Data & Analytics solutions using Azure technologies, ensuring performance, scalability, and reliability.

Key Skills (Must-Have):

  • Azure Databricks
  • PySpark
  • Azure Synapse Analytics

Responsibilities:

  • Design, develop, and maintain data pipelines using Azure Data Factory, Databricks, and Synapse.
  • Perform data cleansing, transformation, and enrichment using PySpark.
  • Handle incident classification, root cause analysis, and resolution.
  • Conduct code reviews and fix recurring/critical bugs.
  • Coordinate with SMEs and stakeholders for issue resolution.
  • Collaborate with teams to deliver robust and scalable data solutions.
  • Contribute to CI/CD processes via Azure DevOps.

Requirements:

  • 4–6 years of experience in Azure Data Engineering.
  • Strong skills in Databricks, Synapse, ADLS Gen2, Python, PySpark, and SQL.
  • Knowledge of file formats (JSON, Parquet) and databases (Teradata, Snowflake preferred).
  • Experience in Azure DevOps and CI/CD pipelines.
  • Familiarity with ServiceNow for incident/change management.
  • Strong communication, problem-solving, and time management skills.

Nice-to-Have:

  • Power BI experience
  • DP-203 certification


This advertiser has chosen not to accept applicants from your region.

Azure Data Engineer

Mount, Kerala Affine

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Experience - 5 - 8 years



Must Have Skills:

  • Azure Databricks
  • Azure Data Factory
  • PySpark
  • Spark - SQL
  • ADLS


Responsibilities:

  • Design and build data pipelines using Spark-SQL and PySpark in Azure Databricks
  • Design and build ETL pipelines using ADF
  • Build and maintain a Lakehouse architecture in ADLS / Databricks.
  • Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc.
  • Work with DevOps team to deploy solutions in production environments.
  • Control data processes and take corrective action when errors are identified. Corrective action may include executing a work around process and then identifying the cause and solution for data errors.
  • Participate as a full member of the global Analytics team, providing solutions for and insights into data related items.
  • Collaborate with your Data Science and Business Intelligence colleagues across the world to share key learnings, leverage ideas and solutions and to propagate best practices.
  • You will lead projects that include other team members and participate in projects led by other team members.
  • Apply change management tools including training, communication and documentation to manage upgrades, changes and data migrations.
This advertiser has chosen not to accept applicants from your region.

Azure Data Engineer

Mount, Kerala Xebia

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Job Title: Data Engineer – Azure Stack

Locations: Chennai, Hyderabad, Bengaluru, Gurugram, Jaipur, Bhopal, Pune (Hybrid – 3 days/week in office)

Experience: 5+ Years

This advertiser has chosen not to accept applicants from your region.

Sr. Azure Data Engineer

Mount, Kerala Mindfire Solutions

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

About the Job

We are seeking a highly skilled and motivated Senior Azure Data Engineer to join our growing data team. The ideal candidate will have a strong background in cloud-based data engineering with hands-on experience across a variety of Azure services. This role will play a critical part in designing, building, optimizing, and maintaining scalable data solutions that support our business objectives.


Core Responsibilities

  • Design and implement robust and scalable data pipelines using Azure Data Factory, Azure Data Lake, and Azure SQL.
  • Work extensively with Azure Fabric, Cosmos DB, and SQL Server to develop and optimize end-to-end data solutions.
  • Perform Database Design, Data Modeling, and Performance Tuning to ensure system reliability and data integrity.
  • Write and optimize complex SQL queries to support data ingestion, transformation, and reporting needs.
  • Proactively implement SQL optimization and preventive maintenance strategies to ensure efficient database performance.
  • Lead data migration efforts from on-premise to cloud or across Azure services.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
  • Maintain clear documentation and follow industry best practices for security, compliance, and scalability.


Required Skills

Proven experience working with:

  • Azure Fabric
  • SQL Server
  • Azure Data Factory
  • Azure Data Lake
  • Cosmos DB


Strong hands-on expertise in:

  • Complex SQL queries
  • SQL query efficiency and optimization
  • Database design and data modeling
  • Data migration techniques and performance tuning

Solid understanding of cloud infrastructure and data integration patterns in Azure.


Nice to have

  • Microsoft Azure certifications related to Data Engineering or Azure Solutions Architecture.
  • Experience working in agile environments with CI/CD practices.
  • Experience with SSIS/SSRS


Qualifications

  • Minimum 5+ years of experience in the software industry
  • B.Tech/M.Tech in CS/IT, or related field.
  • Excellent verbal and written communication skills.
This advertiser has chosen not to accept applicants from your region.

Sr Azure Data Engineer - Remote work

Mount, Kerala Techolution

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Sr Azure Data Engineer with hands-on experience in modern data engineering tools and platforms within the Azure ecosystem . The ideal candidate will have a strong foundation in data integration, transformation, and migration , along with a passion for working on complex data migration projects .


Job Title: Sr. Azure Data Engineer

Location: Remote work

Work Timings: 2:00 PM – 11:00 PM IST

No of Openings: 2


Please Note: This is a pure Azure-specific role . If your expertise is primarily in AWS or GCP , we kindly request that you do not apply .


  • Lead the migration of large-scale SQL workloads from on-premise environments to Azure, ensuring high data integrity, minimal downtime, and performance optimization.
  • Design, develop, and manage end-to-end data pipelines using Azure Data Factory or Synapse Data Factory to orchestrate migration and ETL processes.
  • Build and administer scalable, secure Azure Data Lakes to store and manage structured and unstructured data during and post-migration.
  • Utilize Azure Databricks , Synapse Spark Pools , Python , and PySpark for advanced data transformation and processing.
  • Develop and fine-tune SQL/T-SQL scripts for data extraction, cleansing, transformation, and reporting in Azure SQL Database , SQL Managed Instances , and SQL Server .
  • Design and maintain ETL solutions using SQL Server Integration Services (SSIS) , including reengineering SSIS packages for Azure compatibility.
  • Collaborate with cloud architects, DBAs, and application teams to assess existing workloads and define the best migration approach.
  • Continuously monitor and optimize data workflows for performance, reliability, and cost-effectiveness across Azure platforms.
  • Enforce best practices in data governance, security, and compliance throughout the migration lifecycle.

Required Skills and Qualifications:

  • 3+ years of hands-on experience in data engineering , with a clear focus on SQL workload migration to Azure .
  • Deep expertise in: Azure Data Factory / Synapse Data Factory, Azure Data Lake, Azure Databricks / Synapse Spark Pools, Python and PySpark, SQL
  • SSIS – design, development, and migration to Azure
  • Proven track record of delivering complex data migration projects (on-prem to Azure, or cloud-to-cloud).
  • Experience re-platforming or re-engineering SSIS packages for Azure Data Factory or Azure-SSIS Integration Runtime.
  • Microsoft Certified: Azure Data Engineer Associate or similar certification preferred.
  • Strong problem-solving skills, attention to detail, and ability to work in fast-paced environments.
  • Excellent communication skills with the ability to collaborate across teams and present migration strategies to stakeholders.



If you believe you are qualified and are looking forward to setting your career on a fast-track, apply by submitting a few paragraphs explaining why you believe you are the right person for this role.


To know more about Techolution, visit our website:


If you believe you are qualified and are looking forward to setting your career on a fast-track, apply by submitting a few paragraphs explaining why you believe you are the right person for this role.To know more about Techolution, visit our website:


About Techolution:

Techolution is a next gen AI consulting firm on track to become one of the most admired brands in the world for "AI done right". Our purpose is to harness our expertise in novel technologies to deliver more profits for our enterprise clients while helping them deliver a better human experience for the communities they serve.


At Techolution, we build custom AI solutions that produce revolutionary outcomes for enterprises worldwide. Specializing in "AI Done Right," we leverage our expertise and proprietary IP to transform operations and help achieve business goals efficiently.


We are honored to have recently received the prestigious Inc 500 Best In Business award, a testament to our commitment to excellence. We were also awarded - AI Solution Provider of the Year by The AI Summit 2023, Platinum sponsor at Advantage DoD 2024 Symposium and a lot more exciting stuff! While we are big enough to be trusted by some of the greatest brands in the world, we are small enough to care about delivering meaningful ROI-generating innovation at a guaranteed price for each client that we serve.


Our thought leader, Luv Tulsidas, wrote and published a book in collaboration with Forbes, “Failing Fast? Secrets to succeed fast with AI”. Refer here for more details on the content -

Let's explore further!

Uncover our unique AI accelerators with us:

1. Enterprise LLM Studio: Our no-code DIY AI studio for enterprises. Choose an LLM, connect it to your data, and create an expert-level agent in 20 minutes.

2. AppMod. AI: Modernizes ancient tech stacks quickly, achieving over 80% autonomy for major brands!

3. ComputerVision. AI: Our ComputerVision. AI Offers customizable Computer Vision and Audio AI models, plus DIY tools and a Real-Time Co-Pilot for human-AI collaboration!

4. Robotics and Edge Device Fabrication: Provides comprehensive robotics, hardware fabrication, and AI-integrated edge design services.

5. RLEF AI Platform: Our proven Reinforcement Learning with Expert Feedback (RLEF) approach bridges Lab-Grade AI to Real-World AI.


Some videos you wanna watch!

  • Computer Vision demo at The AI Summit New York 2023
  • Life at Techolution
  • GoogleNext 2023
  • Ai4 - Artificial Intelligence Conferences 2023
  • WaWa - Solving Food Wastage
  • Saving lives - Brooklyn Hospital
  • Innovation Done Right on Google Cloud
  • Techolution featured on Worldwide Business with KathyIreland
  • Techolution presented by ION World’s Greatest


Visit us @ : To know more about our revolutionary core practices and getting to know in detail about how we enrich the human experience with technology.

This advertiser has chosen not to accept applicants from your region.

Data Engineering Manager

Mount, Kerala Rocket Learning

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Role: Data Engineering Manager

Experience: 5-8 years

Location: Bengaluru/ Remote within India

Compensation: 28-30 LPA based on experience


Role Overview:

We are seeking an experienced Data Engineering Manager to lead our data engineering team in building and managing scalable data infrastructure that powers our education platform. The ideal candidate will have hands-on expertise in AWS, Data Lakes, Python, and big data technologies , along with proven leadership skills to mentor a team of data engineers. This role is critical in ensuring data-driven decision-making and enabling scalable analytics for India's largest early childhood education initiative.


Key Areas of Responsibility

Team Leadership and Management

  • Lead and mentor a team of data engineers, fostering a culture of collaboration, innovation, and accountability.
  • Provide technical guidance, career growth opportunities, and performance feedback to team members.

Data Infrastructure and Scalability

  • Design, build, and maintain scalable data pipelines, data lakes, and warehouses to support analytics and machine learning.
  • Ensure data reliability, quality, and security across all data systems.

Strategic Data Initiatives

  • Drive the adoption of best practices in data engineering, including data governance, metadata management, and data lineage.
  • Collaborate with cross-functional teams (product, analytics, engineering) to align data infrastructure with organizational goals.


Responsibilities in Detail

  • Architect and implement data lakes and ETL pipelines using AWS services (Glue, Redshift, S3, Athena, Lambda).
  • Optimize data storage, processing, and retrieval for performance and cost-efficiency.
  • Develop and enforce data governance policies to ensure compliance and data integrity.
  • Integrate data from multiple sources (APIs, databases, streaming) into a unified analytics platform.
  • Work closely with data scientists and analysts to enable advanced analytics and machine learning workflows.
  • Stay updated with emerging technologies (e.g., Delta Lake, Snowflake, Spark) and advocate for their adoption where beneficial,


Critical Success Factors

  • Technical Expertise:
  • Proficiency in Python and SQL for data processing and pipeline development.
  • Hands-on experience with AWS data stack (Glue, Redshift, S3, EMR, Kinesis).
  • Strong understanding of data lake architectures, ETL/ELT frameworks, and big data technologies (Spark, Hadoop).
  • Familiarity with data orchestration tools (Airflow, Luigi) and infrastructure-as-code (Terraform, CloudFormation).
  • Leadership and Management:
  • Proven experience leading and growing data engineering teams (5+ members).
  • Ability to mentor engineers and foster a culture of continuous learning.
  • Strategic Thinking:
  • Demonstrated ability to align data infrastructure with business objectives.
  • Keen eye for optimizing data workflows and reducing technical debt.
  • Collaboration Skills:
  • Strong communication skills to work effectively with technical and non-technical stakeholders.
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Snowflake azure data engineer adf snowflake Jobs in Mount !

Senior Developer- AI & Data Engineering

Mount, Kerala PS Human Resources and Consultants

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a highly skilled and motivated Senior Developer  with a strong background in AI development, Python programming , and data engineering . The ideal candidate will have hands-on experience with OpenAI models Machine Learning Prompt Engineering , and frameworks such as NLTK Pandas , and Numpy . You will work on developing intelligent systems, integrating APIs, and deploying scalable solutions using modern data and cloud technologies.


Key Responsibilities:

  • Design, develop, and optimize intelligent applications using OpenAI, APIs  and machine learning  models.
  • Create and refine prompts for Prompt Engineering  to extract desired outputs from LLMs (Large Language Models).
  • Build and maintain scalable, reusable, and secure REST APIs  for AI and data applications.
  • Work with large datasets using Pandas NumPy SQL , and integrate text analytics using NLTK .
  • Collaborate with cross-functional teams to understand requirements and translate them into technical solutions.
  • Use the Function Framework  to encapsulate business logic and automate workflows.
  • Apply basic knowledge of cloud platforms (AWS, Azure, or GCP)  for deployment and scaling.
  • Assist in data integration, processing, and transformation for Big Data  systems.
  • Write clean, maintainable, and efficient Python code.
  • Conduct code reviews, mentor junior developers, and lead small projects as needed.


Required Skills & Qualifications:

  • Minimum 3 years of experience  in Python development with a strong focus on AI and ML.
  • Proven expertise in OpenAI tools and APIs .
  • Hands-on experience with Machine Learning models  and Prompt Engineering  techniques.
  • Solid programming skills using Python , along with libraries like Pandas Numpy , and NLTK .
  • Experience developing and integrating REST APIs .
  • Working knowledge of SQL  and relational database systems.
  • Familiarity with Function Frameworks  and modular design patterns.
  • Basic understanding of cloud platforms  (AWS/GCP/Azure) and Big Data  concepts.
  • Strong problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication and collaboration skills.


This advertiser has chosen not to accept applicants from your region.

Senior Data Analytics Engineer – Azure Data Stack | Remote

Mount, Kerala Strategic Systems Inc

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Title: Senior Data Analytics Engineer – Azure Data Stack | Remote

Location: Remote (Must overlap with US Eastern/Central time zones; 2 PM–11 PM IST shift acceptable)

Experience Level: Senior | 3–5+ years in data engineering


Role Overview:

We’re looking for a Senior Data Analytics Engineer to join our Global Delivery - DADP team , building high-performance, Azure-native data solutions. You’ll work directly with clients to translate complex business needs into scalable data platforms, applying your deep expertise across Azure Data Factory, Databricks, Synapse, and modern data warehousing patterns.

This is a remote consulting role with real client impact, autonomy, and a fast-paced, agile environment.


This advertiser has chosen not to accept applicants from your region.

Senior Full Stack SDE with Data Engineering for Analytics

Mount, Kerala Truckmentum

Posted 12 days ago

Job Viewed

Tap Again To Close

Job Description

Summary

Truckmentum is seeking a Senior Full Stack Software Development Engineer (SDE) with deep data engineering experience to help us build cutting-edge software and data infrastructure for our AI-driven Trucking Science-as-a-Service platform. We’re creating breakthrough data science to transform trucking — and we’re looking for engineers who share our obsession with solving complex, real-world problems with software, data, and intelligent systems.


You’ll be part of a team responsible for the development of dynamic web applications, scalable data pipelines, and high-performance backend services that drive better decision-making across the $4 trillion global trucking industry. This is a hands-on role focused on building solutions by combining Python-based full stack development with scalable, modern data engineering.


About Truckmentum

Just about every sector of the global economy depends on trucking. In the US alone, trucks move 70%+ of all freight by weight (90+% by value) and account for $40 billion in annual spending (globally 4+ trillion per year). Despite this, almost all key decisions in trucking are made manually by people with limited decision support. This results in significant waste and lost opportunities. We view this as a great opportunity.


Truckmentum is a self-funded seed stage venture. We are now validating our key data science breakthroughs with customer data and our MVP product launch to confirm product-market fit. We will raise 4-6 million in funding this year to scale our Data Science-as-a-Service platform and bring our vision to market at scale.


Our Vision and Approach to Technology

T he back of our business cards reads “Moneyball for Trucking”, which means quantifying hard-to-quantiify hidden insights, and then using those insights to make much better business decision. If you don’t want “Moneyball for Trucking” on the back of your business card, then Truckmentum isn’t a good fit.


Great technology begins with customer obsession. We are obsessed with trucking companies' needs, opportunities, and processes, and with building our solutions into the rhythm of their businesses. We prioritize rapid development and iteration of large scale, complex data science problems, backed by actionable, dynamic data visualizations. We believe in an Agile, lean approach to software engineering, backed by a structured CI/CD approach, professional engineering practices, clean architecture, clean code and testing.


Our technology stack includes AWS Cloud, MySQL, Snowflake, Python, SQLAlchemy, Pandas, Streamlit and AGGrid to accelerate development of web visualization and interfaces.


About the Role

As a Senior Full Stack SDE with Data Engineering for Analytics, you will be responsible for designing and building the software systems, user interfaces, and data infrastructure that power Truckmentum’s analytics, data science, and decision support platform. This is a true full stack role — you’ll work across frontend, backend, and data layers using Python, Streamlit, Snowflake, and modern DevOps practices. You’ll help architect and implement a clean, extensible system that supports complex machine learning models, large-scale data processing, and intuitive business-facing applications.


You will report to the CEO (Will Payson), a transportation science expert with 25 years in trucking, who has delivered $1B+ in annual savin s for FedEx and Amazon. You will also work closely with the CMO/Head of Product, Tim Liu, who has 20+ years of experience in building and commercializing customer-focused digital platforms including in logistics.


Responsibilities and Goals

- Design and build full stack applications using Python, Streamlit, and modern web frameworks to power internal tools, analytics dashboards, and customer-facing products.

- Develop scalable data pipelines to ingest, clean, transform, and serve data from diverse sources into Snowflake and other cloud-native databases.

- Implement low-latency, high-availability backend services to support data science, decision intelligence, and interactive visualizations.

- Integrate front-end components with backend systems and ensure seamless interaction between UI, APIs, and data layers.

- Collaborate with data scientists / ML engineers to deploy models, support experimentation, and enable rapid iteration on analytics use cases.

- Define and evolve our data strategy and architecture, including schemas, governance, versioning, and access patterns across business units and use cases.

- Implement DevOps best practices, including testing, CI/CD automation, and observability, to improve reliability and reduce technical debt.

- Ensure data integrity and privacy through validation, error handling, and secure design.

- Contribute to product planning and roadmaps by working with cross-functional teams to estimate scope, propose solutions, and deliver value iteratively.


Required Qualifications

- 5+ years of professional software development experience, with a proven track record of building enterprise-grade, production-ready software applications for businesses or consumers, working in an integrated development team using Agile and Git / GitHub.

- Required technology experience with the following technologies in a business context:

  • Python as primary programming language (5+ years’ experience)
  • Pandas, Numpy, SQL
  • AWS and/or GCP cloud configuration / deployment
  • Git / GitHub
  • Snowflake, and/or Redshift or Big Query
  • Docker
  • Airflow, Prefect or other DAG orchestration technology
  • Front end engineering (e.g., HTML/CSS, JavaScript, and component-based frameworks)

- Hands-on experience with modern front-end technologies — HTML/CSS, JavaScript, and component-based frameworks (e.g., Streamlit, React, or similar).

- Experience designing and managing scalable data pipelines, data processing jobs, and ETL/ELT

- Experience in defining Data Architecture and Date Engineering Architecture, including robust pipelines, and building and using cloud services (AWS and/or GCP)

- Experience building and maintaining well-structured APIs and microservices in a cloud environment.

- Working knowledge of, and experience applying, data validation, privacy, and governance

- Comfort working in a fast-paced, startup environment with evolving priorities and an Agile mindset.

- Strong communication and collaboration skills — able to explain technical tradeoffs to both technical and non-technical stakeholders.


Desirable Experience (i.e., great but not required.)

- Desired technology experience with the following technologies in a business context:

  • Snowflake
  • Streamlit
  • Folium, Plotly, AG Grid
  • Kubernetes
  • Javascript, CSS
  • Flask, Fast API and SQLAlchemy

- Exposure to machine learning workflows and collaboration with data scientists or MLOps teams.

- Experience building or scaling analytics tools, business intelligence systems, or SaaS data products.

- Familiarity with geospatial data and visualization libraries (e.g., Folium, Plotly, AG Grid).

- Knowledge of CI/CD tools (e.g., GitHub Actions, Docker, Terraform) and modern DevOps practices.

- Contributions to early-stage product development — especially at high-growth startups.

- Passion for transportation and logistics, and for applying technology to operational systems.


Why Join Truckmentum

At Truckmentum, we’re not just building software — we’re rewriting the rules for one of the largest and most essential industries in the world. If you’re excited by real-world impact, data-driven decision making, and being part of a company where you’ll see your work shape the product and the business, this is your kind of team.


Some of the factors that make this a great opportunity include:

- Massive market opportunity: Trucking is a $4T+ global indust y / strong customer interest in solution

- Real business impact: Our tech has already shown a 5% operating margin gain at pilot customers.

- Builder’s culture: You’ll help define architecture, shape best practices, and influence our direction.

- Tight feedback loop: We work directly with real customers and iterate fast.

- Tech stack you’ll love: Python, Streamlit, Snowflake, Pandas, AWS — clean, modern, focused.

- Mission-driven team: We’re obsessed with bringing "Moneyball for Trucks" to life — combining science, strategy, and empathy to make the complex simple, and the invisible visible


We value intelligence, curiosity, humility, clean code, measurable impact, clear thinking, hard work and a focus on delivering results. If that sounds like your kind of team, we’d love to meet you.


  • PS. If you read this far, we assume you are focused and detail oriented. If you think this job sounds interesting, please fill in a free personality profile on and email a link to the outcome to to move your application to the top the pile.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Snowflake Azure Data Engineer Adf Snowflake Jobs View All Jobs in Mount