Optimal Trajectories for Autonomous Human-Following Carts

Developed an autonomous shopping cart system that uses gesture-based navigation to assist users in crowded environments, reducing physical effort and promoting safer, contactless shopping experiences.

Research Robotics Automation AI/ML

Overview

About the Project

The human-following autonomous shopping cart system was developed as a solution to assist users in crowded environments, reducing physical effort and promoting safer, contactless shopping experiences. The project aimed to create a gesture-based navigation system for autonomous carts that could follow users and avoid obstacles in real-time.

ROLE
Researcher, Developer

TIMELINE
July 2020 - August 2021

TOOLS
ROS, Gazebo, RGB-D cameras, Python

TEAM
Myself, Dr. Jingang Yi, Merrill Edmonds, Faiza Sikandar, Tarik Yigit

The Challenge

In crowded shopping environments, traditional carts posed hygiene risks and required physical effort from users. There was a need for an autonomous, human-following solution that could navigate safely while maintaining distance, minimizing contact, and reducing user effort.

The Solution

The human-following shopping cart system uses RGB-D cameras to recognize user gestures and follows them based on skeletal pose estimation. It optimizes the cart’s path using real-time data, ensuring collision-free navigation and safe user interaction.

Understanding the Problem

The research identified key challenges in traditional shopping experiences, particularly during the pandemic. Physical carts increased the risk of bacterial transmission, and many existing autonomous cart systems lacked robustness in crowded environments.

Key Insights

  • No effective gesture-based navigation system existed for autonomous carts in crowded environments.
  • Existing cart systems struggled with real-time path optimization in dynamic, high-traffic environments.
  • Human-following behaviors required a balance of safety, user convenience, and non-intrusive interactions.

This research formed the foundation for the cart's gesture-based interaction and real-time navigation features.

Funding

This project was funded by the Rutgers University Louis Stokes Alliance for Minority Participation (LSAMP) program, which supports the research efforts of minority students in STEM fields. The funding enabled access to essential tools and resources for the development of the human-following autonomous shopping cart system.

Experimentation

The project underwent two stages of experimentation: large-scale simulations using ROS/Gazebo and real-world tests with a 3-wheeled omni-directional robot.

Simulations

Simulations were conducted in a virtual supermarket environment to replicate a crowded setting. The multi-cart system, consisting of up to 40 autonomous shopping carts, was equipped with RGB-D cameras, laser scanners, and depth sensors. The carts navigated through aisles, avoiding obstacles and other carts, while following human targets. The simulation environment allowed for testing of user-following behavior, goal selection, and obstacle avoidance across different traffic levels.

Real-World Testing

In the real-world experiments, the system was tested using a 3-wheeled omni-directional robot. The robot was fitted with forward-facing color and depth cameras for gesture recognition and navigation. Retro-reflective markers were attached to the robot, and Vicon motion tracking cameras captured the human pose and robot movement with precision. These tests evaluated the cart’s ability to follow users based on gesture-based commands and avoid obstacles in dynamic, real-world environments.

Key Findings

  • The system effectively followed users’ gestures, switching between following, waiting, and approaching behaviors.
  • In simulations, the multi-cart system demonstrated collision-free navigation even in high-traffic environments.
  • Real-world tests showed reliable gesture-based interaction, with minimal tracking errors and a high degree of responsiveness to user commands.
  • The cart system efficiently navigated crowded spaces, adapting to human movement while maintaining a safe distance.

These experiments validated the system’s ability to operate in both controlled and real-world environments, highlighting its robustness in crowded spaces and its potential for practical deployment.

Results

The project resulted in a functional prototype capable of safely following users in crowded environments, demonstrating its effectiveness through large-scale simulations and real-world experiments with a 3-wheeled omni-directional robot.

Outcomes

The cart system successfully balanced user-following with obstacle avoidance, ensuring a smooth shopping experience. The gesture-based control provided an intuitive way for users to interact with the cart, enhancing safety and usability.

Conclusion

The human-following autonomous shopping cart is an innovative solution that offers a safer, more efficient shopping experience. It showcases how gesture-based navigation and human-centered design can create intuitive, effective tools for real-world environments.

This project highlighted the importance of addressing user needs in complex environments and leveraging cutting-edge technology to enhance everyday experiences.