In the era of rapid urban transformation, the sharing economy has revolutionized transportation solutions, with shared bicycles emerging as a prominent example. Shared bicycles have become a revolutionary mode of transportation, effectively addressing urban mobility issues by providing an affordable and convenient "last mile" travel option. Despite their growing popularity, a critical challenge remains: the optimal dynamic allocation of e-bikes, particularly in station-based sharing systems. This research tackles this complex problem by leveraging multi-armed bandit algorithms, a cutting-edge approach to intelligent resource distribution. The study conducted comprehensive experiments using real-world data from New York City's bike-sharing system in February, employing sophisticated techniques such as Upper Confidence Bound, Thompson sampling, and the innovative Sliding Window Thompson Sampling. The research meticulously evaluates algorithm performance through metrics including cumulative regret, optimal arm click rate, and Mean Absolute Error (MAE). Notably, the findings reveal the remarkable potential of multi-armed bandit algorithms in solving dynamic allocation challenges, with Sliding Window Thompson Sampling demonstrating exceptional adaptability, especially when confronting sudden environmental changes. This study provides a robust, data-driven methodology for e-bike allocation, representing a significant step towards optimizing urban mobility infrastructure and enhancing the efficiency of shared transportation systems.
Fuhua LinM. Ali Akber DewanM. Nguyen