JOURNAL ARTICLE

Feudal Latent Space Exploration for Coordinated Multi-Agent Reinforcement Learning

Xiangyu LiuYing Tan

Year: 2022 Journal:   IEEE Transactions on Neural Networks and Learning Systems Vol: 34 (10)Pages: 7775-7783   Publisher: Institute of Electrical and Electronics Engineers

Abstract

In this article, we investigate how multiple agents learn to coordinate to form efficient exploration in reinforcement learning. Though straightforward, independent exploration of the joint action space of multiple agents will become exponentially more difficult as the number of agents increases. To tackle this problem, we propose feudal latent-space exploration (FLE) for multi-agent reinforcement learning (MARL). FLE introduces a feudal commander to learn a low-dimensional global latent structure that instructs multiple agents to explore coordinately. Under this framework, the multi-agent policy gradient (PG) is adopted to optimize both the agent policy and latent structure end-to-end. We demonstrate the effectiveness of this method in two multi-agent environments that need explicit coordination. Experimental results validate that FLE outperforms baseline MARL approaches that use independent exploration strategy in terms of mean rewards, efficiency, and the expressiveness of coordination policies.

Keywords:
Reinforcement learning Feudalism Space (punctuation) Computer science Reinforcement Artificial intelligence Baseline (sea) Machine learning Political science Law Psychology Social psychology

Metrics

24
Cited By
4.50
FWCI (Field Weighted Citation Impact)
63
Refs
0.93
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Reinforcement Learning in Robotics
Physical Sciences →  Computer Science →  Artificial Intelligence
Evolutionary Algorithms and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Evolutionary Game Theory and Cooperation
Social Sciences →  Social Sciences →  Sociology and Political Science
© 2026 ScienceGate Book Chapters — All rights reserved.