JOURNAL ARTICLE

Trust and communication in human-machine teaming

Abstract

Intelligent highly-automated systems (HASs) are increasingly being created and deployed at scale with a broad range of purposes and operational environments. In uncertain or safety-critical environments, HASs are frequently designed to seamlessly co-operate with humans, thus, forming human-machine teams (HMTs) to achieve collective goals. Trust plays an important role in this dynamic: humans need to be able to develop an appropriate level of trust in their HAS teammate(s) to form an HMT capable of safely and effectively working towards goal completion. Using Autonomous Ground Vehicles (AGVs) as an example of an HAS used in dynamic social contexts, we explore interdependent teaming and communication between humans and AGVs in different contexts and examine the role of trust and communication in these teams. Drawing on lessons from the AGV example for the design of an HAS used for an HMT more broadly, we argue that trust is experienced and built differently in different contexts, necessitating context-specific approaches to designing for trust in such systems.

Keywords:
Interdependence Computer science Context (archaeology) Common ground Human–computer interaction Knowledge management Process management Engineering Psychology Social psychology Sociology

Metrics

9
Cited By
2.55
FWCI (Field Weighted Citation Impact)
23
Refs
0.86
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Human-Automation Interaction and Safety
Social Sciences →  Psychology →  Social Psychology
Ethics and Social Impacts of AI
Social Sciences →  Social Sciences →  Safety Research
© 2026 ScienceGate Book Chapters — All rights reserved.