Trust in Adaptive Automation: The Role of Etiquette in Tuning Trust via Analogic and Affective Method
Keywords: trust, adaptive automation, trust tuning methods, etiquette behaviors
Abstract: In this paper, we begin by discussing a definition of trust and settle on one provided by Lee and See (2004). This definition emphasizes the nature of trust as an attitude toward the uncertain future actions of an agent. Some important implications of this definition for adaptive automation systems are discussed including (a) trust is not synonymous with user acceptance; in fact, trust should be tuned to result in accurate usage decisions by operators, (b) trust becomes more important with complex, adaptive automation precisely because it becomes less plausible for a human operator to fully understand what and how the automation will operate in all contents, and (c) as an attitude, trust is produced and affected by methods other than rational cognition. In fact, Lee and See provide a model with three methods of tuning trust—analytic, analogic and affective—with special emphasis on their roles trust for adaptive automation. Of these, we argue that the latter two will be more important in human interaction with adaptive automation than they are with traditional automation. We define and discuss a method of tuning analogic and affective trust: the “etiquette” of human interaction with automation. We provide examples from two recent projects, one involving a laboratory experiment and the other involving human interaction with automation in a realistic full mission simulation, which illustrate trust effects from etiquette-based design and system behavior manipulations.