CISER-Research-Thesis-Observational Oversight for Understanding Trust in Interactive Human and AI Systems - CISER Consortium for Intelligent Systems Education and Research
CISER's Role in Research
Develop and apply intelligent systems to a range of Navy mission areas, including intelligence, logistics, cyber, test and evaluation, force health and personnel data, and operational decision making.
Offer a core research program to engage faculty and students in investigating questions of high interest to the group’s sponsors.

CISER
Observational Oversight for Understanding Trust in Interactive Human and AI Systems
Trust in artificial intelligence is an important field of research influencing developments such as autonomous cars, aircraft, the employment of drone swarms, and military decision aid tools. Understanding trust between humans and artificially intelligent systems will help accelerate decision-making cycles, inform better system design, and avoid automation bias or under-utilization of intelligent systems. Using the Hoff and Bashir model of trust in automation as a theoretical foundation of trust, we create a three-dimensional interface for a mobile network control system (NCS) composed of unmanned vehicles (UxV). In support of a ground team element, the NCS uses adaptive submodularity to configure a topology of network nodes that maximizes communications, sensor coverage, and network robustness. Interaction with the NCS is accomplished with a virtual reality (VR) interface by suggesting UxV positions to be evaluated by adaptive submodularity. The VR system is capable of capturing user interaction events, user positions, and total utility of the NCS as reported by adaptive submodularity. The measurements taken from the user may be analyzed post-event for potential indicators of trust development, how quickly the user learned to use the system, and if the user's ability to work with the NCS to accomplish a goal improved over time. Read more...