Adapted trust game

Contributed by: YangLu
Characteristics:

Trust Type: Value-based; Interaction: Collaboration; Stage: Future/Perceived; Risk: Financial; System: Virtual; Test Environment: ITL-Online; Measurement: Behavioural; Pattern: Investment/Trust game;

Description

Participant completes a trading task with an artificial assistant (Assisto) to reduce the number of times they'll have to carry out a subsequent image classification task. The usecase is a variation on investment/trust game. Participants begin with 10 points and are told every point deducts 1 image from the total they're going to have to classify. They can give points to Assisto. However many they give will be tripled. Assisto then gives back anywhere between 0 points and its total (up to 40, depending on how many points the participant initially gave). This determines the final point score and number of classifications that the participant will have to execute in the next task.

Commentary

Paper actively seeks a usecase that can be used across different platforms to test trust levels as a substitute for self-reported assessment. Risk is 'financial' in nature - since participant pays in time if they fail. A preliminary questionnaire is used to check they understand the task, not to test for trust.

Original purpose

To evaluate the usecase itself as a transferable measure of trust that can be used to predict participant trust levels for subsequent human-agent collaboration.

RRI issues

None.

Source

Herse, S., Vitale, J., Johnston, B., & Williams, M. A. (2021, March). Using trust to determine user decision making & task outcome during a human-agent collaborative task. In Proceedings of the 2021 ACM/IEEE international conference on human-robot interaction (pp. 73-82).

Back