| Affiliations: | College of Sciences |
| Team Leader: |
|
| Faculty Mentor: |
Mustapha Mouloua, PhD
|
|
Team Size:
|
7 |
| Open Spots: | 0 |
|
Team Member Qualifications:
|
We seek research assistants who demonstrate professionalism, attention to detail, and strong interest in human subjects research, human–AI interaction, and cognitive or simulation-based research. Preferred applicants will have experience or coursework in psychology, cognitive science, human factors, modeling and simulation, computer science, or related STEM fields. Familiarity with IRB protocols, experimental software (e.g., E-Prime, Qualtrics), and basic research methods and data handling is beneficial. Team members must be able to follow standardized protocols precisely, communicate professionally with participants, maintain confidentiality, troubleshoot basic technical issues, and support accurate data collection. We value reliability, strong organizational skills, and interest in contributing to future research involving AI systems, cognitive performance, and advanced simulation technologies. Required Qualifications: You need to complete the Human Research (Curriculum Group) Human Subjects Research- Group 2.Social / Behavioral Research Investigators and Key Personnel (Course Learner Group) 1 - Basic Course (Stage, as well as Human Research (Curriculum Group) Research and HIPAA Privacy Protections (Course Learner Group) 1 - Basic Course (Stage) |
|
Description:
|
This study investigates how different levels and styles of artificial intelligence (AI) explanations affect human cognitive workload, perceived autonomy, and decision-making performance. Participants complete a working memory task followed by a computer-based medical triage decision task in which an AI system provides recommendations with varying explanation transparency levels (low, optimal, high, or adaptive transparency). The study measures working memory capacity, decision accuracy, reaction time, perceived autonomy, cognitive workload, and user engagement with AI systems. The goal is to understand how AI explanation design influences human performance and experience, and to support the development of AI systems that provide useful information without overwhelming users. |