Autor: |
Joshi, Siddhant Sanjay, Davis, Kirsten, Czerwionka, Lori, Troncoso, Elisa Camps, Montalvo, Francisco J. |
Předmět: |
|
Zdroj: |
Proceedings of the ASEE Annual Conference & Exposition; 2022, p1-18, 18p |
Abstrakt: |
Engineers face complex and multidisciplinary problems in the modern work environment. To understand and solve these complex problems, engineers require systems thinking skills that allow them to consider the interconnected technical and contextual factors. Therefore, it is important to provide engineering students with opportunities to develop these skills during their education. A part of this process is developing assessment approaches that can help instructors measure students' systems thinking ability. A variety of approaches have been used in the literature to assess the development of systems thinking, including surveys, interviews, design projects, and scenario-based instruments. Scenario-based assessments can offer a more in-depth view of student learning than typical surveys while also being faster to analyze than open-ended data such as interviews. However, a range of scenario-based assessments that are available claim to assess similar skills, making it challenging to identify which fits the needs of a particular educational context. To help address this challenge, we compared two scenario-based assessments: the Village of Abeesee scenario [1] and the Energy Conversion Playground (ECP) design task [2], to understand concepts of systems thinking emphasized by each instrument and how students' scores on the assessments are related. The participants in this study were 19 undergraduate engineering students enrolled in an interdisciplinary humanities-based engineering course in Spring 2021. We administered both scenario-based assessments at the start and end of the semester to examine the change in students' scores over time. We then compared the assessment results from each instrument by examining average scores for each of the systems thinking dimensions and also individual total scores on each assessment. Lastly, we compared the experience of scoring the assessments from the perspective of the instructor or researcher using the assessment. Based on our findings, we make recommendations about when an instructor might choose to use one assessment or the other. Our results can inform future research and assessment projects that aim to assess students' systems thinking skills by comparing both student outcomes and instructor experience for these scenario-based assessments. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|