Landing : Athabascau University
  • Blogs
  • Hyeyung Park
  • Article Reviews for Human-Centred Design and Student Facing Dashboard

Article Reviews for Human-Centred Design and Student Facing Dashboard

  • Public
By Hyeyung Park April 14, 2024 - 11:47am

My research area is interdisciplinary, combining theory and practice. Co-designing a dashboard is interesting, but I was concerned about the convoluted design process. I searched for a practical, easy-to-follow dashboard design method. However, I found no streamlined process for creating human-centred dashboards. Even researchers with experience designing dashboards could not present a tangible design process. I read two articles today, hoping for valuable guidelines for my research. Sadly, a vague dashboard design process disappointed me again today. 

Article Information

Ahn, J., Campos, F., Hays, M., & Digiacomo, D. (2019). Designing in Context: Reaching Beyond Usability in Learning Analytics Dashboard Design. Journal of Learning Analytics, 6(2), 70–85. https://doi.org/10.18608/jla.2019.62.5

Ahn et al. (2019) explored human-centred design (HCD) methods with contextual design and design tensions when developing visual analytics systems for educators within the context of improvement science and research–practice partnerships. This article used a design narrative as a methodological approach that focused on describing the history and evolution of a design over time. It involves telling the story of the design process through the documentation of the design experience of developing dashboards to support middle school mathematics teachers’ pedagogical practices. The article illuminated how adopting these design methods fundamentally influences the design choices and focal questions undertaken. It aimed to inform improvement goals directly through district partners' appropriation and repurposing of tools. I selected and read this article to find a practical framework for designing a dashboard because no research articulates a tangible HCD process. Disappointedly, this article also failed to present a clear guide to designing an HCD. To save time and effort, I listed key concepts from this article are listed below:

Human-centred design (HCD) is a design approach that focuses on understanding users, their environment, and their needs as central to technology solutions development. HCD involves users throughout the design and development process to ensure the resulting tools are usable and meet user requirements.

Contextual design analyzes users' work processes and systems to create solutions that fit seamlessly.

Design tension is a critical consideration in the design process. These tensions are trade-offs or balancing acts between design goals, user needs, and constraints. Identifying and managing these tensions is crucial for developing effective and impactful learning analytics tools.

Improvement science (IS) is a research approach that aims to improve educational practices through iterative, evidence-based interventions. The article links to learning analytics design by suggesting that LA tools should be developed to support continuous improvement in educational contexts.

Research-practice partnerships (RPPs) are partnerships between researchers and practitioners to solve practical educational problems. The article emphasizes the value of RPPs in informing the design process of LA tools, ensuring that the tools are relevant and usable in natural educational settings.

Learning dashboards are visual analytics tools that present data about learners’ activities, progress, and achievements to support educators in making informed decisions. The article focuses on the design of these dashboards. It discusses how to make them more effective by incorporating HCD methods, addressing design tensions, and aligning them with IS goals and RPP collaborative frameworks.

Sensemaking is the process by which people interpret data. In the context of learning analytics dashboards, sensemaking is crucial for teachers and educators to understand and utilize the data presented. This is to improve teaching practices and student learning outcomes.

Article information

Reid, D., & Drysdale, T. (2024). Student-facing learning analytics dashboard for remote lab practical work. IEEE Transactions on Learning Technologies, 17, 1037-1050. https://doi.org/10.1109/TLT.2024.3354128

The main research question of the article centers around evaluating the effectiveness of a student-facing learning analytics (SFLA) dashboard in supporting students’ completion of remote lab activities. This article notes that the SFLA dashboard is an effective means of formative assessment during remote laboratory activities. It supports students in better understanding and completing their tasks while being recognized by the students themselves as a valuable aid in their learning process. This suggests that not only does the dashboard contribute to better academic outcomes, but it is also well-received by students as a helpful tool in their learning process. There is a dearth of articles on student-facing dashboards. The takeaways from this article were that SFLA dashboards improved online learning outcomes and increased completion rates. This study confirms that dashboards are effective for learners when incorporated well into the learning design. 

For future reference, key concepts from this article are listed below:

A student-facing learning analytics (SFLA) dashboard is a digital tool that gives students real-time performance feedback during remote laboratory tasks. The dashboard utilizes data analytics to generate visualizations of student actions and compare them with expected task procedures, aiming to support students in identifying and bridging gaps in their understanding or execution.

Formative assessment is a core educational concept that informs the SFLA dashboard. It involves providing continuous, constructive feedback to students about their learning progress. This helps them understand their current understanding and what they need to do to improve. The dashboard serves as a platform for delivering this type of assessment in a remote learning context.