Multimodal data to design visual learning analytics for understanding regulation of learning

Omid Noroozi*, Iman Alikhani, Sanna Järvelä, Paul A. Kirschner, Ilkka Juuso, Tapio Seppänen

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

55 Citations (Scopus)


The increased interest in multimodal data collection in the learning sciences demands for new and powerful methodological and analytical techniques and technologies. It is especially challenging for learning scientists to handle, analyse, and interpret complex and often invisible multimodal data when investigating regulation of learning in collaborative settings as this data can be cognitive, social and/or emotional in nature, much of which is covert in nature. The aim of this paper is to present ways to simplify the analysis and use of rich multimodal data by learning scientists. This is done by making primarily invisible regulation processes and their accompanying social and contextual reactions visible, measurable, and ultimately interpretable. To facilitate data visualisation and processing with respect to the regulation of learning, a Graphical User Interface (GUI) known as SLAM-KIT has been designed. SLAM-KIT reveals principal features of complex learning environments by allowing users to travel through the learners' data and its statistical characteristics. This kit has practical implications as it simplifies complex information and data while making them available through visualisation and analysis to the researchers. Our short-term goal is to simplify this tool for the teachers and learners.

Original languageEnglish
Pages (from-to)298-304
Number of pages7
JournalComputers in Human Behavior
Publication statusPublished - Nov 2019


  • Collaborative learning
  • Learning
  • Learning analytics
  • Multimodality
  • Regulation of learning


Dive into the research topics of 'Multimodal data to design visual learning analytics for understanding regulation of learning'. Together they form a unique fingerprint.

Cite this