yarn-sense

This GitHub page describes the Teamwork Analytics organisation repos. The repos contains code for data processing, cleaning, analysis and visualisation.

View the Project on GitHub Teamwork-Analytics/yarn-sense

Welcome to the yarn-sense project for Teamwork Analytics

This research project aims to provide evidence for teachers and studets to reflect upon their f2f teamwork activity. The code described in this page belongs to a multimodal learning (MMLA) architecture implemented to automatically generate MMLA user-interfaces to support reflection practices.

These are some examples of MMLA interfaces:

Repositories

This GitHub project explain the code used to generate multimodal leaning analytic interfaces to support teamwork activity. Different data modalities are normally collected in an in-the-wild study. Each modality is independientelly pre-processed… Modalities Audio and Code text

  1. Obs-tools repo: is the observation tool and UI. With this application; 1) observations can be canptured and 2) different proxemics visualisations (e.g. proxemics ego-networks and full-networks) can also be automatically generated. Modality: Epistemic_, Actions_
  2. multimodal-audio repo: Modality: Audio_ and Video_
  3. MultimodalData where all the scripts used for data collection, storage and pre-processing are stored per modality.

Architecture

The architecture for implementing the MMLA solution is described in the reference implementation in:

A whole explanation of the architecture is presented in the paper cited above. A brief explanation of the high level architecture is presented here .

Additionally, a Reference Implementation of the architecture can be consulted here

License

All the products [code] described in this project are licended under the Creative Commons Legal Code Licence CCO 1.0 Universal. Please read the details here

Whant to contribute or use the tool?

We encourage to contact us to collaborate contact support and we can validate how can we do so.