yarn-sense

This GitHub page describes the Teamwork Analytics organisation repos. The repos contains code for data processing, cleaning, analysis and visualisation.

View the Project on GitHub Teamwork-Analytics/yarn-sense

Reference Implementation

The high level architecture described was implemented in a real scenario with 40 sessions, where data was collected from different teams.

The implementation diagrama describes how each componente of the architecture was used to automatically generate the MMLA Interfaces.

Image

Observation tool:

A, K and Observations was developed using a nodejs Express framework and Angular as front end. The source code for this repository can be found here.

Multimodal data collection

The multimodal data collection for our implementation consist of different applications, one per modality. The Reference Implementation presented here will explain how the indoor positioning data was collected and store. In this repository you will find all the scripts that we use for the reference implementation.

A. Java script B. Python script

A. Python implementation

The starting point of the application is ProximityLocalisation.py. This scripts reads the raw data as a dataFrame in Python.

Once the application reads the data, the formating is mandatory. The python script formatingDataSetProximity.py is the first formating process that we run on the raw positioning data. That way, we waranty that the attributes trackerId,x,y,rotation,sessionId are in the right format and normalised.

Please keep in touch if you have intered in developing or contributing to create other applications for other modalities.

Multimodal modelling

The scripts used fot the multimodal modelling are:

  1. The python script enumerateTrackersProximity.py is used to contextualise the data and change the trackerId by the role that students performed during the simulation.
  2. The python script distancesProximity.py is in charge of validating second per second how far nurses apart from each other. That way the interpersonal proxemic can be calculated and included into a matrix. Interpersona distances can be: intimate, personal, social and public.
  3. Then the script named visualisationProximity.py visualise the outcomes in the form of a bar chart, a ego-network or a full-graph network. This scripts also generates specific narratives for the bar charts, which is going to be included as part of the visualisation.

Data Visualisation

To visualise the outcomes from the multimodal modelling process we are using two different technologies: vis.js and angular.

Layered Storytelling