Many of the most important functions in society are undertaken by large groups or teams. Emergency response, product development, health care, education, and economic activity are pursued in the context of large, dynamic, interacting networks of groups. Theory and research on such networks of groups is much less developed than research on isolated small groups or formal organizations. A major challenge for research on networks of groups is the difficulties that accompany the collection and analysis of the huge bodies of high resolution, high volume, observational data necessary to study these large, dynamic networks of groups. The goal of this project is to address this challenge by applying advanced computing applications to capture, manage, annotate and analyze these massive observational sets of video, audio, and other data. The resulting data analysis system, GroupScope, will enable breakthrough research into social interaction in large, dynamic groups to be conducted much more quickly and with much higher reliability than was previously possible. It will do this by automating as many functions as possible to the highest degree possible, including managing huge volumes of video, audio, and sensor data, transcription, parsing audio for critical discourse events, annotation and indexing of video streams, and coding interaction. These first pass analyses can then be supplemented by human analysts (and their analyses in turn will feed into machine learning that will improve the computerized analysis).
GroupScope will be developed with the collaboration of social scientists studying emergency response teams, children’s playground behavior, distributed teams, and product development teams. When developed, GroupScope will be deployed in a cyberenvironment, a Web 2.0 based cyberinfrastructure that enables a community of researchers to collaborate on common problems. The cyberenvironment will enable multiple researchers to analyze and code the same group data for both small groups and large dynamic groups and networks. Multiple analyses and codings working from diverse perspectives will enable discovery of previously unsuspected relationships among different levels and layers of human interaction. They can also be linked to survey responses from participants, enabling linkage to the realm of perceptions and traits.
Many of the most fundamental advances in science have come through the development of new instruments, such as more powerful telescopes or microscopes that can allow scientists to view molecules. In the same way GroupScope will shed light on the workings of critical functions performed by real world groups such as emergency response units, health care teams, stock exchanges, and military units. GroupScope will also have applications in the training of those working in multi-team systems, such as first responders to disasters. It can be used to record and “grade” training sessions, giving participants feedback on both strengths and weaknesses of their approaches.
Noshir Contractor is the PI of this project, along with co-PIs Marshall Scott Poole, David Forsyth and Mark Hasegawa-Johnson of the University of Illinois Urbana-Champaign, and Feniosky Pena-Mora of Columbia University. Peter Bacjsy, Associate Director of Image Analysis for The Institute of Computing in the Humanities Arts and Social Sciences (I-CHASS), Alex Yahja, Assistant Director of Modeling for I-CHASS, and Dorothy Espelage, School of Education, are research scientists for this project. The project uses an experimental platform designed by Leslie DeChurch and Dan Doty at the University of Central Florida. SONIC is collaborating with the The Institute for Computing in the Humanities Arts and Social Sciences and the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign on this project.
This project is funded by a grant from the National Science Foundation (award number BCS-0940851).