Stress and productivity patterns of interrupted, synergistic, and antagonistic office activities

Data reconstruction of workers’ stress, distractions, and productivity in the knowledge economy
Stress and productivity patterns of interrupted, synergistic, and antagonistic office activities

The final design of this study was almost a year in the making (fall 2017-spring 2018). In its final form, it was a comprehensive microcosm of all things happening in a modern office with knowledge workers. Creative writing, email disruptions, and presentations to critical reviewers (emulating managers) were all accounted for in the experimental protocol. The sensing instruments, the data acquisition methods, and the software tools developed by our team were state of the art. They enabled us to quantify the participants’ stress levels, emotions, and intellectual output moment by moment as the protocol was unfolding. It was really a God’s eye view into knowledge workers – the quintessential workforce of the 21st Century. 

The research was part of the collaborative work among my lab, Gloria’s group at the University of California, Irvine, and Ricardo’s lab at Texas A&M. We took the decision to run the experiment in all three sites, by establishing identical set-ups, so that PhD students across the three groups benefit equally from the research experience.

During the summer of 2018, when the experiment was under way, 21st Century American research was in full display. The team had over a dozen people from different ethnic backgrounds and with different expertise. It included Americans, Canadians, Indians, Bangladeshi, Brazilians, Arabs, and Chinese who specialized in affective computing, human-computer interactions, psychology, data analytics, and software engineering. All the PhD students across the three labs were newly minted, which added to the challenge. It is nothing sort of a miracle how all these disparate units came together and pulled this challenging project off.  

In late summer 2018, and as the experiment was winding down, we felt that we had something good and it was worth dashing for that year’s CHI deadline. Because we did not have time to fully curate and validate the entire dataset, we focused on a specific data panel that had to do with different forms of distractions. The result was a repudiation of long-held one size fits all ergonomic edicts about email handling. For example, in people with neurotic tendencies email disruptions were a blessing, while in people low in neuroticism was a menace. In November, we heard back from the CHI reviewers and it looked like the paper was `a go’. 

We realized that we just scratched the surface and we already had a significant discovery and a major win. It was imperative, we thought, to curate and validate the full dataset, so that the scientific community at large has the opportunity to delve into this treasure trove. Our initial estimate was that we will be done by February 2019. As it is usually the case, we were overoptimistic. Because of the multimodal and comprehensive nature of the data, we had to devise new methods to perform quality control and cross-validation. Many of the data channels we employed were not used before within such an integrated framework. Even if they did, there were no accounts in the literature for properly capitalizing on their synergy. It was clear that constructing multimodal affective computing datasets was a science all by itself, to which the research community had not paid enough attention. We decided to change this, and I believe we did, by expending significant time, effort, and money. The manuscript and the validated dataset were ready in summer 2019 – almost a year after the conduct of the experiment. Scientific Data was the obvious choice for submitting our work. High flying reviews came back a couple of months later and here we are – the dataset paper went live on November 8, 2019.

More information about the project and visualizations can be found on our website: 

Please sign in or register for FREE

If you are a registered user on Research Data at Springer Nature, please sign in