Emotion analysis can be beneficial to researchers in decision making

Emotion analysis can be beneficial to researchers in decision making

Posted by Guest Blogger on Wed 13 Jan. 2016 - 3 minute read

By David Schindler

Introducing µCap

David Schindler is the laboratory manager of the interdisciplinary decision making laboratory MELESSA at the University of Munich, currently visiting the University of Pennsylvania.

Many researchers interested in human behavior have used Noldus FaceReader in the past to determine emotional states of their subjects. By now, it has become somewhat of a standard to analyze emotions in small-scale psychological studies, to gain insights into consumer behavior from a marketing perspective and in many more areas.

In need of larger sample sizes?

Very recently, experimental economists and also experimental psychologists who need larger sample sizes (i.e. up to 400 subjects) have found interest in analyzing the role of emotions in decision making. Those studies are often conducted in computerized laboratories, where subjects in several sessions sit in small cubicles at separate computer workstations and, with the help of experimental software, make decisions or work on different kinds of tasks.

While FaceReader in general is capable of handling large numbers of participants, it has no built-in routines to link it to the standard experimental software that exist (in economics, the top dog is z-Tree [1], while psychologists use a diverse set of software of which for example E-Prime or Qualtrics are often used).

FaceReader and z-Tree

The recent interest in emotional responses as correlates of decision making in economics have sparked a series of papers that have used FaceReader (in connection with z-Tree), e.g. [2,3,4]. An increased interest in such studies has led us [5] to develop a software, µCap (muCap), which is capable of creating a link between video footage and phases of the experiment, suitable for automated analysis in FaceReader.









FREE WHITE PAPER: FaceReader methodology

Download the free FaceReader methodology note to learn more about facial expression analysis theory.

  • How FaceReader works
  • More about the calibration
  • Insight in quality of analysis & output


µCap

The idea behind µCap is very simple: the program constantly reads out a specific pixel in the top left corner of the screen. Whenever this pixel changes color, µCap creates a timestamp in a .csv file. This timestamp contains the exact time at which the change of color happens (on the client computer) and reflects a point of interest to the researcher.

While our release version of µCap contains a sample file, for how this color change can be triggered in z-Tree, it is not necessarily limited to be used with z-Tree. Any experimental software that allows to change color of a specific screen area can be used to connect to FaceReader.

As an example, consider a decision making experiment, where subjects choose between a risky option (e.g. receive either $5 or $0 with equal probability) and a safe option (e.g. receive $2) after they went through several other decision problems. Just by looking at video footage of the participants’ faces, the experimenter is unable to determine at which point in time subjects saw this particular decision problem. With the help of µCap, the experimenter can implement a color change that takes place when the decision problem of interest is first displayed and thus knows the exact time at which this happened and can link it to the footage.

µConfig, µCap, µProject

µCap consists of three independent tools that allow a fully automatized recording and analysis in FaceReader of a virtually unlimited number of subjects: µConfig is a visual interface that allows experimenters to define color codes at which timestamps should be recorded, the main tool of µCap records the videos and creates the timestamps, and µProject will automatically create a new project in FaceReader such that the analysis can be started with just simply one click. 

µCap can be used free of charge in academic contexts, you can find the latest downloadable version and the corresponding paper at http://mucap.david-schindler.de/.

Analyze emotions

The µCap software makes it feasible to run medium- to even large-scale experiments and analyze emotions using FaceReader with only a few clicks. While we hope that the research community will benefit from our tool, we also hope that future versions of our software can benefit from input from our user base.

References

  1. Fischbacher, U. (2007). z-Tree: Zurich toolbox for ready-made economic experiments. Experimental Economics, 10(2), 171-178.
  2. Joffily, M., Masclet, D., Noussair, C. N., & Villeval, M. C. (2014). Emotions, Sanctions, and Cooperation. Southern Economic Journal, 80(4), 1002-1027.
  3. Nguyen, Y., & Noussair, C. N. (2014). Risk aversion and emotions. Pacific Economic Review, 19(3), 296-312.
  4. Breaban, A., & Noussair, C. N. (2013). Emotional state and market behavior, Working Paper.
  5. Doyle, L., & Schindler, D. (2015). MuCap: Connecting FaceReaderTM to z-Tree. Munich Discussion Paper (No. 2015-4).
Don't miss out on the latest blog posts
Share this post
Topics
Learn
more
Relevant Blogs
infant-gaze-during-feeding

What does an infant’s gaze tell us about how hungry they feel?

McNally and her colleagues developed a coding scheme to observe infant gaze behavior and applied it in a study of complementary feeding.
analyzing-mealtime-behaviors-children-autism

Analyzing the mealtime behaviors of children with autism

Disrupted mealtimes and feeding challenges may risk the development of family stability, as well as the social skills and well-being of a child. Which strategy should families use to support meal time engagement?
classroom-observations-adhd

Classroom observations - behavior of children with and without ADHD

The relationship between reaction time variability and observed attention in children with and without ADHD.