CARMA
Software for Continuous Affect Rating and Media Annotation (c) Jeffrey M Girard, 2014-2023
CARMA is a media annotation program that collects continuous ratings while displaying audio and video files. It is designed to be highly user-friendly and easily customizable. CARMA enables researchers and study participants to provide moment-by-moment ratings of multimedia files using a computer mouse, keyboard, or joystick. The rating scale can be configured on a number of parameters including its labels and numerical range. Annotations can be displayed alongside the multimedia file and saved for easy import into statistical analysis software. CARMA provides a tool for researchers in affective computing, human-computer interaction, and the social sciences who need to capture the unfolding of subjective experience and observable behavior over time.
Screenshots
History
CARMA was first published by Jeffrey Girard in 2014 under the GNU General Public License version 3 (GPUv3). Users are free to use, distribute, and modify the program as outlined in the license. CARMA is meant to be a modernization of Gottman & Levenson’s affect rating dial. A journal publication describing CARMA and its use was published in 2014. However, note that the program and its functionality have changed a lot since that initial publication.
Citation
Users must agree to cite the following article in all publications making use of CARMA:
Girard, J. M. (2014). CARMA: Software for continuous affect rating and media annotation. Journal of Open Research Software, 2(1), e5. http://doi.org/10.5334/jors.ar
@article{CARMA,
author = {Girard, Jeffrey M},
journal = {Journal of Open Research Software},
title = {CARMA: Software for continuous affect rating and media annotation},
year = {2014},
volume = {2},
number = {1},
pages = {e5},
doi = {10.5334/jors.ar}
}
Papers Using CARMA
- Kaczmarek, L. D., Behnke, M., Enko, J., Kosakowski, M., Guzik, P., & Hughes, B. M. (in press). Splitting the affective atom: Divergence of valence and approach-avoidance motivation during a dynamic emotional experience. Current Psychology. doi: https://doi.org/10/gf2trk
- Dhamija, S., & Boult, T. E. (2018). Automated action units vs. expert raters: Face off. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision.
- Leins, D. A., Zimmerman, L. A., & Polander, E. N. (2017). Observers’ real-time sensitivity to deception in naturalistic interviews. Journal of Police and Criminal Psychology. doi: http://doi.org/10.1007/s11896-017-9224-2
- Hammal, Z., Cohn, J. F., Heike, C., & Speltz, M. L. (2015). What can head and facial movements convey about positive and negative affect? In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. doi: http://doi.org/10.1109/ACII.2015.7344584
- Hammal, Z., Cohn, J. F., Heike, C., & Speltz, M. L. (2015). Automatic measurement of head and facial movement for analysis and detection of infants’ positive and negative affect. Frontiers in ICT, 2(21). doi: http://doi.org/10.3389/fict.2015.00021
- Dworkin, J. (2015). Capturing emotional suppression as it naturally unfolds in couple interactions. Haverford College. Retrieved from http://hdl.handle.net/10066/16644