<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="research-article" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Int J Public Health</journal-id>
<journal-title>International Journal of Public Health</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Int J Public Health</abbrev-journal-title>
<issn pub-type="epub">1661-8564</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">1607094</article-id>
<article-id pub-id-type="doi">10.3389/ijph.2024.1607094</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Public Health Archive</subject>
<subj-group>
<subject>Original Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Impact of the War in Ukraine on the Ability of Children to Recognize Basic Emotions</article-title>
<alt-title alt-title-type="left-running-head">Loshenko et al.</alt-title>
<alt-title alt-title-type="right-running-head">War&#x2019;s Impact on Children&#x2019;s Emotions</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Loshenko</surname>
<given-names>Oleksandra</given-names>
</name>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2631255/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Pal&#xed;&#x161;ek</surname>
<given-names>Petr</given-names>
</name>
<uri xlink:href="https://loop.frontiersin.org/people/2724160/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Straka</surname>
<given-names>Ond&#x159;ej</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Jab&#x16f;rek</surname>
<given-names>Michal</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Porte&#x161;ov&#xe1;</surname>
<given-names>&#x160;&#xe1;rka</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>&#x160;ev&#x10d;&#xed;kov&#xe1;</surname>
<given-names>Anna</given-names>
</name>
<uri xlink:href="https://loop.frontiersin.org/people/1066275/overview"/>
</contrib>
</contrib-group>
<aff>
<institution>Faculty of Social Studies</institution>, <institution>Psychology Research Institute</institution>, <institution>Masaryk University</institution>, <addr-line>Brno</addr-line>, <country>Czechia</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1684737/overview">Daryna Pavlova</ext-link>, Ukrainian Institute for Social Research after Olexander Yaremenko, Ukraine</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/2638751/overview">Nazreen Rusli</ext-link>, International Islamic University Malaysia, Malaysia</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/2638974/overview">&#x141;ukasz Kominek</ext-link>, Military University of Technology in Warsaw, Poland</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/2641727/overview">T. Santhi Sri</ext-link>, K L University, India</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Oleksandra Loshenko, <email>psiholog3003@gmail.com</email>
</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>13</day>
<month>05</month>
<year>2024</year>
</pub-date>
<pub-date pub-type="collection">
<year>2024</year>
</pub-date>
<volume>69</volume>
<elocation-id>1607094</elocation-id>
<history>
<date date-type="received">
<day>15</day>
<month>01</month>
<year>2024</year>
</date>
<date date-type="accepted">
<day>24</day>
<month>04</month>
<year>2024</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2024 Loshenko, Pal&#xed;&#x161;ek, Straka, Jab&#x16f;rek, Porte&#x161;ov&#xe1; and &#x160;ev&#x10d;&#xed;kov&#xe1;.</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Loshenko, Pal&#xed;&#x161;ek, Straka, Jab&#x16f;rek, Porte&#x161;ov&#xe1; and &#x160;ev&#x10d;&#xed;kov&#xe1;</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<sec>
<title>Objectives</title>
<p>This study assessed emotion recognition skills in school-age children in wartime conditions in Ukraine.</p>
</sec>
<sec>
<title>Methods</title>
<p>An online survey based on the concept of basic emotions was administrated to a sample of 419 schoolchildren from Ukraine and a control group of 310 schoolchildren from the Czech Republic, aged 8 to 12.</p>
</sec>
<sec>
<title>Results</title>
<p>There is no difference in judging the intensity of anger and fear by Ukrainian children, compared with the control group. There is no evidence that the emotions of anger, fear, and sadness were better recognized in the Ukrainian group. Children from Ukraine were better at recognizing positive emotions than Czech children.</p>
</sec>
<sec>
<title>Conclusion</title>
<p>Increased risks of threats and wartime experience do not impair the accuracy of identification of emotions like fear or the assessment of intensity of basic emotions by children who experience war in Ukraine. Still, it is important to continue studying the long-term consequences of military conflicts in order to deepen the understanding of their impact on human mental functioning.</p>
</sec>
</abstract>
<kwd-group>
<kwd>children</kwd>
<kwd>emotion recognition</kwd>
<kwd>emotion intensity</kwd>
<kwd>emotion recognition accuracy</kwd>
<kwd>war</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>Being able to recognize emotions is an important characteristic of normal social functioning. In childhood, emotion recognition skills are one of the basic factors in forming proper social interaction, healthy communication in relationships, and subsequently healthy psychosocial development, increasing in researchers&#x2019; interest in this topic [<xref ref-type="bibr" rid="B1">1</xref>, <xref ref-type="bibr" rid="B2">2</xref>]. However, most existing studies focus on the typical processes of children&#x2019;s emotions recognition, while the social and environmental contexts are considered to a lesser extent. Hence, there is a significant literature gap regarding hostile environments in terms of their impact on a child&#x2019;s ability to assess and identify different emotional states of other people. At the same time, the existing studies have not considered the context of war as one of the most traumatic environmental factors.</p>
<p>Nevertheless, several earlier studies show a relationship between the ability of younger individuals to recognize emotions and the level of threat and traumatic experiences in childhood [<xref ref-type="bibr" rid="B3">3</xref>&#x2013;<xref ref-type="bibr" rid="B5">5</xref>]. In particular, according to Pollak et al., children who lived in a threatening and abusive environment recognized a lower percentage of emotions (59%) than well-off children (66%) [<xref ref-type="bibr" rid="B3">3</xref>]. That is, an increase in the danger of the environment might lead to deterioration in recognizing emotions.</p>
<p>Specifically, the War in Ukraine upended the lives of the children living in the region in a plethora of ways hardly imaginable to those living in peaceful conditions. Growing up in wartime entails disruptions to normal development in multiple areas [<xref ref-type="bibr" rid="B6">6</xref>]. Therefore, this study focuses on the impact of war-related trauma on the ability of Ukrainian school-aged children to recognize basic emotions.</p>
<sec id="s1-1">
<title>Ability to Recognize Emotions</title>
<p>Researchers attribute emotion recognition to a type of social cognition, namely, a person&#x2019;s ability to perceive, process, and interpret social information [<xref ref-type="bibr" rid="B7">7</xref>]. Slu&#x161;nien&#x117; notes that a person&#x2019;s ability to capture, identify and analyze emotions is a sign of sustainable personality development [<xref ref-type="bibr" rid="B8">8</xref>]. Adolphs indicates that humans prefer using physiognomy in recognizing emotions, and early processing of faces is one of the functions of the cerebral cortex [<xref ref-type="bibr" rid="B9">9</xref>]. In particular, according to Guarnera et al., younger children use facial information to recognize emotions such as happiness, anger, sadness, surprise, or disgust [<xref ref-type="bibr" rid="B10">10</xref>].</p>
<p>The crucial role of emotional recognition in social interaction is integrated into Crick and Dodge&#x2019;s social information processing model [<xref ref-type="bibr" rid="B11">11</xref>]. According to the model, social information processing begins with the situation encoding stage - before assessing the situation and taking the necessary actions, children find out what exactly happened; that is, they evaluate the context. Similarly, Tyng et al. identify the importance of recognizing emotions in their direct influence on the formation of adaptive behaviors through learning [<xref ref-type="bibr" rid="B12">12</xref>]. These findings are supported by Halberstadt et al., who argued that children&#x2019;s ability to interpret emotions is highly significant for their healthy social and cognitive development [<xref ref-type="bibr" rid="B1">1</xref>].</p>
</sec>
<sec id="s1-2">
<title>Effects of Threatening Environment on the Perceived Intensity of Emotions</title>
<p>A child can recognize basic emotions such as joy, anger, sadness, fear, surprise, and disgust [<xref ref-type="bibr" rid="B13">13</xref>]. Still, Garcia and Tully point out that the accuracy of emotion recognition increases with age, meaning that mild and moderate emotions are better recognized in later childhood [<xref ref-type="bibr" rid="B13">13</xref>]. In addition, the recognition of emotions of happiness and anger does not require their sufficient intensity, while the emotions of sadness are recognized by children better if these feelings are clearly expressed [<xref ref-type="bibr" rid="B13">13</xref>].</p>
<p>Earlier studies also report the universality of emotion recognition in childhood. For example, younger children are generally poorer at interpreting emotions than older children [<xref ref-type="bibr" rid="B14">14</xref>], which is confirmed by Gao and Maurer also show that emotional recognition performance improves with a child&#x2019;s life experience [<xref ref-type="bibr" rid="B15">15</xref>]. Mancini et al. also examined changes in the ability of normally living children to recognize emotions from facial expressions as their age increased [<xref ref-type="bibr" rid="B16">16</xref>]. In addition, positive emotions are recognized more clearly by children of all ages compared to negative ones, regardless of cultural context [<xref ref-type="bibr" rid="B17">17</xref>].</p>
<p>Nevertheless, a threatening environment can influence which emotions of other people are prioritized by children, leading them to judge such emotions as more intense compared with children from a non-threatening environment. Ardizzi et al. show that when reading facial expressions of emotions, the disadvantaged street children are more oriented towards negative emotions [<xref ref-type="bibr" rid="B18">18</xref>]. In particular, when comparing two groups, the researchers found that street children more easily identified anger (28.29% versus 7.10% for well-off children) and fear (16% versus 5.3%). In this regard, such children may tend to exaggerate the presence of anger in other people and downplay their positive emotions. McLaughlin states that children in a threatening environment tend to overreact to threats due to their natural reflex of survival in dangerous settings [<xref ref-type="bibr" rid="B19">19</xref>]. Similarly, Pollak et al. show that abused children interpret anger as well as well-off children and interpret sadness less accurately (<italic>p</italic> &#x3c; 0.05) [<xref ref-type="bibr" rid="B3">3</xref>].</p>
<p>Still, we do not know of any studies that focus on the war context. It can be assumed, however, that hostilities also present a threat; therefore, they could affect the ability of children with military conflict experience to read anger, sadness, and fear in a similar way.</p>
<p>Hence, our Hypothesis 1 assumes this effect to be present in children from wartime Ukraine:</p>
<p>
<statement content-type="h1" id="H1">
<label>H1</label>
<p>Children in wartime Ukraine rate anger and fear as more intense than the control group.</p>
</statement>
</p>
</sec>
<sec id="s1-3">
<title>Effects of Threatening Environment on the Accuracy of Emotions Recognition</title>
<p>Threatening environment can also affect the children&#x2019;s ability to interpret (judge) others&#x2019; emotions accurately. By the alteration of judgment we mean, that a facial or body expression signaling one emotion is mistakenly interpreted as a mark of another emotion&#x2013;a person experiencing (and expressing) joy is, for instance, perceived as angry, etc. Nevertheless, the literature is not unanimous concerning the effect of threats on this type of accuracy.</p>
<p>Scrimin et al. compared emotion recognition in children affected by the Beslan event to a control group, with participants balanced for age and gender. Children with the experience of terrorist attack correctly recognized clearly expressed emotions, such as anger and sadness. The affected children also seemed to label facial expressions showing anger and sadness as anger more often than those from control group, and recognized the clearly expressed emotion of sadness less accurately.</p>
<p>On the other hand, Frankenhuis and de Weerth claim that children who have experienced violence or abuse can recognize even unexpressed, hidden, or distorted aggressive emotions [<xref ref-type="bibr" rid="B20">20</xref>]. Pollak et al. have drawn similar conclusions: children who have been abused have a unique and traumatizing social experience, and thus their sensitivity to negative emotions increases [<xref ref-type="bibr" rid="B5">5</xref>]. These findings are consistent with those by Scrimin et al. and Frankenhuis and de Weerth [<xref ref-type="bibr" rid="B20">20</xref>, <xref ref-type="bibr" rid="B21">21</xref>]. However, according to B&#xe9;rub&#xe9; et al., the cognitive processes of abused children is more active, and they react more sharply to negative emotional manifestations [<xref ref-type="bibr" rid="B22">22</xref>]. At the same time, the reaction of children with traumatic experiences to emotions is distinctive. Such a child easily recognizes the emotion of happiness but reads anger and fear better, and this skill also remains in adulthood.</p>
<p>Cicchetti takes a different position, noting that the negative emotional experiences that children with traumatic experiences of abuse have been through impair their ability to recognize emotions [<xref ref-type="bibr" rid="B23">23</xref>]. In this regard, the existing literature supports the idea that stressors, threatening environments, and experiences of violence and abuse could affect a child&#x2019;s ability to recognize emotions correctly. At the same time, some conclusions are contradictory. For example, some researchers argue that children who live in a threatening environment or have traumatic experiences are less able to recognize emotions [<xref ref-type="bibr" rid="B3">3</xref>, <xref ref-type="bibr" rid="B21">21</xref>]. Other researchers focus on the increased accuracy of children&#x2019;s recognition of certain emotions [<xref ref-type="bibr" rid="B18">18</xref>, <xref ref-type="bibr" rid="B20">20</xref>, <xref ref-type="bibr" rid="B22">22</xref>]. In addition, there are no studies assessing the stressor of war in terms of its effect on children&#x2019;s emotional development and ability to read emotions. In this regard, it is possible to hypothesize that wartime is also a variant of a threatening environment, which means that it may affect the overall accuracy of children&#x2019;s recognizing emotion, including their recognition of negative emotions.</p>
<p>Therefore, Hypothesis 2 suggests that this effect is present in children from wartime Ukraine:</p>
<p>
<statement content-type="h2" id="H2">
<label>H2</label>
<p>Children in wartime Ukraine are, on average, worse at recognizing emotions than the control group but children in wartime Ukraine are better at identifying emotions of anger, sadness, and fear than the control group.</p>
</statement>
</p>
</sec>
<sec id="s1-4">
<title>Present Study</title>
<p>The present study is, to the best of our knowledge, the first attempt to study the peculiarities of emotion recognition by children in war conditions. Our literature review suggests that the effects of wartime can be twofold: (1) affecting the perceived intensity of emotions or (2) affecting the accuracy of recognition. As stated in our hypotheses, we expect the children in wartime Ukraine to judge anger, sadness, and fear as more intense, as observed in earlier studies. However, in general, they will be worse at identifying emotions and their expression compared to the control group of children living in a safe environment in the Czech Republic, whom we have included in the study as a comparison sample because we did not have the opportunity to use a pre-war group in the Ukrainian sample. Based on the results of previous studies, we also hypothesize that aspects of children&#x2019;s basic emotion recognition are universal under normal conditions, making the Czech control group suitable for comparison. Our study tests these assumptions and provides exploratory findings.</p>
</sec>
</sec>
<sec sec-type="methods" id="s2">
<title>Methods</title>
<p>The hypotheses and analysis plan were preregistered on OSF (<ext-link ext-link-type="uri" xlink:href="https://osf.io/cpk3b">https://osf.io/cpk3b</ext-link>), the data and the analytic script are publicly available in the OSF project (<ext-link ext-link-type="uri" xlink:href="https://osf.io/c6d8x">https://osf.io/c6d8x</ext-link>).</p>
<sec id="s2-1">
<title>Sample</title>
<p>In total, we analyzed the results from 729 children aged 8&#x2013;14 years (343 (47%) girls, 386 (53%) boys). See <xref ref-type="table" rid="T1">Table 1</xref> for age and country distribution.</p>
<table-wrap id="T1" position="float">
<label>TABLE 1</label>
<caption>
<p>Age distribution of the sample (Ukraine, Czech Republic. 2023).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left">Age</th>
<th align="center">Czech republic</th>
<th align="center">Ukraine</th>
<th align="center">Total</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">8</td>
<td align="center">12</td>
<td align="center">116</td>
<td align="center">128</td>
</tr>
<tr>
<td align="left">9</td>
<td align="center">41</td>
<td align="center">38</td>
<td align="center">79</td>
</tr>
<tr>
<td align="left">10</td>
<td align="center">52</td>
<td align="center">105</td>
<td align="center">157</td>
</tr>
<tr>
<td align="left">11</td>
<td align="center">54</td>
<td align="center">92</td>
<td align="center">146</td>
</tr>
<tr>
<td align="left">12</td>
<td align="center">69</td>
<td align="center">64</td>
<td align="center">133</td>
</tr>
<tr>
<td align="left">13</td>
<td align="center">74</td>
<td align="center">3</td>
<td align="center">77</td>
</tr>
<tr>
<td align="left">14</td>
<td align="center">8</td>
<td align="center">1</td>
<td align="center">9</td>
</tr>
<tr>
<td align="left">Total</td>
<td align="center">310</td>
<td align="center">419</td>
<td align="center">729</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The participants in the &#x201c;wartime Ukraine&#x201d; group were citizens of Ukraine who lived in the Ukraine territory after the start of the Russian invasion. Most of the participants (83%) attended schools in the Chernihiv region, which was briefly under Russian occupation in March 2022, bordering the invading country. This fact suggests that they were situated in more traumatic war-related settings.</p>
<p>To establish a control group, the experimental tasks were administered to a comparable sample of children living in a country not affected by war nor by similar adverse conditions. The Czech Republic was chosen as a reference country, as it is geographically, culturally, and linguistically close to Ukraine. All participants in the Czech sample attended schools in the South Moravian or Moravian-Silesian regions.</p>
</sec>
<sec id="s2-2">
<title>Procedure and Data Collection</title>
<p>Data were collected using a Qualtrics online survey. We also inquired about the participants&#x2019; socio-demographic characteristics and their current emotional state for all basic emotions (happiness, anger, sadness, surprise, fear, and disgust) on a ten-point scale. The total response time did not exceed 20&#xa0;min.</p>
<p>Children participated voluntarily with the informed consent of their legal guardians. The research plan was approved by the Research Ethics Committee of Masaryk University (EKV-2022-085). In order to ensure safety, the children were surveyed online at home. The participation was anonymous and could be terminated at any time without providing reasons. Due to the ongoing hostilities in Ukraine, it was not feasible to collect a representative sample or ensure that all of the children were impacted in a similar way. We delve deeper into this limit in the Discussion section.</p>
</sec>
<sec id="s2-3">
<title>Measures</title>
<p>As stimuli, we used some items from the Test for identification of socio-emotional deficits [<xref ref-type="bibr" rid="B24">24</xref>]. This test is a new diagnostic tool that was developed in years 2018&#x2013;2022 as a part of a research project funded by the Technological agency of the Czech Republic (project number: TL01000494). The test draws on the Mayer-Salovey-Caruso model of emotional abilities [<xref ref-type="bibr" rid="B25">25</xref>], which posits that while emotions are subjective, the capacity to perceive, interpret, and manage them is akin to cognitive abilities. These differences in abilities can be assessed and measured.</p>
<p>The items designed to measure emotion perception consist of photographs and short videoclips. On each item, one actor portrays a specific emotion, mainly via facial expressions, to a lesser degree by body posture and/or body movement. Both photographs and videoclips were taken in a neutral studio setting with monotone light-grey background. The video scenes are not staged, with no setup or prior actions indicating the emotion. Actors include both genders and various ages, though fewer than the total test items, with some appearing in multiple items.</p>
<p>Content-wise, the test builds on the notion of &#x201c;basic emotions.&#x201d; This concept was popularized by Ekman, who argued that several emotions and their facial expressions are universally comprehensible across different cultures and may be thus considered as basic, experienced and understood generally by all people, probably on an innate basis [<xref ref-type="bibr" rid="B26">26</xref>]. This original list comprised joy, sadness, fear, anger, surprise and disgust. Later, this concept was revised, primarily because the list of basic, universally comprehensible emotions is probably considerably broader (up to 18 different emotions) [<xref ref-type="bibr" rid="B27">27</xref>]. However, including a higher number of emotions would markedly increase the time requirements of our design; for this reason, we stuck to the original &#x201c;basic six&#x201d; [<xref ref-type="bibr" rid="B28">28</xref>].</p>
<p>From the pool of all items comprising the perception part of the Test for identification of socio-emotional deficits, we chose 2 photographs and 2 videoclips for each of the six emotions, establishing 24 items. For each emotion, one photograph and one videoclip portrayed a mild expression, while the second set depicted a more intense expression. Participants viewed 12 photo items first, followed by the remaining 12 videos, in a randomized order.</p>
<p>The participants were tasked with selecting the depicted emotion from a list of basic emotions located below the item. A scroll bar with a 10-point scale was also present in the lower part of the computer screen using which the participants indicated the intensity of the emotion displayed on the item (1 &#x3d; lowest intensity, 10 &#x3d; highest intensity).</p>
<p>At the very beginning of the experimental set, before presenting the items, we asked the participants questions regarding their friends, interests, hobbies, etc. Then, they were asked to rate the current intensity of their own emotions (again using a 10-point scale). A separate scale was presented for each emotion of the &#x201c;basic six&#x201d; list.</p>
</sec>
<sec id="s2-4">
<title>Data Analysis</title>
<p>The data was analyzed in R (4.1.2) using packages lme4, gorica, restriktor, mirt, performance, and DHARMa [<xref ref-type="bibr" rid="B29">29</xref>&#x2013;<xref ref-type="bibr" rid="B33">33</xref>]. Data manipulation was conducted via tidyverse [<xref ref-type="bibr" rid="B34">34</xref>]. The confirmatory models were parametrized as a linear mixed model for intensity ratings (given their expected Gaussian distribution) and generalized linear mixed model for accuracy (given its binomial distribution). In both models, we modeled random intercepts for participants (representing their ability) and random intercepts for items (representing their difficulty) as random terms.</p>
<p>After fitting the confirmatory models, we employed informative hypothesis testing which allows to specify order-restricted hypotheses and test their relative likelihood after accounting for their parsimony [<xref ref-type="bibr" rid="B35">35</xref>]. In other words, we first specified restriction to the regression coefficients of interest (e.g., for H1 we set the Ukrainian group to have higher average rating of anger and fear intensity, i.e., <italic>b</italic> &#x3e; 0). Afterwards, the software computes the likelihood of this hypothesis (given the data) and controls for the degree of restriction (with less restricted hypotheses getting more penalty for complexity). Finally, it compares the resulting likelihood to that of an unrestricted hypothesis (i.e., with the maximum likelihood but with the most penalty for complexity because the parameter space was not restricted at all), the complement (i.e., the exact opposite of the specified hypothesis), or another prespecified hypothesis.</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec id="s3-1">
<title>Descriptives</title>
<p>Means and standard deviations of the item intensity ratings are shown in <xref ref-type="table" rid="T2">Table 2</xref> below. Mean difference in intensity ratings between Ukrainian and Czech children was negligible, <italic>d</italic> &#x3d; 0.04. Item-rest correlations are not reported because the ratings do not assume a reflective measurement model.</p>
<table-wrap id="T2" position="float">
<label>TABLE 2</label>
<caption>
<p>Intensity ratings item statistics (Ukraine, Czech Republic. 2023).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left"/>
<th align="center">N</th>
<th align="center">M</th>
<th align="center">SD</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">surprise_1</td>
<td align="center">727</td>
<td align="center">4.30</td>
<td align="center">2.10</td>
</tr>
<tr>
<td align="left">anger_1</td>
<td align="center">726</td>
<td align="center">6.40</td>
<td align="center">2.20</td>
</tr>
<tr>
<td align="left">fear_1</td>
<td align="center">724</td>
<td align="center">5.30</td>
<td align="center">2.10</td>
</tr>
<tr>
<td align="left">disgust_1</td>
<td align="center">726</td>
<td align="center">5.00</td>
<td align="center">2.20</td>
</tr>
<tr>
<td align="left">joy_1</td>
<td align="center">726</td>
<td align="center">7.00</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">surprise_2</td>
<td align="center">723</td>
<td align="center">6.00</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">fear_2</td>
<td align="center">726</td>
<td align="center">6.10</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">anger_2</td>
<td align="center">726</td>
<td align="center">7.60</td>
<td align="center">2.20</td>
</tr>
<tr>
<td align="left">joy_2</td>
<td align="center">723</td>
<td align="center">7.20</td>
<td align="center">2.40</td>
</tr>
<tr>
<td align="left">sadness_1</td>
<td align="center">724</td>
<td align="center">6.40</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">disgust_2</td>
<td align="center">721</td>
<td align="center">7.60</td>
<td align="center">2.10</td>
</tr>
<tr>
<td align="left">sadness_2</td>
<td align="center">719</td>
<td align="center">5.60</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">surprise_3</td>
<td align="center">715</td>
<td align="center">5.10</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">fear_3</td>
<td align="center">705</td>
<td align="center">5.50</td>
<td align="center">2.40</td>
</tr>
<tr>
<td align="left">joy_3</td>
<td align="center">703</td>
<td align="center">6.50</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">fear_4</td>
<td align="center">702</td>
<td align="center">7.10</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">anger_3</td>
<td align="center">698</td>
<td align="center">6.30</td>
<td align="center">2.40</td>
</tr>
<tr>
<td align="left">surprise_4</td>
<td align="center">701</td>
<td align="center">6.20</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">disgust_3</td>
<td align="center">702</td>
<td align="center">7.20</td>
<td align="center">2.20</td>
</tr>
<tr>
<td align="left">sadness_3</td>
<td align="center">695</td>
<td align="center">5.70</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">anger_4</td>
<td align="center">695</td>
<td align="center">7.10</td>
<td align="center">2.20</td>
</tr>
<tr>
<td align="left">joy_4</td>
<td align="center">694</td>
<td align="center">5.70</td>
<td align="center">2.60</td>
</tr>
<tr>
<td align="left">sadness_4</td>
<td align="center">689</td>
<td align="center">6.60</td>
<td align="center">2.30</td>
</tr>
<tr>
<td align="left">disgust_4</td>
<td align="center">691</td>
<td align="center">6.50</td>
<td align="center">2.10</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Interestingly, the children had a strong individual tendency to assess intensity (<italic>ICC</italic> &#x3d; .32) which was an even stronger predictor than the item characteristics themselves (<italic>ICC</italic> &#x3d; .12).</p>
<p>Means and item-rest correlations of the emotion recognition items are shown in <xref ref-type="table" rid="T3">Table 3</xref>. Mean difference in accuracy between Ukrainian and Czech children was small (<italic>d</italic> &#x3d; 0.15).</p>
<table-wrap id="T3" position="float">
<label>TABLE 3</label>
<caption>
<p>Accuracy item statistics (Ukraine, Czech Republic. 2023).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left"/>
<th align="center">N</th>
<th align="center">Item-rest r</th>
<th align="center">M</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">surprise_1</td>
<td align="center">728</td>
<td align="center">.32</td>
<td align="center">0.82</td>
</tr>
<tr>
<td align="left">anger_1</td>
<td align="center">726</td>
<td align="center">.20</td>
<td align="center">0.89</td>
</tr>
<tr>
<td align="left">fear_1</td>
<td align="center">724</td>
<td align="center">.32</td>
<td align="center">0.86</td>
</tr>
<tr>
<td align="left">disgust_1</td>
<td align="center">724</td>
<td align="center">.26</td>
<td align="center">0.84</td>
</tr>
<tr>
<td align="left">joy_1</td>
<td align="center">726</td>
<td align="center">.31</td>
<td align="center">0.99</td>
</tr>
<tr>
<td align="left">surprise_2</td>
<td align="center">724</td>
<td align="center">.35</td>
<td align="center">0.92</td>
</tr>
<tr>
<td align="left">fear_2</td>
<td align="center">726</td>
<td align="center">.00</td>
<td align="center">0,33</td>
</tr>
<tr>
<td align="left">anger_2</td>
<td align="center">727</td>
<td align="center">.31</td>
<td align="center">0.94</td>
</tr>
<tr>
<td align="left">joy_2</td>
<td align="center">723</td>
<td align="center">.40</td>
<td align="center">0.93</td>
</tr>
<tr>
<td align="left">sadness_1</td>
<td align="center">725</td>
<td align="center">.35</td>
<td align="center">0.96</td>
</tr>
<tr>
<td align="left">disgust_2</td>
<td align="center">723</td>
<td align="center">.51</td>
<td align="center">0.94</td>
</tr>
<tr>
<td align="left">sadness_2</td>
<td align="center">722</td>
<td align="center">.33</td>
<td align="center">0.72</td>
</tr>
<tr>
<td align="left">surprise_3</td>
<td align="center">715</td>
<td align="center">.33</td>
<td align="center">0.82</td>
</tr>
<tr>
<td align="left">fear_3</td>
<td align="center">706</td>
<td align="center">.18</td>
<td align="center">0.62</td>
</tr>
<tr>
<td align="left">joy_3</td>
<td align="center">707</td>
<td align="center">.53</td>
<td align="center">0.98</td>
</tr>
<tr>
<td align="left">fear_4</td>
<td align="center">707</td>
<td align="center">.19</td>
<td align="center">0.79</td>
</tr>
<tr>
<td align="left">anger_3</td>
<td align="center">700</td>
<td align="center">.18</td>
<td align="center">0.64</td>
</tr>
<tr>
<td align="left">surprise_4</td>
<td align="center">706</td>
<td align="center">.24</td>
<td align="center">0.90</td>
</tr>
<tr>
<td align="left">disgust_3</td>
<td align="center">703</td>
<td align="center">.57</td>
<td align="center">0.97</td>
</tr>
<tr>
<td align="left">sadness_3</td>
<td align="center">698</td>
<td align="center">.39</td>
<td align="center">0.94</td>
</tr>
<tr>
<td align="left">anger_4</td>
<td align="center">698</td>
<td align="center">.31</td>
<td align="center">0.94</td>
</tr>
<tr>
<td align="left">joy_4</td>
<td align="center">697</td>
<td align="center">.32</td>
<td align="center">0.89</td>
</tr>
<tr>
<td align="left">sadness_4</td>
<td align="center">693</td>
<td align="center">.23</td>
<td align="center">0.63</td>
</tr>
<tr>
<td align="left">disgust_4</td>
<td align="center">691</td>
<td align="center">.27</td>
<td align="center">0.86</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Item characteristics were stronger predictors of accuracy (<italic>ICC</italic> &#x3d; .27) than participant abilities (<italic>ICC</italic> &#x3d; .11). We assumed the emotion recognition items would measure a single cluster within Mayer-Salovey-Caruso model, hence the measure is expected to be a reflective, unidimensional construct.</p>
<p>To test this assumption, we fit a <italic>2&#xa0;PL IRT</italic> model (see De Ayala) which explained the data adequately: M<sub>2</sub>(252) &#x3d; 562,45, <italic>p</italic> &#x3c; .001; RMSEA &#x3d; .043 90%CI [.038 ; .048]; SRMSR &#x3d; .06; TLI &#x3d; .87 [<xref ref-type="bibr" rid="B36">36</xref>]. On an item level, item fear_2 was problematic due to its low intercorrelation that impacted the IRT estimates (<italic>a</italic> close to zero). Content-wise, the item seemed ambiguous between fear (33%) and disgust (37%). Removing it had no considerable effect on model fit but improved SEs of the item parameter estimates, so we decided to exclude it from further analyses. The accuracy items had borderline reliability (average split-half &#x3d; .72).</p>
</sec>
<sec id="s3-2">
<title>Intensity Rating Models (H1)</title>
<p>Hereafter, along with the recommendations of, e.g., Mosalves et al., we report the intermediate models we estimated to gradually build the final model [<xref ref-type="bibr" rid="B37">37</xref>].</p>
<p>Our baseline model (random item intercepts only) explained a moderate portion of total variance in intensity ratings (total <italic>ICC</italic> &#x3d; .44). Adding gender and age did not improve the model (&#x3c7;<sup>2</sup>(2) &#x3d; 4.71, <italic>p</italic> &#x3d; .095, &#x394;AIC &#x3d; &#x2212;3, &#x394;BIC &#x3d; &#x2212;21). Including the effect of country, yielded a slight, non-significant, improvement: &#x3c7;<sup>2</sup> (1) &#x3d; 3.78, <italic>p</italic> &#x3d; 0.052, &#x394;AIC &#x3d; 2, &#x394;BIC &#x3d; 6. Afixed predictor distinguishing between depictions of fear or anger (1) and others (0), worsened the model: (&#x3c7;<sup>2</sup> (1) &#x3d; 0.53, <italic>p</italic> &#x3d; .47, &#x394;AIC &#x3d; &#x2212;2, &#x394;BIC &#x3d; &#x2212;1). Finally, we fit the confirmatory model which included all the above and the interaction between country and fear/anger variables, once again worsening the fit: &#x3c7;<sup>2</sup> (1) &#x3d; 0.007, <italic>p</italic> &#x3d; .93, &#x394;AIC &#x3d; &#x2212;2, &#x394;BIC &#x3d; &#x2212;10.</p>
<p>Therefore, we concluded that our explanatory variables did not explain enough variance in intensity scores. See <xref ref-type="table" rid="T4">Table 4</xref> for the final model parameters.</p>
<table-wrap id="T4" position="float">
<label>TABLE 4</label>
<caption>
<p>Linear mixed model for intensity ratings (Ukraine, Czech Republic. 2023).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left"/>
<th align="center">b</th>
<th align="center">SE</th>
<th align="center">t</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">intercept</td>
<td align="center">&#x2212;0.47</td>
<td align="center">0.19</td>
<td align="center">&#x2212;2.48</td>
</tr>
<tr>
<td align="left">gender (1 &#x3d; boy)</td>
<td align="center">&#x2212;0.04</td>
<td align="center">0.04</td>
<td align="center">&#x2212;0.98</td>
</tr>
<tr>
<td align="left">age</td>
<td align="center">0.04</td>
<td align="center">0.01</td>
<td align="center">2.62</td>
</tr>
<tr>
<td align="left">country (1 &#x3d; UKR)</td>
<td align="center">0.09</td>
<td align="center">0.05</td>
<td align="center">1.90</td>
</tr>
<tr>
<td align="left">type (1 &#x3d; anger/fear)</td>
<td align="center">0.11</td>
<td align="center">0.16</td>
<td align="center">0,69</td>
</tr>
<tr>
<td align="left">country:type</td>
<td align="center">0.002</td>
<td align="center">0.02</td>
<td align="center">0.08</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>Note: Var(participants) &#x3d; 0.32, Var(items) &#x3d; 0.13, Var(e) &#x3d; 0.56.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>To formally test our hypothesis (H1), we used GORICA to estimate the penalized likelihood of this restriction (i.e., the respective interaction term&#x2019;s <italic>b</italic> being &#x3e;0) as opposed to the exact opposite hypothesis (Hc). The H1 was only slightly more supported than Hc (GORICA weights of .501 vs. .499). See <xref ref-type="fig" rid="F1">Figure 1</xref> for the interaction term visualization.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>Interaction between country and emotion type - anger and fear versus others (Ukraine, Czech Republic. 2023).</p>
</caption>
<graphic xlink:href="ijph-69-1607094-g001.tif"/>
</fig>
</sec>
<sec id="s3-3">
<title>Accuracy models (H2)</title>
<p>Our baseline model (same as above) explained a moderate portion of total variance in accuracy ratings (total <italic>ICC</italic> &#x3d; .38). Including gender and age greatly improved the model fit: &#x3c7;<sup>2</sup>(2) &#x3d; 27.40, <italic>p</italic> &#x3c; .001, &#x394;AIC &#x3d; 23, &#x394;BIC &#x3d; 7. Adding country as a fixed predictor yielded another noticeable improvement: &#x3c7;<sup>2</sup>(1) &#x3d; 33.53, <italic>p</italic> &#x3c; .001, &#x394;AIC &#x3d; 32, &#x394;BIC &#x3d; 24. Distinguishing between items depicting fear, sadness or anger (1) and others (0) yielded another improvement in fit: &#x3c7;<sup>2</sup>(1) &#x3d; 4.77, <italic>p</italic> &#x3d; .029, &#x394;AIC &#x3d; 2, &#x394;BIC &#x3d; 5. Finally, we added the interaction between country and the predictor distinguishing between fear, sadness or anger items and the rest, again noticeably improving model fit: &#x3c7;<sup>2</sup>(1) &#x3d; 20.97, <italic>p</italic> &#x3c; .001, &#x394;AIC &#x3d; 19, &#x394;BIC &#x3d; 12. In turn, we have found that Ukrainian children recognize positive emotions more accurately than Czech children. See <xref ref-type="table" rid="T5">Table 5</xref> for the final model parameters.</p>
<table-wrap id="T5" position="float">
<label>TABLE 5</label>
<caption>
<p>Generalized linear mixed model for accuracy (Ukraine, Czech Republic. 2023).</p>
</caption>
<table>
<thead valign="top">
<tr>
<th align="left"/>
<th align="center">b</th>
<th align="center">SE</th>
<th align="center">t</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="left">(Intercept)</td>
<td align="center">2.82</td>
<td align="center">0.42</td>
<td align="center">6.75</td>
</tr>
<tr>
<td align="left">Gender (1 &#x3d; boy)</td>
<td align="center">&#x2212;0.25</td>
<td align="center">0.07</td>
<td align="center">&#x2212;3.51</td>
</tr>
<tr>
<td align="left">age</td>
<td align="center">&#x2212;0.04</td>
<td align="center">0.02</td>
<td align="center">&#x2212;1.48</td>
</tr>
<tr>
<td align="left">Country (1 &#x3d; UKR)</td>
<td align="center">0.74</td>
<td align="center">0.10</td>
<td align="center">7.54</td>
</tr>
<tr>
<td align="left">type (anger/sadness/fear &#x3d; 1)</td>
<td align="center">&#x2212;0.81</td>
<td align="center">0.45</td>
<td align="center">&#x2212;1.80</td>
</tr>
<tr>
<td align="left">country:type</td>
<td align="center">&#x2212;0.45</td>
<td align="center">0.10</td>
<td align="center">&#x2212;4.67</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>To formally test H2, we again used GORICA to test the respective interaction term&#x2019;s <italic>b</italic> being &#x3e;0 AND the main effect of country being <italic>b</italic> &#x3c; 0 as opposed to the exact opposite hypothesis (Hc). Unsurprisingly, given both coefficients having the opposite direction in our sample, H2 received no support compared to Hc (0 vs. 1). See <xref ref-type="fig" rid="F2">Figure 2</xref> for the interaction term visualization.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>Interaction between country and emotion type - anger or fear or sadness versus others (Ukraine, Czech Republic. 2023).</p>
</caption>
<graphic xlink:href="ijph-69-1607094-g002.tif"/>
</fig>
<p>Fixed terms in neither of the final models exhibited concerning multicollinearity (<italic>VIF</italic> &#x3c; 2) and there were no concerns regarding the distribution of the residuals or heteroscedasticity.</p>
</sec>
<sec id="s3-4">
<title>Sensitivity Analyses</title>
<p>The effects of children&#x2019;s current happiness, sadness, fear, and disgust were often significant but not substantial. Only happiness had an effect on intensity ratings (<italic>b&#x2a;</italic> &#x3d; 0.15). Nevertheless, the inclusion of these variables did not change the evaluation of our hypotheses.</p>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>This study examined the ability of school-age children to recognize other people&#x2019;s emotions when exposed to the threats triggered by the war in Ukraine. Previous studies have shown that threatening environment could distort emotional recognition skills. However, wartime should be considered a separate threat factor since individuals facing military conflict are more vulnerable, inclusive of their emotional state, and are more traumatized. Nevertheless, the results surprisingly show that Ukrainian children&#x2019;s emotional condition and ability to recognize emotions under the influence of threats are discrepant to these expectations. Below, we interpret these results and our exploratory findings.</p>
<p>Our hypotheses implied that the Ukrainian children would exhibit the same pattern as in previous research. However, none of our expectations turned out to be supported by the data. In summary, fear and anger were not rated as more intense than the other emotions (H1). And, perhaps most surprisingly, the Ukrainian children were not less accurate in judging emotions as compared with the control group (H2).</p>
<p>We found that children from both groups were less likely to assess the type of emotions and were more focused on assessing their intensity. However, generally, the ratings of the intensity of emotions in the Ukrainian and Czech groups are almost the same. This fact may indicate that young people in wartime remain attentive and sensitive to the strength of the emotions expressed, which is a natural response [<xref ref-type="bibr" rid="B19">19</xref>]. It is also possible to observe some non-typical patterns in the ability of wartime children to recognize the intensity of emotions. In particular, we assumed that it would be more difficult for them to determine the intensity of weak emotions. For the most part, even weak emotions were identified by them as emotions of considerable intensity [<xref ref-type="bibr" rid="B18">18</xref>]. The fact that this feature is tracked for all emotions and that recognition of emotion&#x2019;s intensity does not differ between the Ukrainian and Czech groups, was an interesting and unexpected finding. Accordingly, we may assume that children from Ukraine who go through war experiences do not lose the ability to determine the intensity of fear and anger. Furthermore, children who are exposed to war experiences and are at higher risk of threats seem to identify emotions almost equally well in both static and dynamic manifestations.</p>
<p>Important observations have also been made regarding the accuracy of recognizing emotions by children living in war settings. Initially, it was assumed that children from Ukraine would be worse at recognizing emotions than those in the Czech Republic. However, this hypothesis has not been supported since the results had not revealed significant differences in the accuracy of recognizing all basic emotions between the Ukrainian and Czech groups. For this reason, we can say that hazardous environments and traumatic war experiences do not generally impair a child&#x2019;s ability to recognize emotions, contradicting the findings by Scrimin et al. and corroborating those by Frankenhuis and de Weerth [<xref ref-type="bibr" rid="B20">20</xref>, <xref ref-type="bibr" rid="B21">21</xref>]. At the same time, the recognition of anger, sadness, and fear by Ukrainian children does not differ from that of the Czech group. Earlier studies indicated that traumatic experiences in young people increased their accuracy in recognizing fear and anger [<xref ref-type="bibr" rid="B22">22</xref>]. However, comparing the results between the Czech and Ukrainian groups shows that negative emotions appear to be recognized by children living in dangerous and safe conditions with the same accuracy, which radically differs from the findings of previous studies focused on type of risks other than war threats. For example, the current findings are inconsistent with a number of studies, according to which children exposed to threats and abuse and those who find themselves in difficult circumstances are better in interpreting anger and fear and worse in recognizing positive emotions [<xref ref-type="bibr" rid="B4">4</xref>, <xref ref-type="bibr" rid="B18">18</xref>&#x2013;<xref ref-type="bibr" rid="B21">21</xref>]. In turn, we have found that Ukrainian children recognize positive emotions more accurately than Czech children. This fact may indicate that war affects emotional recognition in a non-standard way and motivates children to ignore negative impressions and focus on the life&#x2019;s pleasant moments. We can also assume that war generally does not have a significant effect on emotional recognition in childhood. However, it is important to note that this study did not analyze the severity of traumatic experiences, which could hypothetically influence aspects of emotion recognition. In other words, the difficult environment in which this research project was conducted made it impossible to determine the extent of threats faced by the children and test its relation to emotion recognition accuracy, suggesting that the observations may not apply to the general population of young people living in war settings.</p>
<sec id="s4-1">
<title>Limitations of the Study and Future Areas for Research</title>
<p>This study has several limitations. First of all, it did not consider the degree of risk for the participants; that is, it did not assess the children&#x2019;s proximity to active war zones or the frequency of military threats in their area of residence. In addition, the study did not take into account the characteristics of children&#x2019;s experiences of war trauma, which could be associated with emotion recognition variables. It also did not examine the quality of life and economic wellbeing of the participants either, although these factors, hypothetically, might also influence the process of emotional recognition. It is also important to recognize that the study results are closely related to age and cultural characteristics and therefore cannot be generalized to the entire population of children who face military threats. Accordingly, we conclude that it is necessary to consider age and culture when assessing the war&#x2019;s impact on children&#x2019;s recognizing emotion to gain a deeper understanding of these features. Finally, it is necessary to note that using an online survey might result in some limitations in the accuracy of information collection. This circumstance requires researchers to pay more attention to interpreting the results obtained and involve two or more experts in the assessment process to avoid biases. At the same time, these limitations open areas for future research. In addition to these factors of research interest, it is important to study the effects of the long-term impact of the wartime threats on children&#x2019;s recognition of emotions. On the face of it, it might seem unlikely that some impairment of emotion perception should manifest itself in the postwar, (and hence peaceful) condition, if it was not found during the obviously more stressful situation of the war itself. However, there are several reasons why it might be so. First, it might provide some additional data about the effects of this type of stress if it lasts for a long time (longer than the span between the start of the war and the realization of the current study). And second, the post-war condition might make it possible to administer a longer set of emotion perception tasks than the one which it was possible to work with now. Such longer and more complex design might permit to find some subtler effects, which could not be detected with a relatively limited number of materials in the present design.</p>
</sec>
</sec>
</body>
<back>
<sec id="s5">
<title>Ethics Statement</title>
<p>Children participated voluntarily with the informed consent of their legal guardians. The research plan was approved by the Research Ethics Committee of Masaryk University (EKV-2022-085). In order to ensure safety, the children were surveyed online at home. The participation was anonymous and could be terminated at any time without providing reasons.</p>
</sec>
<sec id="s6">
<title>Author Contributions</title>
<p>OL was responsible for conceptualizing the research, designing the study, and overseeing data collection and analysis. Played a central role in drafting the manuscript. PP actively participated in the study&#x2019;s design, data collection, and analysis. Contribute substantially to the literature review and method section of the paper. OS was responsible for specific aspects of statistical analyses, contributed to the results and discussion sections of the paper. MJ contributed to the literature review, framing the research questions, data analysis. &#x160;P contributed to interpretation. Assisted in synthesizing findings and discussing their implications. A&#x160; provided valuable insights into the interpretation of results, took part in manuscript revision and finalization. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec sec-type="funding-information" id="s7">
<title>Funding</title>
<p>The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. This study is part of the project &#x2018;MUNI/A/1523/2023&#x2019;, and it received support from the Grant Projects of Specific Research Program at the Masaryk University Faculty of Social Studies.</p>
</sec>
<sec sec-type="COI-statement" id="s8">
<title>Conflict of Interest</title>
<p>The authors declare that they do not have any conflicts of interest.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<label>1.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Halberstadt</surname>
<given-names>AG</given-names>
</name>
<name>
<surname>Denham</surname>
<given-names>SA</given-names>
</name>
<name>
<surname>Dunsmore</surname>
<given-names>JC</given-names>
</name>
</person-group>. <article-title>Affective Social Competence</article-title>. <source>Soc Dev</source> (<year>2001</year>) <volume>10</volume>(<issue>1</issue>):<fpage>79</fpage>&#x2013;<lpage>119</lpage>. <pub-id pub-id-type="doi">10.1111/1467-9507.00150</pub-id>
</citation>
</ref>
<ref id="B2">
<label>2.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shu</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Xie</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Liao</surname>
<given-names>D</given-names>
</name>
<etal/>
</person-group> <article-title>A Review of Emotion Recognition Using Physiological Signals</article-title>. <source>Sensors</source> (<year>2018</year>) <volume>18</volume>(<issue>7</issue>):<fpage>2074</fpage>. <pub-id pub-id-type="doi">10.3390/s18072074</pub-id>
</citation>
</ref>
<ref id="B3">
<label>3.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pollak</surname>
<given-names>SD</given-names>
</name>
<name>
<surname>Cicchetti</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Hornung</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Reed</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Recognizing Emotion in Faces: Developmental Effects of Child Abuse and Neglect</article-title>. <source>Dev Psychol</source> (<year>2000</year>) <volume>36</volume>(<issue>5</issue>):<fpage>679</fpage>&#x2013;<lpage>88</lpage>. <pub-id pub-id-type="doi">10.1037/0012-1649.36.5.679</pub-id>
</citation>
</ref>
<ref id="B4">
<label>4.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pollak</surname>
<given-names>SD</given-names>
</name>
<name>
<surname>Sinha</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Effects of Early Experience on Children&#x2019;s Recognition of Facial Displays of Emotion</article-title>. <source>Dev Psychol</source> (<year>2002</year>) <volume>38</volume>(<issue>5</issue>):<fpage>784</fpage>&#x2013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1037/0012-1649.38.5.784</pub-id>
</citation>
</ref>
<ref id="B5">
<label>5.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pollak</surname>
<given-names>SD</given-names>
</name>
<name>
<surname>Messner</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Kistler</surname>
<given-names>DJ</given-names>
</name>
<name>
<surname>Cohn</surname>
<given-names>JF</given-names>
</name>
</person-group>. <article-title>Development of Perceptual Expertise in Emotion Recognition</article-title>. <source>Cognition</source> (<year>2009</year>) <volume>110</volume>(<issue>2</issue>):<fpage>242</fpage>&#x2013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2008.10.010</pub-id>
</citation>
</ref>
<ref id="B6">
<label>6.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Frounfelker</surname>
<given-names>RL</given-names>
</name>
<name>
<surname>Islam</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Falcone</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Farrar</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Ra</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Antonaccio</surname>
<given-names>CM</given-names>
</name>
<etal/>
</person-group> <article-title>Living Through War: Mental Health of Children and Youth in Conflict-Affected Areas</article-title>. <source>Int Rev Red Cross</source> (<year>2019</year>) <volume>101</volume>(<issue>911</issue>):<fpage>481</fpage>&#x2013;<lpage>506</lpage>. <pub-id pub-id-type="doi">10.1017/s181638312000017x</pub-id>
</citation>
</ref>
<ref id="B7">
<label>7.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ferretti</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Papaleo</surname>
<given-names>F</given-names>
</name>
</person-group>. <article-title>Understanding Others: Emotion Recognition in Humans and Other Animals</article-title>. <source>Genes, Brain Behav</source> (<year>2019</year>) <volume>18</volume>(<issue>1</issue>):<fpage>e12544</fpage>. <pub-id pub-id-type="doi">10.1111/gbb.12544</pub-id>
</citation>
</ref>
<ref id="B8">
<label>8.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Slu&#x161;nien&#x117;</surname>
<given-names>G</given-names>
</name>
</person-group>. <article-title>Possibilities for Development of Emotional Intelligence in Childhood in the Context of Sustainable Education</article-title>. <source>Discourse Commun Sust Edu</source> (<year>2019</year>) <volume>10</volume>(<issue>1</issue>):<fpage>133</fpage>&#x2013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.2478/dcse-2019-0010</pub-id>
</citation>
</ref>
<ref id="B9">
<label>9.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adolphs</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Recognizing Emotion From Facial Expressions: Psychological and Neurological Mechanisms</article-title>. <source>Behav Cogn Neurosci Rev</source> (<year>2002</year>) <volume>1</volume>(<issue>1</issue>):<fpage>21</fpage>&#x2013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1177/1534582302001001003</pub-id>
</citation>
</ref>
<ref id="B10">
<label>10.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guarnera</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Hichy</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Cascio</surname>
<given-names>MI</given-names>
</name>
<name>
<surname>Carrubba</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>Facial Expressions and Ability to Recognize Emotions From Eyes or Mouth in Children</article-title>. <source>Europe&#x2019;s J Psychol</source> (<year>2015</year>) <volume>11</volume>(<issue>2</issue>):<fpage>183</fpage>&#x2013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.5964/ejop.v11i2.890</pub-id>
</citation>
</ref>
<ref id="B11">
<label>11.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Crick</surname>
<given-names>NR</given-names>
</name>
<name>
<surname>Dodge</surname>
<given-names>KA</given-names>
</name>
</person-group>. <article-title>A Review and Reformulation of Social Information-Processing Mechanisms in Children&#x2019;s Social Adjustment</article-title>. <source>Psychol Bull</source> (<year>1994</year>) <volume>115</volume>(<issue>1</issue>):<fpage>74</fpage>&#x2013;<lpage>101</lpage>. <pub-id pub-id-type="doi">10.1037/0033-2909.115.1.74</pub-id>
</citation>
</ref>
<ref id="B12">
<label>12.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tyng</surname>
<given-names>CM</given-names>
</name>
<name>
<surname>Amin</surname>
<given-names>HU</given-names>
</name>
<name>
<surname>Saad</surname>
<given-names>MN</given-names>
</name>
<name>
<surname>Malik</surname>
<given-names>AS</given-names>
</name>
</person-group>. <article-title>The Influences of Emotion on Learning and Memory</article-title>. <source>Front Psychol</source> (<year>2017</year>) <volume>8</volume>:<fpage>1454</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2017.01454</pub-id>
</citation>
</ref>
<ref id="B13">
<label>13.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Garcia</surname>
<given-names>SE</given-names>
</name>
<name>
<surname>Tully</surname>
<given-names>EC</given-names>
</name>
</person-group>. <article-title>Children&#x2019;s Recognition of Happy, Sad, and Angry Facial Expressions Across Emotive Intensities</article-title>. <source>J Exp Child Psychol</source> (<year>2020</year>) <volume>197</volume>:<fpage>104881</fpage>. <pub-id pub-id-type="doi">10.1016/j.jecp.2020.104881</pub-id>
</citation>
</ref>
<ref id="B14">
<label>14.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gordillo Le&#xf3;n</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Nieto M&#xe1;</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Arana Mart&#xed;nez</surname>
<given-names>JM</given-names>
</name>
<name>
<surname>Mestas Hern&#xe1;ndez</surname>
<given-names>L</given-names>
</name>
<name>
<surname>L&#xf3;pez P&#xe9;rez</surname>
<given-names>RM</given-names>
</name>
</person-group>. <article-title>Role of Experience in the Neurology of Facial Expression of Emotions</article-title>. <source>Revista de Neurolog&#xed;a</source> (<year>2015</year>) <volume>60</volume>(<issue>07</issue>):<fpage>316</fpage>&#x2013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.33588/rn.6007.2014403</pub-id>
</citation>
</ref>
<ref id="B15">
<label>15.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gao</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Maurer</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Influence of Intensity on Children&#x2019;s Sensitivity to Happy, Sad, and Fearful Facial Expressions</article-title>. <source>J Exp Child Psychol</source> (<year>2009</year>) <volume>102</volume>(<issue>4</issue>):<fpage>503</fpage>&#x2013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1016/j.jecp.2008.11.002</pub-id>
</citation>
</ref>
<ref id="B16">
<label>16.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mancini</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Agnoli</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Baldaro</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Bitti</surname>
<given-names>PE</given-names>
</name>
<name>
<surname>Surcinelli</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Facial Expressions of Emotions: Recognition Accuracy and Affective Reactions During Late Childhood</article-title>. <source>Emotions Their Influence Our Personal Interpersonal Soc Experiences</source> (<year>2018</year>) <fpage>21</fpage>&#x2013;<lpage>39</lpage>. <pub-id pub-id-type="doi">10.4324/9781315100319-3</pub-id>
</citation>
</ref>
<ref id="B17">
<label>17.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Widen</surname>
<given-names>SC</given-names>
</name>
<name>
<surname>Russell</surname>
<given-names>JA</given-names>
</name>
</person-group>. <article-title>Children&#x2019;s Recognition of Disgust in Others</article-title>. <source>Psychol Bull</source> (<year>2013</year>) <volume>139</volume>(<issue>2</issue>):<fpage>271</fpage>&#x2013;<lpage>99</lpage>. <pub-id pub-id-type="doi">10.1037/a0031640</pub-id>
</citation>
</ref>
<ref id="B18">
<label>18.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ardizzi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Martini</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Umilt&#xe0;</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Sestito</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Ravera</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Gallese</surname>
<given-names>V</given-names>
</name>
</person-group>. <article-title>When Early Experiences Build a Wall to Others&#x2019; Emotions: An Electrophysiological and Autonomic Study</article-title>. <source>PLoS ONE</source> (<year>2013</year>) <volume>8</volume>(<issue>4</issue>):<fpage>e61004</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0061004</pub-id>
</citation>
</ref>
<ref id="B19">
<label>19.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>McLaughlin</surname>
<given-names>KA</given-names>
</name>
</person-group>. <article-title>Future Directions in Childhood Adversity and Youth Psychopathology</article-title>. <source>J Clin Child Adolesc Psychol</source> (<year>2016</year>) <volume>45</volume>(<issue>3</issue>):<fpage>361</fpage>&#x2013;<lpage>82</lpage>. <pub-id pub-id-type="doi">10.1080/15374416.2015.1110823</pub-id>
</citation>
</ref>
<ref id="B20">
<label>20.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Frankenhuis</surname>
<given-names>WE</given-names>
</name>
<name>
<surname>de Weerth</surname>
<given-names>C</given-names>
</name>
</person-group>. <article-title>Does Early-Life Exposure to Stress Shape or Impair Cognition?</article-title> <source>Curr Dir Psychol Sci</source> (<year>2013</year>) <volume>22</volume>(<issue>5</issue>):<fpage>407</fpage>&#x2013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1177/0963721413484324</pub-id>
</citation>
</ref>
<ref id="B21">
<label>21.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Scrimin</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Moscardino</surname>
<given-names>U</given-names>
</name>
<name>
<surname>Capello</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Alto&#xe8;</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Axia</surname>
<given-names>G</given-names>
</name>
</person-group>. <article-title>Recognition of Facial Expressions of Mixed Emotions in School-Age Children Exposed to Terrorism</article-title>. <source>Dev Psychol</source> (<year>2009</year>) <volume>45</volume>(<issue>5</issue>):<fpage>1341</fpage>&#x2013;<lpage>52</lpage>. <pub-id pub-id-type="doi">10.1037/a0016689</pub-id>
</citation>
</ref>
<ref id="B22">
<label>22.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>B&#xe9;rub&#xe9;</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Turgeon</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Blais</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Fiset</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Emotion Recognition in Adults With a History of Childhood Maltreatment: A Systematic Review</article-title>. <source>Trauma, Violence, and Abuse.</source> (<year>2021</year>) <volume>24</volume>(<issue>1</issue>):<fpage>278</fpage>&#x2013;<lpage>94</lpage>. <pub-id pub-id-type="doi">10.1177/15248380211029403</pub-id>
</citation>
</ref>
<ref id="B23">
<label>23.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cicchetti</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Socioemotional, Personality, and Biological Development: Illustrations From a Multilevel Developmental Psychopathology Perspective on Child Maltreatment</article-title>. <source>Annu Rev Psychol</source> (<year>2016</year>) <volume>67</volume>(<issue>1</issue>):<fpage>187</fpage>&#x2013;<lpage>211</lpage>. <pub-id pub-id-type="doi">10.1146/annurev-psych-122414-033259</pub-id>
</citation>
</ref>
<ref id="B24">
<label>24.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Porte&#x161;ov&#xe1;</surname>
<given-names>SO</given-names>
</name>
<name>
<surname>Jab&#x16f;rek</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Pal&#xed;&#x161;ek</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Jajcaj</surname>
<given-names>F</given-names>
</name>
<name>
<surname>&#x160;romov&#xe1;</surname>
<given-names>V</given-names>
</name>
<etal/>
</person-group> <source>Subtest Pro Identifikaci Socioemo&#x10d;n&#xed;ch Deficit&#x16f; [Subtest for the Identification of Socio-Emotional Deficits]</source>. <publisher-name>Masarykova univerzita</publisher-name> (<year>2022</year>). <comment>Available at: <ext-link ext-link-type="uri" xlink:href="https://www.invenio.muni.cz/o-inveniu-zakladni-informace">https://www.invenio.muni.cz/o-inveniu-zakladni-informace</ext-link>
</comment>.</citation>
</ref>
<ref id="B25">
<label>25.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mayer</surname>
<given-names>JD</given-names>
</name>
<name>
<surname>Caruso</surname>
<given-names>DR</given-names>
</name>
<name>
<surname>Salovey</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>The Ability Model of Emotional Intelligence: Principles and Updates</article-title>. <source>Emot Rev</source> (<year>2016</year>) <volume>8</volume>(<issue>4</issue>):<fpage>290</fpage>&#x2013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1177/1754073916639667</pub-id>
</citation>
</ref>
<ref id="B26">
<label>26.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ekman</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>An Argument for Basic Emotions</article-title>. <source>Cogn Emot</source> (<year>1992</year>) <volume>6</volume>(<issue>3&#x2013;4</issue>):<fpage>169</fpage>&#x2013;<lpage>200</lpage>. <pub-id pub-id-type="doi">10.1080/02699939208411068</pub-id>
</citation>
</ref>
<ref id="B27">
<label>27.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cordaro</surname>
<given-names>DT</given-names>
</name>
<name>
<surname>Sun</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Kamble</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Hodder</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Monroy</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Cowen</surname>
<given-names>A</given-names>
</name>
<etal/>
</person-group> <article-title>The Recognition of 18 Facial-Bodily Expressions Across Nine Cultures</article-title>. <source>Emotion</source> (<year>2020</year>) <volume>20</volume>(<issue>7</issue>):<fpage>1292</fpage>&#x2013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1037/emo0000576</pub-id>
</citation>
</ref>
<ref id="B28">
<label>28.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Keltner</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Sauter</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Tracy</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Cowen</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Emotional Expression: Advances in Basic Emotion Theory</article-title>. <source>J Nonverbal Behav</source> (<year>2019</year>) <volume>43</volume>(<issue>2</issue>):<fpage>133</fpage>&#x2013;<lpage>60</lpage>. <pub-id pub-id-type="doi">10.1007/s10919-019-00293-3</pub-id>
</citation>
</ref>
<ref id="B29">
<label>29.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bates</surname>
<given-names>D</given-names>
</name>
<name>
<surname>M&#xe4;chler</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Bolker</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Walker</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>Fitting Linear Mixed-Effects Models Using Lme4</article-title>. <source>J Stat Softw</source> (<year>2015</year>) <volume>67</volume>(<issue>1</issue>):<fpage>1</fpage>&#x2013;<lpage>48</lpage>. <pub-id pub-id-type="doi">10.18637/jss.v067.i01</pub-id>
</citation>
</ref>
<ref id="B30">
<label>30.</label>
<citation citation-type="web">
<person-group person-group-type="author">
<name>
<surname>Kuiper</surname>
<given-names>RM</given-names>
</name>
</person-group>. <article-title>GORICA: Evaluation of Inequality Constrained Hypotheses Using GORICA R Package Version 0.1.2</article-title> (<year>2021</year>). <comment>Available from: <ext-link ext-link-type="uri" xlink:href="https://CRAN.R-project.org/package=gorica">https://CRAN.R-project.org/package&#x3d;gorica</ext-link>
</comment> (<comment>Accessed 2021</comment>).</citation>
</ref>
<ref id="B31">
<label>31.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vanbrabant</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Rosseel</surname>
<given-names>Y</given-names>
</name>
</person-group>. <article-title>An Introduction to Restriktor: Evaluating Informative Hypotheses for Linear Models</article-title>. <source>Small Sample Size Solutions</source> (<year>2020</year>) <fpage>157</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.4324/9780429273872-14</pub-id>
</citation>
</ref>
<ref id="B32">
<label>32.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Choi</surname>
<given-names>Y-J</given-names>
</name>
<name>
<surname>Asilkalkan</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>R Packages for Item Response Theory Analysis: Descriptions and Features</article-title>. <source>Meas Interdiscip Res Perspect</source> (<year>2019</year>) <volume>17</volume>(<issue>3</issue>):<fpage>168</fpage>&#x2013;<lpage>75</lpage>. <pub-id pub-id-type="doi">10.1080/15366367.2019.1586404</pub-id>
</citation>
</ref>
<ref id="B33">
<label>33.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Hartig</surname>
<given-names>F</given-names>
</name>
</person-group> <source>Package &#x2018;DHARMa.&#x2019;</source> (<year>2020</year>). <comment>Available from: <ext-link ext-link-type="uri" xlink:href="https://cran.microsoft.com/snapshot/2020-07-04/web/packages/DHARMa/DHARMa.pdf">https://cran.microsoft.com/snapshot/2020-07-04/web/packages/DHARMa/DHARMa.pdf</ext-link>
</comment> (<comment>Accessed 2021</comment>).</citation>
</ref>
<ref id="B34">
<label>34.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wickham</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Averick</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Bryan</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Chang</surname>
<given-names>W</given-names>
</name>
<name>
<surname>McGowan</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Fran&#xe7;ois</surname>
<given-names>R</given-names>
</name>
<etal/>
</person-group> <article-title>Welcome to the Tidyverse</article-title>. <source>J Open Source Softw</source> (<year>2019</year>) <volume>4</volume>(<issue>43</issue>):<fpage>1686</fpage>. <pub-id pub-id-type="doi">10.21105/joss.01686</pub-id>
</citation>
</ref>
<ref id="B35">
<label>35.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Hoijtink</surname>
<given-names>H</given-names>
</name>
</person-group> <source>Informative Hypotheses: Theory and Practice for Behavioral and Social Scientists</source>. <publisher-loc>New York</publisher-loc>: <publisher-name>CRC Press</publisher-name> (<year>2012</year>).</citation>
</ref>
<ref id="B36">
<label>36.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>De Ayala</surname>
<given-names>RJ</given-names>
</name>
</person-group> <source>The Theory and Practice of Item Response Theory</source>. <publisher-loc>New York</publisher-loc>: <publisher-name>The Guilford Press</publisher-name> (<year>2008</year>).</citation>
</ref>
<ref id="B37">
<label>37.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Monsalves</surname>
<given-names>MJ</given-names>
</name>
<name>
<surname>Bangdiwala</surname>
<given-names>AS</given-names>
</name>
<name>
<surname>Thabane</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Bangdiwala</surname>
<given-names>SI</given-names>
</name>
</person-group>. <article-title>Level (Logical Explanations and Visualizations of Estimates in Linear Mixed Models): Recommendations for Reporting Multilevel Data and Analyses</article-title>. <source>BMC Med Res Methodol</source> (<year>2020</year>) <volume>20</volume>(<issue>1</issue>):<fpage>3</fpage>. <pub-id pub-id-type="doi">10.1186/s12874-019-0876-8</pub-id>
</citation>
</ref>
</ref-list>
</back>
</article>