The entry into force of the EU AI Act affects the development, delivery and use of AI systems. Suppliers and users of AI systems need to assess to what extent their systems and practices are affected by the regulation and undertake actions to comply with the requirements. AI systems with the purpose of identifying or inferring emotions or intentions of natural persons on the basis of their biometric data – so called “emotion recognition systems” or “emotion AI system” – are subject to certain rules in the AI Act.
Article 5 (f) of the AI Act prohibits certain practices involving emotion recognition systems, stating that emotion recognition systems are not permitted “in the areas of workplace and education institutions”.
iMotions’ product offerings include functionality that utilizes AI systems that in iMotions’ view constitute “emotion recognition system” according to the definition in the AI Act. The functionalities affected are:
- Facial expression analysis
- Voice analysis
Against this background iMotions has, in cooperation with its suppliers of the affected AI systems, attempted to assess the applicability of the prohibition in Article 5 (f) of the AI Act to iMotions’ – and consequently its customers’ – practices. Note that iMotions’ assessment is not to be considered a guidance or advice to take any action or omit to take action. iMotions encourages each of its customers to conduct its own analysis and assessment and welcomes an open discussion on these issues.
Prohibition within the area of education institutions
iMotions has analysed and assessed the scope of the prohibition of emotion recognition systems in the areas of education institutions, as potentially iMotions’ university customers would be affected.
In summary, iMotions’ position regarding the applicability scope of the prohibition in Article 5 (f) is that the practices that are prohibited relate specifically to education and the education setting – and are not intended to affect research operations conducted at universities. Although a university per definition would be covered by the term “education institution”, the context and purpose of the prohibition, as it is further explained in recital 44, is one aspect that leads iMotions’ to assume that research activities using emotion recognition systems are not covered by the prohibition even though the involved test subjects happen to be students at the education institution. For these situations, the students are not being analyzed in their capacity and role as students.
Extract from recital 44:
Considering the imbalance of power in the context of work or education, combined with the intrusive nature of these systems, such systems could lead to detrimental or unfavourable treatment of certain natural persons or whole groups thereof. Therefore, the placing on the market, the putting into service, or the use of AI systems intended to be used to detect the emotional state of individuals in situations related to the workplace and education should be prohibited.
That said, it is iMotions’ assessment that any potential research activities that involve students in their educational setting (lectures, tests, exams, etc.) may fall under the prohibition. As the AI Act does not include a general exception for research, the particular use case involving students in the educational setting should be avoided until further guidance on the applicability of the prohibition in Article 5 (f) is provided.
Prohibition within the area of workplace
iMotions has analysed and assessed the scope of the prohibition of emotion recognition systems in the area of workplace, as iMotions’ corporate and academia customers, potentially are affected.
Given that iMotions’ products are intended for research, the use cases relevant for assessment are limited to cases that include research conducted in workplaces and involving groups consisting of employees, potential employees or externally recruited respondents, e.g. panelists. Use for any other purposes than research, has been excluded from iMotions’ assessment.
In summary, iMotions’ position regarding the applicability scope of the prohibition in Article 5 (f) as relates to workplaces, is that the practices that are prohibited reasonably do not include research activities conducted within the research division of a company even though the involved test subjects happen to be engaged through employment. For these situations, the employees are not being analyzed in their capacity and role as employees.
This said, iMotions’ recognizes that use of Emotion AI in a workplace setting will require thorough risk analysis and assessment, both against the AI Act regulations and other relevant regulations such as the GDPR, etc.
___________
Regardless if research is conducted at an educational or another kind of institution, participants must be informed of and consent to the purpose of a data collection and ways of data processing.
iMotions is providing this information to offer transparency as regards its analysis and assessment of the effect of the prohibited AI practices regulation to certain use cases and customers. iMotions emphasizes that this statement is not to be considered a guidance or advice to take any action or omit to take action. iMotions encourages each of its customers to conduct its own analysis and assessment and welcomes an open discussion on these issues.
Note: Since iMotions’ drafted this statement, the EU Commission has on February 4, 2025, provided a guideline on the prohibited practices in Article 5. iMotions will review and potentially update this statement based on the Commission’s guideline. iMotions’ encourages its customers to read the guideline.