GreenBook Interview with Peter Hartzbech

This interview was originally conducted by Leonard Murphy, at GreenBook – The Future of Insights, and posted on the 8th of December 2021. The interview is brought on this blog with kind permission. The original post can be found here.

By: Leonard Murphy

08 December 2021

I have been an unabashed fan of applied behavioral science in insights for many years. Not only is working to understand the nonconscious drivers of behavior and decisions endlessly fascinating in general, but amazing technological strides have unlocked the ability to scale nonconscious insights in ways that seem to expand almost daily. However, until relatively recently, the insights and analytics industry has been slow to adopt these advances. As in so many things, the pandemic prompted a sea change. The combination of a massive digital-centric shift of consumer behavior combined with an imperative need from buyers of insights to understand a rapidly changing realignment of consumer values and decision-making has spurred significant growth of behavioral science.

Many companies have benefited from this growth, but a clear leader has emerged recently. Smart Eye AB, the global leader in Human Insights AI, with its recent acquisition of Emotion AI pioneer Affectiva, and iMotions, the leading biosensor software platform, have joined forces to create a true, integrated powerhouse in delivering unparalleled insights into human behavior.

In today’s CEO series interview, I chat with Peter Hartzbech, CEO of iMotions, on the story behind consolidating these three companies under one roof and what the future of nonconscious measurement technology may bring.

Peter and I had not chatted previously, and I was very impressed with his energy and vision. He has crafted a bold strategy to build a truly global leader that will drive new use cases, technological innovation, and business impact in a way few others will be able to achieve. That said, there are many other organizations with their own unique offerings that are growing rapidly as well – look for continued market forces to keep pushing everyone ahead.

During our conversation, we decided it would be interesting to bring he, Martin, and Rana back for a group discussion on the future of technology-driven applied behavioral science. Look for that early next year. In the meantime, I think you’ll find this deeply interesting, engaging, and provocative. Enjoy!

Transcript (edited for clarity)

Lenny Murphy: Hi, everybody. It’s Lenny Murphy here in our continuing CEO Series. And today, it is my honor and privilege to be joined by Peter Hartzbech, Founder and CEO of iMotions, which if you’ve been following the news, has been doing a lot of cool stuff lately. So Peter, welcome.

Peter Hartzbech: Thank you very much, Leonard. It’s amazing to be here today. So it’s afternoon here in Denmark, so you know, I’m sorry I had to get you up this early. [LAUGHS]

Lenny Murphy: Oh, that’s good. You know, I’m sure you’re aware that I have lots of kids. So being up early is – that happens no matter what. So if it had been a Saturday, maybe I would have been a little grumpier, but we’re OK. Appreciate that. [LAUGHS]

So as I mentioned, anybody who’s been following this kind of unprecedented level of M&A and funding activity in the market, iMotions has come up pretty frequently with some of the deals that you have done. And I won’t steal your thunder. Why don’t you tell us a little bit about the iMotions story and where you are recently with this massive growth trajectory that you’ve been on.

Peter Hartzbech: Yeah. Well, thank you very much. Yes, I will do that, try to do it shortly. [LAUGHS] So as you probably know, it’s been a 17-year journey. So I started in the eye tracking industry in 2004, 2005. I think we made it in some trade shows back in the day, AIF and MIA and all these. I recall that.

But since then, it’s been a wild ride – a lot of ups and downs, as it is with most startups. But since 2011, we have been profitable and growing substantially every year. So it’s been a pretty exciting [INAUDIBLE] strategy that, of course, a lot of companies don’t do. But that’s also why we came really well through COVID, because we are a very stable company, we had a lot of clients for many years that have been supporting a company.

So maybe I should just quickly explain, for the ones who don’t know what iMotions is. So basically, we are a software house. So we are focused on software. And then we have started out with working with eye trackers and eye tracking. But basically, what we do today is called multimodal research. So basically, we combine a lot of different biosensors together into one easy-to-use software platform.

So for example, if you have a website, for example, you can test your website. What we can measure, exactly what people look, with the eyes, through the eye tracker. And then you could, for example, put on the GSR, a signal which is called galvanic skin response, which is kind of measuring the intensity of the emotional response you have. If you’re frustrated or irritated, then your GSR or your arousal goes up.

And then we also measure facial expression. So for example, we work with Affectiva and Realeyes, as well, but primarily Affectiva. We work with them, who have built a facial expression engine that can measure a lot of things out of your face, like brow furrow if you are frustrated when you look at the website, joy, and so on.

So that’s the basis of the company. We are focused on software, as I said, but also kind of built a large ecosystem of suppliers. All the different hardware companies that academic researchers or commercial researchers call, we have more or less integrated into the platform. So we are a one-stop shop for people who want to start with biometric research. Or you can buy both the hardware training and software so we can kind of help you build your lab.

But on top of that, we are not only lab-based software. We also have an online solution for both facial expressions, eye tracking, and surveys, in combination, and also our mobile platform, where you can basically – this is the future of the research lab where you can have a phone, where you can wear sensors for a longer time, so no more longitudinal studies. So that’s kind of the three kinds of main evolution areas for our research products.

So yeah, and then, now you mentioned the M&A activity. And so we have been thinking about how should we bring the company to the next level growth-wise and so on. Since we are bootstrapped, it’s been a pretty hardcore journey ride. You have to make the money. Then you invest in new sales team members. That has to be ramped up and so on.

So of course, it’s also a tough ride but a very secure ride, so to say. So we wanted to try to figure out how could we then boost the growth and the potential for this kind of human behavior research or for the human behavior research market in the future. And that’s why we have now been acquired by Smart Eye.

And I can maybe shortly just talk about what Smart Eye has been doing. Also, for – well, they are 22 years or something. So Martin, the CEO of Smart Eye, he’s been 22 years into it. I’m still only a teenager. I’ve only worked in the industry for 17 years. [LAUGHS] So I’m the teen in that relationship.

But they have been building eye trackers primarily for hardcore research labs, such as NASA and so on, but also a lot of automotive companies – so for [INAUDIBLE] purposes. And then they have used that to scale into the automotive industry. So basically, they are the world leader within safety systems in cars, what is called driving monitoring systems – so basically, the camera that observes you from behind the wheel. And by law, I think by 2025, both in EU and US, you actually have to have a driving monitoring system in the cars. So it’s a super exciting vertical and area to be working in.

And then the third part of the – you can say this M&A activity is that, actually, Smart Eye was acquiring Affectiva – that most of you out there also know – maybe mostly for the media analytics sites, where they have built a facial expression engine for advertising testing and video ads and so on – some trailer testing and so on. But actually, they also had a focus on the automotive industry.

But they are not doing the driving monitoring systems. They were kind of in the next wave of things, which is called interior sensing. So it’s basically more observation of the whole car, where you can see, for example, if there’s a baby, it knows there’s a baby in the car. So if you leave the car in 100 degrees Fahrenheit outside, the car would alarm you, because there’s actually 50 babies dying every year from this, unfortunately.

So these kinds of – that’s called interior sensing inside the cabin. So there’s both, you can say, a fit between the three companies with regards to automotive industry, but also with the human behavior research industry and multimodal research. So it’s really, really super exciting. I can – this is mostly on the behavior research side, I think, but I think – so maybe I can finalize the automotive side of it here.

So the next generation of that – the first was driving monitoring system. Then the interior sensing is coming. And that was what Affectiva was really one of the world leaders in. And then the next wave is a kind of multimodal automotive security, so to say, or safety, where you have more sensors inside the car. And that’s, of course, where iMotions can bring some really important knowledge to the table, of combining different sensors and frame rates and so on for the next generation in automotive.

So that is super exciting. Yeah, and it’s really cool. I have worked with both Rana – they’re personal friends, Rana and Martin Krantz, from Smart Eye. And I’m looking forward to – we’re going to be leading the executive team together with the CFO, going forward, with the three companies together. So in some ways, it’s also more like a merger of three companies. But of course, this was an acquisition.

But I see this as a long-term future for iMotions. And I’m still very excited about it and hope to stick around for a lot of years, keep building. But now with actually the platform for on the Stock Exchange, right? So since Smart Eye was on the Stock Exchange, that’s also why this deal could be done, so to say. And now we have all three companies there. Yeah, and it’s possible to, of course, raise substantial money for the future if we need.

So yeah, so that’s the short story, just to – I mean, where we are today is I think we have approximately 1,300 clients in 80 countries inside iMotions. So now it’s about how do we get that to 5,000 or 10,000 clients and then move ahead.

Lenny Murphy: Yeah, that is an exciting story, right? And knowing all three of you, and particularly Rana – I mean, Rana I consider a personal friend and has been for years. And just I love it when great people come together to do great things. So it’s heartwarming as much as it’s exciting from a business standpoint.

And from that perspective, I think there have been two kinds of nascent categories in the industry until 2020. One was digital qualitative, in all of its permutations, and the other was non-conscious measurement that they hadn’t hit scale yet – lots of cool innovation, lots of great activity.

And I certainly noticed – I mean, it was obvious on the qual side that, [SNAPS FINGERS] yes, that got flipped. But it was – I expected, and I think you have validated that, as well, that as the world shifted to digital out of necessity, that the eye tracking and facial coding and just understanding emotional states in general when everybody is kind of going, ah, the world is changing so rapidly, and consumer responses and behavior changes so rapidly, would also become very in demand. And I think we’ve seen that play out across the category.

Other companies in your space play on variations on a theme, whether it’s implicit or facial coding or eye tracking or galvanic skin response, whatever the case may be. Everybody – that rising tide has lifted all those boats.

Peter Hartzbech: Yeah.

Lenny Murphy: And you’re an example, then, because this is what happens when that starts to occur – then we start to see some consolidation to try and find synergies and grow that bigger. So it’s exciting to see you doing that. That category – it’s time for that to happen within the category. And I think it’s incredibly breakthrough and interesting. So hats off.

Peter Hartzbech: Well, thank you, thank you. Yeah, and I think what’s, if I can add to it, so it started very fragmented, this market, right? There are a few hardware eye-tracking companies. They did a little bit of eye tracking, but some analysis on top. There are GSR companies, EEG companies. It was all very fragmented, and nobody really worked together, right?

And then iMotions came in around 2011 and thought, OK, how can we combine all of these things with a more horizontal approach? And now, bringing the technologies into the same place, I just want to say, though, that we actually are very focused on keeping our ecosystems. So iMotions will be running independently. That’s also why I stayed the CEO, and the whole team and management inside iMotions is the same because we are very, also – it’s very important for us to have these hardware partners and other software algorithms so we can make sure the researcher gets the best possible product, right?

But of course, in the longer term, there’s a lot of ways that we can help to get the products in the right direction, both within facial expressions and eye tracking. But I think that’s really what is exciting about this, that there are also ways that we could do new business models, that when you both control some of the technology and of the software, you can also make sharper business models that fit the need better of the clients. And I think that might be one of the things that most have been struggling with.

If you’re an eye-tracking company, for example, you have to sell expensive hardware and then have a little bit of software. But you have to start from zero every year. It’s very hard to make any recurrent revenue and build a large sustainable business. So I think the combination of these three companies is going to be really, really exciting.

Lenny Murphy: Yeah. Now – and I want to be conscious of time, but there’s kind of two things that are interesting here, too. So I believe that there’s also a piece of the business that folks within health and wellness, and particularly diagnostics, from a – so how – is that going to continue, as well, on exploring that? As somebody with a neurological disorder, it’s interesting to think through that. So yeah.

Peter Hartzbech: So it’s – personally, it’s very important to me. Of course, I cannot control exactly what’s going to be done. But the original reason why we went multimodal and actually did build iMotions multimodal solution was because my mother passed away from Parkinson’s, right? And you could see all these physical signs that there were, but we couldn’t – she couldn’t articulate it. And you know, it’s not until you have passed away you could actually diagnose it.

So for the families out there, just the thing that you don’t know what’s happening, that’s one thing, right? So you cannot treat them. You cannot work with these patients. So that is really, really an important part of iMotions’ vision. And we still hope to continue to do a lot of that.

So the way that I see this that, in general, iMotions is in R&D horizontally – so for example, the health care industry, but also gaming and a lot of other industries. But in health care, specifically, we have a lot of clients already, both in academia but also in the commercial space, that actually are using our software to try to diagnose autism, Alzheimer’s, Parkinson’s, OCD, post-traumatic stress, all these disorders. So that is a very important part of the future, the way I see it.

And the timing of when it is and if it is and how it’s going to happen, of course, I don’t know. BUT at least on an R&D level, continuing that and continuing evolving also not only the lab product but also the phone product, I think. The lab has the phone for more longitudinal studies. You can observe the patients in the home, so you have patches on and so on with your phone. I think that’s a really, really important part of the future for iMotions and the group.

Lenny Murphy: That’s exciting. You know, I don’t know about you, but the – to have great commercial success and try and do good at the same time – there is no better combination, in my mind.

Peter Hartzbech: Exactly. And I think those three companies – that’s one of the reasons we also did it. I think, from iMotions’ side, it’s like, I know Rana very well, as I said, and Martin. And they’re really good people that also want to change the world and save lives and so on.

So the whole culture and the vision of the three companies are already very aligned. There’s, of course, big risk when you do something like this. But I feel very confident about that, also, that our teams have worked together across the companies for many years. I think we worked with Affectiva more than seven years and brought them into the academic market. And with Smart Eye, I think it’s approximately five years now. So to work with people who want to change the world and think big, that is really exciting.

Lenny Murphy: Absolutely. Well, you know, I don’t know about you, but the older I get, my mantra really is, I only want to do things that I like with people that I like.

Peter Hartzbech: Yeah.

Lenny Murphy: Now, obviously, life doesn’t always work that way. But it’s certainly an aspirational goal.

Peter Hartzbech: And it’s [INAUDIBLE]

Lenny Murphy: Yeah. So last question, let’s talk for the last few minutes. Lots and lots of discussion and activity now around the Metaverse. For those that have maybe read Ready Player One or saw the movie, that’s the context that I think of, all right, we’re going to build the OASIS, which was that construct, or The Matrix. I don’t know which one is [LAUGHS] maybe more accurate. But in terms of how that technology works, It’s certainly would seem to be an incredibly fruitful opportunity for the integration of various biometric and non-conscious measure into hardware as well as software.

So are you exploring – if we all have VR headsets in the next five years because Zuckerberg decides this is what he’s going to bet it all on, what that looks like, and particularly from a research application – I mean, I can imagine what are the – how does the shelf test change when it’s all VR? I think those things can be really exciting and scary at the same time? But any thoughts?

Peter Hartzbech: Yeah, so we already we’ve been at VR already for, I would say, maybe four or five years. It started with some really almost hand-hacked-together VR headsets. That’s what we started with, and doing eye tracking. So today, with the Varjo headset, for example, we are really like in real life. It looks like the two of us standing here, right?

So the technology is there. There’s something with the pricing that probably has to come down a bit in order for it to happen. But already now you have extremely precise eye tracking inside the VR headsets. And it also uses it for the rendering, to do it more optimally you know so you don’t have to render too much where you’re not looking, for example.

So I think eye tracking is a core component of VR, in general. But I also think, of course, other sensors, as we talked about – I know a lot of companies who are looking at some patches you can put on here. So you have some facial expressions, if you get warm in the face, for arousal measurement and so on. So there’s no doubt that this is going to scale, as well. And I think, if it’s going to happen in the metaverse and all that, I would let that up to Zuckerberg to decide. But if it is on a smaller scale, we are ready for that. We are also ready for the larger scale.

But of course, I think I’m also slightly worried about the privacy, because it’s going to be a little bit scary when you’re sitting there, that you can basically completely derive everything that’s happening and all you’re looking at and so on. And you know, I’m personally a bit – I rather like that there’s an application that’s useful and people choose to be part of it, right? So let’s say, for example, diagnosis through VR when you have post-traumatic stress, or you have arachnophobia – you have a spider coming over, and you can train yourself.

That kind of application, I think, is really huge. And that’s going to happen no matter what. And that’s kind of maybe where we are most focused. So I think there’s going to be some privacy discussions if we are going to go into this big OASIS one day and sit here at home. [LAUGHS] So yeah, but it’s super exciting.

But for research purposes, it’s already relatively popular. Of course, it’s in the early stage, but we still have a lot of clients that are exploring it now. And as you said, for example, for self-testing and stuff like that, it’s – already a lot of the large fast-moving consumer goods companies are doing it and so on.

I think the trick that needs to be solved, though, is how do you quickly create the environments, like the virtual environments, so with Unity, for example.

Lenny Murphy: Non-scalable. Yeah.

Peter Hartzbech: Unity needs to evolve a bit more also so that people like normal researchers can sit down. All that you have at such a large library of standards – also for knee operations or whatever you want to kind of use for training and all that, that has to be evolved a lot more. And that is basically what stops it a little bit right now, that people have to use a lot of time to build, let’s say, a doctor’s operating room, where he has to a knee operation, for example. That can take months or half a year or more. So that is kind of what needs to move a bit quicker. And then I think it’s definitely going to happen.

Lenny Murphy: Yeah. Yeah, I agree wholeheartedly. That’s always been the challenge, even with the – 15 years ago, we were doing the virtual magazine flips in Flash, right? Great and interesting from a research standpoint, but to build it was incredibly time-consuming and expensive. So I agree, that’s been – gamification as a whole, that idea, it is that rendering component

Peter, this has been great. I think you and I could go on for a really long time. And hopefully, we will have another opportunity to do that. I would even love to even do a trio conversation with you and Rana and Simon, at some point.

Peter Hartzbech: Let’s do it.

Lenny Murphy: Yeah, let’s do it. But hats off. Kudos. You’ve broken down a barrier that needed to be broken down in the industry to help create scale and show that these technologies absolutely have multiple applications, not just in the research space, and can be an incredibly exciting business model across the board. So congratulations on all the success.

Peter Hartzbech: Thank you very much.

Lenny Murphy: And thanks for spending time with us today.

Peter Hartzbech: It’s all about the team, man. It’s the team that has done it. So I want to send out a thanks to them. So yeah, great. Thank you very much.

Lenny Murphy: Thank you, Peter. I guess it’s not too early to say happy holidays. So we’re almost there.

Peter Hartzbech: [INAUDIBLE] Happy holidays to you.

Lenny Murphy: We’ll go ahead and say that. All right, thank you so much. You have a great day.

Peter Hartzbech: Thank you.

Lenny Murphy: Bye. Bye-bye.

About the author


See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.


Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news