Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

tinyBuild don't spy on employees with AI, says CEO who suggested doing so to identify "time vampires"

"It works" but was "hypothetical"

In a talk at this week's Develop:Brighton conference, tinyBuild CEO Alex Nichiporchik gave examples of how large language models such as ChatGPT could be used by video game studios to identify "potential problematic players on the team." The suggestions included feeding employees' text chats and video call transcripts into a system to detect certain words which could indicate burnout and "time vampires."

After receiving criticism online, Niciporchik tweeted to say that parts of his presentation had been taken out of context and that the examples were "hypothetical."

Well I'm not going to embed a tinyBuild trailer am I.Watch on YouTube

"We do not use AI tools for HR, this part of the presentation was hypothetical," said Nichiporchik in the final tweet of a thread.

During the presentation, Nichiporchik described a process he called "I, Me Analysis", as reported by Whynow Gaming. The process involves feeding Slack transcripts, and automated transcriptions from video calls, into ChatGPT in order to count how many times an employee uses the words "I" and "me".

"There is a direct correlation between how many times someone uses ‘I’ or ‘me’ in a meeting, compared to the amount of words they use overall, to the probability of the person going to a burnout," Nichiporchik reportedly said. "I should really copyright this, because to my knowledge, no one has invented this."

He also spoke about how a similar process could be used to identify "time vampires" - employees who talk too much in meetings. "Once that person is no longer with the company or with the team, the meeting takes 20 minutes and we get five times more done."

Through these systems, Nichiporchik suggested a company might be able to "identify someone who is on the verge of burning out, who might be the reason the colleagues who work with that person are burning out, and you might be able to identify it and fix it early on."

Whynow Gaming also report that Nichiporchik said that they had run these processes retroactively on former employees at tinyBuild, but that they were now beginning to use them actively. "We had our first case last week, where a studio lead was not in a good place, no one told us. Had we waited for a month, we would probably not have a studio. So I’m really happy that that worked," he reportedly said.

This would seem to contradict Nichiporchik's insistence now that tinyBuild "do not use AI tools for HR" or that the presentation was purely "hypothetical", which he further stated on the tinyBuild blog.

As many have already pointed out, using large language models for employee surveillance under the guise of detecting burnout is deeply dystopian. "Black Mirror-y", to borrow Nichiporchik's own description. So is describing employees as "time vampires", or suggesting that a ChatGPT process might lead to someone being "no longer with the company." (Plus, as someone else has surely pointed out, you don't need a large language model to count the number of instances of a word in a text. You could just press ctrl+F in a decent text editor.)

I use "I" and "me" a lot in work meetings, but that's only because I fight burnout by singing the hook from Because I'm Me at the start of each Teams call. Let me do a presentation at the next Develop.

Read this next