Two very important things about AI:
1. AI is incapable of insight. I don't mean now, I mean ever.
For something to be an insight, it either has to expand our knowledge or connect two dots in a way that has never been done before. It's not by accident that neurodivergence is overrepresented in research: you need to be hypercurious and a little bit weird to see things that others might miss.
A bot, by contrast, defaults to the average: even when presented with new information, it will seek to synthesise it in exactly the same way as everyone else: virtually anyone looking at it will give the same answer. If that's all you're capable of, you're not a very good researcher.
2. Wading into the weeds is the core of research. It's not something to skip because it IS the job.
When you delve into transcripts, line by line, you deeply internalize the information in them. Each time you pick apart and recombine the information, as you do during the analysis, you embed it deeper into your memory.
By the time you come to present the results, you understand it deeply. If you try to skip that stage, you have learned nothing.
As you say, when you get the whole team together to go through the affinity map, it then embeds it into their memories, too, and they also draw insights. My process is subtly different - I get everyone to silently write down everything they found interesting/surprising and then talk us through it - but the result is the same.
I go through it in advance (as a sort of pathfinder) and then with the team (to get the richest, deepest insights), and so that they can learn, too.