The argument that you "need a loop because otherwise what calls the event listener" does not hold water. Admittedly on any mainstream OS, you do indeed have such a loop, and event listeners do work that way, but it is entirely possible to make an interrupt driven system that works without any loops of any kind.
But you still would not want to structure a game that way.
The thing that makes a loop the most appealing solution is that your loop becomes what in real-time programming is referred to as a cyclic executive . The idea is that you can make the relative execution rates of the various system activities deterministic with respect to one another. The overall rate of the loop may be controlled by a timer, and that timer may ultimately be an interrupt, but with modern OS s, you will likely see evidence of that interrupt as code that waits for a semaphore (or some other synchronization mechanism) as part of your "main loop".
So why do you want deterministic behavior? Consider the relative rates of processing of your user s inputs and the baddies AIs. If you put everything into a purely event based system, there s no guarantee that the AIs won t get more CPU time than your user, or the other way round, unless you have some control over thread priorities, and even then, you re apt to have difficulty keeping timing consistent.
Put everything in a loop, however, and you guarantee that your AIs time-lines are going to proceed in fixed relationship with respect to your user s time. This is accomplished by making a call out from your loop to give the AIs a timeslice in which to decide what to do, a call out to your user input routines, to poll the input devices to find out how your user wants to behave, and call out to do your rendering.
With such a loop, you have to watch that you are not taking more time processing each pass than actually goes by in real time. If you re trying to cycle your loop at 100Hz, all your loop s processing had better finish up in under 10msec, otherwise your system is going to get jerky. In real-time programming, it s called overrunning your time frame. A good system will let you monitor how close you are to overrunning, and you can then mitigate the processing load however you see fit.