It’s a rather simple question, but a somewhat fundamental in computing in general.
Let’s say we have a player controlled character in an open world. The world is large. In this game-world we have a vast amount of trigger zones that basically responsible for changing music, changing lighting, changing animation and so on. And the game clock runs at 60 fps.
In this situation we have so many subroutines which sensitive to “If a player has entered the area” that we can count then by thousands. Here’s where I’m not sure. Since we have so many trigger subroutines it seems we can’t just simply put them to check, if a player has entered the area, every 60th of a second even though this simple “listening” is not gonna take up a lot of resources; but scaling the world bigger and bigger will create a performance hit for sure, with this approach. So, there must be some other mechanism to manage all of those trigger zones. I inclined to think that the mechanism goes like this:
- a player’s character subroutine constantly (at 60 fps) updates its position.
With each update frame the playable character subroutine goes
something like this: Hey, I’m at the position x(5.0),y(3.0),z(10.5)
so what branch of code should I execute being at this position? And
the respond comes from an indexed map where we have every possible
coordinates and corresponding code to execute. (of course we can have
a lossless compression going on for saving up the memory)
Am I correct? …or does every subroutine listen for an input every update frame? (hope it’s not )
Edit addition:
I’m not asking if games divide their worlds into chunks, areas or grid cells for saving up memory, but asking about their approach on saving up CPU/GPU cycles and how unactivated triggers are treated.
Thank you, who have already given their answers.