I’m hoping someone can help me make sense of this graph. I’m trying to see how CPU utilization changes as more game sessions get squeezed onto the same server instance. Here, I opened up about 50 sessions over the course of like 15 minutes.
The red line is game sessions, and the brown line is “CPU Utilization.” The brown line is what’s confusing me.
Usually, when I think of CPU utilization, I think of it being basically 0% when nothing is happening, and then it creeps up to 100% as more things happen. And, in fact, that is what I saw last night when I did a similar test using a different instance type.
But here, it starts around 80% and drops to 30% as we reach 50 game sessions. Then it goes back up to 80% the moment the game sessions are end. It seems to directly conflict with what I saw last time.
I suppose what possibility is that by CPU utilization, Amazon really means CPU “availability,” so that might explain why the value seems inversely related to the number of game sessions. But if that were the case, then why does it hover at 80% and 100% when it’s just sitting idle with no game sessions? And furthermore, why did this seem to be the opposite behavior as what I witnessed during my previous test?
Thanks so much in advance for any light you can shed on this issue!