Originally posted by Mezlo
View Post
I think about 'lag' in three ways:
- Network latency:
- How long does it take does the packet travel from user's machine to the server?
- How long does it take the other way?
- "Server latency": time it takes from receiving a request from to finish executing it.
- "Client latency": time it takes from receiving info from server to complete acting on the info.
So, let's say let's say there are 50 other players near a player's character, all doing stuff. Server dutifully sends all that info to someone's ready-to-retire-yesterday computer, and the data get there fast. BUT, it takes the tired, old computer 10 seconds to finish digesting all that info and draw the screen.
That gives you 0.1 "frame per second" display rate, at best. That player would have what I call a "client latency" problem, but what's his complain?
"It's laggy."
Now, let's say a zone server is overloaded--too many people at location _____. The server is having a hard time keeping up, and is having trouble sending out "Here are the 50 people near you" message to the (implied) client request to update "Hey, what's around my character?"
Another user, with his $5000 desktop gaming monster, is displaying the game at a crisp 100+ frame per second. But, the sever is only sending updates about the stuff around him every 5 seconds. His complaint? Well, of course it would be:
"It's laggy."
In both cases, there were no problem caused by network latency--both were issues with computational load. But to the end user, it's all 'lag'.


Leave a comment: