• Please make sure you are familiar with the forum rules. You can find them here: https://forums.tripwireinteractive.com/index.php?threads/forum-rules.2334636/

Server Server Performance Configuration Tweaks?

Ninjaboy

Member
Aug 2, 2006
18
0
I was wondering if there were any performance settings for RO?

Our server has an abundance of cpu and bandwidth readily available so I want to tweak server to use more of our resources. I noticed when our server is full with 32 players that we are only using up about 25% CPU and memory at peak.

The problem is however our pings are about 50-75ms over what they should be for some reason. This only occurs once the server gets to about 26-32 players. We have more then enough CPU and bandwidth so I want to use more of our resources to lower those pings and increase performance at the cost of CPU / Bandwidth usage.
 
Zetsumei said:
i've had thesame problem since launch. Changing tickrate does nothing (so its not cpu dependant) you can put it at 50 or at 10 won't change the ping problem. And changing netspeed won't help either. Putting it at 25000 or just 5000 or things in between all give thesame problem at 28+ players pings go up.

Interesting because according to the nettick document it should have some effect. I'm going to test the Unix version and see if it's more effective at managing the resources. VMWare is an awesome program if you're running a windows box. You can virtualize a Unix session using a .iso file on your windows server.
 
Upvote 0
Update:

After doing some testing and a little more research I concluded that the Unreal engine is probably one of the worst server side engines in terms of resource usage out there. Battlefield 2 and Savage can support 64 players on the same platform it takes Unreal to support more then 26 players.

The short - If you want to run a 32 player server you're going to need raw single cpu horsepower. It should be a dedicated box only running Red Orchestra, 3.2ghz+ (or equivalent), and 2gigs of memory. If you have a dual CPU, hyperthread, blah blah blah box it won't make a difference. It will be reporting 25% CPU usage of course but will be taking up 95-100% of one CPU. I'd probably disable hyperthreading as well as this has been known to cause issues with most dedicated servers such as Battlefield 2 (see BF2 v1.3 dedicated server recommendations in the patch notes).


If you are not meeting this minimum requirement you will experience what appears to be "lag" on certain server-side CPU intensive maps like Arad. Example, what should be a 60ms player will appear as a 200ms player or even more. It's not actual net lag but lag from the server not being able to update the players as quickly. Although I think there are some other theories to the causes of this including one instance where they said it doesn't make a difference it's only cosmetic. My experience is the game play does feel decreased in quality. This could mean that the Unreal II engine when it increases to a certain player count or on certain maps displays odd behavior that such as this which doesn't effect gameplay. It could also mean that if you're getting insanely high ping counts for localized players it's a server performance issue and you should check your server's CPU usage.


I'll be testing an Intel 950D server and the new Woodcrest Xeon processor to see how they perform with Unreal. My final conclusion is that with a 2.8ghz dual Xeon system with 2gigz of memory you can't effectively run more then 26 players with all the maps enabled.
 
Last edited:
Upvote 0
If you guys want more players on your servers then learn how to adjust the client rates. Unreal engine servers are fantastic. I run a 32 player server on a 1.8 GHz A64 CPU perfectly fine & no one complains about lag.

The settings you want are.

MaxClientRate=15000
MaxInternetClientRate=10000
Those are default numbers. I run at 7000 for both.

Tickrate is useless. Leave it at its default.

Its also very helpfull to know your upload speed. Download is almost always insanely faster then upload. For instance my home connection is 2MB up & 15MB down.
 
Last edited:
Upvote 0
Devourer said:
If you guys want more players on your servers then learn how to adjust the client rates. Unreal engine servers are fantastic. I run a 32 player server on a 1.8 GHz A64 CPU perfectly fine & no one complains about lag.

The settings you want are.

MaxClientRate=15000
MaxInternetClientRate=10000
Those are default numbers. I run at 7000 for both.

Tickrate is useless. Leave it at its default.

Its also very helpfull to know your upload speed. Download is almost always insanely faster then upload. For instance my home connection is 2MB up & 15MB down.

I hate to burst your bubble but the MaxInternetClientRate will have almost no effect on your server unless your server doesn't have enough bandwidth to handle updating the players at the recommended speeds. The Clientrate is as follows:


"The netspeed decides how much data you want to send to the server each second. A netspeed of 5000 will try to send 5000 bytes of data each second. And yes, for the smart ones out there that already guessed it, netspeed of 13690 will send 13690 bytes pr second. Also, it tells the server how much data you want to receive each second. The servers seem to not allow going under 2000, clients have a minimum of 500.

Some interesting aspect with the netspeed, is that it limits your framerate. Your maximum expected framerate online with netspeed 5000 is 5000/64 = 78fps. You should not get more, you will often get less. So why /64 you ask? Well, it's kinda simple. Each time your computer does an update, it sends about 64 bytes of data. So the great doods at epic thought, let's do netspeed/64 and limit framerate that way, so the client does not exceed netspeed bytes sent pr second."

So there would be only two reason to change that rate - #1. If your individual client machine was not able to handle the expected framerate of a 10000 netspeed rate. #2. If your server was limited on bandwidth. It could also cause your clients to be shooting at the target and still "missing".

Tickrate 30, Netspeed 500

vs.

Tickrate 30, Netspeed 5500


The "tickrate" does effect the server by telling it how fast to process the information.

Tickrate 20 | Tickrate 40

This is using one of the older Unreal Engines however the developers have stated the variables will effect the game in the same manner as they have not changed their meaning in their newer engines. I can't imagine a A64 1.8ghz being able to handle a 32player server effectively.
 
Upvote 0
I'm actually glad you can't "imagine" a A64 3000+ running at its default 1.8GHz, handling a 32 player server effectivly. Even after saying
I noticed when our server is full with 32 players that we are only using up about 25% CPU and memory at peak.
Must mean I am doing something right. & if you think client rates do not effect your overall bandwidth on your server that's more then fine as well. F6 will tell the client what speed they are getting in game. Funny how when in game I hit F6 it says exactly what speed of bandwidth I am allowing to the clients from the server.

Feel free to test different MaxClientRate & MaxInternetClientRate. I'm sure you'll see a difference but if you want to insist there isn't then that's fine with me as well. Doesn't hurt my server one bit.

Edit: My dedicated server listed my Client "netspeed" at 7000 within the dedicated server cmd window. Just like I set it. I do not have unlimited upload so I need to throttle some of the bandwidth.
 
Last edited:
Upvote 0
Devourer said:
I'm actually glad you can't "imagine" a A64 3000+ running at its default 1.8GHz, handling a 32 player server effectivly. Even after saying

Read the entire thread and this will make since to you. I stated my reasoning behind why it was reporting 25% (hyperthreading + 1 other processor that also hyperthreads) later on in the post. Meaning the game is actually capped out and overloaded since ROOST doesn't take advantage of hyperthreading and isn't multi-threaded.

Must mean I am doing something right. & if you think client rates do not effect your overall bandwidth on your server that's more then fine as well. F6 will tell the client what speed they are getting in game. Funny how when in game I hit F6 it says exactly what speed of bandwidth I am allowing to the clients from the server.

I didn't say they don't effect the overall bandwidth on the server. Quiet the contrary. I said that is the only reason to even use or limit the client rates. I also implied that if you lower that rate it might save your bandwidth but decrease the overall game quality of the players especially with 32 since less information is being sent/updated. It will also limit the maximum FPS a client will achieve. See the previous reply.


Feel free to test different MaxClientRate & MaxInternetClientRate. I'm sure you'll see a difference but if you want to insist there isn't then that's fine with me as well. Doesn't hurt my server one bit.

Well of course if you change the maximum data your clients are allowed to recieve you're going to see a difference. The only problem is it's not the one I want to see.

Edit: My dedicated server listed my Client "netspeed" at 7000 within the dedicated server cmd window. Just like I set it. I do not have unlimited upload so I need to throttle some of the bandwidth.

That's the only thing that made sense so far. If your upload speed is limited then you can throttle it back to prevent clients from lagging out. However anytime you limit the default recommended settings you will lose quality of game play. To what degree of course could be subjective. If you look in those links above you can see that lower client rates result in a less fluid hitbox because clients are actually "warping" or "teleporting" from one spot to the next. This means if someone is standing right in front of me and he's running I stand a good chance to miss him even though he's visually on my screen because only every 5th frame of the other player is actually going to register.

A 3000+ probably isn't enough to handle a 32 player server on Vehicle and Arty maps. I have not tested an AMD 64 3000+ however so I can't say for certain. I do know however that a 950D 3.4ghz Pentium IV works and a Xeon 2.8ghz does not. I error on the side of caution when it comes to lowering bandwidth consumption beyond the default limits but I suppose there is always room to tweak and adjust.
 
Upvote 0
The dedicated server for the Unreal 2.5 Engine is quite CPU hungry. I highly recommend a dedicated processor of Intel 3.0 GHZ and up or an AMD 3000+ and up for a 32 player server. The ram usage however is nowhere near 2 gigs. The ram usage for the server should never go over 512 mb (if not even much lower than that). While BF2 can do 64 player servers, there is also an issue of fidelity and the types of levels the engine can do. BF2 can't do highly complex geometry (for things like building interiors) like Unreal can. We're working on some CPU improvements for the servers for future patches, but we can't really rewrite the engine or anything of that nature. Unreal will always be Unreal, and it is just CPU heavy.
 
Upvote 0
I would just like to post for the record. I did read the thread & never was I really replying to you "Ninjaboy". You stated
We have more then enough CPU and bandwidth
in your 1st post. Even though that doesn't really make sense as right before you stated
however our pings are about 50-75ms over what they should be for some reason
which totally sounds like a lack of bandwidth to me...... In any case. I was replying to post #2 "[5thW]Heide" who said
Our server is more than capable of running 32 players.
Plenty CPU and memory left BUT everybody gets a huge ping-jump.
So we are forced to limit to 28,
& #6 "Zetsumei" which both most certainly sound like a lack of bandwidth. Although given what Zetsumei said I believe he has pretty limited bandwidth.
 
Upvote 0
Devourer said:
I would just like to post for the record. I did read the thread & never was I really replying to you "Ninjaboy". You stated in your 1st post. Even though that doesn't really make sense as right before you stated which totally sounds like a lack of bandwidth to me...... In any case. I was replying to post #2 "[5thW]Heide" who said & #6 "Zetsumei" which both most certainly sound like a lack of bandwidth. Although given what Zetsumei said I believe he has pretty limited bandwidth.

Since you can't read a few replies up (see reply #8) I'll re-quote the entire thing:



"Update:

After doing some testing and a little more research I concluded that the Unreal engine is probably one of the worst server side engines in terms of resource usage out there. Battlefield 2 and Savage can support 64 players on the same platform it takes Unreal to support more then 26 players.


The short - If you want to run a 32 player server you're going to need raw single cpu horsepower. It should be a dedicated box only running Red Orchestra, 3.2ghz+ (or equivalent), and 2gigs of memory. If you have a dual CPU, hyperthread, blah blah blah box it won't make a difference. It will be reporting 25% CPU usage of course but will be taking up 95-100% of one CPU. I'd probably disable hyperthreading as well as this has been known to cause issues with most dedicated servers such as Battlefield 2 (see BF2 v1.3 dedicated server recommendations in the patch notes).


If you are not meeting this minimum requirement you will experience what appears to be "lag" on certain server-side CPU intensive maps like Arad. Example, what should be a 60ms player will appear as a 200ms player or even more. It's not actual net lag but lag from the server not being able to update the players as quickly. Although I think there are some other theories to the causes of this including one instance where they said it doesn't make a difference it's only cosmetic. My experience is the game play does feel decreased in quality. This could mean that the Unreal II engine when it increases to a certain player count or on certain maps displays odd behavior that such as this which doesn't effect gameplay. It could also mean that if you're getting insanely high ping counts for localized players it's a server performance issue and you should check your server's CPU usage.


I'll be testing an Intel 950D server and the new Woodcrest Xeon processor to see how they perform with Unreal. My final conclusion is that with a 2.8ghz dual Xeon system with 2gigz of memory you can't effectively run more then 26 players with all the maps enabled."




So you see my reply #8 was a follow to my original post both of which are in this very thread. So if you're going to quote someone make sure you bring in all the information in the thread as #8 lead to my final conclusion which is clearly backed up by RO support.


Furthermore testing my 950D I found we were actually at 90% usage with 32players on one map with the default settings. That would probably scare me away from using anything less then a 3.4ghz cpu or equivalent. I invite you to join the server and you'll notice the difference right off the bat from all the other 32 player servers out there. Especially on the vehicle maps.

I tested your 7000 net settings and I can tell you that I found that atleast 20% of the bullets "vanished" compared to the normal 10000 net setting when someone was moving.
 
Upvote 0
[RO]Ramm-Jaeger said:
The dedicated server for the Unreal 2.5 Engine is quite CPU hungry. I highly recommend a dedicated processor of Intel 3.0 GHZ and up or an AMD 3000+ and up for a 32 player server. The ram usage however is nowhere near 2 gigs. The ram usage for the server should never go over 512 mb (if not even much lower than that). While BF2 can do 64 player servers, there is also an issue of fidelity and the types of levels the engine can do. BF2 can't do highly complex geometry (for things like building interiors) like Unreal can. We're working on some CPU improvements for the servers for future patches, but we can't really rewrite the engine or anything of that nature. Unreal will always be Unreal, and it is just CPU heavy.

Yes I noticed that the RAM was incredibly minimal. That really suprised me but is really impressive. BF2 and especially Savage 1 are both memory hogs. You always want the dedicated server to take up as little cpu as possible but it wasn't a big issue for me. I just needed to test out a few server boxes, settings, and tweaks to see exactly what it would take to make a 32 player server feel fluid and smooth. The new server I put ROOST on is the fastest ROOST server out there at the moment and performed beautiful at 32 players. I'm eagerly awaiting a woodcrest box to see how the new 5160 executes the product.
 
Upvote 0