• Please make sure you are familiar with the forum rules. You can find them here: https://forums.tripwireinteractive.com/index.php?threads/forum-rules.2334636/

Post your average fps on intense combat

Intel Core 2 Duo E7500 @ 3.2GHZ
BFG Tech GeForce GTX 260 OCX Maxcore OverClocked at 700MHz core, 1,500MHz shader and 2,500MHz (effective) memory
Asus P5Q SE Plus Motherboard
4GB DDR2 Ram
Windows 7 Ultimate x64
OCZ GameXStream 700w PSU
1680x1050 resolution
Nvidia Forceware 195.81 beta drivers(latest available)
Refresh rate 85HZ
Vsync Enabled
All settings maxed with x16 AF and 4x AA.

After some benchmarking with fraps I did some 2 minute gameplay videos and did some average FPS benchmarks. I get 85 FPS even in intense scenarios.
 
Last edited:
Upvote 0
Lots and lots of variables to take into consideration with this scoring, depending on map, area and what's being drawn my average can be anything from 400+ to 40- so best I can do is say my average is around 200fps, that's by averaging my average :)

By taking away the above variables my Highest FPS in KF to date is 1553 ( box room, no objects and low settings) cant seem to push it past 1600 yet ;)

Average score above is based on

System:

E8500 @ 3.9Ghz
4Gb DDR2 @ 1106Mhz
HD4870x2 2Gb @ Stock

Settings

1680x1050
Maxed settings x16 AF and 8x AA

Filthy.
 
Last edited:
Upvote 0
Lots and lots of variables to take into consideration with this scoring, depending on map, area and what's being drawn my average can be anything from 400+ to 40- so best I can do is say my average is around 200fps, that's by averaging my average :)

:p Yeah but i wrote average on intense combat, not idle with out zombies.
also in many maps my fps drop significantly on intense combat.
In the first post, can you see the screenshot. In this, the gpu only "draw" my team, and the fps are only 32 :confused:.
 
Upvote 0
A couple of you blokes are playing at 1680 x 1050 on rigs that could easily handle 1920 x 1200-gauge gameplay, so the constant 60+ fps isn't exactly surprising.
I do 19x12 on a fairly powerful machine w/ SLI GTX 260 (c216 x 2), 4 gigs of ram, with a substantially overclocked E8500, and not a damn thing in KF can make the machine flinch. It's a totally over-powered spec for an Unreal 2.5 based game.
Hell, it's pretty much over-powered (or, very adequately powered) for most Unreal 3-based titles I've played.
All set for Killing Floor 2 and RO 2, although I'd love to transition to twin GTX 275s at some point.
 
Upvote 0
The 30fps thing only applies to what is necessary to produce the illusion of movement to the human eye. It is not yet confirmed what the limits of the human eye's perception truly is, so in any situation, the higher the fps the better.

Now the refresh rate thing is more accurate. If your monitor only refreshes the image every so often, it's essentially a cap because the image is only cycled every so often (60hz refresh is a visible cap at 60fps, and 120hz refresh is a cap at 120fps, since the monitor can't physically recycle the image any faster).

So, basically, even if you churn out 300fps and you only have a 60hz monitor... Well, those extra 240fps mean jack.

EDIT: Just read Gunthak's post, and he basically summed this up perfectly. Ignore me please, haha.

Anyway, specs for a wide-range of equipment -

Laptop 1 -
Intel Core Duo @ 2.16GHz
2GB DDR2 RAM (2 x 1GB)
ATI Radeon X1600 @ 256MB
Windows XP Home Premium SP3
1440 x 900 Res. @ ~90fps - Settings on Normal (w/ no AA)

Laptop 2 -
Intel Core 2 Duo @ 2.66GHz
2GB DDR2 RAM (2 x 1GB)
Nvidia GeForce 8800M GTX @ 512MB
Windows Vista 32-bit Home Premium SP2
1280 x 720 Res. @ ~120fps - Settings on Normal (w/ no AA)

Desktop 1 -
Intel Core i7 920 @ 2.67GHz
6GB DDR3 RAM (3 x 2GB)
ATI Radeon HD 4870 @ 4GB (2 x 2GB)
Windows Vista 64-bit Home Premium SP2
1600 x 1050 Res. @ ~200fps - Settings on Highest (w/ no AA)

Desktop 2 -
Intel Pentium 4 @ 3.0GHz
2GB DDR RAM (2 x 1GB)
ATI Radeon 9800 @ 128MB
Windows XP Home Premium SP3
800 x 600 Res. @ ~20fps - Settings on Lowest (w/ no AA)

Screenies from Desktop #1 (using "stat fps", so it prints in the right-hand corner) -
Number 1
Number 2

Something about the Anti-Aliasing kills my average fps by about 100 frames usually. I've only experienced this kind of dip in KF though, as when I play other games on Desktop #1, I can crank the AA to 4x without a hitch, and sometimes 8x for a loss of ~20 frames.

For example, in CounterStrike: Source my settings are completely maxxed on Desktop #1, including HDR lighting, full AA, etc. and I average ~250fps. However, my monitor is only 120hz, so, anything over 120fps doesn't really matter I suppose. :p
 
Last edited:
Upvote 0
Wow, Now i know just how overkill this system is in comparison

Win7 Ultimate (Formerly XP Pro 64bit)
I7 2.94ghz (quad @3ghz, not Oc'd)
16gb (Formerly 8gb) Ram
3x BFGTech Geforce 295 (2x1792mb, Not Oc'd)
Raid Setup HDD's 4x1tb (2 2tb Pairs)
Combo Cooling Liquid+Fans (bit loud at full blast with the fans, but i don't mind)

1280x1024res highest detail, the engine will lag and crash long before my FPS will drop

Pretty much a full tower server being used as a desktop
 
Last edited:
Upvote 0
Wow, Now i know just how overkill this system is in comparison

Win7 Ultimate (Formerly XP Pro 64bit)
I7 2.94ghz (quad @3ghz, not Oc'd)
16gb (Formerly 8gb) Ram
3x BFGTech Geforce 295 (2x1792mb, Not Oc'd)
Raid Setup HDD's 4x1tb (2 2tb Pairs)
Combo Cooling Liquid+Fans (bit loud at full blast with the fans, but i don't mind)

1280x1024res highest detail

Why would you pair those specs with a 1280x1024 monitor? It's like putting scooter tires on a Hummer or something. Total mismatch of power to rendering output potential. Give yourself a late Christmas gift, and pair your beast of a machine with a proper 19x12 or 25x16 widescreen monitor! Enjoy all that power man, don't bury it inside a tiny display res.
*the 1200p monitors are dirt cheap lately.
 
Last edited:
Upvote 0
A couple of you blokes are playing at 1680 x 1050 on rigs that could easily handle 1920 x 1200-gauge gameplay, so the constant 60+ fps isn't exactly surprising.
I do 19x12 on a fairly powerful machine w/ SLI GTX 260 (c216 x 2), 4 gigs of ram, with a substantially overclocked E8500, and not a damn thing in KF can make the machine flinch. It's a totally over-powered spec for an Unreal 2.5 based game.
Hell, it's pretty much over-powered (or, very adequately powered) for most Unreal 3-based titles I've played.
All set for Killing Floor 2 and RO 2, although I'd love to transition to twin GTX 275s at some point.

Yah sorry about that but I only have a 20 inch monitor and the native RES is 1680x1050. I know I could run 1920x1200 in high end games and ive seen rigs that run GTX 260 cards play games like MW2 maxed at 2560x1600 with a frame rate of 60+. I need a new monitor don't I? Also guys. Vsync good or bad?

Win7 Ultimate (Formerly XP Pro 64bit)
I7 2.94ghz (quad @3ghz, not Oc'd)
16gb (Formerly 8gb) Ram
3x BFGTech Geforce 295 (2x1792mb, Not Oc'd)
Raid Setup HDD's 4x1tb (2 2tb Pairs)
Combo Cooling Liquid+Fans (bit loud at full blast with the fans, but i don't mind)

How much did that beast cost you? Its gotta be over the $3000 mark. Man the only way to make that rig better would be to replace the GTX 295s with HD 5970s and use eyeinfinity to use 6 monitors.
 
Last edited:
Upvote 0
Why would you pair those specs with a 1280x1024 monitor? It's like putting scooter tires on a Hummer or something. Total mismatch of power to rendering output potential. Give yourself a late Christmas gift, and pair your beast of a machine with a proper 19x12 or 25x16 widescreen monitor! Enjoy all that power man, don't bury it inside a tiny display res.
*the 1200p monitors are dirt cheap lately.

Because my desk is small, i have no use for anything over 1280x1024, and i LOATHE widescreen with a passion, because it KILLS so many of the games i play that are older and don't support it

How much did that beast cost you? Its gotta be over the $3000 mark. Man the only way to make that rig better would be to replace the GTX 295s with HD 5970s and use eyeinfinity to use 6 monitors.

this system is over the 4k mark, My other system (I'm in the process of upgrading it) will probably hit well over 6k by the time i'm done

Similar specs though, It's a Dual Proc setup (Intel "Skulltrail dual Core2Extreme)
 
Upvote 0
i LOATHE widescreen with a passion, because it KILLS so many of the games i play that are older and don't support it

That's a total zero-issue if you don't use a 'scaling' setting in your video control panel.
All your old 4:3 non-widescreen games will simply be displayed in their proper, non-stretched/scaled original format. Meanwhile, you get to enjoy all the newer games that benefit (visually and gameplay-wise, with far greater view range/superior FoV) from 16:9 and 16:10 'HD' widescreen formats. There's no excuse!
Well, except for the limited physical desktop space you mentioned. Moving from a (guessing) 19-inch 1280x1024 display to a 23-inch 1920x1200 isn't that much of a space increase, however...
 
Upvote 0
Because my desk is small, i have no use for anything over 1280x1024, and i LOATHE widescreen with a passion, because it KILLS so many of the games i play that are older and don't support it



this system is over the 4k mark, My other system (I'm in the process of upgrading it) will probably hit well over 6k by the time i'm done

Similar specs though, It's a Dual Proc setup (Intel "Skulltrail dual Core2Extreme)

OOh nice computer you got there.. Can I have it? WTF DO YOU DO FOR A LIVING YOU RICH BASTARD!!!

ALLLEEERRRTTT:
Does anyone know when the GT 300 series cards will come out? I'm waiting on em so I can buy a GTX 285 when the price drops.
 
Last edited:
Upvote 0