• Please make sure you are familiar with the forum rules. You can find them here: https://forums.tripwireinteractive.com/index.php?threads/forum-rules.2334636/

Video suggestions

SgtThompson

Grizzled Veteran
Aug 15, 2006
79
0
Top priority, optimize the engine. No offense, but for how the game looks, it should not run as poorly as it does in some cases.

Next, I would recommend adding Anti-aliasing to the game. This is quite basic and I don't see how it never made it in.

Another big one. Fix the widescreen display issues. This includes using the proper aspect ratio and extending horizontally instead of chopping the top and bottom of the screen off.

Think Half-Life 2. They did widescreen perfectly. Take note!

Finally, improve HDR. It blurs the screen instead of providing a "realistic visual experience."

Don't get me wrong, I love the game. But, there's room for improvement.
 
Next, I would recommend adding Anti-aliasing to the game. This is quite basic and I don't see how it never made it in.
Nowadays you set this in the menu of your gfxcard's driver as that setting overwrites any ingamesetting anyway (unless its set to accept the ingame setting). You can even create different profiles for different games, so there is absolutely no need for an ingame AA, AF, VS, TB, etc setting.

I agree about the rest of what you said though. Especially the optimizing part although I don't think its possible for them to do much about it. The engine is dated, and the older an engine is, the worse its looks:performance ratios are.
If the game was running on UE1 and look like that it would run even worse.
So its not "for such an old engine it runs bad" but its actually "BECAUSE of such an old engine it runs bad" and without an engine overhaul you can only change so much.
There isn't going to be an engine overhaul though. Not when they are already working on their next project on a new engine to begin with.
Its a decent suggestion but it remains an utopic dream.
 
Upvote 0
Top priority, optimize the engine. No offense, but for how the game looks, it should not run as poorly as it does in some cases.
Unreal Engine 2.5 is being pushed over the limits, so what you currently see is already heavily optimized.

It wouldn't be much better though.
Next, I would recommend adding Anti-aliasing to the game. This is quite basic and I don't see how it never made it in.
Anti-aliasing is mostly controlled externally, via the display drivers.
Since DirectX 9 afaik, you can't have anti-aliasing controlled ingame.
Another big one. Fix the widescreen display issues. This includes using the proper aspect ratio and extending horizontally instead of chopping the top and bottom of the screen off.

Think Half-Life 2. They did widescreen perfectly. Take note!
Half-Life 2 uses the Source engine, which is almost entirely 3D, even in the GUI.

It's also optimized for "exotic" screen resolutions, Unreal Engine 2.5 isn't and the way it's now, it's hardcoded in.

And there's probably no way we are going to see that changed.
Finally, improve HDR. It blurs the screen instead of providing a "realistic visual experience."
The HDR you are talking about is having tonemapping run on floatingpoint calculations on the Shader Model 3.0 arithmetic logical units.

Since Unreal Engine 2.5 doesn't support Shader Model 3.0, there's no way we can have real HDR in.
 
Upvote 0
And there's probably no way we are going to see that changed.The HDR you are talking about is having tonemapping run on floatingpoint calculations on the Shader Model 3.0 arithmetic logical units.

Since Unreal Engine 2.5 doesn't support Shader Model 3.0, there's no way we can have real HDR in.

<besserwisser>
Acctually, no game features real HDR images since no screen in the world is capable of representing the entire dynamic range. Real HDR would mean being able to show all levels of energy from pitch black to the strength of the sun. You can store this information, yes, so there are real HDR images, but you cannot show the entire dynamic range at any display device. Enter the tone-mapping that maps the entire dynamic range down to a low dynamic range that can be shown on our monitors. In fact, most games don't use HDR images, only Higher Dynamic Range ones, since you can get a pretty good effect by simply having the double of the screen output capacity stored in the image. As long as you have some spectra to shift within with the tone mapping you can get the effects you're looking for, even if they are not the real thing.

Finally, support for Shader Model X is set in the graphics drivers and not in the game engine, are they not? GLSL-support is at least defined in the drivers and I can't see why HLSL should be any different. After all, it is a question whether the GPU can handle the functionallity asked for or not. Writing the shader itself is not a big deal, really.
</besserwisser>
 
Upvote 0
<uberbesserwisser>
Of course there are no real HDR images in 3d videogames and just for your info, they aren't even 3d!:eek:
That's why its called HDR-Rendering, because it renders an image that looks like it was HDR, same as its called 3d rendering, because it renders images that appear to be 3d although they aren't (your screen stays flat). Its just that everyone is too lazy to write HDR rendering, so we simply refer to that effect as HDR. The problem is, that RO doesn't support HDR but only Bloom, which is fake fake HDR so to speak. Kind of like those shaderconstructs some unrealengine2-mappers use to fake bumpmaps. That bloom isn't without fault and one fault is, that it looks bad and blurrs the whole picture. I hear its especially bad on older ATI cards, but I don't know about that.
</uberbesserwisser>
 
Upvote 0
<besserwisser>
Acctually, no game features real HDR images since no screen in the world is capable of representing the entire dynamic range. Real HDR would mean being able to show all levels of energy from pitch black to the strength of the sun. You can store this information, yes, so there are real HDR images, but you cannot show the entire dynamic range at any display device. Enter the tone-mapping that maps the entire dynamic range down to a low dynamic range that can be shown on our monitors. In fact, most games don't use HDR images, only Higher Dynamic Range ones, since you can get a pretty good effect by simply having the double of the screen output capacity stored in the image. As long as you have some spectra to shift within with the tone mapping you can get the effects you're looking for, even if they are not the real thing.

Finally, support for Shader Model X is set in the graphics drivers and not in the game engine, are they not? GLSL-support is at least defined in the drivers and I can't see why HLSL should be any different. After all, it is a question whether the GPU can handle the functionallity asked for or not. Writing the shader itself is not a big deal, really.
</besserwisser>
Wasn't talking about real HDR though, because that will probably never exist.

Also, having shaders running externally is way to risky, GLSL is OpenGL-only, HLSL is Direct3D-only must be done through the Direct3D layer.
Directly using the drivers without using a universal (rendering) API like often leads to bugs and crashes, unless you're using assembly to directly communicate with the hardware.

But unfortunately Microsoft shut that gate a long time ago.


EDIT: What Murphy said. ;)
 
Last edited:
Upvote 0
@ Murph and SgtH3nry3:

I only tried to point out the meaning of HDR just because the term is many times misused. There is quite alot of development going on when it comes to aquiring HDR images and also with the display problem (Higher Dynamic Range displays ^^) so it's a shame when a fake solution like ROs is called HDR. The rest of the text was not there to lecture you, I just had very little to do at work today. I hope someone find it interesting.

When it comes to the shaders: The whole "shader model #" stuff is connected to the Direct3D series is it not? I've never heard the term when talking about GL shaders and I can not imagine that ATI and nVidia would have agreed on some sort of standard. What I was aiming for is that support for the functionallity is not determined by the game engine bur rather by what is implemented in the driver. If it's supported by the card, implemented in the driver then the engine can use it without any problems.
 
Upvote 0
It's also optimized for "exotic" screen resolutions, Unreal Engine 2.5 isn't and the way it's now, it's hardcoded in..

Ut2k4 supports all widescreen resolutions perfectly, its only locked in ostfront. Together with auto cropping what wasnt in ut2k4 i believe. So circles are still circles and not ovals. You just miss a bit of top or bottom instead of more on the left or right. Aka a more zoomed in vieuw whats better would depend on the end user.
 
Upvote 0