• Please make sure you are familiar with the forum rules. You can find them here: https://forums.tripwireinteractive.com/index.php?threads/forum-rules.2334636/

Unreal Engine 4 to be shown off this year (2012)

HeadClot

Grizzled Veteran
Dec 2, 2010
286
110
Utah, United States
It is that time again...

No not when Santa comes to your house... This is much more Awesome than that. The unreal engine is getting an whole new engine number - the Unreal Engine 4.

[url]http://www.g4tv.com/thefeed/blog/post/720663/unreal-engine-4-to-be-revealed-in-2012-according-to-epics-mark-rein/[/URL]

The following is a tech demo called Samaritan and was made with the Unreal Engine 3

Unreal Engine 3: Official Samaritan Demo - YouTube

Here is what i found on the unreal engine 4 -

http://www.highperformancegraphics.org/previous/www_2009/presentations/TimHPG2009.pdf[url]http://www.highperformancegraphics.org/previous/www_2009/presentations/TimHPG2009.pdf[/URL]

Let the general disscusion begin!
 
Lets see if they can pull it off.
The goals they have in mind are nice according to that presentation from 2009, but are they actually working towards that goal?
That Samaritan demo ran on triple 580 SLI and not on CPU software (which is what they want if I understood the presentation correctly).
It would really be awesome indeed if they could get rid of all the middleware, and I hope we get to see signs of that in the Unreal Engine 4.
 
Upvote 0
Lets see if they can pull it off.
The goals they have in mind are nice according to that presentation from 2009, but are they actually working towards that goal?
That Samaritan demo ran on triple 580 SLI and not on CPU software (which is what they want if I understood the presentation correctly).
It would really be awesome indeed if they could get rid of all the middleware, and I hope we get to see signs of that in the Unreal Engine 4.


Get rid of middleware? Why? Should they reinvent the wheel instead of adopting a middleware?
 
Last edited:
Upvote 0
If they can actually make something that looks better and runs better by using the full power of the CPU without wasting code on Directx and Drivers, I don't know why they shouldn't do it.

The thing is, CPU's didn't go the larrabee way. Based on the filename of that pdf, Tim Sweeney gave that presentation in three years ago. A lot has changed since then ;)

Edit: "0 out of 1 members like this post."
Haha, some people are just sad. Downvote for disagreeing with a post. Tough times.
 
Last edited:
Upvote 0
But then that only means that they wont be able to pull it off, doesn't mean it wouldn't be awesome if they did. Notice that I used that conditional in my post.

Well I just think it's kinda far fetched.
I'm guessing they'll ditch DX 9 and do a similar thing as DICE did with Frostbite 2 and incorporate all the DX 10+ stuff into UE 4. Meaning stuff like parallel draw calls, better instancing of meshes and things like that.
Wouldn't be surprised if they adopted realtime lighting instead of lightmaps, with the exception of ambient light maybe.
 
Last edited:
Upvote 0
The predictions, or atleast the problems cited are still valid though.

As performance of the machines improves, there will definitely be a continuation of the push that has happened everywhere else in computers to have the computer do more work in order to get the job done sooner ( writing the software not the actual software job being done sooner ).

Technically code would be faster if it were all written in assembly. That is not feasable anymore, as machines process time is cheap, developer time is not.

It is a trend that will hit the video game industry sometime soon, just look at the speed at with which graphics got better ( for a while ) but seemed to have somewhat stagnated now. How good did crysis look, even when compared to a brand new game like BF3 or ROHOS? Crysis 1 came out in 2007. Now compare Crysis 1 to the best looking game of 2002. You tell me if that progress slowed down a bit when comparing those two time periods.


For the time being it looks like we are moving to a hybrid model first, although to some extent those predictions are bang on. AMD's new 7000 series cards supports, guess what x86 memory. And IIRC to some extent, running x86 compiled code. The same goes for Nvidia 600 series cards.
Also on the Cpu end, we now have serious video cards being embedded into CPU's, this is currently only for the low to medium end stuff, but that is how these trends start..

"
The AMD Radeon HD 7000 series will be 28nm-based GPUs and are expected to ship with support for x86 addressing in a unified CPU/GPU address space, integrate PowerTune support, RISC MIMD instructions replacing VLIW SIMD instructions for GPGPU computing, XDR2 video RAM, PCI Express 3.0 support, and various other interesting changes.
"

The XDR was a flop and dident happen ( costs ).
http://www.phoronix.com/scan.php?page=news_item&px=MTAxNjY

So some unity is starting to happen here, except it seems to be video cards adopting CPU technology, not vice versa ( the vectorized processing ). This is probably due to the fact that the more recent knights cross 50 core chip from Intel boasts 1 teraflop, while video cards from AMD and NVidia can top 4 teraflops each. ( again IIRC, although I am quite sure that the numbers for the high end AMD's n Nvidias were x4 of the Knights cross performance )
 
Last edited:
Upvote 0
UE4 has allways been said to be aimed at the next generation of Consoles (IE the Xbox720 and PS4, assuming they are still to be called that), but since those have decided to be tardy, i'm not really sure what Epic is going to make of that..

Not to mention considering the rumoured specs already, they feel more like last LAST gen than next.
 
Upvote 0