##Our decision process when it comes to implementing third-party technologies goes roughly like this:
- Does it enhance the game in a meaningful way?
- Can all of our players benefit from it (multi-platform compatibility)?
- Do we have time to implement it?
- Does Unreal support it out of the box?
- Is it fun ?
We’re running on an incredibly tight schedule. Because of this, we’re being very careful about which features we want/need to implement.
We have a billion-and-one ideas and we have to make informed decisions as to which of those ideas we want to pursue.
The bottom line is: fun comes first.
If it doesn’t add fun to the game, we push it back on our priority list.
Yes, adding rain or snow would be neat, and having real looking hair would be a quality boost.
But Tower Unite is about having fun with friends, so we want to spend our time implementing features which enhance that part of the game.
Graphics and effects are secondary to gameplay, always.
Also as other people have mentioned, implementing NVidia features drives a wedge between the two PC demographics NVidia and AMD.
We don’t want to do that. Our goal is to deliver a consistent experience across all platforms. Adding some of these features could ruin that.
Now I’ll address some of your questions / comments directly:
NVidia PhysX is integrated into Unreal out of the box. It’s the physics engine which drives Unreal 4. PhysX is accelerated by NVidia GPUs but can also run on your CPU if you’re using a different graphics card.
I don’t know how the snow / rain stuff factors in, I’ll have to look into that.
CUDA is a GPU parallelism library. It simply allows you to write multi-threaded code to be computed by the graphics card’s multiple CUDA cores.
It does not do water simulation. We’d have to write that component ourselves (which ain’t easy).
Nah. We’d have to author our own hair simulation data and it doesn’t work well with AMD.
This is possible, but why implement something that only players with beefy machines can use? Seems like a waste of time if you ask me.
Maybe somewhere down the line when these newer technologies become more universally compatible or “beefy computers” become “normal computers” we’ll consider implementing them.
Anyway, I hope this answers questions anyone has about where we stand on NVidia tech. 