NVIDIA Announces GameWorks SDK 3.1by Daniel Williams on March 16, 2016 7:20 AM EST
Innovation is hard work. Doing work that has already been done elsewhere can be satisfying, but also annoying - no-one wants to reinvent the wheel every time. In the realm of 3D graphics, we are not limited to creating our wares from scratch - toolsets such as NVIDIA GameWorks are provided to developers allowing them to include advanced graphics rendering and physical simulation features into their products. The latest version, NVIDIA GameWorks 3.1, is being released this week.
NVIDIA GameWorks SDK 3.1 introduces three new graphics technologies involving shadows and lighting. NVIDIA Volumetric Lighting involves simulating how light behaves as it scatters through the air, and was showcased in Fallout 4. Moving over to shadows, we will see NVIDIA Hybrid Frustum Traced Shadows (HFTS) which involves rendering shadows that start as hard shadows nearer the casting object and transition to soft shadows further away. Lastly, in the new graphics features, we see NVIDIA Voxel Accelerated Ambient Occlusion (VXAO), which NVIDIA dubs as the highest quality ambient occlusion algorithm. What makes this version better than previous techniques is the ability to calculate shadows with all geometry in world space, versus older screen space techniques that can only cast shadows for geometry visible to the camera.
Adding to the roster of PhysX features is NVIDIA PhysX-GRB, which is a new implementation of NVIDIA’s PhysX rigid body dynamics SDK. This new implementation provides a hybrid CPU/GPU pipeline that NVIDIA claims can improve performance by a factor of up to 6X for moderate to heavy simulation loads, especially for those that are large on compute shader register resources. NVIDIA Flow is the other update to PhysX which will introduce the ability to simulate and render combustible fluids such as fire and smoke, and this time simulation will not be confined to a bounding box. This should lead to much more flexibility and usefulness in games and other software in the future.
Post Your CommentPlease log in or sign up to comment.
View All Comments
DiLi - Wednesday, March 16, 2016 - linkDo you understand what open means? The code legally cannot be modified or shared. Have you even seen the NDA agreement you make with NV when you decide to use Gameworks in your game? It's pretty clear the agreement restricts data between YOU and NV, and you cannot post or use modified code without NV's consent. That is why I will not support GameWorks or NV proprietary tech in any software I create - it's not open and can't be "optimized" without consent.
If you don't believe it, go and try to get the "open" source code from Nvidia and read the agreement carefully.
As for "other then h/w physx I think everything runs on AMD"
Really? Care to list at least one?
close - Wednesday, March 16, 2016 - linkHe won't list any. Most people have no idea what GameWorks actually is. They assume it's a bunch of options in the game's "graphics" menu that you just disable or enable and it's just like changing the resolution.
Dribble - Wednesday, March 16, 2016 - link"He won't list any. Most people have no idea what GameWorks actually is. They assume it's a bunch of options in the game's "graphics" menu that you just disable or enable and it's just like changing the resolution."
It is a bunch of options in the games graphics menu that you can enable and disable. It's coded exactly the same way the options that the dev added were coded (C++/#/DX compiled by microsoft compiler). It's just that nvidia wrote them and provided them as libraries, exactly the same way as all the 3rd party libraries some game uses (of which there will be dozens written by all sorts of companies and most of them not open source).
Dribble - Wednesday, March 16, 2016 - linkI don't think you really understand what you are talking about - you know just cause it's called "OpenCL" doesn't mean you have to let everyone see your code? C is open too and you can't look at everyone's C code. It's just a language. The important point being it runs not only on Nvidia but on AMD as well, that is true of gamesworks - it runs on AMD hardware, mostly as well as it runs on Nvidia hardware. Most of the performance deficits tend to be because nvidia hardware is better at tessellation and the tessellation level by default is set to high for AMD to run well. That is something the dev's can fix by lowering tessellation level (or AMD can force it in drivers) and then it tends to run fine.
"Really? Care to list at least one?"
Everything I think - pick a gameswork game and tell me what gameswork effects you don't see on an AMD card.
DiLi - Thursday, March 17, 2016 - link"I don't think you really understand what you are talking about - you know just cause it's called "OpenCL" doesn't mean you have to let everyone see your code? C is open too and you can't look at everyone's C code. It's just a language.
Useless "dribble" to avoid actually reading and commenting on the license? How does OpenCL enter this discussion?
" The important point being it runs not only on Nvidia but on AMD as well, that is true of gamesworks - it runs on AMD hardware, mostly as well as it runs on Nvidia hardware."
Do you honestly believe that? If that were the case, then there would be absolutely no need to even have an NV card. Stop joking; you know this is not true at all.
Also, look at this link and tell me what effects you are referencing:
I'm just curious, and want to know exactly which one you want to learn about today. Just because a Gameworks game runs fine on AMD does not mean those effects are enabled.
nathanddrews - Wednesday, March 16, 2016 - linkNVIDIA already does charge "whatever they like" for products. Ever heard of the Titan? NVIDIA's goal is to stay in business, pay its employees, please shareholders, and excite its customer base. NVIDIA will never have a legal monopoly as long as Intel continues to integrate graphics into nearly all of its CPUs. GameWorks is a product that they sell to developers. Whatever impact it has on AMD customers is literally not NVIDIA's business - so why would they ever want to optimize it for AMD? That's on the developers of the game, just like it's on the developers to use DX11, DX12, Vulkan or all of the above.
close - Wednesday, March 16, 2016 - linkThere's a difference between "not optimizing for AMD" and "intentionally crippling everything not Nvidia". Which basically translates into AMD sine Intel has no say in the game world.
close - Wednesday, March 16, 2016 - linkAnd no, they don't charge what they like, they charge what the market likes. Because for now AMD is still there to undercut their prices. If things go like this and AMD has to fight 'fair' business practices from Intel and Nvidia alike they won't be around for more than 2 or 3 years. And you' get the chance to find out exactly what Nvidia 'likes' to charge.
When Intel was offering 'incentives' to use their CPUs somebody must have used the exact reasoning you're using now. 'What impact it has on AMD customers is not Intel's business'. Apparently it wan't quite so. When you abuse your position to force a competitor out of the game will sooner or later prove to be disappointing either to Nvidia or to the customers.
And BTW, try asking a developer who actually worked with GW what they think about the "optimizations" in the SDK. The reason You can't properly optimize for AMD is because it's not just about optimizing, you have to jump the hurdles put there especially to prevent this. And if history has taught (some of) us something it' that every company will abuse it's position for another chunk of cash. Intel, Microsoft, Apple, Google, Facebook. But Nvidia wouldn't do this because fanboys all over the world voted not to...
Sttm - Wednesday, March 16, 2016 - linkMore price fear mongering from AMD fanboys. They have been saying this shit for years about Intel, and guess what, a solid i5 still costs about $200, a great i7 about $330... AMD hasn't been competitive this decade and yet Intel's prices haven't skyrocketed....
This fear mongering was proven wrong over Intel, and its wrong about Nvidia.
rhysiam - Wednesday, March 16, 2016 - linkYes, and that brand new $200 i5 is about 25% faster than the $200 Sandy Bridge i5 which was released more than 5 years ago. Price may not have moved much, but neither has performance! Of course lack of competition isn't the only factor in that equation, but it's a big one nonetheless.