NVIDIA Introduces Revolutionary Neural Texture Compression for Material Textures
Click here to post a comment for NVIDIA Introduces Revolutionary Neural Texture Compression for Material Textures on our message forum
TheDeeGee
Just need to convince shareholders to ship optimized games.
Undying
And convince nvidia to ship graphics cards with memory to have room for the textures.
tsunami231
tensor cores required too? or all card can do?.
compression vs compression there is huge diffrence though
cucaulay malkin
the visual improvement is impressive, needs widespread support. I guess buying a used 3090 you can easily have this running at 4K without the need to rob a bank.
schmidtbag
I like the idea of reducing VRAM usage, but I assume this is going to be pretty GPU intensive.
HybOj
"16 times more texels"
hmmm
((( 16 times the detail!! )))
Jokes aside, this is the only way to keep the old (3080) GFX cards alive
Chrysalis
They really dont want to ship proper amounts of vram it seems.
Undying
Vram usage will increase exponentially.
cucaulay malkin
dunno how people can see this and complain, must be sour.
low/med+ntc will look a lot better than high/ultra without. and no, it does not cause excessive vram usage by itself, as evidenced in the photo, there's just 8% difference. amd talking vram and textures ultimately led to nvidia owners having another awesome feature that amd won't have. wish my 6800 with 16G could use this too, but 8G cards will now have better textures with this enabled.
Embra
Pretty cool sounding.
Would ope gen GPUs be able to utilize this?
cucaulay malkin
prolly just needs tensor cores.
Noisiv
https://research.nvidia.com/labs/rtr/neural_texture_compression/assets/ntc_medium_size.pdf
it's just a lot of matrix multiplication, so doesn't really need tensor cores, but
it's 10x faster on CUDA using half-precision tensor cores while using only 2 GB of VRAM, compared to generic implementation in PyTorch which requires 18 GB
Reddoguk
5090 will have 32GBs of Vram and a 512bit bus reaching 2tbs of bandwidth, maybe......
TheDeeGee
Undying
cucaulay malkin
rocky01
Does this news make up for their huge arrogance and disregard for even their own partners like EVGA? Leaving tons of their users behind with these 'shortages' and astronomical prices is a clue what they think of customers. I left AMD years ago after they consigned my three year old card as 'legacy' not worthy of updates. Nvidia is better, barely. Getting too big for britches, no question.
H83
The idea looks cool but the potential gains look very optimistic...
But even a 25% improvement would already be very good.
Also, the timing is perfect from Nvidia!;)
oneoulker
Neo Cyrus