The Three New GeForce GTX 1070 and 1080 Related Technologies

Published by

Click here to post a comment for The Three New GeForce GTX 1070 and 1080 Related Technologies on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
If I have understood correctly HANSEL working way, a game would disable HANSEL in MP and activate it in SP.
You react to reaction of reaction. So read my 1st post here, to get into image...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
1st, it would be undetectable and you would have no proof it was used for cheating. 2ndly, Would you prevent everyone with nVidia GPU to play competitive online? And data are there, extractable from sound. Moment people get way to dump scene at will, there will be scripts doing rest.
How can it be used for cheating? Its essentially is noclip with a photographic filter. There is no transparency option. Do people cheat in CS:GO right now using noclip? No. How is this any different?
https://forums.guru3d.com/data/avatars/m/180/180832.jpg
Moderator
dont think they will give that benefit in MP. its about equality, so its just another awesome extra people can use in single player. Im glad things like this are happening on PC, it needs more innovative things like this.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
dont think they will give that benefit in MP. its about equality, so its just another awesome extra people can use in single player. Im glad things like this are happening on PC, it needs more innovative things like this.
I don't even understand the argument. You can free roam right now in like 90% of games. It's disabled in multiplayer. This is literally the same exact thing except it gives you instagram filters and options to output to a higher resolution by slicing the FOV. This isn't some magical process that's hijacking the game engine.
https://forums.guru3d.com/data/avatars/m/257/257887.jpg
ANSEL, how lovely. Only positive about this is, that Game has to support it. In other words, nVidia is not forcefully taking that geometry information. But I am pretty sure, soon enough there will be tons of cheaters using this, to take snapshot of scenery and render it on 2nd screen with transparency ON.
Have you not used your brain for once and thought the developers clearly know this and have disabled it for Multiplayer? Also that example is totally ridiculous, there isn't even a transparency option. I honestly don't think you thought that comment through before posting :3eyes:
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Have you not used your brain for once and thought the developers clearly know this and have disabled it for Multiplayer? Also that example is totally ridiculous, there isn't even a transparency option. I honestly don't think you thought that comment through before posting :3eyes:
If it is 3D, then it means it is stereoscopic. Do you know what makes images to show depth? Difference in position between having one viewport and 2 for stereoscopic? I apparently used those few brain cells I still have 🙂 However you ...
ANSEL, how lovely. Only positive about this is, that Game has to support it. In other words, nVidia is not forcefully taking that geometry information.
I guess, you know how easy is to have anticheat for game to control what 3rd party proprietary library does and does not. Or how easy is to protect your game against wrapper written for such library, especially if game code is made in way that this 3rd party library can have unrestricted access to scene and settings.
https://forums.guru3d.com/data/avatars/m/87/87717.jpg
The audio feature is really nice, almost a necessity. But - I don't like that nvidia is doing it, because that means this is going to be proprietary and it is GPU-bound. Though this is very complex for audio processing, a discrete sound card (or even the CPU) should be handling this, not the GPU. In a simplified perspective, think of it like this: 1. The CPU is the first to know about the layout of the 3D area. 2. The CPU then tells the GPU what the area looks like. 3. The GPU renders everything. 4. The GPU adds filters to make up for lens correction, and then outputs the display. 5. The GPU processes the sound effects based on the environment. 6. The GPU sends this info back to the CPU. 7. The CPU tells the sound card what to play and how to play it. 8. The sound card plays the audio (add another step if it does it's own layer of processing). We want to be LOWERING latency in the GPU, not increasing it. As stated in step 1, the CPU already knows what the 3D area looks like. For the most part, the CPU is twiddling its thumbs as it's waiting for the GPU to finish rendering the scene, let alone render the audio too. Anyone meddling with VR should be smart enough to have a hefty CPU, which should have plenty of spare resources to process this info. So not only could step 6 be eliminated entirely, steps 5, 7, and 8 could be done in-parallel with everything else the GPU is doing.
the extra horse power by itself is lowering latency already. this is just another way for taking advantage of the GPU's strength. a CPU could probably be used to a lesser degree. real time ray tracing is out of its league, though I'm not sure how such audio processing compares with visuals in terms of performance impact. I guess they can cut some more corners in accuracy to lessen the performance penalty. maybe even a dedicated geforce card could be used for both PhysX & OptiX 🙂
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Does anyone know if their new high bandwidth SLI bridges are going to look like the ones pictured..? I hope these do not interfere with third party liquid cooling setups. Just looks like there is an added lip that could interfere with certain connections.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Love the SMP - Simultaneous Multi-Projection! It should have been done years ago by both NVidia and AMD... When I used to play stuff like Battlefield on three 24" monitors, even my 7 year old nethew used to laugh that the two side monitor images were all warped and distorted so not really much use or good to look at straight on! If it is hardware based, does that mean the correction will work in existing games like Battlefield 4 and Starwars Battlefront then? Yes might make my next card be NVidia then unless AMD does something spectacular...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Love the SMP - Simultaneous Multi-Projection! It should have been done years ago by both NVidia and AMD... When I used to play stuff like Battlefield on three 24" monitors, even my 7 year old nethew used to laugh that the two side monitor images were all warped and distorted so not really much use or good to look at straight on! If it is hardware based, does that mean the correction will work in existing games like Battlefield 4 and Starwars Battlefront then? Yes might make my next card be NVidia then unless AMD does something spectacular...
Viewport is defined by few numbers and is very simple math. This type of correction can be done and likely is done by software. They had good idea. But I'll ask you question about angle control. You have to tell driver at what angle screens are turned. What happens if someone hacks driver and unlocks 360° freedom? Will you play with 2nd screen looking behind you? This may be good for 360° projection in room. That's for sure.
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
The Simultaneous Multi-Projection looks very very nice. If I still ran triple monitors I would be extremely interested in a pair of 1070s for that feature combined with 8GB VRAM.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
The audio feature is really nice, almost a necessity. But - I don't like that nvidia is doing it, because that means this is going to be proprietary and it is GPU-bound. Though this is very complex for audio processing, a discrete sound card (or even the CPU) should be handling this, not the GPU.
Hmm, that's not the way I read it. I think you'll find it's an SDK/middleware used as part of the creation process. Just like you bake lighting, you now bake audio so your audio files bounce off 3D surfaces, according to the pre-baked audio solver. It's really no different than any of the other solvers they got for fluids, particles, lighting etc. Hey, but I stand to be corrected.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Zotac GeForce GTX 1080, 8GB GDDR5X, DVI, HDMI, 3x DisplayPort (ZT-P10800A-10P)
I can't believe nvidia keep supporting DVI. It really should die, but having dual link DVI korean monitor, I can't really complain.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
I can't believe nvidia keep supporting DVI. It really should die, but having dual link DVI korean monitor, I can't really complain.
Right.... Can not understand that method.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Love the SMP - Simultaneous Multi-Projection! It should have been done years ago by both NVidia and AMD... When I used to play stuff like Battlefield on three 24" monitors, even my 7 year old nethew used to laugh that the two side monitor images were all warped and distorted so not really much use or good to look at straight on! If it is hardware based, does that mean the correction will work in existing games like Battlefield 4 and Starwars Battlefront then? Yes might make my next card be NVidia then unless AMD does something spectacular...
For sure...! I do wish I'd have kept my BenQ triple setup now.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Does anyone know if their new high bandwidth SLI bridges are going to look like the ones pictured..? I hope these do not interfere with third party liquid cooling setups. Just looks like there is an added lip that could interfere with certain connections.
Can anyone provide any info on this subject.? I sure hope they do not shoot themselves in the foot like MSI did with the release of their after market lightning coolers. Those coolers did not allow for the standard "PCB" SLI connection due to their large and oversized heatpipes on their coolers. I had to use "flexi" ribbon style bridges and figure out which card went to which bridge connector and so on.... I mean it all worked in the end....after buying quite a few looooooong SLI ribbons that I did not even know they made that big..., but it was horrendous to look at.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
last year Hybrid 980-Ti took a month to be available. And to be honest, with the new card taking half the power of 980-Ti, i'm not sure it'll be worth this time.
Why is that...? If performance is there then it should not matter right.? I always say...OC.....OC.......OC!.!.! Especially if they give you headroom with monster cooling.