PCIe 4.0 Specs on its way - double the bandwith
Click here to post a comment for PCIe 4.0 Specs on its way - double the bandwith on our message forum
mohsentux
PCIe 4.0 + NVlink = Madhouse 😀
fusion
Yo dawg, heard you're sick of upgrading so we made pci express 4 so everything you just bought can be upgraded just after you upgraded. We're also upgrading our upgrade path so you can upgrade again after you've just upgraded with the last upgrade. Don't forget to check out our upgrade forums.
TheDeeGee
GPU wise PCI-E Gen3 isn't even a bottleneck yet.
Even a TITAN at 8x Gen3 has only 0.5 lower FPS than at 16x.
miffywiffy
What I've learnt these days is new hardware is generally irrelevant, just upgrade when you have to. I still have a 670 and nothing is pushing that, got a 750ti in one system and can even triple screen iracing with 80FPS and it maxed out lol.
Gotten to the point where new hardware isn't exciting, the software isn't keeping up. I haven't had to upgrade a CPU since 2009, I have, but I haven't needed to as that old CPU still isn't a bottleneck. I haven't had to upgrade a GPU since 2012 and even then it was a 670 which was more mid to high and not very expensive.
Really what excites me...
- New motherboard features, having stuff like bluetooth and ac wireless built in was the reason I switched to Haswell, not for the CPU itself.
- SSD prices coming down, we've reached the heights of HDD price and storage, but SSDs still need to come down in price so we can use them in 100% of our systems and finally ditch HDDs.
- Sound Cards, still waiting for one with good drivers, so far onboard is far superior, it lacks an amp but I'm hoping either a mobo company takes it to the next level or we finally get good sound cards. The latency on the current crop is really annoying, the conflicting issues with stuff like Rocksmith makes them irrelevant to me, along with their optical not being true optical.
TheDeeGee
k3vst3r
Loobyluggs
Whoah.
Mato87
Well then I think my computer will be obsolete by the end of 2015, I think it will be the right time for upgrade in the beginning of 2016 😀
It's funny because the last computer lasted more than 6 years, but this computer what I have now won't last more than 3 years 😀
weasel
Times are changing i knew a time when i would upgrade mobo cpu and ram every year now i'm 2 years on the same main rig and 3 years on the same second rig and still satisfied.:)
Calmmo
they know computers have slowed down in terms of advancement considerably. They need to find a new ways to milk us.
Fender178
Also I doubt that we are going to notice any difference between Gen 3 and Gen 4 considering it is supposed to be double the bandwidth
miffywiffy
Mineria
Mineria
http://www.elderscrollsonline.com/
I ran the game completely fluid >60fps with maxed out settings @ 1920x1060 on a Q9650 + GTX260 system.
Skyrim isn't bad either when you play it out of the box without mods.
They made a game engine that works: rl66
rl66
rl66
-Tj-
http://en.wikipedia.org/wiki/Skylake_%28microarchitecture%29
nvlink is for corporations
Us normal peps will be stuck with normal pcie4.0, probably with Pascal.
Intel's Skylake will apparently support pcie 4.0, but only Skylake-E LGA2011 variant
schmidtbag
Seems kind of pointless to me. Except for people who have 2 dual-GPU cards, as far as I'm concerned PCIe 2.0 is still good enough for the average gamer.
Honestly though I see a greater underlying issue to all of this. Lets say you have a high-bandwidth PCI-e 1x card, such as a HDMI capture card. If its designed for PCIe 4.0, there's a good chance the card will noticeably under-perform on a PCIe 1.0 or 2.0 generation slot. While backwards compatibility is certainly welcome, I feel like PCIe's progression is going to result in a lot of confusion and frustration for people with outdated machines. So far, I haven't encountered any non-GPU PCIe cards that specify which generation they're recommended to work with, though I don't have many to begin with.
ScoobyDooby
http://www.reactionface.info/sites/default/files/images/1310428349083_0.png
Just doesn't feel right without his photo.