Intel Z370 Chipset Could Support Kaby Lake - But Intel Will Not Allow It
Click here to post a comment for Intel Z370 Chipset Could Support Kaby Lake - But Intel Will Not Allow It on our message forum
user1
Its about the same, if you have the right tools.
Koniakki
Six CFL cores @5GHz?!
https://s26.postimg.org/5o2ta0j3d/images.jpg
I get you boss!
https://s26.postimg.org/wm1s8va21/200_s.gif
Hilbert Hagedoorn
Administrator
Meanwhile your a-hole rating just went through the roof.
Venix
nevcairiel
Why would I even want a Z370 with a Kaby Lake CPU? I mean, I could understand people wanting Coffee Lake on Z270, so they can keep using their old boards, but why buy a new board for an old CPU? Why not buy both new in that case? Its not like Z370 has magic new features you would replace your board for. The number of people this would ever affect is so small...
Emille
nosirrahx
H83
wavetrex
This is getting more ridiculous by the month.
Not only that Kabylake is just a higher clocked Skylake, but now the fact that they add 2 more cores for Coffeelake means yet another mobo?? Which isn't even compatible with the IDENTICAL socket from last year ?
Intel can shove their 4.8-5.0Ghz up their tight a$$es, it doesn't really matter anymore. Games are more and more parallel (thanks, 8-core consoles !) and raw frequency is becoming less and less relevant each passing year.
EVERYTHING ELSE, any professional work, benefits more from core counts + decent IPC than just raw frequency.
This is the nail in the coffin, switching to AMD. Next buy: Most likely Threadripper.
DLD
Idiotel; Scumbagtel...couple of options for the new name of the financial company...
D3M1G0D
Intel seems really keen on segmenting their product lines lately. Seems that the way they artificially blocked off lanes and memory channels on X299 was just the start, they're now resorting to artificially limiting what CPUs can be used on their boards. I wouldn't be surprised if they did this with every CPU going forward, and I would also not be surprised if someone finds a way around it. Delidding kits was just the start - pretty soon we'll have enabling kits which can enable blocked CPUs! I smell a profit opportunity here ๐
tunejunky
b
but who other than we geeks will/can do it?
Agent-A01
BarryB
Seems this is the issue, make the Hardware faster to cope with lazy devs and piss poor optimisation on console to PC ports!!
Aura89
Agent-A01
http://cdn.mos.cms.futurecdn.net/gQFon9WmbFGGk8TqS4hMCT-650-80.png
So many games like this, especially ones in the past couple years that only use 4 threads.
I5 provides a much better experience for someone with 120/144/hz etc monitor.
Ryzen will be obsolete by the time games will commonly utilize 10+ threads
Low resultions? Focus on things that don't matter?
I don't understand what you're talking about there but games like this are very common.
Robbo9999
kroks
A lot of fully working features are disabled for commercial reasons, it's not surprising coming from Intel (HT, Overclocking, VROC!...)
claydough
I would be happy sticking with x58 and i7-980x for the past 7 years if the sata 3 controller wasn't so bad on the RE III and PCIE 3 wasn't introduced right after that purchase.
( ouch... No m.2 u.2 nvme etc. Since adopting ssd I doubt that my user experience would feel that storage speed or graphic lane gen compromise. It's still bothersome )
But now that PCIE 4 & 5 are back to back in little over a year...
That just seems to soon to pull the trigger on a major purchase when comparatively that would probably kneecap future purchases and upgrade manuevering for another 8 years?
In which case...
I would view any major purchase now as a temp stopgap measure with little upgrade viability?
Don't see where the drama is anymore with chipsets. The last time I bought a mb with an eye to CPU upgrade viability I went with AMD and got butt hurt almost immediately when they dropped my socket 939 choice!
( doh! That fx60 was pretty boss tho )
But since Moore's law has been squeezed anyway...
It actually seems as if all the advancement opportunities come with new chipsets?
( lane increases, memory gen/speed, i/o bandwidth: HDMI 2.0, DisplayPort 1.4, thunderbolt, usb 3.1, NVME... Even upcoming Optane iops storage performance are the only intriguing technological considerations next to GPU upgrades that hold the most weight making upgrades exciting? )
The last thing I wanted to hear was that coffe lake wouldn't require a new z370 chipset! Isn't it obvious that such chipset advancements is where all the sexy happens nowadays any way?
The determining factor should not be whether or not advancement and update paths are forced...
I still consider myself an enthusiast. What would determine my possible temp purchase path right before the back to back PCIE 4/5 updates is whether or not Intel brings exciting features/POWER with that forced path?
It becomes a whole different story if z370 brings say...
38 lanes and quad channel memory in addition to that coffee lake core increase!
In which case that new 6 core single threaded calculation performance king suddenly becomes my 6 core HEDT workstation upgrade!
( as far as an i7-980x holdout is concerned.)
Or the tableau becomes even uglier if nothing intriguing/new is brought to the proverbial table with that forced path...
Which would certainly stink of arrogance and greed grab in light of the vocal competition mandate against a market of boring increment.
Either way ( good or bad ) the results will be more telling because of z370.
I have no confidence at this point...
But not ready to congratulate AMD EITHER for SIMPLY waking up out of a decade of absent hibernation. I have my fingers crossed for RYZEN'S 2nd iteration...
Hopefully timed for a threadripped PCIE-5 equipped chipset I can upgrade GPUs on for another 8 years!
( chipset + GPU upgrade longevity is the only true sexy pronz )
claydough
Then again...
Everything seems like a bad timing purchase or painfully temporary.
The way monitors are folding in existing/aging TV advanced tech at an insanely high gamer brand baited premium...
It is becoming apparent that they r going to milk out speedy low latency monitors with HDR color accuracy and bit depth in addition to sharp contrast ratios enabled by self emitting diode equipped screens...
Seemingly? 6 years behind the TV industry.
Been a hardware enthusiast since the mid 90's and I don't ever remember a hardware tech market where "the timing" of purchases made every major investment ( save broadband internet ) seem like such an expensive "temporary" proposition in light of an obvious and inevitable end game solution just out of reach.
For the past 5 years I have been bemoaning the death of the 32" TV( relevent 32" set anyway ) set market that use to rule 4:3. In which case we would have that temporary solution to wait out insanity in the monitor market.
( dependent on game modes that were given attention )