Seven games benchmarked before and after Denuvo got removed

Published by

Click here to post a comment for Seven games benchmarked before and after Denuvo got removed on our message forum
https://forums.guru3d.com/data/avatars/m/248/248879.jpg
vbetts:

So you're telling me that multiple sources that are well known in the online PC communities are all lying, and one person has 0% experienced this? https://www.hardwarecanucks.com/forum/hardware-canucks-reviews/76333-i7-2600k-vs-i7-8700k-upgrading-worthwhile.html https://www.techspot.com/review/1546-intel-2nd-gen-core-i7-vs-8th-gen/ https://www.anandtech.com/bench/product/2109?vs=287 Now being at 60hz I could understand how you wouldn't see any bottleneck.
Yes I'm saying that. All I've spoken was from personal experience. You said it yourself, what is the "normal" norm here. You already said that 1080p was "standard" when we look at things, because this is what the majority uses, but I doubt that the majority of that uses anything else besides 60hz. Sadly the steam hw survey doesn't show that.
https://forums.guru3d.com/data/avatars/m/248/248879.jpg
Koniakki:

He's "angry" at HH for making a VALID remark for the 2600k. lol
Shoo fanboy, I actually went through effort to proof that his remark wasn't valid. Because in the video the CPU wasn't a limiting factor in any of those games. So the "saturated and bottlenecked" remark is invalid. Period.
https://forums.guru3d.com/data/avatars/m/271/271700.jpg
What I'm reading here is that Intel hasn't gotten any better since the 2600k? Sorry. Y'all made me do it!
https://forums.guru3d.com/data/avatars/m/248/248879.jpg
Killian38:

What I'm reading here is that Intel hasn't gotten any better since the 2600k? Sorry. Y'all made me do it!
Nah the newer ones are better in most things. But we're applying the "standard" here, which is 1080p gaming (and presumably 60hz). That which the majority of pc gamers run. Based on that, then yes, the 2600k would be just as good enough as the 8700k (at this moment in time). I will defo upgrade end of this/early next year, I tried to push it for as long as possible. At least, if they're 8core/16threads, otherwise I'll try another year of waiting. I even ran dual HD6870's when I bought the hw. However, for 1080p60 gaming, right now, an upgrade isn't needed. Especially not a €350+150+200 (CPU/MOBO/RAM) upgrade.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Assassin's Creed Origins and Final Fantasy XV are two games using a couple of threads that can really CPU limit certain processors though it's usually still more about GPU performance. Not exactly sure how it's dividing it but I believe FF:XV is at 6 threads and AC:Origins is at a pretty hefty 8 and doesn't scale down so for that game in particular even hexa cores see some CPU limitation though not as bad as quads do. (Deferred render contexts was it? Aided by the CPU but it doesn't scale down to lower core processors very well so that's affecting performance negatively instead even on newer processors if they're quad core.) Ubisoft isn't one to patch out DRM though usually and I guess Square isn't considering removing it anytime soon if ever if it's not outright required so not as easy to compare performance before and after DRM plus there's custom stuff and other things running such as Final Fantasy XV having additional layers of protection and these are known to interfere with things a bit. EDIT: Though for now these two (AC:O in particular.) are more of the exception rather than the rule, probably going to see more of it over the next few years though and then we'll see how well it can scale I guess. EDIT: Well, this thread moved fast. 😀
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
Kaaskop:

Shoo fanboy, I actually went through effort to proof that his remark wasn't valid. Because in the video the CPU wasn't a limiting factor in any of those games. So the "saturated and bottlenecked" remark is invalid. Period.
There're nicer ways to do it. But each to their own. Can't force people to have manners. Whatever point you were trying to make either correct or not, the below comment was disrespectful.
Kaaskop:

Can you stop trashing the 2600k. .....
Clearly inappropriate and uncalled for comment made directed to a well known website owner/reviewer.
Kaaskop:

..... This, talking down about the CPU. He might have had a different opinion before in that other article, but his tone in this one is obviously talking down on that CPU.
Looks more like a direct and personal accusation, not to mention false too, rather than trying to prove a point. Clearly it seems you couldn't care less about the bottleneck point and just using it as an excuse for the real reason of the outburst which is obvious in the comment above. Which is not even a valid one to begin with. Don't worry buddy. The 2600k is a goo... ehm I mean great cpu still. No one is saying otherwise. 😉
data/avatar/default/avatar07.webp
lol when did he trash the 2600k retard its a old as hell cpu if you got the money to buy a 1080ti why pair it with a 2600k lol time to upgrade get money
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
kieron fleming:

lol when did he trash the 2600k retard its a old as hell cpu if you got the money to buy a 1080ti why pair it with a 2600k lol time to upgrade get money
Only warning I'm gonna give you, no posts like this. No one respond to this. Come on guys, we were having a good argument there! You don't get that too many times around here 😛 (This is not sarcasm, I actually liked discussing it with @Kaaskop , cheers buddy!)
https://forums.guru3d.com/data/avatars/m/239/239932.jpg
alanm:

Wish there was a non-Denuvo version of AC Origins to test. That game hammers the hell out of the CPU like no other game before it, even on pause or alt-tabbed out.
What also irritates me is the stutter in the cutscenes. Rime had issues too and they were resolved when the drm was removed. I wouldn't be surprised the same will be true here ... in addition to Ubisoft going full retard with DRM whenever they can.
data/avatar/default/avatar17.webp
I've been reading this thread and I can say for certain from personal experience a 2600K is severely bottlenecking a GTX 1080TI at 1080p. Hell, nowadays, even my current 6700K is bottlenecking some games...
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Kaaskop:

Can you stop trashing the 2600k. Like I commented on his YT too, besides a few games (like AC:O or The Division), this CPU isn't being a bottleneck in many games. Besides that, his was even on stock speeds, while most that still got it, it's on 4.2 or higher. So 4 core/8 threads at 4.2GHz (minimum), isn't sufficient these days and could affect results? Let alone, not be "a match" for a GTX1080TI?
I am not trashing a 2600k, I think it was a great processor back in the days that can still hold ground. One of the reasons I did this article. For this specific article, it should not have been used as yes, it is exaggerating the performance hit compared to modern age CPUs. Not sure why you feel so insulted about it, there's nothing negative about 2600K. But for this test, it should not have been used is all I am saying.
https://forums.guru3d.com/data/avatars/m/262/262995.jpg
"We all know" - stop right there. We don't. In fact it's been proven that denuvo does not affect performance at all. If it does, it's so negligible that it's barely detectable.
Kaaskop:

Can you stop trashing the 2600k. Like I commented on his YT too, besides a few games (like AC:O or The Division), this CPU isn't being a bottleneck in many games. So unless you got actual proof that this CPU was affecting anything besides the occasional CPU bottleneck in some instances. In his video it ONLY bottlenecked around 7:30 while being underground in Agents. So why would this CPU be not sufficient enough to handle such games and effect results, when you can clearly see the CPU usage in his video. Besides that, his was even on stock speeds, while most that still got it, it's on 4.2 or higher. So 4 core/8 threads at 4.2GHz (minimum), isn't sufficient these days and could affect results? Let alone, not be "a match" for a GTX1080TI?
Well, it's a very old CPU now. That CPU is indeed a bottleneck in many many games. Even a 6700/7700k is a bottleneck. Every machine has one, but if you want better performance in games currently I highly suggest moving to an 8600k minimum, which is an absolute beast of a CPU and well worth the upgrade.
RealNC:

Perf is not the only concern. What about games that never get updated to remove Denuvo? When Denuvo stops working (Denuvo server goes away and/or Denuvo itself is not compatible anymore with future Windows versions), your game doesn't work anymore? See your various Star Force or SecuROM or Tages game CDs or DVDs from the 2000's. Good f'ing luck getting them to work. Meanwhile, the pirated, illegal copies of those discs work perfectly fine, so you have to break the law to play the game you paid for. Denuvo should be illegal.
Actually, it's not illegal if you own the game already. Star Force- have to agree. That was the worst. It was known to actually kill physical drives. It killed 2 DVD drives of mine alone.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Irenicus:

Actually, it's not illegal if you own the game already.
You can't find direct downloads to these games yet. They're not "abandonware" or anything like that. You need to go through p2p, and that makes it illegal and you can get caught, because p2p means you're uploading too, not just downloading. In fact, the uploading part is how people get caught and is the data that's presented to the authorities if you deny it. And furthermore, you are put at risk of infecting your machine with malware. All because of this kind of aggressive DRM that effectively puts an expiration date to the game it "protects."
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
In other words no need to waste time with DRM cuz it get's cracked anyways.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
for sure the 2600k @ stock it presents in cpu bound games bottleneck , BUT isn't it better to run a cpu like the 2600k on this test ? If denuvo indeed *wasting* cpu resources those will show easier on an older but still a very much capable cpu no ? while if you use say the 8700k @ 4.5 or a 2700x at 4.2 ....most likely both those cpus will not be affected no ? they have the performance left to spare no ? Anyway my thinking is that with a 2600k you can potentially detect the slightest allocation of cpu calculation throughput to anything else that is not related to game performance while with the cpus i mention they have a lot left to spare for denuvo and the 1080ti is there to make sure it is not the bottleneck since it is a cpu test . Is my logic faulty here ?
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
airbud7:

for 1 or 2 fps? :D
You showed 1440p where the bottleneck starts to fall on the GPU and not the CPU. You'll find if you go to 720p and 1080p the bottleneck is back on the CPU, when we getting even stronger gpus later down the tech years 1440p will be a bottle neck for cpus too then. That being said, its small differences that aren't really shown unless you drop the res down... but considering so many still use 720/1080p screens
data/avatar/default/avatar29.webp
All my games run perfectly smooth with my beloved 2600k and i have had no problems with it in seven years. Upgrading to a newer Processor that will run all my games smoothly too isn't worth several hundreds of $ for new ram, motherboard and CPU. I remember a time when we had to update hardware every other year. These times are over.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
@Iwontreadit 1080p 60 fps is still a very capable cpu there is no question about that ! the only reason i changed my 3770k was because the motherboard is now on the motherboard heaven and well yes i remember those times .. right now is not that simple you have to evaluate how much the update will cost you and what you get in return . See my system r5 1600 and gtx 1060 i play on 1080p 60hz ...yeah yeah i know i am a pleb ... so say i want to move to 4k i would need at least 500 on a new monitor another 700 for a gpu ....so depending how your economic are you can be like .... sure 1200 euros pay and play ...or you can be like me and decide you want to keep those 1200 for few more years till the 4k is closer to mainstream and on more mainstream prices ! There is nothing wrong by going for it ..hell i would if i did not had other bills running !
https://forums.guru3d.com/data/avatars/m/248/248879.jpg
Hilbert Hagedoorn:

I am not trashing a 2600k, I think it was a great processor back in the days that can still hold ground. One of the reasons I did this article. For this specific article, it should not have been used as yes, it is exaggerating the performance hit compared to modern age CPUs. Not sure why you feel so insulted about it, there's nothing negative about 2600K. But for this test, it should not have been used is all I am saying.
I'm sorry but I actually do see it as being rather negative about the CPU in this article. But let's just agree to disagree on the subject of it being negative or not. Like I said above, the CPU is (still) amazing and unless the next has 8c/16t as the €350-400 i7 (or i9 if they're going a different route), I will still keep it for as long as I can. For 1080p60 this CPU can still hold it's ground against most games. However, at higher refreshrates (and thus framerates), I do indeed think it will bottleneck quite a bit more than I'm experiencing.
vbetts:

Come on guys, we were having a good argument there! You don't get that too many times around here 😛 (This is not sarcasm, I actually liked discussing it with @Kaaskop , cheers buddy!)
I'm glad that you didn't treat me as an complete idiot. As for me it's quite difficult to portrait what I mean to say into actual text and come across as rude or childish sometimes. Even though I try to back-up my words with as much evidence as I can to show as to why my standpoint is like that. Anyway, it was a good one and I had fun trying to convince you of my point while you did the same for yours https://i.imgur.com/LoMTw5X.gif