AMD Ryzen 5 4600H and Ryzen 7 4800H pop up in 3DMark: as fast as desktop?

Published by

Click here to post a comment for AMD Ryzen 5 4600H and Ryzen 7 4800H pop up in 3DMark: as fast as desktop? on our message forum
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Really impressive, and that is at a lower TDP.
data/avatar/default/avatar24.webp
Insane 🙂)
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
It's nice that the mobile market might be getting some love, years ago i remember seeing a few desktop CPU's make it to the laptop market, though these were straight up cpus you find in desktops. These run nice, if the next surface pro gets one of these i might have to grab one up for that performance bump, even if its Zen+ performance that is still might impressive
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
My next dream ThinkPad couldn't be released soon enough!
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Just wait until AMD are on TSMC's 3nm gate all around approach in 3-4 years. I have a weird prediction that Intel and AMD will put Nvidia out of business by the end of the decade. Who is buying a GPU if APU's have greater than 2080 TI performance? Not many.
data/avatar/default/avatar15.webp
JamesSneed:

Just wait until AMD are on TSMC's 3nm gate all around approach in 3-4 years. I have a weird prediction that Intel and AMD will put Nvidia out of business by the end of the decade. Who is buying a GPU if APU's have greater than 2080 TI performance? Not many.
Depends on how ray tracing goes, also don't think we can get that much performance and let's not forget, game coding nowadays is quick, lazy and dirty which means it's not optimal at all, thus why you need all that GPU power to run games which look barely better than 2010.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
FrostNixon:

let's not forget, game coding nowadays is quick, lazy and dirty which means it's not optimal at all, thus why you need all that GPU power to run games which look barely better than 2010.
I strongly disagree. Compare the overall scene, character and animation detail in a game like Red Dead Redemption 2 to Crysis 2 (2010). The quality is night and day different. Further most modern shaders and methods, say for example photogrammetry, are arguably more complex processes (especially in upfront material design) than literally all the shaders in an entire game from 2010. The problem is the increased cost of building out these worlds (AAA studios today are topping 500+ people on projects) and diminishing returns with complex shaders (back in 2010 you could still easily double poly counts and get massively improved graphics, now you have to calculate the light bloom through a skin shader on a dozen characters to make a better looking game). Are some projects rushed with poor optimization? Sure - but studios like DICE, Guerrilla Games, ID, Naughty Dog, 4A, Ninja Theory, Cloud Imperium, Etc are all pushing the envelope graphically and far surpassing games from 2010.
JamesSneed:

Just wait until AMD are on TSMC's 3nm gate all around approach in 3-4 years. I have a weird prediction that Intel and AMD will put Nvidia out of business by the end of the decade. Who is buying a GPU if APU's have greater than 2080 TI performance? Not many.
This post is so weird to me. If APU's have greater than 2080Ti performance by then there will be some 4080Ti or 5080Ti that's faster. If some subset of a chip, given a size, is dedicated not to graphics - then an addon board that's entirely dedicated to graphics will always be faster.
https://forums.guru3d.com/data/avatars/m/111/111985.jpg
damn, stronger than my r5 2600 o_O
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
JamesSneed:

Just wait until AMD are on TSMC's 3nm gate all around approach in 3-4 years. I have a weird prediction that Intel and AMD will put Nvidia out of business by the end of the decade. Who is buying a GPU if APU's have greater than 2080 TI performance? Not many.
NVidia isn't going anywhere. Especially not at the hands of AMD and Intel. Neither of them can match the performance of NVidia's cards. AMD doesn't have the funding, and Intel doesn't have the experience (or even products) to compete with NVidia directly. I'd be surprised if Intel can even compete with AMD with their first GPU launch. Add in the fact that NVidia also exists in other markets that AMD and Intel chips don't fit and it's even less likely that NVidia will be out of business any time soon. NVidia is involved in the development of self-driving cars, which they expect to take off in the next few years. I think IF NVidia were to leave the dedicated GPU market, it will be because it's either no longer profitable or the consumer PC market has died. I don't see any chance of AMD and/or Intel (or really any other company for that matter) driving NVidia into bankruptcy.
Denial:

This post is so weird to me. If APU's have greater than 2080Ti performance by then there will be some 4080Ti or 5080Ti that's faster. If some subset of a chip, given a size, is dedicated not to graphics - then an addon board that's entirely dedicated to graphics will always be faster.
Agreed....though I'd be disappointed if we're only up to a 4080Ti or 5080Ti by the end of the decade. Especially when we're expecting the 3000 series to launch in the coming months.
data/avatar/default/avatar29.webp
sykozis:

NVidia isn't going anywhere. Especially not at the hands of AMD and Intel. Neither of them can match the performance of NVidia's cards. AMD doesn't have the funding, and Intel doesn't have the experience (or even products) to compete with NVidia directly. I'd be surprised if Intel can even compete with AMD with their first GPU launch... ...I don't see any chance of AMD and/or Intel (or really any other company for that matter) driving NVidia into bankruptcy...
Let's be thankful for innovation and, oh yes, time. Rule #1: nothing stands still.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
sykozis:

NVidia isn't going anywhere. Especially not at the hands of AMD and Intel. Neither of them can match the performance of NVidia's cards. AMD doesn't have the funding, and Intel doesn't have the experience (or even products) to compete with NVidia directly. I'd be surprised if Intel can even compete with AMD with their first GPU launch. Add in the fact that NVidia also exists in other markets that AMD and Intel chips don't fit and it's even less likely that NVidia will be out of business any time soon. NVidia is involved in the development of self-driving cars, which they expect to take off in the next few years. I think IF NVidia were to leave the dedicated GPU market, it will be because it's either no longer profitable or the consumer PC market has died. I don't see any chance of AMD and/or Intel (or really any other company for that matter) driving NVidia into bankruptcy. Agreed....though I'd be disappointed if we're only up to a 4080Ti or 5080Ti by the end of the decade. Especially when we're expecting the 3000 series to launch in the coming months.
Yeah I know its a weird post. At some point not that far off there we be virtually no dedicated cards in 90% of PC's. Sure Nvidia will have faster cards but I'm saying here in the next 5 years or so more than 90% of the people won't want to pay extra for the increase in performance. AMD and Intel will have enough densities to do graphics in APU's and do it like only a dedicated card could before. Having a CPU chiplet, APU chiplet, a few stacks of HBM that is shared between both will also offer some performance synergies. Honestly if you can get 2080 TI performance (I actually think APU's will have more power than this but nothing on the market to point at) out of this kind of chip how many people are buying a dedicated GPU even if we are talking 5 years from now? Every dedicated card in history has been obsoleted at least in the desktop and lower spaces and GPU's will be no different. It's not a matter of if but when.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
JamesSneed:

Yeah I know its a weird post. At some point not that far off there we be virtually no dedicated cards in 90% of PC's. Sure Nvidia will have faster cards but I'm saying here in the next 5 years or so more than 90% of the people won't want to pay extra for the increase in performance. AMD and Intel will have enough densities to do graphics in APU's and do it like only a dedicated card could before. Having a CPU chiplet, APU chiplet, a few stacks of HBM that is shared between both will also offer some performance synergies. Honestly if you can get 2080 TI performance (I actually think APU's will have more power than this but nothing on the market to point at) out of this kind of chip how many people are buying a dedicated GPU even if we are talking 5 years from now? Every dedicated card in history has been obsoleted at least in the desktop and lower spaces and GPU's will be no different. It's not a matter of if but when.
you are exactly right. and don't worry about the bruised egos of fanboys. the fact of the matter is the entire consumer PC market is going mobile. PERIOD. indeed, if it wasn't for mobile RTX Nvidia wouldn't have made their targets the last two quarters just as Jen said. the simple fact that just about every game platform has game streaming (independent of end use GPU) furthers this even more. and don't say anything about game streaming sucking - it does but these are early days and no-one is giving up on it. WE are enthusiasts, WE are the sort to own GPU's and do not mind (too much) the added expense. but the VAST majority of people would be delighted to have "2080ti" performance in an APU. Nvidia will be making (more) money in a more lucrative segment(s) A.I. and automotive where they aren't going anywhere. but AIB's have long been on a decline.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
JamesSneed:

Yeah I know its a weird post. At some point not that far off there we be virtually no dedicated cards in 90% of PC's. Sure Nvidia will have faster cards but I'm saying here in the next 5 years or so more than 90% of the people won't want to pay extra for the increase in performance. AMD and Intel will have enough densities to do graphics in APU's and do it like only a dedicated card could before. Having a CPU chiplet, APU chiplet, a few stacks of HBM that is shared between both will also offer some performance synergies. Honestly if you can get 2080 TI performance (I actually think APU's will have more power than this but nothing on the market to point at) out of this kind of chip how many people are buying a dedicated GPU even if we are talking 5 years from now? Every dedicated card in history has been obsoleted at least in the desktop and lower spaces and GPU's will be no different. It's not a matter of if but when.
I feel like that's already now? You're original post made it sound like "who's going to buy a GPU (that currently buys one)" - not like in total PC share. The vast majority of my friends and family don't have PCs at all. They just browse on tablets/phones. Workwise, the majority of people can get away with a NUC/laptop - my current company and last company, aside from the CAD guys, all run on integrated setups. PC Gaming wise though (as far as like actual games and not web ones go), I'm not sure I agree. I think gaming will go entirely cloud based before 90% of gamers run integrated.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
I can't wait to get a G14 with a 4800hs and 1660ti.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
JamesSneed:

Just wait until AMD are on TSMC's 3nm gate all around approach in 3-4 years. I have a weird prediction that Intel and AMD will put Nvidia out of business by the end of the decade. Who is buying a GPU if APU's have greater than 2080 TI performance? Not many.
APUs with the power of a 2080TI ? Really? Are you not exagerating a little maybe? Not to mention by that time Nvidia will have something much more powerful like others have said before.
tunejunky:

you are exactly right. and don't worry about the bruised egos of fanboys. the fact of the matter is the entire consumer PC market is going mobile. PERIOD. indeed, if it wasn't for mobile RTX Nvidia wouldn't have made their targets the last two quarters just as Jen said. the simple fact that just about every game platform has game streaming (independent of end use GPU) furthers this even more. and don't say anything about game streaming sucking - it does but these are early days and no-one is giving up on it. WE are enthusiasts, WE are the sort to own GPU's and do not mind (too much) the added expense. but the VAST majority of people would be delighted to have "2080ti" performance in an APU. Nvidia will be making (more) money in a more lucrative segment(s) A.I. and automotive where they aren't going anywhere. but AIB's have long been on a decline.
Actually the market is going to cloud and to the streaming services and those services need big and powerful GPUs to render the games, guess who´s going to provide those? Not to mention that when the moment arrives that Nvidia can´t sell anymore desktop GPUs that probably means that AMD and Intel also can´t sell their desktop CPUs becuase everything has been moved to cloud/streaming services...
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
So to be clear I meant 90% of PC gamers won't have a dedicated GPU sometime in the next 5 years. It's pretty obvious mobile, consoles, cloud solutions, etc will help make this a reality by reducing the number of PC gamers thus devaluing the whole market. If you look at the expected densities TSMC is projecting for the 3nm node then you can see AMD really can have a GPU with the power of 2-3x of 5700 XT's in a small 200-250mm2 chiplet. Then you pair that GPU chiplet with an 8-core CPU chiplet and 32GB of HBM. You now have something that can play games at 4K really well and all that memory bandwidth would make the CPU side really shine. This is what I'm expecting to see from AMD in 2023. Intel will be doing the same around then as well with EMIB and Foveros on there new 7nm EUV process. This is why Intel has a huge focus on dedicated GPU's its not because they plan to make them forever it's because they want to take the tech into a chiplet and tie it in via EMIB. Intel wisely saw this future coming from AMD so they must have a competitive GPU solution when people stop buying separate CPU's and dedicated GPU's. What I am saying is Nvidia will have almost no consumer GPU sales once AMD and Intel pull this off.
https://forums.guru3d.com/data/avatars/m/38/38873.jpg
Hope your asumptions are on track, but i somewhat doubt it, GPUs will exist as long as there is a market for them, my guess is the next 10 years they will be around, we barely can move 4k60display, not to say 120hz+, not to mention dual on a next gen VR. If anyting we have seen GPUs enter other areas that were more cpu bound like editing/rendering/transcoding/simulations/learning, etc. My guess is we will have GPUs for the next 10 years, and Nvidia will continue to rise the pricing if no one can compete with them, as long as we move into higher resolutions we will continue to need higher gpus.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Nvidia is most likely going to pick up an ARM design company soon (yes I know about Denver) - they are losing out on lots of exascale projects due to lack of CPU architecture. CXL is only going to go so far. Windows is becoming less and less dependent on x86 regardless.