Patent Again confirms AMD pursuing BIG.Little architecture
Click here to post a comment for Patent Again confirms AMD pursuing BIG.Little architecture on our message forum
LEEc337
With gpu too I heard mention that desktop Ryzens were gonna start including gpu cores
Silva
Noisiv
LEEc337
Hopefully by then intel have Xe cores on their cpus and some rdna./rdna2 cores at least on the 7000 and everybody can have some sort of game straight out the box
Undying
Gaming out of the box is a myth. You need a dedicated gpu if you want to game. I would rather see them not waste space on the chip for a igpu when you can use that space for more cores.
H83
schmidtbag
I'm a little surprised it's taken AMD this long to go big.LITTLE. Their chiplet design makes it relatively easy to apply, and they already had performance issues due to the Windows scheduler for at least 2 years, so this is one of those situations where I figure they'd say "while we're at it, let's just add little cores" since the scheduler needs to understand how to use those too.
Since next-gen APUs will be on DDR5 and based on RDNA, they ought to be a lot more potent. If you intend to game on an APU, you're not going to need more than 8 cores anyway. I too would rather have more cores, but I think it's good to have the option.
BLEH!
Only buy if more than 8 GPU cores. Come on AMD, seriously, you have the capability!
tunejunky
Undying
waltc3
Thing is, when companies register patents it does not necessarily follow that they may ever use them. Many hardware patents are done "in case we decide to go that route," etc. It provides design latitude to offset future licensing fees, etc. I've always thought of SMT logical cores as sort of "little" processors. They are, actually, less powerful than the main cores, but so far we can either turn them on or off--there is no "Partial SMT" setting--either use them all or use none of them. Might be cool to see a different approach on down the road. I see the best application in mobile computing, possibly, where battery life is a premium consideration. Time will tell.
asturur
waltc3
kapu
TLD LARS
MonstroMart
Reddoguk
I wouldn't mind a CPU with a decent iGPU in it as long as it does something for those also with a dedicated GPU. Maybe only for an AMD/AMD system but allowing both GPU's to work together could be a possible way to increase something like load times or memory use over both GPU's.
I think in the next few years and CPU iterations we will see much better APU's and things like DirectStorage from MS and DX getting better too. We'll have to wait and see what WDDM 3.0 and DX_FL12.2 brings to the table but things are looking promising and thanks to next gen consoles really, who'd of thought that consoles are actually being useful for once.
tunejunky
i'm not saying that discrete gpu's are dead or dying. i'm saying that consoles have finally gotten competitive. the main drawback to a "console master race" is the simple fact that consoles have far longer product cycles, so while they're competitive today they will be less so with rtx 4xxx or rdna 3 by winter 2022 - but depending on the production of said cards (production = pricing) many pc gamers with old cards may switch to consoles.
the other shoe that needs dropping is the game devs. while there may be a handful (or less) of new games that benefit from all of the eye candy, dev studios are hesitant to lay down sys reqs demanding more than Kepler era gpus. which brings them back to the easy meat of console development and porting. so this is a perfect storm for pc gamers.
Embra
GPU's may only exist on the high end in a few years.
cucaulay malkin