AMD Navi spotted in MacOS Mojave

Published by

Click here to post a comment for AMD Navi spotted in MacOS Mojave on our message forum
data/avatar/default/avatar25.webp
How is this news now? This happened last october or was it november
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
The driver signing on Apple device doesn't work as on PC: - AMD work as is because drivers are already in OS distribution (like nearly everything a normal user would plug on his/her own machine... less stress btw). - Why AMD and not NVidia then? it's because Apple have chosen AMD to be distributed in OEM in their machine (same as NVidia on previous gen) it's Apple choise, not AMD or NVidia choise. -NVidia is 3rd party and doesn't care about mac user as it represent less each year (volume of mac and potential client too), also Mac user doesn't buy so much outside of the Apple world, and without signing it is a problem. About the new CPU from Apple there is a growing rumor that it could be more ARM based (also the high end iPad pro is really at a powerfull laptop level). Apple is Apple and can't be compared to PC world despite it use the same hardware.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
David3k:

Typical nvidia fanboy who ever only assumes that a criticism of nvidia is somehow just opposing brand loyalty. Grow up, kid.
Even more on Apple topic whatever you have inside AMD, Nvidia, IBM, Foxconn or Intel you still have an Apple from Apple and user doesn't care what is the brand inside they just want to make it work as is.
data/avatar/default/avatar18.webp
I'll talk a bit about the bright news, the negativity in this thread is killing me. The Navi 16 seems like the next go-to card, if priced correctly, within the $300 range, and then with similar to the 2060 performance, the same the rx 580 is almost on par with the 1060, this will be a great product, which I would gladly toss my cash for.
data/avatar/default/avatar30.webp
rl66:

- Why AMD and not NVidia then? it's because Apple have chosen AMD to be distributed in OEM in their machine (same as NVidia on previous gen) it's Apple choise, not AMD or NVidia choise.
Previous gen? Last MacBook Pro with NVIDIA graphics (optional included) was released in 2014, iMac 2013, Mac Pro 2009, Macbook 2009
FrostNixon:

I'll talk a bit about the bright news, the negativity in this thread is killing me. The Navi 16 seems like the next go-to card, if priced correctly, within the $300 range, and then with similar to the 2060 performance, the same the rx 580 is almost on par with the 1060, this will be a great product, which I would gladly toss my cash for.
They're more likely different configurations of one chip for laptops (or even gfx portion of APU), with the number indicating active CU's.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Is that radeon6000 a codename?
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
David3k:

Oh sure, let's forget all about how and why nvidia are, these days, stonewalled by Apple completely. Surely it has nothing to do with their past working history. Let's also forget how, for a time, nvidia was the only addon graphics option for Apple. I wonder why that ended?
If memory serves there was an issue with Nvidia GPU's in Mac Book Pro's a while ago which led to mass returns of Mac Book Pro's because all the GPU's in that version of the Mac Book Pro fail after normal usage which Apple had to pay to fix.... I think it was around then that Apple started to not like Nvidia.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
KissSh0t:

If memory serves there was an issue with Nvidia GPU's in Mac Book Pro's a while ago which led to mass returns of Mac Book Pro's because all the GPU's in that version of the Mac Book Pro fail after normal usage which Apple had to pay to fix.... I think it was around then that Apple started to not like Nvidia.
Yes, you are right, it was with the NVidia 8600 that overheat, NVidia point is that they sold the GPU only, and so it was Apple fault, and Apple point that the line of GPU was a defective stock... it end with everyone sue each other... and at the end it was a defective GPU stock that were sold and lawyer gain a lot of money from both side. From my own experience a not satisfied client might always be right, founding a solution together is a nice way to save money and to make commercial relation strongher. NVidia did a total fail in this story.
data/avatar/default/avatar17.webp
Nvidia sells a chip on a substrate in package to manufacturers , for 8600 they tried a thinner than industry standard substrate. When they failed industry experts said. told you so. Nvidia said it was the solder bumps failing. Eveyone else rolleyes and says no one else is having this problem. Cue nvidia goes back to accepted standard minimum thickness substrates. Not coincidentally a 8600gts was the last Nvidia card I have bought
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Astyanax:

The failed gpu's apple bought were implemented by apple, typical apple fanboy behavior is to forget that fact and blame the chip provider for the chip buyers implementation failure. there was never any defects found in the G8x chips themselves and it was always down to brittle solder pads, as implemented by the likes of apple, dell and hp. people like to forget the fact that ATI mobility based laptops had brittle 'bumps' too, just the market was so low at the time that it got barely any tech media traction. http://www.ee.cityu.edu.hk/~ycchan/publications-ycchan/PeerReviewedJournals/PeerJournal-206.pdf https://www.vogons.org/viewtopic.php?f=46&t=44126&start=20#p433375 No. There is a management level decision that the OSX developers do not agree with at play here, Nvidia is shadow banned from providing drivers to Mojave.
No no no. You are not going to start more trolling. You are not going to throw any insults. Only warning.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
tunejunky:

well boys (and the one or two gals), the floodgates are opening. the Navi variants (guaranteed at least two in 1st launch, most likely 3) are being leaked by social media in Taiwan and China from 2nd and 3rd party suppliers and aib's. i already mentioned the PS5 rumors (which gained steam over the last 48hrs) which isn't surprising as the console market basically paid for Navi development. the more interesting aspect to me is how quietly everyone is taking Sony basically absorbing the HTPC market, as custom builds will not be as cost effective or as attractive. when you have an 8 core (with or without HT) cpu and dedicated 1440p gpu (on a SoC) you are achieving a lot of functionality. depending on Sony for the software/interface may be an issue, and it remains to be seen how much they know they have a tiger by the tail.
While definately something to it, I honestly don't believe they'll grab the "HTPC" market, since such a one... never has existed in the first place. What exists are consoles and TVs with apps... and they (Sony) won't be making TVs anymore than they already do. But definately, the console is the blu ray player (already has since PS3), but there's actually not much more to it. And for that, honestly, you don't need 8 cores, or a 1440p GPU. While I do agree with you, that HTPC market lives off it being able to be used as a PC too, which the consoles never will do since it would need a much more open platform than a console platform.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
rl66:

Yes, you are right, it was with the NVidia 8600 that overheat, NVidia point is that they sold the GPU only, and so it was Apple fault, and Apple point that the line of GPU was a defective stock... it end with everyone sue each other... and at the end it was a defective GPU stock that were sold and lawyer gain a lot of money from both side. From my own experience a not satisfied client might always be right, founding a solution together is a nice way to save money and to make commercial relation strongher. NVidia did a total fail in this story.
It wasn't overheating, the devices operated within their intended temperature specification. Industry wide solder for the time was transitioning to rohs and everyone got stung by "brittle bumps". The XBOX 360 is an IBM and customized Radeon under the hood, it had the same issues.
Bitey:

Nvidia sells a chip on a substrate in package to manufacturers , for 8600 they tried a thinner than industry standard substrate. When they failed industry experts said. told you so. Nvidia said it was the solder bumps failing. Eveyone else rolleyes and says no one else is having this problem. Cue nvidia goes back to accepted standard minimum thickness substrates. Not coincidentally a 8600gts was the last Nvidia card I have bought
Except history and scientific study into the matter (rather than news outlet scare mongering) tells a very different story. Recall the only sources that deep dived the issue happened to be written by a certain Charlie.
David3k:

The second part of your "history lesson" forgets the part where nvidia has a long-standing history of throwing their partners under the bus to remove implication on themselves when something goes wrong.
This has been apples business practice since inception, infact, Steve Jobs in real life behaved this way in his personal life. "Apple" doesn't fall far from the tree, so to speak. Have you seen any of Louis Rossmann videos?
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
Interesting, I came to read a discussion about new GPUs, but all I see is just a bunch of people arguing about Apple and Nvidia. Anyway, I hope that the new cards do come around August/September, it's around time that I want to upgrade. We also need more competition to again get normal prices on GPUs. As for PS5m I don't think it will be interesting as HTPC, because I doubt that Sony will allow the same functionality as PC. On the other hand the PS5 could be somewhat more powerfull and with the pricing of GPUs now, could be an actual alternative to PC gaming.
data/avatar/default/avatar07.webp
Backstabak:

As for PS5m I don't think it will be interesting as HTPC, because I doubt that Sony will allow the same functionality as PC. On the other hand the PS5 could be somewhat more powerfull and with the pricing of GPUs now, could be an actual alternative to PC gaming.
Now that consoles are getting some decent CPUs (Ryzen), we could actually see a generation of 60 FPS gaming. If that's the case, consoles for me will be far more attractive. If nothing else, multi-core utilization on PC ports should theoretically improve, so it seems like a win all around for me. My hopes however, are that devs don't just use all the extra power for higher resolutions and over the top polygon counts that don't add much to the experience, but we'll see.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Backstabak:

Interesting, I came to read a discussion about new GPUs, but all I see is just a bunch of people arguing about Apple and Nvidia.
Same here, so guys can we not turn this into a apple vs the world topic?
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Backstabak:

On the other hand the PS5 could be somewhat more powerfull and with the pricing of GPUs now, could be an actual alternative to PC gaming.
dirthurts:

Now that consoles are getting some decent CPUs (Ryzen), we could actually see a generation of 60 FPS gaming. If that's the case, consoles for me will be far more attractive. If nothing else, multi-core utilization on PC ports should theoretically improve, so it seems like a win all around for me. My hopes however, are that devs don't just use all the extra power for higher resolutions and over the top polygon counts that don't add much to the experience, but we'll see.
I hate to ask this, but hasn't it always been like this, the consoles being pretty good hardware wise, when they were released? Let's look at it 4 years down the road again... Also, just as with old console ports, it always corresponds to how much a dev invests in a port... not what could have been done, but how much time they get by the publisher to even invest into a proper port. But, we all can hope that faster consoles / more widespread API adoption will bring less "limits" to the PC in the future.