AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry

Published by

Click here to post a comment for AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry on our message forum
https://forums.guru3d.com/data/avatars/m/298/298232.jpg
They already have used AI all over it's just coming out time so the stuff they're doing now in secret can be only seen as scifi then later that stuff comes into public and on and on. Just look at MidJourney, type in cyberpunk as an 80s action movie, perhaps give it some links as references and it spits out a short video or stills that can be made into a yt video which I linked to in the cp77 thread. Lots of lost work for digital artists and others. Huge open world games worlds are probably roughed in via spreadsheet metrics, google maps data etc and spits out a world probably with period assets too then later on some temp contractors come in and detail it out, gone in 3 months. All sorts of things AI can be used for, drones for example, no need for pilots or the pilot is window dressing/dummy etc Planes can bank at G's no human could take etc robotics is pretty huge in engineering right now, every industry they can they robotocize w the right AI goes even further. Death spiral, the real cattle cars will be (soylent) green powered to the "factories".
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Horus-Anhur:

It has already flipped. AMD already stated they will add Tensor units to RDNA4. But this is for late 2024, a very long time in the computer space.
once those card arrive AMD gpu price will go up good chunk too. highly doubt they sell stuff at lower price then nvidia when both will have those cores now. would be nice if that is start of price coming back down but, that probably not happening
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
tsunami231:

once those card arrive AMD gpu price will go up good chunk too. highly doubt they sell stuff at lower price then nvidia when both will have those cores now. would be nice if that is start of price coming back down but, that probably not happening
AMD has already shown they're not the budget brand any longer. You only have to look at how expensive an AM5 X670 setup is compared to Z790.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
geogan:

"Two weeks ago, Microsoft said it had bought tens of thousands of Nvidia’s AI-focused processors, the A100 GPU, in order to power the workload of OpenAI. Nvidia has sold 20,000 H100s, the successor to that chip, to Amazon for its cloud computing AWS service, and another 16,000 have been sold to Oracle." Those are staggering amounts of expensive GPUs being sold now to big tech companies and this is why NVidia is now reaming rewards for all the ground work it's put into this area for years. I wonder what Oracle is going to produce after buying that many H100s?! Source: https://www.theguardian.com/technology/2023/mar/26/cryptocurrencies-add-nothing-useful-to-society-nvidia-chatbots-processing-crypto-mining
Probably an AI whose first words would be: "We are Oracle and we are many. To save time let's just assume that we are never wrong!"
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
H83:

Unfortunetly, AMD doesn`t have the resources to bet in every possible market, so they have to lag behind in some of them and just leave others completely unattended.
You're right - I tend to forget that they have CPU's (and I own one so imagine how can I forget?), GPU's, server market, consumer, etc. But it's a lack of vision here - if you can envision where the next hot thing will be you win - Nvidia pushing AI since 2016-17 won at least for now and you have to give credit to the leather jacket man because a good CEO must have great vision and he was right to push for this.
Horus-Anhur:

CDNA has tensor units since the first iteration. And Xilinx was producing and selling FPGAs and APACs before the merger. And are doing it after the merger. AMD might be a bit behind the game compared to NVidia, but not as much as you think.
You're also kind of right: they may not be that far as I think but they lost a big cash train here and development + innovation can't exist without money and now Nvidia, because of their success in this field, has the money to develop even further - they will not stop and wait for the others while "the others" must get the money from somewhere - by the way AMD buying Xilinx was a great move, it proves that they are interested but let's see how they intend to use that. With an Nvidia GPU you can have a neural network at home - who could believe that ten years ago? We use neural networks a lot at work for scientific simulations and other kind of experiments and I love CUDA - it's really beautiful in its simplicity and very easy to apply in your code and it's already very advanced. AMD launched two years ago GPUFORT - an alternative to CUDA - nobody cares....and without the software + community support you can put as many tensor cores as you want. Here Nvidia is again in front by much because they nailed it with CUDA also. By the way Nvdia launched last year CV-CUDA - a derivative of CUDA open source for computer vision - they are not sleeping just because others are not at their level - the only good thing about Nvidia - they are not Intel with lazy refreshes.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
barbacot:

You're also kind of right: they may not be that far as I think but they lost a big cash train here and development + innovation can't exist without money and now Nvidia, because of their success in this field, has the money to develop even further - they will not stop and wait for the others while "the other" must get the money from somewhere. With an Nvidia GPU you can have a neural network at home - who could believe that ten years ago? We use neural networks a lot at work for scientific simulations and other kind of experiments and I love CUDA - it's really beautiful in its simplicity and very easy to apply in your code and it's already very advanced. AMD launched two years ago GPUFORT - an alternative to CUDA - nobody cares....and without the software + community support you can put as many tensor cores as you want. Here Nvidia is again in front by much because they nailed it with CUDA also. By the way Nvdia launched last year CV-CUDA - a derivative of CUDA open source for computer vision - they are not sleeping just because others are not at their level - the only good thing about Nvidia - they are not Intel with lazy refreshes.
You are missing my point. I'm not saying that NVidia is not the market leader. What I'm saying is that AMD also has products in AI. Which is very different from what you were saying, that AMD had nothing at all. Will AMD ever take the crown from NVidia. I seriously doubt it. Jensen is not an incompetent moron like Brian Krzanich. But AMD, like several other AI companies can carve a part o the market for themselves.
data/avatar/default/avatar16.webp
but why is she saying something we've known for almost 10 years now? brownie points?
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
barbacot:

You're right - I tend to forget that they have CPU's (and I own one so imagine how can I forget?), GPU's, server market, consumer, etc. But it's a lack of vision here - if you can envision where the next hot thing will be you win - Nvidia pushing AI since 2016-17 won at least for now and you have to give credit to the leather jacket man because a good CEO must have great vision and he was right to push for this. You're also kind of right: they may not be that far as I think but they lost a big cash train here and development + innovation can't exist without money and now Nvidia, because of their success in this field, has the money to develop even further - they will not stop and wait for the others while "the others" must get the money from somewhere - by the way AMD buying Xilinx was a great move, it proves that they are interested but let's see how they intend to use that.
Well, it`s easier to forge a future path when we only have a product line (GPUs) to develop than several ones like AMD. But i`m the first to admit that Jensen has been an amazing leader for Nvidia, understanding that great hardware is only as good as the software that supports it and creating new markets for Nvidia, that are now paying off big time. It`s a shame that also seems to be an ass, like 90% of highly succeffull CEOs...
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Horus-Anhur:

You are missing my point. I'm not saying that NVidia is not the market leader. What I'm saying is that AMD also has products in AI. Which is very different from what you were saying, that AMD had nothing at all. Will AMD ever take the crown from NVidia. I seriously doubt it. Jensen is not an incompetent moron like Brian Krzanich. But AMD, like several other AI companies can carve a part o the market for themselves.
You're right - it must be my male period - I can think straight that time of the month:D
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Maddness:

AMD has already shown they're not the budget brand any longer. You only have to look at how expensive an AM5 X670 setup is compared to Z790.
i have noticed there prices are "almost" as ridiculous as nvidia, it getting there. so once tensor cores get added what it same. I think prices are gona hellava lot worse before they get better. Really hope Intel pull rabbit out ass and start making competitive cards at half the price but they probably wont be happen either
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
barbacot:

Probably an AI whose first words would be: "We are Oracle and we are many. To save time let's just assume that we are never wrong!"
Well considering even now I was getting failures because a column name field alias was named over 30 characters long (in 12.1 - I know - an older version) maybe they could improve their DB so not stuck in the 1980s a bit first.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
geogan:

Well considering even now I was getting failures because a column name field alias was named over 30 characters long (in 12.1 - I know - an older version) maybe they could improve their DB so not stuck in the 1980s a bit first.
...more like 1970 -that limit is there from the start... I don't like it - in my youth I was intern for a short period at an Oracle division and it was like MS worst conceptions folded ten times. Never used it after that: MySQL, MariaDB (fork of MySql that by the way Oracle killed when SUN bought it - they were like "No, we will support MySql, we are committed to open source" - blah, corporate lies) and PostgreSQL (best gis extensions for maps IMHO) or even MS SQL - quite good actually for big projects.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Reddoguk:

Surely it's Quantum computers that will deal with AI best. The way they can be used to crack any encryption thousands of times faster than normal CPUs matters. Imagine Chatgpt in 10 years from now. If you was on a phone line with this bot you might never know it is a bot which is scary and bad for peoples jobs. AI could eventually replace many, many jobs because of course businesses will use them and replace real people because one money and two bots don't need sleep. I mean we talk about Sentient beings but apart from self awareness we don't even know what sentience is. I'm not going to go into how a human brain works but what we do know is that we don't know what a consciousness is. So how would we know if a machine can become conscious or not as consciousness just means having knowledge of something.
Once Quantum computers will be a thing, and it's a matter of when now if, all current hardware will become obsolete.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
barbacot:

...more like 1970 -that limit is there from the start... I don't like it - in my youth I was intern for a short period at an Oracle division and it was like MS worst conceptions folded ten times. Never used it after that: MySQL, MariaDB (fork of MySql that by the way Oracle killed when SUN bought it - they were like "No, we will support MySql, we are committed to open source" - blah, corporate lies) and PostgreSQL (best gis extensions for maps IMHO) or even MS SQL - quite good actually for big projects.
I did work for Oracle too about 20 years ago... but still have to use it today still... think government/corporate clients like the feeling of safety in a big name DB behind their web system.