But can it play Battlefield V on Intel Gen12 Xe?

Published by

Click here to post a comment for But can it play Battlefield V on Intel Gen12 Xe? on our message forum
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Nvidia time running out? How long before people don't care for dedicated gpu anymore? Running BF5 at 1080/30Hz on a ultra book is trully impressive.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
XenthorX:

Nvidia time running out? How long before people don't care for dedicated gpu anymore?
Not anytime soon, but yeah, I do agree that the market will likely become smaller and smaller, with only enthusiasts buying them. With RTX and VR, ever increasing refresh rates, demand for GPU horsepower will be there for the foreseeable future.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
XenthorX:

Nvidia time running out? How long before people don't care for dedicated gpu anymore? Running BF5 at 1080/30Hz on a ultra book is trully impressive.
Impressive would be if it run 1080/60Hz with Medium Settings. We know nothing of the settings used and no one games at 30 fps.
https://forums.guru3d.com/data/avatars/m/251/251688.jpg
But..... can it run windows 3.1 ....
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Desktop Xe please.
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
XenthorX:

Nvidia time running out? How long before people don't care for dedicated gpu anymore? Running BF5 at 1080/30Hz on a ultra book is trully impressive.
This is just silly. Dedicated GPUs aren't nearly close to being obsolete. People still complain at performance impact when using raytracing, and that has been using dedicated space in the chip with nvidia's implementation (the only on the real market). And that's just an example, there are many other aspects of our current rendering techniques that aren't physically accurate at all, even on behemoth cards that pull 10x the power an iGPU uses. Imagine how people would complain when running these advanced games with the restrictions of integrated GPUs... Photorealism in real time is close, but not here yet. And framerates are getting higher, but in a reality where people complain when not being able to run at 4k in 60 FPS, I seriously doubt we'll even see iGPUs competing for AAA gaming space in the next decade. I would bet with a lot more confidence on a paradigm shift into unified RAM between CPU and GPU, similarly to what we have on consoles and also (sort of) on iGPU's today. That way each processor would do it's thing in a specialized manner, with it's separated power budget and likely physical space, and would be able to share/manipulate resources in a parallel manner on a ultra-fast common memory pool.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Just imagine for a second Nvidia had not opened the door of Raytracing with its previous gen.