When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:
When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:
That has never worked well. It might give high average framerates on paper, but it introduces jitter that produces a worse overall experience. In fact, Gamers Nexus just came out with a video on a better way to measure this, and it touches on showing the problem with multi-GPU setups:
https://youtu.be/qDnXe6N8h_c
I think that you misunderstood my comment.
The video shows how SLI makes the frame pacing more inconsistent, which is a known issue when multiple GPUs work together to solve the same problem.
What I am talking about is more like Nvidia Optimus. This is a common technology on laptops, where the display is connected to the low power iGPU, while games can use the dedicated Nvidia chipset.
I don’t know about potential frame pacing issues on these technologies, and it seems like it was not addressed in the video either. However, I know that newer laptops have a switching chip that connects the display to the dedicated GPU, which, I think, aims on lowering the latency.