Nvidia's $1599 RTX 4090 GPU reviewed

Originally published at: Nvidia's $1599 RTX 4090 GPU reviewed | Boing Boing

2 Likes

Man, it’s been a really long time since I’ve paid attention to gaming rigs.
Aside from the eye-popping price - 450W power draw from a video card? Holy crap.

9 Likes

If existence in the Matrix ends up being privatized, I imagine graphics cards will end up taking the place of rent in the inequality equation. Based on my current housing situation, let’s just say it’s a good thing I’m still nostalgic for VGA…

4 Likes

For me this latest generation is kind of a miss. Numbers-wise it looks good if you’re a few generations behind but the power draw and prices are up there. Also the size of the 4090 is comically gigantic

4 Likes

A few more generations and we will have motherboard sized GPUs with a plug in for a raspberry pi sized single board computer with CPU, RAM, and storage.

9 Likes

I joked with a friend that you might need rebar inside the PC case to hold the GPU up

2 Likes

In some ways, Apple is on to something with going to external GPUs. The physical size and power requirements of modern GPUs means they crossed over into “second computer” a long time ago. Trying to continue to stuff that second computer in your first computer with all the space, power, and thermal problems that creates is getting into diminishing returns.

5 Likes

For me having my GPU plugged into my PC would not be completely ideal but with that said i certainly would not have bought as huge as a case as i did in the first place. I think that’s a good trade off, i would be willing to have a more manageable case size if i knew i didn’t have to worry about my GPU fitting or not.

1 Like

Yeah, e-PCIe is interesting idea, strangely underutilized, as is Thunderbolt, and they’ve been around for a bit, had a guy using a TB3 eGPU (1070 at that time) and an XPS 13, it was pretty good, except you’re dependent on the manufacturers implementing things properly. Dell sucks in that respect, they didn’t implement the full TB3 spec, and only had 8 PCIe lanes available >:(

2 Likes

Yeap. Not really a rebar, but my system has one of these:
https://www.google.com/search?q=GPU+bolster&tbm=isch

1 Like

external gpus still aren’t supported by Apple Silicon, and no one knows if they will be.

3 Likes

In what way is this OK? It feels borderline irresponsible that GPUs draw this much power. I mean, I love gaming, but I’m the guy who frets about how much his furnace fan draws during the summer.

3 Likes

I watched a review of several of the third party cards and many of them came with extra brackets or support bits.

I haven’t actively played PC games in about 20 years and didn’t even know such a thing existed for home rigs. I’ve also been a Mac user at home since like 2011…
In the server infrastructure world that I live in, you will see very expensive GPU cards in use for VM clusters hosting virtual desktops for people that need better graphics in that application. But like for a 4 node cluster, you are hosting a LOT of virtual desktops with one card in each node and the hypervisor passing that through to the VMs.

1 Like

The fireplace screensaver will feel real.

4 Likes

Ugh that’s too bad. Another random Apple pivot, maybe.

2 Likes

Yeah, but even those huge dividable GPUs pale in performance per $ when you stack them up against these consumer ones, although that much pixel pushing power comes with a big power draw, and loads of waste heat to get rid of too.

Oh, totally. Those Tesla cards and the like are just for “decent” performance necessary for certain tasks. I’ve seen them used for CAD on virtual desktops, but honestly speaking, the numbers of people in an organization that really need stuff for CAD or something similar, they are better off having a dedicated workstation. VDIs are best for general workload stuff. Which is most people in an office.

I have a gaming rig with a big power-sucking video card but I only use it a few hours a week and am happy to pay for/generate power required. That seems like the definition of “OK” to me.

I can see that, because it’s a lot of thermal split amongst a bunch of different tasks. That seems more efficient?

I’m not trying to be judgemental - only a few hours a week is probably fine? I’ve been out of the loop for a few years.