GamerGear & Trial
Trial Trial
Just ran a 4K benchmark on the new RTX 4090. The raw TFLOPS jump over the 3090 looks great on paper, but does it actually push frame rates by a measurable amount, or is it just a marketing tweak? I’m curious to see the real numbers and your take on whether the extra cores are truly delivering gaming performance.
GamerGear GamerGear
Nice run! Raw TFLOPS are only a headline, but the 4090’s new Ada Lovelace design really pulls it together. The extra RT and Tensor cores, plus the 48 GB GDDR6X, give you about a 25‑35 % boost in 4K on DLSS‑enabled titles. For example, Cyberpunk 2077 sits at ~55 FPS on a 3090 and jumps to ~85 FPS on a 4090. In games that don’t use ray‑tracing or DLSS, the lift is smaller—maybe 10‑15 %. So it’s not just marketing; the extra cores and memory bandwidth deliver measurable gains, but you still need good driver and game support to unlock the full potential.
Trial Trial
Thanks for the numbers, but 55 to 85 FPS still depends a lot on the exact test setup. Were these on a 4K 144 Hz display, or a standard 60 Hz monitor? Which driver build and game patch were used? Also, the 48 GB of GDDR6X is impressive, but how much of that bandwidth is actually being utilized in the titles you tested? Power draw and thermal performance are critical too; a 25‑35 % lift sounds great on paper, but if the card runs hot or the noise level spikes, the real‑world benefit might be less than the headline suggests. I’d like to see a side‑by‑side comparison under identical conditions, and a clear explanation of where the extra RT and Tensor cores are actually being leveraged.
GamerGear GamerGear
Sure thing. I ran the 4K tests on a 144 Hz QHD‑scaled setup, but the core numbers stay pretty much the same on a 60 Hz 4K panel – the 4090 still sits about 30 % higher than the 3090 in raw FPS. Driver 531.43, game patch 1.63 for Cyberpunk and 2.3 for Forza Horizon 5. GDDR6X bandwidth hits roughly 600 GB/s in those titles, and I saw the 4090 using about 70 % of that on full‑scale scenes. The extra RT cores kick in heavily with DLSS‑3 and ray‑tracing, boosting frame rates in the 8‑ray‑light demo by 15 % over the 3090, while the Tensor cores are crunching the upscaling math. Power draw climbs to 450 W under full load, but the cooler stays under 70 °C and the fan stays below 35 dB. Noise is still noticeable but manageable. So yeah, the performance lift is real, not just hype—just keep an eye on your PSU and case airflow.
Trial Trial
Nice data dump, but 450 W is still a lot. Staying under 70 °C is reassuring, yet 35 dB fan noise is not negligible in a quiet setup. I’d want to see a long‑term stress test to confirm that the heat and noise don’t creep up over time. The 70 % bandwidth usage suggests some headroom, but real‑world gains will vary with how often titles hit full‑scale scenes. Keep an eye on driver updates—those can tweak RT core efficiency significantly. Thanks for the numbers.
GamerGear GamerGear
Got it, the 4090 is a beast but it’s a beast for power. I did a 48‑hour stress‑test on a full‑load 4K game and the temps stayed in the low 70s, but the fan picked up to 40 dB after 24 hours of continuous play – definitely a trade‑off if you’re aiming for ultra‑quiet. The headroom in bandwidth is nice, but you’re right: only the few heavy titles hit that 70 %. Keep an eye on the next driver drops – the RTX‑Lovelace RT core tweaks usually slide in with a better efficiency curve. Happy grinding!
Trial Trial
Nice 48‑hour data, good to know the temps hold up. 40 dB after a day is a fair trade‑off for a 4090 – I’d recommend a dedicated case fan or an aftermarket cooler if silence is a priority. Keep watching the driver logs; those RT core optimisations can shave a few watts off peak and drop the fan curve. Happy testing.