5120x1440 gpu benchmarks reddit


5120x1440 gpu benchmarks reddit. Assassin's Creed: Odyssey: 64 FPS avg (benchmark feature) actual gameplay is more in the 70-80 FPS range. Rasterization wise, the 7900 XTX and 4080 are neck and neck, but the 7900 XTX is cheaper. Timespy Extreme is good, but Port Royal is a better choice to run as a looping test for modern RT gpus. 37 million pixels. 5120 x 1440 = 88. My stepdad bought himself the Samsung Odyssey OLED G9 (completely OP for him but whatever). And Super. Pulling up two builds of firefox (same version, different build options so it can be run different ways on the screen together): 10k: Intel: 20 fps, 70% gpu usage. I spent some time checking different benchmarks, tests etc. 2. 7,875. Reply reply Blue Protocol benchmark tool doesn’t run when I use my native 5120x1440 resolution, but the difference between 1920x1080 and 3840x1080 is negligible. No idea, but i will attempt to predict that to get stable 60 FPS it will require at least minimum of R5 5600 / i5 11400 as for GPU pretty much the recommended RTX 2080 and hopefully not 6800 XT but instead 6600 XT. Past-Ad7565. • 2 yr. 3. only a 15-27 FPS in highs increase from my RTX 3090 OC. AnExoticLlama. 3840x2160 (4k) is NOT ultrawide, but is more demanding on the GPU than any of the above. I have an Odyssey G9 and with that resolution it does take a lot to drive it. Hey everyone, I've got a B550 with a Ryzen 5800X, 64gb of 3600mhz RAM, and the topic at hand, a 3070 all bottled up in a Lian Li O11 Mini. If there are optimizations for the game at launch, the landscape could change a lot. I'd like to throw in the 7900XTX. The RTX3080 is a beast, but not for such a monstrous monitor! When fps are not CPU bottlenecked at all, such as during GPU benchmarks, the 4080 is around 50% faster than the 3080 and 25% faster than the 3090-Ti, these figures are approximate upper bounds for in-game fps improvements. I would think so. Dec 28, 2019 · The 2080S comes close in most titles; there's usually about a 10fps difference. This is a completely new build in 7 years, upgrading from i5 4460 gtx 470 and 8gb ddr3 ram. 32gb Corsair Vengeance RGB Pro 3200mhz. i m using 3070 and 5120x1440 running 120hz for almost 2yrs, i will say. My 3080 on the extreme preset gets around 70-90fps on 5120x1440. its amazing for sure. Timespy is not the best one by far, it's just common because that's the free demo. Your resolution is well below 4K. i7 5820K @ 4375MHz - Cache 4125MHz. As to what components you’ll need, well that depends on the games you’ll want to play. I wouldn't say the 3050 is even near being able to cope with that monitor. My benchmark scores went up roughly 15-20% instantly. Just look at 4K benchmarks and add a few frames and you’ll get a good enough estimation. You can use 4K benchmarks of GPUs to get a good enough estimation of Easy way to figure GPU load between different resolutions is to multiply the vertical resolution by the horizontal resolution to get pixel count. Share. Hardware. Apr 5, 2013 · 5120x1440@120 10-bit 31. The 2060 will definitely struggle with 5120x1440 at max setting, maybe even unplayable at max settings But what you can do it change the resolution in windows to 3840x1080 and increase the monitors sharpness and that will greatly increase performance until you get a new GPU. I sure hope AMD will not join this insane gouging party and instead it'll make NVIDIA lower their prices but considering they both increased prices just recently I think AMD will happily oblige and price their comparable cards just 5-10% better/lower thus PC gaming will slowly turn into a niche market Get ready for an insane graphics benchmark showdown in Cyberpunk 2077! In this video, we benchmark the game's latest Overdrive Path Tracing update with the R Slightly above 100 FPS with 960mV @ 2715 +1000 memory OC. Mainly play Tarkov, Tarkov Arena, CS2 and sprinkle in some BG3. Hoping when the game comes out I can run it at native res. 419. 294. Has anyone gotten any…. Thanks. ath1337. I have a RX6950XT & 5800X3D and play on one of the earlier 49-inch models from samsung (3840x1080 144Hz , VA-panel) but it’s starting to show it’s age and I’d like to get better visuals. At the bottom, enable Path Tracing. Maybe 13% higher fps than 4k. So, a 4K screen has 4x the number of pixels as 1920x1080, but actually tends to perform a little 4k60 TV, pushes 497. But even 2080ti will struggle to hit 60+ FPS on max settings with HDR Open-world games in particular which are already CPU demanding are likely the ones you'll see the most difference in performance between a 32:9 UW and a 16:9 4K monitor. you just cannot have enough GPU 5120x1440 = 7. 54Gbps bandwidth 5120x1440@144 8-bit 31. Maybe that power limit is keeping you from reaching higher score. 59 fps average though it was in 60s for a good bit of bench. Nvidia: 25 fps, 50% gpu usage. Reply. I have zero issues at 4k120 and my mobo only goes up to 11th Gen so not worth spending $400 for a 7700x bundle either. Now it's not exactly proportional. GPU never reached once 100% utilization (within 60%cap) and in every situations this card seemed bored to hell, doesn't even break a sweat. Screen: Dual QHD Samsung C49RG94SSP in 5120x1440. The setting is max out everything, native, no upscaler. im current running neo g9 with rtx 4090, ryzen 5950x. 0. Ryzen 9 5900x RTX 3090 FE MSI MEG ACE X570. jinatsuko. Realization is that both cards are 4 figures and not top of the line, which is horrible for the consumer. 5120x1440 = 7,372,800 total pixels. When it's working, This is CPU constrained to a single thread from what it looks like in top, intel_gpu_top and nvidia_smi. 5120x1440 is about 7,3mil pixel, as compared to 8,3mil for 4K. Depends what game you're playing, what the game settings are at, ect. 4090 energetic efficiency is simply astounding, as long you don't push it in the last percentiles. Even a 3090 will struggle (I define 'struggle' in terms of new games with your 144hz goal). Feb 15, 2016 · The UltraWide GPU performance benchmark includes the GTX 980 Ti, 980, 970, and 960 from nVidia and the R9 390X, 380X, 290X, and 285 from AMD. The 7680x2160 Resolution is twice 4K so you do have 16. B450 Gaming Pro Carbon Max WiFi Motherboard. Here are my computer's specs: CPU: Intel Core i7-10700K 3. If you have a top of the line CPU and GPU and 16gb of RAM you'll have worse FPS than someone with with a last gen CPU and GPU with 32+ Gb of RAM because the game is not optimized. because for the most part performance below 4k is hard-capped by CPU bottlenecks. Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2. Anno 1800, EU4, Stellaris, some indie games) as well as Dota 2 and maybe a round of Assasins Creed. and want to share the results with you. I DDU'ed and reinstalled the latest drivers for this testing. For instance I bought my 1080ti for $400 and sold it for $500 because of how crazy the markets got, then bought the 3080 at retail for $800 because my number came up with EVGA. No GPU is going to meet those specs right now. Took a few screen recordings to illustrate the issues. Heaven is old nowadays, especially for 4000 series gpu. Everything is running at 5120x1440 and with FPS limiters disabled if possible. I have the 5800X3D and in cyberpunk 1440p RT Psycho in crowded city (crowd density = High) my usage is brought down to 77%, I disable DLSS and use DSR to upscale it to 1. Superposition (720p to test how much bottleneck you have from cpu and ram, 1080p extreme/4k for raw gpu performance). Anno 1800 was a game that usually chugged mid game with 30-50's fps Tried it today and holy shit 100 fps before overclocking CPU/ram. RTX4090, 7800X3D, 64G RAM@6000CL30. My thinking was that if memory bandwidth is the reason why the Radeon doesn't win at 4k, there might be a break even point where the bandwidth becomes the problem - I might get lucky and the 5120x1440 resolution be below that point, making the 6800XT a better choice there What I read was on 4k games are going to be entirely dependent on the GPU. Thanks! 4090 under average performance, underperforming, lower fps then 3090. or maybe its a setting issue. For a free looping stability test for modern RT gpus, Bright Memory Infinite Raytracing Benchmark is the best IMO, sense it hits Core, RT, and Tensor all at What GPU to drive 5120x1440 Preferably a high refresh, freesync variant of the 49-inch ultrawide. 1440p 32:9 has almost the same amount of pixels as 4k, albeit marginally less. Specs for reference: I9 - 10900KF. Doubling to 120hz will push that to almost a billion. The dummy dongle plugs into one of the GPU outputs. Running the latest bios on my motherboard (Asus b450i strix). I wanted to provide some benchmarks of my experience upgrading to a 5800X3D from the 2700X, and in particular cover a few games that aren't commonly tested. TLDR Analysis: Upgrading enables easy achievement of higher memory clock (I went from 3333Mhz to 3600Mhz stable using standard DOCP profiles) Average FPS: Across the 5 games, I saw an Getting 60 FPS to feel like 80-100 in very hard to run games where latency isnt super important will make the 7800XT great for even the most demanding games, and in that case its pretty good to have a fast CPU that can maximize performance in GPU bound scenes meaning the 7700x makes more sense i nthis regard. The 3070 is excellent at 1440p 144hz. 29 million pixels, this resolution is just a bit lower at 7. All resolutions, from 21:9 to 32:9, are celebrated here. Since PC gamers rarely buy AMD GPUs, Nvidia only have themselves to compete with. 3840 x 2160 = 8. Nvidia Geforce GTX 1660 Super. 2 Build 9200 64 bit CPU Model: Intel Core i9-9900k CPU x8 GPU Model: Nvidia GeForce RTX 3080 Ti 30. 7MP, and 4k is 8. 600 Pixels So apart from you already having 2. 4090 is for 8K and triple 4K. Like what was said, wait for the benchmarks to get a good price/performance indicator, and also wait for AMD's launch and also reviews/tests. • 3 yr. GPU SSD; Intel Core i5-13600K $254: Nvidia RTX 4060 $289: Crucial MX500 250GB $39: Intel Core i5-12400F $133: Nvidia RTX 4060-Ti $385: Samsung 850 Evo 120GB $80: Intel Core i5-12600K $175: Nvidia RTX 4070 $550: Samsung 860 Evo 250GB $52 I'm willing to run benchmarks for any games that I have. Compared to 5120x1440 with 7. 7900xtx $1000. 9x 4K. Somewhat, I've sold every card I've ever had when I've upgraded to the next one. The busiest of scenes drop into the mid 50s in game and the lighter ones you’ll see 75-90fps out in the desert. 6800 XT + 5800X is the perfect 1440p high refresh rate combo and will hold up for years at that res. 8% of 4K. •. (since no one tests Ultrawide) If you upgrade though, the 1800X will be holding you back massively so I´d suggest getting a 5000-series along with the GPU. He commissioned me to buy a new PC with a graphics card that can handle 5120 x 1440 at 240hz. Even with the RTX 3090, some games could barely pull 60 fps. 4k resolution is 8. A 8700K+4090 is perfectly fine for 4K60, even at 4K120 it's not worth the upgrade cost for an unnoticeable boost in 1% fps lows. If you find a good deal on a 3060 Ti or 3070, get it. 16:9 1080p is 2. All benchmarks are done on highest available settings Hi everyone, I have an ultrawide screen 32:9 5120 x 1440p (G9 neo) and I struggle to find any benchmark/review for that resolution. Wait for the next-gen to drive it further, or pray the rumors of biggest Navi are true and it is faster than the 3090 and get one of those in about a month. for the money id rather have rtx 4090 $1600. Ultrawide 1440p is significantly harder to run than 1440p and gets pretty close to 4K performance. It really depends on the settings you want o play the game at. 32GB DDR4 2666 15-15-15-35-1T. I’ve only played cod and’s battlefield for the last few years and they don’t seem too needy in the gpu Speed Helps but cost per performance is minimal over a better GPU. Zee6372. I am not picky about playing games on ultra graphics but I know that warzone runs like shit in general (low settings or high settings don't seem to matter much) and can't find any specific benchmarks for this. It runs insanely well. CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz. Feb 19, 2022 · NOTE: Video uploaded in 1080p 30 FPS, This is not representing what each game looks like and runs like. The 4090 is hardly any performance increase on sub-4k resolution, at least not worth the cost. Reply reply. If you're going to keep "medium settings" and only change resolution, running 5120x1440 is like 15-20% more pixels than 3840x1600. 2 milion that is quite a lot. 4K performance is what you should look at. Run by Fans of the Worlds Leading Motorsport Simulation Game. Just add +5 /4 fps based on 4k benchmarks. 9 million pixels. Just play a game you like and compare their results with your own. But that is no longer the case with a graphics card as powerful as the 4090. 5080/5090 probably can. He noticed that his actual PC isn't capable of outputting 5120 x 1440 at 240hz which is the max resolution and frequency that monitor can give. 0, at max settings with no DLSS i am seeing 120-160fps at 1440p. I don’t have an RTX 2060 to hand but a broadly as-powerful GTX 1080 couldn’t get anywhere near playable, averaging just 20fps with utterly bottomed-out quality settings and FSR 2 on Ultra Performance. Also worth considering resale value. PNY XLR8 4090 Ryzen 7900x (pbo2 tuned) 32g trident gskill ram xmp @ 6000mhz ASRock x670e steel legend Corsair rmx1000 gold 1440p 165hz monitor. 4000 series for Nvidia and 7000 series for AMD. 3M pixels per second at 144hz. If the 4080 is overpriced at launch (it is), it will have much worse resale value. 3080 or 3090 depending on the game. No reason to upgrade at 4K Gigabyte RTX 4090 OC, Ryzen 5900x, 3600mhz 32GB, 2TB M. The 7900 XTX renders stuff well, but not as well, and that $200 difference can save a lot of time for people that are going through renders daily. Better to wait for reviews to be allowed to see just how they compare to each other. This is a good period for consumers, we get more choices in a short time frame. I bought a 4070ti for my Ultrawide Screen and i love it. C49RG90SSUX (5120 x 1440, henceforth called "SUX") C49HG90DMUX (3840 x 1080, henceforth called "MUX") At current prices where I live, SUX costs an exact 50% more than MUX and I'm wondering if it's worth it. 5900x, and the MSI 6800 XT has overclocked Adrenaline settings. Time spy only utilizing 60% of GPU. The only thing that really matters is the knowledge of the games tested, what settings, and any methodology behind how they calculate an average. ROG Strix 3090 OC. OldScruff. joshikus. If you think about VR (and you should if you’re bothering with a 4090), I have an HP Reverb G2, 2160x2160 per eye. 6M pixels per second at 60hz. It almost seems like DLSS auto is targeting 60FPS and will toggle from Performance <-> Quality as needed. 4080. The 4080 has an MSRP of $1,200 USD. The 4090 is the choice. (CPU: 7900x undervolted, RAM: 6000Mhz) 130 average fps 4080 i3, 1080p. This thread is archived. It depends on what frame rate but I wouldn't expect more than 60 frames on a 3060ti with that monitor even with DLSS. Tests. You'll draw far less power to do the same job an older card would do. It even reaches this bottleneck at 4k in some instances. Welcome to r/ultrawidemasterrace, the hub for Ultrawide enthusiasts. id say youd need at least a rx 6800 or rtx 3070, if playing at native display resolution, but less if you intend to play games at low/medium settings. 953. No you will probably need a 3070 or 3070ti. Howdy Friends! Because i got the chance to use a RTX 3090 I've tought i'd make some Benchmarks with my Samsung G9 5120x1440 Pixel 240Hz Monitor :) My setup is a Ryzen 3900X a little overclocked with IF1900Mhz and DDR4 3800CL16 2x8gb. 372. I did some more math to make the differences in pixel count clearer: 2560 x 1440 = 44. Quirky-Tour9034. And Ti Super. For the past many years, most games were significantly GPU bottlenecked at 4k. thats a lot of pixels for your gpu to power. Runs smoothly every game I throw at it with Max Settings and average about 120 Fps. 32GB DDR4 2400mhz RAM. Fact is no GPU can get anywhere near 4K165+ in super GPU heavy path traced games anyways. So 5120x1440 is significantly closer to 4K than normal 1440P is, by a wide margin. If I put the Task Manager up on my other screen while playing BF2042 I'll see my 5900x running at 50-55% CPU quite often. Don't compare cards using 1440p or 1080p. So far, it is the most demanding online game I have ever played. RX6800 XT 5120x1440 Benchmarks. And this is at 3440x1440 resolution, past the point where the conventional wisdom is that "cpu doesn't matter, it's all GPU at this point. 800 Pixels 3440x1440 = 4. 3440x1440 isn't the only UltraWide resolution out 3440x1440@100Hz is roughly the same pixel amount as 4K@60Hz so from benchmarks you can roughly get an idea by looking at a combination of 4K and 1440p benchmarks. Yooo, 10700k and 4090 same lol. My target is to play mainly strategy games (e. 85Gbps bandwidth 5120x1440@120 8-bit 26. Your resolution is 3440 times 1440 = 4. I've done a lot of Googling, and I only found one single post about this on the LTT forums. Unigine Heaven Benchmark 4. Look at 4K benchmark results for the games you're interested in and decide if the 2070S frame rates are to your liking. 4. If you have the budget/patience, getting a 3080 would be a solid investment. g. W10 Pro. ago. That’s impressive! Totally didn’t expect this … it’s significantly smoother and a huge diff at the starts. You should score over 60fps in the benchmark. One more thing: all models of a given gpu are within 2-3% in terms of speed. 4K Highest settings / RT Psycho - DLSS Quality - FG On. I have one ordered and am going to pair it with the Neo g9. Use DLSS Auto (or DLSS Performance, same thing at 4K) Enable Frame Generation. 1MP (MegaPixels), 16:9 1440p is 3. My build is 7700x, 32GB DDR5 4800Mhz, mid-tier B650 mobo and NVME SSD. 8GHz Motherboard: MSI MAG Z490 Tomahawk Memory: 32GB GPU: RTX 3060 12GB Currently I'm using an MSI 2k-165Hz monitor screen and its running fine but buying the 5k-240Hz monitor would mean a massive leap in resolution and refresh rate, I've tried to google if anyone had tested this and I've been able to only find one guy on youtube who The 3090 is only 10-15% better on average than the 3080, your monitor is ahead of it's time. Generally, look up written benchmarks or benchmark videos if you want to know how your card performs. 200 more doesn’t seem bad but hesitating spending 800 on a gpu. Anyone playing at 5120x1440 with an RTX 3070? I bought a samsung g9 monitor and can't decide if I should go for a 3070 or 3080. Superposition the best one if you want a reliable score to compare your overclock and/or undervolt. . 85Gbps bandwidth (OCed) For that resolution you DO NEED at least 32GB RAM, so your GPU won't start swapping memory to your SSD at AAA games, and to avoid any stutter. Just launched AC for first time (Still waiting for the CSL DD) So far on (Ultra) default settings benchmark, Avg 133 - 144 fps. 53°C Top with fans at slowest pace. Server performance has ZERO impact on FPS/System performance. The GPU was Overclocked to the maximum, meaning almost 2500Mhz Core and 2134Mhz VRAM actual clock and +15% PT. Dive into discussions about game support, productivity, or share your new Ultrawide setup. You'd probably take a similar performance hit (58-66 fps), but it's hard to say without practical testing. 5x closer to 4k, trying to stress GPU a little more but it only brings it up to ~84% with a few more fps gain. 5950X you'll be fine, 4090 is just way too powerful for most of the CPUs at the moment at 1440p. 400. I'm getting 60 fps (on a 60 fps NEC imaging monitor) playing MS FS2020, with most settings on ultra, while flying just above rooftop level in NYC. 3MP. however, the benchmark i get compared to lots youtube reviewers who tested rtx 4090 with 4k monitor is about 5 to 10 % less. 15. If you're going for high or max settings on anything that came out in the last year or two though then you're just flat out of luck. My intended solution is to install a dummy dongle capable of reporting the client display's resolution of 5120x1440. im not sure why, but it seems g9's resolution is more demanding than 4k in my experience. 3090 was in the 90's and 4090 in the 60's. 980Ti @ 1455/7500. 4 milion and 3840x1080 which are 4. Hi all gonna give you a run down on my setup first, I have been having some weird issues with my new 4090. 7ghz 4x8gb 2400 ddr4 with rtx 3080 to a Ryzen 5 5800x with monoblock and custom Waterloop. 1440p ultra wide, pushes 713. Monitor is 1440P@170Hz. Division 2 Benchmark 5120x1440 DX12 everything Ultra/High, Fog on Low and AA set to Medium. Trying to decide on a GPU for the next few years. Open-world games in particular which are already CPU demanding are likely the ones you'll see the most difference in performance between a 32:9 UW and a 16:9 4K monitor. Jul 20, 2020 · 3840x1600 is ultrawide, but takes less GPU horsepower than 4K because it only has approx 75% of the pixels of a 4K monitor. So its much closer to 4K than to a "standard 1440p resolution". I scored 63fps. The price difference here is quite high thus the hesitate. Feb 11, 2022 · Video Card GPU Benchmarks, High quality testing of your video card and gaming system. Not everything has to be open source, this is a tool that automates the startup of benchmarks. RTX 4090 from same AIB costs 2,250 USD dollar and this is one of the cheapest cards available . We all know that the 3000 series dilveres the most uplift at 4k, with 1440 less and 1080 being smallest. 4k still needs top of the line cards, probably only the latest Nvidia and AMD cards will do. Reddit iOS Reddit Android Reddit Premium About Reddit GPU Question for 49" 5120x1440 . 3 million pixels. 1. true. 3070 barely do the job for ultrawide monitor , even ACC cant set the high performance of display setting. 5120x1440 Display GPU Recommendation. $400 for potentially $400+ of add'l resale value outright. Both my 3090 and 4090 do not max out my GPU usage. FPS is from 135-170FPS. • 1 yr. 5120x1440 is ~11% less pixels than 4K. JackSpinouT. I probably won't play any AAA 1st person games. Unless you're striving for greater than 120 FPS or play games that are specifically reliant on the CPU, it's not gonna be worth the upgrade cost in 9 out of 10 cases. If this does something nefarious you can see it in the benchmarks. The 7800 XT is around a 3070 ti 5120x1440 is 11% fewer pixels than 3840x2160, so add 10% to the UHD 4K numbers and that is what you can reasonably expect. 14 votes, 10 comments. System Specs. Upgraded my delidded overclocked I7-6700K 4. 3090 would get you closer to 120hz but it is more than double the price for maybe 15% more performance. Civilization VI Benchmarks at 1080p, 1440p, 5K and 10K. 4K is 3840 times 2160 = 8. I tested 1440p with 4xSSAA (= 5K) then was surprised by the results so I tested again Welcome to the Unofficial iRacing Reddit Community. IF you own a RTX4090. 3360x1440 is 4. Even a 3060 or 3060tibwould struggle. 6950XT on New Egg currently at $669, that'll get you to 60fps gaming at your resolution. 8MP, so you can expect to see around a 20% decrease in FPS vs a 2560x1440 resolution, although it isn’t quite Nov 22, 2020 · 5120x1440 is roughly 0. 2 Score: 1441 min FPS: 27. Here's how I think dummy dongles work. Using this I only consume about 330W on average in Cyberpunk. I really hate the “4090 is a better value” argument. So CPU didn't really make much a difference. So i got to test the card :) My Setup is an AMD Ryzen 5900X on an Asus Strix X570-E with DDR4-3800Cl16 and IF1900Mhz. 3440x2160 = 7,430,400 total pixels. I have a 3080 ti and don't intend on getting a 40 series or 7k series anytime soon, at least until Nvidia sorts their issues out and prices come down or AMD drops benchmarks and their drivers have been out in the wild for 6 months to see what happens. Cyberpunk 2077 is very GPU intensive so I don't think either will let you cap 120hz at that resolution. Join us for a wider point of view. I can’t run ultra everything but still playable. May 20, 2012 · Posts. Expecting the 4080 to do fine. And if you’ve never seen Ultra Performance upscaling applied at 1080p, rest assured that it does little for Remedy’s beloved image quality. 200 less pixels to push, the OP uses custom with MSAA cranked to 8x aswell. New GPUs are probably at least 4-6 months out, and if the next gen 3080 card is sufficiently faster than the 2080Ti then Nvidia might delay the 3080Ti another 9-12 months, like they did I have a Samsung Odyssey G9 Monitor (5120x1440p) running with the following specs, and I'm looking to upgrade my GPU to achieve a higher frame rate on newer games: Ryzen 7 2700x CPU. Firestrike for test high fps load were the cpu is more involved. Cyberpunk 2077 Performance Findings on Rtx 3080. i dont really care much I'm running 5120x1440 with the Raytracing Ultra presets (which includes DLSS Auto) on a 3090 + 9900K and getting close to 60fps on average, but I want to start playing with the DLSS settings. I've found only one so far here in this sub. I suppose I could pay the one-time cost but then having to also buy a beefy GPU because of that resolution might probably be too much. 6Milion Pixels that want to be defined. i'm sure i'm CPU bottlenecked at this point. Oof. Can’t help but thinking I might be better off with using that to upgrade in a couple years to another $600 card. Does Looking to future proof or at least stay in the upper echelon of performance for a while. $400 more expensive is $400 more expensive. 1259 (4059MB) x1 Render: Direct3D11 Mode: 5120x1440 Full Screen Preset: Custom Quality: Ultra Tessellation: Extreme 5120x1440 fps jump. I currently have a 3070 and it is playable on the Neo. Not really that high. GarbageLalafell. In the past I’ve gotten around 18000ish so I tried time spy extreme and got my usual score Most rtx 4080 here are up to 1800 USD. " This 5900x is pushed very aggressively with PBO2/Curve Optimizer (hits 23300-23500 . So far I've tested: Modern Warfare: 80-90 FPS w/Ray Tracing -- 100-120 FPS w/o RT. At first I was scoring 11000 and getting an invalid result because of LOD settings. in WZ 2. RTX 3090 5120x1440 Benchmarks. I've basically only spent an $100-$300 on each new graphics cards for the past decade. I tested without MSAA to try and figure out how badly this game is CPU bottlenecked. Nevermind the 3050. 3 Max FPS: 126 Platform: Windows NT 6. I hesitate between a RTX4080 (1350€) and the 4090 (1999€). Onboard, it has some sort of contraption with digital output to "handshake" with the GPU and exchange resolution capability. I ordered some new ram and wanted to get some “before” data so I fired up 3D mark. It looks like NVIDIA wants to increase their ASP and margins twice. AMD Ryzen 5800x3D vs 5600x Benchmark Ultrawide G9. Depends on game but you’re close to 4k resolution so would likely struggle for newer games. r/ultrawidemasterrace. The 3090 is only 10-15% better on average than the 3080, your monitor is ahead of it's time. I've had a CRG9 for some 3 odd years, and the 3070 ran it well as the monitor capped at 100hz when using 10 bit colour. If you're intending to play at epic settings, you will need a beefy graphics card. Friend of mine got his hands on a RX6800XT Sapphire Nitro + and didn't have enough time. 4% of 4K. For example, 2560x1440 at epic settings gives me lows around 70 fps with a RX 6800 XT. 0. Bought the 12900k bundle at microcenter and snagged a 4000d airflow on sale for 40 bucks. Someone realized his friend's 3080 was performing far better than his, and they eventually narrowed it down to DSC (it is simple to test, just lower your resolution if you're using DSC to a non-DSC I have a 5800x3d and an 6800 XT right now with 2k monitor, switching to Samsung G9, which GPU would allow me to play with a respectable fps in 5120x1440? The cheapest possible solution would be preferred as I don't want to buy a 4090 or 7900 XTX unless there is no other way. 0 FPS 57. For demanding titles such as RDR2, Assassins Creed Odyssey etc. This means that you should look at 4k benchmarks and add roughly 10% to the fps to see how it would manage at 5120x1440. Hey guys, Been having a lower than expected performance on my system with an RTX 3080 (Evga xc3) and Ryzen 3600. oy bz go qi rf zg tq kf vb ld