The 4070 is basically a more efficient 3080 with extra features for $600. It's a decent value in the 4000 lineup, but not amazing. About 22% slower than the 4070 Ti, and 25% cheaper. The 6950XT for $610 is about 5% to 20% faster in games, and has equal RT performance, but no DLSS 3. I ended up getting a 4070 Ti about a month ago. I mighty have gotten this instead if it was available. The 4000 series has about the same performance as the 3000. The 3000 series could support DLSS 3, but it wont because then there'd be almost no reason to get a 4000. The only reason Nvidia needed to switch the new Samsung node process was to improve power efficiency enough to make a 4090 and 4090 Ti.
I guess I forgot to post that I bought the Sapphire 7900 XTX Nitro+ when it started becoming available at more sensible prices on newegg. I'm sure a motherboard and associated system RAM update would probably get more out of it, but it has so much brute strength I'm not too distressed about it right now. I plan to hold out until the next generation of CPUs before I do the major upgrades with AM5 unless something major breaks and necessitates doing that sooner. The big premium over the standard reference model is worth it in my view. Despite being noticeably more powerful than the 6900XT it replaced, it's both quieter and MUCH cooler.
The 4060 Ti reviews are pretty crazy. Ranging from it's perfectly okay (Linus) to do not buy (Gamer Nexus) and laughably bad DOA (Hardware Unboxed). Performance wise it sits between the 3060 TI and 3070. It's even slower than the 3060 Ti when memory bus matters. The last Nvidia card with a 128mb memory bus was the 2012 650 Ti. Nvidia plans to release a 16GB version of the card in July, but I'm wondering if that extra memory will even help. All things considered I guess it's not bad for $400. It's just disappointing that there is no generational improvement this gen. Nvidia is giving us better power efficiency and DSLL 3, but reducing memory, memory bus, and providing very little improvement in frames/dollar.
8GB VRAM is bad enough, but a 128-bit memory bus is abysmal. There are older games where the narrow bus is going to shit all over performance. You’re paying the price of a 70 card ($400) and getting something that’s built like a 50 card (128-bit memory bus). EVGA leaving the GPU market makes more sense now. The 60 type cards are the biggest sellers, and they suck this time.
One thing that has been nice with my last two cards being AMD cards is that they tend to throw a massive amount of VRAM on them. I frequently worried about that with Nvidia cards and wondered how long it would be before I needed more memory, but I rarely even think about it now with a monstrous 24 GB. It's overkill, but on the other hand, I never have to worry about it.
I hope AMD continues the trend, but they went for 8GB on the 7600. Granted the MSRP is only $269 and not $400 like the 4060ti, but the 10GB 6700 is faster and about the same price. What the hell AMD.
Something's gotta give. If they don't deliver 16GB/256-bit for $400 or less, a lot of people will just switch to consoles for their AAA gaming. It's not even like RAM is expensive. Micron and Samsung are sitting on mountains of the stuff because they overproduced it. They could put more RAM on the cards without substantially raising prices. They just choose not to.
So many horrible AAA console ports that are killing PCs. Part of the problem is consoles have more VRAM than most PCs. PS5-16GB, XBox X-16GB, Xbox S-8+2GB. They can say that 8GB should be enough for 1080p, but that doesn't take into account shitty ports designed for 10GB+ consoles.
Given the choice between a $1000+ video card that actually has enough VRAM to compete with the consoles and a $400-$500 console, almost everyone will pick the console. Add to that the console versions of games actually being well optimized while the PC versions this gen have been dumpster fires and getting the console is a no brainer. PC is a mess right now. It might come back eventually but it’s in a real slump.
DLSS was recently upgraded to 3.5 with more features including better ray tracing and FSR is getting upgraded to 3.0 with frame generation and other features, including claiming they can generate extra frames for every DX11 and DX12 game. I can't help but wonder what the latency would be with that feature but it is nonetheless interesting encompassing such a massive number of games. AMD indicated Anti-Lag will be upgraded to Anti-Lag+ but didn't provide many specifics about how much of an improvement will be seen, which will be important for this new frame creation stuff. AMD also claimed the lower end cards they just revealed are the last major introduction of the RDNA3 lineup.
I've been pretty impressed with DLSS 3. I've seen some occasional weird artifacts, but it hasn't been too annoying. If 3.5 fixes that even better. I haven't been as impressed with DLSS 2, but I suppose it has times, like in FPS competitive games, where it might be the best choice.