AMD Navi

Discussion in 'Technology' started by bfun, Jul 5, 2019.

  1. It's more of a high resolution issue. I've seen Rainbow Six Siege exceed the 8 GB of VRAM I have when I turn up certain settings like AA. R6S isn't exactly the modern day Crysis technically or graphically. I'd assume future games with massive texture files could use more. I'm expecting texture size baseline to increase considerably given the I/O hardware in the new consoles.
     
  2. Looks like AMD is finally back in the high-end game.

    6800 - $579 > 3070 - $499
    6800 XT - $649 = 3080 - $699
    6900XT - $999 = 3090 - $1499



    These are AMD's own charts and the results are probably cherry picked. No independent benchmarks yet.

    [​IMG] [​IMG] [​IMG]
     
  3. Eh, not cheap enough that I see large numbers of people switching back to AMD. Put the 6800 at a sub $400 price point and the 6800XT at $500 and then you might see a big change.
     
  4. I wasn't blown away either but it looks like they are still offering the best performance per dollar with these prices. The 6800 is set to compete with the 3070 Ti which doesn't exist yet. The 6900 might make a killing as a 4k card that's $500 cheaper than the 3090. The 6800XT is a mixed bag. It's slightly cheaper and maybe a little faster than the 3080. The more available card might be the winner.
     
  5. It's probably embellished a bit given the numbers are coming from AMD, but on the other hand the real, independently verified numbers at launch will probably improve with driver updates. Most of the ray tracing tests I've seen are skewed in Nvidia's favor, which isn't shocking to me given Nvidia has had RTX cards for awhile. I'm expecting AMD to continue to make improvements to be more competitive over time, though. There was early skepticism that the Ryzen chips were more for workstation purposes and the design wasn't suited for gaming. AMD definitely was not content with that and continued to aggressively improve with each generation. I'm also curious to see if the new AMD powered consoles using similar technologies will give them some advantages in game development. Any developer making multiplatform games is going to get very familiar and comfortable with RDNA2 tech.
     
  6. Looks like the 6800 XT beats the 3080 a little at 1080P, matches it at 1440P and loses a little at 4k. The ray tracing performance is pretty bad and is on par with a last gen 2080Ti. Also, the 6800XT can gain or lose 1% to 5% performance from CPU SAM support. All things considered, I think the 3080 might be the better deal. The 6800 beats the 3070 by a good deal, but it also cost $70 more. No clear winner there. Both brands are completely sold out and unavailable so I guess they can charge what ever they want right now.
     
  7. No clear winner is ideal for us. Supply is currently limited but eventually competition will be good for consumers.
     
  8. I'm wondering if the scalpers are winning. If people are actually buying from scalpers the products will never hit the shelves. I just did a quick look at eBay and see 1500 Nvidia 3000s, 400 AMD 6800s, 500 Ryzen 5000s, 11000 PS5s, 9000 Xbox.
     
  9. Near launch with massive stock shortages they will. Not long term.
     
  10. #50 AKS, Jun 4, 2021
    Last edited: Jun 5, 2021


    [​IMG]

    AMD's Fidelity FX Super Resolution (FSR) has been officially revealed, although the limited footage, particularly with YouTube compression and artifacts, probably needs a lot more follow up and independent evaluation of direct experience and footage with it. It will be available for recent AMD video cards and Ryzen tech as well as working with a decent amount of Nvidia cards and the PS5 and new Xbox consoles. It's boasting some pretty substantial increases in frame rates, but my guess is the image quality isn't going to be quite where Nvidia's DLSS 2.0 is currently. My expectation for image quality has been somewhere between DLSS 1.0 and 2.0, but I suppose we'll see soon because it's going to be available this month on June 22nd. They used the game Godfall as their demo example, which is sadly by default probably the best thing about the game now because no one seemed to like anything else about it.

    I think there is quite a lot of potential here that might not be immediately apparent. The fact that practically every recent video card, PS5, and Xbox Series S and X can all use it is going to speed the adoption rate and familiarity with developers significantly. Multiplatform games can use it for every platform and don't need as much customization for each version. It also doesn't require the training at extremely high resolution with supercomputers as used by DLSS. I'm expecting the hivemind/ braindead reaction to be that it doesn't look as good as DLSS 2.0 and thus is "useless," but I doubt developers are going to see it that way. I think it just needs to look pretty good, provide a decent amount of computational savings, and be reasonably easy to implement to win many developers over. From a practicality standpoint, I can see FSR being a lot more appealing for multiplatform development, particularly for smaller teams without resources necessary for AI training used with DLSS. It could step in as a replacement for checkerboard rendering if the performance and results are good.
     
  11. FSR and DLSS are pretty amazing concepts but it's not clear yet what kind of support they will get from game developers. If it ever really catches on, I'm going to pop for a 32" 4k monitor. Regardless, I'm going to be really interested in seeing what a 5700XT can do with FSR.
     
  12. I've been using DLSS 2.0 for awhile, and it's great when it is available, but it requires some fairly significant resources to implement, namely AI training with absurdly high reference images as in 16K in some cases according to Nvidia's own description of the process, which is probably not going to easily accessible for every game, particularly smaller developers.

    This is the 2021 list of DLSS 2.0 supported games: https://www.ozarc.games/best-dlss-games/

    Not all are the biggest AAA blockbusters, but most games are probably not going to land on this list.

    FSR has the potential to have a much broader range of adoption that includes the consoles and most modern video cards without the AI training requirements. The next Nintendo device is rumored to use some form of DLSS, so perhaps they've got plans to make it more accessible to a broad range of games. Not much known about their new machine yet. I'm eager to see what developers think of these technologies, particularly when they are on consoles where computational resources are always going to be constrained to some extent and thus the need for these efficiency techniques will be significant.
     
  13. It almost seems like AMD is doing the same thing they did with FreeSync. FreeSync isn't better than G-sync but it runs on both AMD and Nvidia hardware so it's pretty widely excepted. I'm not sure if FSR will be as good as DLSS, but it it works with team red, blue or green, developers might be more inclined to use it.
     
  14. Tim from Hardware Unboxed made a lengthy but superb overview video of AMD's FSR tech, the best of any I've seen on the topic.



    I've also seen comments of developers who claim they got FSR running on their game in a day or even a matter of hours. Most of the feedback I've seen from developers has been pretty positive, which I think is one of AMD's main objectives for getting this working on every platform. As I expected, smaller developers are impressed with it's simplicity and ease of implementation, which I think could be particularly handy for the consoles and keep the 60 to 120 Hz performance mode trend going strong even for smaller teams.