AMD reveals RDNA4 architecture, Radeon RX 9070 GPUs, and Ryzen 9000 X3D CPUs

Scorpus

Posts: 2,199   +242
Staff member
What just happened? The big surprise out of CES 2025 is that AMD hasn't fully launched new RDNA4 GPUs at the show, instead deciding to merely "preview" the new architecture and a couple of graphics card models. As of writing, AMD has officially announced the RDNA4 graphics architecture, Radeon RX 9070 series of GPUs, ML-powered FSR 4 upscaling, and new Ryzen 9000 X3D desktop CPUs. We've got all the information for you here, plus a few thoughts on what AMD has shown off so far.

AMD has announced new Radeon RX 9070 XT and Radeon RX 9070 graphics cards, but has done so without providing any specifications, any performance figures, pricing, or a release date, outside of a vague "Q1 2025." This is a pretty disappointing given that many were expecting full product details from AMD's RDNA4 launch – you'll just have to keep waiting to learn everything about these models.

It really seems like AMD is not willing to commit to providing concrete information ahead of Nvidia's GeForce 50 series announcement, which we're expecting later today as well. This gives AMD more flexibility to respond to Nvidia's new generation, for better or worse. If they really want to disappoint gamers this generation and ensure RDNA4 is a flop, they would wait for RTX 50 series pricing and make their equivalent models slightly cheaper – a strategy that categorically didn't work with the previous generation and absolutely should not be attempted again. But who knows, we might be in for a repeat anyway.

A more positive perspective is that AMD might be waiting to see what Nvidia does to ensure their products are highly competitive and to better market their cards in comparison to Nvidia. If Nvidia decides to give us a decent price-to-performance improvement this generation, then AMD could respond with even more aggressive pricing later or make minor configuration adjustments. Realistically, the most important factor for AMD is how good RDNA4 looks in comparison to Nvidia's new products.

New Radeon RX 9070 GPUs

AMD is announcing the Radeon RX 9070 XT and RX 9070, which will be available in Q1 2025 from a range of partners, including the usual brands like Asus, Gigabyte, Sapphire, and PowerColor. MSI will not be producing RDNA4 products, having exited that market during the RDNA3 generation, though they haven't provided an official reason as to why. Our understanding is that there's some sort of relationship issue at the moment between AMD and MSI.

You can see a variety of partner models in the image at the top of this story, but AMD also seems to have snuck in a new reference model into the picture, in both a three- and two-fan design. This reference design was previously revealed through an AMD ad a few weeks ago.

As for the RDNA4 architecture, AMD has made only some light disclosures at CES. RDNA4 will be built on the same family of TSMC nodes as RDNA3 – in this case for RDNA4, a 4nm node, whereas for RDNA3 the compute die was built on N5. The expectation is that RDNA4 uses a different design to RDNA3, no longer separating the GPU cores and memory controller into separate dies on separate nodes, but there's been no confirmation of exactly what this looks like.

AMD is touting a wide range of architectural improvements with RDNA4. The headlining feature is an "optimized" compute unit, AMD told us in a briefing that this includes a "significant" change to the architecture, higher IPC, and increased frequency – which sounds like a little more than a mere optimization. This suggests that RDNA4 should see a notable performance improvement from a similar number of compute units to RDNA3 – RDNA3 models typically clocked between 2.1 and 2.3 GHz for their game clock, so pushing that up in combination with better IPC should lead to a healthy performance gain.

In addition, RDNA4 features a "massive" change to the AI aspects of the architecture, and while AMD kept this vague, they did say they've added a number of capabilities to the design. The ray tracing engine has been overhauled for improved performance, but we have no numbers to back this up yet. There's also a second-gen "Radiance Display Engine," suggesting further improvements to display connectivity relative to RDNA3.

The media encoder is "substantially" better according to AMD, and they actually provided some specifics to back up this claim. A footnote reveals this is comparing H.264 image quality using VMAF quality scores between RDNA3 and RDNA4 in three games (Borderlands 3, Far Cry 6, and Watch Dogs Legion) at 1080p and 4K. H.264 was a massive weak point for AMD's media encoder, so it's encouraging to see these claims targeting improvement in that area given H.264 is still widely used. Hopefully, there are no hardware bugs with the encoder this time, like outputting AV1 at 1082p instead of 1080p, a real issue with RDNA3.

AMD has not provided concrete performance figures for RDNA4 just yet, however they shared a teaser on this slide talking about Radeon branding. In the dark gray, we have existing generations like the GeForce 40 series and RX 7000 series, and a dashed line indicating where AMD believes the RTX 5000 series will fall – a bit faster than current models. Then on the right, we have the positioning for both the RX 9070 and RX 9060 series, and yes, there's a mention here of the 9060 with no further information.

Where the RX 9070 series matches up with current cards is around the level of the RTX 4070 Ti and RX 7900 XT for the highest-tier 9070 model, which has long been rumored as the performance ceiling for RDNA4.

AMD is suggesting the lower RX 9070 model could be in the range of the RX 7800 XT, and then we'll see the RX 9060 series offering performance at best between the 7700 XT and 7800 XT, and for the lowest-tier cards, around the 7600 XT.

AMD says this new naming scheme was chosen for two reasons. Firstly, to match their direct competitor, essentially copying Nvidia to make it easier for people to figure out how AMD cards compare...

AMD says this new naming scheme was chosen for two reasons. First, to match their direct competitor, essentially copying Nvidia to make it easier for people to figure out how AMD cards compare. Basically, they're saying the RX 9070 series is a competitor to the RTX 5070 series, though whether that means for both performance and pricing remains to be seen. Secondly, they said the 9000 series name was chosen to align with their CPU series and also because 8000 series GPU branding is being used for mobile designs with RDNA3.5.

The name doesn't really matter too much, to be honest. What matters is whether AMD can deliver a strong GPU with competitive features and great value. Renaming their lineup so it looks more like an Nvidia card isn't the key to more sales. The key is creating a product similar to whatever the RTX 5070 ends up being, justifying the matched name, and then smashing Nvidia down with better pricing and value.

FSR 4 = Machine-learning-based upscaling

AMD also announced a couple of new features coming with RDNA4 GPUs. There's "Adrenalin AI", okay... Then the much bigger and more important feature for gamers, FSR 4, which is finally moving to machine-learning-based upscaling. There's not much detail being provided here either (see the pattern?), outside of FSR 4 being developed for RDNA4, and designed for high-quality 4K upscaling.

The hope is that FSR 4's quality will improve to the point of being truly competitive with DLSS, which is more likely to happen with an AI-enhanced algorithm like Nvidia already uses and like Intel uses with XeSS. But we haven't been provided any demonstration of FSR 4, and we don't know when it will launch.

There are a few additional tidbits though. FSR 4 is designed to use the AI accelerators built into RDNA4, but AMD hasn't confirmed whether FSR 4 is exclusive to RDNA4 graphics cards or whether it will be broadly available like FSR 3, which works not only on older AMD GPUs but also with chips from Nvidia and Intel.

AMD says FSR 4 will provide not just an image quality improvement compared to FSR 3, but a performance improvement as well, which is interesting.

We also spotted a rather interesting footnote, which reads: AMD FSR 4 upgrade feature only available on AMD Radeon RX 9070 series graphics for supported games with AMD FSR 3.1 already integrated. This implies that AMD has developed some sort of conversion system that will take a game with FSR 3.1 and automatically upgrade it to FSR 4 when you have an RX 9070 graphics card.

We asked AMD about this strange footnote that doesn't seem related to any of the main points on the slide, and they declined to elaborate or explain this, which is a little unusual. If this feature does exist in the way the footnote seems to suggest, it would be a very solid way to launch FSR 4, given that FSR 3.1 is already available in around 50 games.

Game support is crucial because FSR 4 would not enhance the value of RDNA4 if it took two to three years for it to be supported in a wide number of games. In that sort of scenario, DLSS would still have a huge advantage in game support, and thus would still be worth paying for. But if FSR 4 can launch with decent game support right off the bat, it will go a long way to closing that gap, provided image quality and performance also hold up.

Various board partners will be showing off Radeon RX 9070 models at CES 2025, so we'll hopefully get some hands-on time with them in the coming few days right here in Las Vegas.

New Ryzen 9 9950X3D and Ryzen 9 9900X3D CPUs

AMD also announced two additional desktop Ryzen 9000 CPUs: the Ryzen 9 9950X3D and Ryzen 9 9900X3D. These are exactly as expected, bringing 2nd-gen 3D V-Cache technology to 16- and 12-core CPUs, providing a single CPU with great performance for gamers and creators.

If you had to guess what the specifications of these CPUs looked like, you'd probably be right.

The Ryzen 9 9950X3D with 16 cores packs 128MB of total L3 cache, 64MB from the original Zen 5 CCDs, and 64MB stacked beneath one of the CCDs – the other CCD remains without V-cache, in a similar design to the 7950X3D. Scheduling is still required to ensure games run on the 8-core CCD with V-cache, while the other CCD is designed to run at the highest possible frequencies – in this case, 5.7 GHz, the same max clock speed as the 7950X3D.

The TDP is also identical to the 7950X3D at 170W, so this is basically a straight Zen 4 to Zen 5 core architecture upgrade.

The 9900X3D, the 12-core model, is also what you'd expect: 128MB of L3 cache and up to a 5.5 GHz max boost frequency, 100 MHz lower than the 7900X3D. The TDP remains the same at 120W. No pricing was given for either of these CPUs, and AMD has only said they will be available in Q1 of 2025.

As for performance, AMD claims in their own benchmarks that the 9950X3D will deliver performance roughly equivalent to the 9800X3D in games, within 1%. This means an 8% improvement compared to the 7950X3D in AMD's testing across 40 titles at 1080p high settings – which, again, AMD says is similar to how the 9800X3D compared to the 7800X3D. Relative to the 285K, AMD believes the 9950X3D should be 20% faster in games.

In productivity apps and creator workloads, AMD is quoting a 13% increase in performance relative to the 7950X3D across 20 apps and a 10% increase relative to the Core Ultra 9 285K. This is why AMD believes the 9950X3D will be the best part for both gamers and creators, as it offers both better gaming and better productivity performance than the 285K in their testing. Of course, we'll have to verify whether that's true in our full review when the parts launch, so stay tuned for that.

Other AMD announcements at CES 2025 relate to mobile parts, which we'll be covering shortly, including new laptop hardware and chips for gaming handhelds. There are tons of mentions of AI, too, because of course, that's the trend. Some of the laptop CPUs even have AI in the name, like the AMD Ryzen AI Max+ Pro 395, which is a real name for a real product.

That was more of a teaser from AMD than a full product launch, but all of these new products are not too far away. We should learn more about RDNA4 in particular shortly. Stay tuned for more CES 2025 coverage here on TechSpot.

Permalink to story:

 
Like the idea of holding back like this for AMD, though after the new RTX line launch with pricing (later today), AMD needed to have stock ready and get jump to fill orders knowing the RTX line is going to be short and mostly scalpers.
 
I played the AI drinking game and I can barely type right now. cheers to those who did! I ran out of beer about halfway through

AI AI AI AI AI AI AI AI AI AI

I'm gonna have to get 6 pack if I'''n gonna watch nVidias but I think I shoud aboid the high alcohol stuff
 
I played the AI drinking game and I can barely type right now. cheers to those who did! I ran out of beer about halfway through

AI AI AI AI AI AI AI AI AI AI

I'm gonna have to get 6 pack if I'''n gonna watch nVidias but I think I shoud aboid the high alcohol stuff

Glad I'm not the only one who noticed the writing quality of this article. TERRIBLE. AI AI AI. Another round, please.
 
Definitely disappointing but AMD knows it has a very weak hand this round and its best to keep the cards close till NV plays theirs.

Don't need another 5700XT/5600XT scenario where prices were bouncing around and cards were getting shipped with old bios that users had to flash all for the sake of remaining price/performance competitive against NV.
 
It was a shitty presentation. Hopefully AMD waits for Nvidia to release product details and prices, and adjusts their offerings accordingly. This would align with Huynh’s talk about gaining market share. So maybe a well priced (from the start) AMD gpu at last?
 
Feels like AMD stiffed Sony on the PS5 Pro. When I see them launching RDNA4 only a couple months later and the lower end 9600 is touted to be in the same sort of performance bracket as the 7700XT. Ehhh.

The GPUs still feel like an entire generation behind Nvidia, maybe more at this point. You will probably see a two year old 4070Ti turning over most of AMD's brand new product stack as soon as you enable any ray tracing. Pricing is everything, as usual. I guess if the 9070XT is under $700 it'll sell to somebody.
 
Feels like AMD stiffed Sony on the PS5 Pro. When I see them launching RDNA4 only a couple months later and the lower end 9600 is touted to be in the same sort of performance bracket as the 7700XT. Ehhh.

The GPUs still feel like an entire generation behind Nvidia, maybe more at this point. You will probably see a two year old 4070Ti turning over most of AMD's brand new product stack as soon as you enable any ray tracing. Pricing is everything, as usual. I guess if the 9070XT is under $700 it'll sell to somebody.
SO do you remember how when the XBONE and PS4 came out that graphics were suddenly really awesome and really optimized? That seemed to tapper off with the Series X and the PS5. What everyone did get is basically the same graphics fidelity on hardware that was faster. We have Publishers, not really devs(but nto exclusively) spending money on how they can make games generate more revenue and more addictive. They spending less money on development and optimization and spending more of it on how to make selling loot boxes more addictive.

This is why despite even a 4060 being significantly more powerful than a 1080ti, it doesn't really FEEL like we've made any improvements. GPUs have gotten A LOT faster over the last 3 generations. Regardless of who has the fastest card, they both have made incredible improvements, but many people don't feel the upgrade like they used to.

This isn't from a lack of trying, but from a lack of optimization. The devs don't have to spend nearly as much time on optimization and the top brass of the gaming company just wants to make as much company. "well, everyone can go get these fast cards now. we can spend half as much on optimization and twice as much on marketing skins and battle passes". That kicks the can down the road to the people buying the cards.

The companies selling us the games(not the developers, I want to make the difference very distinct) think that since we have faster hardware that they can give use a similar experience with fewer development costs. The problem is that costs of hardware of risen to the point where the same consumer who would buy an 80 class card every generation, well, maybe he's looking at a 70ti or maybe just a 70 class card. His experience in new games is going down relative to what he is used to.

Now, for me, I have built 2 "money is no object" machines twice in my life. Currently, I game less than when I was younger, that's just how it is. There is also the fact that there are very few games these days that I find worth spending that kind of money on. I'm still young enough where a few grand in my 401K will make a big difference at retirement, but not so young that I can YOLO my money away. I'm also not old enough yet to justify spending money on myself as my life isn't slowing down, My last "money was no object" machine was at 30 and my next will probably be in my late 40's. For now, I'm happy with "acceptable" performance at what I would consider an "acceptable" price. While I certainly could afford a "money is no object" machine whenever I want, I actually kind of enjoy working within the limitations I have placed on myself.

That said, I will not make excuses for publishes who won't pay for development optimizations and the fact that they wont pay for development optimizations is the main reason I can't justify hardware costs like I used to. On top of that, games just aren't that fun anymore. I was still excited for every new game coming out well into my early 30s. Now, I can count the games I've found worth playing in the last 5 years on one hand. 2077, BGS3 and Space Marines 2. To be honest, I'm still playing EvE these days on a 12700 and a 6700XT 90% of the time. I'm actually downloading Neverwinter Nights EE on steam right now so I'm gonna give that a go again soon, but I'm certain I got my point across
 
SO do you remember how when the XBONE and PS4 came out that graphics were suddenly really awesome and really optimized? That seemed to tapper off with the Series X and the PS5. What everyone did get is basically the same graphics fidelity on hardware that was faster. We have Publishers, not really devs(but nto exclusively) spending money on how they can make games generate more revenue and more addictive. They spending less money on development and optimization and spending more of it on how to make selling loot boxes more addictive.

This is why despite even a 4060 being significantly more powerful than a 1080ti, it doesn't really FEEL like we've made any improvements. GPUs have gotten A LOT faster over the last 3 generations. Regardless of who has the fastest card, they both have made incredible improvements, but many people don't feel the upgrade like they used to.

This isn't from a lack of trying, but from a lack of optimization. The devs don't have to spend nearly as much time on optimization and the top brass of the gaming company just wants to make as much company. "well, everyone can go get these fast cards now. we can spend half as much on optimization and twice as much on marketing skins and battle passes". That kicks the can down the road to the people buying the cards.

The companies selling us the games(not the developers, I want to make the difference very distinct) think that since we have faster hardware that they can give use a similar experience with fewer development costs. The problem is that costs of hardware of risen to the point where the same consumer who would buy an 80 class card every generation, well, maybe he's looking at a 70ti or maybe just a 70 class card. His experience in new games is going down relative to what he is used to.

Now, for me, I have built 2 "money is no object" machines twice in my life. Currently, I game less than when I was younger, that's just how it is. There is also the fact that there are very few games these days that I find worth spending that kind of money on. I'm still young enough where a few grand in my 401K will make a big difference at retirement, but not so young that I can YOLO my money away. I'm also not old enough yet to justify spending money on myself as my life isn't slowing down, My last "money was no object" machine was at 30 and my next will probably be in my late 40's. For now, I'm happy with "acceptable" performance at what I would consider an "acceptable" price. While I certainly could afford a "money is no object" machine whenever I want, I actually kind of enjoy working within the limitations I have placed on myself.

That said, I will not make excuses for publishes who won't pay for development optimizations and the fact that they wont pay for development optimizations is the main reason I can't justify hardware costs like I used to. On top of that, games just aren't that fun anymore. I was still excited for every new game coming out well into my early 30s. Now, I can count the games I've found worth playing in the last 5 years on one hand. 2077, BGS3 and Space Marines 2. To be honest, I'm still playing EvE these days on a 12700 and a 6700XT 90% of the time. I'm actually downloading Neverwinter Nights EE on steam right now so I'm gonna give that a go again soon, but I'm certain I got my point across
Speaking of optimization in games, mandatory beta testing by customers has become the norm as well.
But that is 100% on us. We buy games in any condition sometimes for full price. The last bit of decency the devs have today is when they at least put early access label on their game when it is not ready.
Which makes me believe that this is just in human nature, I mean desire to have the thing as soon as possible no matter the cost.
It must be a well known trick they teach in marketing schools.
 
Last edited:
Dear Tim,

I do indeed see a pattern and one that you keep repeating often in your 1st CES article, is that you didn't/don't understand the scope of an announcement.

"Join AMD executives, alongside partners and customers, to hear how AMD is expanding its leadership across PCs and gaming, highlighting the breadth of the company’s high-performance computing and AI product portfolio."
^^
Your whole article is very off putting, because AMD doesn't just make graphics cards for gamers & you keep referencing how little detail or how vague AMD was with their Opening day announcements. And YOU portray this as if such information isn't forthcoming in a panel, or that AMD is hiding something, or running away from something.

Why are you so suspicious..? Most of us are highly anxious to see/hear more.



Secondly, the following paragraph is simply biased and misplaced:
"Game support is crucial because FSR 4 would not enhance the value of RDNA4 if it took two to three years for it to be supported in a wide number of games. In that sort of scenario, DLSS would still have a huge advantage in game support, and thus would still be worth paying for. But if FSR 4 can launch with decent game support right off the bat, it will go a long way to closing that gap, provided image quality and performance also hold up."

Time to face the facts as most realists understand now in 2025, that CUDA is dead in gaming and for the last 4-5 years, games have been designed and developed for RDNA, because it's unified with Console and PC.

PlayStation 5/Xbox and handhelds have more exclusive game/sales, than NV. And to that point, like so many here, you are so stuck in the past (along with all my expensive EVGA cards) and do not understand that DLSS has no weight.

Game Developer's 1st develop their games based on RDNA and AMD's hardware. I am sorry if that stark news threatens your stock portfolio, but when GTX owners and early RTX owners have to use FSR in games, that is a huge win for AMD. 1st & 2nd-Gen RTX owners can't even use DLSSX and will be thanking AMD with their next purchase.


FSR is thee gaming industry standard and it's agnostic and available in every game on the market. So what gap is FSR trying to close, so much that YOU hope AMD can make it close said gap faster with FSR4..?

What I want to know is how is NVidia going to stuff DLSS into all the leading games coming out on console..? Where u need such technology the most.

Not so much in $800+ GPU's <-- Gamers buy thoz for their raw raster performance.
 
Naming scheme is weird, feels like the "performance rating" and similar copycat naming of the late 90'/ early 2000's, where cpu's would be named not with their clockspeed, but what pentium speed they measured up to in performance, or some other arbitrary competitor based metric, should be clearly carving their own product line, not haphazardly chasing Nvidia who is already a step ahead
 
The questions is "would FSR4 be ready on launch day" or like the FSR3FG arrive later...
 
AI's taking up all the space. They didn't show anything about RDNA4, literally AMD's worst presentation ever.

 
Naming scheme is weird, feels like the "performance rating" and similar copycat naming of the late 90'/ early 2000's, where cpu's would be named not with their clockspeed, but what pentium speed they measured up to in performance, or some other arbitrary competitor based metric, should be clearly carving their own product line, not haphazardly chasing Nvidia who is already a step ahead

Radeon RX 9700 renamed to RX 9070...

Bcz AMD doesn't want people to be confused about their CPUs, that are also now at 9000 series and have CPU's like the Ryzen 9700x or Ryzen 9800x3D, so the new naming scheme makes sense for GPUs.
 
Interesting thing about AMD with RX 9070 XT is how they seem to be focused on comparing this next gen AMD card(s) to nVidia's now past gen (4070 vs 9070) -- Now factor in AMD's idea to use 9070 to gain or buy market share from nVidia, just how cheap will they need to sell the RX 9070 XT (when RTX 5070 is $549 and RX 9070 XT is 4070 performance level card) for the team green fan boys to jump ship on nVidia? Going to have to be far less than $549 like 5070 is priced.
 
After AMD announcement I was thinking : great, AMD has finally caught up with the green team's former generation, yay and yawn!
After Nvidia's, vapourware, pardon me, DLSS4 fest, I think that may not be such a bad thing after all.
All verging on AMD's upcoming independently tested price/performance ratio, this fool hopes.
 
Breaking news, AMD renames the RX 9070 XT once again to the AMD Radeon 4070 Ti and prepares for AIB preorders in Jan. 2025 -- with nVidia RTX 5000 series being such a failure do to DLSS 4, pricing, and next Gen design combined with AMD's next Gen GPU performance levels promoted a further rename delaying the CES full live presentation to the new RX 4070 Ti and RX 4070. ;)
 
Back