DDR5-4800 trades blows with DDR4-3200 in leaked Alder Lake benchmarks

Hahahahah, technology advancements ftw! Even my sorry *** 2400 mhz DDR3 system hits 34.1 MB/s read / 37.6 MB/s write (Gskill Tridentz/4790k stock).

Here's to early adopters! Prices wouldn't be as high if it weren't for you! :)))
 
Here's to early adopters! Prices wouldn't be as high if it weren't for you! :)))
Indeed. If you absolutely, positively have to be the first kid on your block to have something, it's only fair that you should help defray the R & D costs.

Put into context and more bluntly, "bragging rights cost money, how badly do you need to hear yourself talk". :rolleyes: :
 
Last edited:
In theory DDR4 spec started at 1600, but in reality nobody used anything less than 2133 even from the early days of the standard. Within a year of Intel's first consumer DDR4 chipset platforms (Z170) 3000-3200 was pretty much the sweet spot of price/performance and everywhere.

I expect the same here. You're probably looking at 6400 as the bottom end widely sold DDR5 kits in a years time and it'll go quickly up from there.

Good point, but it's worth noting that DDR3 reached 1600 in the standard, and that before DDR4 was introduced DDR3 2133 was common (kind of like DDR4 3200 today, I'd say; correct me if you think the analogy is wrong), DDR3 2400 was like current DDR4-3600, and DDR3 3000 was available.

That made DDR4-1600 quite pointless. It's far from the situation with DDR5, where the baseline 4800 is near the top of the line for DDR4. I agree with you that we're likely to standardise on higher speeds even in the short run, but DDR5 4800 in nowhere nearly as pointless as DDR4 1600 was.

In any case, I wouldn't read much into the current numbers. DDR5 4800 should have a real bandwidth advantage over DDR4 3200. The DDR5 CPU's slower speed in this sample probably has an effect, and the platform is far from mature. I'm pretty confident that by release date we will see a meaningful difference, and probably even more so with the CPU generation after that.
 
So, today I have learned that I can save $300 by not buying any DDR5 and just stick with my current 32GB of DDR4 3200 in my 9900K PC, If I decide to upgrade to an Alder Lake 12900K build in Q1 2022, seeing as its compatible with DDR4.

Good!

 
Indeed. If you absolutely, positively have to be the first kid on your block to have something, it's only fair that you should help defray the R & D costs.

Put into context and more bluntly, "bragging rights cost money, how badly do you need to hear yourself talk". :rolleyes: :

says the one with 17,025 posts. lol. not that I want to ignore the "cranky" part of your name, but do you actually have an opinion in tone with your age, other than name calling or projecting?

bragging rights cost money? thank god people with low self worth can always buy new stuff to feel better about themselves. if you remember your economics 101, price is dictated by demand/supply, not by r&d. you can ASK whatever price you want for a product. some *****s bought a 999$ iPhone app that simply added the icon of a diamond/gem on their screen. at the other end, VW loses about a million dollars on every Veyron they sell. but who am I to reply to strangers without 2 simultaneuosly firing synapses. :)
 
Last edited:
I'm gonna wait till DDR5 kits and motherboards are on the shelves with full compatibility to whatever generation intel Core i9 is available and then upgrade to a new desktop.

I'm sure I'll see significant speed nd stability improvements - although I'm doing fine right now.

Honestly if I were I would skip 12th gen and wait till for the beta testers to try it out first
 
This guy gets it! There's pretty much no chance DDR5 with high latency will compare to Samsung Bdie 3800 at cas 14 with tight trfc. Still, if the new Intel chips can get me to 280 fps in Warzone I'm buying. I'm cpu limited at 1080p

You really need that 280fps? Are you that bad in the game? What's next 360fps at 720? LMAO
 
That's because Sandy Bridge CPU's IGP (HD-530), had a boot up limit of 2133 Mhz.

I'm sort of a wimp, and never run memory past the maximum boot speed of the onboard IGP. (You know, in case a video card blows up, you don't have to R & R the memory, just hook the monitor to the board. (I fibbed a bit. I'm wimpy, lazy, and cheap). ;)

You know you could run faster memory and then just the speed on it?
 
says the one with 17,025 posts. lol. not that I want to ignore the "cranky" part of your name, but do you actually have an opinion in tone with your age, other than name calling or projecting?
Have you checked my join date? I've been "projecting" for nigh on 15 years.

Besides, I just gave my general opinion, whereas you're commenting directly on my character, my ability to think, me "living up to my screen name", etc. Which while not directly , "name calling":, so to speak, is an "ad hominem" attack. (and BTW, off topic Not that I'm a stickler for that sort of thing),
bragging rights cost money?
As a matter of fact it does. Why else would someone pay $2,000,000 for some crap antiquated "Super Mario" game cartridge? I also believe he felt it might get him laid more, since the average high maintenance barracuda has to know, there;s plenty more money where that came from. If nothing else, said gold digger might be able to squeeze a few hundred thou out of him in a :sexual harassment lawsuit
if you remember your economics 101, price is dictated by demand/supply, not by r&d. you can ASK whatever price you want for a product. some *****s bought a 999$ iPhone app that simply added the icon of a diamond/gem on their screen.
I don't recall "economics 101", since I never signed up for that course..

I am however up to about "paying attention to the news", oh say about 6201. And IIRC, big pharma always blames the staggering cost of a newly released, patented drug, on, "massive R & D costs", and gets away with that explanation, every time..While "Viagara", was still under patent protection, the sticker price was $2,000 a month. The massage there being, "woodies cost money, how many do you feel you need"?
VW loses about a million dollars on every Veyron they sell. but who am I to reply to strangers without 2 simultaneuosly firing synapses. :)
Well, major automakers spend gobs and gobs of money on their racing divisions. But they do it under their own brand name. So, if Chevy is winning a bunch of races, they'll likely sell thousands upon thousands of their standard autos, to compensate for, and likely exceed, their investment. at the track. Have you ever heard of, :NASCAR"?, or were you too busy studying economics?

If Volkswagen loses a million on every Veyron the sell, they should re-brand it as a Volkswagen instead of Bugatti, and take it to the track. It might help them sell a lot more plain old Volkswagens..

While I'm here, (I'm about to go rest my two lone brain cells), do you mind if I ask, how your > 3rd post< went for you? Did it make you feel important, maybe like you'd accomplished something? Or possibly like you've righted some "wrong" for Techspot?
 
Last edited:
You really need that 280fps? Are you that bad in the game? What's next 360fps at 720? LMAO
Honestly, I'm 40 years old and I'm competing with super sweat 20 year Olds in Warzone. We play wagers occasionally so yes I need the 280fps on my 280hz monitor. My human benchmark scores are around 180ms which isn't great. My aim is pretty good though thanks to hundreds of hours in Kovaaks. The lower input lag of high refresh rate gaming really helps me compete with the kids. Also, pretty much all CS pros play at lower than 1080p stretched (4:3) aspect ratio, not far off 720p. If I used a controller I'd probably do better at Warzone but I'm MKB for life. Thanks for your toxicity though!
 
Well, major automakers spend gobs and gobs of money on their racing divisions. But they do it under their own brand name. So, if Chevy is winning a bunch of races, they'll likely sell thousands upon thousands of their standard autos, to compensate for, and likely exceed, their investment. at the track. Have you ever heard of, :NASCAR"?, or were you too busy studying economics?

Whilst I totally agree with your point, I think using Nascar as your example wasn't a good choice.
Pretty much no one outside North America watches it or knows the manufacturers or drivers.
 
No one outside of the southern US watches Nascar... f1 is way more popular in Canada
There are other racing circuits here in the US, and I believe Formula 1 as well. If not F-1, then "Indy car", racing. I'm old enough to remember when they were running those nasty old "Offenhauser" four bangers. (I even spelled that correctly, first try).

I do recall watching a "strange" racing event here in the US (?), which was held on an F-1 style track, but the cars were of many different classes from stock to full on F-1 designs. I've never had any interest in oval track racing, whatsoever.

I'm from "the union" part of the country, and can assure you I have absolutely no interest in watching stock cars go round and round in circles at 200 MPH. Usually, the "highlights" of the horrific crashes on the 11 o'clock news are more than sufficient to satisfy the sadistic side of my nature.

NASCAR is our most "notorious" (?), racing organization, which is why I used it as a citation.. Although, it's just not the same now that they're not allowed to wave the Confederate flag around anymore.

My interest in watching sports only goes as far as women's beach volleyball, women's figure skating, and women's artistic gymnastics. The shoulders on the women, (girls ?) in in standard gymnastics are a bit off putting. IMO, it's not "ladylike", to be built like John Cena..

OK. so I'm a lecherous old fart, what of it?;)
 
Last edited:
Honestly, I'm 40 years old and I'm competing with super sweat 20 year Olds in Warzone. We play wagers occasionally so yes I need the 280fps on my 280hz monitor. My human benchmark scores are around 180ms which isn't great. My aim is pretty good though thanks to hundreds of hours in Kovaaks. The lower input lag of high refresh rate gaming really helps me compete with the kids. Also, pretty much all CS pros play at lower than 1080p stretched (4:3) aspect ratio, not far off 720p. If I used a controller I'd probably do better at Warzone but I'm MKB for life. Thanks for your toxicity though!

I myself run a 165Hz Monitor but honestly I don't know why :joy: I mostly play single player games, few online ones, my brother talked me into it saying it was so good, I barely see or feel a difference as for the input lag we are talking here about milliseconds here that definitely doesn't make a difference
 
I thought I explained why I ran the stock speed RAM.

Or were you trying to say, "you know you can run memory faster than the speed on it". Big difference

My comment wasn't full... what I was trying to say was that you could have run faster memory than 2133Mhz and if your GPU broke down just clock it down
 
My comment wasn't full... what I was trying to say was that you could have run faster memory than 2133Mhz and if your GPU broke down just clock it down
Gotcha. (y) (Y)

Although that does assume you can get IGP output into BIOS with the higher memory speed.

I do have a spare 8400 GS laying around for such emergencies. I suppose that would be a good work-around.
 
Last edited:
Fortunately we have time before the real next gen future proofing begins with AM5 3 quarters away for ddr5 ram to mature.
 
Good point, but it's worth noting that DDR3 reached 1600 in the standard, and that before DDR4 was introduced DDR3 2133 was common (kind of like DDR4 3200 today, I'd say; correct me if you think the analogy is wrong), DDR3 2400 was like current DDR4-3600, and DDR3 3000 was available.

That made DDR4-1600 quite pointless. It's far from the situation with DDR5, where the baseline 4800 is near the top of the line for DDR4. I agree with you that we're likely to standardise on higher speeds even in the short run, but DDR5 4800 in nowhere nearly as pointless as DDR4 1600 was.

In any case, I wouldn't read much into the current numbers. DDR5 4800 should have a real bandwidth advantage over DDR4 3200. The DDR5 CPU's slower speed in this sample probably has an effect, and the platform is far from mature. I'm pretty confident that by release date we will see a meaningful difference, and probably even more so with the CPU generation after that.

Judging by the state in which DDR4 was released, and the JEDEC specs which show that DDR5 will likely have worse CAS latency than DDR4 (when measured in ms, not cycles), I strongly suspect that we'll be waiting until second CPU generation using DDR5 to see any meaningful benefits from DDR5. Even if the RAM itself is better, the first generation of memory controllers (from both Intel and AMD) tends to not perform very well.
 
I strongly suspect that we'll be waiting until second CPU generation using DDR5 to see any meaningful benefits from DDR5.
I suspect we'll see meaningful benefits when decent iGPUs are released. GPUs tend to care a lot more about bandwidth than about latency. For general computing, the extra bandwidth probably won't matter much.

Still, I do expect that the bandwidth itself will be measurably higher, which is just not the case in the bandwidth benchmark quoted here. It's possible that this is the case with Alder Lake due to being a DDR4/DDR5 hybrid, but I certainly hope that's not the case.
 
Another work around would be bios reset by pulling the battery out for few seconds : - P
Jeez, make me feel like a noob why dontcha?

That said, since I don't have any experience with "overclocked" (so to speak), memory, IDK if when the BIOS is reset, whether or not it will be polled at a stock speed, or at its rated speed.
 
Jeez, make me feel like a noob why dontcha?

That said, since I don't have any experience with "overclocked" (so to speak), memory, IDK if when the BIOS is reset, whether or not it will be polled at a stock speed, or at its rated speed.

it always goes back to stock speeds, I have DDR4 3600Mhz memory if I was to pull my battery out the memory would go back to 2133Mhz
 
I'll probably hold off on DDR5 for a few years. Currently I'm waiting for zen 5 before I contemplate upgrading, I went from a 3570k to my 9700k after all.

Maybe by then I can afford a midrange GPU to go with....
 
Back