Aside from the formal press releases from Corsair already announcing the new Carbine 100R, the Hydro H110i GT all-in-one liquid cooler, the HG10 N780 GPU Bracket and their new flash storage options, at their suite there was a couple of interesting things worth discussing regarding DRAM. A small portion of the suite had the recently released GIGABYTE X99-SOC Champion (which we have reviewed), but plugged in to this was a set of orange DDR4-3400 memory.

Up until this point, Corsair had released DDR4-3200 (we have a kit of this in to test) at $740 for 4x4GB to DDR4-3333 at $910, but DDR4-3400 pushes the margin out a bit more. With most DRAM, binning for higher speed hits the law of diminishing returns – you have to bin more ICs to get the high speed. As a result, these modules will be pretty expensive, and because it is X99 which needs four modules to reach quad channel bandwidth, a user has to buy all four. At the suite, they even had them running with a small overclock to DDR4-3500:

This sets them up to be very expensive. They are currently on Corsair’s website for $999.99 for a 4x4GB kit. To put that into context, two sets of 4x8GB DDR4-2133 C13 will run at just over $1000 combined, making these modules almost 4x the cost per GB than the base JEDEC frequency memory. These DDR4-3400 modules are set at 16-18-18, which is looser than JEDEC at 2133, but indicates that both primary sub-timings and frequency are tight compared to each other. Compatibility for this kit is so far only listed as with the X99-SOC Champion.

While the kit was impressive and did catch my eye, the following wall image caught my attention more:

Here it explicitly states that a module size of 16GB is coming to DDR4 in 2015. Unfortunately no other information could be crowbarred out of Corsair regarding time frame or pricing, but we were able to speak with a memory manufacturer who said it should be coming in the near future. We will be working hard with Corsair to secure some testing kits if they pop up, but it means that soon we should (hopefully) start to see 128GB UDIMM arrangements on X99. It might also mean another round of BIOS updates to help support a full 8x16GB configuration. These would most likely start at DDR4-2133, as this would have the highest yields.

Comments Locked


View All Comments

  • Sivar - Tuesday, January 13, 2015 - link

    I have 32GB DDR4 and Windows 8.1 and even running 10 concurrent 1080P video encodes (using x264 on placebo mode), 30+ Chrome tabs, Steam, and Visual Studio 2013 I have never been able to hit >16GB RAM usage.

    Other than VMs and astonishingly memory-hungry content creation apps -- both of which are used by a vanishingly small percentage of the PC market -- I don't know how much demand will exist for such large RAM sizes.
    There was a time that more RAM was always better because the need was just around the corner. While >8GB will eventually be required, the growth curve has been flat for as long as any time in history.
  • inighthawki - Tuesday, January 13, 2015 - link

    If Microsoft ever implemented OS level support for RAM disks and made it consumer accessible, that would be a pretty useful. Or find a way for the OS to more intelligently use the RAM as a disk cache to bypass disk access completely for commonly used applications, which I suppose could theoretically provide power savings on mobile devices.
  • Klimax - Wednesday, January 14, 2015 - link

    RAMDisk is part of Windows since forever. You can add RAMDisk driver. It is though GUI-less and parameters re edited through registry. More info and source code (it is very simple driver):
  • inighthawki - Wednesday, January 14, 2015 - link

    Very interesting, didn't know about this. Thanks for sharing!
  • atticus14 - Wednesday, January 14, 2015 - link

    So while I don't count as typical consumer the amount of Ram Chrome takes up...I think I can use a few of those.
  • bitech - Wednesday, January 14, 2015 - link

    Why would you want RAM requirements to rise? What good would it do to make consumers require more than 8gb RAM?
  • Flunk - Tuesday, January 13, 2015 - link

    Cool, looks like we're getting close to a reason to upgrade my 2500K. Sky Lake or AMD's next-gen chip seem interesting and they both use DDR4.
  • foxtrot1_1 - Tuesday, January 13, 2015 - link

    Yeah, SkyLake is the first significant upgrade for Sandy Bridge users. My 2500k + GTX 670 FTW box can still manage just about everything I need at 1080p, really looking forward to the step up to PCIe SSDs and DDR4.
  • CaedenV - Tuesday, January 13, 2015 - link

    Agreed, a lot is riding on Skylake. I have an i7 2600 and very few things really peg the CPU, and the new equipment (outside of the iGPU which goes unused) is only better for very specific tasks such as encryption. I am to the point where I want to upgrade... but an upgrade would be largely useless at this point.
    Even my old-ish GTX570 can keep up with most things today (at least for 1080p) and is really only held back by only having 1GB of RAM on board rather than any sort of processing bottleneck.
    Skylake + PCIe SSDs + DDR4 + a newer GPU might actually make a case for an upgrade. Hopefully it will be faster, offer more cores, and have a significantly lower idle power use for when I am not using it but still have to have it on. But if the new chips offer the same thing the last few generations have offered with the same horsepower while using less wattage then I may just upgrade the GPU and be done with it.
  • TiGr1982 - Tuesday, January 13, 2015 - link

    According to the recent years Intel's "tradition", Skylake won't bring any impressive x86 performance improvements. Another 5%, I guess, not taking new specific Skylake intstructions into account (correct me if I'm wrong).
    Besides, Skylake is initially rumoured to be locked (non-overclockable). So, I have a big doubt it will be worth upgrading from i7-2600 (at least i7-2600K) from purely x86 performance point of view.
    I personally have Haswell i7-4790K and plan to use it for three years more or whenever the M/B dies of ageing in the future, because I believe an upgrade to a newer platform in the next 3 years won't be really worth the money.

Log in

Don't have an account? Sign up now