Comments Locked

101 Comments

Back to Article

  • yeeeeman - Thursday, December 24, 2020 - link

    while this is an interesting product, it is one SLOW machine and even a Sandy Bridge based system trounces it.
  • Silver5urfer - Thursday, December 24, 2020 - link

    PS4 and XB1 get crushed by Q6600 Core2Quad, Sandy i7 2600K obliterates it. And that SB processor is still relevant even today can play a lot of games which came in 2020.
  • d0x360 - Thursday, December 24, 2020 - link

    Allot of 2d indies maybe...and surely nothing that's using unity for it's engine.

    I mean hell I have an i7 5820k with an all core OC of 4.6ghz (stable to 5.4), that's 6 cores btw. I also have 32 gigs of ddr4 quad channel and a 2080ti ftw3 ultra +170 core +1400mem.

    I found that this year my CPU just isn't cutting it.. except for in cyberpunk oddly enough. In watch dogs legion I'm getting 30fps at 1080p with ray tracing off and DLSS on perf. Yeah that's probably because it's ubi soft but still there have been plenty of other games that came out in the 2nd half of this year that offer unacceptable performance and it's always because I'm cpu bound.

    No it doesn't help that I play at 4k... I like dropping to 3200x1800 but some games freak out when using a custom resolution so then I have to go to 1440p which looks soft on a 4k display.

    So an i7 2600k... Nah. No way. I doubt it even "crushes" the base consoles.
  • loki1944 - Thursday, December 24, 2020 - link

    Yes; it does, because even Bloomfield will demolish an xbox one X. I have C2Q 6600, 9650, i7 920, 930, 960, 980X, 870, 4770K, 6850K, 7800X and i5 750, 6600K, and 9400. There is simply no comparison. If your 5820k isn't cutting it then it's probably dying.
  • Silver5urfer - Thursday, December 24, 2020 - link

    "CBR11.5 marks
    AMD QuadCore A6-5200 2.0GHz: 1,99pts
    Intel Core2 Quad Q6600 2.4GHz: 2,70pts
    Intel OctaCore Atom C2750 2.4GHz (@2.6GHz Turbo): 3,81pts
    AMD EightCore FX-9370 @ 2.4GHz: 4,16pts
    Intel Core i7-4770K @ 2.4GHz: 5,18pts

    source: http://pc.watch.impress.co.jp/docs/topic/review/20...
    source: http://adrenaline.uol.com.br/biblioteca/analise/78...

    the PS4 CPU runs at 1.6GHz,
    1.5GHz quad core Jaguar scores 1.5
    1.6GHz would probably be something around 1.6
    8 core 1.6GHz Jaguar maybe around 3-3.2?

    Q6600 at 3GHz achieves around 3.3
    stock around 2.7"

    A quick google instead of wasting my time... And that's the CBR11.5 score. Then if we take i7 2600K, it is over. At GN 4.7GHz @1.35v 2600K plays games at Ryzen 2700 Stock levels.

    End of story.

    8th gen console HW is utter garbage.

    And using Cybertrash ? That is a horrible game and a horrible benchmark as well.
  • moozooh - Thursday, December 24, 2020 - link

    > I found that this year my CPU just isn't cutting it.. except for in cyberpunk oddly enough.

    That's because it's written for four cores and doesn't use any modern instruction sets (except one instance of AVX which can be safely patched out to play on pre-SB hardware). Then again, the same is applicable to most modern games, still, sadly. It's almost 2021, and almost no game makes good use of more than four cores.
  • raystryker - Friday, December 25, 2020 - link

    My Core 2 Quad 6600 and R9270 outpaced both consoles with no trouble...I wouldn't say "crushed" but yeah. I gamed on that combo through 2018....
  • Hrel - Saturday, January 23, 2021 - link

    The performance you're indicating is indicative of a bad overclock. Probably also throwing lost bits in memory like crazy. Your hardware is capable of FAR FAR more than you're getting.
  • bunnyfubbles - Friday, March 26, 2021 - link

    your 5820K @ 4.6GHz should be at least roughly equivalent to a Ryzen 2600 if not 3600 or modern, non K i5... if it really isn't cutting it when you appear to be setting yourself up for a GPU limitation with your choice of resolution, there is likely something critically wrong with your system.
  • MetaCube - Friday, December 25, 2020 - link

    "And that SB processor is still relevant even today can play a lot of games which came in 2020."

    Nice one.
  • powerarmour - Saturday, December 26, 2020 - link

    It's certainly interesting, but it's about 7 years too late performance wise.
  • shabby - Thursday, December 24, 2020 - link

    Disappointed you didn't test cyberpunk 2077 on it...
  • YB1064 - Thursday, December 24, 2020 - link

    He might have, but it probably crashed.
  • eastcoast_pete - Thursday, December 24, 2020 - link

    Had the same thought. Even if it would load, playing at 1-2 fps would be an experience one would want to forget.
  • shabby - Thursday, December 24, 2020 - link

    I'm sure at 360p it could pull off a solid 30fps šŸ˜‚
  • eastcoast_pete - Thursday, December 24, 2020 - link

    Good one! Ian, can you comment on whether it would even load Cyberpunk 2077, never mind run it?
  • Ian Cutress - Thursday, December 24, 2020 - link

    I don't actually own it. šŸ¤Ŗ
  • brucethemoose - Thursday, December 24, 2020 - link

    It runs on AMD 17.

    "...our system shipped with beta versions of Adrenaline 17.12, which indicates we have December 2017 drivers. None of AMDā€™s regular driver packages will recognize this system as it uses a custom embedded processor. Some games will refuse to run because the drivers are so old."
  • beginner99 - Thursday, December 24, 2020 - link

    These Jaguar cores were kind a slow/subpar when the consoles released and now it's just ridiculously bad. Actually amazing what console devs managed to do with these.
  • d0x360 - Thursday, December 24, 2020 - link

    True but also remember how tightly optimized you can get with your code when you know the hardware and also when you can code to the hardware without needing to worry so much about abstraction layers, other overhead and DRM that a publisher might slap on after you've already optimized the game.

    Great example there is ubi soft. If you bought AC Odyssey on steam then there were 4 or 5 layers of drm. You had uplays, steams, denuvo and another denuvo caliber one I can't remember and that might be it but there may be 1 more I can't remember.

    Anyways it was the 2 main DRM scheme's (denuvo and the other big one lol) chewing up cpu cycles. I remember playing that game and seeing cpu use average at 70% but then I tried a cracked version (I owned the game) and cpu use dropped to 40% average.

    So there are lots of extra things on the pc side that get in the way and that's not even counting problems with the system.
  • FunBunny2 - Thursday, December 24, 2020 - link

    "also remember how tightly optimized you can get with your code when you know the hardware and also when you can code to the hardware without needing to worry so much about abstraction layers, other overhead and DRM that a publisher might slap on after you've already optimized the game."

    you can thank Mitch for writing 1-2-3 only to 8086 assembler and DOS. from that gossamer layer of an 'operating system' we got neato games and viruses that happily fiddled the hardware.
  • Flunk - Thursday, December 24, 2020 - link

    I'm hoping the new consoles will bring better AI and interactivity because they have massively better CPUs than the previous generation.
  • StuntFriar - Thursday, December 24, 2020 - link

    I was working in a dev studio at the time that was using its own in-house engine for a series of niche sports titles (Cricket, Rugby, etc...) and we couldn't believe how weak the Jaguar cores were compared to the PowerPC cores on the PS3 and 360.

    The GPUs on the PS4 and XBone were obviously superior but we were heavily bottlenecked on the CPU side, because our engine made use of a managed Lua interpreter for all of our game scripting (basically, the majority of gameplay code) which ran on the main thread.

    These were games that easily ran at 30fps on a PS3 which were now struggling to hit 15fps on an XBone. The team had to port a lot of the Lua to native C++ code, as well as a huge amount of other optimisations to get it running well.
  • Jorgp2 - Friday, December 25, 2020 - link

    War?

    These 8 out of order Jaguar cores should be an order of magnitude faster than the in order core of the PS3.
  • StuntFriar - Saturday, December 26, 2020 - link

    As a whole, yes. Core for core, the old PPC cores on the PS3 and 360 were faster at some things.

    Most game engines in 2010 were heavily single-threaded with only a few things handled on other threads. If you were migrating to the PS4 and XBone, there was usually a fair bit of work to do.
  • lmcd - Sunday, December 27, 2020 - link

    PS3 had 2 PPE cores and 6 SPE cores. The 2 PPE cores certainly exceeded Jaguar cores for interpreted workloads, considering how narrow Jaguar designs are, how limited their cache was/is, and the fact that (IIRC) a Jaguar core's out of order depth is lower than that of a PPE PowerPC core.
  • at_clucks - Saturday, December 26, 2020 - link

    @StuntFriar, the reason PS3 games may have issues when running on the Jaguar is the same why they may have issues running on much faster x86 CPUs, and the reason many console games run like crap on much faster PCs. The porting isn't perfect (read "they're many times a mess") and the Cell CPU in the PS3 is very different from x86. Devs had a bad time writing for the PS3 so "porting" usually meant "rewrite from scratch". Now why rewrite a particular game to hit the 1:1 between the PS3 and the next gen consoles when the latter could vastly outperform the PS3 in every regard so you'd actually go for much higher targets? And many of the inexperienced devs that tried to just port PS3 to x86 1:1 of course had an even worse time because they were trying to "adapt" PS3 code to run on something vastly diffrent. Maybe let us know what games ran 30FPS on PS3 but were "struggling" to hit 15FPS on XBone (I know of absolutely no game that runs at 15FPS on XBone). But you were a dev back then so I'm sure you know all this ;).

    The PS3 was a dead end from a programming perspective so any attempts to "port" instead of rewrite was destined to fail badly especially at the hands of the average code monkey. There are things that run great on paper but "porting" them to a digital computer is not that great if you *really* want to replicate everything 1:1. Rewriting has a reason.

    It's like saying you could easily hit 30Km/h on a bicycle but on a motorcycle you can barely hit 2Km/h while pedaling the wheels with your own 2 feet just like on the bicycle, proof that motorcycles are slower.
  • StuntFriar - Saturday, December 26, 2020 - link

    We didn't release the game at 15fps on PS4/XBone - we optimised it so that it ran better.

    Read what I wrote again. We had a managed Lua VM running on the main thread. This is typically fine because the single Power PC core on the Cell processor (or each of the equivalent ones on the 360) was strong enough to run it along with the C++ main game thread.

    A single Jaguar core couldn't match it, which is why we had to rely less on Lua and ported most of it to C++. The engine guys also did some optimisations to spread some of the work across the other cores - the engine also ran on the 360 and already farmed off stuff like animation on a separate thread. I don't know the exact details because I wasn't on that project but they got it together in the end and it ran fine on PS4 and XBone when released.
  • at_clucks - Tuesday, December 29, 2020 - link

    It's like saying that gas is better than electric because you tried putting gas in your battery and it couldn't handle it. Yes, the Jaguar cores can;t handle code that was written and optimized specifically for some completely different type of core. No wonder most people consider game devs bottom of the barrel devs: very little understanding in general, very low quality code. And before you argue read what you wrote again. And think if you'd want any software to run with the same kind of glitches and crashes games do.
  • at_clucks - Tuesday, December 29, 2020 - link

    And just to not leave you hanging since you won't reach the conclusion yourself, the recent Xbox 8 core Jaguar sits at ~150GFLOPS. Which is... almost exactly what the Cell could do. And I won't get into integer performance where the Cell still pretty much used an abacus. Now given the difficulties to actually optimize for the Cell since good game devs are rarer then hen's teeth, pretty much no game got close to that while in x86 even code monkeys could do it.

    It's relatively hard to compare *CPU to CPU* since the Cell and AMD's APUs are very different. But they switched to x86 because it was better in all shapes and forms, better power, better performance, and you could actually get that performance even with code monkeys. And it's a good thing they did, even with ~15 years of (claimed) hindsight your understanding is still that code written and optimized for one architecture and design ran poorly when straight up ported to another one that was as different as they could get.
  • StuntFriar - Saturday, December 26, 2020 - link

    Also worth pointing out that it was a multi-platform engine, which ran pretty well on a contemporary Core 2 Quad / Core i3 and budget DX9 GPUs.
  • The_Assimilator - Thursday, December 24, 2020 - link

    Why would anyone want to build a 2020 system with these 2016 processors, which were garbage back then and are even more so now?
  • Qasar - Thursday, December 24, 2020 - link

    heh, why not ??
  • brucethemoose - Thursday, December 24, 2020 - link

    > You can get the board shipped for $125
  • JfromImaginstuff - Friday, December 25, 2020 - link

    Oh hmmmmm "Just because" seems reason enough
  • JfromImaginstuff - Friday, December 25, 2020 - link

    As an addendum, it makes no sense, that was just sarcasm
  • Hifihedgehog - Tuesday, December 29, 2020 - link

    For an arcade cabinet, maybe? :) Heck, when arcade cabinet building became big in the early 2000s, many people used Xbox motherboards. Using this would be very reminiscent of that. For a $120 board, it is not bad, and would emulation will be restricted to Dreamcast era and earlier given the slowness of the CPU, it offers more a different flavor of flexibility from your typical RetroPie system.
  • lmcd - Thursday, December 24, 2020 - link

    The same reason why this is available is also why it's so bad. DDR3 was a bottleneck even in low end GPUs at the time, let alone a midrange unit like in the Xbox One. Doubling memory bandwidth would probably get 35% performance gains or higher. Obviously that would be complicated on this platform.
  • eastcoast_pete - Thursday, December 24, 2020 - link

    Wasn't there a console-like system on the market in China that was based on a custom-built Ryzen-based APU, and used GDDR memory? I believe that even runs Win10.
  • brucethemoose - Thursday, December 24, 2020 - link

    Yep:

    https://www.anandtech.com/show/13381/subor-z-conso...

    https://www.eurogamer.net/articles/digitalfoundry-...

    And then the team supposedly got disbanded: https://www.eurogamer.net/articles/2019-05-15-subo...

    Both publications got their hands on the physical hardware, but they never got a chance to thoroughly bench it, as far as I can tell.

    I'm suprised it wasn't mention here, unless I missed it.
  • Ian Cutress - Thursday, December 24, 2020 - link

    I'm actually quite embarrassed I never gave the Subor Z console a full review.
    I really should do that, despite the fact it's not available any more, even in China.
  • cosminmcm - Thursday, December 24, 2020 - link

    Do it!
  • Hifihedgehog - Tuesday, December 29, 2020 - link

    Yes!!!
  • brucethemoose - Friday, December 25, 2020 - link

    Y'all still have it!?

    Do it! I'd love to read it too!
  • Jorgp2 - Friday, December 25, 2020 - link

    That's what I was going do ask
  • eastcoast_pete - Friday, December 25, 2020 - link

    Please do! If nothing else, it's an interesting even unique setup, in some ways similar to Apple's M1. How so? Essentially an all-in-one design with capable graphics and unified memory, just like Apple's mobile SoC, but in x86/x64, and here with higher bandwidth memory (GDDR); would also love to know if and how the usually higher latencies of VRAM are affecting the CPU part of the performance.
    Similar to what others have written here: if Microsoft would sell their Series S or (better) X with Win 10 for maybe $100 more than the consoles, I would be very interested. That is, if running the CPU with VRAM doesn't work everyday computing tasks too badly.
  • loki1944 - Thursday, December 24, 2020 - link

    I think you mean Xbox one X, not S, for 2560 graphics cores.

    "All three had eight Jaguar cores, but varied in graphics cores, from 768 for the One and One S, up to 2560 in the Xbox One S"
  • Kangal - Thursday, December 24, 2020 - link

    Ian's also got some graphs wrong in the CPU Benchmark section. He forgot to upload the CineBench r20 Multi-Thread graph, and instead has uploaded the Single-Thread graph TWICE. Lol.
  • Ian Cutress - Thursday, December 24, 2020 - link

    R20 graph is replaced with Dwarf Fortress.
    All the other 80+ benchmark tests and data is in www.anandtech.com/bench - I didn't think padding out several pages with all of our data would be helpful.
  • Kangal - Saturday, December 26, 2020 - link

    No problems.
    I know to get the best idea of how a chipset performs, you need as many benchmarks as possible. I've just generally found CineBench to be a reliable way to gauge overall/rough performance. Thanks for the anandtech benchmark tabs, I do like the update, though needs more data points :D
  • azfacea - Thursday, December 24, 2020 - link

    sounds like a solution w/- a problem. much more expensive than raspberry pi and other SBC for pretty much all things CPU . I guess it has more GPU but also way higher power and the GPU is nothing amazing in 2020. so whats the point ??
  • Fulljack - Thursday, December 24, 2020 - link

    it's literally for a niche market in Japan as the article has been stated
  • brucethemoose - Thursday, December 24, 2020 - link

    Can y'all run some GPU compute benches?

    This could potentially be used as a nice budget renderfarm board, especially if the framebuffer can be bumped up. Also, I bet CPU <-> GPU transfers are pretty fast.
  • Glock24 - Thursday, December 24, 2020 - link

    Can it use higher clocked DDR3 RAM? For example DDR3-1866?
  • MrCommunistGen - Friday, December 25, 2020 - link

    Yeah, I'd be really interested to see if performance increases appreciably using some faster RAM.

    I only have experience with two AMD APUs from this era, but both are in prebuilt system scavenged from my company's eWaste pile:

    1. A cheapo Acer tower with an A10-7800 (Kaveri) which heavily throttles the CPU cores if there is any 3D load -- causing a CPU bottleneck in many cases. It shipped with DDR3-1600. I bought some Crucial JDEC spec'd DDR3L-1866 (no XMP profile required) to see if that'd help at all, but the system seems to be capped to 1600MHz despite the A10-7800 supporting DDR3-2133.

    2. An even more cheapo HP "tower" which ended up being an empty box with an external power brick and a mobile A8-6410 (Beema) on an ITX motherboard. Despite Beema officially supporting DDR3L-1866, this machine is also capped to 1600MHz.

    All of this is to say, I wouldn't at all be surprised that even if you paired the A9-9820 with faster DDR3 sticks it might not actually run at higher memory speeds.
  • boozed - Thursday, December 24, 2020 - link

    Apologies in advance...

    RRRAARRWHHGWWR!
  • Bigos - Thursday, December 24, 2020 - link

    Does it run Linux?
  • abufrejoval - Thursday, December 24, 2020 - link

    Android x86 should be fun to try!
  • Oxford Guy - Thursday, December 24, 2020 - link

    'Why console processors have never made it into the PC market before'?

    Because Jaguar has worse IPC than even Piledriver, right?

    'At the time AMD had two CPU core designs that were worth some merit: Bulldozer cores, with its 1 core/2 backend design that has since been branded a large dumpster fire for the company, or Jaguar cores, aimed more for the low power/high efficiency market. Weighing in the performance target of this generation of consoles, both companies decided on having eight Jaguar cores.'

    Or Jaguar cores, aimed at making more money for AMD because the die size is smaller -- not because the CPU is any better than Piledriver. On the contrary.

    The parasitic 'consoles' drained life out of PC gaming with these terrible cores. There is no good reason to have two artificial x86 walled gardens in addition to the PC gaming platform. Yes, there are reasons but none of them outweigh the cost, from the point of view of the consumer rather than companies like Sony, MS, and AMD.

    These "consoles" actually put AMD in the position of competing against the PC gaming platform. So much for the common unjustified belief that AMD is some kind of white knight on a horse, fighting the good fight for the PC gamer. Instead, the company was content to feed us Polaris, a weak design from the start designed to make money by keeping die size small, and lame GPUs like Radeon VI. The transistor count in that one is nothing to write home about. All the marveling about how much performance AMD has been able to get on 7nm since then generally neglects to mention that Radeon VI was not a high-performance design in the first place.
  • Alistair - Thursday, December 24, 2020 - link

    Polaris is one of the best GPUs ever made. Relax. The CPU might have been bad but the GPUs in the PS4 and later Polaris were both great.
  • Oxford Guy - Friday, December 25, 2020 - link

    Polaris was a low-end design when it was released and it was rehashed endlessly.
  • Jorgp2 - Friday, December 25, 2020 - link

    Jaguar has a higher IPC than Piledriver.
  • Oxford Guy - Friday, December 25, 2020 - link

    Thatā€™s now what I had read but okay. Was there time to update the design beyond Piledriver in time for the ā€œconsolesā€?

    And, if Jaguar really was so competitive then why did AMD use Piledriver, Steamroller, and Excacator?
  • jjem002 - Saturday, December 26, 2020 - link

    They continued with Piledriver and such since they could clock 50%+ higher than jaguar, so in the desktop space the performance lost from IPC could be made up for by increasing the power limits.
  • AntonErtl - Saturday, December 26, 2020 - link

    As someone pointed out, clock rate matters as much as IPC; Piledriver has up to 5GHz (4.3GHz @ 95W), while Jaguar was available only up to 2.4GHz (Opteron X2170).

    As for IPC, I can only give you Bobcat (AMD E-450) and Excavator (Athlon X4 845) numbers. On our LaTeX benchmark, I see IPC=0.97 on the Bobcat and IPC=1.35 on the Excavator, with the overall result that the Athlon X4 845 runs the benchmark 3.2 times as fast as the E-450.

    Supposedly Jaguar has 15% higher IPC than Bobcat and Excavator has also some IPC advantage over Piledriver, so maybe Jaguar has roughly the same IPC as Piledriver.
  • Oxford Guy - Saturday, December 26, 2020 - link

    "so maybe Jaguar has roughly the same IPC as Piledriver"

    That makes more sense. Since Piledriver was a very unoptimized design it seems very reasonable to conclude that better performance could have been had from the CMT path, had AMD bothered to do it. Instead, AMD went cheap, both for the "consoles" and in how it did Steamroller and Excavator. Similarly, that was us getting Polaris forever.

    Piledriver was designed, intentionally, to rely on high clocks to be relevant. The thing is, though, it could run at a good clock (see the 8320E) with decent power consumption. I doubt that Jaguar had anything to offer beyond smaller die size on a cheap inferior node (28 nm bulk).

    Maybe that's impressive to some but it doesn't impress me. The "consoles" with their super lame Jaguar design, were a drain on the PC gaming platform. People paid to be suckered.
  • Gigaplex - Thursday, December 24, 2020 - link

    The fact it's impossible to get drivers for this suggests that this is not an AMD sanctioned release. It's a deal-breaker for Windows usage. It would be interesting to see if the Linux drivers work with it.
  • Alistair - Thursday, December 24, 2020 - link

    Come on Microsoft, release the Surface desktop already, build it on the Series S's modern 7nm CPU. $300 for the console, charge $600 for the PC, I'd buy a ton of them.
  • dontlistentome - Saturday, December 26, 2020 - link

    Me too. Dinky case, multiple monitor output and quiet fan please. Maybe a square version of the Series S in sensible matt black.
  • zodiacfml - Thursday, December 24, 2020 - link

    saw this on aliexpress back then but a quick search on its performance easily makes this a pricey and environmentally unfriendly paperweight. i find it hard to believe Chuwi would bother with this. they're better off with those old Xeon processors, plenty availability and performance though discrete graphics won't be an option for Chuwi.
  • Ian Cutress - Thursday, December 24, 2020 - link

    It looks like Chuwi is making sure that their hardware all meets 2.35 GHz and 896 SPs. The Aliexpress ones seem to vary in what is actually unlocked.

    Chuwi made this for the Japanese PC market. It's a very unique market.
  • zodiacfml - Friday, December 25, 2020 - link

    That is impressive, I think they're paying a lot more for this part/spec. I don't expect this be affordable, maybe $500 though cheap for Japanese market.
    AMD and MS should do something about this console/PC for current and the future, maybe bundle 2 or 3 games to the console plus the price of Windows OS...
  • orangpelupa - Thursday, December 24, 2020 - link

    Have you tried manually update the driver thru device manager and manually browse to the inf driver file from official amd driver that's the closest to this weird apu? (r7 or rx).

    Windows will complain blah blah blah, just ignore it. Keep ignoring it and try each driver one by one (in the same family).

    Also try to disable windows driver signature enforcement, then manually edit the official amd driver inf with this gpu VID.

    I used to to those things to update Radeon gpu on my old asus laptop.
  • Ian Cutress - Thursday, December 24, 2020 - link

    I didn't go too deep. The Chuwi Aerobox was working as-is, so I left it there when it wasn't as straightforward as it could be.
  • BushLin - Thursday, December 24, 2020 - link

    I'd bet it would take less time to paste the PCI identifier into a recent driver .inf file than it'd take to figure out which games don't run and write the article section about the drivers.
  • Lord of the Bored - Saturday, December 26, 2020 - link

    But the article section about the drivers still needs to be written. You can't just gloss over a complete lack of drivers on the internet, even if you do get to provide driver modding instructions.
  • abufrejoval - Thursday, December 24, 2020 - link

    And here I thought this was going to be a piece on how to recycle discarded consoles as Windows 10 thin clients... (or Android-x86 for that matter).

    Because there is going to be a lot of these going into most likely rather eco-fiendish recycling very soon.

    Still, what I find puzzing is that 8-core code must have become the norm with that generation of consoles, yet even the latest and greatest like Microsoft latest Flight Sim never seem to touch more than perhaps 2-3 cores, while they take ages to get going and perform rather mediocre even on slightly above average hardware (using a Ryzen 7 5800X/RTX 2080ti combo).
  • Eirikr - Friday, December 25, 2020 - link

    Would love to see you this performance under Linux with the latest open source graphics drivers
  • Eirikr - Friday, December 25, 2020 - link

    Meant to say would love to see you review the performance.
  • Ptosio - Friday, December 25, 2020 - link

    If Microsoft really cared about the environment to 1% extent of their greenwashing attempts, it would allow Windows software to be run on their Xbox machines.

    This time people who already own powerful x86 and GPU units in form of a console, would not be forced to essentially buy the same thing twice, cluttering the planet in the process.
  • Oxford Guy - Friday, December 25, 2020 - link

    Donā€™t forget Sony, the company that advertised the ability to run Linux on the PS3 to sell them and then yanked away the ability. Then, in court, the company tried to call the plaintiffs bad apple ā€˜nobodiesā€™. The company argued that ā€˜no oneā€™ had any interest in running Linux on the machines ā€” of course after the military among others did exactly that. A relentless patterns of bad faith from a corporation? Perish the thought.
  • osx86h3avy - Saturday, December 26, 2020 - link

    My understanding was allowing Linux was a knee jerkmreaction to the og Xbox being hacked to run Linux et al,and Sony was like "pfft, yall want it we will give it to ya" in a sort of bad faith way originally.....not that it makes it ok to then drop support later like they did but it always struck me as more of a fu to m$ than serious consideration for the community
  • Oxford Guy - Saturday, December 26, 2020 - link

    Sony chose to make effort to advertise the feature. It wasn't just listed in small print.

    Sony then pulled a bait and switch. I think I heard the suggestion that it was done as a way to protect BluRay discs from being ripped. Regardless of the reason, Sony had no legal or ethical way to cut a feature out of the product it had sold to people. It's like the scandal of e-book providers reaching into your device to delete books you purchased.
  • Lord of the Bored - Saturday, December 26, 2020 - link

    To my recollection, it was for tax purposes. It let them call it a computer rather than a game console in some regions.
  • Rookierookie - Saturday, December 26, 2020 - link

    Microsoft obviously doesn't want to affect the Windows PC market.

    Sony, though, I don't know. They sell PCs, yes, but I don't know if VAIO is big enough that Sony would actually be harmed if they enabled the capability on their consoles. They've also done something similar before, by making the PS2 a DVD player as well.
  • Oxford Guy - Saturday, December 26, 2020 - link

    Microsoft obviously wants to have its cake (a "console" it can make money on) and eat it, too (Windows PC gaming).

    Too bad the public isn't smart enough to say no and demand a common software layer (Vulkan + OpenGL) to go with the common hardware layer (x86).
  • Alexvrb - Friday, December 25, 2020 - link

    This would have made *SOME* sense if it was released years ago and used faster DDR3 in a quad channel configuration like the consoles themselves. Even better if they had worked with AMD to enable the eSRAM but that's a long shot for a niche APU. The actual configuration used is just awful for anything other than dumping old stock of DDR3.

    At least we are finally seeing a similar cache used to good effect on PC with the RX 6000 series.
  • eastcoast_pete - Friday, December 25, 2020 - link

    Now I know they aren't really directly comparable, but how much better or faster is this setup when compared with, let's say, a Snapdragon 865+ based gaming smartphone like the Asus ROG III? Is there a way to run Aztec Ruins or whatever on this Chuwi box, just to get an idea? My strong suspicion is that the gaming smartphone will actually hold its own.
  • B3an - Saturday, December 26, 2020 - link

    I've love to see this compared to high-end phone SoC's from the last few years. I've been saying all this time that they've been faster than the base XO console for a long time now.
  • lmcd - Monday, December 28, 2020 - link

    The SRAM cache enables nearly double the memory bandwidth in scenarios that are apparently very important and/or very common, so it's not a small thing to gloss over. And likewise this isn't a platform Windows is at all optimized for. I agree that in CPU-bottlenecked scenarios you're clearly right but it's pretty easy to push a videogame toward GPU-bottlenecked imo.
  • osx86h3avy - Saturday, December 26, 2020 - link

    I didn't read all comments so someone may have mentioned this but if they used a smaller box and added quad port gigabit nic if possible it could make a neat platform for pfsense router...I dunno though I mean not if it runs at 150w but perhaps with the right nic they could get that down...or perhaps add more Sata ports for a cheap nas???....other than that it seems like more of a curiosity than a real,option....and really, there are already so many far cheaper better solutions for those that it doesn't make much sense unless they can cost out somewhere like with say just 4gb onbaord mem, Emmc storage, no audio, etc., but now its just a whole other beast and all for the sake of just not throwing away decade old silicon which it seems is the best thing to do with these....
  • phoenix_rizzen - Saturday, December 26, 2020 - link

    The table on the first page doesn't match up with any of the text describing the APU in the Aerobox.

    CPU and GPU specs are incorrect.
  • regsEx - Sunday, December 27, 2020 - link

    And they wondering why CB2077 is 20 fps on that ancient slow hardware. It's actually doing very good on that hardware, as they said.
  • kaidenshi - Sunday, December 27, 2020 - link

    "The concept of the modern console built on the same x86 architecture as desktop and gaming PCs came into effect with the 8th Generation of consoles ā€“ weā€™re talking the Playstation 4 and the Xbox One, both of which used the semi-custom services of AMD to build specific processors for these machines."

    Ahem, the original Xbox would like a word...
  • Mitch89 - Tuesday, December 29, 2020 - link

    Was thinking the same thing. Original Xbox used a Pentium III running a modded version of Windows.
  • artk2219 - Sunday, December 27, 2020 - link

    It may have taken a little less time if you had used a toothpick to mechanically remove most of the paste before going to town with alcohol. Either way, i haven't seen a paste job that bad in a while, although Dell does a similarly bad job more often than not.
  • Kuhar - Monday, December 28, 2020 - link

    Can you please add comparison to Apple M1. FPS should be about on par when using rosetta, windows and heavy-duty games, right?
  • cheapcomputers - Monday, December 28, 2020 - link

    That was interesting. As soon as I started reading, I wondered how an Athlon 2c/4t would compare. Thanks for including some of that.

    I wonder if they can actually sell this garbage? Several months ago it would have been no problem for me to get an Athlon, mb, and DDR4 for less than $200.

    I wonder how my old a4-6300 would compare. I still use it occasionally to install OSes and for file storage.
  • dromoxen - Friday, January 1, 2021 - link

    This is the sort of system I might/could be interested in ... except its horribly outdated, and slow. Something with decent perfromance and a supremely low power usage (ergo v low heat output) would be more appropriate eg a laptop WITHOUT a screen, keyboard,charger,battery .
    Where most of these SFF items fail is their lack of gfx muscle .
    Thanks NV for killing off MXM(micro-sized gfx card), which was earths last, best hope.
    Deskmini seems far better ??
  • Athlex - Monday, January 4, 2021 - link

    "despite there being no mention of MSI anywhere else on the product or through the stack"

    Looks like there's a Microstar part number just below the M.2 SATA slot - MS-7C28
  • Calista - Thursday, January 7, 2021 - link

    My biggest worry is Chuwi, that company have MASSIVE issues with quality control and have had so for years. They let the most obvious minor flaws slip which makes me wonder what major flaws they also let slip.

Log in

Don't have an account? Sign up now