Comments Locked

95 Comments

Back to Article

  • jhoff80 - Sunday, June 2, 2013 - link

    This article and the power consumption stats just make me wish that Intel would just make it easier to get a hold of their -T chips for end users. A 35W or 45W chip would be great for me, but the only thing that has full retail availability is the 65W one. (And it's not because it's so early in launch, it's always been way too difficult to get -T versions.)
  • EnzoFX - Sunday, June 2, 2013 - link

    Not to mention expensive! You get the same results by undervolting/underclocking, typically.
  • Laststop311 - Monday, June 3, 2013 - link

    You are correct in a way but you could undervolt the T series as well and get better thermal performance then the 65 watt version. atleast that is my experience. If i was making an HTPC i would use the i7-4770t or the i7-4650t if thats the equivalent of the i7-3770t this year. The power consumption is amazing and proper 24hz is great for 1080p24 playback. upgrade to the htpc just isn't in my budget right now and ivy bridge + gt 660 isnt a bad htpc. MY PC budget is going to an ultrabook upgrade this year. The increased battery life and performance is insane. i7-980x desktop still does not have a large enough upgrade to make it worth it. Ivy bridge-E is not THAT much faster and I dont think even haswell-e next year will be enough to upgrade the desktop.
  • Death666Angel - Tuesday, June 4, 2013 - link

    "but you could undervolt the T series as well and get better thermal performance then the 65 watt version."
    Not to the same extent. The T series will already be driving much tighter voltages than normal SKUs. While you may save 15% power consumption by undervolting normal SKUs, undervolting already power efficient SKUs would result in sub 5% probably.
  • vnangia - Sunday, June 2, 2013 - link

    Well, it helps that there are 35W parts this time around - at least on the timeline. IVB didn't get any 35W parts, so the HTPC is still on SNB, and yeah, I could definitely use the incremental improvements to QuickSync.
  • jhoff80 - Sunday, June 2, 2013 - link

    Yes, but I'm not talking about only 35W specific chips. The i7-3770T was just as difficult to get as any other -T series chip, because they don't sell them to end-users directly.
  • vnangia - Sunday, June 2, 2013 - link

    I'm agreeing with you! What I was trying to say is, Intel did announce low-TDP SNB parts and delivered: SNB had a bunch of -T versions available to end-users at both low (G4xx, G5xxT, 2100T, 2120T) and high end (2390, 2500T). I bought my 2100T at Microcenter B&M for instance.

    By contrast, Intel didn't announce any end-user -T (and just a handful of -S) parts and we saw that IVB had virtually no -T parts available. I'm optimistic that now they've announced a few -T parts at the high end, we might actually see these materialize in the retail chain and hopefully it bodes well for -T parts at the low end.

    Fortunately (*knocks on wood*) the current SNB-based HTPC is still going strong, so I don't feel the need to upgrade. If and when I do, though, I expect that it won't be so clear cut - I may end up going with AMD's lineup, despite the relative paucity of AMD ITX boards.
  • jhoff80 - Monday, June 3, 2013 - link

    Sorry, I must've misunderstood.
  • Krysto - Monday, June 3, 2013 - link

    This is insane. Why use a $400 Intel Haswell media box for 4k video, when you can use the much cheaper and much more efficient Mali T622-based media boxes that should be appearing next year?

    http://blogs.arm.com/multimedia/977-a-new-branch-f...
  • NirXY - Monday, June 3, 2013 - link

    "should be appearing next year"
  • heffeque - Monday, June 3, 2013 - link

    Well... the AMD A4-5000 seems to be perfect for HTPC and I don't see in this comparison.
    Why not try comparing what the AMD A4-5000 can do (4k, 23Hz, etc) versus this Haswell system?
    The CPU isn't that good, but there's no need for much CPU on HTPC systems, and also... the price, just look at the price.
  • meacupla - Monday, June 3, 2013 - link

    when you playback hi10 or silverlight content, having a fast cpu helps immensely, since those formats don't have dxva support.
  • halbhh2 - Tuesday, June 4, 2013 - link

    Consider prices, at $122 suggested, the new A10 6700 is going to be interesting as the real competition to this Intel chip.
  • majorleague - Wednesday, June 5, 2013 - link

    Here is a youtube link showing 3dmark11 and windows index rating for the 4770k 3.5ghz Haswell. Not overclocked.
    This is apparently around 10-20fps slower than the 6800k in most games. And almost twice the price!!
    Youtube link:
    http://www.youtube.com/watch?v=k7Yo2A__1Xw
  • JDG1980 - Monday, June 3, 2013 - link

    You can't use madVR on ARM. And most ARM platforms are highly locked down so you may be stuck with sub-par playback software from whoever the final vendor is.
  • HisDivineOrder - Tuesday, June 4, 2013 - link

    Because we don't live in next year, Doc Brown?
  • BMNify - Wednesday, June 12, 2013 - link

    for the same reason that QS isn't being used far more today, that being Intel and arm devs talk the talk but don't listen to or even stay in contact with the number one video quality partners ,that being the x264 and ffmpeg devs and provide their arm patches for review and official inclusion in these two key Cecil app code bases to actually use the arm/intel Low Level video encode/decode API's
  • MrSpadge - Monday, June 3, 2013 - link

    Use an i5 and the price almost drops in half. Then undervolt it a bit and each regular CPU will only draw 40 - 50 W under sustained load. Which media playback doesn't create anyway.
  • Mayuyu - Sunday, June 2, 2013 - link

    2-Pass encodes do not offer any improvements in compression efficiency in x264. The only time you would want to use a 2-Pass encode is to hit a certain file size.

    Quicksync is irrelevant because their h264 encodes are inferior in quality to xvid (which has been outdated for a long time now).
  • raulizahi - Thursday, August 29, 2013 - link

    @Mayuyu, 2-pass x264 encodes using VBR do offer improvements in compression efficiency at the same video quality. I have proven it many times. An example: target 720p50 at 3Mbps VBR, first pass I get a certain quality, second pass I get noticeably better quality.
  • mindbomb - Sunday, June 2, 2013 - link

    The current version of madvr does support dxva native actually.
  • gevorg - Sunday, June 2, 2013 - link

    The near $300 price of i7-4765T is extremely price prohibitive for HTPC use. Majority of users will find AMD's Trinity APUs to be perfect for HTPC job.

    Also, unless Intel handicapped it, you should be able to downclock any i7 Haswell CPU to be near i7-4765T speed/TDP. This is possible with Sandy Bridge and Ivy Bridge chips.
  • meacupla - Sunday, June 2, 2013 - link

    the only problem with trinity is the rather limited choice of mITX mobos and rather high power consumption and thermal output, which makes them not ideal for compact HTPCs...

    Although, granted, for $300 for the CPU alone, I'd much rather buy an xboxone or PS4.
  • HisDivineOrder - Tuesday, June 4, 2013 - link

    You just listed four problems while saying, "the only problem with trinity." That's the real problem with AMD's options. There's like "one problem" for everyone.
  • Spunjji - Tuesday, June 4, 2013 - link

    Except for those of us for whom there are none, and/or are prepared to live with limitations to not have to shell out $300 on a CPU.
  • vnangia - Sunday, June 2, 2013 - link

    Very true. The SNB low-TDP parts were within spitting distance of their equivalent regular-TDP parts (about $25-50 more), not $200 more.
  • JDG1980 - Sunday, June 2, 2013 - link

    If you can wait six months or so, you're probably going to be better off going with Kaveri. AMD is going to be substantially increasing the GPU power of their APU and switching to a homogenous memory architecture so everything uses GDDR5. What little I've heard (which may not be reliable) seems to indicate that the GPU in Kaveri may be about on par with the discrete 7750. I don't know if they can pull that off, but if they even come close then they will have basically rendered all sub-$100 discrete GPUs obsolete.
  • lmcd - Sunday, June 2, 2013 - link

    Inaccurate. $100 GPUs will have improved by Kaveri's release. And AMD's drivers won't necessarily meet the expectations set here either.
  • medi02 - Monday, June 3, 2013 - link

    This driver FUD is getting old...
  • Spunjji - Tuesday, June 4, 2013 - link

    Very old, but don't expect it to stop.
  • HisDivineOrder - Tuesday, June 4, 2013 - link

    I've heard this song and dance before. It never happens. Plus, limiting people to GDDR5 of pre-determined amounts for a HTPC seems like an exercise in being stupid.
  • Spunjji - Tuesday, June 4, 2013 - link

    Yeah, I'm not buying that rumour. Doesn't make much sense.
  • JDG1980 - Sunday, June 2, 2013 - link

    It's good to see that Intel finally got around to fixing the 23.976 fps bug, which was the biggest show-stopper for using their integrated graphics in a HTPC.

    Regarding MadVR, I'd be interested to see more benchmarks. How good can you run the settings before hitting a wall with GPU utilization? How about on the GT3e - if this ever shows up in an all-in-one Mini-ITX board or NUC, it might be a great choice for HTPCs. Can it handle the good scaling algorithms?

    My own experience is that anti-ringing doesn't add that much GPU load. I recently upgraded to a Radeon HD 7750, and it can handle anti-ringing filters on both luma and chroma with no problem. Chroma upscaling works fine with 3-tap Jinc, and luma also can do this with SD content (even interlaced), but for the most demanding test clip I have (1440x1080 interlaced 60 fields per second) I have to downgrade luma scaling to either Lanczos 3-tap or SoftCubic 80 to avoid dropping frames. (The output destination is a 1080p TV.) I suspect a 7790 or 7850 could handle 3-tap Jinc for both chroma and luma at all resolutions and frame rates up to full HD.

    By the way, I found a weird problem with madVR - when I ran GPU-Z in the background to monitor load, all interlaced content dropped frames. Didn't matter what settings I used. Closing GPU-Z ended the problem. I was still able to monitor GPU load with Microsoft's "Process Explorer" application and this did not cause any problems.

    Regarding 4K output, did you test whether DisplayPort 60 Hz 4K works properly? This might be of interest to some users, especially if the upcoming Asus 4K monitor is released at a reasonable price point. I know people have had to use some odd tricks to get the Sharp 4K monitor to do native resolution at 60 Hz with existing cards.
  • ganeshts - Monday, June 3, 2013 - link

    This is very interesting.. What version of GPU-Z were you using? I will check whether my Jinc / anti-ringing dropped frames were due to GPU-Z running in the background. I did do the initial setup when GPU-Z wasn't active, but obviously the benchmark runs were run with GPU-Z active in the background. Did you see any difference in GPU load between GPU-Z and Process Explorer when playing interlaced content with dropped frames?
  • JDG1980 - Monday, June 3, 2013 - link

    I was using the latest version (0.7.1) of GPU-Z. The strange part is that the GPU load calculation was correct - it was just dropping frames for no reason, it wasn't showing the GPU as being maxed out. For the video card, I was using the newest stable Catalyst driver (13.4, I believe) from AMD's website. The OS is Windows 7 Ultimate (64-bit).

    The only reason I suspected GPU-Z is because after searching a bunch of forums to try to find out why interlaced content (even SD with low madVR settings) wouldn't play properly, I found one other user who said he had to turn off GPU-Z. I cannot say if this is a widespread issue and it's possible it may be limited to certain system configurations or certain GPUs. Still worth trying, though. Thanks for the follow-up!
  • tential - Sunday, June 2, 2013 - link

    I don't understand the H.264 Transcoding Performance chart at all can someone help?

    QuickSync does more FPS at 720p than 1080p. This makes sense.

    The x264 on the Core i3 and core i7 post higher FPS in 1080p but lower in 720p. Why is this?
  • ganeshts - Monday, June 3, 2013 - link

    Maybe the downscaling of the frame from 1080p to 720p sucks up more resources, causing the drop in FPS? Remember that the source is 1080p...
  • tential - Monday, June 3, 2013 - link

    Ok so if I'm downscaling to 720p, why does FPS increase with quicksync, but decrease with the processor?

    It's OPPOSITE directions one increases (quicksync) one decreases (cpu). Wouldn't it be the same both ways?
  • ganeshts - Monday, June 3, 2013 - link

    Downscaling is also hardware accelerated in QS mode. Hardware transcode is faster for 720p decoded frames rather than 1080p decoded frames. The time taken to downscale is much lower than the time taken to transcode the 'extra pixels' in a 1080p version.
  • elian123 - Monday, June 3, 2013 - link

    Ganesh, you mention "The Iris Pro 5200 GPUs are reserved for BGA configurations and unavailable to system builders". Does that imply that there won't be motherboards for sale with the 4770R integrated? Will the 4770R only be available in complete systems?
  • StardogChampion - Monday, June 3, 2013 - link

    I am wondering about this comment as well. Everything I've read seemed to indicate it would be available in mini-ITX form for building AIOs (so likely thin mini-ITX). Haswell will be a big disappointment without availability of the BGA packages in mini-ITX form.
  • Sivar - Monday, June 3, 2013 - link

    Thank you for the article.
    Note that x264 is a specific software encoder, not a type of video or a thing that can be accelerated ("While full x264 acceleration using QuickSync...")
    H.264 is the video standard.

    Also note that x264, the CPU-based encoding software, does not need to run in 2-pass mode to get great quality. 2-pass mode is ONLY if you want a specific file size regardless of quality. If you want a specific quality, you use quality mode. --CRF23, for example, returns small (though variable depending on content) file size and good quality.
  • ganeshts - Monday, June 3, 2013 - link

    Sivar,

    I did specifically want to mention full x264 acceleration using QuickSync -- That is because x264 is the H.264 encoder of choice for many users. The most beneficial addition to the CPU would be the ability to get hardware acceleration when using x264 with ANY set of options. That is simply not going to be possible with QuickSync (or, for that matter, any hardware-based encoder).

    Yes, agreed about the mistaken mention of 2-pass for improved quality. I will update it shortly.
  • Spawne32 - Monday, June 3, 2013 - link

    People always fail to realize what key element in every one of these releases, how big the enthusiast market truly is. All of us posting here on this comment section regarding this review are a small fraction of the overall market intel targets, this is part of the reason AMD suffers so tragically with their current lineup. Power consumption and price are the two biggest factors in a regular consumers mind when purchasing a PC, be it laptop or desktop. Performance numbers rarely play a factor. I don't know what AMD is doing over there but I long for a day when AMD can actually challenge intel and drive prices down even further, because these 230-400 dollar starting prices for "mainstream" intel processors proves once again why I refuse to invest in them regardless of performance. The marginal increase in speed in my day to day activities does not warrant the price being paid for something that is obsolete in 1-2 years. AMD's highest priced processor right now is 179.99, its comparable intel counterpart in haswell....349.99, you do the math.
  • bji - Monday, June 3, 2013 - link

    Either the increases in speed with each successive generation are great enough to render previous generations obsolete, or the increases in speed with each successive generation are small enough that the previous generation is not rendered obsolete. You can't have it both ways just to try to make Intel look bad, sorry.

    I don't know what margin Intel is making on these parts - do you? Remember that they are sinking large R & D and transistor budgets into these minor speed increases, and at the same time sinking lots of money into developing the next generation of process technology. If $300 is not worth it to you, don't buy the part; Intel won't be able to sustain their R & D budgets if nobody buys the results.
  • Deuge - Monday, June 3, 2013 - link

    If one of the GT3 or GT3e parts comes out in a refreshed NUC, id love to see a review of it from an HTPC perspective. Very interested to hear if it can handle Lanzcos + AR or Jinc.
  • dbcoopernz - Monday, June 3, 2013 - link

    Is the inability to use LAV with DXVA-native for madVR an Intel limitation? The devs of both the LAV filters and madVR have told me (on the doom9 forum) that DXVA-native is fine for madVR on AMD GPU's.
  • BMNify - Monday, June 3, 2013 - link

    DXVA native DOES work with AMD using LAV filters and MadVR... I'm using it as I type (watching MotoGP)
  • ganeshts - Monday, June 3, 2013 - link

    It also works with the Haswell piece. I will update the article ASAP.
  • BMNify - Monday, June 3, 2013 - link

    APU is the go to for HTPC builders. And stop with the power this and thermals that... undervolt it, toss in a Pico PSU, suspend to memory when not in use and enjoy. Take the hundreds saved and buy a Kabini or two as clients.

    If we're talking balls to the wall processing might, absolutely, lets talk Intel but not for a simple HTPC.
  • meacupla - Monday, June 3, 2013 - link

    there's like... exactly one mITX FM2 mobo even worth considering out of a grand total of two. One of them catches on fire and neither of them have bluetooth or wifi.

    LGA1155 and LGA1150 have at least four each.
  • TomWomack - Monday, June 3, 2013 - link

    The mITX FM2 motherboard that I bought last week has bluetooth and wifi; they're slightly kludged in (they are USB modules apparently glued into extra USB ports that they've added), but I don't care.

    The Haswell mITX boards aren't available from my preferred supplier yet, so I've gone for micro-ATX for that machine.
  • BMNify - Monday, June 3, 2013 - link

    I know you're trolling but the fact is more people are content with converting their 5 year old C2D cookie cutter desktop into an HTPC ($50 video card + case + IR receiver = job done) than buying all new kit.
    We reached the age of "good enough" years ago. Money is tight and with all the available gadgets on the market (and more to come) people are looking to make it go as far as possible. Intel is going to find it harder and harder to get their high margin silicon into the homes of the average family. Good enough ARM mobile + good enough x86 allows people to own more devices and still pay the bills. It looks like AMD has accepted this, they've taking their lumps and are moving forward in this "new world". I'm not sure what Intels long term strategy is but I'm a bit concerned.
  • Veroxious - Tuesday, June 4, 2013 - link

    Agreed 100%. I am using an old Dell SFF with an E2140 LGA775 CPU running XBMCbuntu. It works like a charm. I can watch movies while simultaneously adding content. That PC is near silent. What incentive do I have for upgrading to a Haswell based system? None whatsoever.
  • kallogan - Monday, June 3, 2013 - link

    2.0 ghz seriously ??? The core 45W Sandy i5-2500T was at 2,3 ghz and 3,3 ghz turbo. LOL at this useless cpu gen.
  • kallogan - Monday, June 3, 2013 - link

    Forget my comment didn't see it was a i7 with 8 threads. 35W tdp is not bad either. But the 45W core i7-3770T would still smoke this.
  • Montago - Monday, June 3, 2013 - link

    I must be blind... i don't see the regression you are talking about.

    HD4000 QSV usually get smudgy and blocky.. and that i don't see in HD4600 ... so i think you are wrong in your statements.

    comparing the frames, there is little difference, and none i would ever notice while watching the movie on a handheld device like an tablet or Smartphone.

    The biggest problem with QSV is not the quality, but the filesize :-(
    QSV is usually 2x larger than x264
  • ganeshts - Monday, June 3, 2013 - link

    Montago,

    Just one example of the many that can be unearthed from the galleries:

    Look at Frame 4 in the 720p encodes in full size here:

    HD4600: http://images.anandtech.com/galleries/2836/QSV-720...
    HD4000: http://images.anandtech.com/galleries/2839/QSV-HD4...

    Look at the horns of the cattle in the background to the right of the horse. The HD4000 version is sharper and more faithful to the original compared to the HD4600 version, even though the target bitrate is the same.

    In general, when looking at the video being played back, these differences added up to a very evident quality loss.

    Objectively, even the FPS took a beating with the HD4600 compared to the HD4000. There is some driver issue managing the new QuickSync Haswell modes definitely.
  • nevcairiel - Monday, June 3, 2013 - link

    The main Haswell performance test from Anand at least showed improved QuickSync performance over Ivy, as well as something called the "Better Quality" mode (which was slower than Ivy, but never specified what it really meant)
  • ganeshts - Monday, June 3, 2013 - link

    Anand used MediaEspresso (CyberLink's commercial app), while I used HandBrake. As far as I remember, MediaEspresso doesn't allow specification of target bitrate (at least from the time that I used it a year or so back), just better quality or better performance. Handbrake allows setting of target bitrate, so the modes that are being used by the Handbrake app might be completely different from those used by MediaEspresso.

    As we theorize, some new Haswell modes which are probably not being used by MediaEspresso are making the transcodes longer and worse quality.
  • eio - Sunday, June 23, 2013 - link

    great example! very interesting.
    I agree with Montage that for most snapshots, HD4600 is significantly better than HD4000 for retaining much more texture, even for this frame 4 in 1080p.
    but in 720p HD4600 shows its trade off of keep more fine grained texture: looks like HD4600 are regressed in low contrast, large scale structral infomation.
    as you said, this type of regression can be more evident in video than snapshots.
  • eio - Sunday, June 23, 2013 - link

    another thing that surprises me is: x264 is a clear loser in this test. I don't understand why, what are the specific params that handbrake used to call x264?
  • nevcairiel - Monday, June 3, 2013 - link

    @ganeshts

    I'm curious, what did you use for DXVA2N testing of VC-1?
    LAV Video doesn't support VC-1 DXVA2 on Intel, at least on Ivy Bridge, and i doubt Haswell changed much (although it would be a nice surprise, i'll see for myself in a few days)
  • ganeshts - Monday, June 3, 2013 - link

    Hendrik,

    I made a note that DXVA2N for interlaced VC-1 has software fallback.

    That issue is still not fixed in Haswell. That is why you see QuickSync consuming lower power compared to DXVA2N for the interlaced VC-1 sample.
  • zilexa - Monday, June 3, 2013 - link

    To be honest, now that I have a near-perfect Raspberry setup, I would never buy a Core ix/AMD Ax HTPC anymore. Huge waiste of money for almost un-noticable image quality improvement.
    The Raspberry Pi will use max 6.5w, usually much lower. Speed in XBMC is no issue anymore, and it plays back all my movies just fine (Batman imax x264 rip 7-15MBps). I play mostly downloaded tv shows, streams and occasionally a movie. It also takes care of the whole download process in the background. So I don't even have a computer anymore at home. I sold my old AMD 780G based Silverstone M2 HTPC for €170 and it was the best decision ever.

    Still cool to read about the high end possibilities of HTPC/MadVR or actually just video playback and encoding, cos thats what this is really about. But I would never buy a system to be able to support this. HTPC in my opinion is to be in a lazy mode and able to playback your shows/movies/watch your photos and streams in good HD quality and audio.

    If you need HTPC, in my opinion there is no need for such an investment in a computer system which is meant for a huge variety of computing tasks.
  • jwcalla - Monday, June 3, 2013 - link

    It's going to depend on individual needs of course, and I think your Raspberry Pi is on the other end of the extreme, but otherwise I kind of have the same reaction. This has got to be an $800+ build here for an HTPC and then I begin to wonder if this is a practical approach.

    Owing to the fact that Intel's entire marketing strategy is to oversell to the consumer (i.e., sell him much more than he really needs), it seems that sometimes these reviews follow the strategy too closely. For an HTPC? Core i3 at the max. And even that's being generous. If one needs certain workloads like transcoding and such then maybe a higher end box is needed. But then I question if that kind of stuff is appropriate for an HTPC.
  • superjim - Monday, June 3, 2013 - link

    Playback a raw M2TS 1080p 60fps file on your Pi and get back to me.
  • phoenix_rizzen - Monday, June 3, 2013 - link

    How did you get around the "interface is not accelerated" issue on the RPi? I found it completely useless when trying to navigate the XBMC interface itself (you know, to select the show to watch). Sure, once the video was loaded, and processing moved over to the hardware decoder, things ran smooth as silk.

    I sold my RPi two weeks after receiving it due to this issue. Just wasn't worth the headaches. Since moved to a quad-core AthlonII running off an SSD with a fanless nVidia dGPU. So much nicer to work with.
  • vlado08 - Monday, June 3, 2013 - link

    What about Frame Rate Conversion (FRC) capability?
  • ericgl21 - Monday, June 3, 2013 - link

    Ganesh,

    Let's assume you have two 4K/60p video files playing in a loop at the same time for a duration of 3 hours.
    Is it possible that Iris or Iris Pro could play those two video streams at the same time, without dropping frames and without the processor throttling throughout the entire movie playback ?
    I mean, connecting two 4K TVs, one to the HDMI port and the other to the DisplayPort, and outputting each video to each TV. Would you say the Iris / Iris Pro is up to this task? Could you test this scenario?
  • ganeshts - Monday, June 3, 2013 - link

    Haven't come across a 4Kp60 sample yet. All the stuff on YouTube is at a max. of 30 fps, and I have some samples sourced from other platforms that are QuadFHD at 30 fps. Please pass on any 4Kp60 clips that you have.

    I know there are two crazy scene encodes with 4Kp50 (Crowd Run 2160p) and a 250 Mbps one (Ducks Take Off). No hardware decoder I have seen has been able to handle either properly. So, I doubt 4Kp60 is going to work :| That said, if I get a chance, I will definitely evaluate the Iris / Iris Pro.
  • madwolfa - Monday, June 3, 2013 - link

    Why do you still need 23.96 support since "Smooth Motion" feature in MadVR? I couldn't care less now...
  • Dug - Monday, June 3, 2013 - link

    Because not everyone can, wants to, or even knows what MadVR is. Never mind setting it up properly.
  • HOSH - Monday, June 3, 2013 - link

    Personally this is going in the right direction, but I am wondering what low power settings we could use with the Core i7-4750HQ or the Core i7-4770R in an Mini-ITX HTPC style board since they both have the Iris Pro 5200. From reading the reviews here the Iris Pro 5200 should be closer to what NVIDIA or AMD currently has to offer in the HTPC discrete graphics, but on die for a cleaner system.
  • Aikouka - Monday, June 3, 2013 - link

    Is it worthwhile to assume that the poor QuickSync performance is just a software problem? I've been interested in gaining QuickSync support, but the performance presented isn't that enticing.
  • ganeshts - Monday, June 3, 2013 - link

    Very much possible. I am going to evaluate a driver downgrade to see if the issue is in the latest drivers.
  • superjim - Monday, June 3, 2013 - link

    Why do we need an i7 for HTPC duty? A 45W Core 2 Duo or Athlon II system is plenty using a 6570/430 and up GPU. Sure it uses more power but that's hardly a problem (both in money and heat). What is the usage scenario for an HTPC that needs an i7?
  • Aikouka - Monday, June 3, 2013 - link

    They probably didn't have access to a lower-end Haswell processor... especially since Intel hasn't released the i3 Haswell processors yet.
  • superjim - Monday, June 3, 2013 - link

    Makes sense but even an i3 is overkill for an HTPC as another commenter suggested. I think Trinity has a pretty tight grip on the bang-for-the-buck HTPC right now. Richland will only make that better.
  • Penti - Monday, June 3, 2013 - link

    It really depends on the amount of post processing done on the HEPC, a Trinity/Richland or Intel with integrated graphics or something like a HD6450 really isn't enough for all. Obviously a fast CPU is good as a fallback when there is no hardware acceleration too.
  • Penti - Monday, June 3, 2013 - link

    Plus old hardware is old and not available anymore.
  • phoenix_rizzen - Monday, June 3, 2013 - link

    Quad-core Athlon-II, CPU fan configured to spin down as needed, case fan unplugged, SSD, nVidia 210 GPU (fanless) running Linux + XBMC. Sub-$300 CDN.

    Why would you need an i7 for an HTPC? Why would you need a skookum dGPU? And why would you be transcoding on the HTPC? The HTPC should just play the movies on the screen that's attached to it, nothing more. The movies shouldn't reside on the HTPC, and you should be plugging in mobile devices to transfer movies to/from them. That's what the skookum "server" in the other room is for. :)
  • solnyshok - Wednesday, October 23, 2013 - link

    This is quite old thread, but I wanted to add, that it strikes me that my htpc usage model is totally different from the one you described. I use Atom based htpc (dualcore 2.1GHz) which is on 24x7, doing playback to HDMI 1080p tv and torrents and file serving for home network. it is up to 10w and fanless. No MadVR though.
  • benamoo - Monday, June 3, 2013 - link

    I'm wondering why no one mentioned the upcoming Ouya console and possibly many more ARM based media player boxes coming to the market next year.

    I've been an HTPC user for years now, but it's not worth it anymore to invest in such a costly/bulky/noisy system simply for HTPC tasks. Sure, repurposing an old system is great, actually that's what I've been doing, but building a new one from scratch (especially with a Core i7) seems to be a huge waste of money IMO.

    I have high hopes for Ouya (and similar ARM/Android powered boxes). Hopefully the experience would be so good that we can finally rid ourselves of this Wintel duopoly.

    Don't get me wrong. I still believe an HTPC is the best media center box out there. But these boxes can offer very similar results with a fraction of the cost.
  • rennya - Wednesday, June 5, 2013 - link

    Mainly because those ARM players has crappy GUI and limited support for file formats and containers? Try playing a Matroska file that used segment linking, has a 10-bit H.264 1080p24 video stream with at least 10Mb bitrate, a DTS-HD MA 7.1 track and also a fully-styled SSA subtitle track and you will see that Ouya console crashed and burned while doing so.
  • sireangelus - Tuesday, June 4, 2013 - link

    Can someone explain to me why they don't get themselves a laptop with some remote functionality and use that as an htpc? shouldn't it be less expensive, have a lower tdp and be more useful?
  • HisDivineOrder - Tuesday, June 4, 2013 - link

    I sincerely hope they fixed the 23.976 bug in the IGP that is included with Bay Trail. If they did, there's your HTPC of choice for anyone not obsessed with MadVR.
  • halbhh2 - Tuesday, June 4, 2013 - link

    Such a careful review makes me want to have the new A10 6700 put through the same paces.
  • majorleague - Wednesday, June 5, 2013 - link

    Here is a youtube link showing 3dmark11 and windows index rating for the 4770k 3.5ghz Haswell. Not overclocked.

    Youtube link:
    http://www.youtube.com/watch?v=k7Yo2A__1Xw
  • eio - Saturday, June 22, 2013 - link

    according to the snapshots, to my eyes, the QSV quality of HD4600 is significantly better than HD4000 & x264...
  • Surlias - Friday, December 6, 2013 - link

    Hey, I just bought my first 120 Hz capable HDTV, and I'm wondering how to configure it for both gaming and proper 24p media playback. Can I set the a custom resolution of 1920x1080 @ 120 Hz (GTX 770) just like you can with a 120 Hz monitor, and then leave it at that setting all the time? Then games would be able to do VSync up to 120 fps, and media would be able to lock in at 24p because it's an even divisor of 120. Or is this not possible on a TV due to HDMI limitations? If this is the case, then will it be necessary to manually switch back and forth between 60Hz and a 24 Hz custom resolution depending on the usage situation (since 24 Hz would be awful for gaming)? I've always found this particular subject confusing and I'm hoping someone can help me understand how this works. FYI, I mostly use XBMC for media playback, which has an option to "sync to display refresh rate", which I assume would be essential for enabling 24p playback.
  • redmist77 - Tuesday, April 8, 2014 - link

    I'm amazed at how well Haswell locks the video and audio sync. I can output 23.976Hz video to my plasma TV and all of the AV sync graphs in MediaPortal or MPC-HC are *dead* flat and there's never a stutter. 24Hz, 50Hz and 59.94Hz are all perfect too. That's possibly the holy grail of a HTPC. I haven't found any other bugs (DXVA2 etc.) either so it's looking good. DTS-HD, Dolby True HD bit streaming also works perfectly. 0-255 RGB HDMI output requires a registry fix but that's no big deal (default is 16-235)

    The next great challenge is a gaming HTPC that uses Intel 4600 for video and a giant PCI-E card when launching a game ;) I've sort of got it working using DisplaySwitch.exe and two HDMI inputs on my AV receiver but it's not quite seamless.
  • khmara - Friday, May 30, 2014 - link

    The flickering that is seen during the test with the Haswell in 4k on the Seiki 4k TV was due to Intel graphics settings having the refresh rate default at 29. If you manually change it to 30 the flickering is eliminated.
  • Tassadar - Thursday, June 12, 2014 - link

    Hi all,

    I have a HTPC with an intel haswell and I can't get 23,97 fps, I have 24 even if i set 23 in intel panel properties.

    I have run CustomModeApp.exe but I only can enter entire numbers (no decimals) in the frequency. I also have try 23 but doesn't allow me to accept.

    Any help?

    Thanks
  • Gadgety - Monday, August 18, 2014 - link

    "... but the HDMI link never got locked (the display would keep flickering on and off). The frequency of locking was inversely proportional to the HDMI cable length... We will update this section as and when we reach closure on the issue with ASRock / Intel."

    This was in June 2013. Still no closure after 13 months?

Log in

Don't have an account? Sign up now