
Original Link: https://www.anandtech.com/show/9986/ces-2016-roundup-total-editor-recall
CES 2016 Roundup: Total Editor Recall
by AnandTech Staff on January 26, 2016 11:00 AM EST- Posted in
- Trade Shows
- CES 2016
Another year, another Consumer Electronics Show - actually, it seems that it's official name is now just CES. Nonetheless, it ends up being one of the biggest shows of the year for technology, if not the biggest. Covering PC to smartphone to TV to IoT to the home and the car, CES promises to have it all. It's just a shame that the week involves so many press events and 7am-midnight meeting schedules it can be difficult to take it all in, especially with 170,000 people attending. With the best of interests, we did take some information away and we asked each editor to describe the most memorable bits of their show.
Senior PC Editor, Ian Cutress
CES is a slightly different show for me compared to the other editors - apart from flying in from Europe which makes the event a couple of days longer, it actually isn't my priority show, and that honor goes to Computex in June. Despite this, and despite companies like ASUS cancelling their press events because everything they would have announced would come after Computex anyway, CES this year felt like my busiest event ever. Overriding announcements like AMD's Polaris is a great way to get the adrenaline going, but a couple of other announcements were super exciting too.
First up I want to loop back to ASUS. Despite the lack of a press event, their PR mail shot just before the event mentioned a 10G Ethernet switch being launched. At the time I mistook the announcement for a 10 port switch, when the device is actually a 2x10G + 8x1G, but even in that configuration the price of $300 is hard to ignore. Moving a workspace to 10G, especially 10GBase-T, means getting a capable switch, which at a minimum costs $800 at the moment for an 8-port number. So bringing that down to something more palatable is a good thing.
Having been to China to visit Huawei at the end of last year and talking about the Kirin 950 meant it was good to see the Mate 8 launched and Andrei's launch day review. ARM's A72 microarchitecture, the thinner, lighter and more powerful upgrade to A57, was in the flesh and on 16nm using TSMCs 16FF+ node. When we spoke with Huawei and HiSilicon before the launch, they were promoting some impressive numbers especially on power efficiency, which Andrei tested and confirmed. Whereas 2015 was a relative dud for mobile on Android, 2016 should breathe a bit of life into an ever expanding market with the introduction of A72 and 16/14nm.
Speaking of things that should come to life in 2016, Virtual Reality should be on the rise and the constant talk about VR was ever present at the show. Not only the kits (I had another go at the HTC Vive with iBuyPower while HTC filmed it) but also the hardware that powers them, with AMD's Raja Koduri stating that true VR requires 16K per eye at 240 Hz. While we're far away from that right now, we saw new hardware gracing the scene such as EVGA's VR Edition that provisioned for all the USB ports needed, or full on systems with MSI's Vortex. The Vortex was interesting by virtue of the fact that it sounds essentially like the Mac Pro with a single CPU and two GPUs in a triangular configuration sharing a single heatsink and a single fan to cool them in nothing bigger than a small wastepaper bin. While the design is purely aimed at the gaming crowd, a professional look paired with a Core i7 and two GTX 980 Ti graphics cards (or any upcoming 14/16nm cards), plus the Thunderbolt 3 ports it has, would make it a mini powerhouse for gaming and VR.
I got super excited for a couple of other things, but perhaps not for normal reasons. Firstly was storage - Mushkin showed us an early PVT board of their new 2TB drive, but said it was a precursor to a 9mm 4TB model coming in at $500. That pushes pricing down to $0.122 per GB, although in that configuration due to some RAID controllers and splitting it takes a hit on IOPS and power consumption, but nonetheless it seemed a good way for cheap SSD style storage.
(Edit 2016-01-26: Mushkin has clarified their comments to us: they are aiming for below $0.25/GB, which puts the drive south of $1000. Saying $500 is more of an end goal several years down the line for this sort of capacity.)
The other part was Cooler Master's new MasterWatt power supply with an integrated ARM controller and Bluetooth. This gave the user, either via internal USB or on the smartphone app, access to the power consumption metrics, rail loading and recording functionality that I've badly wanted in a power supply for a while. With the right command line tools and recording, I ideally want to get several of these to power my next generation of testbeds and get a metric ton more data for our reviews. I've pitched several ideas to CM about how we can use them in the future and they seem very willing to work towards a common goal, so watch this space.
My big show of the year is going to be Computex in early June, when a number of the standard tech companies have already stated they have large plans for releases. Roll on 2016...!
Senior Laptop Editor, Brett Howse
Every year we converge on Las Vegas to see all the new technology announcements from the world. The show is massive, and there is always a struggle to see as much as you can in the limited time available. This year I’ve stepped up as the senior laptop editor so there was plenty for me to cover in the couple of days I had boots on the ground, so to speak.
I think for me the most amazing announcement was the incredible number of devices that were available with OLED displays. I’ve long been a fan of the deep black levels OLED can produce and I’ve been wanting to see some on notebooks. We got that and more. There were several notebooks announced with OLED including the Lenovo ThinkPad X1 Yoga, the HP x360, and the Alienware 13, but I think the most impressive was the 30-inch Dell UltraSharp display. The price is very high but new uses of technology always start that way.
OLED is not the holy grail either. It has power draw issues on high APL workflows which is kind of the standard in the PC space. But as a first step into the PC sector, I was very impressed with the displays on all of the devices. It doesn’t even appear that the price premium is going to be that bad on the notebooks so hopefully we will see more innovation with these displays.
Another trend I’ve noticed, and this didn’t start at CES, is just how thin and light powerful notebooks have become. Everyone was keen to show off their latest notebooks based on Skylake, and it’s fairly stunning how much computer you can pack into such a small body. The LG gram is a great example of this. LG decided to use traditional Ultrabook parts in a 15-inch chassis, and when you see a 15-inch notebook under three pounds, it’s almost like it's not there. Very impressive.
Often the trade off of this is battery life though, and with the rare exception many companies have been going the smartphone route where thin and light trumps extra battery life. But even with that said, notebooks like the ThinkPad X1 Carbon got thinner and lighter this year but also increased battery capacity. High density batteries come into play here. I think it’s going to be a trend we continue to see but hopefully device efficiency makes up the difference. Everyone has different needs as far as battery life, and even the worst performers in 2016 last far longer than notebooks only a couple of years old.
The final thing I want to bring up is the decline in the PC market. Depending on what numbers you look at, the PC market dropped 6-10% in 2015 when looking at unit sales. I suppose that’s not a huge surprise considering how we interact with different devices now, and we have not seen an increase in system requirements for Windows since Vista in 2007, so old computers are still getting by. I’m as guilty of this as anyone, since I just upgraded to an i7-6700K system from an i7-860 system I believe I built in 2009. My old system was still good and rarely did I wish for more performance. Tablets have also taken some of our usage share, and with their light weight operating systems people are already finding they have a long lifespan. But despite the bad news about PC sales, it actually has been very good for consumers.
The quality of laptops released at CES were clearly a step ahead of what they have been in the past. In the past quality was secondary to quantity, but with the drop in volume, everyone has had to step up their game. Build quality, display quality, and performance have all taken a big step forward, especially for companies competing in the premium space. Users also want a nice looking laptop, and we saw the Dell XPS 13 come along last year and set the bar pretty high. Dell then took the styling and applied it to the XPS 15 to great effect, as well as their new Latitude 13. No one else is standing idle though, and there have been some excellent designs by other companies as well like the HP x360 and Lenovo 900.
So although the numbers suggest an industry in decline, I saw plenty at CES to make me hopeful for the future that at least the devices are improving for those that want them.
Mobile and Tablet Editor, Brandon Chester
This year’s CES was my second time attending the show. I was quite new to AnandTech during my first CES, and what I can say is that while this year seemed even bigger than last as far as the scale of the show goes, for me personally it felt much less frantic than last year. CES is an interesting event for me because there aren’t always many major announcements made which relate to new mobile products, but because of how prolific the smartphone has become most of the announcements tie in with the mobile world in some fashion.
One of the big pushes this year was VR. This isn’t unexpected, given that there are now many vendors trying to get involved with VR now that it appears that the category will be much bigger than the niche market some may have expected it to be in the early days of the Oculus Rift.
At CES, I was able to try the HTC Vive and the Gear VR. I wasn’t able to try the Oculus Rift, but everything I’ve heard from other people and read online says that the Vive is the best of these three major offerings. I have to admit that I was quite impressed by the demos for the Vive, and you can read a bit more about that here.
One of the barriers to adoption that I can see with the Vive will be the fact that setting up the lighthouse tracking stations requires a substantial amount of space, and I don’t think many users are going to have a room that they can dedicate to using for VR. In most cases, I expect that VR will end up being a way to increase immersion without requiring the user to move about any more than existing games require. Unfortunately, this won’t take full advantage of VR’s potential, but that’s just the truth of how most people want (or are able) to play games if we consider it a mass market venture.
As a mobile editor, the most interesting thing about VR is how much it owes to the advances in the smartphone market. The only reason any of these products exist in their current forms is because of the work to create small displays with higher and higher resolutions, along with the great advancements that have been made in the manufacturing of OLED displays. I don’t think the current VR headsets are where they need to be in this regard, as in my experience you need to move away from PenTile in order to avoid the chromatic aliasing and other artifacting that the current headsets have. The displays will eventually have to enter the realm of 4K and even 8K panels as well in order to have a high enough resolution to mimic how you would actually see the world. However, the fact that there are such high resolution OLED panels available right now is definitely the result of how that technology has advanced in order to be used in smartphones, and the display technology wouldn’t be where it needs to be for these first generation VR headsets if that had never occurred.
On the tablet side of things, I saw two notable tablet launches at the show this year. The first was Huawei’s MediaPad M2 10, which is a mid range Android tablet using the Kirin 930. More interesting was the launch of Samsung’s TabPro S. This announcement was a surprise to me, and even more surprising was the fact that Samsung had decided to go the Windows route with their 2-in-1 instead of using Android. Samsung may have realized that moving to a 2-in-1 can amplify the issues with Android tablets, and using Windows allows you to provide an experience that leans more toward a laptop, which is what I feel many 2-in-1 buyers are really looking for anyway.
Something absent from this year’s show were smartwatches. Part of this certainly has to do with how much of the market Apple has grabbed, coupled with the fact that they don’t attend the show. Even so, I was surprised to see very little promotion from other vendors and nothing in the way of new announcements. We did see new finishes for the Huawei Watch and Samsung Gear S2, but no completely new hardware. We might see more at MWC in February perhaps.
While smartwatches were missing in action, that isn’t to say that wearables themselves were missing from the show. Fitness trackers were shown off in multiple places, and head-mounted displays in a similar style as Google Glass were being shown off as well. Zeiss’s Smart Optics technology, for making discreet smart glasses, was definitely the most interesting thing going on in that category, despite the fact that it’s only a tech demo right now. I hope that they’re already in talks with companies to get this technology into future commercial products.
The last area that I ended up seeing a lot of at the show was television. In hindsight this is a bit surprising since I don’t even have a television or any sort of cable service, but I suppose that my interest in display technology played a part. There were two main things that happened in the TV space. The first is the adoption of wider color gamuts and support for HDR in the standards for UltraHD content.
Both of these are important, although I am very disappointed by the efforts of a group of companies to push DCI-P3 support into these specifications in addition to Rec. 2020 because their display technology isn’t capable of reproducing that color space entirely. I was able to see quantum dot panels this year that covered over 90% of the Rec. 2020 gamut, and the use of a second smaller gamut may cause problems down the line with input and output chains that don’t handle the color management for P3 content properly when displays move to Rec. 2020 displays. Consumers with newer Rec. 2020 displays might end up seeing oversaturated pictures, or owners of DCI-P3 panels may have to deal with under saturation of Rec. 2020 content.
The second thing I noticed about the TV space is the lack of advancement that OLED has seen. This is mainly due to Samsung’s choice to push LCD panels with quantum dots, which takes the largest OLED manufacturer out of the race. While they briefly took a step into the OLED TV market a few years ago, Samsung has just continued primarily as an LCD manufacturer since then.
There are a couple of important things to consider here. I haven’t yet seen an OLED panel approaching coverage of the Rec. 2020 gamut, which is part of the reason why DCI-P3 has been put into the UltraHD standards. This is conflicting becuase it’s a gamut for cinema use, and it now coexists with the Rec. 2020 gamut which will be used for UHDTV. OLED’s limited peak brightness also limits the range of bright shades for HDR content, but its black level allows for better detail with dark areas. Something to note is the fact that light incident upon the display will end up negating the advantages of OLED’s black levels due to reflections, so the black levels only provide an advantage with a proper environment to block out other light sources.
With these things in mind, it does make sense that Samsung is pushing in a different direction. When looking at the TV market, I don’t see OLED as becoming a complete successor to LCD, while I do expect it to do so in the mobile space. TVs often have static parts of the interface, and issues like burn in and emitter aging will be difficult to control. Improvement to emitter materials will allow for higher peak brightness and a greater color gamut, but it seems like OLED may be more of a stopgap technology rather than a long term play.
When looking strictly at the mobile market I don’t think there was a lot to be excited about from this year’s CES. Those sorts of announcements are usually reserved for MWC anyway, so it’s something to be expected. If you expand your view to the technology market as a whole, then I think you’ll see a lot of interesting things going on. I think VR is going to be big, even though the display technology isn’t where it needs to be yet. Early adopters will help drive further investment, which will drive technology improvements, and eventually prices will come down as well. I think head-mounted displays in general will also become more widely adopted in the future as technologies are created to implement them in more discreet ways, and I think Zeiss’s demo was a great example of how quickly things can move.
As for the display and TV market, I think the move to Rec. 2020 will be delayed as manufacturers ship DCI-P3 panels instead, and that’s quite unfortunate. HDR has the potential to greatly improve the dynamic range of video content, and it’ll be interesting to see which of the several proposals for HDR content encoding end up being adopted most widely. As for panel technology, I think LCD is going to stick around for longer than people think, and I think OLED will probably be something that exists in addition to LCDs rather than replacing them, with a future technology such as MicroLED eventually replacing both down the line. As always, technology keeps moving forward in many different ways.
SSD Editor, Billy Tallis
With only a few months as an AnandTech Editor under my belt, this was my first CES and my first time in Las Vegas. The scale of the event is almost incomprehensible (I frequently relied on my smartphone's GPS to navigate even when indoors), and my schedule was packed with meeting after meeting and ten different companies in one day can get hectic. But it was worth it to meet all the company contacts I'd only been introduced to by email, meet most of my fellow AT writers, and get hands on with upcoming SSD technology.
While I would have loved to take the time to look at all the drones flying around (in cages, fortunately) or wait in line to try some of the VR demos, I only had about two hours to explore the CES show floor. That wasn't even enough time to walk past half of the exhibits. I stopped at a few booths to drool over FLIR's thermal cameras while imagining how a PCIe M.2 SSD might light up under their gaze, to gawk at wireless routers competing to have the most antennas, and to look for cameras worth upgrading to, but what I have to report on is just about SSDs.
For months, the SSD market has largely been in a holding pattern awaiting next generation components to make into products that are actually exciting. The high-end SATA market hasn't budged; the low-end market has seen gradual price decreases from transitions to TLC NAND, cheaper controllers and Toshiba's 15nm flash finally replacing their 19nm flash in volume. The PCIe SSD market still suffers from too few choices, such as expensive drives from Intel and Samsung, and a handful of outdated drives that don't support PCIe 3.0 or NVMe and aren't any cheaper for it. The doldrums will be over soon, as 3D NAND and NVMe are about to become widely available from every brand.
We've covered the announcements and roadmap updates from the SSD controller vendors, but haven't highlighted many of the retail products that will be incorporating them. Some of the products demoed were fairly unsurprising, such as Plextor's M7V value SATA drive and successor to the M6V (switching from MLC with the SM2246EN controller to TLC with a Marvell controller); they get to cut costs and have some more room to differentiate their product using in-house custom firmware. The OCZ Trion 150 improves on the Trion 100 by moving from Toshiba's A19nm TLC to their 15nm TLC, with no changes in performance specifications. Aside from cosmetic differences that aren't necessarily finalized, most of the new Phison-based products don't stand out from the crowd and there's nothing much to say about them individually.
The most unusual drive was clearly Mushkin's prototype for a 4TB model in their Reactor line. In order to hit that capacity they're putting two SM2246EN controllers behind a JMicron JMS562. The latter chip is one you'd more commonly expect to find in a multi-bay USB hard drive enclosure, but it can use one of its three SATA channels as the host interface instead of USB, making it into a transparent RAID controller. This reportedly kills random access performance, but Mushkin is expecting to be able to ship the 4TB model for a mere $500, which will greatly help it find a niche.
Plextor's M8Pe NVMe drive using Marvell's 88SS1093 controller will be available as an add-in card or in M.2 2280 form factor. They had a mock-up of a wraparound heatspreader for the M.2 model with a similar motif to the add-in card's heatsink. This is the first M.2 drive we've seen with any sort of heatsink or heatspreader on it, which may become more important as performance increases.
ADATA's exhibit impressed me with the sheer breadth of their product line. Between their consumer, enterprise and industrial SSDs they were showing off drives based on virtually every controller except Phison's. Their IM2P3738N industrial M.2 drive is using Marvell's low-cost 88NV1140 PCIe 3.0 x1 NVMe controller, the first deliberately low-end NVMe product. ADATA crammed all the other new stuff into one demo system: a 2.5" drive with IMFT 3D NAND, an M.2 prototype with Silicon Motion's SM2260 NVMe controller, and a U.2 drive with Marvell's 88SS1093 NVMe controller. The latter drive was in a PCIe to 2.5" U.2 riser card that looked like it would be a handy addition to my testbed. We've asked for a couple in order to do power testing!
News Editor, Anton Shilov
There are several things that people want from their personal computers these days: mobility, high-resolution display, high performance across a wide range of applications (including demanding PC games), sleek design and some require a small form-factor setup. While it is possible to get an ultra-thin laptop, a powerful desktop and a high resolution monitor, it is pretty hard to get everything in one package. Apparently, companies like ASUS, MSI and Razer know a way how to partly solve the problem, and we saw their solutions at CES.
Modern microprocessors and solid-state drive can offer desktop-class performance on an ultra-thin notebook. However, when it comes to performance in graphics-intensive applications, it is simply impossible to build a leading-edge GPU, which is required to play the latest games in ultra-HD (4K) resolution, into an ultra-slim form-factor system. As modern GPUs can dissipate 300W of heat, it is impossible to cool-down such a chip in a laptop. Moreover, even a 100W GPU will completely ruin battery life and will require a sophisticated cooling solution, which means a thicker design. The only way to enable proper graphics performance on a small form-factor PC is to use an external graphics adapter. While some may argue that external graphics solutions are only useful for a fraction of the market, this is not entirely true. In addition to notebooks, external graphics adapters could be plugged to small form-factor PCs like Intel NUC or all-in-one systems with decent displays, which are expanding in their utility, particularly in enterprise and other global markets.
External graphics adapters are not something new. Back in 2007/2008 AMD introduced its external graphics port (XGP) technology, code-named Lasso. AMD’s XGP allowed to connect a graphics card to a PC using the PCIe 2.0 x8 or x16 interface, which guaranteed sufficient amount of bandwidth for the time. Unfortunately, the XGP relied on a sophisticated proprietary connector that was made by only one company and was rather expensive. As a result, it never took off.
ASUS, Alienware, MSI and some other companies have also introduced external graphics solutions for their mobile PCs over the past decade. However, their GPU boxes were either proprietary, or had performance limitations. For example, ASUS’ first-generation XG Station contained an NVIDIA GeForce 8600 GT GPU and used Express Card interface, providing rather low performance even for 2007. More recently, Alienware and MSI introduced proprietary external graphics solutions compatible only with their laptops. The GPUs relied on external PCIe x4 interface, they were compatible with rather powerful video cards (thanks to the fact that they featured their own PSUs) and hence could really bring serious performance improvements. Unfortunately, both Alienware’s Graphics Amplifier as well as MSI’s Gaming Dock were only compatible with select laptops made by these two companies.
At CES 2016 several companies introduced their new external GPU boxes that can accommodate high-end graphics adapters. At least one of the solutions uses Thunderbolt 3 interface with up to 40 Gb/s transfer rate and are compatible with various PCs. Some continue to be proprietary, but are using more modern connectors such as USB Type-C. However, since they can offer desktop performance, they can help to build gaming systems based on ultra-thin notebooks, AIO or SFF desktops.
ASUS was among the first hardware makers to offer external graphics solutions for laptops in the mid-2000s and at CES 2016 it demonstrated its all-new XG Station 2, which is designed for the company’s upcoming notebooks. The XG Station 2 is compatible with any ASUS video card based on AMD Radeon or NVIDIA GeForce GPU that consumes up to 350W of power, which means that you can install almost any board into this dock. The XG Station 2 uses two USB type-C cables that support up to 32 Gb/s transfer rates (which is equivalent to PCIe 3.0 x4), but relies on a proprietary architecture. The external GPU kit from ASUS seems to be a powerful solution and it even allows using the laptop’s own display to play games using external video cards. However, since it is a proprietary technology, it will not be compatible with non-ASUS systems.
The manufacturer did not reveal a lot of information about its plans concerning the XG Station 2 and compatible laptops. Hence, it is unknown how competitive ASUS’ solutions with external graphics will be. Nonetheless, it is a good thing to know that the world’s largest supplier of gaming laptops intends to offer external GPUs as an option.
MSI already offers Gaming Docks for select laptops. At CES 2016 the company demonstrated its external graphics solution for its all-in-one gaming PCs. The external GPU dock for AIO systems is compatible only with the company’s 27XT 6QE as well as NVIDIA graphics cards, but it uses PCI Express interconnection and can house almost any contemporary GeForce (with exception of dual-GPU cards and some non-reference boards). The solution looks like a commercial one and it will, without any doubts, become a key selling point of MSI’s gaming AIOs this year.
The form-factor of the dock is tailored for all-in-one PCs, hence, it cannot be easily connected to laptops or SFF systems. Moreover, since implementation of the PCI Express and GPU power delivery are clearly proprietary, MSI’s external graphics boxes will only be compatible with its own AIOs. Keeping in mind that MSI needs to sell system to gamers, even proprietary external GPU box makes a great sense for the company as it allows is to offer almost any video card with its AIOs, unlike other PC makers.
Razer, which is mostly known for its peripherals and gaming laptops, decided not to use any proprietary technologies with its new Razer Stealth ultrabook and Razer Core external GPU dock. Everything is based on industry-standard components and hence the Core can be connected to almost any system with Thunderbolt 3.
Unlike ASUS and MSI, which are still finalizing their new external GPU technologies, Razer is already taking pre-orders on the Stealth notebook. The Stealth laptop is just 0.52-inches thick, but it features a 12.5-inch IGZO display with 2560x1440 or even 3840x2160 resolution as well as the Intel Core i7-6500U central processing unit. The system can be equipped with up to 8 GB of LPDDR3-1866 memory, up to 512 GB of PCIe SSD, 802.11ac Wi-Fi, built-in webcam and so on. The laptop starts at $999, which is comparable to other ultrabooks.
They Razer Core is an external GPU box that connects to personal computers using Thunderbolt 3 interface with up to 40 Gb/s transfer rate (appropriate cable is included). The enclosure features its own 500 W PSU and is compatible with all graphics cards with up to 375 W TDP. The GPU box also features four additional USB 3.0 ports as well as a Gigabit Ethernet controller.
Razer does not sell its Core GPU box just yet, hence, the pricing is unknown. Nonetheless, its reliance on Thunderbolt 3 technology and compatibility with a variety of laptops and SFF PCs makes it a very interesting product. If the company decides not to artificially limit compatibility of its Core with third-party PCs, the external GPU box can become a rather powerful product among owners of notebooks and SFF PCs with Thunderbolt 3 interconnection.
The general industrial trends show that modern PCs are becoming smaller and sleeker, but high-end graphics adapters remain rather large and power hungry. As a result, external graphics solutions for mobile and small form-factor personal computers are just what the doctor ordered. However, proprietary solutions are not always good, especially if we are talking about systems from smaller suppliers or the desire to be 'truly' universal. That being said, locking a user into a certain methodology might guarantee future sales. But Thunderbolt 3-based external GPU boxes look very promising because they combine relatively high transfer rates with simplicity and industry-standard cables (which means relatively affordable pricing).
In fact, after seeing Razer’s Core, it becomes pretty clear that after a decade in development, I think external graphics is on its way to finally done right.