Hacker News new | past | comments | ask | show | jobs | submit login
AMD Ryzen 4000 Mobile APUs (anandtech.com)
432 points by neogodless on Jan 6, 2020 | hide | past | favorite | 307 comments



And laptop manufacturers will now proceed to stuffing these into bargain bin laptops with shitty 1080 displays. To this day desktop/workstation Ryzen CPUs are not treated as a premium product in spite of their superior performance.

I wish Apple would adopt this at least for some products. It'd give AMD more credibility, and Apple would be able to negotiate pricing more easily with Intel.


Well, Dell, Lenovo and Asus are onboard, and at least Asus seems to be gearing up for some premium offerings. So far, these laptop chips were mid-level performers with mediocre battery life. At least in their marketing slides, they are potentially addressing both issues. Zen 2 on 7nm is allowing for high core count, higher clocks and low TDP. In theory they've also improved the idle states to reduce power usage under more conditions.

Both the above theoretical improvements and availability in desirable laptop chassis's will remain to be seen, but at least we can be optimistic, given the desktop wins we've seen this year.


From the link:

> the company said that they have rearchitected a good portion of the power delivery in the APU in order to be able to power down and power gate more elements of the SoC than was previously possible

Appears they did reduce the power usage, remains to be seen how much in reality.


The perf/power graphs there are some bullshit synthetic metric with combined CPU+GPU performance. So indeed it is yet to be seen if this translates into better battery life.


(Disclaimer, I submitted the linked HN post and wrote the linked blog post.)

I talk about this explicitly here [0]. I think it's important that this be called-out widely and given attention. As I said in the first comment there, it's hard to vote with your wallet when the option you want doesn't exist.

[0] https://news.ycombinator.com/item?id=21975877


Reminds me of the lack of non-wide hi-dpi monitors. Nothing at 4:3, 3:2, or 16:10 exists anymore—even if you were willing to pay $10,000+.


Check the new dell XPS 13 to be released today [1]:

> Screen resolution on the XPS 13 has jumped accordingly, now up to 1,920 x 1,200 on the FHD model, and 3,840 x 2,400 on the 4K model. Dell calls this FHD+ and 4K+ for the added aspect ratio, but it should be noted that touch is still an optional feature and isn’t standard.

[1] https://www.digitaltrends.com/computing/dell-xps-13-2020-eve...


I saw that and am happy about it. But, until they come out with a 15" supporting Linux it is still a no-go. Maybe next year.


> supporting Linux

Wait, the Dell XPS 15 doesn't work well with Linux? Could you share a bit on why?

(I was actually thinking of getting a XPS 15 myself, and putting Linux on it.)


I, and every developer in my company have been running Linux in various flavours in XPS 15s for several years now; it works just as well as on any other device I've used it on.

I use an XPS 13 at home also running Linux (Ubuntu/Manjaro) and have no unusual issues there that I haven't also seen in my desktop.

Saying it doesn't work well in Linux is either bad luck with a particular model or just plain misrepresentation.


I think it's bad luck with a particular model for me, and I can't even put my finger on why, exactly. The wireless and graphics are really unstable, though. Many live USBs will not boot into a GUI mode.

It was really aggravating for a brief period, then I decided that I'd just relegate that machine to being "the windows box" and moved on.


That's unfortunate. Which model do you have? We use models from the past 5 or so years, all with Nvidia cards, the only recent issue we've had was with a new model that arrived yesterday and Ubuntu 18.04 didn't have a wireless driver, but quick connection over ethernet and an update fixed that.

You also have to switch RAID to AHCI out of the box (if it came with Windows) but it's otherwise been smooth sailing from there.

As for graphics, Prime/Primus/Optimus (which ever it is, I'm never sure) is an issue with all machines and not a DELL issue, Linux support, in my experience, for switchable graphics is just terrible.

The machine I'm on now (XPS 15 9560) has nvidia always on, I used it plugged in 99% of the time anyway, I'm running Ubuntu 18.04 for work.


I think mine is a 9570.

Agreed on the switchable graphics issue. I think mine was compounded by a nouveau issue with this particular nvidia part. One fix I'd have been happy with would've been to have intel graphics always on. That matches what I do better anyway. But I could not find a stable way to do that.


I found that installing the nvidia proprietary driver and switching to intel mode kept it happily locked to intel, then there is a fix for intel graphics for Xorg (conf change) to stop tearing in videos like YouTube. Performance for things that use GPU rendering (like certain IDEs :() wasn't great so I switched back on Nvidia.

We are all in various states of Nvidia/Intel proprietary/nouveau drivers/modes here depending on personal preference but none of us have any real day to day issues (On Ubuntu 18.04 at least, which is our preferred distro).


I'll have to give that a try. Fedora's the (very slightly) preferred option here mostly because we're usually deploying to red hat stuff. But that is not a religion and would take a back seat to a working laptop. If the XPS15 worked at native resolution with good wireless, it'd be much nicer than my current linux machine.


A bunch of the Ops guys here run Fedora, I've heard it runs better than Ubuntu, particularly Gnome, but in the dev team we're debian so I've not tried running Fedora.


I tried initially Debian, but it didn't really work. Fedora works much better, didn't have any problems, and it's just the much better distro (a >10 year happy debian and darwin user)


Mine does not. Service tag 35CRPQ2 (a late 2018 model). I've tried Fedora 29, 30 and 31 and Pop OS 19.04 and 19.10.

The wireless and graphics are not stable.

I tried for a brief period to get something working, then decided to designate it as my "windows box" and move on. It's been quite a nice Windows 10 system, to the extent that that's a thing. I'd probably even like it if I could bend telemetry, the auto-installed/advertised store apps and updates completely to my will.


Dell officially supports Linux on all of its Developer Edition (Project Sputnik) computers. The current lineup includes the XPS 13/15 and many Precision laptop/desktop models:

https://www.dell.com/en-us/work/shop/overview/cp/linuxsystem...


They don't seem to have any AMD processors. :( I'm thinking about system76 but I don't know anybody who has one.


you mean intel system76, right ? Yeah, they are worth looking into now since they ship with coreboot.


No I was thinking about an AMD system from https://system76.com/desktops


Oh, I don't think there is enough to differentiate system76 in desktops other than aesthetics right now. You can probably get better bang for buck with supermicro parts if you forego the aesthetics. They are doing interesting work with coreboot in the laptop space though.


Ok thanks for the heads up.


I had the newest Xps 15 at my last job two month ago. With a new enough kernel (5.1 or 5.2) got everything but the fingerprint reader working


Probably, but they usually come out with a Precision dev edition with Linux installed, no MS tax. That’s what I usually buy.


Most probably due to that Nvidia gpu in there.


Last year Dell's lineup was leaked and seems to be on track, so expect a new XPS 15" around June.


XPS 15 works great with Ubuntu 18.04 thus far. I've been running it for a year now without any problems I can recall.


Isn't the windows surface book's screen 3:2?

edit: Apparently it's called the windows surface laptop and yes it has a 3:2 aspect ratio

https://www.microsoft.com/en-ca/p/surface-laptop-3/8VFGGH1R9...


I need a Linux laptop or standalone monitor.


15 years ago I had a Dell laptop with a 15.4 1920x1200 (16:10) screen. Nowadays you can't find something like that except on esoteric "Toughbooks" by Panasonic...


Coincidentally enough, Dell's recent upgrade to the XPS 13 includes a 16:10 monitor. Announced only a few days ago, available today.

https://www.arstechnica.com/gadgets/2020/01/dell-updates-xps...


16:10 was the perfect aspect ratio. I held onto my 24" 1920x1200 Dell Monitor until just last year. I recently tried to upgrade to a new 16:10 and it was impossible. Disappointing.


I have a similar Dell Pentium 3, still works, from 2002. 1600x1200 15" 4:3 screen. I still occasionally use the thing for nostalgia's sake.


I had that laptop. Amazing screen (and a tank of a laptop).

Back then other other cool thing was the 17" MacBook Pro. It was glorious.


I remember it! Awesome.


God yes. Why can't screen manufacturers make 3:2 desktop screens. We can't all be gamers or movie watchers.


i have a huawei with a 3:2 13" display it so much smaller compared to the same 13" zenbook with "no bezels"

and the keyboard is the old "2015 macbook" keyboard.

https://imgur.com/xGES3Wz


Try finding 5:4 monitors


I want a 1:√2 ratio, like an A4 sheet. But irrational pixel sizes aren't invented yet.


16:9 is quite common and not too far from 16:10. It also happens to match up with popular consumers HD video standards (1080p, "4k" 2160p), which is probably a good portion of why it has enduring popularity.


The wider the panel the, the smaller area it has, while still preserving the diagonal (i.e. 14" 16:9 has smaller area than 14" 16:10), allowing more panels to be cut from the same material and increasing the yields in case of material defects.

I.e. making of wider panels is cheaper, and because in advertisement there's diagonal and not area, it still allows advertising in a way that doesn't put the manufacturer or oem to disadvantage.


Popularity has nothing to do with why its more popular. The industry decided as a whole to go with 16:9 screens due to them costing less, and consumers had no choice. Similar to how most flagship smartphones have lost their headphone jacks even though no customer is running to buy a phone just because it doesn't have a headphone jack.


But consumers do run for ever thinner devices, and a 3.5mm jack will always make the device at least 4mm high (probably more to account for front display and backside assembly).


And when they got too thin and started bending in pockets folks complained until they got sturdier and thicker again.


Ryzen 3000 aka Zen+ mobile wasn't a premium product. GPU was certainly better than Intel, but CPU wasn't the perf/watt simply wasn't there.

I hope they'll stuff these into bargain bin laptops with 1080p displays though. I'm in the market, and don't care to pay more to use more resources for higher dpi that I can barely see the difference with and just causes software headaches. :P


I'd just like to see high quality 1440 panel laptops come in vogue. Why do I need a 4K 13" laptop when I can't discern the difference between 1440 at regular sitting distance? Wasted pixels and wasted battery. Also, it seems like the 4K panels are the only ones with wide color gamut and proper color accuracy.


Happy to see that I'm not the only one in this camp. I don't see the need for such Pixel density on the Mac 13inch laptops. What a waste of resources. In fact, am happy with a 1080p matte display.


The bigger issue is that if you're lucky with 1080p, you'll get 100% sRGB coverage. In a world where many phones and tablets support DCI-P3 why would I want or expect less pixels and less color gamut coverage with a laptop that generally cost more?


Ok. I can't discern that much of a difference in color or FWIW I don't even know what is DCI-P3. My work is mostly coding and email. So 1080P seems fine. I can't imagine the insane battery life you can get from current batteries, for a 1080p laptop display.


I use the same machine for code as well as photography so color accuracy and depth is something I value


This. Ryzen has been great in the desktop space, but it hasn't quite put its best foot forward in mobile where Intel has been the better performance and battery life option, graphics aside.

With Ryzen 4000 we're finally seeing a truly competitive, perhaps even better (we'll have to see how battery life compares) APU. The premium choices should come along if the chips measure up this time around.


There were OEM mess-ups that exacerbated the moderate performance of AMD laptops, like configuring single-channel setups for no obvious reasons even with dual slots, poor thermal management, and more.


I was recently browsing budget laptops and was astonished to find that 1366 x 768 displays are still a thing.


According to Mozilla's data [1], 1366x768 was just this past year eclipsed as the most common desktop resolution. By the only incrementally superior 1920x1080.

[1] https://data.firefox.com/dashboard/hardware


It may be as close as you can get to actually being incrementally superior, but double the pixels does make a big difference in this case.


There are at least 2 normal increments between 720p and 1080p.


The point being made is that doubling the number of pixels is quite significant. Calling it incremental is a little short-sighted IMO.


I was originally going to note exactly that, but then I actually did the math and noted that 1366x768 is 1.0005 megapixels, and 1920x1080 is 1.98 megapixels, which is pretty close to an integer increment when looked through that lens, so decided that was worth accepting (and interesting enough in a nerdy way to allude to).


Not a hardware guy -- would that be 720i and 1080i?


At least for wide screen ratios, there’s 1366x768 and 1440x900


1280×720 (720p), 1366×768, 1440x900, 1680×1050, 1920×1080 (1080p FHD)


I am still waiting for cheap HiDPI monitors and more OS support.

Seems that most monitors nowadays are focusing on high framerates for gaming and colorspace support for visual creatives. The latter are usually very expensive.

I just want some 27"-32" monitors with very high dpi, framerates and color accuracy be damned.


>Seems that most monitors nowadays are focusing on high framerates for gaming and colorspace support for visual creatives. The latter are usually very expensive.

Disregarding the high framerates, I often wondered if someone ever wrote a display application that achieves higher dynamic range by alternating pixel values over frames, sure it would need to be calibrated, say by a photodiode stuck against the screen, lowpass filtering the signal...

It seems like one might "try out HDR" on conventional monitors, to find out if they are worth it... \ which also raises the question if these HDR monitors are not simply storing the last frame(s) to deduce what the appropriate 24bpp (8b per channel) intensities should be such that for static images & static gaze the illusion of HDR is recreated... perhaps this exact trick is what caused ghosting in the first generations of LCD monitors?


I don't know if $700 qualifies as cheap, but the iiyama ProLite XB2779QQS-S1 (5K) seems to fit your description. I am quite happy with mine.


I had one of those I bought in 2017 for $200. Used it to play with distros and do lab work. It was definitely worth it. It even had another ram slot to stick another piece of RAM to make it 8G.

You get used to the display, it’s text editing anyways.


You really don't, you either deal with the impossibility to see much text at once, or with absolutely crappy font rendering.

When I needed an underpowered cheap laptop in 2016 for text editing I bought an asus X-something that had 8 hours of battery life, some atom anemic cpu and 4GB of RAM for 210€ and even that came with a 1080p screen.


I’m not the kind where I stack millions of words on microscopic font. I prefer to work from afar with comfortable sized font. Which didn’t make that much trouble. Another big reason why I don’t like IDE because they have too much panel taking up space, terminals doesn’t have this problem.


I'm sorry but I work with size 14 Consolas in visual studio code which is a far cry from microscopic font, and the difference in rendering quality at the same size on both resolution is massive.


Yep, 1080 is still (sadly) considered not too bad for a laptop.


I have the reverse problem. I want a laptop with a 1TB SSD, but those seem to appear only in configurations with other premium options, such as 4K displays. My 55 year old eyes can't see 4K resolution, so I'd prefer to save the money, power, and time not painting them. 1920x1080 works for me.


There are a couple of options:

1. Buy a cheap laptop with a replaceable memory and nvme slots and upgrade them for cheaper than you could buy them for. Dells typically have replaceable memory and nvme drives.

2. Buy a cheap laptop and put a 1TB NVME drive in a Thunderbolt 3 / superspeed 10 enclosure. This would be the way to go for a Macbook Pro.

If you're going to do superspeed 10 (~ 1 Gigabyte/sec)

NVME disk: https://www.amazon.com/Crucial-1TB-NAND-NVMe-PCIe/dp/B07J2Q4...

SuperSpeed 10 enclosure: https://www.amazon.com/Plugable-Tool-free-Enclosure-Thunderb...

If you want to do something faster (3 to 4 Gigabytes/s) than you should look at the Samsung Evo/Evo Plus/Pro nvme sticks and a proper Thunderbolt 3 enclosure for NVME like this one:

https://www.amazon.com/WAVLINK-Certified-Thunderbolt-Enclosu...

FWIW, I have that pluggable superspeed enclosure above and a Samsung Evo NVME stick, and the superspeed 10 enclosure is the bottleneck. I get 1 Gigabyte/sec in read and writes.


Most laptops can have their components swapped out. I've upgraded my ram [4->16gb], swapped out the slow non-nvme 128gb m.2 boot ssd for a 2tb nvme ssd.

I used a cheap m.2->sata tray in my laptops second slot with the 128gb ssd and macrium reflect* to mirror the old boot disk to the new ssd.

Next up I'm going to get a 4tb data ssd to replace the 128, and swap out the laptops 60hz display for a 144hz display.

Unless the laptops your considering have soldered data drives, are under unusually stringent after care plans (enterprise), or you don't want to mess with hardware these are pretty straightforward tasks. You can probably Google (laptop model) ssd swap site:YouTube.com and watch some one go through the steps. If YouTube doesn't have anything replace YouTube with reddit.

* https://www.macrium.com/reflectfree


> My 55 year old eyes can't see 4K resolution

My old eyes really appreciate the significantly sharper text. If you can't see things, well increase the font size!!


I'm with the parent. Driving a 4K display is really power hungry and the advantages are basically zero if you are using a 16 point font anyway. I'm 1980x1080 on a 13" laptop display. If I've got my math right, that's 169 dpi. A point is 1/72", which gives me 2.35 dots per point, or about 37 pixels of height for a 16 point font. With anti-aliasing, that's absolutely fine for readability.

If I had a monitor that was twice the size (reasonable on a desktop), I'd be looking for 4K. Or potentially if I wanted 8 pt fonts to be shaped nicely... But of course, I couldn't read them anyway on my display... Anyway, on a mobile platform (what the original article is talking about), 4K is a bit of a loser unless you have super eyes, IMHO.


13" is tiny for getting work done, I suppose it hd makes sense there. I have a 15 and 20", and 4k makes it tolerable.


13" laptop fits in a handbag (it's A4 sized) and mine weighs 800 grams. I get 8-10 hours of real web development (with Docker, emacs, Firefox) on a single charge (I'm often without power for the entire day). Again... mobile :-) If you just want a luggable, then 4K can be a reasonable option -- especially if work is paying for it. However, for the extremely nomadic work style that I have 1080p, big battery is way better.


may I ask which model laptop you use? 8-10hrs at 1080p is very good. I get 6-8hrs on an Intel N3060 at 720p, but that's not great for development.


I posted this in the sibling post, but I'll copy it here: It's a GZ73/NL-NRA https://dynabook.com/direct/onyx-blue-pgz73nl-nra.html Actually, I was mistaken. It's 859 grams :-) That includes the battery. This particular model has only a 256 gig drive on it, though. I should also point out that it's an Intel box. I'm impatiently waiting for an AMD box with similar battery performance.

I should also say that I use Arch Linux with i3, powertop settings and set my backlight down to about 7% to get the longest battery settings. This is fine indoors, but if I'm working outdoors or in a sunny place, I'll have to up the backlight which will cut a couple of hours off the peak performance.


many thanks, impressive stuff


Is that 800g with a battery in it?! What laptop is this?


It's a GZ73/NL-NRA https://dynabook.com/direct/onyx-blue-pgz73nl-nra.html Actually, I was mistaken. It's 859 grams :-) That includes the battery. This particular model has only a 256 gig drive on it, though. I should also point out that it's an Intel box. I'm impatiently waiting for an AMD box with similar battery performance.


thanks, very impressive. if only it were available to buy ><


ditto! I'm off to Japan in a couple of months, I'll try and come back ~859g heavier!


I don't understand why they don't sell it elsewhere. It was the same with the previous model I bought. You've got a full power, A4 sized, < 1kg laptop that gives you a full day's work on a charge. How can they not sell that? And eventually Toshiba decided they had to exit the laptop market because they couldn't make any money at it... It's baffling.


The do sell it here, it's just under a different brand name and model number:

https://us.dynabook.com/computers/laptops/portege/X30/X30-F1...

It's ~1kg, but that's mostly due to additional RAM and the like.


That's not actually the same machine. There is a similar model on the Japanese side. It's missing some extra USB-A connectors and an actual network connector. I think the extra weight is actually because battery technology is different. It is possible that they can't sell the lighter / longer life battery in the US. But, you are right anyway. It's close enough :-) I'm glad they are finally realising that this is a useful configuration for some people.


Hot off the press there's this[0] guy weighing in at 870g.

[0] http://uk.dynabook.com/press/releases/dynabook-europe-porteg...


it's rare enough to find an i7-8th gen in anything without a >=15" screen and keyboard with numpad. Dell Ultrabooks perhaps but they start at 1.2kg AFAICS


I've been using Dynabooks (no longer made by Toshiba) for precisely this reason. They have a variety of strange models that can give you low resolution displays with very long battery life. The only problem is that I'm not sure you can find those weird configurations outside of Japan. Now that they are a Taiwan company, I'm hoping so (especially because I'd love it if the concept would catch on). Anyway, troll their truly horrible website, using the search function. That's how I tend to find the model I need. Also, note that in Japan the price on the website is inflated. I found that if I "became a member" (give them my email address), the price was 30% off. No idea why they have such a bizarre (and seemingly counter-productive) pricing system. No affiliation with Dynabook or Toshiba -- just been my goto laptop for 10-15 years.


What do you mean they're no longer made by them? They still have them for sale here:

https://us.dynabook.com/computers/laptops/portege/X30/


Yes, but Dynabook is now a Taiwanese company. Toshiba still sell the computers that are made, though. There is a "design" element apparently, but the two companies are supposed to move farther apart over time... Or at least that's what I'm led to believe.


You can buy a 1TB SATA SSD for like $120 or a 1TB NVMe SSD for around twice that.


It really depends on the laptop how easy that replacement is, but yeah that can be a _great_ option. A friend got a great deal on a laptop that would have been perfect except it was spinning-rust-only. So we upgraded that to an SSD and saved probably $400 for an hours work.


There are plenty of under $500 laptops with 15.6" FHD IPS screens that you can upgrade the RAM, M2 and 2.5" storage slots. I just updated a $300 Acer with 16 GB of RAM and a 1 TB SSD, cost me less than $200 extra.


MacOS has an option to change the size of text independently from the resolution. This is how i use it on 4K displays. Cant remember the exact setting name. Other operating systems might have similar features.


are nvme slots a thing yet?

i have a 1tb one. it's pretty tiny. it cost me $70. why is it so hard to put one of these inside a laptop?


Its really not that hard to find a laptop with upgradeable SSDs and even RAM. We just have to stop buying laptops with soldered, non-upgradeable everything.


I had the exact same thing, so I bought a Dell xps 15 then swapped out the ssd for a 1tb one that was faster, too.


My thinkpad x395 has 1TB storage. It's a 13" laptop with 1080p IPS screen, Ryzen 3700U and 16TB ram.


I was amazed with 16 TB RAM for a laptop, but obviously it was 16 GB instead


Oh, I'd love to have 16TB of RAM indeed.


May I ask why is 1080 sad? I have been on 1366x768 during the early 2010s. But switch to 1080 over the last couple of years. While my personal desktop is on 1440p, 1080 is still very much a 'good enough' resolution for corporate work and day-to-day browsing usage. I see this trend to stick around for the next 5 years at least because most consumers don't really see non-marginal benefit in moving to 1440p.


I have two Dell 4k monitors that allow for incredibly sharp text. Weren't that expensive. Can pry them out of my cold dead hands.

Only thing I wish for is a squarer aspect ratio, 16:10, 3:2, etc, which (even more) sadly aren't allowed any more.


Yup, the Microsoft Surface line has really spoiled me here. I was hoping to see more 3:2 ratio displays as a result. Not sure why but everything just went wrap around or double wide instead of a more productive A9 ratio.

I have heard rumors that Microsoft will release a display to compliment the line. I hope it's true.


1080p is barely big enough for 2 windows side-by-side. A bit short in general, too. We had 1920x1200 before the 16:9 craze took over (a decade ago?).


> May I ask why is 1080 sad?

A serious laptop these days might have 1800 rows or so. 1080 is pretty miserable. Pixels are going to be large and blocky and text (which is what most corporate work and day to day browsing is) is going to be blurry and poorly defined.


actually most of the time laptops (beside super expensive macbooks) are very bad at using 4k and the screen look way blurier than a nice sony vaio 1080p screen (what an amazin computer it was) from almost 10 years ago.

especially with windows still sucking at really handling 4k


Even MacBooks are scaled to something that looks like a crisper 1440x900 (the 13-inch at least). And they aren’t 4K native resolution. It’s nice to have better looking font, but it’s the kind of feature that sort of disappears for me over a long session unless I’m comparing 2 displays.

In terms of usable screen space, 1080p can be either adequate or a little too much depending on display size. I have to zoom my X270 in a bit.


I run a 4K desktop and a 4K laptop, and both have excellent visuals with crisp text on Win10. The only things that still suck at 4K are old apps, but there's only so much the OS can do about them short of just refusing to run apps that are too old (which is the Apple way of handling such things).


If you get a screen appropriate for running at 2x zoom then there shouldn't be any visual problems caused by the rescaling.


1440 is a thing, you don't have to go right to 4k (which yeah is often going to be a mistake in a budget to mid-tier machine).


1920x1080 on 14” and you don’t need scaling. That’s the reason why I go with that resolution. Scaling is still not perfect on Windows.


Look at Asus's new line of premium laptops. All AMD Ryzen 4th gen. Ryzen 9 CPU's have been out of stock for months as enthusiasts and professionals are jumping at the value of getting 16 cores for ~750$. I think the tide of the market is shifting significantly, especially in consumers minds.


"ASUS DITCHES Intel" - yesterdays vid from LTT about Asus at CES 2020: https://www.youtube.com/watch?v=hGUESEq75ZI


What's wrong with a 1080p display on a laptop? My current laptop has a 2560×1440 screen and I can almost guarantee that my next one will be 1080p matte if I can help it. There is just no point in me wasting compute on pixels that I'll never see.


Ditto; I opted for the 4K display in my XPS 15, and as gorgeous as it may be it's beyond unnecessary for text editing, web browsing, and even watching video.

If I had to spec out the same machine today, I would opt for the 1080p display. Not just for the increased battery life, but better overall graphics performance as well. Unless you're doing serious graphics work, it's just not necessary.

4K has its place, but smaller displays where you'd be hard-pressed to notice the difference at typical viewing distances is just not it.


Text rendering when the OS is running in a HiDPI mode (e.g. 4x as many pixels used for each character) is not a waste. It's more legible and looks wonderful. No chance of not noticing it in action.


I wonder how we've gotten by for 40+ years without HiDPI font rendering. sarcasm


Feel free to stick to spinning hard drives too.


https://www.ultrabookreview.com/34951-asus-zephyrus-g14-14-i...

Screen: 14 inch, FHD 1920 x 1080 px resolution, IPS-Level, 60 Hz, matte, 100% sRGB, Pantone Validated 14 inch, FHD 1920 x 1080 px resolution, IPS-Level, 120 Hz, matte, 100% sRGB, Pantone Validated 14 inch, WQHD 2560 x 1440 px resolution, IPS-Level, 60 Hz, matte, 100% sRGB, Pantone Validated

But I'm sure there will be many other offerings and possible Apple may well. But who knows, Apple may just license the core and launch a hybrid x86/ARM system. Fun times though.


Why does the 15" have 240 hz but the 14" only goes up to 120hz?

The AnimeMatrix display looks kind of cool though.


Thermals? No good reason? General availability of panels?


Prelude: I am a HiDPI junkie: I had a IBM T221 200 ppi monitor more than 12 years ago. I currently have a Dell XPS 13 9350 with 3200x1800 at 13.3" (276ppi).

However my next ultrabook (with a 14" or smaller display) will probably feature what you call "a shitty 1080 display". 1080 at 13.3" is 170ppi which is just fine for a laptop because you're not as close to the display as on a smartphone or a watch.


Agreed wholly. I spent hours trying to find a laptop with a higher resolution screen than 1080 I could replace my T470 (14") with. After getting it I realized I significantly liked the 1080 panel better - aside from the fact I wish it had a higher nit rating. Not only that but the 1080 panel is much more power efficient and with the dual batteries on it I can get more than 10 hours out of it running Linux (and everything works really well). I have a hard time context switching between my work provided laptop (MBP 15", 32GB, Touchbar, etc) and it. The screen and the keyboard are jarring and, again, would rather do all my work on the smaller screen 1080 laptop with a real keyboard.

Apple could do wonders in a 14" MBP format with a typing focused keyboard (no Touchbar) and a 1080 panel. Would trade in my existing corporate laptop in a heartbeat. Especially if it had 32GB of RAM (needed for VM/container setup and could actually use 48-64, but that isn't going to happen anytime soon).


They made the 13" Air for the last ten years with a 13.3 900p screen

> Apple could do wonders in a 14" MBP format with a typing focused > keyboard (no Touchbar) and a 1080 panel.


While it might seem like a nitpick - they're not equal IMO. Putting a 13" MBA next to the T470 (I actually own one, funny enough - A1369) is not remotely the same end user experience. Also, Macbook Air is not in the same target market as Macbook Pro. What I'd ideally want is a compact machine with tech specs that align with the MBP line. CPU and RAM that the Air line will never see. I will agree that the keyboard on the MBA was pretty good. Apple truly messed that up.

In my T470 I have 32GB of RAM (user upgradeable) a dual core i7 @ 2.8GHz and an M.2 NVME slot which leaves me the flexibility for upgrade beyond the 500GB drive I bought it with. As it stands right now it's also got an open slot for a 2.5" SATA drive as well. It also has two batteries - a 3 cell 24Wh internal battery (field replaceable) and a 6 cell 72Wh external swappable battery. My one gripe is it only has one USB-C (it is Thunderbolt). I bought it in 2017 and it's still like brand new. Keyboard is still fantastic, screen hinge shows no sign of failing, I've never used a case to protect it and it continues to take a daily beating. If something fails, it's likely I can buy the parts and replace them with little to no fanfare. Honestly, if Apple sold something like this - it would sell like crazy. Lenovo now even supports Linux like a (mostly) first class citizen with regard to firmware and drivers.

For fun I just measured both and was looking at the screens side by side - the T470 has a bit more width, which is probably what I like since I do a lot of side by side comparison, and the additional pixels on the screen are at just the right scaling for how I like to work (which is probably why the higher resolution MBP I have feels jarring when switching).

For reference - what I measured... MBA: 11 5/16w x 7 1/8h (13.3" diag) T470: 12 3/16w x 6 7/8h (14" diag)


To nitpick, you'd have to compare with the similar model from when the MacBook Air came out.

The T410 (a brick with a terrible screen and battery jutting out the rear) or the X201 (a nicer brick and a tiny trackpad

I definitely agree with you that the current T470/80/90 is a pretty nice machine, it's wolrds ahead of what the ThinkPad was when the Air was 'new'


^ And I will, by the way, continue to insist that this was and is a great machine. People liked to hate on the screen but it was fine.

Great keyboard, great touchpad, pretty small, stellar battery life. I'm currently typing on the 11 inch variation.


I thought the same thing. But it only works if display scaling is perfect. Windows 10 just isn't quite there yet, espeially if you mix an external monitor of a different PPI.

1080 ~ 13/14" isn't quite big enough for me to use native, and not enough DPI to use scaled.

Lenovo's 1440p and Apple's 1600p displays are ideal. 1800p would actually be a little better. The Surface book's 2000p screen is best of all plus it is 3:2....


"And laptop manufacturers will now proceed to stuffing these into bargain bin laptops with shitty 1080 displays"

Which is good as we would have some serious processing power without having to second mortgage a house. At 15" 1080 display is anything but shitty (assuming it has decent gamut and dynamic range). Higher resolutions are much better served by standalone monitors. 40" 4K works just fine for me while on 15" screen 4K either makes fonts too small or if I increase the scale then there is no real difference with 1080


IIRC, high dpi erases the need for font aliasing, cleartype, ...


Why would I care about it from a user point of view? Games use their own fonts rendering and regular business soft looks just fine. And paying hundreds of dollars more for tiny 4K display in my opinion is not very good ROI


Lenovo makes a T450 (or something, I forget) that's a really good workstation laptop. They also have an AMD SKU of this laptop with everything except the upgraded screens.

Part of the reason the X1 Carbon is great is because 14" screen @ 1440p. Now give me an AMD version and I would be all over it.


The T495 is the T490 with Ryzen. I am so going to get the next model with Ryzen 4000. The only question is how Lenovo is going to solve the naming problem ;)


The T490's successor is going to be called T14, the T590's is the T15, and the X390's the X13, and so on. They'll probably go the "n-th generation (202X)" route for model years, as they've been doing with the X1 series. We'll see what they're going to call the AMD versions. Source: https://www.notebookcheck.net/Lenovo-ThinkPad-T14-X13-T490-X...


They should just call them T800 and T1000 if you ask me


I couldn't wait so I got the E495. I wanted the T495, but I'm not going to pay a premium for a laptop that's worse than previous revisions (no option for second battery, soldered RAM).

However, since I got a good deal on mine, I'll consider getting an upgrade if the next batch of laptops look good. I hope Lenovo backtracks on soldering on RAM and removing the option for a second battery for the T-series. If I want thin, I'll get an X-series, I got the T-series because it's a work horse that can be configured according to my needs. I'm not satisfied with the solution of having a USB battery bank as a backup.


Seems solved already - the AMD variant is simply labeled as being 5 better than the Intel.

Or did you mean the 490+10 = 4100 vs 500 naming problem?


The latter - historically they added or removed digits when the number overflowed. Both aren't very good solutions IMO.


>> Now give me an AMD version [of X1 Carbon] and I would be all over it.

And so would I. X1 has been my workhorse for the past 5 years or so. If the native Linux GPU drivers work well, I'd upgrade to a Radeon APU based X1 Carbon. I don't even care if it's expensive.


Have the X395 (3500U config) -- running Pop!_OS without issue.


Looks awesome, but the 16G of RAM and the 14" 1440p are what makes the X1 awesome.


The X395 is available with 16GB RAM. Don't recall screen resolution. Looked into buying one but my XPS13 developer edition's warrantee doesn't expire until next October.

My next laptop will be a Zen 2, regardless of whether or not Dell steps up to the plate. I hope they do. I prefer the fit and finish of my current XPS13 to my previous X230. Also the XPS 13 developer edition ships with Linux, which I approve of.


I only say 8GB option on the site. Resolution maxes out at 1080p


The T495s should be the solution for this but annoyingly they don't offer the 1440p screen like they do on the T490s (that can even go to 32GB of RAM).

But the an AMD X1 would be awesome indeed. The better chassis and 4K screen would make it the ideal machine for me. I'd love it if they did that in the Gen 8 but I'm not holding my breath...


I configured mine with 16G RAM, but yeah, the screen options max out at 1920x1080 and 300nits. That said, for a laptop with this form factor it's got an amazing keyboard.


I think it takes a little more time. The avergae customers that doesn't follow the news have to get used to AMD being not the cheap stuff. But it think it is slowly changeing with AMD being now competitive in notebook, desktop and server market. I think Dell/Alienware just presented there new flagship gaming desktop with an AMD CPU.


The average customer doesn’t even notice AMD vs Intel.


On a typical laptop sized screen (13-17") I don't find > 1080p displays to be that much better, the pixels are already tiny. If it provides significant cost savings it seems like a good decision to me. Of course also offering higher res options for those who want to pay for them would be nice.


The main benefit is text. After using 4K for a while, regular 1080p looks very fuzzy and pixelated to me solely because of text.


I was recently looking at various Ryzen 3xxxu series laptops to help a family member trying to get something on a budget. One thing I noticed while researching them is a lot of people think Ryzen systems are cheap builds because they require two RAM channels for the iGPU to be any good. So, on a Dell system with 8GB RAM instead of getting one 8GB DIMM and one free slot, you end up with two 4GB DIMMs and no free slots. A lot of people think this Dell cutting corners on builds by shoving in cheaper parts and subsequently making it harder to upgrade the laptop. I think there's a market education component here that isn't going to be easy to tackle.


Shipping dual-channel is better because 99% of people never upgrade.


I agree. But that speaks to the market education component I was referring to. For people that don't know the difference, they're probably not looking at a premium build anyway. And for those that want the capability to upgrade, occupying both RAM slots with lower capacity DIMMs appears to be a worse situation than you get with a typical Intel build. Moreover, since they're occupied with 4GB DIMMs a fair number of people think this is done as a cost saving measure on a cheap machine. I'm not an industry analyst, but that strikes me as something that's going to be difficult for AMD if they're looking to move into the higher end of the market.


I think it will be different this time. And the reason is that Intel has really dropped the ball. Even if Intel’s chips are better, they have had massive shortages over the past few years. If you aren’t buying 100s or laptops a week, there were periods where Dell would not sell you a modern business laptop at any price because they didn’t have Intel chips.

I’m hoping the system integrators have learned their lesson and go out of their way to support and promote AMD for their own sakes.


>I wish Apple would adopt this at least for some products. It'd give AMD more credibility, and Apple would be able to negotiate pricing more easily with Intel.

May I am reading too much into a small thing. But this is the first time I remember any vendors has Apple's product Image and Logo in their presentation. Normally for other brands using their logos and image are simply a request away. Not Apple. Apple has historically distant itself from any vendors using their logo as some sort of endorsement. Since winning the Apple contract is a quality label in itself.

In almost all other cases you can mention, speak, or use the text "Apple" with regards to Apple itself using your component. But I dont ever record product image and Logo being used.

I think the Ryzen 4000s fits MacBook Pro 14" or MacBook Air Retina perfectly. Or even the MacBook Pro 16". Instead of using a Gaming Focused Navi / RDNA GPU, it is bundled with a GPGPU focused Vega, which Apple has already been Optimising for their Metal Compute.

Just waiting until USB 4 / Thunderbolt 3 is fully opened with non-Intel solution I guess Apple will be good to go then.


AMD GPUs are powering their Mac Pro (and macbook):

https://www.amd.com/en/graphics/radeon-apple

I suspect they also allowed them to use the logo.


Yes and has been for years if not decade. Along Intel, PowerVR, Marvell, Broadcom, Samsung, LG, etc. There are more than 200 companies in the Apple Supply Chain, and it is very rare you see Apple mentioned or their logo being used.

The first rule about being a vendor of Apple is you dont talk about being a vendor of Apple.


Was Apple mentioned somewhere in the slide deck that Anandtech has as the second part of the article? Or was it in a video version of the presentation? Super interested by what you're saying here.


AMD had a slide showing a bunch of partners, including Apple (for GPUs only, of course).


It was shown in the liveblog [1]. Just to give you an example of an organisation which wants to use Apple's logo but likely hasn't got permission due to Apple does not want to officially endorse it. [2]

[1]https://images.anandtech.com/doci/15333/IMG_20200106_140421....

[2] https://aomedia.org/membership/members/


Fascinating. I remember seeing an article rumoring a very expensive Apple desktop. Possibly related?


1080p is in that sweet spot where it looks good enough without requiring buggy and inconsistent HiDPI scaling.


I’m using HiDPI on Linux/KDE and haven’t had any issues for a long time. Windows can’t be much worse. I think you need to reevaluate.

1080p might be fine for a phone. There’s nothing ‘sweet’ about blurry text[1] on a lowDPI laptop screen, in my opinion. Especially if you’re buying new. Paying real money for obolete tech makes no sense.

[1] https://webcube-general.s3.amazonaws.com/eizo/media/contenta...


16:9 though... Absolute garbage. At least 16:10 is a must, but now when 3:2 exist everything else just looks so hopelessly inadequate.


1:sqrt(2) is pretty much the ideal ratio anyway, A4 in landscape :)


3:2 is MS Surface only, yes?


Matebook x pro (and other in the matebook series) as well, and I saw something new released (non ms/huawei) with it this week too. So hopefully it will be more common.


I'm curious about how many people actually have laptops with resolutions higher than 1080p and use them without display scaling.

People buy a high resolution laptop and turn the scaling on 200%, effectively nullifying the benefits of a high resolution display.


I'm sorry but you don't know what you're talking about.

Using a hi-dpi display at native res on a 15'' screen is a recipe to kill your eyes, whereas scaling to 200% gives you butter-smooth text that relaxes them. That is the benefit.

When I was young, I was all for keeping text small and stuff ing as much info on the screen as I could. Nowadays, I just struggle to work on any native-res screen: unscaled low-dpi looks like crap, unscaled hi-dpi is tiring. My world changed with the 2012 retina MBP and I simply cannot go back.


There is scaling like just making every pixel a 2x2 pixel area on the screen, but there is also scaling that increase the resolution. I think me551ah was talking about the former kind, which also seems useless to me, just buy a 1080p display. the second kind though makes everything nicer.


Fractional scaling is stupid. 150% × 4K = 1440p so just buy 1440p screen but 150% on 4K will mean misaligned pixel grid. If you need more real estate and want HiDPI then you need 5K display, not 4K. 5K is 1440p@2.

Fractional scaling should be under seven “Are you sure? What you’re doing is stupid.” pop-ups. Instead at least Windows 10 actually makes 150% the default. Boggles my mind.

Here’s a good article series on this topic by Elementary OS dev:

https://medium.com/elementaryos/what-is-hidpi-and-why-does-i...

https://medium.com/elementaryos/top-3-misconceptions-about-h...


There are non English computer users benefit greatly from high DPI displays, especially oriental characters; and there are a lot of us.


With scaling you still get smooth fonts and crisp images (if available at higher resolutions than "screen pixels"), the benefit of a high resolution display really isn't to fit more stuff on the screen anymore (especially on 12-14").


It seems, after reading all of these comments that the ONLY significant improvement of highdpi is Font Rendering. I am a 3d artist and do not run HiDPI as it is a waste for my user case. I can see how most people don't do artwork and prefer the font rendering of HiDPI. Please don't think we all want nice font rendering, sometimes we need that 100% scaling.


> People buy a high resolution laptop and turn the scaling on 200%, effectively nullifying the benefits of a high resolution display.

The benefit is the higher resolution itself. Scaling doesn’t change the resolution. You still get the benefit.


200% scaling is the intended use case.

E.g. for text:

https://www.eizoglobal.com/library/basics/pixel_density_4k/t...


A high resolution display is useful for more than just seeing additional things on the screen at once.


I'd say that scaling up is exactly the benefit of a HiDPI display; if you don't you keep the same crappy low resulution rendering, only in smaller size!


The image is crisper to me, than for a regular non-scaled display.


How's the power usage of these Ryzen chips vs Intel vs Apple's home-brew ones? The consumer cares a lot more about battery life than they do about getting the last 10% of performance out of a chip these days.


These will be the first Zen 2 7nm laptop APUs from AMD. The article goes into details about being able to get more power from the same frequency, the greater IPC, and some architectural improvements to reduce power usage throughout the SoC to extend battery life. Of course watch for reviews once retail units are available!


> And laptop manufacturers will now proceed to stuffing these into bargain bin laptops with shitty 1080 displays. To this day desktop/workstation Ryzen CPUs are not treated as a premium product in spite of their superior performance.

Seems like all, or nearly all, of ASUS's laptops this quarter, from premium to value, are Ryzen; shipping with 120Hz+ displays, some of them IPS.


One would think 1080p displays aren’t even profitable to make these days compared to the cost of a HiDPI, just like we don’t see 800x600 or 1024x768 displays anymore. Eventually, whatever money is being saved by a power res display will be too marginal to make it reasonable to place in new products (ie when 2X DPI becomes the new 1X).


Can't blame just the manufacturers. AMD seems to do no QA with its BIOS before handing it to OEMS. Most of the previous generation's AMD laptops had issues with linux. AMD bios updates are still not supported under linux. If AMD doesn't care about the core enthusiast crowd, why should people care ?


Most bioses can update themselves from the system setup menu. Just download the bios update in an appropriate format, put it somewhere that's accessible to the bios, and you're set.


Linux users are not the core enthusiast crowd for laptops.

Even among my extremely tech savvy crowd of friends, I only know one guy who runs linux as their desktop environment.


Updating the BIOS is so rare that I really don't mind going into the BIOS to do an upgrade, especially since an unexpected reboot/freeze can brick your BIOS. I update it maybe once a year, and only if there's some feature that I need.


Can you provide some examples? I own two AMD-powered laptops and neither had BIOS issues.


https://wiki.archlinux.org/index.php/Laptop/Lenovo#E_series

And note that https://fwupd.org/lvfs/firmware/new does not list any AMD laptops. So even if you get a functional laptop, you won't be able to update the bios easily.


That's really unfortunate. I appreciate you sharing that with me. I hope this gets more attention.

But is the BIOS something controlled by AMD or the OEMs? My whole point in this and other threads it that OEMs are handling things poorly, and that's what's affecting AMDs prospects in the mobile market. The BIOS is a highly device-specific firmware, so that's why I'm asking.


The development of the BIOS is (typically) done by a company like AMI that interfaces with silicon manufacturers like AMD/Intel and laptop manufacturers like Lenovo/Dell. All of them are to blame for BIOS bugs since almost all laptop platforms (barring Chromebooks and a few others) are locked down in a way that nobody other than them can fix the issues.


> So even if you get a functional laptop, you won't be able to update the bios easily.

Extracting a ZIP on a FAT/NTFS USB key is difficult?


Nothing is difficult. The point is that there is no excuse for using a) windows, b) freedos, c) geteltorito, or any hacks and workarounds in 2020 for updating the bios of your laptop.

To your point, the method of putting the bios file on a usb stick is also not universal. I don't think any lenovo laptops support that. And while passable, that's also kludgy.


I updated my A485 bios recently using fwupd, was much easier than on windows, just run a command and restart. To me the year of the Linux desktop is already here :)

But yea seems that not all machines are supported, you either get lucky or have to do research beforehand.


Replying to self for an edit. T495 has bios updates on fwupd but the E series does not. So it's a case of price differentiation by lenovo.


Appreciate the update. I'm glad that you brought this issue up. It's an important one to me and one I think needs more attention.


I think that this will depend on the actual performance of the CPU.

While apple is a huge manufacturer, MS did choose to use them in their surface laptops. And as much as I wanted the previous mobile ryzen cpus to take off, most of the reviews were against them in the battery life tests while the performance tests weren't enough to make up for it. The same was true for any amd powered laptop, not only for surfaces.


Microsoft is also onboard, I think? Their latest Surface Laptop 3 has top-priced models with AMD CPUs


Previous gen. Not competitive with the 1065g7 in the 13” Surface Laptop 3.


It was a good first step, hopefully they iterate with the newer gen (see sibling comment) and keep making a more premium offering than has been standard lately.


I doubt Apple cares that much about x64’s future on the sort of manufacturing timescales (2-4 years) they work on.

I bet they already have an approximate release date for the MacBook Air that ships with an 64 bit ARM AnX CPU. It’s been brewing for at least 3 years now.

2021, perhaps?


Do you think Apple would bring back universal binaries?


Not necessary under the App Store model, I think - don’t they already have partial/conditional downloading of hardware-platform-specific assets running on iOS?

I wonder how many fucks they give about non-App Store apps. I rely on them, but I know Apple is using the App Store to herd developers toward embracing best practices for security like sandboxing and entitlements and userspace drivers, et c.

Also, there is nothing to “bring back” AFAIK; the os x binary format (mach-o, isn’t it?) still supports multiple architectures like it did the last time we all went through this dance. See also: 64 bit iOS/watchOS binaries.


"Shitty 1080p displays"? Back in the day, 1024px was the highest accessible resolution even on desktop hardware, never mind laptops. And now we're even getting that vertical resolution on a wide-screen display.


"Back in the day" you could hack CRTs to output 2560 by 1600 at over 75 Hz and I'm pretty sure I've seen resolutions in excess of 4000 px width.


> Back in the day, 1024px was the highest accessible resolution

You're going back a bit there. 1900x1200 LCDs have been around for about two decades, and 1600x1200 was achievable on CRTs in the mid 90s.


> Back in the day, 1024px was the highest accessible resolution even on desktop hardware

Well, sure, a very ancient day, in computing terms. The first QXGA (2048×1536) notebook was in 2002.


Back in the day my 386sx was enough to run 1024x768

So? What does that have to do with 20 years later today?


Back in the we had decent aspect ratios.


A lot of things were tolerable "back in the day". Nowadays I'd rather pay $100 more and have much lower eye strain, if such an option is available.


To be fair, 1080 is quite ‘tolerable’ up to 17in.


How having more than 1080p on a 13..15” screen would prevent eye strain? Or ‘shitty’ here mans cheap TS, not 1080p resolution itself?


Internet commenters all have amazing eyesight (which also seems to improve along with hardware).


I don't have amazing eyesight anymore thanks to staring at shitty low res displays for almost two decades before Apple released Retina Display. I haven't used a low res display since then. I also run my stuff with pretty large font sizes, and use a 32" 4K IPS display as my primary monitor for work. I advise others to do the same.


A 4K 32” isn’t that great. The minimum I would use is a 4K 28”, but even then I still see pixels. The best solution right now for good desktop resolution is an iMac 27” 5k, which has similar DPI to a modern high end laptop. And running without the need for anti-aliasing is great for reading text in a code editor.


I have a 27" 4K display that I run at 1440p because 2160p is too small for me to read text easily. I'm on Linux and don't use any scaling. 1440p on a 27" at roughly 3ft viewing distance is slightly better than Retina quality according to this[0] calculator.

[0] https://www.designcompaniesranked.com/resources/is-this-reti...


Use "large text" setting in accessibility settings (assuming you're using Ubuntu), and zoom in your browser, terminal, and text editor using their own settings. That's what I do. Looks great. Though my monitor is 32".


Thanks for the suggestion! I'm on Arch though and pretty happy with my current setup anyhow.


Interesting. I myself can never view native 4K under 32". Text is just too small for me.

> The best solution right now for good desktop resolution is an iMac 27” 5k, which has similar DPI to a modern high end laptop.

These iMacs are by default downscaled to 1440p.


This doesn’t even make any sense. A modern OS and application can deal with whatever resolution you throw at it, the text in visual studio code will look just as big on a high DPI as in lo DPI. The only time native is a relevant term is if the application has a fixed resolution and must be up scaled, down scaled, or resized to its native resolution. Those are mostly games, real applications that manipulate text shouldn’t be included (though some old legacy ones are, I don’t use any of those).

iMacs by default are not downscaled unless you are using fixed resolution applications. Simply put, there is no reason needed for scaling.


Put it further away from you - it's good for your eyes to focus further out.


"I still see pixels" is a silly argument. Just put a translucent/matte filter on the screen if you care that much.


It’s the difference between reading text on a piece of paper printed via a modern laser printer and an old fashioned dot matrix one. They are both readable, but one looks much better than the other. You change test this by printing out a screenshot of a PDF vs. printing the PDF out directly.

And yes, I use matte filters on my high DPI screens, especially my iPad.


And an offset-printed book page will look "much better" than even the laser-printed one, but do people really care at that point? That's pretty much the same as complaining that a 4K resolution is too pixely.


4K is too pixels if the screen is large enough. I had a Hitachi 28” 4K before I upgraded to an iMac 27” 5k. The text on the hitachi looked like crap, you could see the pixel corners and sub pixel anti aliasing coloring bleed through 10, 12, and larger fonts (that is on Windows, on a Mac you would see lots of blurriness via font smoothing).

Aesthetics matter, for a few hundred dollars more, it feels like I’m almost looking at paper (if it weren’t for the backlighting).


At 32 inches you're only getting 138 DPI. Much like a 17 inch 1080p screen, it's acceptable but it could be a lot clearer.

The standard from decades ago is 96. Just wanting to double that, to make the pixels not blatant, is not extravagant.

Ignore the raw resolution here. Talk about DPI.


DPI is only half of the picture though (forgive the pun). Viewing distance is the other half, your eyes can't tell the difference between 100 DPI and 300 DPI if you're far enough away.


So what, do you read books ten feet away to reduce eye strain?

A modern desk gives you about two feet between you and your monitor, which isn’t overly close, and you’ll still very much notice the difference between 138 and 220 DPI.


Only for my 150" books ;)

I sit 3ft away from my monitors, at that distance I can't notice a difference.


Bingo. And it's less stressful for the eyes to focus on an object that's further away. Net result: I might actually have some vision left over by the time I retire. Worth the cost.


Yeah but we're talking about computer screens, not televisions. The viewing distance is around 20-30 inches and 100DPI definitely does not cut it at that distance.


You're not wrong but at around 30" is where is starts not to matter (for 1440p anyway) and I happen to sit ~36" away from my monitors. See [0].

[0] https://news.ycombinator.com/item?id=21977571


'Retina' is just one metric. When it comes to crisp edges, there's still a difference up to 5 or 10 times the PPI.

https://en.wikipedia.org/wiki/Hyperacuity_(scientific_term)


Also, you’d have to scale up text to 150% anyway (exactly to avoid straining your eyes).


Which is fine. The lines of fonts are much less blurry that way, especially if subpixel rendering is off.


Lenovo makes good laptops with AMD. Not sure what you call premium exactly. I wouldn't call Apple laptops premium, I personally don't view as as well designed. The keyboard for example is atrocious.


Linux tech tips made a surprisingly long stream of videos, all boldly blasting how they were 'perf leader'. I'd be surprised if brands don't follow youtube and such a little.


I think you mean Linus Tech Tips (not affiliated with Linux or Linus Torvalds).


Torvalds is pro AMD too anyway ;)


yeah auto suggest fail


Re: Apple using AMD cpu.

Apple can’t until either: Intel opensources thunderbolt or Apple drops Thunderbolt support.

Years ago Intel said they would but they still haven’t yet.

Edit: I originally stated Lightning and meant to say Thunderbolt.


There are a handful of AMD systems with Thunderbolt already. Presumably the situation will be better with USB 4.0.


>Years ago Intel said they would but they still haven’t yet.

They did and It is called USB 4. Where the top USB 4 profile provides full TB3 compatibility.


Intel has made Thunderbolt royalty free.


It may be royalty free but getting access to the chips and reference designs still requires signing an NDA with Intel (which they may or may not even give you a chance to do). They are still the gatekeepers for the technology because no one else makes comparable chips.


It’s still not royalty free yet.

There’s AMD mobo with thunderbolt support but they aren’t certified.


I like the 1080p display in my laptop. In fact, the 2-in-1 convertible I bought was mainly offered in 4k and the 1080p option was harder to find.

So, it totally depends on the buyer.


sounds like microsoft is in it for the long hall. Hopefully they come out with more skus with top end hardware to pilot the industry forward for premium amd laptops



1080p is good enough on a 15.6" screen plus you get extra battery time. Not everyone needs 4k.


>bargain bin laptops with shitty 1080 displays

The era of shitty 1366x768 displays is finally over.


I wonder if Apple knows something about what Intel has in store, or if they just doin't care because the performance of their ARM-type chips will be so vastly better they can emulate x86 and still be faster than what Intel has.

That being said, the Mac Pro with a 64-core Ryzen chip and 4TB of memory would have been absolutely nuts.


Precisely. I am wondering if there are some behind-the-scene deals with Intel to put premium stuff only into their laptops or motherboard. Another thing is the latest TRX40 boards have all at most 4 PCIe slots, which is a joke for a workstation. Imagine a 64 core CPU with only 4 PCIe slots... I don't believe in coincidences across all manufacturers, it doesn't make sense to be less competitive.


Intel/AMD don't sell chips; they sell "platforms" and most vendors won't deviate too much. And 16 lanes times 4 slots does equal 64. If you want more there's always Epyc.


Lenovo announced some Yoga laptop with Ryzen 4000, which is an AMD version of their Intel laptop. The differences? No 4k and only 8GB RAM for AMD. 4-core Intel has 16GB, 8-core AMD has 8GB RAM. Coincidence, right?


I've had it with Intel and laptop OEMs.


x299 has up to 7 PCIe slots with only 28 PCIe 3 lanes. TRX40 has 64 PCIe 4 lanes. They could easily support a mode with 8 PCIe slots in PCIe 3 mode and full lanes. Instead we get 4 PCIe slots... My Deep Learning rig has 6 GPUs already.


Well, I hope this works out for AMD - currently they apparently can't compete for power efficiency.. At least judging by one of the more interesting reviews from 2019:

"The Microsoft Surface Laptop 3 Showdown: AMD's Ryzen Picasso vs. Intel's Ice Lake"

https://www.anandtech.com/show/15213/the-microsoft-surface-l...

Ed: looks like we'll see more intel/amd head-to-head designs, eg: https://www.anandtech.com/show/15305/acer-swift-3-either-wit...


Not that I want to apologize for poor performance, but I remember feeling let down by that review because some obvious differences between the platforms were not highlighted that should feed into the conclusions, not the least of which was the drastically different memory used between the two platforms.

The article was billed as "let's see the difference between AMD and Intel" but there were significant platform differences that made it not quite apples-to-apples.


Picasso is a nearly two year old design on 12nm being compared to Intel's latest Sunny Cove design on their (profusely) bleeding-edge process. It was a valid comparison, as that what Microsoft was selling. But I expect a better showing with process parity.


The article on The Verge quoted AMD as doubling the power efficiency from Zen+ (with 7nm Zen 2). Very interested to see real world battery life before making any judgements.


Any idea if these support full HW acceleration of VP9 (Youtube)?


The previous gen has it, so yes.


And about AV1?


Very unlikely. They'd be the first to ship customer PC parts with AV1 decode support. But the next generation almost certainly will. Same goes for other vendors, for Nvidia post-Ampere, for Intel post-Icelake/Tigerlake/whatever the next is nowadays, etc.

Otherwise, there are AV1 decode IPs available, including a SoC or two. Plus some very recently announced set-top boxes and TVs. So you'll definitely be seeing some hardware with AV1 support shipping this year.


Can't wait. I'm really hoping AV1 is the be-all end-all and we can all stop moving to new codecs.


AV1s real advantages are also in the free licensing, something nothing out of MPEG-LA cartel will ever offer.


AV1 is inferior to h265 and the successor to h265 should come soon and rekt AV1 and no longer have big royalties issues.

BTW the hardware cost by not using h265 far outweight the royalty cost. It has always been a pure economic nonsense for google (YouTube) to exclude h265.


Could you elaborate on the weaknesses of AV1? I hadn't heard that. Is it because of the wavelet-based i-frames that h265 uses? Or a bunch of little things?


I think he may be referring to AV1's economics weakness. Not the actual quality itself. As in the cost to bring quality to this level while disregarding encoder and decoder complexity. Kostya has a rant about it here [1]. He was the guy that made Real Video work across all platform .

Personally I am giving the industry a benefits of doubt and one last chance on VVC / H.266.

[1] https://codecs.multimedia.cx/2018/12/why-i-am-sceptical-abou...


> As in the cost to bring quality to this level while disregarding encoder and decoder complexity.

AV1 is amazing for YouTube, Netflix and torrent-scale videos currently.

I can't find the reference for this, but in the beginning the codec developers and big companies had meetings. The answer from big companies for how much encode time increase would be acceptable was 100x to 1000x the current standard. Thus the design.

Of course the encode time problem will be solved for regular users too once AV1 encode ASICs for consumer hardware enter the market in 3–5 years. There are a few solutions already offering cloud FPGA encoding alongside beefy servers. If streaming bandwith costs are a significant issue for you, then you can easily afford that.


I wouldn't be so optimistic https://xkcd.com/927/


That's not why next-generation codecs keep being made.

I remember back in 90s / 00s when a CPU would stutter doing MPEG2 decode with XVID or DIVX codecs. Eventually, CPUs got faster, and guess what?

Faster CPUs mean you can "afford" better compression / decompression routines. No one uses XVID or DIVX anymore, because CPUs are so fast that we want better compression (not faster compression).

H264 was then popularized, and I remember in the late 00s / early 10s when "Netbooks" would stutter decoding H264. ASIC chips were invented to better decode H264. But guess what? Today, CPUs / GPUs are so much faster that H264 is now too easy, and we desire even better compression.

Repeat the story again for H265 and VP9, the next generation (with VP9 patent-free becoming more popular).

AV1 is designed to be the next step. Its still too slow for most people, but ASICs are being developed so that cell-phones can run the AV1 decoder algorithm. Eventually, computers will overall get much faster, and everyone can then move to AV1.

-----------

Eventually, computers will be so fast that even AV1 is "too quick", and a new, slower, better form of compression will be invented.

We move to newer codecs because our technology continues to change. Now maybe AV1 will be the "end all, be all" of codecs, but it will be a sad day if that is true. Because for codec-progress to end implies that CPUs have stopped improving.

Moore's law seems to be dying, and later nodes (5nm, 3nm, etc. etc.) are taking longer and longer to research-and-develop. Maybe our computers will stop improving soon...


The speed of CPUs doesn't really matter for codecs these days. Only for adoption during the first few years until fixed-function hardware starts shipping.

Besides, most of the processing burden is on encoding. CPU performance hasn't resulted in better codecs, but rather advances in encoding methods. AV1 encode is 100x slower than previous codecs, but that's fine due to the savings at scale.


We need NUCs from AMD !


Check out MiniPC[1], which is essentially the same thing, but made by OEMs instead of by Intel directly.

- [1] https://www.amd.com/en/products/embedded-minipc-solutions


DeskMini A300 by AsRock. Slightly larger than NUC, but can house full power 65W desktop CPU.


This one is using desktop CPUs (nothing against them) and unfortunately not the newest Ryzen versions.


It supports the 3400G, which is the most recent with integrated graphics.

https://www.asrock.com/nettop/AMD/DeskMini%20A300%20Series/#...


ah, thanks for that information. I'm not super familiar with AMDs productline and somehow thought that all Zen 2 have an integrated graphics inside. So we are gonna to expect some more powerful G line APUs in the future too I guess...


Yeah hoping to see something like that soon, will be pretty awesome I think.


Those already exist. For example, look at the HP ProDesk machines.


The Zotac CA621 nano features the 15W Ryzen 3 3200U with Radeon Vega 3 graphics


But the Ryzen 3xxxU is Zen+ and not the current generation (Zen 2)


+1 These seem like they'd be perfect for it.


Anybody know of any companies planning to ship these in a NUC-style form factor?


ASRock did the DeskMini A300 last year which was quite popular. I'm hoping they will refresh it with an updated motherboard for the new APUs coming this year.


It was the only one I believe. I'm still on the lookout for a good MiniPC solution that doesn't have compromises like a soldered CPU or requiring SODIMMs etc.



I'm looking forward to build a mini HTPC with this. Hope it can handle 8K HDR encode/decode well.


I got the previous 3000 Ryzen edition on my new cheap Lenovo, and it kills all my big Intel machines in all benchmarks. It cannot use it for benchmarking, as it drops frequencies from 4.3 to 1.5 as it likes (or does temp. freezes) but for testing and dev the AMD works wonders.


Since when are CPUs called APU?


APU = CPU + GPU in a single package or die.


These APU's wont fix todays problems in Laptops. Idle power consumption is largely due to all components together. bigger batteries are only more expensive. I'm still waiting for more efficient SSD's. There's no point in an efficient SOC if you still need a large hard drive and a bright display.

And of note; an 1800 base freq is on the low end of the performance/watt curve we've seen from their other products. Maybe AMD expects most workloads to not use all 8 cores properly and let the boost algo max the cores out?

also, where's PCIE 4? My guess is they are waiting a cycle on purpose due to power constraints.


> And of note; an 1800 base freq is on the low end of the performance/watt curve we've seen from their other products. Maybe AMD expects most workloads to not use all 8 cores properly and let the boost algo max the cores out?

The 1.8ghz base freq is just to hit the desired 15W TDP. The approach is pick a TDP, say 15W, then adjust base freq for the core count to hit that. That's why the 8C ends up at 1.8ghz base while the lower-end 4C has 2.9ghz base freq. Then let turbo be the thing that everyone actually uses on a daily basis, because base frequency is irrelevant. It's not actually an input into anything the CPU does for either AMD or Intel. The CPU is monitoring its power draw to stay in a power budget and a temperature budget. What speed it ends up running at is then not just dependent on how many cores are used but also what type of instructions they are running. It's a fully-dynamic system these days, making single-number specs useless.


Man, I hope the TDP is adjustable, as well as the voltage, like many Intel chips. Would be great to see what these chips can do.


All AMD Ryzen chips, even the cheapest $50 3000G (technically branded an 'Athlon') are fully unlocked and you can adjust everything to your hearts desire.

Unlike Intel, AMD does not charge extra for unlocked chips.

It's not like it matters tho anyway: with Zen2, PDO2 pretty much pushes your chip up to the limits of thermal constraints. I'm looking at Ryzen Master right now, and my TDP is "1000W".


Given this is a laptop it doesn't really make sense to adjust the TDP even if you can. You're going to be limited by the laptop's cooling solution, which is going to be at best sufficient for factory settings but nothing more. More commonly it's actually not quite sufficient for factory settings, leading to thermal throttling over sustained loads.


Oh I am well aware. Laptop coolers are barely good enough, without exception. It's fascinating really, it's like they want their systems to run at close to throttling temperatures, so they last a tad bit longer than their warranty.

Doesn't bother me, I've got 2 hands and 2 eyes, anything can be modified with those. Except encrypted BIOSes, for now :D


That's not true at all. The SoC still makes a significant difference. The recent Surface laptops are the perfect demonstration of this: https://www.anandtech.com/show/15213/the-microsoft-surface-l...

SSD efficiency would be the thing that's nearly irrelevant. What even stresses the disk at all in a typical ultrabook workload?

Under load though mobile CPUs hitting north of 40W makes them, along with the GPU, the most power-hungry component by a lot though. Display power is like 5-10W at "normal" brightness levels.


"Under load though mobile CPUs hitting north of 40W" Most ultrabooks feature 15watt parts. They might hit 17watt but will definitely throttle after that. Yes, if you buy a laptop with a high tdp part like 35 then yes, it will consume the most power.

My problem is that idle power consumption is what mostly dictates battery life. and the APU wont always be in C0. Especially when doing mundane things like a powerpoint presentation or watching a movie. in those cases the CPU consumes only around 5-8W on todays mobile processors. Like you said, the screen brightness is already at 5-10 watts. Factor in that you're either streaming a film from netflix, adding another couple watts for your ethernet or wifi chip. Or you're streaming it from your ssd. Meaning your ssd will be kept out of its lowest power state and also consume a couple watts on top of all your other components.

Unless you're rendering for hours on your battery, your CPU wont be the top contender for power consumption.

"SSD efficiency would be the thing that's nearly irrelevant." you'd be suprised. most hdd's are still more power efficient than flash storage. Especially the bigger capacities have active idle power consumption of 1.5w that is, before we're adding any reads or writes to it. Yes, ssd's can power down but its not a silver bullet.


>"SSD efficiency would be the thing that's nearly irrelevant." you'd be suprised. most hdd's are still more power efficient than flash storage. Especially the bigger capacities have active idle power consumption of 1.5w that is, before we're adding any reads or writes to it. Yes, ssd's can power down but its not a silver bullet.

You will need data to prove your point, Most Laptop SSD has idle power and active power lower than HDD AND faster read write response time hence much quicker to idle power state.

i.e SSD energy per workload is significantly lower than HDD.


I dont have perfect apples to apples comparisons but I can paint you a pretty good picture. Idle active power consumption is mostly determined by the controller and the amount of RAM on the device. read and write energy costs are very difficult to determine (for someone not working at samsung) First of all, almost all old ssd's beat new ones in power consumption. which makes sense as the controller needs more energy to read smaller and smaller tlc nand. especially the move to NVME drives. You can see that here https://www.anandtech.com/show/9702/samsung-950-pro-ssd-revi...

The idle power draw is also specified in the spec sheet. Its why I didn't bother with data. The evo drives dont have as much ram and a more forgiving controller. they are along the 0.5w lines I think

So just how do these drives compare to hdd? and how do they consume power on everyday tasks? I dont have any articles of modern mobile hdd's because they kinda went out of style. But one article gives you the best case scenario https://www.tomshardware.com/reviews/ssd-hard-drive,1968-10....

Old, low capacity, sata flash drives. They are the most power efficient devices. Hopefully anantech or tom's hardware will retest NVME drives in laptops to hit the final nail in the coffin for nvme ssd power consumption compared to disks.


Sigh. This is HN, not Anandtech, there are people here design SSD Controller and Firmware for a living.

>The idle power draw is also specified in the spec sheet. Its why I didn't bother with data.

Which is why you should check. I should also remind you Laptop SSD are tuned with energy usage in mind rather than performance.

A modern 970 EVO desktop NVMe drives idle at 0.03w. You are off by a factor of 10.

The same drives has a deep idle state of 0.005W. Your idle Power is off by 100x.

Let's Assume a Laptop HDD has a similar idle and standby power. ( Which they dont )

The same drive compared to HDD has a Seq Red 10x and Random Read of 1000x to 100,000x. Let pick a middle of the pack number as 100x.

Even if Active power of SSD were double of the HDD ( Which they dont ). In Mix workload SSD is faster than HDD by 20x, SSD would have saved 10x energy.

>NVME drives in laptops to hit the final nail in the coffin for nvme ssd power consumption compared to disks.

There is nothing final nails in the coffin. The difference is so drastic, and there is a reason why every laptop manufacturer jumped to SSD. Not because it was cheaper, but it saves huge amount of energy hence battery.


> Most ultrabooks feature 15watt parts. They might hit 17watt but will definitely throttle after that.

That's not at all how Intel's TDP nor turbo works.

From the anandtech surface comparison I linked, the Intel part is the 15w i7-1065G7 ( https://ark.intel.com/content/www/us/en/ark/products/196597/... )

And yet, when hit with a multithreaded load it immediately shot to over 40w power draw: https://images.anandtech.com/doci/15213/CinbenchR20MT.png

That "15w TDP" is just the limit it will sustain. It regularly spikes over it, and by a lot, for 10-30 seconds at a time.

Or in Intel's technical terms the spec that they provide is the PL1 TDP. The spec that they don't provide is the higher PL2 TDP. Turbo is initially governed by PL2, not PL1.

> Especially when doing mundane things like a powerpoint presentation or watching a movie. in those cases the CPU consumes only around 5-8W on todays mobile processors.

And per the anandtech link you provided Samsung's specs for the 950 Pro NVME drive mention 70mW idle and 2.5mW DevSlp power draw, and anandtech measured idle power draw at 20mW.

This puts CPU power draw at 100X larger than the SSDs in the simple video & powerpoint presentation use cases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: