Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Still only supports one display. Also they increased the prices across the board..

Quite happy with my M1 Pro, a beat and a hell of a purchase.



This is disappointing. It has 2 Thunderbolt 4 ports but can only drive one external display. So unnecessary, this would be the perfect machine for my home and work setup, but I have 2 external displays in both cases.


They are 100% doing this intentionally. They want to drive power users toward spending more and they know that many of us will…


Really disappointing considering they now support up to a 6k external display. Yet you can't do 2x 1080p, or 2x 4k.

I think Apple knows a lot of customers care about this and want it to be a barrier getting them into a pro machine. The cheapest laptop they sell with multi-external-monitor support is $1k more than their cheapest laptop overall ($2k vs $1k).


Wait, you can't extend the display to a second screen?


Yes but only one external display plus the build in display.


There was some dock from a 3rd party vendor that let you do more screens, but I can't remember which one...


There are a few. They are able to do this by using something called Display Stream Compression. While it may be find for some, a lot of us would prefer not to have a diminished experience with a compressed stream.


Display Stream Compression (DSC) is fine. It is not a "diminished experience". DSC is visually lossless.

Instead, those docks use a technology called DisplayLink which has nothing to do with DSC. DisplayLink means that external monitors are basically "software" displays that are tremendously slower and often very limited in resolutions and frame rates. Having any DisplayLink display connected also breaks HDCP and can cause other problems.


The relevant standard is proprietary, but Wikipedia quotes it, confirming that "visually lossless" is marketing lies:

https://en.wikipedia.org/wiki/DisplayPort#Display_Stream_Com...


"Marketing lies" is unnecessarily inflammatory. I googled before posting to see if I could find anyone legitimately complaining about DSC, and it really seemed like pretty much everyone was happy with it.

There are always people like "audiophiles" who claim to be able to distinguish impossibly small differences, and there is perhaps a very small number of people with exceptional hearing who actually do... but 320kbps compressed audio is "audibly lossless" for most of the population. The exact same thing applies here, by all appearances. I'm sure there are mp3 test cases where the compression does something terrible, just like with DSC... that just isn't what people actually encounter day to day.

I can't see the second study linked which is on IEEE, but if you look at the fist one, Figure 4 shows that DSC was "visually lossless" in almost all test cases. Let me quote one thing from that study:

> As described above, the HDR content was selected to challenge the codecs, in spite of this both DSC 1.2a and VDC-M performed very well. This finding is consistent with previous series of experiments using SDR images.

So, this testing was done with samples that would challenge the codecs... and they still did great. It doesn't appear to be "marketing lies" at all. It appears to be a genuine attempt to describe a technology that enables new capabilities while dealing with the imperfect limitation in bandwidth of the available hardware.

Do you have some terrible personal experience with DSC to share? Did you do a blind test so that you weren't aware of whether DSC was enabled or not when making your judgments? Are you aware that almost all non-OLED monitors (especially high refresh rate) always have artifacts around motion, even without DSC?

I haven't personally had a chance to test out DSC other than perhaps some short experiences, which is why I based my initial comment on googling what other people experienced and how Wikipedia describes it. You pointed me to a study which seems to confirm that DSC is perfectly fine.


>in almost all test cases

Common sense suggests that "visually lossless" means no detectable difference by the naked eye ever, not in "almost all test cases". MP3 is a very old codec, and it's possible that there are still some "killer samples" that can be ABXed by skilled listeners with good equipment even when encoded by a modern version of LAME. A better example of something that could reasonably called "audibly lossless" might be something like Opus at 160kbps, for which I've seen no evidence of any successful ABX. But even that is is usually called "transparent", not "audibly lossless", so not only is "visually lossless" a lie, the name itself is propaganda.


> Common sense suggests that "visually lossless" means no detectable difference by the naked eye ever, not in "almost all test cases".

Common sense suggests no such thing. When you buy a bottle of “water”, it actually has a bunch of stuff in it that isn’t water. How dare they?! When someone says “I’ll be there in 15 minutes”, it is highly unlikely that they will show up in exactly 900 seconds. Such liars! Why are you even meeting them? When people say airplanes are “safe”, you might angrily point out how many people have died, not realizing that “safe” is relative to other things and not an absolute in that context. This is common across basically everything in life. “There are no absolutes.” If you think common sense is to automatically assume every statement that even looks remotely absolute is intended to be taken absolutely… that is not common. Short statements will come off as absolute, when they are just intended to be taken as approximate, but even absolutes are usually meant to be taken as slightly less than absolute.

“Visually lossless” is a description of the by far most common experience with DSC. They’re not describing it as truly lossless, so you know there is some loss occurring. It is natural to assume that in extraordinary circumstances, that loss might be noticeable side by side… but you don’t have a side by side when using a monitor most of the time, so the very lossy human vision system will happily ignore small imperfections.

> so not only is "visually lossless" a lie, the name itself is propaganda.

Your whole comment shows that you don’t understand how communication works. It is “visually lossless” as far as people are concerned. The study shows that! This is not at all what propaganda looks like.

When Apple labeled their iPhone screen a “retina screen” because people would no longer notice the pixels, I suppose you called that a “lie” as well because you could lean in really close or use a microscope? The retina display density achieved its stated goal.

There is literally no point in continuing this discussion when you take such an absolutist position and refuse to consider what alternative communications would look like. How about “99.9% visually lossless”? That would be even more confusing to people.

Communicating complicated concepts succinctly is a lossy process. Language is lossy. As they say, “all models are wrong, but some are useful.”


I understand perfectly that "visually lossless" was chosen to emotionally manipulate people by triggering positive associations with the word "lossless", despite not actually being lossless, or even transparent. Language is lossy, but that does not excuse corporations twisting it further in their attempts to exploit you.


Who is being exploited in this situation? I've never heard of anyone successfully selling a product because DSC is "visually lossless". It's a short explanation for a complicated technology, on the rare occasion that anyone googles DSC to try to understand what it is.

Your continued use of inflammatory and frankly incorrect language isn't helping your case. If this is "exploitative" marketing language that is "lying", you should file a case with some consumer protection body. You have also failed to demonstrate how you would communicate the overwhelming effectiveness of DSC that the study showed.


"High fidelity lossy" would be an honest name.


Sure, that name would have been fine too.

I don’t think “honesty” is an issue at play here either way, as I have discussed in great detail (and with many examples that you surely have encountered), given how people (unfortunately?) communicate in the real world.

If the study had shown something substantially different (or if people online were frequently having bad experiences), I would totally have been onboard with calling it marketing overreach and lies.


DSC doesn't solve the hardware limitation of only being able to drive a single external display on the M1, that's a hardware thing that cannot be changed. You have confused it with DisplayLink, which is basically another graphics card, hence why it "solves" this problem, but the experience is worse because it's CPU-intensive/software rendered.


Good catch. Definitely meant DisplayLink.


I bought and followed the online tutorials about using the DisplayLink docks and whatever else I purchased from Amazon and I couldn't get it to work with 2 external monitors. It isn't straightforward.


Does the 13" MBP support multiple displays?

Sorry- I'm horrible at reading Apple Specs and inferring the capabilities


just the one external screen (two screens total including internal).

https://www.apple.com/macbook-pro-13/specs/

People have gotten round it by connecting additional screens using display link adapters.


Awesome thanks for the assist! That page makes it clear, I guess I'm actually just horrible at sifting through the marketing to find the spec page :)


Display Link is alrightish for light office work or coding. But not much else.


Can you use two external screens if you disable the internal screen? That's what I do now with a ThinkPad.


No sadly you can't


My 13" 2014 MBP supports 2 mDP + 1 HDMI = total 3 external displays.

Running external display at 4k@60Hz is possible but not straight forward, it requires patch core graphic framework, or using 3rd party boot loader. Newer models do not have this limitation afaik.


Intel != Apple Silicon


Wait really? I was using 2 external displays alongside the built in desplay on my m1 just a few days ago. Or is it a limitation only with m1 mb airs?


> on my m1 just a few days ago

M1 != M1 Pro/Max/Ultra.

If you have an M1 Pro or M1 Max or M1 Ultra, that is not "[your] m1".

Each chip has significantly different capabilities in a number of aspects. As far as display support goes,

M1 = 1 external display[0]

M1 Pro = 2 external displays

M1 Max = 4 external displays (3 USB-C + 1 HDMI)[1]

[0]: the exception is the M1 Mac Mini, which doesn't have an internal display, so it can use two external displays.

[1]: once again, the desktop version without a built-in monitor can support one additional monitor, so the Mac Studio with M1 Max can support 5 displays.


Is there a technical reason that the M1 only supports a single external monitor (optimized intended experience), or is just market segmentation?


Every GPU on the market supports a limited number of monitors. There are fixed-function (not programmable in a traditional sense) blocks of silicon that are used to support each monitor.

M1’s GPU came equipped to only support the internal monitor and one external monitor… a very slim configuration, but that’s likely influenced by its smartphone processor ancestry. Smartphones don’t need to power a bunch of displays.

The larger M1 chips have bigger GPUs with more of those fixed function blocks.

It isn’t artificial market segmentation at a software level, but it is certainly market segmentation at a hardware level, and something they knew would happen when they designed these chips.

In the end, they were pretty spot on about the market segments. Most people want/need external display support… but one external display is plenty for most people. People who need more are likely to also want more in general, and the higher end options satisfy that.

It still would have been nice for them to upgrade things for M2.


> People who need more are likely to also want more in general, and the higher end options satisfy that.

I disagree. The topic comes up repeatedly whenever Apple Silicon is discussed. It’s my impression that for quite a lot of us the base M1 or M2 would be everything we wish for from a pure performance perspective. Yet the limited display output options are the only thing that force us towards the Pro and higher tiers.

It seems like a deliberate limitation and I don’t like this form of product segmentation.


Thank you for the thoughtful response.

I agree that they were pretty spot on with the market segmentation. I’m one of the folks who doesn’t need more than a single external monitor, and I consider myself a power user when it comes to resource consumption. I just wish the cost of ram would come down, holy moly.


Got it, I thought they were saying it was a limitation of the chip not the specific laptop they had. Thanks for the clarification!


It is a limitation of the chip. The M1 chip and the M1 Pro chip are not the same chip.

The laptop itself has nothing to do with it. If they decided to put an M1 Pro chip into the MacBook Air, it would be able to have 2 external displays.


alright thanks


M1 Ultra = Every display known to man.


Apple probably could support 10 displays off of M1 Ultra, but I guess they decided to leave some displays for the rest of us.


14 and 16 inch Macbook Pro's support multiple external screens up to 6K. https://www.apple.com/macbook-pro-14-and-16/specs/


That's the case for the M1 Pro and M1 Ultra. The regular M1 only supports a single external display.


The M2 as well, unfortunately.


Only supports one external display, as opposed to the 14"/16" machines that can do maybe 3 or 4?


They did? Seems to be the same price as M1 MacBook Pro.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: