Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not to speak for anyone else, but one thing I gently disagree with:

>Given that Hackintoshers are a particular bunch who don’t take kindly to the Apple-tax[...]

I have zero issues with an Apple premium or paying a lot for hardware. I think a major generator of interest in hackintoshes has been that there are significant segments of computing that Apple has simply completely (or nearly completely) given up on, including essentially any non-AIO desktop system above the Mini. At one point they had quite competitive PowerMacs and then Mac Pros covering the range of $2k all the way up to $10k+, and while sure there was some premium there was feature coverage, and they got regular yearly updates. They were "boring", but in the best way. There didn't need to be anything exciting about them. The prices did steadily inch upward, but far more critically sometime between 2010 and 2012 somebody at Apple decided the MP had to be exciting or something and created the Mac Cube 2, except this time to force it by eliminating the MP entirely. And it was complete shit, and to zero surprise never got a single update (since they totally fucked the power/thermal envelope, there was nowhere to go) and users completely lost the ability to make up for that. And then that was it, for 6 years. Then they did a kind of sort of ok update, but at a bad point given that Intel was collapsing, and forcing in some of their consumer design in ways that really hurt the value.

The hackintosh, particularly virtualized ones in my opinion (running macOS under ESXi deals with a ton of the regular problem spots), has helped fill that hole as frankenstein MP 2010s finally hit their limits. I'm sure Apple Silicon will be great for a range of systems, but it won't help in areas that Apple just organizationally doesn't care about/doesn't have the bandwidth for because that's not a technology problem. So I'm a bit pessimistic/whistful about that particular area, even though it'll be a long time before the axe completely falls on it. It'll be fantastic and it's exciting to see the return of more experimentation in silicon, but at the same time it was a nice dream for a decade or so to be able to freely take advantage of a range of hardware the PC market offered which filled holes Apple couldn't.



Apple does not want to offer to the hackintosh/enthusiast market because they are the most price conscious segment. Targeting that segment means putting out extremely performant, low-margin commodity machines. Doing so then cannibalizes the market for their ultra-high-end stuff.

Not only that, though. Enthusiasts are also extremely fickle and quick to jump ship to a cheaper hardware offering. If you look at all of Apple’s other markets, you’ll see loads of brand loyalty. Fickle enthusiasts don’t fit the mould.


When Apple first mandated kext signing (Mountain Lion?) they explicitly whitelisted certain community built kexts used for Hackintosh. IMO Apple and the Hackintosh community has been mutually benefited until now. Many who have accustomed to macOS from Hackintosh has eventually invested in Apple products.

Considering Apple has only went after those who profiteered by selling pre-built Hackintosh and not everyone who are profiteering from Hackintosh scene; I would say Apple did care about the Hackintosh community in some way.

I thought the higher performance/price Hackintosh, especially with Ryzen might force Apple to act differently but now with M1, Apple needn't worry about Hackintosh performance/price anymore.


> Apple does not want to offer to the hackintosh/enthusiast market because they are the most price conscious segment. Targeting that segment means putting out extremely performant, low-margin commodity machines. Doing so then cannibalizes the market for their ultra-high-end stuff.

Looking over the shoulder at a 64-core Threadripper with 256GB of ECC RAM, 3090FE, Titan RTX and Radeon VII, yeah right. Some of us do Hackintoshing because we want more dope specs than what Apple offers and customizability that comes with PC hardware.


What Apple could have done is to continue supplying something like the G4 towers. Those where stunningly beautiful machines and practical.


What if they simply decided that they didn't care for that part of the market? At some point we should just accept that.


Sure, but there's a legit gap in the datacenter— not having a sanely, legally rackable OS X machine is a pretty big problem for a lot of organizations. Not everyone wants to do their Jenkins builds or generate homebrew bottles on a Mac Mini under someone's desk.


Is this really an issue? They sell shelves that let you rack 2 Mac Minis in 1U space. You can also buy a rack mount Mac Pro if you want to spend really big bucks.


Isn’t the newest Mac Pro available in rack form?


Ah, so it is, and the thermal story there is definitely much better than with the Mini, there being a clear intake/exhaust flow. OTOH, there's still likely a gap in terms of management features, and the starting price of $6.5k for a 4U system is definitely going to be a barrier for some. Good to know there's at least something, anyway.


None of those use cases seem relevant for iOS or macOS development.


Surely you'd want CI builds for your app? I suppose you can always go the sassy option and just offload this problem onto Travis or CircleCI, but then they're the ones stuck figuring out how to rack thousands of Mac Minis, dealing with thermals in a machine that isn't set up for hot/cold aisles, a computer that doesn't have a serial port or dedicated management interface, etc.

If you're a big enough org or the app is for internal use, this might not be an option anyway. At that point I imagine most people just give up on it and figure out how to run macOS on a generic VM. But at that point you have to convince your IT department that it's worth it doing a thing that is definitely unsupported and in violation of the TOS.

Or maybe some of these are big enough that they are able to approach Apple and get a special license for N concurrent instances of macOS running on virtualized hardware? Who knows.


No company on the planet is big enough for Apple to make exceptions like that. All of them either use a cloud provider or a custom rack design just for Mac Minis.


Companies like Google or Microsoft aren't big enough? Google's Chrome and Microsoft Office alone I would wager are more than big or popular enough to get special treatment

Adobe is smaller by contrast but I'd speculate has a much deeper relationship with Apple as well


All of them use Mac Minis as far as I know.


Nope, just build straight from XCode.


Well sure, for a single person team. But as soon as you're working with other people, surely you want an independent machine making builds and running tests— this is literally item 3 on the Joel Test.


You would be surprised how some teams actually develop code, even timesharing iMacs among teams.


I wonder what the overlap is between those teams who do not invest in their infrastructure, and those who ship broken products.


If you ever get a chance to meet employees at CircleCI or some other CI provider at a conference after Covid is over, consider asking them about how they rack Mac Minis.


Good for them, I just use XCode.


Ah yes, how could they have been so blind. They should have just put XCode in the server racks.


What servers?


The entire thread is about running OSX in data centers. In data centers you run servers if you didn't notice.


pjmlp's view appears to be that because their customers, who are not experts, don't know enough to ask for continuously tested software, they don't believe it is their professional responsibility to provide that either. This allows them to dismiss any complaints about macOS in datacenters as irrelevant.


On the contrary, testing doesn't come for free, and everyone gets what they care to pay for.


I'm not a consultant, but I believe it would be an ethical failing on my part to hand someone else a piece of code without extensive, automated testing and CI.


You have 80 hours budget to deliver X features, no compromise or no pay, feel free to decide how to deal with testing.


Well, thank you for providing the first compelling argument as to why software practices need to be more formally regulated. Providing CI/CD should be the industry norm and expected default.


macOS is not a server OS and X serve has long stop being an option.


I can't tell if you're being facetious or you just don't care about automated testing and continuous integration for your code.


CI/CD is largely ignored in plenty of consulting gigs.

I care to the extent customers care.


So true. I've done a lot of freelance work over the past 20 years. CI/CD has never come up. You're sometimes lucky if you can even set up a test system / site.


I pity your customers.


Their choice, no need to pity them.


Apple tried selling Xserves for years.


And to this day they're still the most gorgeous servers ever made, especially the xserve raid.


I don't think the answer is for Apple to force people into buying custom server hardware any more than it is to force them into making janky rack setups for Mac Minis.

The answer that most people would like to see would be a stripped down, non-GUI macOS that's installable at no cost in virtualization environments, or maybe with some evaluation scheme like Windows Server has, which effectively makes it free for throwaway environments like build agents.


> The answer that most people would like to see would be a stripped down, non-GUI macOS that's installable at no cost in virtualization environments

That's called "Darwin" and it's theoretically open source, but there doesn't seem to be a useful distribution of it. Whether that's due to lack of community interest or lack of Apple support is the question.


A useful distribution (for building anyway) would require all the headers and binaries from macOS, which wouldn’t be distributable, right? So you’d have to have enough of a free system to be able to get to the point where that stuff could be slurped out of a legit macOS installation. Sounds like an interesting challenge.


Okay, but offering no machine suitable for developers and power users will eventually hurt them when they leave whole ecosystem.


> offering no machine suitable for developers and power users

This perception strikes me as having warped in from a different decade. Nowadays, at least in my neck of the woods, developers almost universally use laptops, and Apple's still plenty competitive in the (high end) laptop department.

For the most part, the only developers I know who still use desktops are machine learning folks who don't like the cloud and instead keep a Linux tower full of GPUs in a closet somewhere. And then remote into it from a laptop. Half the time it's a MacBook, half the time it's a XPS 13. And they were never going to consider a Mac Pro for their training server, anyway, because CUDA.

I couldn't speak to power users, but my sense is that, while it meant something concrete in the '90s, nowadays it's a term that only comes out when people want to complain about the latest update to Apple's line of computers.


I work in games where we write c++ in a multi-million LOC base. Every developer in my company has a minimum of 12 cores, and 96GB RAM. All of the offices are backed by build farms on top of this. There are entire industries that rely on very high end hardware. (Of course we also rely on lots of windows-only software too, but that's only an issue once the hardware is solved)


Fair, and we could spend ages listing all the different kinds of people who have really specific job descriptions that require them to have traditional, stationary workstations. And then we could follow that up with lists of all the reasons why they need to be running Windows or Linux on said workstations, and couldn't choose comparable Apple hardware even if it were available.

But I don't think that we need to beat a dead horse like that. The more interesting one would be to figure out some interesting and non-trivially-sized cross-section of people who both need a workstation-class computer, and have the option of even considering using OS X for the purpose.


The main reasons to buy Apple x86 machines for any OS developer was that Apple has to keep their number of hardware variants to a minimum and you can run compatibility (and truly same hardware performance) tests against any OS as OsX was the only one locked to it's hardware. The same might be true for Arm if there are adequate GPL drivers to not exclude Linux/Android, etc.


I'm not sure that's true. At least in my experience, Bootcamp seemed almost designed to cripple Windows by contrast to OS X

The last time I used it (the last MBP with Ethernet built in. I want to say 2012 or 2013?) some of the features "missing" in Bootcamp

- No EFI booting. Instead we emulate a (very buggy!) BIOS

- No GPU switching. Only the hot and power hungry AMD GPU is exposed and enabled

- Minimal power and cooling management. Building Gentoo in a VM got the system up to a recorded 117 degrees Celsius in Speccy!

- Hard disk in IDE mode only, not SATA! Unless you booted up OS X and ran some dd commands on the partition table to "trick" it into running as a SATA mode disk

The absolute, crushing cynic in me has always felt that this was a series of intentional steps. Both a "minimum viable engineering effort" and a subtle way to simply make Windows seem "worse" by showing it performing worse on a (forgive the pun) "Apples to Apples" configuration. After all, Macs are "just Intel PC's inside!" so if Windows runs worse, clearly that's a fault of bad software rather than subtly crippled hardware


I think we used rEFIt.. I remember it would be a bit finicky, but I never really had to boot windows since my product had no equivalent, and these days I don't boot OsX, though firmware updates would be nice.


What if Apple decided that they don't get to gain that much out of AAA games so they don't care offering hardware that those companies might ran on?

I have the feeling that Apple just cares about Apps for iOS (money wise). What's the minimum they need to do so people write iOS apps?

If this hardware, incidentally, is good for your use case, all is good. If not, they might just shrug it and decide you're too niche (i.e. not adding too much value to their ecosystem) and abandon you.


Yes, I think they view the basic mid-range tower box as a nearly-extinct form. Like corded telephones & CRTs.

They choose to make the mac pro as some kind of halo product, I guess. But really the slice of people who need more power than an iMac, and less than this "Linux tower full of GPUs" or a render farm, they judge to be very small indeed. This wasn't true in the 90s, when laptops (and super-slim desktops) came with much bigger compromises.


I don't think they think it's a small market; I think they think it's a commoditized market with very thin margins. That form-factor has a literal thousand+ integrators building for it, and also in many segments (e.g. gaming) people build their own to save even more money. Those aren't the sort of people who are easily swayed to pay an extra $200+ of pure margin in exchange for "integration" and Genius Bar "serviceability" (the latter of which they could mostly do themselves given the form-factor.)


I guess people into hot-rodding, especially for games, have never been Apple's target. (Even if they are numerous, and I actually have no idea how large this segment is.) Besides price-sensitivity, wouldn't they be bored if there were only 3 choices? Maybe we will find out when the M2-xs or whatever arrives.


> suitable for developers and power users

Important to note 'some' here. I'm a developer and power user, and haven't had a desktop computer in almost 10 years.


Me too. I last had a desktop, at work, about that long ago, and have not bought a desktop computer for myself in a lot longer. Laptops got very good and I can still plug it into a monitor and external controllers when I need to. I don’t need a server at home because of the cloud and broadband.


> Okay, but offering no machine suitable for developers

You mean... software developers? The same people who almost universally use a Mac?


There's a massive US centric bubble when it comes to Apple. iPhones and Macbooks are not in the majority, let alone universal, with software developers as whole, just in pockets


The five dollar latte crowd is willing to pay and consume. Walk into any café and good luck finding non-Apple machines. (Occasionally there will be a Surface or two, especially if you live in Seattle).


They are most visible but that does not mean they are most important part of ecosystem. Plus, in 5 years they will be reconsidering workplace setup due to back pain and/or carpal. And Apple asks arm and leg for all ergonomic accessories like external monitor and dock.


> Plus, in 5 years they will be reconsidering workplace setup due to back pain and/or carpal.

This is where I'm at.

I don't know if other people are built from sturdier stuff than me or what, but typing on a laptop to any significant extent leaves me with tendonitis for several days. And staring at a laptop screen too long leaves me with neck pain.

Laptops are a nightmare in terms of ergonomics.

It's been a bit of a blessing for me because I only have a laptop at home, and it basically means I can't take work home with me.

But I'm pretty seriously considering upgrading to a traditional desktop sometime in the next year.


Laptops are my ergonomic savior. I make sure it's on my lap, and that my elbows are on softly padded armrests and hang down gently, and this has given me decades of work after fierce carpal tunnel inflammation.

I also use a Wacom tablet comfortably placed on a table to my right.


You just... buy from someone else? You don't have to buy an external monitor or a dock from Apple.


Sure - so now they're unsuitable for developers?


> The same people who almost universally use a Mac?

This has become steadily less true since about 2012, in my experience. I don’t know any full time developers still using an Apple laptop. The keyboard situation caused a lot of attrition. I finally stopped support for all Apple hardware at my company months ago, simply to get it out my headspace. Will Fusion360 again be completely broken by an Apple OS update? Am I going to have to invest time making our Qt and PyQt applications work, yet again, after an Apple update? Are Apple filesystem snapshots yet again going to prove totally defective? The answer is “no”, because we really need to focus on filling customer orders, do we’re done with Apple. ZFS snapshots function correctly. HP laptop keyboards work ok. Arch Linux and Windows 10 (with shutup10 and mass updates a few times per year) get the job done without getting in my face every god damned day.


> I don’t know any full time developers still using an Apple laptop.

Fascinating. I can name a few startups in my town that use Apple. One just IPO'd (Root), another is about to (Upstart). There are others as well.

The big companies it's hit or miss. Depends on if they are working on big enterprise applications or mobile/web. Mobile and web teams are all on MacBook Pros, and the big app dev teams aren't.

When I was last in Mountain View they were on Mac as well but I know that depends on personal preference.


>The same people who almost universally use a Mac?

* in very specific places and conditions.

Actual numbers from every single credible survey puts macs at a grand maximum of 25%.


Well, most corporations don't give developers a choice in what computer they use. I doubt that makes them unsuitable.


Macs keep dropping off our domain, so there's no real way to maintain their provisioned state.


I use a mac because other developers do in my office. But I'd be just as productive on a linux or windows machine.

For a while osx had the edge because it had a nice interface while still offering a lot of unix. Now windows and linux has caught up in the areas they were lacking before. Meanwhile apple has been caring less and less about people using the cli.


Quite possibly - I was a huge Apple fan who's now using a PC because I was fed up with the lack of viable options for me.


Apple will certainly offer an ARM-based MacPro, but I'm assuming it'll be a very different beast - current one maxes out at 1.5TB of RAM and it doesn't seem likely anyone will integrate that much memory on a chip anytime soon ;-)

Memory bandwidth is one key feature impacting M1's performance. When Apple builds an ARM-based MacPro, we can expect something with at the very least 5 DDR5 channels per socket. It's clear, from this, the M1 is a laptop/AIO/compact-desktop chip.


The M1 already has 8 LPDDR4x channels per socket, running at 4266MHz.


My bad. I was looking at the specs. It's 300GBps, which is roughly 5x DDR5 IIRC.

So yes, at the very least 8 DDR4 channels, or one per core, but I'd expect more from a workstation-class board.

Now, speaking of the board, all those memory channels will be funny.


Funny? 8 channels is the standard AMD Epyc socket. Most threadrippers (AMD's workstation chip) are 4 channel, but there is a variant that's 8 channel.


I would expect more, so that cores don't get memory starved. The M1 has 4 fast cores and 4 slow ones. If we imagine an M2 with 8 fast cores, I would expect it to need 16 channels to have the same performance. That's a lot.


Dunno, the M1 CPU package is tiny, thin, power efficient, etc. It's got 4 memory chips inside the package. I don't see any particular reason why a slightly larger package could have 4 memory chips on one side, and 4 chips on the other to double the memory bandwidth and memory size.

However the M1 is already pretty large (16B transistors), upgrading to 8 fast cores is going to significantly increase that. Maybe they will just go to a dual CPU configuration which would double the cores, memory bandwidth, and total ram.

Or move the GPU and ML accelerator offchip.


I'm a developer and haven't had a desktop in 15 years or so. It's been a mix of Thinkpads (IBM then Lenovo) and MBPs.

I'm guessing very few developers need the extra power a desktop offers over a high-end laptop.


even a low end laptop. i work in clojure for finance. digital nomad. thinkpad x220.


I think the developers and power users that still use desktop machines/towers are either very cpu-power-hungry niche exception, or the more backwards ones, and thus least likely to influence/be imitated by anyone...


I care to differ (as a developer on a desktop). The reason for developing on a desktop is that my productivity is much higher with 3 screens, one of which is a 40 inch, a full 101 key keyboard and a mouse.


> The reason for developing on a desktop is that my productivity is much higher with 3 screens

Those requirements don’t dictate a desktop[0]. Also, the physical size of the monitor is irrelevant, it’s the resolution that matters. Your video card doesn’t care if you have a 40” 4K monitor or an 80” 4K monitor, to it, it’s the same load.

The reason I still have a cheese grater Mac Pro desktop at all is because I have 128gb RAM in it and have tasks that need that much memory.

[0] I’ve connected eight external monitors to my 16” MBP (with laptop screen still enabled, so 9 screens total). I don’t use the setup actively, did it as a test, but it very much works. The setup was as follows:

TB#1 - 27” LG 5K @ 5120x2880

TB#2 - TB3<->TB2 adapter, then two 27” Apple Thunderbolt Displays @ 2560x1440

TB#3 - eGPU with AMD RX580, then two 34” ultrawides connected over HDMI @ 3440x1440, two 27” DisplayPort monitors @ 2560x1440

TB#4 - TB3<->TB2 adapter, then 27” Apple Thunderbolt Display @ 2560x1440

So that’s almost 50 million pixels displayed on around 4,000 square inches of screens driven by a single MBP laptop.


How was it to move (or find) the cursor?

(I kid, I kid)


You kid, but it legit was an issue. I’ve used at least 3 monitors (if not 1-2 more) for over a decade now, so I’ve experience there, but going up to 9 even for a short while, it was definitely an issue.


Yeah I’m with you. Laptops are great, but they sacrifice a lot for the form factor. Remove the constraint of needing an integrated screen, keyboard, touch pad and battery, and you can do much more. Sure you can dock it, but docked accessories are always second class citizens relative to the integrated stuff.


All of which are available on modern laptops


Laptop user, I also have 3 screens. I do use the MBP's keyboard, but never felt like that cost me productivity. I use a normal mouse as well. The only reason I can think of the need a desktop is the extra CPU/GPU capacity you can get.


> The only reason I can think of the need a desktop is the extra CPU/GPU capacity you can get.

Or RAM


Or internal peripherals. If I want 20Tb of storage, and I don't want external chassis all over the place, I need a desktop with at least a couple of 3.5 bays.


You mean you don't like paying $500 for 8GB of soldered RAM?


Nope, not what I’m saying at all (in part because your comment is hyperbolic and untrue). Some folks need more than 64gb RAM which is the highest amount most laptops have.


He is not that off. Apple asks $200 for 8 GB, so he is at the same order of magnitude. For comparison, I've bought this week 16 GB DDR4 ECC (unregistered) sticks for 67 EUR per piece (before VAT).


Great, so you bought an different type of RAM in a completely different form factor and paid a different price. This is on “processor package” RAM and will thus have an entirely different price basis than a removable stick would, not even factoring in the Apple Tax.

Furthermore, how is that relevant to the point _I_ was making about needing more than 64gb of RAM? If you both want to tangent, fine do so, but don’t try to put words in my mouth while doing it.


> Great, so you bought an different type of RAM in a completely different form factor and paid a different price.

It is being called "using an example" or "illustrative example". For comparison, I've used a type of RAM that is traditionally much more expensive than you find in laptops.

> This is on “processor package” RAM and will thus have an entirely different price basis than a removable stick would,

No.

1) The same price is being asked for RAM in non-M1 models.

2) You could put any price tag you want, because the item is single-sourced, the vendor can pull a quote out of the thin air and you cannot find exact equivalent on the market. Therefore, for comparison, a functionally and parametric similar item is being used.

> how is that relevant to the point _I_ was making about needing more than 64gb of RAM?

You get a different product, that supports more RAM.

> If you both want to tangent, fine do so, but don’t try to put words in my mouth while doing it.

Could you point out, where I did that? I was pointing out, that your note about the GP being hyperbolic is untrue - he was in the ballpark.


> I was pointing out, that your note about the GP being hyperbolic is untrue - he was in the ballpark.

Essentially as in the ballpark as $80 is, both are off by 2.5x. Claiming they are “same order of magnitude, so it’s not hyperbolic” is laughable. $100k and $250k are both same order of magnitude, but are radically different prices, no?


at work, when at the office, they are always pushing screens on us. keep thinking is some pork deal with dell. my whole team either plugs in a laptop to one screen, or just works straight on the laptop. maybe we're not cool.


A quick Google will turn up several serious usability studies that show more screen real estate == higher productivity. It depends a lot on the type of work, of course, but for development a larger screen would mean less scrolling and tab switching => less context switching => so your brain gets more done.


Its probably the ergonomics police.


Or ...they're old and cant see the tiny laptop screen or get back pain when using a laptop all hunched over. To be honest, I don't know how anyone does serious work on them.


You can connect a laptop to 2-3-4 externals screens. Which many do. You don't need a tower for that.


Apple itself is selling it's new chips as making faster devices. If only a niche want that speed, Apple probably wouldn't be pushing it as part of the pitch so hard.


That would be if everything else was equal. Everything else is NOT equal. People also want portability, small size, battery life, etc.

If more than a niche had speed as its sole priority, then they would already use desktops, but most (80%+) use laptops today.

But of the majority that uses laptops, most would like a faster machine. Just would prefer it was also a laptop.


Considering how much the gaming side of the PC market will drop on just a single card I am more of the opinion that Apple chose to avoid this market because it did not want the association with gaming as if that were beneath their machines.

At times there seemed to be a real disdain for the people who loved upgrading their machines as well as those who gamed on them. Apple's products were not meant to be improved by anyone other than Apple and you don't sully them with games. The Mac Pro seems to be the ultimate expression of "You are not worthy" from the base system which was priced beyond reason to the monitor and stand. It was the declaration of, "fine, if you want to play then it will cost you" because they didn't really care about the enthusiast of the wrong use - games and such.


Why do they highlight games at every WWDC and product announcement?


To let the fans have a bathroom break.


Being “fickle” is kind of hard to apply to a market segment, because it not a synchronized monolith. There clearly is demand for decent machines hackingtoshes make. It’s just that the mobile market is much higher ROI. So, any R&D other Apple products get are some coincidental opportunities. This entire M1 change is a happy accident.

So it’s not that hackingtosh builders are anything, at all, it’s that they’re outnumbered by iPhone buys 1 to a million.


They certainly wouldn't want to scare away developers, others have suffered greatly due to neglecting them and developers seems to be getting rarer. You always want the enthusiasts and of course they buy new hardware to look for ways how they can make it work for them. Many devs also have a high income so that price isn't as important anymore.


A big part of saving Apple was Jobs killing the clone program. That lesson probably still resonates in the halls of Apple even if allowing hackintoshes is a different thing without the same risks.


I would be extraordinarily grateful for some insight into why this comment was downvoted.


I think the parent comment is just completely ignoring the argument of the post they reply to.

Just looking at the first sentences:

GP: > I have zero issues with an Apple premium or paying a lot for hardware.

parent: > the hackintosh/enthusiast market [...] are the most price conscious segment


Then buy the new Mac Pro? I don't understand why that's not an option for GP.


Because as the sibling comments point out, the price of a Mac Pro isn't just an "Apple Tax^WPremium" over a desktop machine but is an order of magnitude more expensive (assuming you don't care about workstation-class components, i.e. Xeon Ws, Radeon Pro GPUs and ECC RAM).

There's an enormous price gap between a Mac Mini and the Mac Pro (especially when the Mini now has higher single-threaded performance than the base Pro...) which Apple has widened in the last decade or two.


I've had a continuous string of Mac Pros from G3 to 2012 (MacPro 5,1) my main workhorse. I have continually updated and expanded it.

The 2013 mac pro was a mess. pass.

The latest mac pro... I think it wasn't just expensive, it was sort of sucker expensive.


> The 2013 mac pro was a mess.

I appreciate that the 2013 mac pro wasn't for you, but it was perfect for me: small but powerful. Firstly: RAM. I was able to install 64 GiB on it, which enabled me to run Cloud Foundry on ESXi on Virtual Workstation on macOS. Non-Xeon chipsets maxed-out at (IIRC) 16 GiB and then later 32 GiB—not enough.

Secondly, size & esthetics: it fits on my very small console table that I use as a desk. I have a modest apartment in San Francisco, and my living room is my office, and although I had a mini-tower in my living room, I didn't like the looks.

Third, expandability: I was able to upgrade the RAM to 64 GiB, the SSD to 1 TB. I was able to upgrade the monitor to 4k. It has 6 Thunderbolt connections.

My biggest surprise was how long it has lasted: I typically rollover my laptops every year or so, but this desktop? It's been able to do everything I've needed it to do for the last 7 years, so I continue to use it.

[edited for grammar]


While the form factor was cool, how pissed would you have been if it broke and you were buying the exact same machine, for the same price (give or take), in 2018?

Part of the "mess", I'd argue, was that Apple backed themselves into a thermal corner where they couldn't update the machine but also wouldn't cut its price so it got steadily worse value as time wore on.


> but also wouldn't cut its price so it got steadily worse value as time wore on

This has long been an issue for Apple products. It's why the best time to buy an Apple product is right after an update.


You're not wrong but the Mac Pro went a particularly long time between updates.


Oh, definitely. Look at the Apple TVs for another example. In both cases, if Apple would drop the price, even just yearly, they would sell so many more units.


But my workhorse has had so many upgrades. Lots of storage in and out. I have a bunch of drive sleds. I updated the graphics card more than once. Presently it has 2x6 core, 5 ssds (one in a pcie slot), a 10tb hard disk, a pcie usb3 card, and a gtx980.


I just got a new Mac Pro. The only real upgrade I did from Apple was to the 12 core Xeon. Other than that I kept the base 32GB memory, though I did get a 1TB SSD from the 256GB base offering.

... then I went to NewEgg and got 192GB of memory for $800ish, rather than Apple's exorbitant $3,000. And seriously, why? Same manufacturer, same specs. And convenience factor? It took a good 45 seconds to install the memory, and I'd wager anyone could do it (it's on the 'underside' of the motherboard, all by itself, and has a little chart on the memory cover to tell you exactly what slots to use based on how many modules you have).

And then I bought a 4x M.2 PCIe card and populated it with 2TB SSDs (that exceed the Apple, with sustained R/W of 4500MB/s according to Blackmagic) for just around $1,100, versus the $2,000 Apple wanted. Only downside is that it cannot be the boot drive (or maybe it can, but it can't be the _only_ drive).


> The latest mac pro... I think it wasn't just expensive, it was sort of sucker expensive

It's the kind of Mac that makes you get an iMac to put on your desk and a beefy Linux server-grade box you hide somewhere, but that does all your heavy lifting.


Yep, thats what I did.


Some tools and OSs make it easier than others. I used to do a lot of work from my IBM 43P AIX workstation (great graphics card, huge monitor, model M keyboard) that actually ran on a more mundane Xeon downstairs. X made it even practical to browse the web on the 43P. It attracted some really confused looks in the office.


Exactly this. The closest to this would be an i7 iMac but not everyone wants an Aio PC. It’s kind of a bummer. We finally have an iPhone for everyone, even a high end small form factor option. Whoever is responsible for that decision please take a look at the Mac lineup next.


There's even precedent for it: the iMac/iMac Pro. The Pro model has workstation-class hardware in it while the non-Pro does not.

Ideally the enhanced cooling from the Pro models would trickle down to the non-Pro. By all reports the (i)Mac Pro is virtually silent but in the low-power ARM world a desktop machine that size could almost be passively cooled, even under load.


Give them time maybe?

I bet Apple would love to release an all-in-one iMac Pro powered by an iteration on the M1. They could put a Dolby Vision 8k display in it and drag race against Threadripper machines for UHD video workloads.


Part of me laughs that an ARM chip could compete with a Threadripper, the other part of me seriously thinks it could happen.


I mean, the iMac Pro came out in 2017 and there isn't much sign of anything trickling down to the standard iMac. Rumour is that the ARM Mac Pro will be significantly smaller than the Intel one - it'll be interesting to see how (or if) they support discrete GPUs.


I don't totally agree with GP, but I think their global point was that during a long time (all the 2010s ?) there was just no decent Mac Pro.

Outside of the Mac mini, the most powerful desktop machine was actually iMacs, with all the compromises that come with the form factor, and the trashcan Mac Pro who was thermally constrained.

In that period, no amount of money would have helped to get peak storage + network + graphic performance for instance.

We are now in a slightly better place where as you point out, throwing insane amounts of money towards Apple solves most of these issues. Except for those who don't want a T2 chip, or need an open bootloader.


Agreed. Do not see anything worth downvoting at all.


Agree completely.

I don’t know that the “Apple tax” moniker is really fair anymore, either.

The machines have always commanded a premium for things that enthusiasts don’t see value in (I.e. anything beyond numeric spec sheet values), so most critics completely miss the point of them.

There’s a valid argument to be made that they’re also marked up to higher margins than rivals even beyond the above, but I’m not sure if any end user has really ever eaten that cost - If you buy a MacBook, there has always been someone (students) to buy it back again 3/5/10 years down the road for a significant chunk of it’s original outlay. That doesn’t happen with any other laptop - they’re essentially scrap (or worth next to nothing) within 5 years. After 10 years I might actually expect the value to be static or even increase for its collector value (e.g. clamshell iBook G3s)

The total cost of ownership for Apple products is actually lower over three years than any rival products I’m aware of.


> The machines have always commanded a premium for things that enthusiasts don’t see value in (I.e. anything beyond numeric spec sheet values), so most critics completely miss the point of them.

It's not just intangibles. I really like using Macs, but my latest computer is a Dell XPS 17. This is not a cheap computer if you get the 4k screen, 64GB of RAM and the good graphics card. At those prices, you should consider the MBP16. The MBP is better built, has a better finish and just feels nicer.

Thing is, Dell will sell me an XPS 17 with a shitty screen because I don't care about the difference and would rather optimise battery life. I can get 3rd party RAM and SSDs. I can get a lesser graphics card because I don't need that either. I can get a more recent Intel CPU. And I can get the lesser model with a greater than 25% discount (they wouldn't sell me the better models with a discount though).

I think some of the Apple Tax, is them not willing you sell you a machine closer to your needs, not allowing some user replaceable parts and not having discounts.


It works both ways: if you get something in Apple hardware, you will get the nice version of it. If can't get something there, you will have to be without.

Example: I've been looking at X1 Nano. It is improvement compared to other lines (it has 16:10 display finally!), but it is still somewhere in the middle of the road.

The competitor from Apple has slightly better display, much better wifi and no option for LTE/5G.

Nano has 2160x1350 450 nits display with Dolby Vision. Apple has 2560x1600 400 (Air)/500 (MBP) nits display with P3. The slightly higher resolution means that Apple would display 9 logical bits using 8 physical when using the 1440x900@2X resolution (177% scale), but to get similar scale on Nano that would mean displaying 8 logical pixels using 6 physical (150% scale). Similarly, the Dolby Vision is an uknown (how it could get used?), the P3 from Apple is a known.

X1 Nano has 2x2 MIMO wifi - Intel AX 200 - with no option for anything better. There are only two antennas in the display frame, you cannot add more (ok, 3, but the third one is for cellular, and cannot be used for wifi if you forego cellular). Apple ships with 4x4 MIMO. If you have decent AP at office or home, it is a huge difference, yet no PC vendors are willing to improve here.

The cellular situation is the exact opposite. You can get cellular module for Thinkpads, and you cannot for Apple, at all, so if you go this route, you have to live with workarounds.


Yes and no. To be honest I did the same back-of-the-napkin math that you did prior to buying my MBP - the thing is the TCO is even worse if you customise the machine.

Example - a Mac is a Mac for resale purposes - if I attempt to later sell an XPS that I've opened up and put an SSD in and a couple of SODIMMS - I now need to recoup my cost on all of those things. The problem is that if someone is looking at a used XPS with upgraded SSD and upgraded RAM they're statistically unlikely to fully investigate and value the (probably really good) parts that you upgraded it with - they're just going to see X,Y,Z numbers and price accordingly.

Generally though, a 5 year old Windows laptop with 16GB RAM still commands the value of a 5 year old Windows laptop as best I could tell looking at resale values.


I wasn’t trying to address the resale value. Only the tax part. The perception of the tax comes from Apple simply not offering compromised parts for a particular set of parameters. And other manufacturers willing to sell at large discounts regularly!


Can’t argue with this. They gouge absurdly on storage and memory.


> I don’t know that the “Apple tax” moniker is really fair anymore, either.

I think it's still accurate and honestly that's apple's business model.

I think the resale value for a student macbook doesn't really matter. It still costs the student - while they are poor - as much as 4x what other students pay for their laptop. Many students are paying $250 for their laptop.


I threw out a laptop I had from 2008 that was at the time top-of-the-line 3000$. I bought it in the US when I was there on vacation. This was when the dollar was at such a low that I got the device for at the time the equivalent of something like 1200€.

I couldn't sell that device for half of that a year and a half later. I got a newer laptop in 2016, again very specced out for a laptop. About 1800€, couldn't sell it for 800€ 2 years later. I still use that last one because I didn't want to sell it so far under what the market value should be.

If you try to sell anything Apple related that isn't more than 5 years old you won't have that problem at all. You can get a good value for the device and sell it without too much of a hassle.

Even if you're a student you would likely be better off buying the cheapest macbook you can find (refurbished or second hand if needed). If you don't like the OS you can just install Windows or a Linux distro on it.

Are you sure about that 250$ number? Because I don't think that's a very realistic number.


> Are you sure about that 250$ number? Because I don't think that's a very realistic number.

For note taking, word processing, basic image editing, web browsing, video playing, etc, you can easily get a capable enough laptop for that price.

This is not comparing like-for-like in terms of what the machines can do, of course. Apple's range doesn't even remotely try to cover that part of the market so a direct comparison is unfair if you are considering absolute price/capability of the devices irrespective of the user's requirements, but for work that doesn't involve significant computation that bargain-basement unit may adequately do everything many people need it to do (assuming they don't plan to also use it for modern gaming in non-working hours).

> If you try to sell anything Apple related that isn't more than 5 years old you won't have that problem at all.

Most people don't consider the resale value of a machine when they buy it. For that to be a fair comparison you have to factor in the chance of it being in good condition after a couple of year's use (this will vary a lot from person to person) and the cost of any upgrades & repairs needed in that time (again more expensive for Apple products by my understanding).

And if you buy a $500 laptop and hand it down or bin it, then you are still better off (assuming you don't need a powerful machine) than if you dropped $3,000 for an iDevice and later sold it for $2,000.

> what the market value should* be.*

"Market value" is decided by what the market will bare, not what we want to be able to sell things for, and new & second hand are often very different markets.


> Are you sure about that 250$ number? Because I don't think that's a very realistic number.

I'm not a student but it's pretty close I think.

I invested $350 into a Chromebook that runs native Linux[0] about 4 years ago and it's still going strong as a secondary machine I use when I'm away from my main workstation.

It has a 13" 1080p IPS display, 4gb of memory, an SSD, a good keyboard and weighs 2.9 pounds. It's nothing to write home about but it's quite speedy to do every day tasks and it's even ok for programming where I'm running decently sized Flask, Rails and Phoenix apps on it through Docker.

If I had to use it as my primary development machine for web dev I wouldn't be too disappointed. It only starts falling apart if you need to do anything memory intensive like run some containers while also running VMs, but you could always spend a little more and get 8gb of memory to fix that problem.

I'm sure nowadays (almost 5 years later) you could get better specs for the same price.

[0]: https://nickjanetakis.com/blog/transform-a-toshiba-chromeboo...


Right, except that when you stop using the Chromebook every day or move on to something better, will it have residual value or just go to landfill?

I love Chromebooks, don't get me wrong, but the problem I've come to realise over time is that many are specced and priced just about at a point where they'll quickly move into obsolescence not long after purchase - at which point the only thing keeping them out of the ground is your willingness to tolerate them after the updates have stopped.

The Mac will still be worth a good chunk of money to someone.

I have a Chromebook Flip here that I adored for several years that I couldn't give away now.


To answer your question accurately will depend on how long it ends up lasting for.

For example if it works well enough for another 4 years, now we need to ask the question on whether or not you could get reasonable value out of an 8+ year old Mac. I never sold one so I'm not sure. My gut tells me it's going to be worth way less than what you bought it for even if it's in good condition.

But more generally, yeah I have no intentions on re-selling this thing if I decide I'm done with it before it physically breaks. I'd probably donate it or give it away for free (if someone wanted it).

I don't see that as too bad tho. If I can get 7-8 years out of $350 device I'm pretty happy, especially if the next one costs about the same.

It's a tough comparison tho because a decently decked out MBP is going to be like 8x as expensive but also have way better specs.


I think it is realistic. It's easy to think people have money to spend on $999 laptop when you are living in a first world country. 90% of the world probably couldn't afford that.


> Are you sure about that 250$ number? Because I don't think that's a very realistic number.

I think it's fairly realistic. The Dell Latitude 7250 is probably a good representative of what you can get used for ~$220-$300 US these days: https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m... The dual-core processor should still be serviceable for everyday work, at ~1.3kg it's light enough to carry around all day, a 1080p resolution should be OK on a 12" screen, and it can take up to 16GiB of RAM, though holding out for one with 16GiB preinstalled will definitely tend to push the cost up to nearer $300: https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m...

(Then any laptop with similar specs except with a 2-in-1 form factor tends to cost a fair bit more, but that's not a must-have for most students or anyone who might have been considering a MacBook.)


> Are you sure about that 250$ number? Because I don't think that's a very realistic number.

I got an i7 T430s for 200€ a few years ago and it's still plenty fast for coding, so I don't see why this number wouldn't be realistic.


"I think the resale value for a student macbook doesn't really matter"

It's literally the only thing that matters if you're the seller.

If you have your choice of two items to sell 5 years from now, you ideally want to be selling the item that's worth substantially more to the buyer, rather than trying to sell something worthless.

Assuming there's nothing dishonest happening, it's really up to the market to price.

Thing is, the student buying the MacBook is probably going to be substantially better off that way too, in that it will likely retain proportionally more of it's value from that point too.


> I don’t know that the “Apple tax” moniker is really fair anymore, either.

Apple on the new Mac Pro that I got a month ago: 192GB memory? That will be $3,000. NewEgg? We'll sell you the same specced memory from the same manufacturer for $800. And you get to keep/sell the baseline 32GB memory.

8TB SSD? $2,000, thanks. OWC and NewEgg? Here, have a PCIe 4xM.2 card and 4 2TB SSDs for $1,100. Oh, and they'll be 50% faster, you just can't have them as the only drive on the system (my Apple SSD runs at around 2800MB/s, the alternative, 4500MB/s).

So they are entirely marked up, and look in any forum - by far most people are not doing what I'm doing, and just "going straight Apple for convenience", though the memory installation was less than 1 minute, and the SSD installation less than 5, including unboxing, seating the 4 drives, reinstalling the heatsink on the card and installing. I get "my time is money", and "it just works" (which, as we know, more and more is less the case with Apple), but really, for me, that was a $3,100 savings for <10 minutes effort.


In terms of high performance products, I’m actually really excited for the next Mac Pro. They’ve got novel design options open to them that no rival has.

The M1 costs Apple relatively little to produce per unit - I would expect them to keep the overall design for a Mac Pro but have stacked modules such that the side wall of the Mac Pro is a grid of 4 or more such modules each with co-located memory like the M1 has. Obviously performance would depend upon the application being amenable to a design like that but a 32 or 64 5nm core Mac Pro is not out of the question, and would be impossible to match for performance in the next few years by any Hackintosh.

Even after capacity frees up at TSMC for AMD to move to 5nm, they won’t be able to co-locate memory like the M1 does due to standards compliance with DRAM sticks.

I think the next couple of years will be really turbulent for other vendors - the M1 is likely far more significant for the PC market due to how disruptive it is than it is for the Mac market.


It might force Intel / AMD / Broadcom to get serious about hardware and at least integrate more of the components for notebooks. Maybe not go full on OEM, but a lot more than the CPU, because M1 is probably fundamentally winning with SoC design.

I would like to know if they are using fundamentally better batteries, and how much a 5nm process lead is behind this.

But I will hand it to Apple, if they finally did something to break the 4-8 hour battery life limit, a limit that always seemed to stay the same despite node shrink after node shrink after node shrink, and really about the same on-screen performance for usual browsing/productivity application use.

I was pretty distrustful of the ARM move, but if they deliver this for the Macbook Pro, I'll hop to ARM.

Associated with the CPU people "getting serious" is them pushing an OS, which would have to be Linux. Intel should have done this 20 years ago, at least as leverage to make Windows improve itself.


AMD would be able to do DRAM on package for the lowest wattage "ultrabook" chips, at the cost of producing a very different package for them vs. the bigger laptops that are expected to have upgradable SODIMMs. But I doubt that this "co-location" is that huge for performance. Whatever memory frequency and timings Apple is using are likely easily achievable through the regular mainboard PCB, maybe at the cost of slightly more voltage. DDR4 on desktop is overclockable to crazy levels and that's going through lots of things (CPU package - socket pins - board - slots - DIMMs).

> stacked modules such that the side wall of the Mac Pro is a grid of 4 or more such modules each with co-located memory like the M1 has

Quad or more package NUMA topology?? The latency would absolutely suck.


Why would latency suck? 64-cores are already only beneficial for algorithms which are parallelizable -- with the most common class of parallelizable algorithm being data parallelizable ... So -- shouldnt the hardware and os be able to present the programmer illusion of uniform memory and just automatically arrange for the processing to happen on the compute resources closest to the RAM / move the memory closer to the appropriate compute resource as required?


Yeah, I'm no kernel developer, but I've been replying to anyone saying 'just stick n * M1 in it' that even AMD has been trying to move back to more predictable memory access latency, less NUMA woes.


But in general we're moving toward even less uniform memory, with some of it living on a GPU. NUMA pretended that all memory was the same latency, because C continues to pretend we're on a faster PDP-11, but this seems like a step in the wrong direction as for how high-performance computation is progressing.


> I have zero issues with an Apple premium or paying a lot for hardware.

Especially if the margins allow them to not engage in silliness on the software side of things like violating privacy and serving ads in the OS.

There are certainly places that Apple can be criticized, but I think in these two areas they're acting pretty well.


What I don't understand - Windows 10 Pro comes with tons of pre-installed junk - Candy Crush etc. Just charge what you need to, business don't want this stuff.


Well, large businesses at least get Windows 10 Enterprise, which doesn't come with all of that nonsense. The real shame is that you can't get Windows 10 Enterprise without a volume license.


You can get Win 10 Pro. I don’t remember my system having any of that junk preinstalled


It installed itself on any Win 10 machine I saw (most of which run Pro), except those running Enterprise (or LTSC/LTSB), or an education license.

Er, I heard that junk doesn’t install itself on “Pro for Workstations,” but I’m not certain and even then that’s another hundred dollars more expensive than Pro.

And even Enterprise still comes with a lot of junk that, like, 90% of users won’t need. Windows AR? Paint 3D? And so on… half the things in the start menu of a stock Windows 10 Pro install are either crap like Candy Crush, fluff like 3D viewer, or niche like the Windows AR thing.

The worst part about this is that there’s definitely a middle ground between not including anything and pushing crap on people — both nearly every Linux distro I have ever seen, as well as Apple nail that balance, and to be frank with the App Store or Microsoft Store and such I really don’t see the need to include hardly anything.


You do get obtrusive telemetry (Cortana is a good example), but you avoid Candy Crush et al.

You can disable some of the telemetry during install as well.


Just run any of the reclaim series. Pure gold if you ask me.

https://gist.github.com/alirobe/7f3b34ad89a159e6daa1



You're correct that they aren't preinstalled, but once you connect to the internet they will be downloaded and installed automatically.


You can get a 90 day demo of Windows 10 Enterprise from Microsoft: https://www.microsoft.com/en-us/evalcenter/evaluate-windows-...

Install it in a virtual machine, every 90 days make a new virtual machine from scratch. That or use the secret code to reset the demo days. Enter an Enterprise key when you want to register it for real.


Or just run Linux.


you are kidding, right?


There's 'a lot' (2-3k) and there is 'silly, can never justify unless I'm in some super niche segment in Audio/Video production' (10-20k).


> it won't help in areas that Apple just organizationally doesn't care about/doesn't have the bandwidth for because that's not a technology problem

I would posit that Apple is always going to keep macOS working on some workstation-class hardware, just because that kind of machine is what Apple's software engineers will be using internally, and they need to write macOS software using macOS.

Which means one of two things:

1. If they never release a workstation-class Apple Silicon chip, that'll likely mean that they're still using Intel/AMD chips internally, and so macOS will likely continue to be compiled for Intel indefinitely.

2. If they do design workstation-class Apple Silicon chips for internal use, they may as well also sell the resulting workstation-class machines to people at that point. (Or, to rearrange that statement: they wouldn't make the chips if they didn't intend to commercialize them. Designing and fabbing chips costs too much money!)

Which is to say, whether it be a Hackintosh or an Apple Mac Pro, there's always going to be something to cater to workstation-class users of Apple products — because Apple itself is full of workstation-class users of Apple products.


> I would posit that Apple is always going to keep macOS working on some workstation-class hardware, just because that kind of machine is what Apple's software engineers will be using internally, and they need to write macOS software using macOS.

I hope I'm not out of line here, but this is not what a "workstation" is. "Workstation" actually has a specific meaning in the realm of enterprise computing solutions, and developers do not (generally) use workstations.

A workstation is something that, say, the people at Pixar use, or Industrial Light and Magic. It's an incredibly powerful machine that can handle the most intensive of tasks. Software development is generally not such a task, unless you're frequently re-compiling LLVM from source or something. (And even then, it's a world of difference.)

Apple's software developers, like most software developers who use Apple machines, use MacBook Pros (for the most part). Sometimes Mac Minis if they need multiple test machines, and I'm sure there are some who also have Mac Pros. But overwhelmingly, development is done on laptops that they dock while at work and take home with them after. (This was my experience when I interned there, anyway.)


Apple develops not just macOS, but also application software like Logic and FCPX. The engineers writing that code need to test it on full-scale projects (probably projects on loan to them from companies like Pixar.)

But moreover, changes to foundational macOS libraries can cause regressions in the performance of this type of software, and so macOS developers working on systems like Quartz, hardware developers working on the Neural Engine, etc., also work with these apps and their datasets as regression-test harnesses.

See also: the Microsoft devs who work on DirectX.

All of this testing requires "workstation" hardware. (Or servers, but Apple definitely isn't making server hardware that can run macOS at this point. IIRC, they're instead keeping macOS BSD-ish enough to be able to write software that can be developed on macOS and then deployed on NetBSD.)


Are Pixar and ILM really using workstations as you describe them, or render farms?


"I would posit that Apple is always going to keep macOS working on some workstation-class hardware, just because that kind of machine is what Apple's software engineers will be using internally, and they need to write macOS software using macOS."

I have always hoped that we could rely on that heuristic - that internal Apple usage of their own products would guarantee that certain workflows would be unbroken.

In practice, this has never held up.

Over the past 10-12 years it has been reinforced over and over and over: Apple engineers use single monitor systems with scattered, overlapping windows which they interact with using mousey-mousey-everything and never keyboard shortcuts.

They perform backups of critical files - and manage financial identities - using their mp3 player.

The fact that multiple monitors - and monitor handoff - is broken in fascinating new ways with every version of OSX tells you how Apple folks are (and are not) using their own products.


It sounds like you have an issue with the lack of window snapping keyboard shortcuts that are in Windows 10, as well as iPhone backups happening in iTunes until they moved to finder, and iCloud being connected to iTunes although it's managed in SysPrefs. And you have seen some regressions along with the successive improvements to display management. Is that fair?

If so, what is the connection to professional workflows on macOS?


Yeah, I think it's pretty clear, based on all the bugs with multi-monitor support down to their Mac minis, everyone at Apple must be running iMac Pros, and maybe they're using Sidecar to make their iPad into an extra screen.


If you don't have issues with paying a lot for hardware, why don't you buy Mac Pro?


A machine that wont outperform the new mini $6000

16 cores 32GB of ram least powerful gpu on the list 2GB of storage $8,799.00

28 cores 48GB of ram same gpu 4GB of storage $14,699

MSRP on a 2020 Toyota Corolla $19,600

AMD Ryzen Threadripper 3970X 32-Core 3.7 GHz Socket sTRX4 $2,629.90

Cost of the same basic GPU about $219

Cost of complete system equivalent to the almost car priced mac about 4200.

9k-15k isn't "a lot" its a crazy amount. 15k is 1/4 of the median households income.

Most of planet earth can't sink 6000 into a computer let alone 15000. Under Apple the standard expandable board in a box with room to expand is a category available to 1% of the US and 0.1% of the world.


Exactly this, I was fine with the Mac Pro until 2012, it was a little more expensive than PCs but not much outrageously so (maybe 30% more, that's a tax I was fine paying given the OS and that it was quite a well built machine).

The new Mac Pro is 3-4x the price of a machine built around AMD having equivalent performances. I'm building a Threadripper for exactly this reason. Most of the issue is Intel vs AMD and the fact that AMD's Threadrippers are an amazing deal when it comes to performance per dollar and that Apple has an aversion to offering decent GPUs


Yes, it is not a commodity machine, but it sort of goes beyond expensive to almost insulting.

If it was 1/2 the price I think it could fairly be called premium priced.


There is also the iMac and iMac Pro range. The Mac Pro is clearly demarcated as a "money is no object" product.


I'd be much more interested in the iMac if it wasn't built inside a monitor.


The M1 was released on the MacBook Air, MacBook Pro, and Mac Mini.

Buying an iMac now would seem to be a poor decision.

From what I'm seeing in some of the comments, people are so lost in the history of the past years of Apple being the pooch ridden from behind on performance, that they can't get their heads out of their arses to see how awesome this is.

I am sitting here right now wondering if I should invest more of my savings directly in Apple stock, at least temporarily to ride their sales wave, or if I should buy a Mac mini and a nice wide curved monitor with a mechanical keyboard from WASD and be f'ing awesome all of a sudden.

The only reason I'm not hitting the buy button is that all of that isn't $135. There's no reason for that amount, but if it said $135, I'd have already paid for it and been drinking beer to celebrate the happiest purchases I ever made.


The new Mini is certainly impressive and suitable for many tasks, but not a replacement for a full desktop machine. Memory and storage are very limited, and the GPU, while great for the MacBook Air, is far from desktop performance. Also, for desktop, the ports selection is very limited.


Given that they just bumped the imac in august we might be waiting for a bit before we start to see ARM imacs.


The iMac in many senses isn't a replacement for a proper desktop. You can't expand the disk storage, many iterations had no great graphics cards, this seems to be somewhat better now, but an upgrade means having to upgrade everything, including the screen. You can't even clean out the fans after some years.

Yes, I own an iMac, as this is the closest to a desktop machine Apple sells, but a replacement for what the Mac Pro used to be, it is not.


3970x 32-core is < $2000

I got curious...

  - amd 3990x 64-core 128 thread    $3849
  - 256gb g.skill ddr 3600          $978
  - noctua nh-u14s cooler           $80
  - samsung 980 pro 1tb nvme pcie4  $229
  - evga 1000w power supply         $207
  - fractal design define 7         $170
  - asus trx40-pro mb               $396
  - nvidia rtx 3080                 $699 (?)
  - steve jobs: a biography         $15  (hardcover)

  = a really maxed out system       $6623
probably 1/2 that for a 5950x/am4 system

EDIT: ok, I had to know...

  - amd 5950x 16 core 32 thread     $799
  - 128gb g.skill ddr 3600          $489
  - noctua nh-u12s cooler           $60
  - samsung 980 pro 1tb nvme pcie4  $229
  - evga 850w power supply          $139
  - fractal design define 7 compact $130
  - asus x570-pro mb                $240
  - nvidia rtx 3070                 $499 (?)
  - steve jobs: a biography         $14  (kindle)
  - linux with kde mac-look icons   $0

  = a really really great system    $2599


Looks pretty nice, but for many you'd be better off with: - AMD 5600x or 5700x (saving $500 or $400) - Samsung 970 pro 2TB for $229 (twice the space for the same price) - rtx 3060 (in a few weeks)

You'll save a fair bit of $600, run quite a bit cooler, it will be much easier to be quieter, and have twice the disk space. Or buy 2x2TB NVMe (motherboards with 2x m.2 are common these days).

Sure the 5600x/5700x isn't as fast in throughput, but how often do you max more then 6/8 cores? Per core performance is near identical and with more memory bandwidth per core you run into less bottlenecks.

I bet over a few years more people would notice double the disk than the missing extra cores.


I don't use apple products but I think the high price of a workstation is not something specific to Apple.

From a discussion I had with a friend recently, I found that Precision workstations from Dell or Z workstations from HP have similar prices for similar performances (sometime prices can reach 40k or 70k dollars).

When comparing Mac Pro to an enthusiast pc build, yes the mac pro is "overpriced", but the mac pro is using a Xeon which is pricier than a ryzen (even if performance wise it's inferior) and a pro gpu which also cost more than consumer gpu (again, even if performance is inferior). The price of a nvidia quadro is always higher than a Geforce gpu with the same specs.

You can see a spec/price comparison I did when having the discussion with my friend here : https://mega.nz/file/Nj4UnSJR#fBdZfn3zoZ8boxap35-GWEgDlicH3R...


It's not just "enthusiast PC builds". You can buy plenty of PCs in that middle category of high-but-not-extreme performance without the certifications, Quadros and vendor markup that a full-on "workstation" model has. And they're perfectly fine choices for many professional use cases.

For that market, the Mac Pro is overpriced (the high-end Dell/HP workstations are too), and Apple doesn't make anything more suited for it. That's the criticism. That the Mac Pro is acceptably priced compared to the Dell/HP workstations doesn't matter if that's not what you need.


My bad, I misunderstood the context of the comment. Thanks for the clarification.

For me personally, outside of laptops and phones, I don't see the appeal of using Apple hardware (unless you want to use MacOS X).


> I think the high price of a workstation is not something specific to Apple.

A professional workstation, with support, services, guaranteed replacement components, guaranteed service-life, maintenance contracts and so on is very different from an enthusiast-built machine.

It's like comparing a BMW M3 to a tricked out VW Golf. You can fit a bigger, badder engine under the VW's hood, stiffen the suspension, replace the gearbox and so on but, in the end, you can get one straight from the dealer and not everyone is inclined to assemble a car from parts.

Did that once. It's fun, educational and not very practical.


Well, VW does sells tricked out VW Golf, straight from the factory (GTI, R) :) Though it is more competition to M135i than M3.


Exactly. You can make a VM Golf that performs like an M3, but, in the end, it'll be a lot of work for an unreliable car.


And the M3 is reliable? BMWs are famous for being leased since repair costs will kill you once your warranty is up.


Depends on the model. The E-series 3-ers were very reliable; for the F-series and newer it is exactly as you wrote.

The leasing thing is for slightly different reason: it is being used by such a market segment, that always wants something new. They would not drive older car, even if it was reliable, it would be not cool enough. Unfortunately, since cca 2010 BMW also found it out, and since then their cars stopped being good -- they don't have to last -- and are just expensive.


Yes and no. 15k hardware is for people who are using it professionally. By "professionally" I mean that they can throw such purchase into their cost and just pay less tax. From my perspective it does not matter if I pay ~15k to tax office or to Apple.

In Europe the incentives to buy expensive goods as a company (like cars, fancy office furniture, etc.) is even bigger because of VAT tax (much bigger than US sales tax).


That just isn't how taxes work. If you spend 15k of your profits and buy capital goods you don't reduce your taxes by 15k because no one was going to charge you 100% tax on it. You save only the foregone taxes rate on the purchase.


And depreciation.


Of you can put your heavy workloads on the cloud and pass it as opex.


which is the way to go


It depends. If it's a constant workload, it may be better to lease the server and operate it on-prem. If it's spiky, then cloud and on-demand/spot is the best option.


You can get the VAT off, sure, but on the rest you just get to pay out of pre-tax profit. In the UK that means 20%.

So a machine that's £5k retail becomes £4167 without VAT, effectively £3333 if you take into account tax savings, which only apply if your company is in profit. A £15K machine effectively would still cost you £10k.

It's a big saving, sure, but it's still a very expensive machine.


Some people pay more than 20% tax. At 50% tax the savings look pretty good.


That tax rate is corporation tax, not a personal tax. Does any European country have a 50% corporation tax?

You're right if you start looking at "Well I run my own company so the cost compared to paying myself that cash as a dividend is much smaller", but that only really applies to those of us who do run our own small companies, own them fully and run them profitably, and have already pumped their personal earnings up to that level. And then we're on to a question about what that box is for and why it's needed, is it a company asset or a personal one?

And remember that you get to apply the same percentage discount to any other machine - your 15K apple box may come down to a conceptual £5K hit on your pocket, if you're paying 50% personal tax on top of the company taxes, but a £4-5k Zen 3 box with dual nvidia 3090s in it will come in at £1333-£1600 by the same metric and quite likely perform better...


If you can run your business on a Zen box with an Nvidia 3090 in it, good for you!


I mean, if you're not running macos-specific stuff, then a top-end Zen3 box with a couple of 3090s in it is going to have more grunt than a 15k mac pro with a Xeon and a Vega II Duo.

But I wasn't really here to talk about comparative value anyway - this was a tax discussion!


Not sure what needs to be discussed there, I know how much tax I pay and which tools I need to get my job done.


Back at the top you seemed to be saying that the entire spend would come off tax, that's I think why people picked you up.


I think you are talking to the wrong poster ;-)


Sure it's not the first time!


I'm not sure if a VAT (Value Added Tax) deduction is gonna save anyone's Christmas considering the base Mac Pro is 5320 EUR (6 333 USD) tax-free.


Doesn't this mean that you cannot use the computer for personal use, or that you can only reclaim the business-use proportion of that VAT?


Strictly speaking, yes, there is the expectation that the machine is used entirely for business purposes, if the business is paying for it, otherwise it might be considered a benefit in kind. It's not so much about VAT then as PAYE.

However I feel no particular guilt that the workstation I use for my full-time dev day-job also has a windows partition for gaming in the evening, and I hope that the tax authorities would see things the same way! It's not like the asset isn't a justified business purchase.


Obviously “I’m willing to pay a lot” can mean a wide range of things, but pretty clearly the comment is talking about paying a moderate premium over the competition. The same way a MacBook Pro model might cost $2500 where you could get a similarly specked windows laptop for $1800. Or an iPhone might cost 30% more than a similar android flagship.

It’s an order of magnitude different with the Mac Pro, the base model is a $6000 machine that will perform like a ~$1500 PC. And the base model makes no sense to buy, it’s really a $10k-$30k machine. It’s a completely different product category.


I don't mind paying for a new Apple, but I do mind paying for repairs on a system that fails just after the warranty runs out. Paying $1500 for a new computer every 5 years or so is good. Paying $1500 for a computer every 16 months, not so much.

The monitor on my MacBook Pro just died, and I bought it July of last year. The repair was about $850 USD. Luckily my credit card covered the hardware warranty, but I'm kind of wishing I'd bought AppleCare.


AppleCare for the Macs is definitely worth it. They even replaced the display on my MBP after 5 years when I called (that seems to be the key is to call someone at Apple directly). This was a part they should've recalled, but anyway I wouldn't have gotten a new display without AppleCare.


There was a display replacement program for 2012-2015 MBP13s, due to the staingate. Though I recall that it was for 4 years since date of purchase, or so.


Yep it was related to that, even though AppleCare ended up replacing my screen after 6 years (not 5 like I initially remembered). It was an unexpected bonus and I’m still quite happy about it. But the key was to call AppleCare directly and not go to the geniuses.


I wonder if there will be one for 2019 MacBook Pros. The repair shop said the replacement display was also faulty, so they had to order a replacement replacement.


Is this the connector problem where the screen doesn’t turn on?


Unless absolutely everything in the PC fails at once, you can just repair it and change components as needed. Not really doable with a Mac given that you can't buy the components from Apple.


Because even a $3K non-Apple Machine outperforms the $5K base model Mac Pro by a large margin, and if I spent $5K outside Apple it would be even more ridiculous, double RTX 3090 + 24 core Threadripper ridiculous.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: