Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What Apple could have done is to continue supplying something like the G4 towers. Those where stunningly beautiful machines and practical.


What if they simply decided that they didn't care for that part of the market? At some point we should just accept that.


Sure, but there's a legit gap in the datacenter— not having a sanely, legally rackable OS X machine is a pretty big problem for a lot of organizations. Not everyone wants to do their Jenkins builds or generate homebrew bottles on a Mac Mini under someone's desk.


Is this really an issue? They sell shelves that let you rack 2 Mac Minis in 1U space. You can also buy a rack mount Mac Pro if you want to spend really big bucks.


Isn’t the newest Mac Pro available in rack form?


Ah, so it is, and the thermal story there is definitely much better than with the Mini, there being a clear intake/exhaust flow. OTOH, there's still likely a gap in terms of management features, and the starting price of $6.5k for a 4U system is definitely going to be a barrier for some. Good to know there's at least something, anyway.


None of those use cases seem relevant for iOS or macOS development.


Surely you'd want CI builds for your app? I suppose you can always go the sassy option and just offload this problem onto Travis or CircleCI, but then they're the ones stuck figuring out how to rack thousands of Mac Minis, dealing with thermals in a machine that isn't set up for hot/cold aisles, a computer that doesn't have a serial port or dedicated management interface, etc.

If you're a big enough org or the app is for internal use, this might not be an option anyway. At that point I imagine most people just give up on it and figure out how to run macOS on a generic VM. But at that point you have to convince your IT department that it's worth it doing a thing that is definitely unsupported and in violation of the TOS.

Or maybe some of these are big enough that they are able to approach Apple and get a special license for N concurrent instances of macOS running on virtualized hardware? Who knows.


No company on the planet is big enough for Apple to make exceptions like that. All of them either use a cloud provider or a custom rack design just for Mac Minis.


Companies like Google or Microsoft aren't big enough? Google's Chrome and Microsoft Office alone I would wager are more than big or popular enough to get special treatment

Adobe is smaller by contrast but I'd speculate has a much deeper relationship with Apple as well


All of them use Mac Minis as far as I know.


Nope, just build straight from XCode.


Well sure, for a single person team. But as soon as you're working with other people, surely you want an independent machine making builds and running tests— this is literally item 3 on the Joel Test.


You would be surprised how some teams actually develop code, even timesharing iMacs among teams.


I wonder what the overlap is between those teams who do not invest in their infrastructure, and those who ship broken products.


If you ever get a chance to meet employees at CircleCI or some other CI provider at a conference after Covid is over, consider asking them about how they rack Mac Minis.


Good for them, I just use XCode.


Ah yes, how could they have been so blind. They should have just put XCode in the server racks.


What servers?


The entire thread is about running OSX in data centers. In data centers you run servers if you didn't notice.


pjmlp's view appears to be that because their customers, who are not experts, don't know enough to ask for continuously tested software, they don't believe it is their professional responsibility to provide that either. This allows them to dismiss any complaints about macOS in datacenters as irrelevant.


On the contrary, testing doesn't come for free, and everyone gets what they care to pay for.


I'm not a consultant, but I believe it would be an ethical failing on my part to hand someone else a piece of code without extensive, automated testing and CI.


You have 80 hours budget to deliver X features, no compromise or no pay, feel free to decide how to deal with testing.


Well, thank you for providing the first compelling argument as to why software practices need to be more formally regulated. Providing CI/CD should be the industry norm and expected default.


macOS is not a server OS and X serve has long stop being an option.


I can't tell if you're being facetious or you just don't care about automated testing and continuous integration for your code.


CI/CD is largely ignored in plenty of consulting gigs.

I care to the extent customers care.


So true. I've done a lot of freelance work over the past 20 years. CI/CD has never come up. You're sometimes lucky if you can even set up a test system / site.


I pity your customers.


Their choice, no need to pity them.


Apple tried selling Xserves for years.


And to this day they're still the most gorgeous servers ever made, especially the xserve raid.


I don't think the answer is for Apple to force people into buying custom server hardware any more than it is to force them into making janky rack setups for Mac Minis.

The answer that most people would like to see would be a stripped down, non-GUI macOS that's installable at no cost in virtualization environments, or maybe with some evaluation scheme like Windows Server has, which effectively makes it free for throwaway environments like build agents.


> The answer that most people would like to see would be a stripped down, non-GUI macOS that's installable at no cost in virtualization environments

That's called "Darwin" and it's theoretically open source, but there doesn't seem to be a useful distribution of it. Whether that's due to lack of community interest or lack of Apple support is the question.


A useful distribution (for building anyway) would require all the headers and binaries from macOS, which wouldn’t be distributable, right? So you’d have to have enough of a free system to be able to get to the point where that stuff could be slurped out of a legit macOS installation. Sounds like an interesting challenge.


Okay, but offering no machine suitable for developers and power users will eventually hurt them when they leave whole ecosystem.


> offering no machine suitable for developers and power users

This perception strikes me as having warped in from a different decade. Nowadays, at least in my neck of the woods, developers almost universally use laptops, and Apple's still plenty competitive in the (high end) laptop department.

For the most part, the only developers I know who still use desktops are machine learning folks who don't like the cloud and instead keep a Linux tower full of GPUs in a closet somewhere. And then remote into it from a laptop. Half the time it's a MacBook, half the time it's a XPS 13. And they were never going to consider a Mac Pro for their training server, anyway, because CUDA.

I couldn't speak to power users, but my sense is that, while it meant something concrete in the '90s, nowadays it's a term that only comes out when people want to complain about the latest update to Apple's line of computers.


I work in games where we write c++ in a multi-million LOC base. Every developer in my company has a minimum of 12 cores, and 96GB RAM. All of the offices are backed by build farms on top of this. There are entire industries that rely on very high end hardware. (Of course we also rely on lots of windows-only software too, but that's only an issue once the hardware is solved)


Fair, and we could spend ages listing all the different kinds of people who have really specific job descriptions that require them to have traditional, stationary workstations. And then we could follow that up with lists of all the reasons why they need to be running Windows or Linux on said workstations, and couldn't choose comparable Apple hardware even if it were available.

But I don't think that we need to beat a dead horse like that. The more interesting one would be to figure out some interesting and non-trivially-sized cross-section of people who both need a workstation-class computer, and have the option of even considering using OS X for the purpose.


The main reasons to buy Apple x86 machines for any OS developer was that Apple has to keep their number of hardware variants to a minimum and you can run compatibility (and truly same hardware performance) tests against any OS as OsX was the only one locked to it's hardware. The same might be true for Arm if there are adequate GPL drivers to not exclude Linux/Android, etc.


I'm not sure that's true. At least in my experience, Bootcamp seemed almost designed to cripple Windows by contrast to OS X

The last time I used it (the last MBP with Ethernet built in. I want to say 2012 or 2013?) some of the features "missing" in Bootcamp

- No EFI booting. Instead we emulate a (very buggy!) BIOS

- No GPU switching. Only the hot and power hungry AMD GPU is exposed and enabled

- Minimal power and cooling management. Building Gentoo in a VM got the system up to a recorded 117 degrees Celsius in Speccy!

- Hard disk in IDE mode only, not SATA! Unless you booted up OS X and ran some dd commands on the partition table to "trick" it into running as a SATA mode disk

The absolute, crushing cynic in me has always felt that this was a series of intentional steps. Both a "minimum viable engineering effort" and a subtle way to simply make Windows seem "worse" by showing it performing worse on a (forgive the pun) "Apples to Apples" configuration. After all, Macs are "just Intel PC's inside!" so if Windows runs worse, clearly that's a fault of bad software rather than subtly crippled hardware


I think we used rEFIt.. I remember it would be a bit finicky, but I never really had to boot windows since my product had no equivalent, and these days I don't boot OsX, though firmware updates would be nice.


What if Apple decided that they don't get to gain that much out of AAA games so they don't care offering hardware that those companies might ran on?

I have the feeling that Apple just cares about Apps for iOS (money wise). What's the minimum they need to do so people write iOS apps?

If this hardware, incidentally, is good for your use case, all is good. If not, they might just shrug it and decide you're too niche (i.e. not adding too much value to their ecosystem) and abandon you.


Yes, I think they view the basic mid-range tower box as a nearly-extinct form. Like corded telephones & CRTs.

They choose to make the mac pro as some kind of halo product, I guess. But really the slice of people who need more power than an iMac, and less than this "Linux tower full of GPUs" or a render farm, they judge to be very small indeed. This wasn't true in the 90s, when laptops (and super-slim desktops) came with much bigger compromises.


I don't think they think it's a small market; I think they think it's a commoditized market with very thin margins. That form-factor has a literal thousand+ integrators building for it, and also in many segments (e.g. gaming) people build their own to save even more money. Those aren't the sort of people who are easily swayed to pay an extra $200+ of pure margin in exchange for "integration" and Genius Bar "serviceability" (the latter of which they could mostly do themselves given the form-factor.)


I guess people into hot-rodding, especially for games, have never been Apple's target. (Even if they are numerous, and I actually have no idea how large this segment is.) Besides price-sensitivity, wouldn't they be bored if there were only 3 choices? Maybe we will find out when the M2-xs or whatever arrives.


> suitable for developers and power users

Important to note 'some' here. I'm a developer and power user, and haven't had a desktop computer in almost 10 years.


Me too. I last had a desktop, at work, about that long ago, and have not bought a desktop computer for myself in a lot longer. Laptops got very good and I can still plug it into a monitor and external controllers when I need to. I don’t need a server at home because of the cloud and broadband.


> Okay, but offering no machine suitable for developers

You mean... software developers? The same people who almost universally use a Mac?


There's a massive US centric bubble when it comes to Apple. iPhones and Macbooks are not in the majority, let alone universal, with software developers as whole, just in pockets


The five dollar latte crowd is willing to pay and consume. Walk into any café and good luck finding non-Apple machines. (Occasionally there will be a Surface or two, especially if you live in Seattle).


They are most visible but that does not mean they are most important part of ecosystem. Plus, in 5 years they will be reconsidering workplace setup due to back pain and/or carpal. And Apple asks arm and leg for all ergonomic accessories like external monitor and dock.


> Plus, in 5 years they will be reconsidering workplace setup due to back pain and/or carpal.

This is where I'm at.

I don't know if other people are built from sturdier stuff than me or what, but typing on a laptop to any significant extent leaves me with tendonitis for several days. And staring at a laptop screen too long leaves me with neck pain.

Laptops are a nightmare in terms of ergonomics.

It's been a bit of a blessing for me because I only have a laptop at home, and it basically means I can't take work home with me.

But I'm pretty seriously considering upgrading to a traditional desktop sometime in the next year.


Laptops are my ergonomic savior. I make sure it's on my lap, and that my elbows are on softly padded armrests and hang down gently, and this has given me decades of work after fierce carpal tunnel inflammation.

I also use a Wacom tablet comfortably placed on a table to my right.


You just... buy from someone else? You don't have to buy an external monitor or a dock from Apple.


Sure - so now they're unsuitable for developers?


> The same people who almost universally use a Mac?

This has become steadily less true since about 2012, in my experience. I don’t know any full time developers still using an Apple laptop. The keyboard situation caused a lot of attrition. I finally stopped support for all Apple hardware at my company months ago, simply to get it out my headspace. Will Fusion360 again be completely broken by an Apple OS update? Am I going to have to invest time making our Qt and PyQt applications work, yet again, after an Apple update? Are Apple filesystem snapshots yet again going to prove totally defective? The answer is “no”, because we really need to focus on filling customer orders, do we’re done with Apple. ZFS snapshots function correctly. HP laptop keyboards work ok. Arch Linux and Windows 10 (with shutup10 and mass updates a few times per year) get the job done without getting in my face every god damned day.


> I don’t know any full time developers still using an Apple laptop.

Fascinating. I can name a few startups in my town that use Apple. One just IPO'd (Root), another is about to (Upstart). There are others as well.

The big companies it's hit or miss. Depends on if they are working on big enterprise applications or mobile/web. Mobile and web teams are all on MacBook Pros, and the big app dev teams aren't.

When I was last in Mountain View they were on Mac as well but I know that depends on personal preference.


>The same people who almost universally use a Mac?

* in very specific places and conditions.

Actual numbers from every single credible survey puts macs at a grand maximum of 25%.


Well, most corporations don't give developers a choice in what computer they use. I doubt that makes them unsuitable.


Macs keep dropping off our domain, so there's no real way to maintain their provisioned state.


I use a mac because other developers do in my office. But I'd be just as productive on a linux or windows machine.

For a while osx had the edge because it had a nice interface while still offering a lot of unix. Now windows and linux has caught up in the areas they were lacking before. Meanwhile apple has been caring less and less about people using the cli.


Quite possibly - I was a huge Apple fan who's now using a PC because I was fed up with the lack of viable options for me.


Apple will certainly offer an ARM-based MacPro, but I'm assuming it'll be a very different beast - current one maxes out at 1.5TB of RAM and it doesn't seem likely anyone will integrate that much memory on a chip anytime soon ;-)

Memory bandwidth is one key feature impacting M1's performance. When Apple builds an ARM-based MacPro, we can expect something with at the very least 5 DDR5 channels per socket. It's clear, from this, the M1 is a laptop/AIO/compact-desktop chip.


The M1 already has 8 LPDDR4x channels per socket, running at 4266MHz.


My bad. I was looking at the specs. It's 300GBps, which is roughly 5x DDR5 IIRC.

So yes, at the very least 8 DDR4 channels, or one per core, but I'd expect more from a workstation-class board.

Now, speaking of the board, all those memory channels will be funny.


Funny? 8 channels is the standard AMD Epyc socket. Most threadrippers (AMD's workstation chip) are 4 channel, but there is a variant that's 8 channel.


I would expect more, so that cores don't get memory starved. The M1 has 4 fast cores and 4 slow ones. If we imagine an M2 with 8 fast cores, I would expect it to need 16 channels to have the same performance. That's a lot.


Dunno, the M1 CPU package is tiny, thin, power efficient, etc. It's got 4 memory chips inside the package. I don't see any particular reason why a slightly larger package could have 4 memory chips on one side, and 4 chips on the other to double the memory bandwidth and memory size.

However the M1 is already pretty large (16B transistors), upgrading to 8 fast cores is going to significantly increase that. Maybe they will just go to a dual CPU configuration which would double the cores, memory bandwidth, and total ram.

Or move the GPU and ML accelerator offchip.


I'm a developer and haven't had a desktop in 15 years or so. It's been a mix of Thinkpads (IBM then Lenovo) and MBPs.

I'm guessing very few developers need the extra power a desktop offers over a high-end laptop.


even a low end laptop. i work in clojure for finance. digital nomad. thinkpad x220.


I think the developers and power users that still use desktop machines/towers are either very cpu-power-hungry niche exception, or the more backwards ones, and thus least likely to influence/be imitated by anyone...


I care to differ (as a developer on a desktop). The reason for developing on a desktop is that my productivity is much higher with 3 screens, one of which is a 40 inch, a full 101 key keyboard and a mouse.


> The reason for developing on a desktop is that my productivity is much higher with 3 screens

Those requirements don’t dictate a desktop[0]. Also, the physical size of the monitor is irrelevant, it’s the resolution that matters. Your video card doesn’t care if you have a 40” 4K monitor or an 80” 4K monitor, to it, it’s the same load.

The reason I still have a cheese grater Mac Pro desktop at all is because I have 128gb RAM in it and have tasks that need that much memory.

[0] I’ve connected eight external monitors to my 16” MBP (with laptop screen still enabled, so 9 screens total). I don’t use the setup actively, did it as a test, but it very much works. The setup was as follows:

TB#1 - 27” LG 5K @ 5120x2880

TB#2 - TB3<->TB2 adapter, then two 27” Apple Thunderbolt Displays @ 2560x1440

TB#3 - eGPU with AMD RX580, then two 34” ultrawides connected over HDMI @ 3440x1440, two 27” DisplayPort monitors @ 2560x1440

TB#4 - TB3<->TB2 adapter, then 27” Apple Thunderbolt Display @ 2560x1440

So that’s almost 50 million pixels displayed on around 4,000 square inches of screens driven by a single MBP laptop.


How was it to move (or find) the cursor?

(I kid, I kid)


You kid, but it legit was an issue. I’ve used at least 3 monitors (if not 1-2 more) for over a decade now, so I’ve experience there, but going up to 9 even for a short while, it was definitely an issue.


Yeah I’m with you. Laptops are great, but they sacrifice a lot for the form factor. Remove the constraint of needing an integrated screen, keyboard, touch pad and battery, and you can do much more. Sure you can dock it, but docked accessories are always second class citizens relative to the integrated stuff.


All of which are available on modern laptops


Laptop user, I also have 3 screens. I do use the MBP's keyboard, but never felt like that cost me productivity. I use a normal mouse as well. The only reason I can think of the need a desktop is the extra CPU/GPU capacity you can get.


> The only reason I can think of the need a desktop is the extra CPU/GPU capacity you can get.

Or RAM


Or internal peripherals. If I want 20Tb of storage, and I don't want external chassis all over the place, I need a desktop with at least a couple of 3.5 bays.


You mean you don't like paying $500 for 8GB of soldered RAM?


Nope, not what I’m saying at all (in part because your comment is hyperbolic and untrue). Some folks need more than 64gb RAM which is the highest amount most laptops have.


He is not that off. Apple asks $200 for 8 GB, so he is at the same order of magnitude. For comparison, I've bought this week 16 GB DDR4 ECC (unregistered) sticks for 67 EUR per piece (before VAT).


Great, so you bought an different type of RAM in a completely different form factor and paid a different price. This is on “processor package” RAM and will thus have an entirely different price basis than a removable stick would, not even factoring in the Apple Tax.

Furthermore, how is that relevant to the point _I_ was making about needing more than 64gb of RAM? If you both want to tangent, fine do so, but don’t try to put words in my mouth while doing it.


> Great, so you bought an different type of RAM in a completely different form factor and paid a different price.

It is being called "using an example" or "illustrative example". For comparison, I've used a type of RAM that is traditionally much more expensive than you find in laptops.

> This is on “processor package” RAM and will thus have an entirely different price basis than a removable stick would,

No.

1) The same price is being asked for RAM in non-M1 models.

2) You could put any price tag you want, because the item is single-sourced, the vendor can pull a quote out of the thin air and you cannot find exact equivalent on the market. Therefore, for comparison, a functionally and parametric similar item is being used.

> how is that relevant to the point _I_ was making about needing more than 64gb of RAM?

You get a different product, that supports more RAM.

> If you both want to tangent, fine do so, but don’t try to put words in my mouth while doing it.

Could you point out, where I did that? I was pointing out, that your note about the GP being hyperbolic is untrue - he was in the ballpark.


> I was pointing out, that your note about the GP being hyperbolic is untrue - he was in the ballpark.

Essentially as in the ballpark as $80 is, both are off by 2.5x. Claiming they are “same order of magnitude, so it’s not hyperbolic” is laughable. $100k and $250k are both same order of magnitude, but are radically different prices, no?


at work, when at the office, they are always pushing screens on us. keep thinking is some pork deal with dell. my whole team either plugs in a laptop to one screen, or just works straight on the laptop. maybe we're not cool.


A quick Google will turn up several serious usability studies that show more screen real estate == higher productivity. It depends a lot on the type of work, of course, but for development a larger screen would mean less scrolling and tab switching => less context switching => so your brain gets more done.


Its probably the ergonomics police.


Or ...they're old and cant see the tiny laptop screen or get back pain when using a laptop all hunched over. To be honest, I don't know how anyone does serious work on them.


You can connect a laptop to 2-3-4 externals screens. Which many do. You don't need a tower for that.


Apple itself is selling it's new chips as making faster devices. If only a niche want that speed, Apple probably wouldn't be pushing it as part of the pitch so hard.


That would be if everything else was equal. Everything else is NOT equal. People also want portability, small size, battery life, etc.

If more than a niche had speed as its sole priority, then they would already use desktops, but most (80%+) use laptops today.

But of the majority that uses laptops, most would like a faster machine. Just would prefer it was also a laptop.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: