I was an intern at SGI in the summer of 1998, when we shipped the latest minor version of IRIX, 6.5. I worked on a test suite for IRIX's pthreads implementation, and got to ship a teeny, tiny bit of real code that fixed a real-time hold-off in pthread_mutex_t. (IRIX is a hard RTOS, you see.) As things happened, the dot-dot releases of that minor version would be the last releases of IRIX to roll off the software assembly lines before SGI put it in maintenance mode for these last darn-near-30-years.
In 2000, I was the 20th-or-so full-time engineer at VMware, where I worked for 9 years. Then was at Facebook from 2009 to 2016, where I worked on the search backend (now replaced), HHVM (which still runs the Big Blue Application, a shrinking portion of the Meta Empire), and started FAIR in 2015 (which finally seems to have turned around the "open" sign with Yann's departure).
In 2016 I started at Slack as Chief Architect, where I mostly did not write a ton of code. I worked on a job queue scheduler which I would not be surprised to find has been replaced. And after that I was mostly encouraging/advising people doing Real Work.
All of which is to say, it is quite possible that the last code I've worked on professionally that is out there running on customer machines ... is that libpthread mutex bug fix from when I was barely old enough to drink.
I was a young systems programmer in this decade, which were some of the most virulent of my life, and I had a lot of projects on Irix, particularly in Mountain View, necessitating my weekly flight from Burbank to San Jose for 3 days on site, porting and hacking and generally having a great ol' Irix time .. and oh, how I loved my trips into the SGI parts of town, the Birds of a Feather meetings discussing Irix vs. Linux (and SunOS and *BSD, oh no!), the flight simulator facility on the SGI campus where I would regularly get trounced by Air cadets in a matter of seconds .. the beautiful buildings that looked like they belonged under my desk or atop the Indy I had at home .. the confident air of the SGI engineers at lunch in the Oracle campus, the crazy ports of naughty things to naughty hardware (Netscape Navigator on Nintendo 64, oh my, how naughty you were, SGI!)
If only SGI had not made that Microsoft deal, had a bit more respect for their hardware engineers, and instead actually built a laptop to compete with Apples famed tiBook. Its one of my favourite alternative-universe daydreams .. what if the tiBook was an SGI tiBook, running Irix out of the gate .. would we have quite the Big Fruity Company dilemma we suffer today? What would an SGI iPhone have looked like?
Off to play some Tranquility and calm myself down a bit.
SGI creates a low power cpu for Apple to use in portable devices, eventually in desktops and laptops (no Arm).
And either:
SGI launches low budget PC with playstation 1 level 3d graphics as soon as they could compete with win3.1/95, running Irix.
Or:
A few years after that SGI launches what is essentially the Voodoo 2.
Any way you look at it the only possible future for SGI was low cost mass market devices. Just a matter of picking which one, they picked none.
Yes.. some interesting thoughts there, MIPS in my pocket: hell yeah.
The crazy thing is, SGI did have internal research projects to do such things .. they had engineers working on porting Netscape to the N64, which could very well have served as the basis for a more interesting consumer-end mass market device. Imagine if someone at SGI had put a cell modem in the mix somehow, yikes.
Well, its all a dream. Meanwhile I still have all my SGI gear, and I'm not afraid to admit I've been looking at 3DFX Voodoo cards on EBay a little more than I should have today ..
>Yes.. some interesting thoughts there, MIPS in my pocket: hell yeah.
The PSP, and twice, as it had an r3k interpreter/loader for PSX games.
Also, you can call me crazy, but I played Nethack under the PSP with the CFW mod setting the clock from 222MHZ to 50MHZ lasting the battery a few hours more...
The GCWZero was a MIPS console too, and pcsx-rearmed had optimisations for that too.
There have been a couple of GCWZero clones made in more recent years (e.g. from Anbernic) running the same (or a derivative) Linux-based OS with JZ4770 MIPS SoC and software compatibility. Too bad Ingenic never released any successor to the SoC though.
Fahrenheit was the end, as soon as that was the way forward, nothing happened except engineers went off to work for Nvidia, which nobody at SGI seemed to have a problem with.
You can't change a company that sells products for a minimum of £10K to a company that sells products for £2K, and the PC was just making the old business model impossible. Apart from anything else, there were some good tools on the PC, albeit MS Office and Adobe Photoshop. The situation was doomed when you didn't need SGI to do decent 3D. They never would have reinvented themselves for this age, sad to say.
> sells products for a minimum of £10K to a company that sells products for £2K
Well .. Apple ended up doing it. Why couldn't SGI? /s
Oh, I know why SGI couldn't do it: elitism. They were high on their own hubris for the latter part of the 90's when they should have been humbled by 3DS Max and Animation:Master eating their lunch .. and used that humility to build products that made people Think Different™ .. they already had a market doing just that, thinking differently to everyone else (who were bleating "Unix is dying, its gonna die, let it die!" at a fever pitch), but that market thought quite a bit too highly of themselves, methinks .. (I know, I was there, and I was one of them.. apart from the "Unix is dying" bit, I never once thought that since the day I had a MIPS RISC/os-based Magnum pizzabox plopped on my desk and was told to do something productive with it..)
Apple literally did that with MacOSX: Unix systems' geekdom from the A/V-media-writers' Mac UI background. You attracted both kinds of white collar jobs from college background. As it came with Xquartz, you could run old legacy software for GL at highers speeds... and hire a graphics expert inbetween to do fancy PDF's/images for the articles and the press releases.
GNU/Linux with KDE3 could have been close but sadly it was too fragmented. If not, well... imagine a full libre QT from the beginning, GTK no existing (no reason for GTK+/Gnome as KDE would have been good enough), automagic Motif converting code into QT at blazing speeds, and QT themselves releasing high quality C bindings. It could have been unstoppable, even more than Apple. No ESD vs ArtsD, Pipewire merging Pulse/ESD and the like would happened long ago. Kparts would left DBUS and COM/OLE in the dust. KHTML/Webkit would have been even far more powerful.
Fedora woudn´t be the reference distro, maybe Slackware with dependencies handled with Slapt-get and a nice GUI installer for newbies. A whole different world, where the smartphones would provice both an input interface... and a sliding keyboard.
Yes, I agree with you on all points - and in hindsight it seems mad that SGI didn't see Apple as the competitors they eventually became, until it was far, far too late.
I rue the day someone at SGI decided to make a deal with the Microsoft devil.
> the crazy ports of naughty things to naughty hardware (Netscape Navigator on Nintendo 64, oh my, how naughty you were, SGI!)
I remember in my youth when I first discovered Linux, soon after discovering that it ran on all sorts of architectures and starting to wonder how many of the computing devices I owned I could get running Linux.
The N64 and a Mac LC III were the only two I never managed to make it happen on. The idea of IRIX on one somehow never even crossed my mind, even though in hindsight it seems so obvious.
I read there was a plan to bring some kind of network platform to the N64, but I was completely unaware there was a port of Netscape to it -- and googling doesn't show anything either!
Do you have any more info? Is that something you ever had a copy of?
No, I only ever witnessed it being tested on N64 hardware during my visits to Oracle to have lunch with friends .. it was a "what if" kind of project, as far as I recall, to see if it could be done. I guess it wasn't viable.
It's weird to consider that the only code I have written that is still running (except for my own machines) is likely minor patches to various open source projects 20 years ago (and some games, but that's something else).
It's an easter egg on the website that usually goes unnoticed. It's our first time on the front page of HN, so it's a little overutilized right now. Capital-C clears it.
Hi Fil! Congrats on all the amazing progress on Fil-C.
We needed to port all the user-level fork(2) calls to vfork(2) when working on uCLinux, a port of Linux to MMU-less microcontrollers[1]. It used to be that paging MMUs were kinda expensive (those TLBs! so much associativity!!!), and the CPU on your printer/ethernet card/etc. might not have that much grit. Nowadays not so much.
Still. A hard-and-fast use for vfork(2), as requested perhaps.
Amazing. I can practically smell that owl it looks so darned owl-like.
From the article it doesn’t seem as though photorealism per se was a goal in training; was that just emergent from human preferences, or did it take some specific dataset construction mojo?
I love owls. Photorealism was one of the focus areas for training because "AI look" (e.g. plastic skin) was biggest complaint for FLUX.1 model series. Photorealism was achieved with both careful curation of finetuning and preference dataset.
When Diego first showed me this animation, I wasn't completely sure what I was looking at, because I assumed the left and right sides were like composited together or something. But it's a unified screen recording; the right, generated side is keeping pace with the riffing the artist does in the little paint program on the left.
There is no substitute for low latency in creative tools; if you have to sit there holding your breath every time you try something, you aren't just linearly slowed down. There are points that are just too hard to reach in slow, deliberate, 30+ second steps that a classical diffusion generation requires.
When I first heard about consistency, my assumption was that it was just an accelerator. I expected we'd get faster, cheaper versions of the same kinds of interactions with visual models we're used to seeing. The fine hackers at Krea did not take long to prove me wrong!
There is no substitute for real-time when you're doing creative work.
That's why GitHub Copilot works so well; that's why ChatGPT struck a chord with people—it streamed the characters back to you quite fast.
At first, I was skeptical too. I asked myself “what about Photoshop 1.0? They surely couldn't do it in real-time.”. It turns out that even then you needed it. Of course, compute wasn't there to do a simple translation of all rasterized pixel values that form an image within a layer, but there was a trick they did: they showed you the outline that would tell you, the user, where the content _will_ render if you let the mouse go.
And it did blow up! But not as much as changing the UI towards a (familiar) chat interface.
Good point! I agree with it but forgot to mention it: interaction matters.
With GitHub Copilot you are in familiar terrain, your code editor; with ChatGPT, you are talking to it the same way you'd talk to an assistant, via chat/email.
And we, at KREA, don't think it'll be the exception for AI for creativity.
That's definitely true, the chat format (vs the completion format) made all the difference. So much so that ChatGPT blew up even though it was inferior in capabilities to GPT-3, just because it was (much) more usable.
As an investor, I hope you’re ready to bankroll the inevitable legal battles. These are not going to be restricted to the big players. Eleuther was recently sued, and they’re a non profit.
The moment you try to market this, you need to be prepared for the lawsuit. I’m preparing for one, and all I did was assemble a dataset. This model is built off of work which most people (rightly or wrongly) believe is not yours to sell.
I’m still not sure how I feel about it. I was forced to confront the question a few days ago, and I’ve been in a holding pattern since then. I’m not so much concerned about the lawsuits as getting the big question right. Ethics has a funny way of sneaking up on you in the long run.
At the very least, be prepared for a lengthy, grizzly smear campaign. Two people wrote stories insinuating I somehow profited off of books3. Your crew will be profiting with intent.
One reason I’ve considered bowing out of ML is that I’d rather not be verbally spit on for the rest of eternity. It’s nice to have the support of colleagues, but unless you really care solely about money, you’ll be classified in the same bucket as Zuck: widely respected if successful by the people that matter, but never able to hold a normal relationship again. Most people probably prefer that tradeoff, but go into this with eyes wide open: you will be despised.
The way out is to help train a model on Creative Commons images. I don’t know if there’s enough data. And it’s certainly a bad idea to wait; your only chance of dominating this market is to iterate quickly, which means using existing models. But at this point, lawsuits are table stakes. You need to be prepared for when they happen, not if.
Also, join me in at least one sleepless night pondering the ethics of profiting off of this. Normally people only mention this as a social signal, not because they actually care. But if you sit down and think it through from first principles, the ethics — legality aside - is not at all clear. This also isn’t a case of a Snowmaker startup (https://x.com/snowmaker/status/1696026604030595497?s=61&t=jQ... he notes that this only works when you have the general population on your side. All of those examples are of startups violating the laws that people felt were dumb. Whereas I can tell you from firsthand trauma that copyright enthusiasts are religiously fanatical. Worse, they might be on the right side of the ethics question.
This was the first time in my life that a startup’s ethics gave me pause. Not just yours, but everyone who’s building creative tools off of these models. You’ll face a stiff headwind. Valve, for example, won’t approve any game containing any work generated by your tools. And everyone else is trying to build their own moat.
I’m not saying to consider giving up. I’m saying, really sit down and go through the mental exercise of deciding if this is a battle you want to fight for at least three years legally and five years socially. I’m happy to provide examples of the type of abuse you and your team will face, ranging from sticks and stones’ level insults to people directly calling for criminal liability (jail time). The latter is exceedingly unlikely, but being ostracized by the general public is not.
At the very least, you’ll need to have a solid answer prepared if you start hiring people and candidates ask for your stance. This comment is as much for your team as for you as an investor, since all of you will face these questions together.
Hi sillysaurusx. I'd love to get into contact with you. As someone who contributes dataset to academic NLP, you have very unique and interesting perspectives on this question.
I can't reach out to you via twitter as I am not a verified member, so I will reach out via email.
It's "out of favor" because it completely failed as a research program. Let's not equivocate about this; it's nice to understand heuristic search, and there was a time when things like compilation were poorly understood enough to seem like AI. But as a path towards machines that succeed at cognitive tasks, these approaches are like climbing taller and taller trees in the hopes of getting to the moon.
Slight correction: HipHop for PHP was cleanroom, including rewriting large families of native extensions to work with its C++ runtime, although it eventually developed workalikes for the PHP dev headers to ease development. Source: I worked on HHVM, its JIT successor that initially shared its source tree and runtime.
Facebook developers seem to have a surprising amount of free time to go around reinventing things that are not obviously social network features. (Or to have had it in the 2010s, at least.)
Note WhatsApp had 35 employees when they were acquired and Instagram had 13. At that size you need to be productive at managing servers but you're probably not thinking how great it'd be to have a "whole new programming language and source control system" team.
WhatsApp and Instagram at the point of acquisition were simpler than Facebook is (and was), or even compared to what it is now. Once you scale you start to need a lot of engineers to help keep things standing up and everyone on the same page.
WhatsApp had like half a billion monthly active users when they were acquired, that could be considered fairly large scale, no? But I agree with your point in general.
Yes, but WhatsApp is a point-to-point communication tool with mostly small groups. Each individual message doesn't need to be distributed to a potentially very large audience like in Facebook, making processing and coordination of nodes smaller and simpler.
Note though that other large projects with similar scaling-git problems tended to just write wrapper tools to work around it, see how Chromium and Android do it.
The idea that "virtualization" began with Zen in 2004 is rather difficult to read as an early VMware employee. Before QEMU independently discovered it, VMware was JIT'ing unrestricted x86 to a safe x86 subset from 1999 on[1]. Hardware support for trap-and-emulate virtualization came to the market in the early 'aughts after VMware had proven the market demand for it.
When I was at VMware in the 'aughts, VESA often saved us as an unaccelerated option for guests that didn't yet have a driver for our virtual display. Was there really no VESA driver for the 9x family? Or does QEMU's BIOS not do it or something?
BearWindows and SciTech Display Doctor are the two VESA drivers which come to mind for Windows 9x. If I remember correctly Bear will also work in 3.x.
I remember these being somewhat frustrating to get working with VirtualBox. I never tried with QEMU.
I've personally moved away from virtualization for older OSes and to emulation. It just seems much easier to deal with even if it's more resource intensive.
In 2000, I was the 20th-or-so full-time engineer at VMware, where I worked for 9 years. Then was at Facebook from 2009 to 2016, where I worked on the search backend (now replaced), HHVM (which still runs the Big Blue Application, a shrinking portion of the Meta Empire), and started FAIR in 2015 (which finally seems to have turned around the "open" sign with Yann's departure).
In 2016 I started at Slack as Chief Architect, where I mostly did not write a ton of code. I worked on a job queue scheduler which I would not be surprised to find has been replaced. And after that I was mostly encouraging/advising people doing Real Work.
All of which is to say, it is quite possible that the last code I've worked on professionally that is out there running on customer machines ... is that libpthread mutex bug fix from when I was barely old enough to drink.
reply