As a longtime NeXTSTEP user, I still remember the first time I perused the filesystem of Mac OS X 10.0 (Cheetah) in 2001. And I distinctly remember thinking to myself that Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only. The filesystem and toolset was nearly identical, and I even found several config files (I don't remember which ones) that still had NeXTSTEP listed in the comments.
As I evolved to develop iOS apps in the 2010s, NSObject (NS=NeXTSTEP) continued to be a reminder of this same lineage.
> Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only
Yup, that’s precisely it, and Apple made no secret of this. As early as the mid 90s, Apple knew their OS was a dead end and they needed to start over with a new OS as the base. Their in-house effort (Copland) was becoming an obvious failure and it was apparent they needed outside help. At the time a lot of people thought BeOS was going to be it, but the deal fell through when it became apparent that having Steve come back was a priority, and buying NeXT came with him. But it was always gonna be the case that the OS was going to be totally replaced and only surface level UI conventions would be maintained from classic macOS.
I think Steve ended up being a better salesperson than Jean-Louis Gassé. And to be fair, for all its incredible effort I don't think that BeOS was as mature as NextStep.
As I recall, there were some early UI efforts that essentially copied the Classic MacOS feel (the MacOS 8/Copland look) onto NextStep, but they were dropped in favor of OS X's Aqua design (which took inspiration from the iMac design)
Mac OS X Server (Rhapsody) was using the classic Mac OS look over a NextStep base. You can find x86 versions online which work in VM, for nostalgia and research.
To be a bit more specific, Mac OS X Rhapsody DR2 (1998) was the last x86 Mac OS build to ship until the x86 Developer Transition Kit shipped with Mac OS X Tiger 10.4.1 in 2005, and Rhapsody DR1 was the first, and only other, pre-Tiger x86 Mac OS release, though unreleased x86 ports of earlier, pre-NEXTSTEP Mac OS releases were demonstrated internally[1].
Additionally, NeXT shipped several x86 releases of NEXTSTEP and its successor OpenStep (NEXTSTEP 3.1–3.3, 1993–1995; OpenStep 4.0–4.2, 1996–1997) prior to the acquisition, all of which also run under virtualization with some effort — though I'd personally recommend the Previous emulator[2] for running older NEXTSTEP builds, as it runs reasonably fast on modern hardware and quite a bit of historically interesting NEXTSTEP software exists that was never released for versions of NEXTSTEP running on non-NeXT hardware (Mathematica 1.0, Lotus Improv, WordPerfect, and the original CERN WorldWideWeb browser come to mind, though source ports of the latter exist).
[1] https://en.wikipedia.org/wiki/Star_Trek_project
[2] https://www.nextcomputers.org/forums/index.php?topic=2642.17...
Do command line programs built on Rhapsody DR2 (x86) run on Tiger (x86) or vice versa?
But holy shit were those BeBoxes sexy as hell. As a teenager, seeing them for the first time, in an era of soulless, beige, badly named Macs like the "Performa", made a lasting impression that I still remember 30 years later.
Those twin vertically arranged CPU usage LEDs running up the sides of the case, pulsing as the box churned through multiple windows of buttery smooth video playback, while the operator simultaneously read and wrote to the disk, accessed the network, and manipulated the filesystem–without ever stuttering, dropping frames, or beachballing–was really quite something at that time. BeOS could multitask in a way nobody else was doing, and macOS still cannot match it.
Still think it would have been interesting to not let some of that tech die on the vine.
and they had a "geek port" -- what nerd wouldn't love a machine with a GEEK port?
It was a GPIO port you could wire stuff into, think it even had ADC's and DAC's for analog signals. The case mounted LED CPU meter bars were sick - like an 80's amplifier VU meter. Amazing bit of kit those machines - true hacker boxen.
At the time apple looked at beos, the print subsystem was... far from complete. There were probably many deal killers with Be, but this was the one I remember people kept repeating.
> I don't think that BeOS was as mature as NextStep.
It certainly was not. It was single user only - no way to log in with different users, no accounts, just a single default user.
Networking performance was awful in R3 and R4. In R5 they replaced the user space network stack with BONE, an in-kernel IP stack that promised better performance. By then, too late as the Palm sale was around the corner. I remember talking to JBQ (Jean-Baptiste Queru) on IRC about this and how that effected their micro-kernel design and JBQ stated that their claim of micro-kernel was for marketing purposes only.
It was a multimedia first system designed by multimedia geeks. Fun for its time and had a lot of great ideas.
Nitpick: BONE was part of R5.1 "Dano", which got leaked but never publicly released.
>I don't think that BeOS was as mature as NextStep
BeOS was interesting but also kind of a joke. I remember trying it out and receiving error messages written as not-helpful haikus. I could only think that this was not a serious OS, and that was the end of BeOS for me. Here's a few:
"Errors have occurred. We won't tell you where or why. Lazy programmers."
"The code was willing. It considered your request, But the chips were weak."
"Cables have been cut. Southwest of Northeast somewhere. We are not amused."
As a programmer at the time, I was confounded by these awful error messages. It just made the whole thing seem like a joke. I had no time for this. I'd never consider writing software for a platform that obscured the error behind a haiku.
Is it really better to print “Unhandled exception” or “Internal server error”?
Sometimes there just isn’t more context available to an error. This was even more the case 30 years ago, when errors were often nothing more than a numeric code — and then you look it up and it’s just some “unspecified data error.”
BeOS tried to make light of that quandary.
I think that probably is the case that the moment the computer throws an error, potentially having lost some work-in-progress data or any other "funny" consequence like that, is the moment the user is less likely to be receptive to jokes.
Just because the computer doesn't know why it crashed doesn't mean a human won't. At least print out a stack trace, or something, then print the haiku. Give someone a chance at figuring out what went wrong. Maybe there was a developer mode I wasn't aware of, but at least Windows at the time (3.1 and 95) was capable of giving the user some indication about what part of the software crashed, whether or not the user knew what to do with it. My first impression of the OS is what led me to avoid it, it's that simple. I don't know why other people didn't adopt it, I can't speak for them.
Actually didn't know that about BeOS, and suddenly the name of the Haiku Project makes a lot more sense to me.
I thought those haiku error messages were only in NetPositive (the BeOS web browser).
> their own window manager and app framework on it only
Window manager in this case including Quartz/Core Graphics (replacing Display Postscript) as well as complete UI facelift/transplant that turned the NeXTSTEP file browser into the Mac OS X Finder (even if it imperfectly copied the spatial orientation of the classic Finder.)
App framework being not only a Mac flavor of AppKit/Foundation/NS* (Cocoa) but also a virtual legacy Mac OS layer (Classic) as well as a hybrid API (Carbon) that worked on Mac OS 9 and OS X, providing a fairly complete Mac app transition strategy (classic -> carbon -> cocoa).
Cocoa also worked with the (later abandoned) Java bridge as well as Objective-C++ and OSA/AppleScript/etc.
The BeOS guy asked too much from what I recall. I think the BeOS thing fell through before Mr Jobs quietly returned in 1997.
As I recall, BeOS was asking on the order of $80 million, NeXT was acquired for $400 million.
I found this reference, so 80 valuation, Be wanted upwards of 200, “In 1996, Apple Computer decided to abandon Copland, the project to rewrite and modernize the Macintosh operating system. BeOS had many of the features Apple sought, and around Christmas time they offered to buy Be for $120 million, later raising their bid to $200 million. However, despite estimates of Be's total worth at approximately $80 million,[citation needed] Gassée held out for $275 million, and Apple balked. In a surprise move, Apple went on to purchase NeXT, the company their former co-founder Steve Jobs had earlier left Apple to found, for $429 million, with the high price justified by Apple getting Jobs and his NeXT engineers in tow. NeXTSTEP was used as the basis for their new operating system, Mac OS X.”
The high price was also justified by the success of WebObjects at the time, which was seen as a potential fuel for an IPO for NeXT, even though WebObjects was not what Apple was buying it for. This Computer History Museum article goes into that angle in detail: https://computerhistory.org/blog/next-steve-jobs-dot-com-ipo...
“BeOS did not have printing” was the insult thrown around at the time.
I don’t remember the exact number, but BeOS was too incomplete at the time to spend what they were asking, and maybe to purchase at all. There was no way to print documents, which still mattered a lot for a desktop OS in 1996. It needed a lot of work.
Now, in retrospect, Apple had time; Mac OS X wasn’t ready for the mainstream until 2003-2004.
To be fair, printing in 1995-6 was a huge can of worms and hell on earth.
Send PostScript, done. Today it's figured out what driver will properly rasterize exotic things like ligatures because we decided that throwing a real CPU in the printer was a mistake.
Unless you were using anything from that tiny obscure Hewlett Packard operation and didn’t want to shell out for a module. HP never promoted Postscript. It was far from universal as an embedded PDL.
> that throwing a real CPU in the printer was a mistake.
The CPU in any decently modern printer is still many times more powerful than what was in an original LaserWriter (30ppm and up needs power, even if it’s simple transforms and not running wankery). It’s not just about CPU power and modern laser printers still support PDL and vector languages like PCL and PDF (and many have some half assed often buggy PS “compatibility” eg BRScript), the bigger mistake is using general purpose Turing tarpit that is “powerful” rather than a true high level built for purpose PDL. PostScript just isn’t very good and was always a hack.
> Send PostScript, done.
The other problem of course being that raw PostScript as a target for most applications is not at all elegant and ironically too low level. So even if you wanted postscript, an OS that didn’t provide something more useful to most applications was missing core functionality. The jwz quote about regexes applies just as well.
> Unless you were using anything from that tiny obscure Hewlett Packard operation and didn’t want to shell out for a module. HP never promoted Postscript. It was far from universal as an embedded PDL.
it's why HP had a series of printers marketed explicitly for Macintosh use, whose difference from the otherwise same model was that PostScript interpreter module was included as standard, as Mac didn't really support non-postscript printers with anything resembling usability
There was a time when the fastest 68k processor Apple shipped was in the LaserWriter (12MHz instead of 8Mhz in the Mac).
I seem to recall a story of someone internal to Apple figuring out how to run a compiler or other batch processing system on the LaserWriter as a faster quasi-coprocessor attached to a Mac.
I remember that time. I was taking a graduate level intro to graphics class and we had an assignment to write a ray-tracer and turn in a printout of the final image along with a printout of the source code. The instructor allowed any programming language, so I used a different one for each assignment.
For the ray tracing assignment I used postscript, the PS image operator calls a function to return each sample in the image. The transform matrix made scaling the image easy.
My code was two pages long, up from one page because of the many comments. I think the next shortest was 15 pages. It also ran faster than most others because of the faster processor.
Don Lancaster (outside of Apple) did that. In fact, he ignored the Mac and connected a LaserWriter directly to his Apple II, and programmed in straight PostScript. Used that language the rest of his life. All the PDFs on his site were hand-crafted.
Oh I knew that was coming. This interesting but ancient piece of trivia just illustrates something about how slow micros were back then. It’s not like printer don’t have more and multiple CPUs today. Not like whatever poorly written outsourced to India “managed” shit and other features are going to run on a potato. Whatever is driving the color touch LCD on even the Walmart econoshit is many times more powerful then that 12 MHz 68k.
Still have no idea what the GPs point was. You can just as easily run a raster on the host, if it has bugs it has bugs, where it lives doesn’t matter.
Further rosetinting is of course that LaserWriter was $20k and it’d be a decade plus before a monochrome dropped under 1. I’m gonna guess the Canon with the shitty drivers is 10x cheaper and faster.
The amount of data transfer for 300x300 DPI full page images is high even now, most printers still render fonts and such internally.
It really isn't that much though. A 1200x1200 DPI monochrome image on Letter size (not even considering margins) paper is on the order of 16 MiB uncompressed. And bitmaps of text and line art compress down heavily (and you can use a bitmap atlas or prerendered bitmap font technique as well). It’s also usually easier to upgrade RAM in a printer than a crappy firmware.
> most printers still render fonts and such internally.
Many printers have some scalable font rendering capability, but it is often not usable in practice for high fidelity. You absolutely can raster on the host to either a bitmap font, or make use of the PDL's native compression. Most lower end printers (which is pretty much the bulk of what is sold) do not have the capability to render arbitrary TrueType fonts, for instance. A consumer/SOHO level Canon laser using UFRII is going to rely on the host for rastering arbitrary fonts.
I have a modern Canon laser printer that does not properly implement ligatures because of obscure driver issues. What I see on the screen is not what is printed.
Text layout is hard and unfortunately drivers and firmware are often buggy (and as printing is lower and lower margin that doesn’t get better). But just throwing a weird language engine in doesn’t actually solve any of those problems.
Text layout doesn't need to be done when the source is a PDF. Make printers do the PDF and let Adobe control trademark access via conformity tests and life is good.
The biggest errors I’ve found are when the PDF says it’s using FontX or whatever, the printer claims to support the same version, and it’s subtly different.
The PDF tool assumes it doesn’t have to send the full font, and the document garbles. Print as image sometimes gets around this.
> Text layout doesn't need to be done when the source is a PDF.
PDF isn’t entirely a panacea, since it’s complex enough that printing any random PDF isn’t trivial at all, but sure, close enough, but before you were talking about Postscript.
> Make printers do the PDF and let Adobe control trademark access via conformity tests and life is good.
PDF printers aren’t all that uncommon. So why doesn’t your Canon do this? These aren’t technical issues at all. This is an economic/financial problem as mentioned (doing business with Adobe!). This isn’t about part cost, a CPU 100x more powerful than the one in the LaserWriter is nothing.
96 was still a few years before all the printers natively supported postscript - HPs had their own vector graphics library for example.
Many printers in common use were still “one font wonders” and that resulted in lots of fun.
ISTR those of us using HP or Apple printers were generally in pretty good shape at the time. Can’t vouch for other brands.
Apple didn't support anything other than PostScript natively at the time, so their printers came with postscript support. HP made special models for use with Macs that shipped with PostScript included.
I wonder why exactly Copland went off the rails, do we have anyone from the Copland team here on HN who can share their view?
One reason was that they tried to have both a preemptive OS with memory protection and backwards compatibility with existing OS extensions, which frequently patched OS calls left and right, read and wrote memory across process boundaries, etc.
For example, there was no support for screensavers in MacOS classic; screen savers such as After Dark hacked that in by patching various OS calls. There was no support for Adobe type 3 fonts; Adobe Type Manger hacked that in. There was no support for showing a row of extension icons on-screen as they loaded; an informal protocol was created by various extension-writers to support that.
I know there was more to it than that, but it seems to me that a better managed Apple in some alternate history might have just thrown all the extensions "support" out, shipped, and said there would a better documented API for that kind of crap in the future. Probably also not waste time on OpenDoc because there was no shot that was ever going anywhere if you considered it from Microsoft's, QuarkXpress's and Adobe's perspectives.
Which is basically one of the reasons Mac OS X actually ended up shipping. You got a Classic VM in an OS that otherwise didn't care about making breaking changes, but you had a sliding scale from Classic to Carbon to Cocoa to fix your software eventually. Also OpenDoc got thrown out of consideration very early in the house cleaning process at Apple.
It's incredible that it only crashed once a day!
Even Mac OS X early on had problems.
I actually maintained a "Crash Log" at the entrance to my cubicle, recording how many times a given application had crashed that day/week/month/year and ultimately extended it to record even the preceding years.
Quark XPress 6 required a colour-coding which reached into the hundreds for a given year (my recovery folder got cleared out once a week and had hundreds of GBs of files in it most Fridays), and Adobe Acrobat wasn't far behind (but no recover folder), while the OS itself was quite reliable --- work done using TeXshop and other Cocoa apps rarely crashed or had problems.
It was a big change from NeXTstep, where I can only recall one software crash, and two hardware faults (SCSI) during college, which was the high-water mark of my GUI experience, w/ a NeXT Cube (w/ Wacom ArtZ and a scanner) and NCR-3125 (running Go Corp.'s PenPoint) and Apple Newton MessagePad all connected together using a serial interface to write papers and take notes and do graphic design work on.
From the wikipedia page it was that they were just adding feature after feature. As an aside it is very interesting that Classic Mac OS was attempted to be replaced by a microkernel twice but was successful because they figured out how to ship NeXT with fresh paint and an slightly improved window manager that really was just NeXT if you compare the two. (A/UX https://en.wikipedia.org/wiki/A/UX, Copland https://en.wikipedia.org/wiki/Copland_(operating_system)
I think “slightly improved” and “fresh paint” undersells the work Apple did. Replacing the NeXTstep Display PostScript server wasn’t a small thing!
This was the second go at it, they already tried once with A/UX, even though it wasn't as well integrated, nonetheless there was this previous experience.
As I noted in another comment, it was a remarkable accomplishment that Mac OS X was reasonably recognizable to Mac users and ran the same apps (via Classic and Carbon), all while having a completely different core OS under the hood (though as another poster noted they had prior experience building Mac-on-Unix environments.)
Apple has pulled off a number of surprisingly successful "brain transplant" transitions for the Mac platform, including moving from 68K to PowerPC to intel to ARM. In each case, the user experience remained largely the same and old apps continued to run via emulation.
With PDF? Afaik it's a subset of PS so their main work would be in reducing the amount of PS features.
Mike Paquette wrote extensively about the effort involved to replace Display PostScript (which NeXT mostly wrote) with "Display PDF" as Quartz is often referred to as.
See postings to usenet on comp.sys.next.*
I'd love a link or two if you can dig them up.
It would have been in a thread like to:
https://groups.google.com/g/comp.sys.mac.advocacy/c/OLc8T6KO...
You missed MkLinux, but of the three (Copland, A/UX, and MkLinux), only one of those was ever slated to be the future of the Macintosh. A/UX was very interesting, but it was also encumbered with a fairly expensive UNIX license and was never really a consideration as a replacement for the existing Macintosh System software. Neither was MkLinux.
I’d love to know more about its design too
A bunch of documentation was posted here, although I’m not sure the links still work (might need to go through archive.org)
Apple published a book about it: https://dl.acm.org/doi/10.5555/524838
Yes. Focus is about saying “no.” None of the Copland management or marketing people ever said “no”. So instead of just porting the Mac toolbox to C and a microkernel and shipping that as a first step, it had to have a new filesystem API (so abstract that it will support non-hierarchical filesystems!), and a new OO system for handling text strings, and a new UI toolkit, a new help system, oh, and an OpenDoc-based Finder… A lot of potentially good ideas, but you simply cannot do that all at once.
It wasn’t actually a completely lost cause. They could have shipped it in six months if they’d just simplified the damn thing to a native UI toolbox on top of a microkernel, API additions cut to the bone. (One of the problems with that plan, though, is that product marketing wants all the features you can cram in, so you have to be willing to tell them “no” too.)
Anyway, Gil Amelio and Ellen Hancock also didn’t know how to manage software projects, so instead of fixing what they had, they decided to buy outside tech. Which means we got the iPod, the iPhone, etc. But, in context, they were total dumbfucks; they could have gotten there with what they had.
Gil took over as CEO two years after the Copeland project began under Michael Spindler, and turned it around enough to ship Mac OS 8 a few weeks after he was ousted by the board (after Jobs broke a gentleman's agreement he had with Amelio and sold his 1.5M shares, tanking the share price).
Gil Amelio was not a great CEO by any measure, but he deserves credit for helping save Apple. He uttered the first "no". "No", we are not going to ship this Copeland thing, it's hopeless, we either have to buy another OS and build from that, or sell the company because we're out of time.
That is a bold claim. If they do not know … how can they do that. And a technical solution to exist need management know-how, financial viability or at least the market allow you to play out.
That is why even if you were right, within Steve Jobs there would not Apple.
World need to have a mix or proper mix of important things. The reason why communism fall always not because it’s ideal but because it is wrong in human level. You need a mix. A proper mix.
They even “buried” MacOS 9 in a WWDC keynote.
Steve Jobs coming back was not really a deciding factor. In retrospect Steve Jobs was essential to Apple’s success after taking over, but when the acquisition happened, there was little reason to expect that. The conventional wisdom at the time was that Steve Jobs did a good job founding Apple and running the Macintosh project, but he wasn’t fit to run a company the size as Apple. NeXT, if anything, vindicated that impression with a string of high profile product flops. At the time of the acquisition, Apple was dying, and NeXT had spent over a decade and billions of dollars growing into a company that could, at best, hope to be acquired by a dying Apple.
The choice of NeXT over Be mostly came down to NeXTStep being a genuinely better operating system than BeOS in many ways as well as Gassee overestimating his own negotiating position. Steve Jobs managed to take over the company, but this was clearly not the intended outcome because almost all of the executives who were involved in the decision to acquire NeXT were fired in the process, and after Jobs took over he also forced almost the entire board to resign as well.
Jobs also (in)famously sold all of the APPL stock that he received in the NeXT acquisition, saying that he had no faith in Apple leadership at the time.
In retrospect, this sale cost Jobs a ton of money. Of course he ended up doing fine, mostly because of the Pixar acquisition by Disney.
Yes, this was one of the moves in his coup against Amelio, and part of what led to Amelio being fired by the board, at which point Steve Jobs took over on an "interim" basis and promised to search for a new permanent CEO.
Not quite. He sold his shares in 1985. I think technically it was all except one so he could still get investor reports, but that last part might be apocryphal.
https://www.cultofmac.com/news/today-in-apple-history-steve-...
You're looking at the wrong year. From 1997: https://www.sfgate.com/business/article/steve-jobs-confirms-...
Oh, you know what? Good point. I forgot that the NeXT sale included a transfer of stock and wasn’t even thinking about it. I stand corrected!
NS stood for NeXT and Sun. NX was the original prefix before OpenSTEP.
EDIT: I might be wrong about the first part, see my comment below.
According to Wikipedia, NS stood for NeXTSTEP: https://en.wikipedia.org/wiki/Foundation_Kit#:~:text=This%20....
I do remember people in my circles thinking it stood for NeXT Software (which was likely a more suitable name for its use at the time).
This is what I’ve heard. If true it’s funny as it effectively eliminates Sun’s inclusion in the acronym because everyone assumes it’s NextStep.
Nope. NeXT switching from NX to NS predates Sun's involvement.
Uh, that‘s interesting! Do you have a source for this (or is it firsthand knowledge?)
I have an original "NeXTstep Concepts" manual with the NX prefix. I can't remember where I heard the "NeXT and Sun" explanation for the first time, but a Google search shows that the prefix change could've actually occurred before the collaboration with Sun. It could stand for NeXT Software, as it's closer to the transition to a software company.
The "NS" prefix was introduced with OpenStep in 1994[1]. OpenStep was a collaboration between NeXT and Sun[2]. So the S in "NS" referring to Sun is certainly plausible!
[1] https://developer.apple.com/library/archive/documentation/Co...
A person that was involved with the project claims the prefix predates Sun's involvement [1].
> Totally wrong. The NS prefix predated Sun signing on to implement the OpenStep spec by quite a bit. It was either NS for “NeXTStep” or “New System”. None of us can remember which, though, but definitely not NeXT-Sun.
Per bbum and others, NS long predated Sun getting involved.
I first read that in a Fabien Sanglard book, probably Game Engine Black Book: DOOM?
[deleted]
I remember the Unix-ness was a big part of OS X’s nerd popularity. People were talking about real Unix terminals, for example.
Later Windows also aimed for the same thing with their new console app and Linux support. Yet macOS has remained the same. The Terminal app feels essentially unchanged and there’s no good app package service (eg brew etc - these are third party and can mess up your system.)
Even Xcode is, well… look how extensions were restricted.
Modern macOS feels boring, but also not aimed at developers.
Back in early 2000s it was a top choice if you wanted some kind of unixy box with a polished GUI desktop that "just worked", especially if you wanted a laptop. BSD and Linux were fine, but as a desktop OS they were a very different experience from today, took way more tinkering even on a desktop PC as anyone who had to write their own X11 config will tell you. Today installing a Linux desktop distro is so easy and hardware compatibility is so good that the tables have turned, also if you are the type of user that wants a big DE (no judgement) the Linux DEs today are far more polished, people still complain today but if you go back in time it was a mess. These days MacOS seems extremely restrictive and awkward by comparison, on the one hand a huge chunk of the userland got stuck in time, while Apple have become more and more hostile to any kind of changes and customisations to the more unixy side of the system.
Sun had an agreement with Toshiba for Solaris laptops, but they were rather pricey.
UNIX is stuck in time, hardly anything improved beyond file systems, and small API improvements, and that is what macOS is measured against, POSIX certification.
To note that the only standard UNIX UI is CDE, and anything 3D isn't part of POSIX.
ZFS, BcacheFS, HammerFS... I think OpenBSD will have a better FS soon.
On modern FS', the plan9/9front ones are pretty much ahead of almost anything; but plan9 it's a Unix 2.0. It went further. On 3D, forget POSIX. GL was the de facto API and now Vulkan, and the most common middleware multimedia API it's SDL2.
Yeah, but none of that is UNIX(tm).
While IrisGL was born on Irix, it was placed under ARB stewardship, which after Long Peaks disaster became Khronos.
Vulkan only exists thanks to AMD offering Mantle to Khronos, an API designed originally for game consoles, very much not UNIX, and had it not been for AMD, Khronos would still be thinking what OpenGL vNext was supposed to look like.
SDL also has very little with UNIX history, as it was created originally to port games from Windows to Mac OS (not OS X) and BeOS.
Linux isn’t too bad on desktops these days (at least if you’ve got well-suited hardware), but laptops are still hit or miss. Even laptops with well-supported hardware running something like powertop often have middling battery life, which isn’t helped by how on many distros things like h.264 hardware acceleration is not enabled in browsers out of the box.
System76 is great and all but we really, really need a vertically integrated “Apple” of Linux laptops that goes the extra mile to dial in everything to perfection.
I don’t think much has changed, MacOS is still a good desktop Unix for development that is simple and just works, and Linux still lacks good hardware support, especially on laptops. Although in truth most of my MacOS development is actually in Linux containers.
I use macos and Pop_Os as my two main systems and both work really well. I got over the bleeding edge linuxism years ago as my 20s passed into obscurity. Both are respectable systems, and my need to tweak and adjust my setup has gone by the wayside. These days I just want something that works and both work fine for the jobs I need of them and can run a couple of weeks or more with no crashes or need to reboot until the next security updates come out. Linux does have limited hardware support, but it's easy to remedy if you do a little research, macOS is naturally limited in hardware selection; both are easily worked around.
> The Terminal app feels essentially unchanged
Is this supposed to be a bad thing?! It's a rock-solid workhorse. If they changed it I would stop trusting macOS to be any better than the linux flavor of the month
Good point, but:
* I'd prefer better font rendering
* I'd like to move the cursor backwards and forwards in long commands easier, maybe even with the mouse (!). Use case is, say, a long curl command and I press up-arrow to repeat it, but want to tweak something. So I press and hold left arrow and wait.
* Semi-transparent background was cool in the 2000s; these days everything is blurred like it's Vista (not a criticism, Windows does it too, it's just UI fashion.) A terminal background is a place where that might be genuinely useful.
* This may be the shell, but when I reboot and restart I get my Terminal tabs reopened to the same working directories, but the history is identical between all of them. I run different commands in different folders for different reasons, and I want to be able to press up-arrow to get back the last command(s) per tab.
I am far from a Terminal, shell, or anything related expert. There may be solutions to the above. But third party terminal apps exist so there must be a market for some problems & solutions.
> * I'd like to move the cursor backwards and forwards in long commands easier, maybe even with the mouse (!). Use case is, say, a long curl command and I press up-arrow to repeat it, but want to tweak something. So I press and hold left arrow and wait.
Ctrl-A moves you to the beginning of the line (and Ctrl-E to the end of the line); Option-left-arrow moves you left by word, option-right-arrow right by word.
[deleted]
> I'd like to move the cursor backwards and forwards in long commands easier, maybe even with the mouse (!).
You can. Just hold down the option key and click wherever you want in the command.
[deleted]
>I'd like to move the cursor backwards and forwards in long commands easier, maybe even with the mouse
In Terminal.app you may alt-click to make the cursor jump to where you’ve clicked. Besides, I use alt-arrows to jump between words: I don’t remember whether that’s out of the box, though. In any case, you may configure the relevant codes in the Keyboard section of the preferences.
What’s wrong with Terminal.app’s font rendering? Looks as good as any other mac app to my eyes, at least on HiDPI displays.
> I'd like to move the cursor backwards and forwards in long commands easier
I believe basic gnu readline semantics are followed here, and most text editing fields elsewhere, with emacs keybindings - ctl-a for start of line, ctl-e for end, esc f for forward word, etc. “esc e” to open $EDITOR with the command line in a tmp file.
The Terminal seems like one of those things that management declares to be “done” and then nobody touches it for a decade, until somebody notices that there’s enough user outcry for a big overhaul. That’s what happened on Windows, where the console rotted for decades until there was finally enough attention to justify the (actually very nice!) new Windows Terminal. Hopefully there will be, because Terminal.app is indeed lagging far behind iTerm and other alternatives now.
I’ve tried iTerm several times and have never been able to understand the appeal. The most interesting feature to me is the Visor-style dropdown terminal but that feature doesn’t work half as well as the Visor haxie for Terminal.app did, somehow.
> because Terminal.app is indeed lagging far behind iTerm and other alternatives now.
iTerm always struck me as slow and bloated. I think it's just a matter of taste.
iterm2 is so much better it's not even funny, but terminal is fine and don't recall it ever crashing on me, unlike wezterm, which I tried for a while, but without fail would lock up the terminal every couple of days to the point I'd have to kill the session. Luckily whatever I'm doing is usually running in tmux anyway.
[deleted]
I've never had brew mess up my system and I've been using it and MacOS for over a decade... How can it mess up your system? I think one time brew itself got wedged on updates and I didn't care to bother to figure out what it was, followed the instructions on how to properly blow it away, and reinstalled with barely a hitch.
Brew winds up hogging an inappropriate amount of disk space on every system I’ve ever installed it on. I switched to nix recently and haven’t had that problem yet.
That being said I haven’t investigated and it could be user error. But brew can absolutely bork your shit
Having it cache a bunch of crap to your disc isn't exactly "borking your shit". Just delete it. There's probably some settings flag you can export in your shell .rc to control that behavior, but its so not a problem it hasn't been worth my time to go look up how to do it.
> I remember the Unix-ness was a big part of OS X’s nerd popularity.
> there’s no good app package service
It's called the App Store.
Not exactly...
The App Store install what you would install through .dmg or .pkg. This is, if you install, for example, Android Studio, Docker and UTM, you will have three QEMU executables, one for each app.
Homebrew does quite a good job as a package manager for Mac, however, it's far from how the package managers work in Linux distros. For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on. In Mac, I have to upgrade the system through system updates, homebrew packages through ``brew upgrade``, Python packages through pip, the App Store installed stuff through App Store and the manually installed apps through whatever the way they are upgraded.
> For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on.
I actually view this as a liability. System package management should be distinct from userspace package management.
You can still use pip inside pyenv or similar. Pacman would install the system-wide stuff, so you won't need to bother about the libraries that a package that you have installed is using.
Mentioning this classic XKCD: https://xkcd.com/1987/. This only made sense after using a Mac. While using Manjaro it was quite organized: only the pacman-installed libraries and the ones of my user projects. Now in Mac I have the default Python, the several Python versions installed from several dependencies in Homebrew, and so on.
> You can still use pip inside pyenv or similar.
That seems to be the worst of all possible worlds.
Perhaps not the best, which could be maybe something like Nix. But this is the opposite of what we have that in Mac
I think my point was:
> Pacman would install the system-wide stuff
Most system-wide stuff is userspace. I was more referring to a divide between the OS-core (kernel, drivers, init scripts) and userspace. Upgrading userspace should not incidentally not allow the system to boot.
Agreed. Joining system and user packages leads to cases like "how did my python upgrade break yum".
It won't happen if the packages are well maintained. Note that if you want a different version for your project you should use a pyenv or similar and have finer control on its dependencies
A pyenv is separating system and user / project packages. You could take it one step further and use pkgsrc, Nix, or homebrew. That way you can get a newer redis/RabbitMQ/whatever not just newer python.
You can use topgrade (https://github.com/topgrade-rs/topgrade) to update system, homebrew, npm, python, cargo with only one command. It also works on Linux and Windows.
Do developers use the app store? 99% of what I install on my computer isn't available through the app store. I just use it for Apple apps (Pages etc). Pretty much everything else is freely available and more fully featured outside the app store.
Plus, it's spammed with low-quality for-profit crapware—the iOSification of an otherwise fantastic platform
Yes, they publish their apps on the App Store and make money from customers.
As for Apple making life easier for developers by making their OS more like Linux, that is not good for the rest of their users, and these users are more important than developers. It's preferable that developers jump through some hoops, rather than making the OS worse for non-developers.
There's the third option of "not enshittifying the os but also not making it more like linux"
When's the last time you used the App Store to install a CLI utility or a shared library?
Usually Xcode.
xcode-select --install
That doesn’t install Xcode.
Which forces sandboxing and App Store regulations on everything on it.
There’s a really good reason the Mac App Store isn’t that big.
I don’t think you can call the Mac App Store “good”.
Sure, it's horrible. It only makes it easier than any other platform for developers to make money by selling their software to users. And it only makes it easy and secure for users to purchase and install software. Awful.
Apple decided they only care about porn editors. Basically. lol.
It's amazing that still today, you find NSStrings and NS prefixed stuff all over working code.
It's actually hard not to know anything about the old Appkit, as much as Apple would have you believe that it's all SwiftUI now.
Apple is pretty clear in it's intention of making SwiftUI the blessed UI toolkit, however, they haven't deprecated AppKit or UIKit in any way and keep updating it, as they demonstrate at every WWDC with "what's new in AppKit" (e. g. for 2024: https://youtube.com/watch?v=McKYDogUICg).
They also provide means to mix-and-match AppKit and SwiftUI in both ways. In no way are they trying to "have you believe it's all SwiftUI now". It is simply the next generation of UI frameworks.
I feel that SwiftUI still has a ways to go. I like the ideas and philosophy, but the execution is still a work in progress.
First, they really need to beef up the docs.
Next, they need to stop punishing people for "leaving the lane." Not everyone wants their app to look and behave like a bundled Apple app.
I dislike Autolayout, and UIKit definitely has a lot of "old school" flavor, but with IB and UIKit, I can make an app that can do just about anything that I want.
I love autolayout. I've never felt it's been so easy to learn how to lay stuff out. InterfaceBuilder felt so confusing and arbitrary and like having to learn an entirely new language just to get basic behavior working correctly. Plus, it didn't compile to code but to a "nib" file you had to work with in abstruse and unintuitive ways. At least I can debug code; how the hell can i debug a nib? Most of the exposed functionality was difficult to google and assumed you knew what all the obscure icons (like springs and arrows and lines) meant. Very confusing and frustrating.
Meanwhile autolayout was very intuitive. Define your variables, define your constraints, and it just magically works.
I sound like a maniac whenever I express my desire for Autolayout over HTML/CSS.
I will in no way deny the sheer power the latter/reach/etc has and I can see why people like it. Autolayout, while tricky to get at first, pretty much "just works" when I write it nowadays... and the older I get, the more annoying I find the cascade in CSS.
I generally use storyboards, but I do need to occasionally do programmatic stuff. That's a pain, but I can make it work.
Are you still using them? It seems like Apple is basically killing them…?
Yeah. The Interface Builder/Storyboard Editor is a slow-ass bug farm, but it works. They are still maintaining and improving it.
Apple may be pushing SwiftUI, but they aren’t stupid enough to kill off UIKit/AppKit.
I’ll lay odds that most AAA programs are still ObjC.
> First, they really need to beef up the docs.
This is a common refrain I hear w.r.t. Apple and while I write very little native mobile code I have to agree. It’s so sparse and rarely has more that I should be able to find out by hovering over a function/variable in my IDE.
Would it kill them to add a paragraph explain why or how you use the thing? To add code samples? Must be too much for one of the most valuable companies in the world… What kills me is that this is actively hurting their own platforms. I can understand some of Apple’s moves but this one is so incredibly short-sighted.
Those of us "of a certain age," know that Apple used to have the best documentation in the industry. Many of the folks that wrote the docs, had more credentials than the ones writing the code.
I love when I find some fantastic doc in the Apple Doc “Archive” with some huge banner at the top reminding me that there is newer documentation available that has, at best, 10% of the content.
> What kills me is that this is actively hurting their own platforms
I don't think they care. Any useful application will be re-created by Apple and bundled with their OS eventually anyway. Outside devs aren't really necessary for anything but games.
> Not everyone wants their app to look and behave like a bundled Apple app.
As (now just) a user of macOS, that’s EXACTLY what I want - consistency, not the output of some branding or UX person trying to mark their territory.
As with everything, "it depends."
It's entirely possible to have unique branding, yet maintain consistency with standard Apple UX.
Just takes flexibility and compromise. Many designers aren't so good at that.
On a related note, did you ever try out Kai's Power Tools[0]? Now that was a nonstandard UX. Some folks really loved it, but it was an acquired taste.
Actually yes! And Kai's Power Goo as well - both of those I hated. I really want apps to fit into the platform, and not try to be special snowflakes, regardless of what the platform is. Admittedly defining what a "Windows app" looks like has got harder over the years, and there were some dark years on the Mac where people tried to rationalize when to use brushed metal and so forth...
The broader point is, I'm very glad Apple are making it harder for people to go off piste with regards to look and feel. Some people might be capable of doing a good job, but like with advertising, the well has been poisoned, and I assume that everyone doing it at all has poor intentions.
Games get a pass (though I don't personally play any on desktop computers).
Yeah just like the Home app…or the calculator that can’t be resized…or System Settings (oooh which way is the text going to run in this text box? Nobody knows!)… or Game Center, or Apple Music, or any of the other half assed first party apps on the platform that all use different layouts and UI components and behaviors.
The consistency you seek is gone, and has been for ages, if it ever even existed.
All of those are bad apps, which desperately need porting to proper native UI technology instead of Catalyst, in many cases.
The fact that some first party apps are bad is not an excuse for third parties (who are inherently less trustworthy) to run rampant though.
I’m achieving very custom looks in SwiftUI. What exactly do you find punishing?
Here's an example from a couple of weeks ago:
I wrote a small SwiftUI app, to display simple bar charts of data from our social app. Number of active users, vs. ones that haven't signed in, user acceptance/rejection rates, etc.
The SwiftUI Charts library is pretty good for this. I consume a CSV file into a dataframe, and add some extra computed properties, etc.
I wanted to add the ability to pinch to zoom, so users don't need to look at the entire dataset, and try to find individual days, etc.
I banged my head for a couple of days, getting it working the way I needed (TL;DR, I did. Just took a while). I can add a magnification gesture adornment, but the docs for it suck, the docs for the charts suck, the docs for the viewbuilder suck, the docs for the view suck, and even the docs for the dataframe suck. I basically had to find out what I needed, by trial and error, and examining the exposed protocols. Some of the stuff is crazy easy, like translating the pinch to a usable number, but then, I wanted to add a bit of custom handling, for the edges, and allowing the user to pan the chart, while also being able to individually inspect bars (I ended up giving up on that, and have an ugly scrubber scrollbar under the chart). It also doesn't work on the Mac, which sucks, because I do a lot of admin stuff on the Mac.
It's a long story, and I'm sure that you would find fault with my work (one of the things geeks love to do, is point out errors made by other geeks), but it shipped, it works, and we've been using it for days.
And, in the end, even though it exhibits nonstandard behavior, it looks exactly like a Settings App panel. That's fine. It's a backend dashboard, but I'd be upset if it was something we needed to expose to end-users of the app.
That's how they treated Carbon, until they didn't.
You can actually find some NX stuff in active headers as well
The preference file for com.apple.applescript holds a bunch of <data> keys that are base64 encoded NeXT "NXStreamType" objects, or "typedstream", it would change based on the endianess of the system it was running on. The only docs I could find were in a header file from 1999 written by Bertrand Serlet.
A tangent I know, but looking at those old screenshots really made me miss that era of OS X. The first versions of Aqua with pinstripes were a bit busy for my liking, but by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
I am still very sad that the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
iOS 4 on an iPhone 4 and OS X whatever-it-was that was on the initial retina MacBook Pros are still very clear in my memory. Everything looked so good it made you want to use the device just for the hell of it.
It’s because the higher the resolution, the worse those kinds of design effects look. It’s why they’re not much used in print design and look quite tacky when they are.
At low resolutions you need quite heavy-handed effects to provide enough contrast between elements, but on better displays you can be much more subtle.
It’s also why fonts like Verdana, which were designed to be legible on low resolution displays, don’t look great in print and aren’t used much on retina interfaces.
The font point aside, which I do agree with, the rest of your comment sounds very subjective to me.
I too prefer more distinction between different UI elements than is fashionable in recent years - and, make no mistake, that’s all it is: fashion - and don’t see why higher resolutions preclude that. That’s not to say we have to ape what was done 10 or 15 years ago, but we can certainly take things in a more interesting and usable direction than we’ve chosen to do since around 2013.
I find myself clicking the wrong window by mistake a lot more frequently than I did back in the day due, I think, to current design trends.
I don't understand why the effects would look worse at higher resolution, or why how they add contrast. The tacky part I do understand, as well as the point about screen fonts like Verdana.
To choose a relevant counter example: the Macintosh System Software prior to version 7 was also very flat. System 7 to 7.5.5 introduced depth in a subtle way and a limited manner. It was only around System 7.6 when they started being heavy handed, something that I always attributed to following the trends in other operating systems.
It’s because at higher resolutions you can see the flaws more easily.
They’d have to be implemented perfectly every time, otherwise the whole thing becomes a mess. Not everyone will bother to do this.
Also, often when creating designs, things look better the more you take away rather than add.
There are a couple of places where macOS still has Aqua-style icons. Not sure I should name them in case someone sees this and removes them, but... eh... set up a VPN but leave the credentials blank so you're prompted when you connect: that dialog has a beautiful icon.
It looks _just fine_ on a Retina display.
When Retina displays were introduced with the iPhone 4, gel-style iOS also looked just fine.
In print, we're interacting with paper and a fake reflective style looks odd. On a computer, we're interacting with glass and something reflective or high-detail feels very suitable. It matches the look to the medium.
> the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
I might have an alternative explanation.
I often think about something I saw, a long time ago, on one of those print magazines about house decoration, which also featured sample house blueprints. That particular issue had a blueprint for a house which would be built on a terrain which already had a large boulder. Instead of removing the boulder, the house was built around it; it became part of the house, and guided its layout.
In the same way, the restrictions we had back then (lower display resolutions, reduced color palette, pointing device optional) helped guide the UI design. Once these restrictions were lifted, we lost that guidance.
> Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
That's an interesting observation. If it was indeed on purpose, I wonder whether they were weighting it based on the effort on Apple's designers/developers/battery usage or the effort it would have drawn from 3rd party developers.
The stark whiteness of “light mode” colors that’ve become standard since the rise of flat UI is I believe greatly under-credited cause for the increase of popularity of dark mode. Modern light mode UI is not fun to look at even at relatively low screen brightness, whereas the middle-grays it replaced was reasonably easy on the eyes even at high brightness.
Nah, it's because of mobile.
All flat boxes is easier to do with 1,000+ different screen resolutions.
Ah yes, when it hit, one of the first things I did was to replace as much of the Denim and "AquaFresh Stripes" as possible with a flat gray texture.
I've also noticed that as screens got larger screen real estate got cheaper so UI design doesn't require as much effort and it shows.
Long live Snow Leopard! It made my mac fly. A whole release dedicated to making Leopard better. It was amazing, peak macOS.
100% agree; if I could revive it to run it on modern arm hardware I would in a heartbeat.
Leopard was my first Mac OS, and Snow Leopard (obviously) my second, and boy was it great. I miss it so much...
I run an iMac G4 with 10.5 as a home music player. The strange thing is that it feels so easy to use. All the ingredients are the same in modern macOS but the feel is very different.
It’s hard to say why. Clarity in the UI is a big one (placement and interaction, not the theme, ie what we’d call UX today). But the look of the UI (colour, depth) really adds something too. Seeing a blue gel button sparks a sense of joy.
[deleted]
I disdain the modern UI, especially how it treats the scrollbar. On MacOS, even with "Always Show Scrollbar" turned on, applications and web pages try their worst to hide scrollbars or make them unclickable for the users. Check the webpage of ChatGPT for example.
I don't know who the hell had the original idea to do that, but I'll curse in my head for eternity.
Not to mention the macOS scrollbars are ugly as sin now.
Yeah, but I'm mostly focused on the functionality: it's too narrow and sometimes not clickable when you are just off by a bit. The Windows one is ugly AF too but at least it's wider.
> by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
You may be thinking of Tiger, because Apple already started removing color from Finder icons and such in Leopard.
Leopard also introduced a transparent menu bar and 3D Dock.
flat UI was the triumph of mediocre repeatability over humane user interaction
If only there was theming available to recreate those old formatting and styles.
Copland the failed OS NeXT was acquired to replace had themes.
You also had Kaleidoscope[0]. That had some crazy themes[1].
[0] https://www.macintoshrepository.org/1706-kaleidoscope
[1] https://web.archive.org/web/20191021204432/https://twitter.c...
Mac OS 8.5 and above technically the theming support as well (presumably salvaged from Copland), but Apple removed the themes from the final version of 8.5, and never released any of them. I'm not sure if many 3rd party ones were made either, as another commentator notes Kaleidoscope was already fairly established as the theming tool for classic Mac OS, and worked with older stuff like System 7.
For me seeing old OS'es always remind me of the bad stuff. Slow CPU's, slow networking, slow disks, limited functionality.
Maybe I'm a bit too negative but for example when people romanticise stuff from the middle ages I can't help but think of how it must have smelled.
Those who romanticize the past tend to highlight the best points and gloss over the low points, which is likely better than dismissing it altogether.
It's also worth noting that some points mentioned either didn't matter as much, or aren't true in an absolute stuff. Slow networking wasn't as much of an issue since computers as a whole didn't have the capacity to handle huge amounts of data, while limited functionality depends upon the software being used. On the last point, I find a lot of modern consumer applications far more limiting than older consumer applications.
Slow networking? Most people’s networking hardware is still only as fast as the best PowerMac you could buy over 20 years ago. Only in the last few years has 2.5GbE become noticeably common.
To me, Finder often seems slower now with SSD and Apple silicon that it was with spinning drives and PPC. And the Mac boots slower!!
Apple's software today is poorly optimized. They're depending on hardware to do all the work.
Around 2010, I started learning Objective-C to be part of the whole native mobile development movement. What I didn’t know when getting into this was how much of a history lesson I would have to participate in to understand the background behind so many aspects of the language and the core frameworks.
I miss that era!
[deleted]
Yes, I had a similar experience with Objective-C. While I found it generally odd, it makes complete sense as a C/C++ alternative with reflection capabilities and a lean event loop. I disliked the memory management. The language hasn’t aged well but that doesn’t mean it lacked clever ideas for its time.
I think it's aged wonderfully.
Given its requirement to be a pure superset of C, it's a far better (IMHO) version of "C with objects" than C++ has ever managed. If you have a good understanding of C, ObjC is just:
- ok, do all that stuff you did and pretty much forget about[*] memory management, we'll do that for you. This is utterly liberating.
- you want dictionaries, arrays, sets etc ? Gotcha covered
- you want real natively-built-in UTF support ? Yep, got that.
- you can even dynamically create/inspect/alter classes at runtime, which makes things like plugins an absolute doddle.
- on top of all that, if you really need something to be in C or C++, just add it. Pure superset, remember, so your other-language code (or 3rd party libraries) will link just fine.
What I never understood was why people (and to be clear, I am not aiming this at the parent post) would take a look at the [...] syntax and immediately move on. It reminds me of people who board an aeroplane and get whizzed to another continent at enormous speeds, elevation and concomitant pressure-differences and then complain about the in-service drinks. The drinks!
I'm an unabashed fan of ObjC - it pretty much hits the perfect sweet spot between the low-level access that C provides and the over-engineered bloat that C++ ended up as. There's a really minimal extra cognitive load (over C) in ObjC for all the extra facilities the language provides (and in some cases, like memory management, a lot less). It's a real shame that Apple were the only ones to get on board.
[*] Two caveats.
1) You do need to still understand about strong vs weak references. Since ARC. is "Automatic Reference Counting", two objects that each hold a strong reference to each other won't ever be automatically released.
2) If you actually do explicitly call malloc() on something, it's on you to call free(). The reasons to call malloc() are far less than under C but the option is still there for you (being a superset of 'C') if you want to get your hands dirty.
This is more or less where I'm at. It's an amazing language that people have an irrational hatred for due to bracket calls.
Also, I know verbosity isn't for everyone, but let me tell you this: every single ObjC project I return to, I know what the hell is going on because of that verbosity. It might suck to write but it's amazing when put against the test of time.
Even just using plain C, CoreFoundation and grand central dispatch make for a really nice standard library. The fact that CoreFoundation bridges with Cocoa types is icing on the cake, and just makes every feel so cohesive in a way that's hard to appreciate unless you've used it.
It's a shame that despite being open-source there hasn't been much interest in adopting CF or GCD outside the apple ecosystem.
That ability to seamlessly intermix C, C++, and Objective-C I believe was a big driver for the numerous high quality third party apps that OS X became known for in the 2000s. Obj-C paired with AppKit, IB, and KVO made building a good UI a cinch, and for speed critical or cross platform parts under the hood you had C++ and its massive ecosystem of libraries. It was a powerful combination.
I like your answer but want to clarify what I meant with aging well: programming languages such as Rust and Zig are taking the place of C and C++ and ObjC is not growing, and I think there are ideas behind ObjC that can iterate further and take a new place, no matter if it is call ObjC2 or there are other new programming languages. The interesting things about ObjC are not completely outdated, many are original, but not followed by other initiatives (yet?). Memory management is one of the things you, and I pointed out.
I enjoy the "Smalltalk style" of ObjC though.
> I like your answer but want to clarify what I meant with aging well: programming languages such as Rust and Zig are taking the place of C and C++ and ObjC is not growing
This is a false dichotomy, because the primary alternative to Objective-C is Swift if you're writing apps for Apple platforms. And if you're not writing apps for Apple platforms, you wouldn't use ObjC or Swift, which are mostly Apple-specific and Apple-controlled.
Your comment seems off-topic. If you review the entire thread, the discussion is focused on the programming language itself, not platform dichotomies. While it’s clear that Swift is the alternative to Objective-C, the discussion here centers on Objective-C as a viable option compared to C/C++, etc, independent of platform considerations.
> If you review the entire thread, the discussion is focused on the programming language itself, not platform dichotomies.
This is not true. The OP said, "Around 2010, I started learning Objective-C to be part of the whole native mobile development movement." In other words, the OP learned Objective-C in order to write iPhone apps. As far as I'm aware, there's no ObjC toolchain for Android.
> Objective-C as a viable option compared to C/C++, etc, independent of platform considerations.
It doesn't make sense to consider ObjC independent of platform considerations.
> It doesn't make sense to consider ObjC independent of platform considerations.
I don't think that's true. You can code in ObjC on Windows and Linux - it's just nowhere near mainstream on those platforms. Clang is all you need, and the GNUstep project provides the runtime. I have an SDL app up and running in Visual Studio which is written in ObjC - SDL is just doing the rendering.
ObjC is not an Apple-created technology, it was invented by Brad Cox and used by NextStep so it became the de-facto language of choice for Apple's frameworks when Next did the reverse-takeover of Apple.
> it's just nowhere near mainstream on those platforms.
That's an understatement.
> I have an SDL app up and running in Visual Studio which is written in ObjC
What's your background, though? When did you learn ObjC, and why?
> ObjC is not an Apple-created technology
I know that, but nonetheless its usage outside of Apple development is close to nil.
I'm not claiming it's popular outside of MacOS. I just think your statement is too strong.
It doesn't matter what my background is. It is possible (supported, even, by Microsoft) to create ObjC binaries. They even had (until recently, project seems to be abandoned) a port of Cocoa derived from The Cocotron to make it easier to bring Apple apps over to Windows.
That project went nowhere and was never more than a basic kind of Hello World, it is from Windows 10 early days, when Windows Phone still mattered.
The Android variant of similar age, is what was in the genesis of WSL, after being ramped down as well.
My point is that every extant implementation of Objective-C outside of Apple's own developer tools is inspired by and used by current or former Apple platform developers.
Nobody today who was never an Apple platform developer has any interest in Objective-C. And that's why my statement is not too strong.
Objective-C developers are a dying breed even within the Apple developer community. I continue to use Objective-C exclusively myself, but I'm a minority.
They will need to rewrite Metal into something else, as it is Objective-C with Swift bindings.
Hence why Metal-Cpp has a mini Objective-C interop layer.
> It doesn't make sense to consider ObjC independent of platform considerations.
It's worth noting that others may view this differently (no Apple pun intended). For example, there are GNU implementations and other alternatives that demonstrate Objective-C implementations beyond Apple’s ecosystem.
Also the influence of WebObjects has been unappreciated.
EOF was probably the first ORM and Direct To WS the first web-based no-code tool.
Absolutely. WO was a brilliantly designed framework (especially for the time) and being somewhat disillusioned with the state of web development in the last decade, I'm still using it as the UI layer for some of my own applications. It just can't be beat when it comes to throwing together a quick app, essentially being AppKit for the web. And as you say, it's influence was great, although I often wish it had a little more influence.
EOF was a great ORM framework as well and I never really understood ORM hate - until I had to use ORM frameworks other than EOF which generally feel … not that great. I ditched EOF a decade back though, due to it being, well, dead, and replaced it with Cayenne which is an excellent, actively developed ORM that feels very much inspired by EOF's design principles.
In the last few years, I've been working on a WO inspired framework (to the point of almost being a WO clone on the component/templating side) as a side project. It's still very raw when seen from the outside, no documentation and still operating under a bad codename - but hoping to make a release and port my remaining WO apps in the coming year. Hopefully it will add at least a bit to WO's influence on the web development world :).
Especially hilarious when you think of the rising popularity of HTMX.
WebObjects at the time revolutionary model of using the URL for state management would work really well with the new trend back towards server side rendered components.
Totally. I've been very happy to see the world embrace htmx in the last year and it's given me confidence knowing I'm doing the right thing with ng-objects.
The methodology htmx uses is in many ways identical to what we've been doing in the WO world for almost 20 years using Ajax.framework (which I don't know if you're familiar with), a WO plugin framework that most importantly adds "partial page updates". So you can wrap a part of a page/component in a container element, and target it so only that element gets rendered/replaced on the client side when an action is invoked (link clicked, form submitted etc.).
And yes, combined with WO's stateful server side rendering and URLs, it's ridicilously powerful. I usually design my WO apps so users never actually see a stateful URL, they always land on "static URLs" while stateful intra-page work happens through page replacements. I love it.
It is basically a whole generation rediscovering what we were doing in the 2000's, now that SPA craziness went too far.
It also influenced the design of Distributed Objects Everywhere at Sun with OpenStep, which eventually got rewritten into what became Java EE.
Anyone familiar with Java EE will find a familiar home in WebObjects, specially the Java rewrite.
Probably true, but I can confirm that this relationship does not go both ways :). Absolutely hated going from WO to Java EE back in the day. But I understand it's gotten better in recent years though.
True enough that it has its own Wikipedia page,
https://en.m.wikipedia.org/wiki/Distributed_Objects_Everywhe...
The Java WO Java EE compatibility,
https://en.m.wikipedia.org/wiki/WebObjects
And docs,
https://developer.apple.com/library/archive/documentation/Le...
Yeah, I've deployed a few WO apps in Java EE environments. How that works is WO will basically use a servlet (adaptor) for request handing, which will bridge calls and convert them from the Java EE APIs to the WO specific APIs. You don't actually interact with the Java EE APIs much (or at all).
I just meant that going from WO to Java EE didn't feel very nice :).
I see, thanks.
> Along with analysis and debugging tools, Apple still gives away everything needed to build apps for the Mac, iPhone, or iPad.
Very conveniently glossing over the fact that developers still have to pay an annual Apple Developer Program subscription fee in order to be able to distribute their apps. TANSTAAFL, as always.
Very conveniently glossing over the fact that if are developing for the Mac, no you don't. You can distribute it outside the store without paying anything.
iOS, yep you're right.
If you choose not to pay Apple for the privilege of macOS development, you will need to teach users increasingly more arcane tricks to get the app running. As of the latest macOS release, the old trick of "right click -> open" stopped working, and the new trick is "open -> go to system settings and click on a magic button -> open again".
You don't pay Apple for the privilege of development, you pay them for the privilege of guaranteeing your users you are a legit developer who cares about their safety by registering and letting your app be reviewed.
Considering it would take less than a day for Apple's registration scheme to be overrun with billions of fake app builders if they don't put in a small monetary roadblock I don't see how this situation could be improved.
This has little bearing on desktop software, which usually doesn't go through the App Store. Apple does not (yet?) require review for traditionally distributed desktop app bundles or executable binaries. The developer fee is paid in that case just to get a signing certificate. The increasing number of hoops necessary to get unsigned things to run seems to just be funneling more developers into paying up and becoming beholden to Apple so they can stop the nagging of their users.
I think GPs point still stands for signing certificates. The need to pay something increases the barrier to entry. You can't just create a million developer accounts to get a million signing certificates after Apple bans one of them.
I think this is fine. If you're a business, the developer fee is not a significant expense, and it makes the whole ecosystem work smoothly. If you're a hobbyist, student, open source developer, or otherwise in a position where you won't make that money back quickly, macOS provides a workaround for opening unsigned apps. This is so different from the terrible situation on iOS.
Desktop is much better than mobile in terms of developer and end-user freedom, but it's getting worse over time. It's becoming clear that desktop freedom is viewed by the major OS vendors as a holdover of a bygone era and not something they want to maintain forever. There have been some shifts in user base and preferred devices that have ameliorated this a bit, but it seems likely that consumer non-mobile operating systems are going to either vanish (perhaps lingering on in a prosumer/professional variant which will be ghettoized) or else morph into something much closer to a mobile OS (where getting full system access locks you out of the ecosystem).
I agree with you. I still believe the current status quo of Apple charging for signing certificates is reasonable. If that changes in the future, I will be upset about it then and will say so.
Until the mid-2010s, most apps were unverified and people trusted the distribution channels where they got them from.
Which is why they would get 10 browser extensions out of nowhere.
iOS can sideload. Is that not allowed in the development license?
Not just distribute, even to run them locally on your own devices for longer than a few days.
I remember seeing finder running on NeXT at a Halloween party at he Omni group in 1999. That was a cool experience.
I just wish it would be more widely available.
Anyone using GNUstep successfully?
i kind of hate that OjectiveC killed ObjectPascal and that we don't have a mainstream modern kernel written in Pascal.
Object Pascal was sadly mostly gone by the time NeXT was acquired, C++ was the main Apple language already.
Object Pascal got replaced with MPW environment, and a C++ version of Toolbox got introduced.
Additionally Metrowerks with the PowerPlant C++ framework, was eventually the main Apple partner for Mac OS development.
It surprised me that Steve Jobs would be so open to unix.
I thought with his not invented here syndrome and desire to control everything and attraction to simplicity and graphical UI he would have hated unix.
How did he come to love unix enough to build NextStep on it?
Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
Lisa and Mac were products of his seeing the Smalltalk GUI at his visit to PARC. There was nothing off-the-shelf, so they had to be built from scratch.
Of NeXT he said that he had been so bamboozled by the GUI at his PARC visit that he missed the other two, arguable more important concepts: OO and networking.
NeXT used as much off-the-shelf components as possible: Ethernet + TCP/IP for the network, Unix for the OS, Adobe's Display Postscript for graphics, Stepstone's Objective-C for the OO parts (which in turn mashed together C and Smalltalk). It bundled TeX, Sybase SQL Server, a bunch of scripting languages, Webster's dictionary, etc.
They only built themselves what they absolutely had to to get the machine and user experience they wanted.
> Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
See also, forking KHTML into WebKit to build Safari when MS cancelled Internet Explorer for macOS and the platform was left without a robust browser choice. For two reasons: That they were somewhat comfortable letting MSIE reign for so long rather than making an inhouse option, and for not starting over when they did.
It’s funny that Apple originally wanted Mozilla (proto-Firefox) but couldn’t figure out how to build it on Mac OS X in a reasonable amount of time.
> couldn’t figure out how to build it
Well, no. They evaluated the existing choices and decided that KDE's code was a better fit.
> Melton explained in an e-mail to KDE developers that KHTML and KJS allowed easier development than other available technologies by virtue of being small (fewer than 140,000 lines of code), cleanly designed and standards-compliant.
According to Ken Kocienda's book (he was one of the original developers of Safari), that email is a post-hoc rationalization. The "evaluation" was literally him and another guy trying to build Mozilla for weeks and failing, and then someone else joining the team and quickly building Konqueror instead.
> According to Ken Kocienda's book (he was one of the original developers of Safari), that email is a post-hoc rationalization.
Kocienda's book doesn't say that Melton's email is a post-hoc rationalization. It doesn't say anything about that email on a meta level. It merely gives a straightforward account of the project's history from Kocienda's perspective. There is zero contradiction between this, the message from Melton on the KDE mailing list, or the other historical accounts (e.g. on Wikipedia) that cite Melton.
The official story from Melton has always been that they originally tried to start with Mozilla, found it too difficult to work with, so they abandoned it and adopted KDE folks' work as the basis for their Mac-native browser.
The most relevant passage from that email:
> When we were evaluating technologies over a year ago, KHTML and KJS stood out. Not only were they the basis of an excellent modern and standards compliant web browser, they were also less than 140,000 lines of code. The size of your code and ease of development within that code made it a better choice for us than other open source projects.
(GeekyBear's "Well, no" here is definitely wrong if the particulars from Kocienda's account are in fact accurate, but that's simply poor synthesis on some HNer's part and doesn't make Melton's version a retcon. The accounts we have from those involved are consistent.)
If the people evaluating your code can't get it to build, I'd say that's a good sign that your code isn't ready to be taken up by others.
And WebKit eventually birthed Chromium. Truly the circle of life.
I’m pretty sure your history is off here. There was a 5 year agreement around 1998 to keep Office available for the Mac, and to make IE the default (but not only bundled) browser available.
Safari was shipped almost exactly at the end of that agreement, and the announcement as to IE for Mac being discontinued was 6 months later.
A fun bit of trivia: that IE was a different rendering engine (Tasman, not Trident which was used on Windows).
Indeed - it was also a lot more standards-compliant than Trident, though not without it's own share of fuckery as befitted the age.
[deleted]
He wasn't, his position regarding UNIX beards was well known.
Supporting UNIX was a business opportunity to go against Sun and other graphical workstations.
There are recordings of NeXT meetings, and his famous appearance at USENIX, regarding this.
Note that everything that matters on NeXTSTEP is based on Objective-C and Framework Kits, zero POSIX, beyond what was need for those government and graphics workstation contracts.
Those who might care to watch one of those videos themselves, might draw a different conclusion from what you've expressed.
https://www.youtube.com/watch?v=CtnX1EJHbC0
His interview with Unix Today is arguably a better point of reference[0], here's a pull quote:
> we believe very strongly that these Unix battles are foolish, that they're not serving our needs as the Unix community and, more importantly, the problems with the Unix marketplace are not that there are three versions of Unix, all 5 percent apart, that need to battle it to the death for one version. The problem is that Unix is only part of what is considered a modern operating system
He didn't start a company to sell Unix machines out of some malice toward the bearded. He started it as the Next Step after Unix as it was at the time. Rather successfully in the long term.
[0]: https://www.tech-insider.org/unix/research/1991/11.html
Maybe you should dive into additional references then,
https://news.ycombinator.com/item?id=34449912
And in that meeting he expresses the point UNIX being relevant due to the workstation market, not because he is such a big UNIX fan.
Maybe he got influenced by Pixar guys: https://www.youtube.com/watch?v=iQKm7ifJpVE
Even though IRIX had its quirks.
I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Steve Jobs left Apple and founded NeXT in late 1985 with the intent of developing a 3M computer: 1 MB of memory, 1 million pixels and 1 million instructions per second; or powerful enough to run wet lab simulations.
Jobs bought Pixar in 1986 when they developed their own computer systems. Luxo Jr. was shown at SIGGRAPH that same year, one part advertisement for their computer, and one part fun hobby project because some of the Pixar guys aspired to one day do a fully computer animated full length feature film of their own. This worked out very very well for them. Eventually, but they also stopped developing the Pixar Computer System in 1990 in part because Jobs was losing a lot of money propping up both NeXT and Pixar.
Development of NeXTSTEP began in 1986 under Avie Tevanian based upon the Mach kernel he had co-developed at Carnegie Mellon which was developed with the intention to replace the kernel in BSD, which at this point I believe is still just BSD and years away from fragmentation. NeXTSTEP 0.8 was previewed in October 1988 and all the core pieces were there: the Mach kernel, BSD, DriverKit, AppKit, FoundationKit, Objective-C runtime, and the NeXTSTEP GUI. 1.0 came in 1989.
IRIX 3.0 was released in 1987 debuting the 4Sight window manager which isn’t too similar to what was released in NeXTSTEP but does use NeWS and IRIS GL, however it was based on System V UNIX. It’s not until Pixar started making movies, I think actually starting with Toy Story, that they bought Silicon Graphics workstations. For Toy Story, the render farm also started off using SGI but eventually moved to Sun computers.
So if anything, IRIX and NeXTSTEP are probably a decent example of convergent evolution given they were both (at least initially) in the business of making high end graphical workstations and neither needed to reinvent the wheel for their target market.
SGI use within Lucas Film (and thus Pixar) goes way back to IRIS 1000/2000 era, so definitely 83/84 afaik.
Sure, but given the timeline, it’s unlikely the decision came about simply because he was influenced by “the Pixar guys”. I pointed out that the goal for the first NeXT computers was to be able to do wet lab simulations, and this was due to a conversation Jobs had with Paul Berg while Jobs was still at Apple. They met again after Jobs founded NeXT before drawing up the initial spec in September 1985.
More likely the decision to use Mach/BSD was because Avie Tevanian was the project lead for the operating system.
4Sight also didn’t debut until IRIX 3.0 (1987, also when it picked up the IRIX name), prior to that they used mex which I traced back as far as 1985 and prior to that I’m not sure, but I don’t think they had a window manager and it seems unlikely they would prior to 1985.
yeah, makes sense.
it's a far-fetched idea anyways. It's a five months difference; NeXT in sep '85, and pixar in feb '86.
More likely scenario is they wanted to come to market as fast as possible with limited resources, so porting Mach kernel and BSD (both proven/robust things) to their platform was probably the fastest route; It'd also have an existing base of developers to attract and carried some weight if they targeted workstation market.
edit: this is what made me think why maybe he was influenced, since Steve Jobs did actually launch another "cube" two years before NeXTcube, which was developed in the time before him buying pixar. This thing required an SGI/Sun to be attached: https://en.wikipedia.org/wiki/Pixar_Image_Computer
> I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Jobs became majority stakeholder of Pixar in 1986. NeXT was incorporated in 1988.
* https://en.wikipedia.org/wiki/Pixar#Independent_company_(198...
* https://en.wikipedia.org/wiki/NeXT_Computer
But Unix workstations were a thing even before then: 68k-based systems were already around in the 1980s, with Sun (taking just one example) releasing their first product in 1982:
* https://en.wikipedia.org/wiki/Sun-1
IRIX-based systems on MIPS were big in the 1990s (post-Jurassic Park), but SGI also started with 68k-based systems.
They launched NeXTcube in 1988, but they incorporated in sep 1985.
I mean, Mach 2 was cutting-edge and freely available from CMU. Probably less a love of UNIX and more the necessity of having a practical base for a new workstation OS.
“ We saw the invention of the fricking World Wide Web on NeXTSTEP”
I can’t stand when people bring this up with such pride. Like the web couldn’t have came on SPARC or anything other than the glorious Steve Jobs cube.
Probably not, as it was also a matter of available tooling.
Rails, as concept, existed in Tcl and Python, AOLServer, Vignette, our Safelayer, Zope,...
Yet it took Ruby, and a cool demo, to put that idea into world scale motion.
[dead]