Even though it's off-topic, my favourite case in a build by the same guy. Nowadays it would work with just a Ryzen 9700X:
https://fabiensanglard.net/the_beautiful_machine/index.html
[Edit] Maybe not completly off-topic since it would be my dream PC.
I bought this case a couple of years ago after this article was linked here.
I love it. It's beautifully engineered. Top quality. It sits at the corner of my desk proudly silent.
I'm likely about to upgrade the pc within but the case will remain a strong feature of my desk.
Do you use it as a gaming PC (or for other high GPU load activities)? And if so, what's your take on noise under load?
Edit: I guess this is a senseless question if the case really only uses passive cooling. I was assuming there would still be fans somewhere.
I despise my current PC's fan noise and I'm always on the lookout for a quieter solution.
It's a dev workstation for me.
Currently inside is an i7-9600 which I limit to 3.6ghz and a cheap 1050ti.
The CPU is technically over the TDP limit of the case but with the frequency limit in place I never exceed about 70degC and due to my workloads I'm rarely maxing the CPU anyway.
There is zero noise under any load. There is no moving parts inside the case at all, no spinning HDD, no PSU fan, no CPU fan, no GPU fan.
> I guess this is a senseless question if the case really only uses passive cooling.
Are there senseless questions?
It can be used for gaming if your demands are met by a Nvidia 1650.
MonsterLabo built passive cases that could cool hotter components, seems defunct though, sadly.
Did you have no success upgrading your fans (Noctua etc)? Still too loud? How about water cooling?
It's an HP OEM (because I moved countries during the pandemic and getting parts where I settled was ridiculously more expensive).
The CPU is AIO (and the radiator fans are loud). The GPU has very loud fans too, but is not AIO.
It's four years old at this point and I might just build something else rather than try to retrofit this one to sanity (which I doubt is possible without dumping the GPU anyway).
I bought my current gaming desktop off a friend as he didn't need it anymore when I was looking for an upgrade. It had an AIO cooler. The pump made so much noise and it seemed like I had to fiddle with fan profiles forever to get it to have sane cooling. I swapped it for a $30 CoolerMaster Hyper 212 and a Noctua case fan. It cools well enough for the CPU to stay above stock speeds pretty much all the time and is much quieter than the AIO cooler was. I'm not suggesting this CPU cooler is the best one out there, but just pointing out its not like one needs to spend $100+ on a cooler to get pretty good performance.
The GPU still gets kind of loud during intense graphics gaming sessions but when I'm not gaming the GPU fans often aren't even spinning.
Honestly at this point it's not so much about money as it is about whether or not this particular case/setup/components combo is salvageable with minimal effort.
The CPU fan is rarely an issue (it mostly just goes bananas when IntelliJ gets its business on with gradle on a new project XD).
The GPU is the main culprit and I'm not sure there's any solution there that doesn't involve just replacing it.
Depending on the fans it may be possible to re-oil the bearings.
Interesting idea. I feel like the fan noise from my GPU is just air moving, but maybe not.
Just last week I moved from using a Noctua NH-U12S to cool my 5950X, to a ARCTIC Liquid Freezer III Pro 360 AIO liquid cooler (first time using liquid cooling), and while I expected the difference to be big, I didn't realize how big.
Now my CPU idles at ~35 usually, which is just 5 degrees above the ambient temperature (because of summer...), and hardly ever goes above 70 even under load, and still super quiet. Realize now I should have done the upgrade years ago.
Now if I could only get water cooling for the radiator/GPU I'm using. Unfortunately no water blocks available for it (yet) but can't wait to change that too, should have a huge impact as well.
I love the attention to detail in this post. I've thought about picking up one of those Vortex86 based ITX boards like the ITX-Llama [1] since you get the joy of running on real hardware but don't have to worry about tracking down a soundblaster card, network cards, etc. Assuming that they ever come back in stock that is.
This post is a meticulous documentation, it's attention to detail is remarkable, and it all looks so clean. How much did your project cost in total, though, out of curiosity (tools and failed attempts included)?
Reading through the post, sadly nothing worked the first time round (bravo to the poster for his perseverance), and while things got slightly better, IT "stuff" is still surprisingly fiddly and fragile.
The quality of the build and the technical detail of the handbooks are areas where things got remarkably worse - how could we let that happen? How can children learn how stuff works without schematics of the devices they own and love?
This is a pretty common behaviour. My dad has been buying both his dream Amigas and his dream car, a Triumph TR6. I bought my dream childhood console, a Gameboy Advance SP (I only had a regular Gameboy Advance).
I also bought a few consoles (GB, NES, N64, PS2) that I was never allowed to own/play, except for NES which I didn't own but did play due to its popularity. My parents were pretty strict with my studies and piano practices so I didn't even have much time with TV, and games were considered as not only wasteful, but also evil.
The thing is, I never played those consoles after purchasing them. I don't have any nostalgic feelings towards except for NES. I actually felt sorry for myself because I discovered my inner kid died a long time ago when I tried to wake him up.
I'll probably give them to a friend's kid if he so wish, or donate to some local museums.
And the common realization then is: what did I find so interesting and special about this (as a child)?
Cannot confirm.
I often look fondly at the hardware I have.
I recently build one pc for each PC generation of the 90s. (486,Pentium 1-2,Athlon)
Still love them even after having built them.
Finding back into DOS is quite interesting, since its so different to PCs today.
Past threads
https://news.ycombinator.com/item?id=44021824 May, 2025 (86 comments)
https://news.ycombinator.com/item?id=44023088 May, 2025 (0 comment)
https://news.ycombinator.com/item?id=44026363 May, 2025 (1 comment)
Honest question: Will building a high-end PC still be a thing in 10 years? I've built all of mine in the last 20 years. Just finished my first AMD build. But I don't think it'll be possible or allowed after a few more CPU iterations. Sure, you'll be able to do builds with the CPU tech available up to when it stops, but I seriously doubt that the cutting-edge chip tech ten years hence will be available to hobbyists. Tell me why I'm wrong.
I think this hinges on what one considers "cutting edge CPU tech". Is it "newer and better CPU tech than before" or "the highest end CPU tech of the particular day".
If the latter ("the highest end CPU tech of the particular day"), I think it's going to keep getting harder and harder, with more top end options like the M4 Max being "prebuilt only", but I don't think it'll go to 0 options in as short as 10 years from now.
If the former ("newer and better CPU tech than before") I think it'll last even longer than the above, if not indefinitely, just because technology will likely continue to grow consistently enough that even serving a small niche better than before will always eventually be a reasonable target market despite what is considered mainstream.
You're going to have to unpack "allowed". Are you saying that the Apple model will win so heavily that separate parts will not be available? What change are you expecting?
NVIDIA not selling cutting edge other than in bulk is a phenomenon of the AI bubble, which will eventually deflate. (I'm not saying it will go away, just that the massive training investments are unsustainable without eventually revenue catching up)
no, you tell us why you think the next ten years are going to be different than the last thirty
One possible reason: to achieve the performance improvements, we are seeing more integrated and soldered-together stuff, limiting later upgrades. The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.
> The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended
That’s a laptop. It’s soldered for space constraints.
There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.
Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.
I'm pretty sure the "Framework Desktop" is a desktop, not a laptop.
The Framework Desktop is not a laptop. The clue is in the name...
The only market for desktops is gaming. Hence nvidia will just slap a cpu on their board and use the unified memory model to sell you an all in one solution. Essentially a desktop console.
Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.
Cases will probably survive since gamers love flashy rigs.
There are a handful of professional uses for a workstation that are hard to beat with a laptop.
If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.
> There are a handful of professional uses for a workstation
I've historically built my own workstations. My premise is that my most recent build may be my last or second to last. In ten years, I will still have a workstation - but not one that I build from parts.
All of these can be done much better on the cloud (I can spawn as big of a machine as my pocket can afford). And with today’s tooling (vs code & jetbrains remote development) you don’t even notice that you develop on a remote machine and not your local.
So the desktop developer market is for those who are not willing to use cloud. And this is a very small minority.
(FYI I am not endorsing cloud over local development, I just state where the market is)
Much of my PhD thesis was/is done traveling in places with poor, poor Internet. Currently on my laptop in rural Calabria, where I pull a blazing fast 60 kbps, sometimes. Would be very irritating waiting for the compiler/theorem prover to go brr, remotely… I can hardly edit a Google doc out here!
This doesn’t contradict your minority point, but it really does make me appreciate local-first.
CS thesis that requires traveling, tell us more! What's the topic? :)
Perhaps the Italian girlfriend was not where the mainframe on which the theorem prover ran? ;)
If I had been in Italy, perhaps my Ph.D. would never have been finished...
Yes, until the day you get an attack by a North American Fiber-Seeking Backhoe, losing your gigabit+ connection and your entire set of tools with it.
I mean there are also prepers with power generators, solar panels and dry food and water tanks waiting for the apocalypse to happen. Again this is a very small minority.
>The only market for desktops is gaming.
I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.
>Cases will probably survive since gamers love flashy rigs
But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?
There is still lot of productivity stuff that benefits from power of desktops. Engineering (Ansys etc), local AI development, 3D modeling, working with large C++/Rust codebases, scientific computing, etc etc. And related to gaming there is of course the huge game developer market too. There is a reason why nvidia and amd still make workstation class GPUs for big bucks.
But all of that hinges on fast off-chip memory. If manufacturers agree that this memory and the SoC need to be soldered, there's not much left to swap out except PCIe boards.
If the processor comes with builtin GPU, NPU and RAM will you be really building the system
Sure. Building a PC already is barely building anything. You buy a handful of components and click them into each other.
While that is mostly true there is a large variety of motherboards. It took me a while to find something with the right SATA and PCIE slots that I wanted. But after that it is just using a screwdriver and some cable ties.
A lot of flexibility still exists
RAM? Are we expecting on-chip RAM any time soon?
Apple's done it since 2020. Intel was planning to, but walked it back. It dramatically increases performance, and allows vendors to sell you RAM at 8x the market price, and requires you to replace your entire computer to upgrade it, thereby inducing you to overspend on RAM from the outset so that you don't have to spend even more to replace the entire system later.
There's literally no reason for shareholders not to demand this from every computer manufacturer. Pay up, piggie.
Exactly. Better performance and higher profits. Seems inevitable to me.
Yes, as that's already the case with phones. There is more to a phone than the SOC.
Who builds phones?
It's mostly factory workers. But hobbyists could do so too if they wanted. Most people want to just buy something that works out of the box so it's not a popular option.
Looking at those pictures made me realise I could _smell_ it…
Got kicked right in the nostalgia I guess
This is nice, but without a CRT monitor (he's using an IPS) it's not quite the real thing regarding the actual on-screen experience.
It's much less important for VGA games than for console games. Most used 320x200 resolution, which was line-doubled to 320x400 then displayed on a monitor capable of at least 640x480, so you had distinct and moderately sharp pixels. The monitor was natively progressive scan, so you didn't get the exaggerated spacing between scan lines that you got on consoles using non-standard field timing to force 240p on a 480i TV. And the refresh rate at this resolution was 70Hz, but very few games ran at 70fps, so you lost most of the benefit of the low persistence of CRTs.
Without a CRT, the soul of retro computing doesn't glow as warm.
I always really liked the handle on my old upgraded lenovo E73. It made it much easier to transport when going to university and back for the holidays, and I'm sad that most cases don't have one. Even a hinged one that sits flat to the top of the case when folded down would be awesome
Typing this on a very old Model M keyboard :)
Honestly, that IBM PS/1 model 2168 computer case looks retro, but also classy and nerdy. I miss the days when computers had enough free slots for external media drives. I really like physical media like CD-ROMs, Blu-ray disks, and floppy disks (I don't use those anymore). It's getting harder and harder to find a computer case that has a slot for a Blu-ray/CD drive.
The prior owner took great care of that machine. It looks new.
oh man, what a trip down memory lane. i started building PCs in college with 386/486s and last year rebuilt my silly custom loop watercooled workstation. :)
and yes: the supplied pc docs back then >>>>>>>> supplied pc docs today
Were they 80486 or i486 8) SX or DX?
My first "PC" was a Sinclair ZX80. I got my soldering iron out.
Much later on (1986ish) my Dad bought a Commodore 64, unfortunately he plugged the power lead into the video socket, when me and my brother arrived home for Chrimbo. Dad got it repaired and it served us very well for several years.
I still have that C64 and it was repaired again a few years ago (re-capped). It now has a USB interface etc. I also have an original Quickshot II joystick and it still works fine.
My first "real" PC was a 80286 based thing. A maths co pro (80287) was a Chrimbo prezzie too and costed something like £110. It had a whole 1MB RAM and the co processor enabled me to run a dodgy copy of AutoCAD. Yes, AutoCAD used to run in 1MB of RAM! The next version needed something mad like 32MB minimum.
Most of what I know about maintaining and assembling computers I learned from the pictures on my Aptiva manual when I was a kid.z
my childhood dreams include gravis ultrasound
Or Roland MT-32. That was the dream.
I lived that dream, and it was good.
I have been wanting to do this with 2000s era Athlon system
The rise of retro computing and gaming is wonderful thing.
MiSTer has been a huge boon for me in terms of saving space and having access to old computers. I have it in an old pizza box case and connected to my old IBM CRT monitor.
I have a modern mouse and mechanical keyboard, but I tried to make everything as beige as possible...
Ah, this is the perfect machine to replicate John Carmack’s work. It’s not a NEXT but is pretty strong to do development on.
wow, this whole blog is a treasure!
I just finished building a water cooled Threadripper 9980X machine today (you can go see it on reddit in r/watercooling or r/threadripper.
My first ever build was a 386 though.
What fond memories.
>Joystic port or MIDI port? Back in the days I only ever thought of the DA-15 as the Game port[1]. To me it was only meant to welcome a joystick and play flight simulators. Little did I know that it could also be used as an output to send MIDI commands to a MPU 401-UART!
With no latency of course because USB hadn't been invented yet.
>My SC-55ST came without a power supply. That was the opportunity to understand better the power requirement marking on the back. Voltage and Amperage are obvious but one must also pay attention to the polarity sign. The SC-55ST uses a negative center[7].
This is the "standard" for guitar effects pedals due to the ordinary switching power socket component on their PCB. The outer connector of the barrel jack does the switching by pushing the conductor away from the internal battery pole and over to the external supply when it is plugged in. This would switch the same way physically whether it was positive or negative, except these are often very sensitive or high-gain audio circuits and every bit of earth ground integrity can be essential for the metal enclosures and coaxial cables to shield the inner audio signal properly.
This SC-55ST may not have an internal 9V battery like a guitar pedal would have, but it was designed to run on a Roland "Boss" A/C adapter anyway which is the top shelf wall wart having highly regulated clean power for studio use. Roland set the standard for center ground with their Boss pedals and adapters which basically steamrolled everyone else. Since for this application it's not the power supply that's using any shielding at all, but the audio needs as much shielding as it can get.