Probably the biggest mistake you can make when it comes to building a PC is believing it will save you money. Of course, there’s the obvious time sink of researching just what it is you’ll be doing in the first place, learning why some parts will or will not go together, sourcing all the parts in the first place, and the omnipresent possibility of something going horribly, horribly wrong and there being no recourse beyond opening your wallet again. What you might save in money, you’ll lose in time… and possibly a little sleep.
I’ve built PCs, and I don’t miss it. At this point in my life, I’d rather pay for the peace of mind via professional assemblies than the satisfaction of building something with my own two hands (and having to be my own technical support). It was certainly a learning experience, and I’ll always be grateful for that. In terms of cost, I didn’t really pay as much attention as I should have, but I know at least once or twice I had to basically start over and source a new part because something was amiss, including swapping out an entire motherboard because I plugged something in wrong. To be fair, when I was assembling PCs, it was circa 2010-2015, which meant I had it easy. There’s no shortage of tutorials on PC assembly, and the way PCs are built in this era to begin with is a much different animal than it would have been 20 or even 10 years ago. To put it in perspective, I didn’t learn to solder until 2016 for my job. None of the PCs I built required any soldering, but this was most certainly not the case 20 or 30 years earlier.
In food terms, you’re assembling a sandwich. The bread is the case, the lettuce is the motherboard, the pickles are the RAM, the meat is the CPU, the mayo is the thermal paste… it’s not a perfect metaphor, but you get the idea. You’ve got individual components that just snap together to eventually form a fully working personal computer workstation. In other words, in terms of time, I actually had it pretty easy.
Now, about those pickles…
Without getting into a long-winded rundown of just what RAM (Random Access Memory) is, think of it this way: The hard drive is your long term memory. It’s your name, your family’s faces, the streets you grew up on, and anything else you don’t want to forget. The more of it you have, the more you can remember without having to forget something first. RAM is your short term memory plus your ability to multitask. The more of it you have, the more you can do at once. It’s also probably the simplest and most straightforward upgrade you can perform on your computer. I like to tell people that if you can change a diaper, you are overqualified to swap out the RAM in a computer. Even if you need someone to talk you through it the first time, after your first go, you’ll be showing it off at parties.
In my personal experience, RAM has always been very costly. I’ve easily spent more on RAM than I have on motherboards (spontaneous failures notwithstanding). There have been many reasons for this. Games tend to be a bit RAM hungry, especially considering online games in which you’re often running at least two or three other applications at the same time, one of which is doubtless a web browser with two-dozen tabs open. During the pandemic, when we were all stuck at home and relying on our computers to check in at work, start that live-streaming channel, or virtually send our kids to school, RAM prices went up. That old desktop you bought a few years earlier wasn’t going to cut it, but you couldn’t afford to replace the whole thing, so you swapped out a few parts.
I’m sorry to tell you, but you were over a barrel. We all were.
There was a demand, so the cost of the supply went up. It’s simple economics. Ironically, something that should have been the cheapest upgrade (sticks of RAM don’t nearly have nearly as much meat on their bones as a motherboard, a CPU, or a hard drive) became a hot commodity.
One company has understood this better than anybody: Apple. Actually, that’s a slight lie; Apple understands the economics, but ultimately plays by its own rules. As a rule, once you’ve purchased an Apple product like a MacBook or an iMac, you cannot swap out or upgrade the RAM, no matter how many diapers you’ve changed nor how much sleep you're willing to lose. When you make that purchase, you’d better be thinking of how it will impact the next seven generations. Otherwise, you’re going to be left wanting down the road. So, do you splurge now and hopefully be content for the next few years, or do you settle and upgrade more often? Apple technically wins in either case.
I talked before about the pandemic driving up the price of RAM. Well, now the newest plague to befall mankind is artificial intelligence.
Without getting into a long-winded rundown of Generative AI and Large Language Models, the important thing to understand is that these chatbots and image generators need a lot of processing power and a lot of short term memory. The tech companies behind these AI tools are building more and larger data centers all over the country and even around the world. The demand for RAM is so out of control that Micron, one of the largest manufacturers of RAM decided to no longer sell to consumers and focus on its corporate clients, the ones building the datacenters.
What this means for consumers is that if you thought the RAM prices during the pandemic were bad (and possibly before that) then you haven’t seen anything yet, and the only thing that’s going to stop it is this whole AI bubble finally collapsing on itself because no amount of Sam Altman saying, “Trust me, Bro!” Is going to make this so-called business model sustainable. This is the part where I humbly brag. As an actor friend of mine once said, “Save your rotten fruit for the parking lot. I’ll have more places to hide.”
Back in 2020, shortly after I bought my house at the start of the lockdowns, I purchased a Mac mini as a housewarming gift to myself. I’d wanted one since they debuted back around 2005, but the planets never quite aligned just right for me to make the decision. 15 years later, the alignment netted me a 2018 Intel-based model with 8 Gigabytes of RAM. I could have sworn I opted for 16, but I think I talked myself down since I was going to use it mostly for writing, drawing, and at most some 3D modeling in SketchUp. Anyway, despite the low hardware specs, the little gray box I nicknamed Gray Rock served me very well for the next five years. Even after Apple changed their processors from Intel to a homegrown lineup known as Apple Silicon, the little Gray Rock that could was holding up just fine and dandy.
I knew I couldn’t keep this up forever, though. While the hardware would probably last many more years, software is another problem. With the change in processors, applications were leaving the old processing architecture behind and were being optimized for the new kid on the block. This sort of hardware upgrade/software optimization leap frog happens regardless of paradigm shifts in processor types, but Apple’s new in-house strategy lit a fire under developers to get with the times.
There was, of course, a new version of the Mac mini, but I have to say I wasn’t too impressed with it. I mean, sure, it’s nice and compact, but it presented a problem for me. While I like Apple’s hardware offerings such as the Mac mini and the iPads, I’m less keen on Apple’s accessories. I don’t like their keyboards, I don’t like their mice, and while their monitors are nice, you can do a lot better for a lot less. As for the mice and keyboards, Apple cares more about form than function in this regard, and that means favoring wireless over having cables crisscrossing what’s supposed to be a sleek, minimalist setup. In other words, my mouse and keyboard needed to be plugged in, and the new Mac mini would have required me to use an adapter. This is a pain, so I searched other options. I guess I’d simply turned into too much of a Prosumer for the Mac mini’s casual demographic. This led me to the Mac Studio, a higher-end desktop with a similar-ish form factor to Gray Rock, the notable difference being the Studio’s doubled height to accommodate a massive heatsink and cooling fan. Needless to say, it had a lot more horses under the bonnet than my model that rolled off the assembly line 7 years ago. It also had more RAM out of the gate, with the minimum being 36 gigabytes. This was starting to look like the perfect upgrade; it had over 4 times the RAM, a new processor, and I could plug in all my favorite accessories without some clumsy adapter. Throw in a special discount that doubled the hard drive space for the same price, and I took it as a sign.
In the first week of August of this year, my new Mac Studio arrived and Gray Rock was shipped back to be recycled. In its honor, the new Studio was given a similar nickname Grimlock (after a Transformer), and it’s been a great upgrade for only taking up so much more room than its predecessor. I may not be pushing its specs with my typical workload, but I’m going for a slow burn rather than anything fast and furious. Even when I’m using a 3D modeling program like Blender, I’m only really using it to make perspective and shadow reference models for drawing.
I’ve been thinking about the timing of my purchase in relation to the spikes in RAM prices. I was under the impression, since Apple’s processors were being made in-house and their RAM works a little differently from the Intel-based models, that Apple was safe from the increased demand. After all, they already charge a premium for RAM when you’re configuring your initial purchase, so I was kind of ahead of the game in that regard. At least, I thought that was the case. Turns out Apple does still rely on these RAM manufacturers for their own machines, and it’s only a matter of time before the price spikes affect Apple’s own price tags (which already have a reputation).
For whatever it’s worth to you out there thinking of buying a new PC or taking on the risk of building one, I absolutely hate that this is happening. There is no smug look on my face as I sit at my Studio typing this out. I hate the reason for it happening most of all. I hate this far off pipe dream of promise of some kind of computer generated hive mind and all we need for it is to build more power- and water-hungry data centers and for you to keep asking Gemini to make you images of Mickey Mouse cleaning an assault rifle while writing your college term papers for you.
Apple Intelligence is available on both Grimlock and Sapphire (my iPad). It has not been enabled. I have no desire to enable it. Few things in this universe would please me more than for the parts of their processors dedicated to AI features to be used for something else.