03 November 2024

Mac mini-er

OR:
Pint-Sized Port Problems 

I’m not one for New Year’s Resolutions. I don’t think I ever have been. Something I did learn recently from YouTuber and podcaster CGPGrey was that instead of a specific goal, try having a theme for the year. For this year, I decided it would be the year of upgrades. It sounds like an excuse to spend money, but bear in mind I make a very strong distinciton between a purchase and an investment. After all, the keyword is “upgrade.” For starters, I upgraded my iPad because I wanted more storage. I bought a new bed. I got some furniture. The biggest investment has been a water softener. I also had to get a new toilet, but that was more spur of the moment, though nonetheless qualifies as an investment. 
Now, I’m debating how much farther to take my upgrades. Specifically, I’m looking down from the monitor and right above my keyboard to the very device I’m using to write this. I have a 2018 Intel-based Mac mini that I bought as a housewarming gift to myself in April of 2020. It only began showing its age this year while I was working on Inktober. I decided to work entirely in SketchUp for this year, work on my shot composition and 3D modeling skills rather than ink and paper. During that time, I realized just how slow and cumbersome a task as simple as rotating the camera can be for the little gray box. Simply put, six years is a lifetime for a desktop in this climate, especially considering the massive paradigm shift Apple took by abandoning Intel processors in favor of something homegrown. That homegrown solution is now on its 4th iteration with the M4 processor. 
My first Apple Silicon device was my new iPad Air, which features an M1 processor. Even though it’s an Air and not a Pro, it runs circles around my old 10.5-inch Pro’s A10X Fusion chip, so I’m very interested to see what it’s like on a desktop. 
I wasn’t expecting Apple to announce a new Mac mini anytime soon. I’d already scoped out a new mini and even a Studio in case I was really dead-set on future-proofing myself. It’s about a thousand dollar difference between the two and double the RAM (the mini caps out at 32GB while the Studio has 64). 
As for the new M4-based Mac mini, I find myself rather ambivalent. For perspective, I’ve been an unapologetic fanboy of the mini ever since its debut in 2005. The only reason I didn’t get one is because I had it in my head that I wanted to play some PC games, so I settled for a Compaq Presario. Comparing the two, the mini looked positively space-aged. It was hard to believe a full desktop was in that little 7.7”x7.7” chassis. Sure, small form factor PC’s weren’t especially rare, but they were typically marketed towards business applications, not general consumers. It’s puzzling to me how much criticism the mini would get in later iterations for not making very many drastic hardware changes. On the whole, before the M1-based minis, the biggest shift in hardware specs came with the removal of the optical disc drive. Even with the Apple Silicon upgrade, it was still the same 7.7 inch square. For my money, I was still impressed. 
Speaking of removing hardware, I should get to what’s got me so ambivalent about the newest M4 Mac mini. In reducing the chassis size from 7.7 inches to 5, a few notable compromises had to be made. Key among these is the complete and utter absence of USB-A ports. The classic, bulky rectangle that had exploded onto the scene in the early 2000’s has now been completely replaced by USB-C. I’m bothered by this because my keyboard and mouse setup is still firmly in camp USB-A. Technically, Apple has only gotten rid of one port, as the M4 mini has 3 USB-C ports in the back and 2 in the front. Previous models had 4 USB-C ports and 2 USB-A ports in the back. 
Some of you may be saying, “but they make adapters” and that’s where I have to give Apple a lot of credit for how they implemented the new ports. They're vertical instead of horizontal. On my current mini, virtually every port is occupied, including all four USB-C ports. The last USB-C port, though, is a bit of an odd duck because of a convoluted workaround needed to let me use one of my favorite peripherals. My 3D mouse that I use to move the camera in SketchUp has a wireless dongle. It’s USB-A. As there are only 2 USB-A ports on the mini and both are occupied by a keyboard and mouse, I had to get an adapter to let me plug it into a USB-C port. You’d think, “Okay, problem solved. What’s wrong?” What’s wrong is that the adapter is too wide to actually fit on the back of the mini thanks to the other devices plugged into them. This led me to kill two birds with one stone. I bought a hub that rests under my mini, plugs into that last USB-C port, and gives me 4 USB-A ports, 1 USB-C port, and two memory card slots. The 3D mouse’s dongle is plugged into the adapter and into that one USB-C port.
“Okay…” You’re starting to ask, “If it’s got four USB-A ports, why not plug the dongle into one of those?” And that’s a good question. I tried doing that at first, but my 3D mouse has a problem. Sometimes, the mini forgets that it’s plugged in and if I start up SketchUp, it doesn’t wake up and give me a leg up on my 3D modeling game. So, I have to unplug the dongle and plug it back in berfore I can use it. Unfortunately, Satechi, the company that made the USB hub, made the USB-A ports really, really tight, so un/plugging it requires a very firm commitment from the user. Meanwhile, un/plugging the USB-C port is super easy. 
Speaking of Satechi, I guess I’m waiting to see how they handle the new mini’s design. If they make a hub that works like my current one, I may go for the new mini after all. They’ve got their work cut out for them. For one, on the old Mac minis, the only vent is on the back. The new mini has a vent on the bottom, so any hub would have to work around them. Additionally, instead of the power button being on the back right corner of the chassis, it’s on the bottom front left corner. 
Without a hub, I'll just have to take stock of exactly what I need to keep plugged in. Maybe that can be part of next year's theme. 

28 September 2024

Amp Up Your Dental Game


I got a new toothbrush, and I really like it, but it has a design choice I feel very conflicted about. It’s rechargeable, which is a step up from my previous one that used a AAA battery (though you easily got more than 6 months of life out of it), but it charges through USB. Not only that, they didn’t bother including a wall adapter.

Here’s what you need to understand about USB: the only consistent standard across all USB ports is that they deliver 5 volts of power. If you know anything about electricity, you know that’s only a third of the equation. Amperage is another third and arguably the most important. You’ll see it on your wall adapters, in that really small print that’s only one shade lighter or darker than the rest of the housing. How they list it may be a little hard to decipher, but you’ll generally see something along the lines of “5V/1.0A.”

Before we move on to explain amperage, let me give you a more real world scenario involving batteries. You’ve probably only had to replace the batteries in your TV remote every few years, and chances are you’re using something like Rayovac, a brand that’s not exactly known for high use applications compared to the Coke and Pepsi of the battery world, Energizer and Duracell. If you’re old enough to remember portable CD players, you know those things went through AA batteries before the end of the week. Obviously, a portable CD player is a very different device than a TV remote, which is practically an overengineered flashlight. It’s not constantly running while you’re watching TV. Both devices can run on a pair of AA batteries, but the CD player uses more amps than the remote.

Needless to say, an electric toothbrush is not a portable CD player. It’s not power hungry, hence a AAA battery lasting several months in my old toothbrush. Even the rechargeable cell in the new one isn’t exactly a hog. In the case of rechargeables, amperage is an indicator of how quickly something can be charged. You might be thinking, “What’s the big deal? It charges faster, so get a wall adapter with a higher amperage.” However, as an old fable once told us, slow and steady wins the race. The race, in this case, is the overall life and longevity of the battery. Without getting into the chemistry of Lithium-Ion celled batteries like those in your smartphone, the first 80% or so of your battery being charged seems to happen pretty fast, even with a fairly slow charger like one rated to 1A. That last 20% though is often going to be a bit slow by comparison and that’s also when you’re probably going to notice your phone getting a little hot. When you pour water out into an empty glass, you slow down and ease off the pour as you get near the top of the glass in order to avoid spilling. That’s more or less what’s happening in that battery when it’s charging.

Now, here’s the problem, the faster you charge your battery, the more strain you put on the whole system because you’re more likely to “spill”. For a time, I was using a wireless charging pad for my iPhone 12 mini. Wireless charging pads, in my view and with the benefit of hindsight, are terrible devices that nobody should ever use ever and you’re an awful person for insisting otherwise. It takes a lot of energy to transmit the electricity wirelessly than through a simple cable, and a lot of that energy is wasted in the process. Despite this inefficiency, this particular wireless charging pad was charging my phone really quickly. Between this and an issue with an app running in the background when it wasn’t supposed to, my battery’s health began declining and my phone wasn’t lasting through the day. It would literally drain in about 4-5 hours of moderate use. Even setting it down on a table overnight would leave it dead by morning. I hadn’t had the phone for very long, so upgrading through my cell carrier was out of the question. I was outside my warranty window as well, so I couldn’t rely on that. I had to pay out of pocket and take my phone to an Apple Store (because mailing it in would have taken weeks) to have the battery replaced. Really, taking it there was the bigger hassle than the 40 dollar cost for the procedure. Since then, the battery health is declining again from routine use, and all despite charging it the old fashioned way with a 5V/1A charger, the little white cube that came with one of my older iPhones. Apple doesn’t make this little cube anymore, and now all of their chargers list the wattage, leaving you to figure out the amperage. To be fair, newer iPhones have more sophisticated charging circuitry in them and can handle faster charging operations if one so desires.

Put simply, if you’re the sort of power user who upgrades their phones every 18 months or whatever your carrier offers, the life and longevity of your battery probably isn’t going to crop up until you’re in the market for a new one anyway. So, for these people, faster charging methods may be more practical. If, however, you’re a moderate user and/or the kind who likes to hold onto their phones until the duct tape holding the glass back together rots away, the two most important numbers you need to know are 5 and 1, which are voltage and amperage, respectively.

So, back to my toothbrush*. It’s using the slowest charger I have, which is going to be fine for it. As for my iPhone 12 mini, I’m just going to start charging it by plugging it into my Mac.

*I should mention in the interest of full disclosure that this toothbrush utilizes wireless charging. That is to say, wireless charging isn’t necessarily a bad thing, but coupling it with fast charging is an utter waste of energy on a phone or comparable device. It’s certainly a waste on an electric toothbrush, hence going the slow and steady route.

02 September 2024

Quoth the Elephant, Nevernote

Someone on Threads asked whatever happened to blogging. I thought about it and remembered how after Richard Wright wrote Black Boy and Native Son (among many other works) he learned about haiku and wrote several thousand of his own. That’s kind of what I feel like has happened with blogging in light of social media. You’re able to more or less do the same thing as a place like Blogger or WordPress, but it’s more concise and direct. There’s engagement and dialogue (for better or for worse) With blogging, that still exists, but it’s somehow on a different scale. Some will write comments, while others will write blog entries of their own responding to that original post.

I’ve had a few ideas for longer form entries like I used to, but I simply haven’t had the energy to do any deep dives of late. So, I’m writing this instead.

I’ve decided to part ways with Evernote. I’ve been using them since 2009 back when I had a Windows phone. Back when I had multiple devices from different companies, it was great. A note started on one could be finished on another and they’d all sync up and talk to each other.

Since 11 SEP 22 SUN, I’ve been using Evernote as a journal. I was having some shoulder pain that was severely limiting my normal range of motion, so I used it to keep track of my pain as well as note when I took any Ibuprofen. As time went on, I used it to document when I took my medications, namely my antidepressants and mood stabilizer. The latter makes me a little drowzy, so I have to be careful when I take it. If it’s too early, I fall asleep on the couch and disrupt my normal sleep schedule. If it’s too late, it’s harder to get up in the morning.

In addition to keeping tabs on my medication, it was also a general sort of journal, sporadically documenting the various goings-on in my life, sometimes acting as an aid for me to track my depressive cycles and try to rationalize what I was feeling at the time. It’s easily the longest journal I’ve ever kept (not counting Blogger, which has no real regularity to it) and Evernote was great for updating it. Then, it stopped being great.

Despite only being text, updating it became extremely tedious and slow, especially using the mobile app. The desktop app wasn’t much better. According to Evernote, the file is only a few hundred kilobytes, a far cry from the 200MB max size they say a note can be on a subscription plan. I wrote in to their customer support, but received no reply. Couple this with their price hike (70 to 130USD) at the start of the year for me, and it’s a lot of goodwill down the drain. As for what I was paying for, their scratchpad was finicky to use, not always syncing between devices, and other times when it would sync, it would double-paste whatever was written from one device to another. This eventually got fixed, but sometimes it would still slip. Their worst offense in all this, really, was their pushing of AI features. You bump up the price, struggle to handle a 200kb text file, barely handle syncing on the scratchpad, and now you’re telling me to use the AI features? Finish this candy bar before you go unwrapping another. It’s just poor management, chasing trends rather than focusing on and leaning into your strengths.

Meanwhile, Apple Notes seems to have gotten better, if only by comparison. Before, I always felt like it was fighting me. I’d use it on occasion, but I simply didn’t like using it. Now, though, it seems so much more polished, keeping in line with the Apple fan adage, “It just works.” Besides, now that all of my devices are Apple, I don’t need Evernote’s best feature, which was being a middleman among the devices. In fact, I wrote the draft for this entry in Apple Notes rather than Evernote. I don't plan on migrating my stuff from Evernote on any grand scale, only as needed. As an experiment, I started my meds journal as a new document, then copied and pasted the old one into the new space. No lag on the desktop, but we'll see how it holds up in morning with the mobile app. If Notes has similar issues with large text files, at least I'm not paying an annual subscription to have it not work. 

15 June 2024

A Random Anecdote About Perspective

Once upon a time, when the Syfy Channel could spell, they had a show called Trailer Park. As the name implies, they showed old sci-fi and horror movie trailers, typically from the 1950's and 60's. At the end of each episode they'd have some commentary from one of a rotation of guest speakers. His job was to offer a sobering perspective or witty insight into what we'd just seen on the show. I think it was an episode about apocalyptic movies where he offered this little story about a physicist giving a lecture on the sun. 

My memory is a little fuzzy, but here's the best I can do at retelling it. 

A physicist was giving a lecture on the makeup of the sun. At one point, he explained that in 5 billion years, the sun would expand to a red giant, possibly consuming the earth in the process, if not at least making life intolerable for whatever would survive. He was interrupted by someone in the audience who seemed rather panicked when they asked, "How soon!?" Thinking he'd maybe given out the wrong number, he restated, "5 billion years." The audience member breathed a sigh of relief and explained, "I'm sorry, I thought you said 5 MILLION years." 

27 May 2024

Peddle Meddle

This recent kerfuffle around Tesla's Cybertruck and its accelerator pedal coming loose and jamming in the footwell of the vehicle has reminded me of something I read about John Delorean and the production of the iconic DMC-12 that would take Marty McFly backwards and forwards in time. 

The story goes that during preproduction, ol' JZD was trying every chance he could to cut corners on design and testing, trying to retain as much of his original vision as possible. His logic was that Lotus (who were helping in the production) frequently skipped a lot of steps that the larger automobile manufacturers make, especially in design and testing. Basically, if it was good enough for Colin Chapman, it was good enough for John Delorean

Finally, someone challenged him. He explained that the reason why Lotus can take a more seemingly more cavalier attitude to design and testing is that they're typically only making a few hundred examples of a vehicle at most. If they find a flaw really late in production, including coming off the assembly line, they can fix it right then and there and repeat the steps for every vehicle that comes off the line. Delorean wasn't making a few hundred DMC-12's. He wasn't making a few thousand. He was making tens of thousands in one of the most sophisticated assembly lines of its time. If it was found that a wheel arch was too narrow or, let's say, the accelerator pedal came loose and got stuck in the footwell, that's a repair you're going to have to repeat tens of thousands of times. 

The worst part about what's happening with the Cybertruck is that even some of the repairs aren't going so well. They're using 3D printed jigs to reattach the pedal with a Phillips-headed wood screw, and some of the examples that have gotten back to customers have the screw misaligned. They're literally botching the repairs despite having all the necessary tools for what is quite possibly the laziest, most half-assed repair anyone could do for a vehicle. 

Bear in mind, I'm not one of these "I told you so" types. I had legitimate faith that Tesla would succeed. After all, it managed to do what I thought would be impossible in my lifetime, and that's make electric cars cool. For perspective, I used to drive a Nissan Cube, and I got asked once if it was electric. Considering other vehicles like the Toyota Prius and the Nissan Leaf, nobody was expecting a an electric car like a Tesla to come out of the concept phase. That it's turned out to have more than a few smoke and mirrors behind it isn't surprising, like exaggerated ranges and the safety of its battery packs in collisions. What is surprising is that not only did they cut corners on the adhesive holding the accelerator in place, but the design of the footwell was such that the pedal can get jammed. 

Sadly, another thing that's not surprising is that there are those who are still defending Tesla in light of this recent event. Something needs to change. Someone needs to challenge the rule over there and tell Delorean he's not Lotus

04 May 2024

Everything Pro is New Again

Top: iPhone charger. Left: iPad Pro 10.5" (2nd Gen.) charger. Right: iPad Air (5th Gen.) charger.

After much shopping around and even deeper thought, I decided a few days ago to upgrade the iPad Pro I've had since late 2017 early instead of waiting until after October. This was when B&H (where I was getting the iPad from) was closed for Passover and Apple had just announced its upcoming iPad event on the 7th of May

I had my sights on an iPad Pro, my thought process being that after Apple would announce its new range of allegedly expensive iPad Pros, people would be snatching up existing models like it was a run on the bank. Granted, this iPad Pro, with its full terabyte hard drive, was going to carry a significant cost. Put simply, though, if the rumors are true, this terabyte model would have cost the same as a baseline new iPad Pro with maybe 128GB of storage. 

The more I thought about it, though, and the more I looked around at other iPad models, I came to the realization it may not be as good of an investment as I was previously thinking. My current iPad Pro has 64GB of storage, and in the 7 or so years I've and it, I've taken up 41GB of it. Most of this consists of drawing applications. I have a few games, but I think the largest file size on any of them is 2GB (and there's less than a half-dozen in total). Overall, I'm very deliberate and intentional with how I use my iPad. It's a drawing device, not a communication device like my iPhone, and not a multimedia workhorse like my Mac mini. I don't like having all my eggs in one basket, another reason I decided against splurging on a higher-end iPad Pro. If I really needed something for, say, editing videos or 3D modeling, I'd just upgrade my Mac mini, which I may well do at the end of this year depending on how I feel when the time comes. I prefer a mouse to a stylus when it comes to 3D modeling, anyway. 

Then there's how the iPad range has evolved from 2017 when my second generation iPad Pro was the fresh face in the crowd. At that time, only the Pro series offered support for the Apple Pencil. After using a number of third party styluses over the years, I can say with certainty that the Pencil is the best of them, and it made the iPad Pro worth its asking price. Now, all of the iPads offer Pencil support. I'm still waiting for, at the very least, Pencil support on an iPhone Pro model, but that may well never happen. I wouldn't want a Pro line iPhone anyway. Pro has become a very flimsy word with Apple. When the iPad Pros were new, you got Pro features, such as the aforementioned Pencil. Nowadays, a Pro iPad still has a number of "exclusive" features, but one of those features is storage space, which makes the whole offering feel a little bit... unbecoming, for lack of a more tactful term. Like I said, my iPad Pro is 64GB, which was the lowest amount of storage available at that time. I wasn't bothered by this as it not only kept the price down, but I was only going to be using it for drawing anyway, and as a rule, drawing apps aren't very resource intensive. However, even being careful with what I put on the tablet, I knew this wasn't going to be sustainable, that I'd eventually run out of space. 

With my iPhones, I have a rule to double my storage with each upgrade, as I'm way less choosy about does or doesn't get installed on it. My current iPhone, a 12 mini, has 256GB of storage (I don't remember my previous iPhone 7's storage. I only think it was 128). Many iPhones go higher than that (512GB) before we have to start adding Pro or Max to their numbers. As for iPads, any model that's not a Pro only has 2 sizes available, 64GB and 256GB. I was worried this would fill up quickly, but then I remembered those posts on social media that compare just how much a billion is compared to even a million. They usually use grains of rice or counting seconds, but you get the idea. 

Taking that as a starting point, I made a graph to put into perspective just how much 256GB is compared to my current 64GB, and how much my 41GB after 7 years to build up actually was in the grand scheme of things. 


The iPad Air is a strange entry in the iPad series. I'm not even sure Apple knows exactly what it's supposed to represent. When it first launched, its claim to fame was being thinner and lighter and more powerful than other models of iPad. This was long before the Pro series came out and the price of the standard, adjective-less iPad was going down into the budget-friendly range. Against its more premium Pro compatriots, I suppose you could say it's the premium device without the premium price. It still has Pencil support and the display is actually slightly bigger than my 10.5" Pro model. Supposedly it's not as bright, but it's by a factor so small and in a unit of measure so poorly understood, I wouldn't have known if it hadn't told me. Speaking of the display, the other trade-off is that it's not as smooth and flowing as a Pro model. I'll save you the block of technobabble and just say that the screen of an iPad Pro refreshes at a rate of 120hz (pictures a second) and so animations look smoother than a typical refresh rate of 60hz, which is what other iPads and most displays you've seen in the wild have on offer. 

To put this in perspective, and at the risk of gushing about iPads, even the very worst iPad screen is better than most high-end premium Android-based tablets. Samsung has some models that come close, and the Surface lineup from Microsoft is a damn good competitor, but dollar for dollar, the displays on iPads are leagues above everyone else. Android tablets can be had for pretty cheap, but the display is always the biggest corner that gets cut, especially in the overall resolution. 

For perspective on that, a Samsung Galaxy Tab S6 Lite has a total resolution of 2000 by 1200 pixels and goes for about 250USD. A current (10th generation), adjective-less iPad has a total resolution of 2360 by 1640 in roughly the same size screen as the Galaxy tablet, and starts at around 330USD. That's fairly close and it's taken a long time to get to this point. There's other differences to consider, and I don't pretend for an instant this is a perfect 1:1 comparison, so take the numbers with a grain of salt. The point is, in terms of a digital drawing experience, there's fewer compromises on the iPad and you do get your money's worth if you're willing to pay that little bit extra. 

I've only had my iPad Air for a day as of this writing, and I honestly can't tell the difference in displays between it and my old Pro. I wouldn't care anyway. Similarly, I don't care that the camera's not as good as a current Pro offering because I don't take pictures or shoot video with my tablet anyway, except maybe to snag a quick reference for while I'm drawing, and I'm just as likely to grab my phone and use Airdrop to send it over. 

As for the 7th, I'll certainly watch and see what Apple's got on offer. I'm bound to use up 256GB eventually. 

24 March 2024

Keeping Watch in the Walled Garden

Full disclosure: I'm writing this on a Mac mini while my iPhone 12 mini charges and before I do a little drawing on my iPad Pro later. 

Officially, I identify as platform agnostic. I don't care what operating system I have to use; I will make it work. I grew up on classic MacOS, I switched over to Windows XP shortly after college, I was a Linux user for a while, and my phone choices have been equally all over the map. I'm far from an early adopter, but you name it, I've probably tried it. Ever heard of Maemo? Symbian

For my money, Apple makes exactly two good products, the Mac mini and the iPad Pro. I often joke to people, "I don't have an iPhone. I have an iPad... Oh, by the way, did you know Apple makes a camera? It takes calls too for some reason." Also in the interest of full disclosure, I have a smartwatch from Fossil that works well enough with my iPhone. I have no desire to buy an Apple Watch. For perspective, my favorite feature about my Fossil is that it doesn't have a screen. It's got a plain, ordinary watch face (which I think is called a complication if you're in the know) and the hands occasionally move to different positions based on what notifications I've set it to. In recent months, however, this has stopped working. It's a little irritating, but I don't miss it. It certainly doesn't make me want to buy an Apple Watch. Those could well be famous last words, but on the whole, the only Watch feature that had my attention was using it as a painter's palette for Procreate, and I don't even think that feature is supported anymore. As for my Fossil, I've actually stopped wearing it altogether. I simply don't need a watch that much, and its use as a pedometer and sleep tracker are simply no longer part of my daily habit. 

As of this writing, the Department of Justice has taken Apple to court over its monopolistic practices, particularly in the smartphone market. Among its talking points was the fact the Apple Watch currently only works with iPhones. Android users are simply out of luck and have to settle for a Garmin, a Fossil, a Fitbit.... 

I'm not saying the DOJ doesn't have some kind of a case against Apple. What I am saying is that their angle of attack speaks more to their lack of knowledge not only of the technology, but of the market itself. This isn't like 20 years ago when Microsoft was under the microscope for their own monopolistic practices, but this is far from apples to apples (no pun intended). That's why I brought up all the other smartwatch brands, smartwatch brands that work with both Apple and Android devices. Moreover, I'm a little confused by this talking point. The DOJ is saying Apple has too much of the smartwatch marketshare with their watch which is exclusive to Apple products. So, what happens if they add Android support? Wouldn't that give them a bigger piece of the pie they allegedly already have too much of? Won't that just land them back in court for the exact same offense but now with slightly different wording? Wouldn't that make any ruling on the matter tantamount to self-incrimination? 

Personally, I don't know anyone who wants an Apple Watch so badly they're willing to give up their Android phone to get one. That's simply not the market we live in. Once upon a time, I worked for a cell carrier that did not earn the iPhone's exclusivity contract with AT&T. Needless to say, this led to a deluge of calls about when the iPhone was coming to our neck of the woods and what hoops would have to be jumped through to make it happen. A workaround did emerge at one point. The iPhone's exclusivity was limited to the United States, so it was possible to use the phone on our network... in Germany. That means you'd have to import a German iPhone, and use a German SIM card, which meant all of your calls would be international ones. Would you believe someone I spoke with still wanted one after all that was explained to them, that it would literally cost them more to use the phone in a month than to buy the thing outright because all of their calls would be for a German number on a North American network? I have no idea if this person went through with their plan, though I like to think he got as far as putting in a bid on an eBay listing before losing it and taking it as a sign to simply wait for the iPhone to be available on other networks. He didn't have to wait long and, frankly, those early iterations of the iPhone are among the many reasons I'm not an early adopter. My first iPhone was a 5c, which was only supposed to be a temporary device while my Sony Xperia phone was getting its battery replaced. None of my iPhones have been the newest model. 

There was likely a certain time in my life when I suffered from the dreaded Fear of Missing Out, but I was a kid, and like all kids, I grew up. So, when I see people lining up to buy a phone or a game console on launch day, I don't envy them. In fact, I only sort of feel bad for them. Do you know who was the first person in North America to own a Nintendo 64? Neither do I. I don't think anyone knows. Even with social media, I don't think we could ever know for certain the name and face of the first person to own a Playstation 5. I'm sure it was a big deal to them, whoever they are, but let's not pretend that this FOMO business is anything to give in to or follow like a cartoon character floating toward a pie cooling on a windowsill. 

I think the DOJ is misunderstanding FOMO, treating it like some perfectly rational mindset in a consumer-driven economy regulated (loosely) by a democratic republic. Someone really wanting an Apple Watch does not make the niche Apple has carved out for itself a monopoly, even if that niche is worth several billion dollars. 

"What do you mean this person has to get a new phone for their watch to work!? This is an injustice!" 

Seriously, where is this attitude when it comes to big Pharma or the oil companies? 

That'$ only a $lightly rhetorical que$tion. 

My point is that the monopoly Apple has allegedly made for itself is nothing like the one that Microsoft got called out on 20-some years ago or the one AT&T was broken up over however long before that. I can't run MacOS on a Microsoft Surface tablet. I can't play Nintendo Switch games on my iPhone. I can't use a BMW part in my Honda. The list goes on. If Apple wants to paint themselves into a corner by playing their cards close to the chest, let them and most importantly let the market decide whether that's playing fair in the game of capitalism. 

25 February 2024

A Little Tipsy

Let me start by admitting that I am by no means an expert on aerospace engineering and could most certainly not land anything on the moon or Mars or anywhere else in our solar system. I don't think I ever even succeeded at that egg drop challenge everyone did in elementary school (not even sure I participated). The point is that my opinion on the Odysseus Lander is only that, one of 10,000 opinions by someone with no expertise on the subject. 

Remember that email chain letter from a few years ago about how the booster rockets on the space shuttle are based on the width of a horse's ass? It's got to do with train tracks being based on old wagon trails, which in turn owe their dimensions to Roman chariots, and you see where this is going. The credibility of the story is debatable, but the point is that for anything we construct, eventually some part of the process is going to be arbitrary or born of a necessity that's no longer a problem to overcome. 

I was thinking about this when I saw the Odysseus lander. I wondered if it had to be designed the way it is to properly fit in the payload space of the rocket, hence it effectively being taller than it is wide. 

From what I can find, the Falcon 9 rocket is about 3.66m (12 feet) wide. I don't know what that is in horse asses, but let's leave that rabbit hole be. I don't feel like researching what other landing craft it housed or exactly how much of that diameter is usable cargo space and not insulation or whatever else it takes to get that phone booth into space. All I want to know is why it was designed that way when so many other craft understand the importance of placing your center of gravity as low as possible. 


This is the Philae Lander, part of the Rosetta mission to Comet 67 P/Churyumov-Gerasimenko back in 2014 (launched in 2004, by the way)It had a rough landing of its own. One missed harpoon and a failed thruster caused it to bounce twice before finally landing and resuming its mission. You'll notice it only has 3 legs and is overall fairly low to the ground.


This is the Viking lander. It landed on Mars in 1976. It also only has 3 legs and is overall wider than it is tall. Are you noticing a pattern yet?

The Odysseus lander has six legs, but according to reports, one leg may have snapped, sending it on its side and possibly leaning against a rock. I don't want to diminish their accomplishment, but ignoring decades of sensible design doesn't endear me to their cause.


This is a tensegrity robot, designed by the Creative Machines Lab out of New York City. It uses tensioning rods to maintain its shape and can even roll by telescoping and expanding its legs in a sequence. Theoretically, a science payload could be suspended in the center of this "skeletal ball" and with no obvious up or down could land in any orientation and be able to correct itself. Also, depending on where we drop it, it may not even need a chute or booster to slow it down. As stated, though, this is all theoretical as only various field tests have been conducted to prove the concept of rolling as a viable means of locomotion on uneven terrain. 

It makes me wonder if Intuitive Machines, creators of the Odysseus lander, maybe had to answer to some higher-ups who weren't too keen on any unproven technologies, instead opting for something that at least resembled a more conventional design. 

23 January 2024

The Death of Drafts

Something I need to get in the habit of is using Evernote to write my rough drafts before bringing them into Blogger, rather than my usual method of writing into Blogger directly. The interface is fine when I need to insert things like links or videos, but as far as the core activity of writing the entry, it's not always as smooth and pleasant a user experience as it should be. I feel the same way about WordPress, though to a lesser degree because the block system it uses is relatively intuitive. Blogger is too much like a word processor; it's very much about the WYSIWYG philosophy.

To better illustrate what I mean, we have to talk about 2001: A Space Odyssey. A lot of people don't understand the relationship between the movie and the book, specifically which came first. Technically, the book was written first, but it was meant to serve as something of a wiki for the screenplay. As Arthur C. Clarke put it, "Before you can make a movie, you need a script. Before you write a script, you need a story." Screenplays, by their very nature, aren't really meant for embellishing details or offering backstory to a spotlighted item. There's an unwritten rule in screenwriting of no paragraph of more than 3 lines. They're designed to be concise and quick to read, each page amounting to roughly one minute of screen time. Novels can play with time and structure in ways screenplays can't, hence why some novels are deemed "unfilmable."

There's another rule about screenplays, that the first version of your screenplay submitted is always your first draft, no matter how many rewrites it went through up until that point. When I was in school, we made these things called rough drafts. They were handwritten on paper, and your teacher would eventually hand it back to you with red marks all over it, pointing out every little mistake you made (sometimes with helpful suggestions on what to do next). You would then completely start over and write out another paper taking the edits to heart. I'm sure this is still technically done, only with digital files instead of sheets of notebook paper. My point is that word processing created a convenience in terms of editing your written work. Some writers forgo "drafts" altogether and edit as they go along, never even bothering with the "version history" feature some word processors and notetaking applications provide. There's no more starting over, barring any serious fundamental hiccup like finding out a source is inaccurate or an entire premise is off the mark. It's overall a more nebulous process. You can still number your drafts, but those can be reserved for page one rewrites rather than little typos or moving a sentence from the end of a paragraph to the start of one. 

This concludes the experiment. Draft was successfully transferred from Evernote to Blogger

13 January 2024

I Am The Alphabet And The Omega

Google has declared my site unfit for advertising. I imagine it's due to a low level of traffic. I'm far from an influencer, and if I had to guess, my WordPress blog probably gets more views than anything I've written here in the past year.

And that's fine. 

The upshot to all of this is it means there are no ads on my site, which is how it should be. Contrast this with YouTube, where a channel can be demonetized, but still have ads all over it. The channel in question simply does not get the money. Maybe that's how it should be, though. After all, what advertisers were most afraid of when it came to YouTube was their money going to content creators who they disagreed with or otherwise didn't want to be associated with. The platform still got the support it needed, and users were free to share what they wanted to (within reason). 

Of course, I'm sure Blogger takes a lot fewer resource to maintain than a data hog like YouTube, so Google didn't have much of a choice when it came to their "advertiser friendly" policies. It's what makes Elon Musk's recent middle finger to advertisers so frustrating. He's both right and wrong for more or less the same reason. Businesses have a right to decide who they will and won't do business with, just as consumers have a choice with where they spend their money. That's called the free market. However, just like George Carlin said about the American Dream, you have to be asleep to believe it. Right now, big businesses hold more power than they deserve and the average consumer is in no position to do anything to their bottom line and therefore effect change in an unfair market. 

When consumers boycott, it takes a town, so to speak. When businesses boycott, it's usually a handful of guys in suits sitting around a table, never mind the hundreds or thousands of employees who may feel otherwise. This is the fundamental flaw with voting with your wallet, something I used to advocate for quite vehemently. When people vote, it's one vote per person. When dollars vote... well, they can't actually vote. The people holding the dollars do. As Douglas Adams said, "On the whole, it was not the small green pieces of paper that were unhappy." 

As for Blogger, I intend to stay; monetization wasn't the goal... not really, anyway. I've been comparing WordPress with Blogger for a few weeks now as the former keeps sending discount codes my way. Blogger technically offers a lot while asking for "nothing" (besides your data) in return. The only real paid feature on Blogger is registering a domain name. WordPress, meanwhile, offers a more streamlined user experience, but only if you play their paywall game. Automattic is not Alphabet; they can't afford to give WordPress away for free. They want money, not data. 

As of now, WordPress is my after show for Blogger. Anything I don't feel like posting here usually ends up over there. At most, WordPress is my back up if Google decides they don't like keeping Blogger around. It wouldn't be the first time I've had to migrate from a blogging service, and if the Muskrat's acquisition of Twitter has affirmed anything for me, it's that no platform is permanent.