Decay-Proof Record Scroll
an infrequently updated online chronicle of a chronic dilettante
10 January 2026
The Correct Resolution of Paper
31 December 2025
Last One Out, Hit The Lights
I mentioned before that I'd been having issues with posting to Blogger due to an OS update that made WebKit not work so well on the desktop environment. The biggest problem was that when I would try to post an image using the menu bar, I got an error message that my Google account couldn't be accessed, albeit I'm here now writing this.
This problem affected both Safari and DuckDuckGo. Other browsers don't seem to have this problem, but it's also confined to the desktop experience because I have no issue with my iPad.This was very frustrating as I haven't had to rely on workarounds for a very long time when it comes to updating Blogger. There have been several OS updates since this problem started, but none of them have fixed the issue... or have they?
When I made my previous entry, I wrote it on my Mac with the intention that I would finish it using my iPad to add all the images. On a whim, though, I decided to try the old drag and drop method. I resized the window, clicked on the image file on my desktop, and dragged it over the body of the text.
Success.
It's a small victory and rather cumbersome, but it's better than trying to use the
version of Safari to fill this out. This interface is really best suited for desktops and notebooks. Sadly, there's no more Blogger app like there is for WordPress, which perfectly adapts the blogging experience for tablets.
See you all next year.
28 December 2025
Bleep You, Got Mine (and I'm sorry)
13 December 2025
Tweet Back
It's recently been announced that a small startup called Operation Bluebird is trying to relaunch classic Twitter, arguing that the Elongated Muskrat has allowed the name and logo to lapse. I'm no legal expert, but there may be something of a leg to stand on. As a rule, a trademark is only enforceable so long as the company in question keeps using the mark consistently. That is, if a company rebrands, giving itself a new name and logo, they lose all rights to its previous assets. This is intended to encourage competition rather than allowing companies to essentially sit on their rights. There’s a lot more to this sort of move, but those are the broad strokes.
As for the new Twitter, it’s set to launch in early 2026 and is currently letting people reserve usernames and handles for when it finally launches. As of this writing, it’s at a little over 150,000 applicants.
I don’t intend to be one of them.
I can respect the effort on display here, and there’s clearly a love and affection for what Twitter once was, but I don’t think there’s any chance of catching the same lightning in a bottle. Twitter’s acquisition by Musk fractured a big part of the social media landscape, and I think things are all the better for it.
Once upon a time, I called Twitter my favorite social networking site, warts and all. It’s hard to describe what exactly I loved about it, but if I had to put it into a single coherent thought, it had an immediacy and a conciseness to it that you didn’t really get out of other platforms. It took the status update aspect from the likes of MySpace and Facebook and made that the entire site. It was also very accessible. I’m old enough to remember when you could use text messaging to post Tweets, back in the days of flip phones and T9 predictive text. That may seem rather quaint now with smartphones, but this was kind of a big deal back in the day. You weren’t bound to a swivel chair in front of a desktop, you weren’t lugging a laptop, and you didn’t have to break the bank buying one of those new fangled smart devices that HTC was making. If you had a phone and a good connection, you could submit a small message to a public square. I remember once reading an article about some activist tweeting only one word: Arrested. I don’t know the exact circumstances, but you certainly couldn’t have made such a quick and concise post to such a wide audience while seated at your desktop as the SWAT team kicked your door in.
Over time, the site evolved to include a few quality of life features, such as the ability to post images, the ability to post links in a way that didn’t count against your character limit, and eventually a doubling of the character limit from around 120 to 240. On a side note, I love the reason this upgrade happened. The story goes that Twitter wasn’t very big in Japan until a massive earthquake hit the nation. Suddenly, people all over Japan were signing up for Twitter to keep in touch during the crisis. That may not seem like a big deal, but you have to consider that the Japanese language is built different from us European/Latin-based types. To a Japanese person, that 120 character limit may as well have been a 120 word limit since a single character in Japanese can be either a letter, a word, or even a short phrase depending on the usage. Microblogging was the chocolate to Kanji’s peanut butter. As word of these longer-than-normal tweets spread, people around the world wanted in. Obviously, you can’t change a language overnight and emojis can only get you so far, so Twitter opted to double the character limit. The platform’s biggest paradigm shift was done out of jealousy for the Japanese language.
Despite all this, Twitter grew with the times by sticking to a very practical model. This drew in a lot of new users and eventually Twitter became the go-to place for news organizations to seek out statements from famous people who had now graced the platform with their presence. There would be an incident or scandal or some other controversy, the offending parties would release statements to Twitter (as opposed to directly to the press) and you’d see a screenshot of that tweet on the news, be it on the TV or on the website or anywhere else you’d get your news. There was a direct line between a person of importance and the general masses. Of course, this was a bit of an illusion as it was just as easy for a celebrity to post a Tweet themselves as it would be for them to hire a full-time social media manager to post on their behalf. Still, it came with a sense of authenticity. Barring any hacking, there wasn’t anything on that feed that a user wouldn’t want there.
However, this wasn’t going to last. Nothing does. Something at sometime was going to come along and disrupt the whole operation. The bigger they are, the harder they fall, and Twitter was no exception.
The fall came in the form of a buyout by a narcissistic billionaire who felt that Twitter wasn’t being as transparent and honest as it should be with what kind of content was and wasn’t allowed on their site. One of the events preceding this takeover was Twitter banning a number of high profile users for violating terms of service, including Alex Jones and Donald Trump. This was viewed by Musk as Twitter not being a platform supporting free speech despite its insistence on being the digital village’s public square. Musk seems to have trouble grasping the fact that free speech does not extend to things like slander and libel or hate speech or calls for violence and harassment. His view seemed to be that people would get to say whatever they want and that the consequences of these actions would just somehow magically work themselves out. Ironically, he’d go back on this promise of totally free speech as he’d start cracking down on satire accounts or impersonations of people and organizations.
As the old saying goes, be careful what you wish for because you just might get it.
So, what’s happened since Twitter imploded? We’ve seen a number of other social sites step up to fill the gap. The centralized source of direct information is now decentralized. It’s no longer “So and so Tweeted yesterday…” but now “The blah blah blah posted on Substack that…” or “What’s his name wrote on Medium...” or “… the company announced on its Threads account.” Among many other new names and faces to the scene. Sure, some of them have been around for some time, but now they’ve found a new purpose serving as a place of refuge for those fleeing the Muskrat. There’s no longer one name in the directory. The monopoly that Twitter built for itself through raw determination crumbled under its own weight and now it’s no longer top dog in the social media scene.
In the end, people don’t need a new Twitter because they’ve already found one, whether it’s Bluesky or Threads or Substack or Medium or WordPress. While Operation Bluebird is more than welcome to prove me wrong, I don’t think they’re going to achieve what they set out to do because it’s physically impossible to replicate the success of Twitter. Even if they were to, what safeguards do they have against history repeating itself?
In the interest of full disclosure, I left my Twitter account abandoned on the very day of my 15th anniversary of signing up. I keep it around for a few reasons, partly because it's costing Musk money to keep it up and running, but mostly because there's a number of very talented artists there who have yet to jump ship because they don't want to lose the audience they've built up over the years.
18 DEC 25 THU Update: Well, this hit a snag earlier than I thought it would.
01 December 2025
My Slop Could Beat Your Slop
![]() |
| Photo by Fruggo |
Let me tell you about someone on Quora, someone we’re going to call J. J had requested my answer to a question regarding YouTube videos. Here is the question verbatim:
My YouTube channel talks about self development, I currently use stock videos from Vecteezy (I give attribution as instructed), motion graphics and Ai voice over narration to make videos. Will my channel get monetized?
I see questions like this all the time. They’re all worded slightly different, and they don't all involve using AI, but my brain hears it the same way every time: I want to participate in the Boston Marathon, but I’m really, really slow. If I show up on a dirt bike, will I be allowed to race?
We can probably have a very deep and thoughtful conversation about the future of AI and how it could potentially be used as a productive tool that aids people in their chosen endeavor. I don’t doubt that. We could probably also have a similar discussion about steroids, albeit the public attitude about those seems pretty clear. Remember when we stopped calling them steroids and simply referred to them by the blanket term Performance Enhancing Drugs? That wasn’t to broaden the definition to include other drugs so much as it was a way for those using said drugs to not sound like they were taking the easy way out. After all, it’s only ENHANCING their performance. They’re still working out and training, they just need that little extra edge because they’ve plateaued in their routine. Is that really so bad?
Of course, doubtless at least one of you has raised a hand in objection and pointed out that content creation for social media platforms is not a competition like it is with athletics. To that I can only say, “Fair, but when monetization is involved and stated as a goal, you’ve made it into one.” We can’t all be Jimmy Donaldson any more than we can all touch the FIFA trophy. Even if we take monetization out of the equation, you’re trying to gain an audience, and that audience only has so much time in the day to consume content. As a wise man said, time is money. It’s even called the attention economy.
Before I could answer J’s question, I needed a little context, just to see if I was possibly missing something fundamental. I asked why he couldn’t narrate the videos himself. Maybe there’s a good reason. I mean, I don’t like the sound of my voice, so who am I to judge? Maybe he doesn’t feel it would be a good fit for the subject matter. Maybe he’s got a really thick accent and is difficult to understand.
J answered in two separate replies, the first being,
“But with the ai voice over is it monetizable?”
J, I asked you why you couldn’t do the narration yourself so I could understand your circumstances that are leading you to ask about the AI voiceover. I asked as a comment on your question so I’d have more information upon which to base my answer. Repeating the question to me isn’t very helpful. The second was,
“Usually my voice over produces unclear audio”
This doesn’t really answer the question, either. “Unclear” isn’t terribly specific. In hopes of coaxing a little more detail out of him, I offered the following advice, “That’s an easy fix. Even voice notes on an iPhone can produce clear audio. If your emphasis is on self-development, you need to demonstrate that you’re developed enough to share your message more directly rather than hiding behind a machine voice. It’s all about authenticity. Visuals are one thing, but audio is what can really make or break a video.” There was no response from J to this. What’s “unclear” remains unclear.
Going back to the response about monetization, this was when I decided to check out J’s profile. There was only this one question on his profile, and he had given only one answer to another question.
Here is that other question verbatim:
If I use an AI generated image in my video and add voiceover to the video and upload it on my YouTube channel, will it get monetized?
Here is J’s answer to that question:
“It is best if you go through YouTube's monetization policy.
From your question, your videos might fall under -LOW EFFORT”
So, for those playing at home, we’ve got one content creator that is using stock videos and wants to use an AI for narration, and another content creator that is using AI generated images and a potentially non-AI voiceover (that’s important). The daylight appears to be measurable in seconds, doesn’t it? Curious if J has actually gone through YouTube’s monetization policy to know that this particular combination of sound and vision is ineligible.
When I brought this up to J, this was his response,
“Yes but there was a significant difference in our content type
That person said they wanted to use still images+ai voiceover only in their videos
But my videos use videoclips, edits and motion graphics+ai voiceover
Our content type is totally different”
Actually, J, that person didn’t say their narration would be rendered by AI. They said they’d “add voiceover to the video” after mentioning using AI-generated images. You made an assumption and tried to insist that using stock assets was more effort-intensive than using AI-generated ones, which is a healthy enough discussion we could have. After all, you’re both using something you didn’t personally create. Someone else did the work and offered it willingly to be used for other people’s videos. The AI-generated assets are a product of data scraping the work of others, regardless of their choice in the matter, but those results are also tailored to a specific input prompt. We could split hairs over who’s putting more effort into the visual portion of their videos until doomsday, but it’s certainly fair to say they’re both low effort compared to people who produce their own visual content, from the humble vlog to the elaborate and collaborative animated story time video.
I should point out that there are many content creators who integrate stock assets into their videos along with their own video and audio content. The important distinction to make here is that the stock footage is not being used as a crutch, much less a foundation. It is supplemental to the original portions of the content. The same goes for something like music from YouTube’s audio library or other stock music resources. These are parts of larger works and their contributions are ultimately secondary to what the content creator brings to the table.
The problem with what you’re doing, J, is that you want the backup band to be more than backup. You’re trying to pile up enough supplemental material that there’s no longer any primary content from you beyond possibly the barest bones of a script and overall vision. Given that, this is why I point out there’s barely any daylight between what you’re trying to do and what you called out that other content creator for asking.
The point I’ve been trying to make to you is that you need to put more of YOU in what YOU are producing for YOUTube. It’s all about authenticity. The reason you’ll hear so many people complain about AI Slop content is that it’s all so impersonal and lacking in heart. It’s designed to chase a trend and feed an ever-changing algorithm, not actually appeal to anyone. It’s junk food, and it’s not even good junk food. The flavor’s gone in an instant and if the calories were any emptier, they’d collapse in on themselves and form little black holes. If that’s the best you can bring to the table, then all you’re doing is getting yourself lost in the noise. Why should anyone give your work attention if you’re not going to give it your own attention and leave a machine to do nearly all of your heavy lifting.
My advice to you is that if you can’t take that step to make your content more personal, then don’t make your content. If you can’t be yourself, why should anyone care about you?
30 November 2025
The Great UnGoogling: The Sign
My point is the problem with WordPress is more with me than anything they do. I suppose it’s fair to say that Blogger spoiled me. Everything Blogger offers is free (unless you want a domain name) because Google is a big evil monolith of a company that gets their money by other means (in my case, cloud storage). WordPress is not Google. At least, they’re nowhere near the size of Google and have fewer means at their disposal to keep the lights on.
Recently, I’ve encountered an issue with Blogger using my browser of choice, Safari. It acted like I wasn’t logged in, even though I was. This problem’s worst aspect was being unable to upload or otherwise post any images. After a tedious back-and-forth with Google’s worthless support, I eventually figured out the issue is with WebKit, a toolkit at the very heart of both Safari and my other browser of DuckDuckGo. I waited for a few updates to my OS and had better luck, but as of this writing I’m still not able to post images. That’s a slight lie; I can post images but I have to use another browser that doesn’t rely on WebKit, such as Edge or Firefox. I don’t want to do that. I know this sounds like being stubborn, but I’m just tired of workarounds. Updates always cause certain pieces of software to break, that’s unavoidable. What’s frustrating however is how long an issue can persist. In a time of software-as-a-service, I don’t think there’s any excuse for a known issue; You’ve got the revenue, you’ve got the personnel, fix it… and no pizza or energy drinks until you do.
This all coincides with a recent change I’ve been making over the course of the past year to UnGoogle my life. It started with no longer relying on Google for logins, password management, or two-factor authentication. That was a very big step and so far it’s had no downsides. I do still have a Google account overall, including a Gmail and cloud storage. Gmail is the best email service I’ve ever used and the cloud storage is really just a backup for other storage services I have, including some good old-fashioned physical drives I have to plug in to my desktop. Blogger is my biggest anchor, the most important reason I haven’t abandoned Google entirely. The recent WebKit fiasco, however, has given me pause. Couple that with the promo pricing offers that WordPress likes to email me, and I can’t help but see it as a sign to jump ship and go all in on WordPress for my blogging needs going forward.
So, given the scope of this endeavor as well as my reservations, I consulted my decision matrix. The decision matrix is a spreadsheet that catalogs the results of three virtual assistants asked to flip a coin. At the risk of sounding spiritual, I concentrate on the question while each coin is being flipped. Heads means to go for it. Tails means don’t do it. The result is a two out of three. The assistants are Siri, Alexa, and Google. Siri said to go for it while Alexa and Google said not to. The only time I’ve ever vetoed the decision matrix was when I upgraded my phone from a 12 mini to a 15. I’d held on to the 12 for a long time, even replacing the battery at one point, and the price on the 15 was pretty hard to beat since it was on the way out. As for WordPress, there exists a possibility of being vetoed and I go all in, but don’t hold your breath. Besides, the deal would only last for three years, and then I’d have to pay closer to full price again. I just don’t feel like it’s worth it right now.
15 November 2025
Please Standby
Okay, it's later in the day and here's the progress. When I wrote the first part of the update and fixed the embedded video, I was at a Windows workstation. I didn't think the OS would make a difference since I tried two different browsers with the same result (or lack thereof). Thinking maybe the issue had been fixed, I tried it on my Mac when I got home, and same input lag and general unresponsiveness when using Safari. I even tried DuckDuckGo again and had the same issue. Those are my only two browsers as I deleted Chrome some time ago and I don't like Firefox. So, I downloaded Microsoft Edge, which is where I'm currently writing this. So, whatever is going on with Blogger, it's purely a MacOS issue. I guess this is what I get for signing up for betas.
Now I'm trying out Brave and it's working just fine. What in Hel's Realm is wrong with Safari and DuckDuckGo?
21 September 2025
The Barbarians at SETI
Hey, who’s up for a little existential dread wrapped up in nostalgia by way of a 90-minute toy commercial featuring blood, body horror, and genocide?
Whenever nostalgia-based content creators bring up pieces of media that “traumatized” us as kids, there’s a fairly familiar list of usual suspects and often with Jim Henson’s The Dark Crystal at the tippy top. Other entries on the list may include The Transformers Movie, The Neverending Story, and The Secret of NIMH, among many others of that particular era. I saw these films back in the day (or at least parts of them before turning off the TV in horror) and they certainly had an effect on me not unlike that on my peers. However, there’s one movie that it seems no one talks about that definitely had an impact on me as a kid, and that’s the G.I.Joe Movie from 1987. What separates this one from the others on those lists is a kind of double-whammy, scaring me as a kid and then unexpectedly giving me severe existential dread as an adult.
Casting my mind back to around 1987 or maybe 1988, I was in my neighbor’s basement watching cartoons with their kids, and the G.I.Joe Movie was on. Originally slated for theaters, the poor box office performance of Hasbro’s previous animated outing (Transformers) led the film to be released directly to video and ultimately to television in a serialized format. The movie centers around the titular Joes going up against an ancient race hiding away under a dome of ice in a frozen wasteland. Called Cobra-La, these ancient serpent worshippers plan to launch a bio-terrorism attack on the whole of humanity. Massive spore-pods are launched into orbit. Upon ripening, the spores will descend upon the Earth, infecting everyone they come in contact with, reducing humanity to mindless beasts. I think this was my first exposure to the concept of a fate worse than death. Seeing these ordinary adults going about their day violently transformed into scaly, savage subhumans is pretty dark for a 6 or 7-year old kid. It was certainly a stark contrast to the Joes’ usual goofy conflicts with the ruthless terrorist organization Cobra, who only wanted to rule over the civilizations of the world rather than outright destroy them. I guess this was also my first introduction to the idea of different kinds of evil, those who wanted to take over the world versus those who wanted to watch it burn.
Fast-forward many years, maybe to the last five or so. I’m a grown man in his forties watching random movie clips and reviews on YouTube. The G.I.Joe movie came up and reignited memories of the absolute body horror that was Cobra Commander’s origin story, among other things like the aforementioned mindless beasts. This time, though, something else caught my attention, something that went over my head back in the 80’s. Cobra-La’s main gimmick throughout the movie is that their technology is organically-based. In other words, they don’t build things so much as grow them. Nearly everything at their disposal is alive. Even door keys are odd-looking beetles and rolling out the red carpet involves a literal army of little crab-like creatures. Kind of puts the Flintstones in a new light, doesn’t it?
Cobra-La is ruled over by Golobulus, voiced by the original G.I.Joe himself, Burgess Meredith. Around the halfway point in the movie, he gives us an exposition and lore dump about how Cobra-La came to be as it is. 40,000 years ago, they lived in harmony with nature, engineering it to their will and establishing an advanced civilization in this period of pre-human history. Climate change, specifically an ice age, brings their entire way of life to a screeching halt and forces them to take refuge under an ice dome. Following this is the rise of what Golobulus calls “the barbarians”. He’s of course talking about humans as we’re shown a pack of Neanderthals poking around a forest in search of their next meal. The flashback makes a time skip worthy of 2001: A Space Odyssey and shows humanity launching the space shuttle. Narration from Golobulus highlights a key difference between the age of Cobra-La and these pesky barbarians known as man. Whereas Cobra-La used organic matter as its foundation, human beings harnessed inorganic materials like stone and metal.*
Let’s think about that a minute. There we were, eons ago, poking around the woods eating grubs and fruits and whatever else we could forage. Somewhere along the way, one of our ancestors had an epiphany. If we took a stick and sharpened it to a point using a rock, we could take on larger prey and enjoy a greater feast. Then, someone hit on the idea of taking those rocks and using the antlers and bones of that prey to shape them into whatever we needed to hunt ever-bigger prey. Following that, someone else noticed a special kind of rock, one that’s got shiny bits in it, and who doesn’t love a bright, shiny object? One discovery leads to another, and next thing you know we’re using spools of copper wire and these weird things called semiconductors to send messages over great distances. Needless to say, as far as we know, no other species on our planet has made these discoveries, much less built on them to a point of manipulating electromagnetic radiation to communicate. Some animals use tools, it’s true, and whales have a surprisingly wide-reaching communications network, but that’s hardly competition in the tech sector… unless we’re vastly underestimating the whales and Star Trek IV is dead on the money.
Speaking of alien intelligences, SETI, the Search for Extra Terrestrial Intelligence, casts a fairly broad net in their search for life beyond our own world. Primarily, though, they’re interested in establishing communication with alien civilizations that have reached at least the point of sending radio waves through space. I mean, if you sank all that grant money into leasing time on a massive radio telescope, that’s where you’d focus the most of your efforts. That’s not entirely fair. There’s actually a decent precedent for the notion that our first contact with an intelligence beyond our solar system could be in the form of telecommunications, namely the famous Wow! Signal, detected in 1977 and originating from somewhere in the constellation of Sagittarius. While we don’t know what the signal necessarily “said”, the fact that it was such a strong signal indicates it being more than some natural phenomena like a charged hydrogen cloud or one of many other potential explanations floating around since the discovery.
I’m firmly, ardently of the belief that we are far from alone in the universe. Given the billions and billions of galaxies, each with billions and billions of stars, with billions of planets in each galaxy, it would be an astronomical impossibility for our little blue marble to be some grand exception to the rule by having life on it. Of course, we have to talk about the word “alone” in this context. As for what kind of life is out there in the universe, I’m in Camp Sagan in that while there’s most certainly life beyond our home star’s Oort Cloud, the likelihood of that life having visited us and walking among us is a little sketchy. The distances are vast enough for radio messages to take lifetimes at best, much less spacecraft. That is, life could exist, just not the kind that’s followed the same path that we have.
This is tapping on the lid of a very, very big can of worms in terms of answering the question of whether or not we’re alone in the universe. The Fermi Paradox, The Drake Equation, The Dark Forest, and probably most concerning is The Great Filter. The Great Filter is the notion that there may be some great barrier in the progress and development of any intelligent species that’s either extremely difficult to overcome, or downright impossible. This barrier can be most anything from exhausting resources to self-destruction by warfare. Essentially, it’s a point at which a civilization cannot sustain itself and either needs a massive paradigm shift or collapses on itself in extinction. As far as humanity goes, there’s two possibilities, the Filter either being ahead of us, or hopefully behind us. Maybe harnessing those inorganic materials and developing technology was the Filter. Maybe leaving this planet or at least getting a start on asteroid mining is the Filter.
Of course, that leaves the final question of whether or not we’re the first to get past the Filter.
*I learned while writing this of a film from 1959 called The Atomic Submarine, which features humanity dealing with an aquatic UFO that is able to heal itself from attack by way of an organically-based technology.
01 September 2025
The Great UnGoogling: First Steps
Google pays Apple around 18 Billion every year for their search engine to be the default in the Safari browser. For perspective, that’s roughly NASA’s annual budget. Bear in mind, this isn’t some exclusivity deal. You can change the default search engine in Safari with just a little spelunking into the menu system. Like most things with Apple, the solution is simple, just not obvious.
If you’re on an iPhone:
Go to Settings.
Find Safari.
You’ll see an option for Search Engine.
Tapping on it reveals a list of options with Google at the top.
Take your pick.
You’re all clear, Kid. Now let’s blow this thing and go home.
The desktop environment has fewer steps, but works much the same way; it’s neatly tucked away in settings, far from hidden, but not calling attention to itself. Like I said, simple, not obvious. As for that list of options, it may seem like slim pickings, and it kind of is. Once upon a time, it was kind of the Wild West. There was Yahoo! and Google, of course, but there was also the likes of Excite and Lycos. Those two are still technically around, but they’re very much shadows of their former selves and not selectable on Safari’s shortlist. In fact, the only search engine I know of that came on the scene with a lot of hype and died almost as quickly was Cuil (pronounced “cool”), lasting only from 2008 to 2010. As to how we came to have one big slice with many rail-thin slices that would blow away in a gentle breeze if you tried to serve them on a plate, that’s a discussion people have earned degrees investigating. So, let’s just call it (Un)Natural Selection Meets The Law of Averages and carry on.
Touching back on those thin slices, they all stand a very good chance of becoming bigger pieces of the pie with the recent backlash Google has gotten for failing to uphold their founding mantra of “don’t be evil”. There’s been too many issues to cover in great detail here, from exploiting behavioral economics to legal troubles to policy changes contrary to earlier promises. Speaking only for myself, I’ve been making moves since the start of the year to slowly but surely de-Google my life, a kind of unintentional New Year’s Resolution. I don’t plan on quitting Google entirely, mostly because of my blog here as well as my YouTube channel, Gmail, and maybe Google Drive (which I'm considering moving to an external drive). My first real step in this direction was eliminating instances wherein I use my Google account to log in to a service. That took a very long time, but was ultimately painless and now I log in to those services through other means.
My next step was choosing a new default search engine, which brings us to that list of options. Yahoo! was an outright no from me. I closed my email account there some time ago and the only real attachment I have to them is Flickr, which I almost never use, at best serving as a plan B if I decide to give up on Instagram. Bing is okay; it’s by Microsoft, a company I have very mixed feelings about, but overall it’s a perfectly competent search engine. I don’t know a lot about Ecosia, except that it’s some kind of nonprofit that plants trees. I simply don’t see a gimmick like that being sustainable, which makes me sad as it’s a very noble cause.
In the end, I chose DuckDuckGo, and I’ve been using it for a few weeks now. It is partly built on Bing and primarily emphasizes privacy. What this means is that when you search something, DuckDuckGo doesn’t keep track of where you’ve been and what you’ve searched for, and does not use this data to target ads to you or otherwise tailor your search results to consider your browsing habits. I’ve never used Google without logging in to my Google account, so I don’t know what the search experience is like for everyone else. The consensus of late is that the search results feel fundamentally broken as they push AI features that are hit or miss at best as well as giving special favor to companies that buy advertising space on the platform rather than emphasizing relevance. Few things in this world irk me more than someone saying, “Just Google it!” as it exposes their ignorance to just how Google works. Put simply, my search results are not going to look like yours because of our different browsing habits. Google tracks what you look for and tries to find results that fit, often creating a kind of echo chamber.
One of the criticisms of DuckDuckGo and to a similar extent Bing is that the search results aren’t as “good” as Google. What is meant by this is that the results are less tailored and require a little more heavy lifting on your part. This isn’t the best example, but you’ll get the idea: when I put “flip a coin” into Google, it loads an applet that flips a virtual coin. When I put “flip a coin” into DuckDuckGo, I don’t get an applet. Instead, I get a list of websites that offer random number generators with coin-flipping options, along with articles detailing the history of the coin flip as well as the mathematical probabilities of using different currencies in the flipping. In other words, it’s not flipping the coin for me, it’s directing me to where I can get help with flipping a coin as I don’t have one on me and need to decide where I’m going to go for lunch. DuckDuckGo isn’t trying to be the answer. It knows it’s a directory. Google is trying to be the answer, at least for simple things like flipping coins or telling me the current temperature in Madagascar or the results of an election. At least, they started with simple things. Now, they’re trying to get AI to deliver more concise answers to more complex questions. I once asked Google which finger types the 6 on a keyboard. The first result was an AI summary that insisted it was the ring finger on the right hand. Needless to say, this is very wrong*. I tried again with different wording, thinking maybe I confused it. The result was the same. Sometime later, when I asked the same question, it did away with the AI summary and gave me a list of websites about learning to type.
To be fair, DuckDuckGo does have some AI features and will occasionally feature a summary at the top, but this is actually a pretty rare occurrence, even when I ask a direct question rather than inputting a string of keywords. Sometimes, the AI summary will only show a blank bar with an option for me to generate the summary. Other times, it’s placed far down the list, sometime after the 5th or 6th result. The point is that DuckDuckGo isn’t trying to force it on me, and I appreciate that. It’s still interested in the technology and wants it to be better, but it knows to make it more or less “opt-in” compared to Google.
I’ll have further entries on my UnGoogling progress, but I wanted to start it off light and simple, something that most people can do without having to make any serious commitments or disrupt any routines.
09 August 2025
The First Byte Is With The i
Supposedly, the colored bands across the apple presented a challenge as far as making the badges, especially considering every machine was going to have one somewhere on its casing. It was costly and tedious, but the company knew what it wanted. The rainbow would eventually fade in favor of a single color depending on which of their iMacs and/or iBooks you purchased. Following that, they stuck with a flat monochrome, though the rainbow does technically live on in their new range of iMacs echoing their classic predecessors from the late 1990’s and early 2000’s.
What’s unique about Apple’s branding is that tech companies, as a rule, shy away from anything flashy or colorful.
Keep It Simple, Stupid.
Microsoft’s Windows has had a similar trajectory in terms of branding, trading its colorful, curvy squares and swish for a flat and sober arrangement of squares. Of course, outside of Microsoft’s Surface lineup, you don’t really see their logo on the machines that carry their operating system, except maybe for a small foil sticker somewhere on the casing, typically next to the one for the processor and the one for the graphics card.
Then there’s ViewSonic, which doesn’t know what it wants in terms of branding.
I’ve used ViewSonic monitors for years and they’ve never let me down. They’re reasonably priced, they have a wide selection, and they offer a high quality image. That said, they frustrate and disappoint me in a way only the decisions of a large tech company can.
Before I made the leap from a large Android phone from Sony to an iPad for my drawing, I poked around the Android market to see what would meet my needs. Unfortunately for me, most of the Android tablet offerings didn’t really have creative productivity in mind. Their target audience was people who just wanted to read a book, browse the web, or watch a streaming service. Even the few that tried to appeal to the artist crowd usually came up short and asked a pretty high price, so high that you might as well have saved yourself some money and just gotten a damn iPad, which I did.
Android tablets are slim pickings these days, dominated by only half a handful of companies, namely Amazon and Samsung. To the latter’s credit, they’ve made a pretty good go of giving the iPad a run for its money with creative professionals. Before that, though, it seemed all the major players wanted a piece of the Apple-dominated pie. I even remember Toshiba offering a line of fairly reasonably priced tablets. They had terrible displays, but that was really where a lot of these companies cut the most corners. So, when I saw that ViewSonic was making Android tablets, I was elated.
ViewSonic has a logo that stands out among the other tech companies, even giving Apple some stiff competition, albeit it’s a little anemic on the academic analysis front. It’s three Gouldian finches huddled together in a neat little row, their feathers a vibrant mix of yellows, blues, purples, and a touch of red around the eyes of the outer two. It is really less of a logo and more of a promotional illustration, not unlike Apple’s original woodblock logo of Sir Isaac Newton sitting under a tree (before they “got to the point” and went with the apple itself).
I thought, “Can you imagine how cool that would be to be rocking a tablet with those three birds where anyone else would expect to see an Apple logo!?” Between that and ViewSonic offering quality monitors, it seemed like a no-brainer. Then, I browsed the selection and saw no trace whatsoever of the finches. Not even the bezel on the front had them like they do for some of their monitors. All they had was the ViewSonic name embossed on the back. I was furious, that special kind of furious you reserve for the most first world of first world problems, the one that has you toppling a chair as you storm out of the room before marching back in to continue hurling abuse at the screen.
“AND ANOTHER THING…!”
Seriously… you idiots! You had a Gouldian opportunity to show up Apple and make a name for yourself in the tablet space, and you blew it. It didn’t help they also weren’t offering any better resolution or image quality than their competition, indicating they simply put their name on something so they could say, “Yeah, we tried the tablet thing, but it didn’t work out. Let’s go back to making monitors.” The lineup did not last and the Android tablet market would implode practically overnight following that saturation of everything except colored logos.
I bring this all up because I was considering upgrading my monitor to an ultra wide (the 21:9 ratio instead of the typical 16:9). I found a model that seemed to tick all the boxes except having the finches on them. That’s not really a dealbreaker for me, just disappointing. When I looked at the monitor’s specifications, something caught my attention. It’s not uncommon for monitors to have built-in USB hubs to help with cable management. It’s actually pretty smart. You’ve typically got your PC tower on the floor under your desk, so rather than get extension cables for all of your peripherals, you plug one long cable from the PC to the monitor, and then plug all of your peripherals into the monitor.
However, this particular model had only one USB port, but proudly proclaimed:
“Connect your compatible computer through the dual HDMI 2.0 and one DisplayPort inputs. A USB-A 2.0 port allows you to connect peripherals such as a wireless keyboard and mouse adapter through the monitor.”I was very puzzled by this. How can you plug a mouse and keyboard into the monitor and then somehow get them to connect to your computer? Is there some obscure feature to the HDMI or DisplayPort connection that I don’t know about? I mean, both HDMI and DisplayPort carry audio information, maybe there’s another channel in the mix for other purposes. I decided to get some clarification and try my luck with the Q&A section of the B&H listing where I was originally browsing. A staff member got back to me within about 24ish hours and clarified that the USB port on the monitor is to plug in a USB thumb drive for the sole purpose of upgrading the monitor’s firmware.
As of this writing, the B&H page for the monitor is unchanged, still claiming that the USB port on the monitor is for computer peripherals. I don’t blame B&H for that; they’re getting their sales blurb directly from ViewSonic. So, I went to the source and found this:
“Two HDMI (v2.0) inputs, one DisplayPort (v1.4) input, and one USB-A (v2.0) input offer flexible connectivity so you can directly connect your keyboard, mouse and other peripherals.Now I was really puzzled. This is the source. This is the company making the thing. Do they know something I don’t? I found their sales support email and wrote them to get some clarification on the listing. Like B&H, they were very prompt in their response:
Change the way you work and play with the VX3418-2K monitor today.
*165Hz refresh rate with DisplayPort only”
Thank you for contacting ViewSonic!
Taking a look at the VX3418-2K, I can confirm that the USB-A is only for updating the monitors firmware. I have included an image of the user guide that explicitly states it.
Not too sure why it is marketed in this manner, but this is something we can take a look at and correct. Thank you for pointing this out.
So, they did… sort of… clumsily... and hastily:
“Two HDMI (v2.0) inputs, one DisplayPort (v1.4) input, and one USB-A (v2.0) input** offer flexible connectivity so you can directly connect your keyboard, mouse and other peripherals.I hear LG monitors are pretty good.
Change the way you work and play with the VX3418-2K monitor today.
*165Hz refresh rate with DisplayPort only
**For firmware updgrade (sic) use only”





