09 August 2025

The First Byte Is With The i

People have probably earned Master’s degrees analyzing Apple’s classic rainbow logo. The apple represents knowledge and the rainbow represents order—oh, but there’s a catch; there’s a bite taken out of the apple and the colors are all out of order, representing disruption, a rebellious upset of the status quo. It’s hard to imagine now for most people, but once upon a time Apple was the scrappy underdog upstart that relied on its fans for its marketing as much as anything by a professional advertising agency.

Supposedly, the colored bands across the apple presented a challenge as far as making the badges, especially considering every machine was going to have one somewhere on its casing. It was costly and tedious, but the company knew what it wanted. The rainbow would eventually fade in favor of a single color depending on which of their iMacs and/or iBooks you purchased. Following that, they stuck with a flat monochrome, though the rainbow does technically live on in their new range of iMacs echoing their classic predecessors from the late 1990’s and early 2000’s.

What’s unique about Apple’s branding is that tech companies, as a rule, shy away from anything flashy or colorful.

Keep It Simple, Stupid.

Microsoft’s Windows has had a similar trajectory in terms of branding, trading its colorful, curvy squares and swish for a flat and sober arrangement of squares. Of course, outside of Microsoft’s Surface lineup, you don’t really see their logo on the machines that carry their operating system, except maybe for a small foil sticker somewhere on the casing, typically next to the one for the processor and the one for the graphics card. 

Then there’s ViewSonic, which doesn’t know what it wants in terms of branding.

I’ve used ViewSonic monitors for years and they’ve never let me down. They’re reasonably priced, they have a wide selection, and they offer a high quality image. That said, they frustrate and disappoint me in a way only the decisions of a large tech company can.

Before I made the leap from a large Android phone from Sony to an iPad for my drawing, I poked around the Android market to see what would meet my needs. Unfortunately for me, most of the Android tablet offerings didn’t really have creative productivity in mind. Their target audience was people who just wanted to read a book, browse the web, or watch a streaming service. Even the few that tried to appeal to the artist crowd usually came up short and asked a pretty high price, so high that you might as well have saved yourself some money and just gotten a damn iPad, which I did.

Android tablets are slim pickings these days, dominated by only half a handful of companies, namely Amazon and Samsung. To the latter’s credit, they’ve made a pretty good go of giving the iPad a run for its money with creative professionals. Before that, though, it seemed all the major players wanted a piece of the Apple-dominated pie. I even remember Toshiba offering a line of fairly reasonably priced tablets. They had terrible displays, but that was really where a lot of these companies cut the most corners. So, when I saw that ViewSonic was making Android tablets, I was elated. 


ViewSonic has a logo that stands out among the other tech companies, even giving Apple some stiff competition, albeit it’s a little anemic on the academic analysis front. It’s three Gouldian finches huddled together in a neat little row, their feathers a vibrant mix of yellows, blues, purples, and a touch of red around the eyes of the outer two. It is really less of a logo and more of a promotional illustration, not unlike Apple’s original woodblock logo of Sir Isaac Newton sitting under a tree (before they “got to the point” and went with the apple itself).

I thought, “Can you imagine how cool that would be to be rocking a tablet with those three birds where anyone else would expect to see an Apple logo!?” Between that and ViewSonic offering quality monitors, it seemed like a no-brainer. Then, I browsed the selection and saw no trace whatsoever of the finches. Not even the bezel on the front had them like they do for some of their monitors. All they had was the ViewSonic name embossed on the back. I was furious, that special kind of furious you reserve for the most first world of first world problems, the one that has you toppling a chair as you storm out of the room before marching back in to continue hurling abuse at the screen.

“AND ANOTHER THING…!”

Seriously… you idiots! You had a Gouldian opportunity to show up Apple and make a name for yourself in the tablet space, and you blew it. It didn’t help they also weren’t offering any better resolution or image quality than their competition, indicating they simply put their name on something so they could say, “Yeah, we tried the tablet thing, but it didn’t work out. Let’s go back to making monitors.” The lineup did not last and the Android tablet market would implode practically overnight following that saturation of everything except colored logos.

I bring this all up because I was considering upgrading my monitor to an ultra wide (the 21:9 ratio instead of the typical 16:9). I found a model that seemed to tick all the boxes except having the finches on them. That’s not really a dealbreaker for me, just disappointing. When I looked at the monitor’s specifications, something caught my attention. It’s not uncommon for monitors to have built-in USB hubs to help with cable management. It’s actually pretty smart. You’ve typically got your PC tower on the floor under your desk, so rather than get extension cables for all of your peripherals, you plug one long cable from the PC to the monitor, and then plug all of your peripherals into the monitor.

However, this particular model had only one USB port, but proudly proclaimed: 
“Connect your compatible computer through the dual HDMI 2.0 and one DisplayPort inputs. A USB-A 2.0 port allows you to connect peripherals such as a wireless keyboard and mouse adapter through the monitor.”
I was very puzzled by this. How can you plug a mouse and keyboard into the monitor and then somehow get them to connect to your computer? Is there some obscure feature to the HDMI or DisplayPort connection that I don’t know about? I mean, both HDMI and DisplayPort carry audio information, maybe there’s another channel in the mix for other purposes. I decided to get some clarification and try my luck with the Q&A section of the B&H listing where I was originally browsing. A staff member got back to me within about 24ish hours and clarified that the USB port on the monitor is to plug in a USB thumb drive for the sole purpose of upgrading the monitor’s firmware.

As of this writing, the B&H page for the monitor is unchanged, still claiming that the USB port on the monitor is for computer peripherals. I don’t blame B&H for that; they’re getting their sales blurb directly from ViewSonic. So, I went to the source and found this: 
“Two HDMI (v2.0) inputs, one DisplayPort (v1.4) input, and one USB-A (v2.0) input offer flexible connectivity so you can directly connect your keyboard, mouse and other peripherals.

Change the way you work and play with the VX3418-2K monitor today.

*165Hz refresh rate with DisplayPort only”
Now I was really puzzled. This is the source. This is the company making the thing. Do they know something I don’t? I found their sales support email and wrote them to get some clarification on the listing. Like B&H, they were very prompt in their response: 
Thank you for contacting ViewSonic!

Taking a look at the VX3418-2K, I can confirm that the USB-A is only for updating the monitors firmware. I have included an image of the user guide that explicitly states it.

Not too sure why it is marketed in this manner, but this is something we can take a look at and correct. Thank you for pointing this out.

So, they did… sort of… clumsily... and hastily: 
“Two HDMI (v2.0) inputs, one DisplayPort (v1.4) input, and one USB-A (v2.0) input** offer flexible connectivity so you can directly connect your keyboard, mouse and other peripherals.

Change the way you work and play with the VX3418-2K monitor today.

*165Hz refresh rate with DisplayPort only

**For firmware updgrade (sic) use only”
I hear LG monitors are pretty good.

03 August 2025

H'Elio, Anybody There?


So, I have a question, but it’s a bit of a complex one that needs a lot of context for it to make sense, so please bear with me.

I recently saw Elio and my overall impression of it could best be summed up in a single word: Solid. It doesn’t excel in any areas, but it doesn’t fall short anywhere, either. Another film I use the word “solid” to describe is The Last Starfighter, and Elio has a nice little nod to that film I appreciated. I do share the sentiment with a number of Pixar fans that their films simply don't pack the same punch as they once did. For me, I think they peaked with The Incredibles and they’ve simply never been able to recapture that special something. To be fair, I haven’t seen Coco, and that seems almost universally loved by all who’ve seen it. I also rather enjoyed Elemental, though that was more for its visuals and world building than anything else it had to offer. Even Lightyear had its moments despite being an overall lackluster execution of an ill-conceived afterthought. Maybe there’s a paradigm shift at Pixar, or maybe I’m just getting older. I mean, it couldn’t possibly be the latter, could it?

The criticisms of Elio seem to come in two flavors. One has to do with the art style being too similar to past entries, namely Luca and Turning Red. As much as I see the point, I can’t help but wonder if we’re not dealing with a little bit of cherry-picking here. Okay, Elio’s characters look like they could fit in right alongside those in Turning Red and Luca, but could you say that about Elemental, Soul, Lightyear, or Inside Out 2? Those all seem to have pretty distinct art styles from one another. Frankly, if Pixar movies are starting to blur together, it’s got more to do with the number of sequels they make than anything else. After all, why shouldn’t The Incredibles 2 have the same art style as The Incredibles? It would be rather odd if they took it in a drastically different direction. The same goes for the Toy Story films as any changes to the art style seem to have more to do with technical innovations since the first film came out all the way in 1995 than any creative decision from on high. 1995? Damn, I am getting older.

The other flavored critique of Elio’s reception is in its marketing. There didn’t seem to be any shortage of YouTubers and social media influencers insisting they didn’t even know the movie was out because they never saw any ads for it. Well, that’s easy enough to explain. Elio was delayed by almost a year because of the SAG/AFTRA strikes, so chances are most if not all of that initial marketing budget was spent pushing a date that’s no longer valid. Marketing a movie is expensive and it’s not easy to make sharp right turns or pump the brakes. If the movie’s release date is pushed further down the pipe, that’s more money Pixar has to ask from Uncle Walt to get the movie out in front of people prior to the actual release date. That said, this is where I start to approach the initial question I brought up at the start. I use YouTube the way most people probably use Netflix or Hulu or HBO Max. It’s rare for me to binge a show or series of movies on one of those platforms, but I’m rather embarrassed to admit how many hours I’ve spent binging retro videogame reviews or story time animations or ASMR content. My point is that I’ve been seeing plenty of ads and trailers for Elio leading up the release date. I don’t know where these people are getting this idea that they don’t know the movie’s coming out because I’ve sure been made aware the past several weeks.

So, here’s the question at long last: Are you guys all using adblockers?

Can you really complain about not knowing when a movie is coming out if you’ve cut off the mainstream means of marketing? How did you know about all the other movies you might have seen instead of Elio? I mean, I saw Superman the day before I saw Elio, and just like that film, there were plenty of trailers in front of YouTube videos. That movie doesn’t seem to be doing so bad in theaters, so what am I missing here? I don’t think the art direction critique has a lot of credibility, and the lack of marketing is only understandable to a point when you consider how many people are consuming content in the first place.

Pixar and Disney have released a statement effectively blaming moviegoers for Elio’s lackluster box office performance, insisting that they go out of their way to make these original stories only for people to not go and see them, all while complaining about there being too many sequels, reboots, and remakes. I don’t want to say they have a point, but I also don’t think they’re completely wrong. We turn out in droves for the latest entry in a long-running series, then say we’re tired of sequels, and finally add insult to injury by ghosting those movies that do try to be their own thing rather than part of some legacy.

There’s some study out there about how what people say they want doesn’t always match up with what they really want. It goes something like this, a group of participants were asked how they like their coffee, either rich and dark or milky and weak. Overwhelmingly, people said they liked their coffee rich and dark. However, when it came to their real world spending habits, overwhelmingly the drink of choice was milky and weak. So, what’s going on here? One conclusion drawn was that people said they liked rich and dark because it sounded more appealing in their heads. Maybe they wanted to avoid some social stigma of not being able to handle this adult beverage like adults. In the end, when it came to actually having to drink coffee in their daily routine, they went with milky and weak because they didn’t want to deal with the harshness of the alternative. Speaking only for myself, when I used to drink coffee, if I ordered from Starbucks, I would order a Tall Blonde. I did this for two reasons. For one, it’s fun to say. For two, I’d take that Tall Blonde over to the counter with all the sugar and cream, and add each until it tasted just right to me. In other words, I didn’t trust the baristas to know exactly what I wanted (I mean, I barely knew), so I took the basic foundation and built on it to my own tastes.

There’s no adding cream and sugar to a movie, not unless your theater has a liquor license and you’re really good at coming up with drinking games on the fly. Considering how expensive it can be to go to the movies these days and the economy being in and out of the drunk tank since January, your only real choice is to potentially have a bad time at a movie where you don’t know what to expect, or play it safe with a franchise that’s never let you down.

Much like my original question, the issue is complex and requires a lot of context to understand. Are people really unaware of certain movies coming out and saying as much, or are they sticking with what they know to avoid taking a risk?