There’s a lot of hullaballoo recently about ChatGPT and AI. It’s not surprising. The feats that these text systems are accomplishing are astounding, and creepily human-like. It begs the question: are we at the cusp of Artificial General Intelligence?
I’m not an AI expert. But I do know that most people don’t understand what it’s all about, and a lot of media just gets it horribly wrong. For many of us, if something smells and feels like it’s human-made, passing a Turing test, we implicitly intuit that the system is in fact “intelligent”, in a similar sense to how we intuit our own cerebral capabilities. We get an emotional response because the system is sufficiently human-like to convince us in our guts, which makes the leap to logical conclusions easy if not inevitable.
It turns out that everything we’ve built so far that purports some level of AI is in fact quite dissimilar to how our brains actually work. Well, sort of. To be clear, we just don’t know how our brains work. And systems like ChatGPT are in vital respects similarly black-boxed and inexplicable. “They just work.” Enormous neural nets of 100’s of billions of neurons (sound familiar?) end up eliciting language patterns that so convincingly replicate our own abilities that it confounds and amazes us. Is it as smart as us?
ChatGPT is certainly emulating our language construction semantics in a completely different way, on a completely different substrate. But the results are almost the same. So at minimum, it’s a fantastic tool. But it’s not a human. And it’s miles away from “general” intelligence. It’s a huge matrix of weighted coefficients that were machine learned ad nauseam to basically output something that makes sense to us, and is fairly narrowly applicable only to text generation.
Part of the wow-factor here should be abated by understanding that English is really “only” about 50,000 words, and there are only “so many” ways to put all those words together in different ways that can produce meaning. Don’t get me wrong — ChatGPT is four-letter-word-ing amazing, and certainly cool. But maybe constructing intelligible text is not as “complicated” as we think it really is. Human language is astounding, and something that interests me to no end. However, perhaps it’s not as complex as it feels.
If you want to seriously nerd out on this topic, read this Stephen Wolfram article. It’s really, really long, and there’s a lot of math, and a lot of terms in quotes (spoiler: for good reason), but it seems incredibly sensible to me. I admit most of the math was outside my wheelhouse. But there’s still some great insight here about neural networks and machine learning and what’s really going on under the hood. Another related article below as well, more easily digestible. Enjoy!
Recently, I made a major logistical decision to switch completely from Windows to macOS. I had been running Bootcamp for many years on MacBook Pro laptops, and almost exclusively booting into Windows.
OK, wait. Why would I do such a silly thing?
It all started decades ago, when I became a diehard IBM ThinkPad fanboy. The devices were solid and reliable, without superfluous bells and whistles, and just got the job done. They usually lasted a few years before the specs just weren’t good enough. It wasn’t necessarily because of hardware failures or flimsy construction that I needed to upgrade.
Then IBM sold their personal computer division to Lenovo. They kept on the ThinkPad tradition, but over the years, the quality started to dip, and the unnecessary extras started piling on — whether it was quirky hardware options, or tag-along software packages ala Dell. I started getting frustrated.
Then, I decided to jump hardware ships to Apple, and never looked back. Although extremely marked up in price (the “Apple Tax”), they were just built well. Aluminum chassis. Reliable keyboard. They just worked, well above par from similar makes and models. (I am not claiming they were or are perfect — just that the standard was better and they felt more robust.) I started with Bootcamp from day one. I never even tried Parallels. I booted into macOS only seldom, as there was almost never a pressing need.
A couple things happened in the most recent past. First, Apple has abandoned Intel CPUs for their own silicon. That means that Bootcamp is out of the question for future models, and I knew I was getting close to needing a hardware upgrade after a few years with my current MacBook Pro.
Next, my son stopped playing Fortnite, which we would occasionally enjoy together, and hence I stopped as well. So I didn’t need a gaming experience anymore. Gaming on macOS is lackluster and sparse, but that is irrelevant if you’re not even gaming at all.
So, with a pending M1 or M2 chip in my destiny, I decided it was time to abandon Windows completely, and go all-in with macOS. It took me around two months to fully embrace the decision. The execution was painful and took a few weeks. Now the dust has settled, and I’m actually really pleased. The hardest part was finding a suitable replacement for Quicken for Windows, which I use to keep track of our home finances. I settled eventually on Banktivity, and it’s working well for me. So I’m a happy Apple geek.
Except…
After the switch, and spurred on because of the switch itself, I got totally screwed by Apple on a major UX failure. I do bear some minimum of responsibility, but nonetheless, I firmly believe it’s mostly a mistake on Apple’s part.
I have for years had two SIM cards — one for Denmark, and one for the US. And whenever I’ve upgraded my iPhone, I’ve kept a defunct phone sitting at home with my AT&T SIM card, set to auto-forward calls to voicemail so I don’t get dinged by international calling rates. This way I could still get regular SMS messages to my US number. I was delighted when I could auto-forward these messages to my main iPhone. Huzzah! I was able to get my messages all in one place.
Then All Awesome Things Apple took it up a notch. Once I started using macOS daily, I saw that I was also getting iMessage and SMS texts directly on my computer. I actually found this to be completely fantastic.
At one point, my old phone, with the AT&T SIM card, started complaining about space problems. I went into Settings on the phone and looked at Storage, and could see that my Messages app was eating up over 3 GB of space. “Huh,” I thought. “I don’t really need to keep all these messages on this old phone. I can just keep them only on my new phone, and on macOS.”
So I flipped the switch for “Message History” from “Keep Messages Forever” to “Keep Messages for 30 Days“. And Dumb Me presumed (wrongly) that this setting was local to my phone. Not global to my entire iCloud account.
I never got any warning message on the old phone to the tune of, “Hey, are you sure about that? You’re about to remove 3 GB of message history that we know you have meticulously backed up for more than a decade, because you like to be able to see old messages with your family and friends. You like to peruse photos and videos and sound clips you have joyously exchanged with people you care about. Oh, and perhaps more importantly, this is a global setting, which means that all those messages will also POOF disappear on your other iPhone and your MacBook, and even a Cloud-based backup of your phone will not restore them. They will be wiped from existence. Are you really, really sure you want to make this change?”
No, there was no warning. Just a silent deletion of all my messages. I first noticed around a week later. It was too late.
And a week before that, I had purged some old iPhone backups on my disk, because they were on the Windows partition, and I just didn’t think I would care about them ever again. Joke’s on me!
Should I have thought twice before making that change to the settings in my old iPhone 5s? Yes. Do I think it’s a massive UX failure by Apple? Yes. I was on the phone with them in Ireland for like 45 minutes complaining about this. I’m hoping Apple considers some UI changes in the Messages Settings to at least warn users that the “Message History” setting is iCloud global — if you are using iCloud-based Messages, that is. That was also something I had recently changed. “Might as well back that up in the cloud too, right?” I would have been better off never backing it up in iCloud. But I just didn’t really know.
Anyway, <rant off>. I’m still overwhelmingly pleased with transitioning from Windows to Mac. I’m not gaming at all anyway, and I’ve adjusted fine to using Mail instead of Thunderbird (it was too buggy for me on macOS with multiple IMAP accounts). And I love having a real shell terminal. In general, the fairly seamless integration between and among apps and devices in the Apple Universe is well-executed, whereas it always feels like a kludgy add-on for Windows.
If you need emotional support to make the change, feel free to reach out. 🙂
Some of my fondest memories as a kid were from the Palo Alto Research Center (PARC). Xerox created the environment to foster innovation in technology. And there was so much cool innovation going on. I got to play around with it all first-hand, as a 7-year-old in 1978, when my father joined along with other cohorts from CMU. Until my dad left in the early/mid-80s for Digital Equipment Corporation, Xerox PARC was a second home for me. There were so many new cool things being done at that time — networking, graphical user interfaces, mails, you name it. My favorite pastime was playing Trek.
Xerox pulled out of PARC decades ago. And now Xerox has pulled its final plug. Lots of memories. Lots of nostalgia.
Years ago, when my dad Phil Karlton was working at Netscape, he touted a now-infamous phrase:
There are only two hard things in Computer Science: cache invalidation and naming things.
It’s been fun following this quote on the Internet for the past two decades. It’s clearly been so sticky because it’s both funny and true. There have been lots of riffs on it, as well, all of which are delightful. Thanks, Internet.
Today marks the inauguration of the 45th President of the United States. The Donald. Everyone is coping with this in totally different ways, from elation to abject horror. We seem more and more divided, and the language each side speaks is incomprehensible to the other.
The only way to get out of this quagmire of disillusion and angst is communication and dialog. Until everyone can feel like they’re being heard, respected, and understood, we ain’t goin’ nowhere but down. The more we disagree, the more we need to open the channels of discussion, and shut down hatred and reactionist responses.
You can’t tell someone they’re stupid, expect them to suddenly have a moment of quiet self-reflection, and just suddenly agree with you. “Oh wait, you’re right! I am stupid! How did I not realize this before?” Said no one. Ever.
We must dispel and stomp out ignorance. Not by trashing those we deem as lacking. But by lifting them up. By sharing, by finding common ground, and working as a team.
Either way, I’m gonna miss the Obamas in the White House. I look forward to how they will contribute to this world in the years to come.
I thoroughly and wholeheartedly agree with this assessment of what’s happening to our collective conscious since the rise of social media. We feel like the world is hurting and worsening, but the opposite is true.
I always loved Georgia O’Keeffe’s work. My mom did as well. We had one of those large coffee table books at home with a bunch of her paintings. I always loved to flip through the pages and see good art.
Now there’s a new book coming out showing some of her watercolor studies.
I haven’t posted anything in over a year. Sorry about that. Either I haven’t had anything interesting to say, or I haven’t prioritized the time to say something interesting. I am placing all bets on the latter.
Things that have occupied my brain for the past 14 months, aside from work and family:
About two weeks ago, a fairly cool live video tweet service was launched, aptly named Meerkat.
Not surprisingly, I started getting lots ‘n lots of hits on my site, from folks who I naturally assume assumed that meerkat.com would be the app’s domain.
Sorry folks. It’s just me. The occasional (annual?) blogger.