Back in 2011 when Apple announced the iPad I believed it marked the real beginning of the personal computing era rather than the beginning of the Post-PC Era as was being pronounced at the time. As I argued then, and think it is still the case, we have historically thought of a personal computer as a device that one person uses. In other words, “personal” meant one. My argument back then is that “personal” really should mean what everyone knows it to mean, which is computing specific to you, the user. I imagine personal computing as the mashup of hardware + software + Internet + intelligence.

Today I learned via manton about Rabbit Inc. and a mobile device and operating system it has introduced built around a Large Language Model and what the company calls a Large Action Model to provide a user interface based on natural language processing and execute actions rather than generate text and pictures. I don’t know whether the NLP works as well is demoed, but the demo at least shows exactly how I imagined how personal computing should work. Speech is our primary mode of interaction and typing is really abnormal and frankly unknown to the majority of people in the world.

I think the software is the most important part of what Rabbit Inc. has made, the purpose of the device is to prove the software’s functioning and capability. If it does perform as well as shown, I expect the company and product to be acquired by one of the big tech firms.