My latest video looks at ChatGPT alternatives that can be operated on personal computers, including PCs and Macs.
I first look at Nvidia’s Chat with RTX, a tool enabling users to run a ChatGPT-like chatbot locally. Chat with RTX only works with Nvidia’s newer 30 or 40 series GPUs, which could be a limitation for some users. I tested it on a Lenovo Legion 5 Pro (affiliate link) that had an RTX 4060 GPU on board. Disclosure: the laptop is on loan from Lenovo.
I then tried GPT4All, an alternative open-source large language model client that offers similar functionality to Chat with RTX but without the need for high-end GPU hardware. Like Chat with RTX, GPT4all is user-friendly, requiring minimal setup and no advanced developer tools. GPT4All is compatible with various operating systems, including Macs, Linux, and Windows, broadening its accessibility. However, for optimal performance, 16 GB of system RAM is recommended especially on Windows.
In testing these platforms, I observed that while these AI models are capable, they are not nearly as good as ChatGPT. My test involved having the AI’s summarize one of my prior video transcripts for a blog post. I found that they more often than not got the context of the video wrong and even made stuff up rather than adhering to the facts in the source text it was summarizing.
But this does show how fast AI technology is moving from large data centers into something that can be run locally on a laptop. I was particularly impressed with how fast and responsive GPT4All was on my M2 Macbook Air as compared to a Lenovo Thinkbook running with a 13th generation Intel processor.
Both chat clients allow the user to choose from a number of different large language models. Although I only looked at three of those models in the video, there are many more offered as a free download to explore. These models are being updated all the time so I’m sure we’ll see some rapid improvements as the year progresses.