我刚刚创建了世界上最小的人工智能服务器吗?
我成功在我的去谷歌化的Unihertz Jelly Star上安装了Ollama,这款手机被誉为世界上最小的智能手机,配备了3英寸的屏幕。Jelly Star拥有8+7GB的内存。我下载并成功在设备上本地运行了精简版的Deepseek-R1:7b。它慢吗?是的。但它仍然能够稳定地逐字输出文本,不会崩溃,完整响应的时间不超过几分钟。有没有人有其他微型人工智能工作站的例子?
查看原文
I successfully installed Ollama in Termux on my degoogled Unihertz Jelly Star, reputed to be the world's smallest smartphone, having a 3 inch screen. The Jelly Star packs 8+7 gb ram. I downloaded and then successfully ran distilled Deepseek-R1:7b locally on the device. Is it slow? Yes. But it still steadily outputs text word by word, does not crash, and takes no longer than a couple mins to respond in full. Anyone have other examples of micro AI workstations?