Pro to Technology@lemmy.worldEnglish • 13 days agoGoogle quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square48arrow-up1246cross-posted to: localllama@sh.itjust.works
arrow-up1246external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro to Technology@lemmy.worldEnglish • 13 days agomessage-square48cross-posted to: localllama@sh.itjust.works
minus-squareGreg ClarkelinkfedilinkEnglish3•12 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
minus-square@Euphoma@lemmy.mllinkfedilinkEnglish7•12 days agoYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk