Using LLMs on Your Phone Locally in Three Easy Steps
This post was published here at Medium, click for details: https://medium.com/@frank.t.f.ye/using-llms-on-your-phone-locally-in-three-easy-steps-30367c3a60a3 Recently, I’ve been exploring ways to run Large Language Models locally on my phone, and I’m excited to share what I’ve learned. With the latest mobile devices packing increasingly powerful hardware, running your own AI models locally has become a reality. The recent release of smaller but capable Llama 3.2 models has made it possible to run these on both iOS and Android devices. This is particularly useful when you need to handle tasks like drafting emails in situations where you either don’t have internet access or have privacy concerns about using cloud-based AI services. iOS Users If you’re using Apple devices like me, setting up a local LLM is surprisingly straightforward. Just head to the App Store and download “LLM Farm” — it’s an impressive local LLM runner developed by Artem ( Link ) and it’s completely open-source!...