Local AI refers to running AI applications directly on a device, locally, instead of relying on (distant) cloud servers. On-device AI works directly on commodity hardware (e.g. old PCs), consumer devices (e.g. smartphones, wearables), and other types of embedded devices. Often a simple local AI application can meet the needs of most projects or analysis and reduces potential risks of using public models.
Additional benefits include added privacy / data security – data stays on the device and under one’s control, accessibility – Small Language Models (SLM) trainings and model use are way more affordable, they can work on all kinds of hardware, and independently from an internet connection (offline) and sustainability – AI consumes significantly less energy compared to cloud instances.