local.ai
Experiment with AI models locally without the need to setup a full-blown ML stack. Powered by a native app designed to simplify the whole process from model downloading to starting an inference server. No GPU required!
Accès:
15.0K
Pays:
United States
Débat