← Back to home

Support

Get help with InferrLM and learn how to make the most of your AI assistant

Email Support

Send us your questions, feedback, or report issues

sage_mastermind@outlook.com

GitHub Issues

Report bugs or request features on our GitHub repository

Open an Issue →

Frequently Asked Questions

Do I need an account to use InferrLM?

No! You can use local AI models without creating an account. An account is only required if you want to use cloud-based models (ChatGPT, Claude, Gemini, DeepSeek) with your own API keys.

Are my conversations private?

Yes! Conversations with local models are stored entirely on your device and never transmitted to our servers. When using remote models with your API keys, conversations go directly to those providers.

How do I download AI models?

Go to Models → Download Models to see our curated list of models optimized for mobile devices. Select a model and tap download. Models are fetched directly from HuggingFace.

What is RAG and how do I use it?

RAG (Retrieval-Augmented Generation) allows AI to reference your documents for more accurate responses. Upload documents via the attachment button in chat, and enable RAG in settings to process them locally on your device.

How does the local server feature work?

The local server runs on your device and exposes REST APIs on your WiFi network. Start it from the Server tab to access your AI assistant from browsers on other devices. Data stays on your local network.

Can I use vision models?

Yes! Download vision-capable models like SmolVLM2 (includes projector file). Use the camera button in chat to capture images or select from your gallery to send to vision models.

Which devices are supported?

InferrLM supports iOS 12.0+ and Android devices. For best performance with larger models, we recommend devices with at least 6GB RAM. Smaller models work well on most modern smartphones.

How do I delete my account?

Go to Settings → Account → Delete Account in the app. Your account data will be removed from our servers within 30 days. Local data remains on your device until you uninstall the app.

Is InferrLM open source?

Yes! InferrLM is licensed under AGPL-3.0. Visit our GitHub repository to view the source code, contribute, or report issues.

Need More Help?

If you cannot find the answer to your question, feel free to reach out to us directly. We typically respond within 24-48 hours.

Contact Support