Supported Models in Refact.ai
Cloud Version
With Refact.ai, access state-of-the-art models in your VS Code or JetBrains plugin and select the optimal LLM for each task.
AI Agent models
- GPT 4.1 (default)
- Claude 3.7 Sonnet
- Claude 3.5 Sonnet
- GPT-4o
- o3-mini
Chat models
- GPT 4.1 (default)
- Claude 3.7 Sonnet
- Claude 3.5 Sonnet
- GPT-4o
- GPT-4o-mini
- o3-mini
For select models, click the π‘Think
button to enable advanced reasoning, helping AI better solve complex tasks. Available only in Refact.ai Pro plan.
Code completion models
- Qwen2.5-Coder-1.5B
Configure Providers (BYOK)
Refact.ai gives you the flexibility to connect your own API key and use external LLMs like Gemini, Grok, OpenAI, DeepSeek, and others. For a step-by-step guide, see the Configure Providers (BYOK) documentation.
Self-Hosted Version
In Refact.ai Self-hosted, you can choose among 20+ model options β ready for any task. The full lineup (always up-to-date) is in the Known Models file on GitHub.
Completion models
Model Name | Fine-tuning support |
---|---|
Refact/1.6B | β |
Refact/1.6B/vllm | β |
starcoder/1b/base | β |
starcoder/1b/vllm | |
starcoder/3b/base | β |
starcoder/3b/vllm | |
starcoder/7b/base | β |
starcoder/7b/vllm | |
starcoder/15b/base | |
starcoder/15b/plus | |
starcoder2/3b/base | β |
starcoder2/3b/vllm | β |
starcoder2/7b/base | β |
starcoder2/7b/vllm | β |
starcoder2/15b/base | β |
deepseek-coder/1.3b/base | β |
deepseek-coder/1.3b/vllm | β |
deepseek-coder/5.7b/mqa-base | β |
deepseek-coder/5.7b/vllm | β |
codellama/7b | β |
stable/3b/code | |
wizardcoder/15b |
Chat models
Model Name |
---|
starchat/15b/beta |
deepseek-coder/33b/instruct |
deepseek-coder/6.7b/instruct |
deepseek-coder/6.7b/instruct-finetune |
deepseek-coder/6.7b/instruct-finetune/vllm |
wizardlm/7b |
wizardlm/13b |
wizardlm/30b |
llama2/7b |
llama2/13b |
magicoder/6.7b |
mistral/7b/instruct-v0.1 |
mixtral/8x7b/instruct-v0.1 |
llama3/8b/instruct |
llama3/8b/instruct/vllm |
Integrations
On a self-hosted mode, you can also configure OpenAI and Anthropic API integrations.
- Go to Model Hosting page β 3rd Party APIs section and toggle the switch buttons for OpenAI and/or Anthropic.
- Click the API Keys tab to be redirected to the integrations page (or go via Settings β Credentials)
- Enter your OpenAI and/or Anthropic key.