Skip to content

Supported Models in Refact

Cloud Version of Refact

Completion models

  • Refact/1.6B

Chat models

  • GPT 3.5
  • GPT 4 (Pro plan)

Self-Hosted Version of Refact

In Refact self-hosted you can select between the following models:

Completion models

Model Name Fine-tuning support
Refact/1.6B
Refact/1.6B/vllm
starcoder/1b/base
starcoder/1b/vllm
starcoder/3b/base
starcoder/3b/vllm
starcoder/7b/base
starcoder/7b/vllm
starcoder/15b/base
starcoder/15b/plus
starcoder2/3b/base
starcoder2/3b/vllm
starcoder2/7b/base
starcoder2/7b/vllm
starcoder2/15b/base
deepseek-coder/1.3b/base
deepseek-coder/1.3b/vllm
deepseek-coder/5.7b/mqa-base
deepseek-coder/5.7b/vllm
codellama/7b
stable/3b/code
wizardcoder/15b

Chat models

Model Name
starchat/15b/beta
deepseek-coder/33b/instruct
deepseek-coder/6.7b/instruct
deepseek-coder/6.7b/instruct-finetune
deepseek-coder/6.7b/instruct-finetune/vllm
wizardlm/7b
wizardlm/13b
wizardlm/30b
llama2/7b
llama2/13b
magicoder/6.7b
mistral/7b/instruct-v0.1
mixtral/8x7b/instruct-v0.1
llama3/8b/instruct
llama3/8b/instruct/vllm

For an up-to-date list of models, see the Known Models file on GitHub.

Integrations

Refact.ai offers OpenAI and Anthropic API integrations.

To enable these integrations, navigate to the Model Hosting page activate the OpenAI and/or Anthropic integrations by pressing the switch button in the 3rd Party APIs section.

3rd Party APIs Secction

Press API Keys tab link, you will be redirected to the integrations page. Alternatively, you can access the integrations page by clicking on the Settings dropdown menu in the header and selecting Credentials.

Model Hosting Page with Dropdown Expanded

In the Credentials page, you can specify your OpenAI and/or Anthropic API keys.