Skip to content

QuickStart

Get up and running with Jan in minutes. This guide will help you install Jan, download a model, and start chatting immediately.

    1. Download Jan
    2. Install the app (Mac, Windows, Linux)
    3. Launch Jan

    We recommend starting with Jan v1, our 4B parameter model optimized for reasoning and tool calling:

    1. Go to the Hub Tab
    2. Search for Jan v1
    3. Choose a quantization that fits your hardware:
      • Q4_K_M (2.5 GB) - Good balance for most users
      • Q8_0 (4.28 GB) - Best quality if you have the RAM
    4. Click Download

    Download Jan v1

    HuggingFace models: Some require an access token. Add yours in Settings > Model Providers > Llama.cpp > Hugging Face Access Token.

    Add HF Token

    Step 3: Enable GPU Acceleration (Optional)

    Section titled “Step 3: Enable GPU Acceleration (Optional)”

    For Windows/Linux with compatible graphics cards:

    1. Go to Settings > Hardware
    2. Toggle GPUs to ON

    Turn on GPU acceleration

    1. Click the New Chat icon
    2. Select your model in the input field dropdown
    3. Type your message and start chatting

    Create New Thread

    Try asking Jan v1 questions like:

    • “Explain quantum computing in simple terms”
    • “Help me write a Python function to sort a list”
    • “What are the pros and cons of electric vehicles?”

Jan organizes conversations into threads for easy tracking and revisiting.

  • Left sidebar shows all conversations
  • Click any chat to open the full conversation
  • Favorites: Pin important threads for quick access
  • Recents: Access recently used threads

Favorites and Recents

  1. Hover over a conversation in the sidebar
  2. Click the three dots icon
  3. Click Rename
  4. Enter new title and save

Context Menu

Single thread:

  1. Hover over thread in sidebar
  2. Click the three dots icon
  3. Click Delete

All threads:

  1. Hover over Recents category
  2. Click the three dots icon
  3. Select Delete All

Customize how models respond:

  1. Use the assistant dropdown in the input field
  2. Or go to the Assistant tab to create custom instructions
  3. Instructions work across all models

Assistant Instruction

Add an Assistant Instruction

Fine-tune model behavior:

  • Click the Gear icon next to your model
  • Adjust parameters in Assistant Settings
  • Switch models via the model selector

Chat with a Model

Connect to OpenAI, Anthropic, Groq, Mistral, and others:

  1. Open any thread
  2. Select a cloud model from the dropdown
  3. Click the Gear icon beside the provider
  4. Add your API key (ensure sufficient credits)

Connect Remote APIs

For detailed setup, see Remote APIs.