Synopsis

sc chat [-hV] [-m=MODEL] [MESSAGE]

Description

Chat with a bot using various LLM providers.

Start interactive or single-shot conversations with AI models. Supports multiple models from Ollama and other providers.

REPL Mode:

  sc chat
  sc> Hello, how are you?
  sc> exit

Single-shot Mode: sc chat "What is the capital of France?"

Using Different Models: sc chat -m llama3.2 "Explain machine learning" sc chat --model codellama "Write a Python function"

With Custom Endpoint: sc --base-url http://192.168.1.100:11434 chat "Hello"

Usage Modes

REPL Mode

When no message is provided, sc chat starts in REPL mode:

sc chat
sc> Hello, how are you today?
Hello! I'm doing well, thank you for asking. How can I help you today?
sc> What is the weather like?
I don't have access to current weather data, but I can help you with weather-related questions or point you to weather services.
sc> exit

Single-shot Message Mode

Provide a message as an argument for one-time questions:

sc chat "What is the capital of France?"
The capital of France is Paris.

Model Selection

You can specify different models for different use cases:

General Purpose

sc chat -m llama3.2 "Explain quantum computing"

Code Generation

sc chat -m codellama "Write a Python function to sort a list"

Creative Writing

sc chat -m mistral "Write a short story about space exploration"

REPL Commands

In REPL mode, you can use these special commands:

/exit or /quit or /bye

Exit the chat session

/help or /?

Show available commands

/clear

Clear the chat history

Options

-h, --help

Show this help message and exit.

-m, --model=MODEL

Specify the LLM model to use for this conversation.

Overrides the default model from configuration.

Examples: llama3.2, codellama, mistral, phi3

-V, --version

Print version information and exit.

Arguments

[MESSAGE]

Optional message to send to the bot.

If not provided, enters REPL mode.

In REPL mode, type /exit or /quit or /bye to end the session.

Examples

Start REPL session with default model:

sc chat

Ask a quick question:

sc chat "How do I install Python?"

Use a specific model:

sc chat -m codellama "Show me a Java hello world example"

Connect to remote Ollama instance:

sc --base-url http://192.168.1.100:11434 chat -m llama3.2

Configuration

The chat command respects these configuration settings:

provider

LLM provider selection

providers.ollama.model

Default model to use when none specified

providers.ollama.base-url

Ollama server endpoint

Set these with:

sc config --set provider=ollama
sc config --set providers.ollama.model=llama3.2
sc config --set providers.ollama.base-url=https://ollama.example.com/api/

See Also

sc(1), sc-config(1)