Synopsis
sc chat [-hV] [-m=MODEL] [MESSAGE]
Description
Chat with a bot using various LLM providers.
Start interactive or single-shot conversations with AI models. Supports multiple models from Ollama and other providers.
REPL Mode:
sc chat
sc> Hello, how are you?
sc> exit
Single-shot Mode: sc chat "What is the capital of France?"
Using Different Models: sc chat -m llama3.2 "Explain machine learning" sc chat --model codellama "Write a Python function"
With Custom Endpoint: sc --base-url http://192.168.1.100:11434 chat "Hello"
Usage Modes
REPL Mode
When no message is provided, sc chat
starts in REPL mode:
sc chat
sc> Hello, how are you today?
Hello! I'm doing well, thank you for asking. How can I help you today?
sc> What is the weather like?
I don't have access to current weather data, but I can help you with weather-related questions or point you to weather services.
sc> exit
Multimodal Mode
REPL mode supports multimodal prompts with image attachments using the @
symbol:
sc chat
sc> Describe this image @/path/to/image.png
This image shows a beautiful landscape with mountains in the background...
sc> Compare these screenshots @"screen 1.jpg" @'screen 2.png'
Looking at both screenshots, I can see the following differences...
sc> What's in this chart? @./data/chart.jpg
The chart displays quarterly sales data showing...
Supported Image Formats
-
JPEG (
.jpg
,.jpeg
) -
PNG (
.png
) -
GIF (
.gif
) -
WebP (
.webp
) -
BMP (
.bmp
)
File Path Features
-
Tab completion: Press Tab after
@
to browse and select files -
Absolute paths:
@/full/path/to/image.jpg
-
Relative paths:
@./images/photo.png
or@../docs/diagram.jpg
-
Quoted paths:
@"path with spaces.jpg"
or@'another path.png'
-
Multiple files: Attach multiple images in a single prompt
Single-shot Message Mode
Provide a message as an argument for one-time questions:
sc chat "What is the capital of France?"
The capital of France is Paris.
Model Selection
You can specify different models for different use cases:
General Purpose
sc chat -m llama3.2 "Explain quantum computing"
Code Generation
sc chat -m codellama "Write a Python function to sort a list"
Creative Writing
sc chat -m mistral "Write a short story about space exploration"
REPL Commands
In REPL mode, you can use these special commands:
/exit
or/quit
or/bye
-
Exit the chat session
/help
or/?
-
Show available commands
/clear
-
Clear the chat history
Multimodal Features
File Attachment Syntax
Use the @
symbol to attach image files to your prompts:
@filename.jpg
-
Attach a single image file
@"path with spaces.png"
-
Use quotes for paths containing spaces
@./relative/path.gif
-
Relative paths from current directory
@/absolute/path/image.webp
-
Absolute file paths
Tab Completion
Press Tab after typing @
to:
-
Browse the file system
-
Filter to show only supported image formats
-
Navigate directories with arrow keys
-
Select files with Enter
File Validation
The system automatically:
-
Validates file existence before sending
-
Checks for supported image formats
-
Resolves relative paths to absolute paths
-
Provides helpful error messages for invalid files
Options
- -h, --help
-
Show this help message and exit.
- -m, --model=MODEL
-
Specify the LLM model to use for this conversation.
Overrides the default model from configuration.
Examples: llama3.2, codellama, mistral, phi3
- -V, --version
-
Print version information and exit.
Arguments
- [MESSAGE]
-
Optional message to send to the bot.
If not provided, enters REPL mode.
In REPL mode, type
/exit
or/quit
or/bye
to end the session.
Examples
Start REPL session with default model:
sc chat
Ask a quick question:
sc chat "How do I install Python?"
Use a specific model:
sc chat -m codellama "Show me a Java hello world example"
Analyze an image in REPL mode:
sc chat
sc> What do you see in this image? @./screenshot.png
sc> Compare these two charts @chart1.jpg @chart2.jpg
Connect to remote Ollama instance:
sc --base-url http://192.168.1.100:11434 chat -m llama3.2
Use multimodal prompts with different image formats:
sc chat
sc> Analyze this diagram @/home/user/docs/flowchart.png
sc> What's the difference between @"image A.jpg" and @'image B.webp'?
sc> Describe the contents of @../screenshots/ui.gif
Configuration
The chat command respects these configuration settings:
provider
-
LLM provider selection
providers.ollama.model
-
Default model to use when none specified
providers.ollama.base-url
-
Ollama server endpoint
Set these with:
sc config --set provider=ollama
sc config --set providers.ollama.model=llama3.2
sc config --set providers.ollama.base-url=https://ollama.example.com/api/
See Also
sc(1), sc-config(1)