Synopsis

sc [-hV] [--base-url=BASE_URL] [COMMAND]

Description

A runtime for AI chatbots supporting multiple LLM providers. sc is a command-line interface for interacting with Large Language Models (LLMs) from various providers including Ollama and OpenAI. It provides chat capabilities, configuration management, and RAG (Retrieval-Augmented Generation) functionality.

Examples
  • sc chat "What is the weather today?"

  • sc chat -m llama3.2 "Explain quantum computing"

  • sc config --set providers.ollama.model=codellama

  • sc rag --etl=vectorStore file:///path/to/document.pdf

Getting Started

sc is designed to make AI interactions simple and powerful. Before using the tool, you may want to initialize your configuration:

sc config init

This creates a configuration directory at $HOME/.sc/ and sets up default values for connecting to Ollama.

Common Workflows

Basic Chat Session

Start an REPL chat session with the default model:

sc chat

Quick Question

Ask a single question without entering REPL mode:

sc chat "What is machine learning?"

Using Different Models

Specify a different model for specialized tasks:

sc chat -m codellama "Write a Python function to calculate fibonacci numbers"

Processing Documents

Use RAG to process and understand documents:

sc rag file:///path/to/document.pdf -o out.txt
sc rag https://example.com/article.html --etl=vectorStore

Configuration Management

View and modify configuration settings:

sc config --get providers.ollama.model
sc config --set providers.ollama.model=llama3.2
sc config --file

Options

--base-url=BASE_URL

Ollama API endpoint URL.

Overrides configuration file setting.

-h, --help

Show this help message and exit.

-V, --version

Print version information and exit.

Commands

chat

Chat with a bot using various LLM providers.

config

Manage configuration settings for the sc CLI.

help

Display help information about the specified command.

rag

Interact with the RAG (Retrieval-Augmented Generation) system.

Configuration

sc uses a hierarchical configuration system:

  1. Command line options (highest priority)

  2. Configuration file at $HOME/.sc/config

  3. Default values (lowest priority)

Configuration File Format

The configuration file uses YAML format:

provider: ollama
providers:
  ollama:
    base-url: http://localhost:11434
    model: llama3.2
chat-memory:
  jdbc:
    url: jdbc:hsqldb:mem:testdb

Environment Variables

You can also use environment variables:

export PROVIDERS_OLLAMA_BASE_URL=http://192.168.1.100:11434
export PROVIDERS_OLLAMA_MODEL=codellama

Examples

REPL Chat with Custom Model

sc --base-url=http://192.168.1.100:11434 chat -m codellama

Batch Document Processing

# Process multiple documents
for doc in *.pdf; do
  sc rag "file://$(pwd)/$doc" --etl=vectorStore
done

Configuration Setup for Remote Ollama

sc config init
sc config --set providers.ollama.base-url=http://192.168.1.100:11434
sc config --set providers.ollama.model=llama3.2

Files

$HOME/.sc/config

Main configuration file

$HOME/.sc/

Configuration directory

Bugs