Introduction
AIChat is a versatile tool that allows you to interact with various Language Learning Models (LLMs) directly from your terminal. It’s designed to be both user-friendly for beginners and powerful enough for advanced users who want to automate workflows using macros and shell integration.
Terminal Setup
To begin, you’ll first need to install AIChat, the exact process for which will depend on your operating system. Once installed, running aichat
for the first time will prompt you to create a configuration file. This file, typically located at ~/.config/aichat/config.yaml
on Linux, C:\Users\Alice\AppData\Roaming\aichat\config.yaml
on Windows, or /Users/Alice/Library/Application Support/aichat/config.yaml
on macOS, stores your API key and other settings. You’ll be asked to specify the platform and API key during this initial setup.
Environment Management
AIChat also supports the use of environment variables which are useful for sensitive information such as API keys, or model paths, it’s essential to keep them out of configuration files.
AIChat REPL
The heart of AIChat is its Read-Eval-Print Loop (REPL), which provides an interactive environment for engaging with LLMs. Here’s a simplified view of the REPL workflow:
REPL Command Workflow
- Command Input (e.g.,
.model gpt-4o
) ⬇️ - Processing (AIChat interprets the command and config settings) ⬇️
- Output Generation (Command response or AI output) ⬇️
- Optional Continuation (e.g.,
.continue
,.regenerate
)
Here are some basic examples of REPL usage:
aichat
: Starts the REPL.aichat Tell a joke
: Sends a simple query to the LLM.aichat -e install nvim
: Executes a command in the shell using AIChat to create the command.aichat -c fibonacci in js
: Generates code output.aichat -m openai:gpt-4o
: Selects the LLM to use.aichat -r role1
: Uses the rolerole1
.aichat -s
: Starts a temporary session.aichat -s session1
: Uses sessionsession1
.aichat -a agent1
: Uses agentagent1
.aichat --rag rag1
: Uses RAGrag1
.aichat --info
: Shows system information.aichat -r role1 --info
: Shows role info.aichat -s session1 --info
: Shows session info.aichat -a agent1 --info
: Shows agent info.aichat --rag rag1 --info
: Shows RAG info.
Beginner Tips:
- 💡 Start simple: Enter
aichat Tell me a joke
to experience the magic! - 💡 Use
aichat --info
to explore your configuration and capabilities.
Macros
Macros are a powerful feature allowing you to automate sequences of REPL commands. These are defined in YAML files located in the <aichat-config-dir>/macros/
directory. For example, a macro to generate a commit message could be defined in a generate-commit-message.yaml
file like this:
steps:
- .file `git diff` -- generate git commit message
You can then execute it with aichat --macro generate-commit-message
.
Imagine chaining .file
and .prompt
commands to write and deploy a script:
Create a macro in
<aichat-config-dir>/macros/
like this:
steps:
- .file script.py -- improve code style
- .file script.py -- deploy changes
Run it with
aichat --macro deploy-script
for instant automation.
Function Calling
AIChat supports function calling, enabling it to interact with external tools. You can configure tools in your configuration file using mapping_tools
. For example:
function_calling: true
mapping_tools:
fs: 'fs_cat,fs_ls,fs_mkdir,fs_rm,fs_write'
use_tools: null
And then use the fs
tool group by setting use_tools: fs
.
Shell Assistant
AIChat can act as a shell assistant, generating commands based on your natural language input. It’s aware of the OS and shell you are using, and will generate appropriate commands for your system.
Shell Integration and Autocompletion
AIChat also offers shell integration and autocompletion, allowing you to use alt+e
to get intelligent completions in your terminal, and suggesting commands, options, and filenames as you type, enhancing your workflow. Here’s how to set up shell integration:
Shell Integration Steps
- Install AIChat binary.
- Add
aichat
to PATH. - Source completion scripts based on shell type:
source <(aichat completion bash)
Integration scripts are available for various shells such as bash, zsh, PowerShell, fish, and nushell.
Generate Code & Use Files
The -c/--code
flag will provide pure code output, while the -f/--file
can be used to send file contents to LLMs. For example:
aichat -c fibonacci in js
will generate JavaScript code.aichat -f data.txt
will send the contents ofdata.txt
to the LLM.aichat -f image.png ocr
will send the image for OCR analysis.
Real-World Use Cases:
Quickly summarize documents with AIChat: Use
aichat -f report.txt summarize
to get key insights from long files.
ARGC and JQ Enhancements
These enhancements allow for more sophisticated input and output manipulation, allowing AIChat to fit seamlessly into scripting workflows.
Gemini EXP and Ollama Integration
AIChat supports integration with Gemini EXP and alternatives using Ollama, providing greater flexibility in selecting the models you want to use.
Troubleshooting
Here’s a table to help with common issues:
Issue | Cause | Solution |
---|---|---|
API key error | Missing/incorrect API key | Update config or use aichat --info . |
Model not loading | Model path not set correctly | Verify model in ~/.config/aichat/config.yaml . |
Command not found | Binary not in PATH | Add AIChat to PATH using export PATH=$PATH:/path/to/aichat . |
This guide provides a foundation for understanding AIChat. By experimenting and using these features, you can unlock a powerful set of LLM tools from your command line. If you need more specific guidance, don’t hesitate to ask!