Skip to main content

Setup Options

Choose the installation method that best suits your needs:

Option A: Using pyspur Python Package

This is the quickest way to get started. Python 3.12 or higher is required.
pip install pyspur
pyspur init my-project
cd my-project
This will create a new directory with a .env file.
pyspur serve --sqlite
By default, this will start PySpur app at http://localhost:6080 using a sqlite database. We recommend you configure a postgres instance URL in the .env file to get a more stable experience.
You can customize your PySpur deployment in two ways:a. Through the app (Recommended):
  • Navigate to the API Keys tab in the app
  • Add your API keys for various providers (OpenAI, Anthropic, etc.)
  • Changes take effect immediately
b. Manual Configuration:
  • Edit the .env file in your project directory
  • It is recommended to configure a postgres database in .env for more reliability
  • Restart the app with pyspur serve. Add --sqlite if you are not using postgres
This is the recommended way for production deployments:
First, install Docker by following the official installation guide for your operating system:
Once Docker is installed, create a new PySpur project with:
curl -fsSL https://raw.githubusercontent.com/PySpur-com/pyspur/main/start_pyspur_docker.sh | bash -s pyspur-project
This will:
  • Start a new PySpur project in a new directory called pyspur-project
  • Set up the necessary configuration files
  • Start PySpur app automatically backed by a local postgres docker instance
Go to http://localhost:6080 in your browser.
You can customize your PySpur deployment in two ways:a. Through the app (Recommended):
  • Navigate to the API Keys tab in the app
  • Add your API keys for various providers (OpenAI, Anthropic, etc.)
  • Changes take effect immediately
b. Manual Configuration:
  • Edit the .env file in your project directory
  • Restart the services with:
    docker compose up -d
    

Using Local Models with Ollama

  1. Start Ollama service with:
OLLAMA_HOST="0.0.0.0" ollama serve
  1. Update your .env file with:
OLLAMA_BASE_URL=http://host.docker.internal:11434
  1. Download models using: ollama pull <model-name>
  2. Select Ollama models from the sidebar for LLM nodes
Note: PySpur only works with models that support structured-output and json mode. Most newer models should be good, but please confirm this from Ollama documentation for the model you wish to use.

Next Steps

After installation, you can:
  • 🪄 Create New Workflow Click “New Spur” to create a workflow from scratch
  • 📋 Use Templates Start with one of our pre-built templates
  • 💾 Import Spur JSONs Import spurs shared by other users
  • 🌐 Deploy as API Single click using the “Deploy” button in the top bar

Need Help?