Installation

Quick guide: see Quick Start Guide (installation in 2 minutes).

Technical terms: see Glossary for uv, S3, Docker, etc.

Prerequisites

  • Python 3.13.7

  • uv (package manager)

  • S3 Access (Garage Storage) - see Garage S3 Storage

Local Installation

1. Clone the Repository

git clone https://github.com/julienlafrance/backtothefuturekitchen.git ~/mangetamain
cd ~/mangetamain/10_preprod

2. Create the Virtual Environment

uv sync

This command automatically installs all dependencies defined in pyproject.toml.

3. S3 Configuration

Copy S3 credentials to 96_keys/credentials:

[s3fast]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET

Data is automatically loaded from S3 when the application starts.

4. Launch the Application

uv run streamlit run src/mangetamain_analytics/main.py

The application is accessible at http://localhost:8501

Installation with Docker

1. Clone the Repository

git clone https://github.com/julienlafrance/backtothefuturekitchen.git ~/mangetamain
cd ~/mangetamain/30_docker

2. Launch with Docker Compose

# PREPROD Environment (port 8500)
docker-compose up -d

# OR PRODUCTION Environment (port 8501)
docker-compose -f docker-compose-prod.yml up -d

Access to Environments

Docker Volumes

Docker volumes map local code to the container:

Advantages:

  • Source code in read-only mode (prevents accidental modifications)

  • Changes visible immediately (Streamlit hot reload)

  • Persistent DuckDB data

Docker Management

View logs:

docker-compose logs -f mangetamain_preprod

Restart after adding a new dependency:

# 1. Add dependency locally
cd ~/mangetamain/10_preprod
uv add new-dependency

# 2. Restart container
cd ~/mangetamain/30_docker
docker-compose restart

Stop services:

docker-compose down

Rebuild completely:

docker-compose up -d --force-recreate --build

Docker Debug

Enter the container:

docker-compose exec mangetamain_preprod bash
# Then inside the container:
uv run python -c "import streamlit; print(streamlit.__version__)"

Check container health:

docker-compose ps
# Should display: healthy

Real-time logs:

docker-compose logs -f --tail=100

Docker Development Workflow

  1. Modify code: Edit files in 10_preprod/src/

  2. View changes: Streamlit reloads automatically

  3. Add dependency: uv add package then docker-compose restart

  4. Debug: docker-compose logs -f or enter the container

Cleanup

Stop and remove container:

docker-compose down

Complete cleanup (images, volumes, networks):

docker system prune -a
# Warning: Removes ALL unused Docker images

Tests

Run Unit Tests

cd ~/mangetamain/10_preprod
uv run pytest tests/unit/ -v --cov=src

Expected result: 93% coverage (118 tests)

Check Code Quality

# PEP8
uv run flake8 src/ tests/ --config=../.flake8

# Formatting
uv run black --check src/ tests/

# Docstrings
uv run pydocstyle src/ --config=../.pydocstyle

Main Dependencies

  • streamlit >= 1.50.0

  • plotly >= 5.24.1

  • pandas >= 2.2.3

  • numpy >= 2.2.6

  • duckdb >= 1.4.0

  • polars >= 1.19.0

  • loguru >= 0.7.3

  • pytest >= 8.5.0 (dev)

  • pytest-cov >= 6.0.0 (dev)

The complete list is available in 10_preprod/pyproject.toml.

Installation Verification

Verify uv

uv --version
# Expected: uv 0.8.22 or higher

If uv is not installed:

# Linux/macOS
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows (PowerShell)
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"

Verify Python

python3 --version
# Expected: Python 3.13.7

The project requires Python 3.13+ for compatibility with all dependencies.

Verify Environment

After uv sync, verify that the environment is activated:

uv run python --version
# Expected: Python 3.13.7

uv run python -c "import streamlit; print(streamlit.__version__)"
# Expected: 1.50.0 or higher

Verify S3

Test S3 connection:

cd ~/mangetamain/50_test
pytest test_s3_parquet_files.py -v

If the test fails, verify:

  1. The file 96_keys/credentials exists and contains the correct keys

  2. The S3 endpoint is accessible: ping s3fast.lafrance.io

  3. DNAT iptables rules are configured (if applicable)

See: Garage S3 Storage for detailed S3 Garage configuration.

Troubleshooting

Error: « uv: command not found »

Solution: Install uv with the official script:

curl -LsSf https://astral.sh/uv/install.sh | sh
source $HOME/.cargo/env  # Reload PATH

Error: « Python 3.13 not found »

Solution: uv can install Python automatically:

uv python install 3.13

Or install manually from https://www.python.org/downloads/

Error: « No credentials found »

Symptom: Message « S3 credentials not found » in the app

Solution:

  1. Create the directory: mkdir -p 96_keys

  2. Create the file 96_keys/credentials with INI format:

[s3fast]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET
  1. Check permissions: chmod 600 96_keys/credentials

Error: « Connection timeout » S3

Possible causes:

  1. Network: Check connectivity: curl -I http://s3fast.lafrance.io

  2. Firewall: Check that port 80 is open

  3. DNAT: Configure bypass for maximum performance

DNAT Solution (optional, 10x performance gain):

sudo iptables -t nat -A OUTPUT -p tcp -d 192.168.80.202 --dport 80 \
     -j DNAT --to-destination 192.168.80.202:3910

See: Garage S3 Storage section « DNAT Performance Optimization ».

Error: pytest tests fail

Solution:

  1. Verify that the environment is up to date:

cd ~/mangetamain/10_preprod
uv sync
uv run pytest --version  # Expected: pytest 8.5.0+
  1. Run tests with verbosity:

uv run pytest tests/unit/ -vv
  1. If a specific module fails, test it in isolation:

uv run pytest tests/unit/test_color_theme.py -v

Port 8501 already in use

Symptom: « Address already in use » at startup

Solution:

  1. Identify the process using the port:

# Linux/macOS
lsof -i :8501

# Windows
netstat -ano | findstr :8501
  1. Stop the process or use another port:

uv run streamlit run src/mangetamain_analytics/main.py --server.port 8502

Application loads slowly

First launch: 5-10 seconds (normal S3 loading)

Subsequent launches slow: Check Streamlit cache

Solution:

  1. In the app, click menu (⋮) > « Clear cache »

  2. Or delete cache manually:

rm -rf ~/.streamlit/cache

S3 Performance < 100 MB/s

Solution: Configure DNAT bypass to reach 500-917 MB/s

See: Garage S3 Storage section « DNAT Bypass Performance ».

Additional Resources