Quick Start for New Developers - sgajbi/portfolio-analytics-system GitHub Wiki
Get up and running with the Portfolio Analytics System locally
Before starting, ensure you have:
- Python 3.10+
- Docker & Docker Compose
- PostgreSQL Client
- Kafka (via Docker Compose)
- Git
Optional (for advanced workflows):
- kubectl & Helm (if deploying to Kubernetes locally)
git clone <repo-url>
cd portfolio-analytics-system
Create a .env
file in the root folder:
DATABASE_URL=postgresql://user:password@localhost:5432/portfolio_db
KAFKA_BROKER=localhost:9092
docker-compose -f docker-compose.dev.yml up -d
This will start:
- PostgreSQL
- Kafka + Zookeeper
Apply database schema:
alembic upgrade head
Example for starting Cost Calculator:
cd services/calculators/cost_calculator_service
uvicorn app.main:app --reload --port 8001
Repeat for other services as needed (Ingestion, Cashflow, Position, Valuation, API).
Example for Cost Calculator Service:
pytest tests/unit --disable-warnings -q
Run all services’ tests:
pytest --disable-warnings -q
docker-compose -f docker-compose.test.yml up -d
pytest tests/integration --disable-warnings -q
docker-compose -f docker-compose.test.yml down
Start the Query Service:
cd services/query_service
uvicorn app.main:app --reload --port 8080
Query portfolio positions:
curl -H "X-Correlation-ID: TEST:uuid" http://localhost:8080/portfolios/1001/positions
- [Observability & Logging](Observability-&-Logging.md) – Debugging with correlation IDs
- [Testing Guide](Testing-Guide.md) – Running tests in detail
- [Database Migrations](Database-Migrations.md) – Schema changes with Alembic
- [Production Migration Guide](Production-Database-Migration-Guide.md) – Migrating safely