Quick Start for New Developers - sgajbi/portfolio-analytics-system GitHub Wiki

Quick Start for New Developers

Get up and running with the Portfolio Analytics System locally


1️⃣ Prerequisites

Before starting, ensure you have:

  • Python 3.10+
  • Docker & Docker Compose
  • PostgreSQL Client
  • Kafka (via Docker Compose)
  • Git

Optional (for advanced workflows):

  • kubectl & Helm (if deploying to Kubernetes locally)

2️⃣ Clone the Repository

git clone <repo-url>
cd portfolio-analytics-system

3️⃣ Setup Environment Variables

Create a .env file in the root folder:

DATABASE_URL=postgresql://user:password@localhost:5432/portfolio_db
KAFKA_BROKER=localhost:9092

4️⃣ Start Dependencies (DB + Kafka)

docker-compose -f docker-compose.dev.yml up -d

This will start:

  • PostgreSQL
  • Kafka + Zookeeper

5️⃣ Run Alembic Migrations

Apply database schema:

alembic upgrade head

6️⃣ Start Services Locally

Example for starting Cost Calculator:

cd services/calculators/cost_calculator_service
uvicorn app.main:app --reload --port 8001

Repeat for other services as needed (Ingestion, Cashflow, Position, Valuation, API).


7️⃣ Run Unit Tests

Example for Cost Calculator Service:

pytest tests/unit --disable-warnings -q

Run all services’ tests:

pytest --disable-warnings -q

8️⃣ Run Integration Tests (Kafka + DB)

docker-compose -f docker-compose.test.yml up -d
pytest tests/integration --disable-warnings -q
docker-compose -f docker-compose.test.yml down

9️⃣ Call the API

Start the Query Service:

cd services/query_service
uvicorn app.main:app --reload --port 8080

Query portfolio positions:

curl -H "X-Correlation-ID: TEST:uuid" http://localhost:8080/portfolios/1001/positions

🔍 Useful Links


⚠️ **GitHub.com Fallback** ⚠️