Metricis Developer Guide¶
This guide covers everything developers need to know to contribute to and extend the Metricis platform.
Table of Contents¶
- Development Environment Setup
- Project Architecture
- Client Development
- Server Development
- Portal Development
- Database Management
- Background Workers
- Testing
- Code Style & Conventions
- Contributing
Development Environment Setup¶
Prerequisites¶
- Node.js 18+ - JavaScript runtime
- Python 3.11+ - Server runtime
- PostgreSQL 15+ - Database (can use Docker)
- Redis 7+ - Message broker for Celery (can use Docker)
- Git - Version control
Initial Setup¶
# Clone the repository
git clone https://github.com/your-org/metricis.git
cd metricis
# Install root dependencies (runs npm install in all workspaces)
npm install
# Set up Python virtual environment
cd server
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt
# Copy environment template
cd ..
cp .env.example .env
Environment Variables¶
Create a .env file in the root directory:
# Environment
ENVIRONMENT=development
# Database
DATABASE_URL=postgresql+asyncpg://postgres:postgres@localhost:5432/metricis
# JWT Authentication (auto-generated in development, REQUIRED in production)
JWT_SECRET_KEY=your-secret-key-change-in-production
JWT_ALGORITHM=HS256
# Session (auto-generated in development, REQUIRED in production)
SESSION_SECRET_KEY=another-secret-key
# Session Storage Backend ("memory" for dev, "redis" for production)
SESSION_STORAGE_BACKEND=memory
# Redis (for Celery and optional session storage)
REDIS_URL=redis://localhost:6379/0
# Rate Limiting
RATE_LIMIT_PER_MINUTE=60
AUTH_RATE_LIMIT_PER_MINUTE=10
# SMTP (optional)
SMTP_HOST=smtp.example.com
SMTP_PORT=587
SMTP_USER=your-email@example.com
SMTP_PASSWORD=your-password
SMTP_FROM=noreply@example.com
# Twilio SMS (optional)
TWILIO_ACCOUNT_SID=your-account-sid
TWILIO_AUTH_TOKEN=your-auth-token
TWILIO_FROM_NUMBER=+15551234567
# Firebase Cloud Messaging (optional)
FCM_CREDENTIALS_PATH=/path/to/firebase-credentials.json
FCM_PROJECT_ID=your-project-id
# REDCap (optional)
REDCAP_TOKEN_YOUR_SITE=your-32-character-token
Starting Development Servers¶
# Start all services (client, portal, server)
npm run dev:all
# Or start individually:
npm run dev:client # Port 5173
npm run dev:portal # Port 3000
npm run dev:server # Port 8000
# Start background workers (in separate terminal)
cd server
source .venv/bin/activate
celery -A app.celery_app worker -Q reminders -l info
# Start Celery beat for scheduled tasks
celery -A app.celery_app beat -l info
Using Docker¶
# Start all services with Docker Compose
docker-compose up
# Or build and run specific services
docker-compose up client server redis postgres
Project Architecture¶
Monorepo Structure¶
metricis/
├── client/ # jsPsych assessment client
│ ├── src/
│ │ ├── main.ts # Entry point
│ │ ├── tasks/ # Cognitive task implementations
│ │ ├── services/ # API client, storage
│ │ ├── plugins/ # Custom jsPsych plugins
│ │ ├── i18n/ # Internationalization
│ │ └── config.ts # Task configuration
│ └── package.json
│
├── server/ # FastAPI backend
│ ├── app/
│ │ ├── main.py # FastAPI application
│ │ ├── config.py # Settings and configuration
│ │ ├── routers/ # API endpoints
│ │ ├── services/ # Business logic
│ │ ├── workers/ # Celery tasks
│ │ ├── db/ # SQLAlchemy models
│ │ └── celery_app.py
│ ├── alembic/ # Database migrations
│ └── requirements.txt
│
├── portal/ # React admin portal
│ ├── src/
│ │ ├── main.tsx # Entry point
│ │ ├── pages/ # Page components
│ │ ├── services/ # API client
│ │ ├── contexts/ # React contexts
│ │ └── types/ # TypeScript interfaces
│ └── package.json
│
└── docs/ # Documentation
Data Flow¶
- Assessment Flow:
- Client starts session via
POST /api/session/start - jsPsych runs cognitive tasks, collecting trial data
- Client submits results via
POST /api/submit -
Server stores data and optionally syncs to REDCap
-
Notification Flow:
- Scheduler service identifies visits needing reminders
- Celery worker processes reminder queue
- Notifications sent via email/SMS/push
- Logs stored in
notification_logstable
Client Development¶
Adding a New Cognitive Task¶
- Create the task file (
client/src/tasks/your-task.ts):
import { JsPsych } from 'jspsych';
import htmlKeyboardResponse from '@jspsych/plugin-html-keyboard-response';
export interface YourTaskSummary {
mean_rt: number;
accuracy: number;
total_trials: number;
}
export function createYourTaskTimeline(jsPsych: JsPsych) {
const timeline: any[] = [];
// Instructions
timeline.push({
type: htmlKeyboardResponse,
stimulus: '<p>Task instructions here...</p>',
choices: [' '],
});
// Practice trials
// ...
// Main trials
// ...
return timeline;
}
export function calculateYourTaskSummary(data: any[]): YourTaskSummary {
const trials = data.filter(d => d.task === 'your_task' && d.trial_type === 'response');
return {
mean_rt: trials.reduce((sum, t) => sum + t.rt, 0) / trials.length,
accuracy: trials.filter(t => t.correct).length / trials.length,
total_trials: trials.length,
};
}
- Add configuration (
client/src/config.ts):
export const TASK_CONFIG = {
// ... existing tasks
your_task: {
practice_trials: 5,
main_trials: 40,
stimulus_duration: 500,
iti_min: 500,
iti_max: 1500,
},
};
- Register the task (
client/src/main.ts):
import { createYourTaskTimeline } from './tasks/your-task';
// Add to timeline builder
if (taskId === 'your_task' || taskId === 'all') {
timeline.push(...createYourTaskTimeline(jsPsych));
}
Working with jsPsych Plugins¶
Custom plugins go in client/src/plugins/. Follow the jsPsych plugin template.
Internationalization¶
Add translations in client/src/i18n/locales/:
// en.json
{
"your_task": {
"instructions": "Press the spacebar when you see the target.",
"feedback_correct": "Correct!",
"feedback_incorrect": "Incorrect"
}
}
Server Development¶
Adding a New API Endpoint¶
- Create/update router (
server/app/routers/your_router.py):
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from ..db import get_db
from ..services.your_service import YourService
router = APIRouter(prefix="/your-endpoint", tags=["Your Feature"])
@router.get("/")
async def list_items(db: AsyncSession = Depends(get_db)):
service = YourService(db)
return await service.list_items()
@router.post("/")
async def create_item(data: ItemCreate, db: AsyncSession = Depends(get_db)):
service = YourService(db)
return await service.create_item(data)
- Create service (
server/app/services/your_service.py):
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from ..db.models import YourModel
class YourService:
def __init__(self, db: AsyncSession):
self.db = db
async def list_items(self):
result = await self.db.execute(select(YourModel))
return list(result.scalars().all())
async def create_item(self, data):
item = YourModel(**data.model_dump())
self.db.add(item)
await self.db.commit()
await self.db.refresh(item)
return item
- Register router (
server/app/main.py):
Working with the Notification Service¶
from ..services.notifications import NotificationService
async def send_participant_notification(db, participant_id, message):
service = NotificationService(db)
# Send email
await service.send_email(
recipient="user@example.com",
subject="Assessment Reminder",
body=message,
participant_id=participant_id,
)
# Send SMS (requires Twilio configuration)
await service.send_sms(
recipient="+15551234567",
body=message,
participant_id=participant_id,
)
# Send push notification (requires FCM configuration)
await service.send_push_notification(
participant_id=participant_id,
title="Assessment Reminder",
body=message,
)
Working with the PDF Report Service¶
from ..services.pdf_reports import PDFReportService
async def generate_study_pdf(db, study_id):
service = PDFReportService(db)
# Generate study summary PDF
pdf_bytes = await service.generate_study_summary_report(
study_id=study_id,
page_size="letter" # or "a4"
)
# Generate participant report PDF
pdf_bytes = await service.generate_participant_report(
study_id=study_id,
participant_id=participant_id,
include_session_details=True,
page_size="letter"
)
# Generate session scores PDF
pdf_bytes = await service.generate_session_scores_report(
study_id=study_id,
session_id=session_id,
page_size="letter"
)
Working with Study Templates¶
from ..services.templates import TemplateService, TemplateCategory
# Initialize service (no DB required for listing)
service = TemplateService()
# List all templates
templates = service.get_all_templates()
# Filter by category
templates = service.get_templates_by_category(TemplateCategory.NEUROLOGICAL)
# Search templates
templates = service.search_templates("cognition")
# Get specific template
template = service.get_template("ms_assessment")
# Available template categories:
# - SCREENING: Quick screening batteries
# - NEUROLOGICAL: MS, Parkinson's, etc.
# - PSYCHIATRIC: Depression, anxiety, ADHD
# - PEDIATRIC: Child-specific assessments
# - AGING: Cognitive aging studies
# - ONCOLOGY: Chemo brain, cancer-related
# - SPORTS: Concussion, return-to-learn
Working with Participant Import¶
from ..services.participant_import import ParticipantImportService
async def import_participants(db, study_id, csv_content):
service = ParticipantImportService(db)
# Import from CSV
result = await service.import_from_csv(
study_id=study_id,
csv_content=csv_content,
skip_duplicates=True,
create_missing_sites=False
)
# Import from Excel
result = await service.import_from_excel(
study_id=study_id,
excel_content=excel_bytes,
sheet_name=None, # Uses first sheet
skip_duplicates=True,
create_missing_sites=False
)
# Check result
if result.success:
print(f"Imported {result.imported_count} participants")
else:
for error in result.errors:
print(f"Row {error.row}: {error.message}")
# Generate import template CSV
template_csv = service.generate_template_csv(include_optional=True)
Portal Development¶
Adding a New Page¶
- Create page component (
portal/src/pages/YourPage.tsx):
import { useState } from 'react';
import { useQuery } from '@tanstack/react-query';
import api from '../services/api';
export default function YourPage() {
const { data, isLoading } = useQuery({
queryKey: ['your-data'],
queryFn: () => api.getYourData(),
});
if (isLoading) return <div>Loading...</div>;
return (
<div className="page-container">
<h1>Your Page</h1>
{/* Your content */}
</div>
);
}
- Add route (
portal/src/App.tsx):
- Add API method (
portal/src/services/api.ts):
async getYourData(): Promise<YourDataType[]> {
const response = await this.fetch('/api/your-endpoint');
return response.json();
}
Form Validation with Zod¶
The portal uses Zod for runtime validation. Schemas are in portal/src/validation/schemas.ts.
- Using existing schemas:
import { useFormValidation } from '../validation';
import { studySchema } from '../validation/schemas';
function CreateStudyForm() {
const {
values,
errors,
setValue,
handleSubmit,
} = useFormValidation({
schema: studySchema,
initialValues: {
code: '',
name: '',
description: '',
status: 'draft',
supported_languages: ['en'],
consent_mode: 'internal',
},
validateOnChange: true,
});
const onSubmit = handleSubmit(async (data) => {
await api.createStudy(data);
});
return (
<form onSubmit={onSubmit}>
<input
value={values.code}
onChange={(e) => setValue('code', e.target.value)}
/>
{errors.code && <span className="error">{errors.code}</span>}
{/* ... */}
</form>
);
}
- Creating new schemas:
import { z } from 'zod';
export const yourFormSchema = z.object({
name: z.string().min(1, 'Name is required').max(100),
email: z.string().email('Invalid email'),
count: z.number().int().min(1).max(100),
optional_field: z.string().optional(),
});
export type YourFormData = z.infer<typeof yourFormSchema>;
- Available validation schemas:
studySchema- Study creation/updateparticipantSchema- Participant enrollmentbatterySchema- Battery configurationsiteSchema- Site managementvisitWindowSchema- Visit schedulingconsentFormSchema- Consent formsredcapConfigSchema- REDCap settingsloginSchema/registerSchema- AuthenticationexportOptionsSchema- Data export optionsparticipantImportRowSchema- CSV import validation
Database Management¶
Creating Migrations¶
cd server
source .venv/bin/activate
# Auto-generate migration from model changes
alembic revision --autogenerate -m "Add your_table"
# Or create empty migration
alembic revision -m "Custom migration"
# Apply migrations
alembic upgrade head
# Rollback one migration
alembic downgrade -1
Adding a New Model¶
- Create model (
server/app/db/models.py):
class YourModel(Base):
__tablename__ = "your_table"
id: Mapped[uuid.UUID] = mapped_column(
UUID(as_uuid=True), primary_key=True, default=uuid.uuid4
)
name: Mapped[str] = mapped_column(String(255), nullable=False)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), server_default=func.now()
)
- Generate and apply migration:
Background Workers¶
Creating a Celery Task¶
- Add task to worker (
server/app/workers/your_worker.py):
from ..celery_app import celery_app
@celery_app.task(bind=True, name="app.workers.your_worker.your_task")
def your_task(self, param1: str) -> dict:
# Task logic here
return {"status": "completed", "param1": param1}
- Call task from API:
from ..workers.your_worker import your_task
# Async call (returns immediately)
result = your_task.delay("value")
# Get result (blocks)
result.get(timeout=30)
- Schedule periodic task (
server/app/celery_app.py):
app.conf.beat_schedule["your-periodic-task"] = {
"task": "app.workers.your_worker.your_task",
"schedule": 3600.0, # Every hour
"args": ("default_value",),
}
Testing¶
Running Tests¶
# Server tests (with test database)
cd server
source .venv/bin/activate
pytest tests/ -v # Run all tests
pytest tests/test_auth.py -v # Run specific file
pytest tests/ -v --cov=app # With coverage report
pytest tests/ -v -x # Stop on first failure
# Linting
cd server && ruff check app/
cd client && npm run lint
cd portal && npm run lint
# Type checking
cd server && mypy app/ --ignore-missing-imports
cd client && npx tsc --noEmit
cd portal && npx tsc --noEmit
Test Categories¶
The server has tests for:
- Authentication (tests/test_auth.py) - Login, register, token refresh, user profile
- Sessions (tests/test_session.py) - Session start, end, validation
- Submissions (tests/test_submit.py) - Data submission, authentication requirements
- Health (tests/test_health.py) - Health check endpoints
Writing Tests¶
Server tests (server/tests/test_your_feature.py):
import pytest
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession
@pytest.mark.asyncio
async def test_your_endpoint(client: AsyncClient, db_session: AsyncSession):
"""Test your endpoint with authenticated client."""
response = await client.get("/api/your-endpoint")
assert response.status_code == 200
data = response.json()
assert "expected_field" in data
@pytest.fixture
async def authenticated_headers(client: AsyncClient, db_session: AsyncSession):
"""Get auth headers for protected endpoints."""
from app.services.auth import create_user
user = await create_user(db_session, email="test@example.com", password="Test123!")
await db_session.commit()
response = await client.post("/api/auth/login", json={
"email": "test@example.com",
"password": "Test123!",
})
tokens = response.json()["tokens"]
return {"Authorization": f"Bearer {tokens['access_token']}"}
Continuous Integration¶
GitHub Actions automatically runs on every PR:
- Server Tests - pytest with PostgreSQL and Redis services
- Client Build - npm install, lint, typecheck, build
- Portal Build - npm install, lint, typecheck, build
- Security Scan - Bandit for Python, npm audit for JS
Code Style & Conventions¶
Naming Conventions¶
| Context | Convention | Examples |
|---|---|---|
| TypeScript files | kebab-case | simple-rt.ts |
| TypeScript variables | camelCase | sessionId |
| TypeScript interfaces | PascalCase | SessionConfig |
| Python files | snake_case | session_store.py |
| Python variables | snake_case | session_id |
| Python classes | PascalCase | SessionStore |
| API paths | kebab-case | /api/session/start |
| API fields | snake_case | session_id |
Linting¶
# Client
cd client && npm run lint
# Portal
cd portal && npm run lint
# Server (if configured)
cd server && ruff check .
Contributing¶
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature - Make your changes
- Test your changes:
pytest tests/ -v - Lint your code:
ruff check app/andnpm run lint - Commit with descriptive messages
- Push to your fork
- Submit a pull request
CI will automatically verify your changes pass tests and linting.
Commit Message Format¶
Types: feat, fix, docs, style, refactor, test, chore
Example: