Testing Workflow Guide¶
This document describes how to test batteries, assessments, and the complete data flow (including REDCap sync) before going live with real participants.
Development Server Ports¶
| Service | Port | URL |
|---|---|---|
| Server (FastAPI) | 8030 | http://localhost:8030 |
| Portal (Researcher) | 3030 | http://localhost:3030 |
| Client (jsPsych) | 3031 | http://localhost:3031 |
| Patient Portal | 3032 | http://localhost:3032 |
Start all servers:
Or start individually:
npm run dev:server # Port 8030
npm run dev:portal # Port 3030
npm run dev:client # Port 3031
npm run dev:patient-portal # Port 3032
Testing Methods¶
1. Quick Test Mode (Recommended for Battery Testing)¶
The fastest way to test a battery without manual form entry. Opens directly in jsPsych with pre-populated participant data.
URL Format:
http://localhost:3031/?battery=<battery-uuid>&study=<study-code>&participant=<participant-code>&age=<age>&site=<site-uuid>
Example:
http://localhost:3031/?battery=2ef1f656-38e1-4483-980b-f6fdc3e8014c&study=STUDY010&participant=TEST-001&age=25&site=demo
Parameters:
- battery - UUID of the battery to test (required)
- study - Study code (required for site selection)
- participant - Participant code (required for quick test mode)
- age - Participant age (required for age-adaptive tasks)
- site - Site UUID or "demo" (required for quick test mode)
- test_mode=true - Optional, allows testing draft batteries
How to get battery UUID: 1. Open Portal → Studies → Select Study → Batteries 2. Click on a battery to view details 3. Copy the UUID from the URL or battery info panel
2. Dev Menu (Single Task Testing)¶
For testing individual cognitive tasks without a full battery:
http://localhost:3031/?task=cpt
http://localhost:3031/?task=simpleRT
http://localhost:3031/?task=all # Full battery
3. Portal Test Mode¶
Testing from the Portal UI with participant links:
- Enable Test Mode:
- Portal → Studies → Select Study
-
Toggle "Test Mode" in the study selector dropdown
-
Generate Test Link:
- Navigate to Participants
- Select a test participant
- Click "Generate Assessment Link"
-
Copy and open the link
-
Or use Quick Test from Battery Page:
- Navigate to Batteries
- Select battery → "Test" button
- Enter test participant details
4. iOS Simulator Testing¶
For testing native Capacitor builds:
-
Start Vite dev server with host exposure:
-
Find your Mac's IP:
-
Set environment for hot reload:
-
Open in Xcode:
-
Run in simulator - The app will load from your dev server with hot reload.
Debugging iOS: - Safari → Develop → Simulator → app.metricis.app - View console logs and network requests
REDCap Sync Testing¶
Prerequisites¶
- REDCap site configured in Portal (Settings → REDCap)
- Valid REDCap API token
- Participant enrolled in REDCap
Test Flow¶
-
Verify REDCap Configuration:
-
Check participant exists in REDCap:
- Open REDCap project
-
Verify participant record exists
-
Run assessment with test participant:
- Use Quick Test Mode or Portal Test Link
-
Complete the battery
-
Verify sync:
- Check server logs for sync status
- Verify data appears in REDCap project
- Check
sync_statusin session records:
Troubleshooting Sync Failures¶
If sync fails: 1. Check server logs for CRITICAL errors 2. Verify REDCap API token is valid 3. Check REDCap field mappings 4. Manually retry sync via API:
Patient Portal Testing¶
Magic Link Authentication¶
- Generate a magic link:
- Portal → Participants → Select Participant
-
Click "Send Login Link" or generate manually
-
Or create JWT token directly (dev only):
-
Access patient portal:
Testing Patient Portal Tasks¶
- Patient portal requires valid authentication
- Participant must have scheduled visits/tasks
- Tasks are fetched from
/api/portal/tasks
Environment Configuration¶
All ports are configurable via environment variables for deployment flexibility.
Server (.env)¶
ENVIRONMENT=development
API_HOST=0.0.0.0 # Required for mobile testing
API_PORT=8030 # Server port (default: 8030)
ALLOWED_ORIGINS=http://localhost:3030,http://localhost:3031,http://localhost:3032,capacitor://localhost,capacitor://app.metricis.app
Client (.env or .env.local)¶
# API Configuration
VITE_API_URL= # Override API URL for all environments
VITE_DEV_API_PORT=8030 # Development API port (default: 8030)
VITE_CLIENT_PORT=3031 # Client dev server port (default: 3031)
# For iOS/Android simulator testing - use your Mac's IP
VITE_NATIVE_API_URL=http://192.168.1.xxx:8030/api
Portal (portal/.env.local)¶
Patient Portal (patient-portal/.env.local)¶
Changing Ports¶
To use different ports (e.g., to avoid conflicts):
-
Update server port in
server/.env: -
Update client environment:
-
Update portal/patient-portal proxy targets if needed
-
Update CORS origins in
server/.envwith new client ports
Longitudinal Study Testing¶
Testing multi-visit longitudinal studies requires simulating time passage and generating data at multiple timepoints. This section covers the complete workflow for testing longitudinal studies.
Overview¶
Longitudinal study testing uses three main features:
- Time Simulation - Simulate future dates to test visit windows and scheduling
- Synthetic Data - Generate test participants with realistic assessment data
- Dev Mode - Bypass consent and other constraints for testing
Enable Test Mode for a Study¶
- Navigate to Portal → Studies → Select Study → Overview
- Scroll to "Development & Testing Mode" section
- Configure the test settings:
- Notification Dry-Run: Log notifications without sending
- Time Simulation: Advance study time for testing
- Synthetic Data Generation: Enable to allow test data creation
Step 1: Generate Synthetic Participants¶
Generate test participants with enrollment dates and visit schedules:
From Portal UI: 1. Enable "Synthetic Data Generation" toggle 2. Click "Generate Data" 3. Configure options: - Participants: Number to generate (1-100) - Enrollment Spread: Spread enrollment over N days (simulates staggered enrollment) - Generate assessments: Create initial assessment data - Assign to arms: Randomly assign study arms - Assign to sites: Randomly assign sites 4. Click "Generate"
From API:
curl -X POST "http://localhost:8030/api/testing/studies/{study_id}/generate-synthetic-data" \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{
"participant_count": 10,
"enrollment_spread_days": 30,
"generate_assessments": true,
"generate_schedules": true,
"assign_arms": true,
"assign_sites": true
}'
Synthetic participants are marked with codes like STUDY001-SYN-0001 and can be cleared later.
Step 2: Simulate Time Passage¶
Advance the study's effective date to test visit window transitions:
From Portal UI:
1. In "Development & Testing Mode" section
2. Use time control buttons: -1 Week, -1 Day, +1 Day, +1 Week
3. The "Effective Date" display shows the simulated date
4. Click "Update Visit Statuses" to recalculate visit states
From API:
# Set time offset (+30 days = 1 month in the future)
curl -X POST "http://localhost:8030/api/testing/studies/{study_id}/time-simulation" \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{"offset_days": 30}'
# Update visit statuses based on new effective date
curl -X POST "http://localhost:8030/api/testing/studies/{study_id}/update-visit-statuses" \
-H "Authorization: Bearer <token>"
Visit Status Transitions:
As effective date advances, visits transition through these states:
- scheduled → in_window (within visit window)
- in_window → overdue (past target date)
- overdue → grace_period (if configured)
- grace_period → missed (past grace period)
Step 3: Generate Longitudinal Data¶
Generate assessment data across multiple visits for each participant:
From API:
curl -X POST "http://localhost:8030/api/testing/studies/{study_id}/generate-longitudinal-data" \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{
"visits_to_complete": 3,
"participants": null
}'
Options:
- visits_to_complete: Number of visits to complete (0 = all scheduled visits)
- participants: List of participant IDs, or null for all synthetic participants
Step 4: Bulk Anchor Date Setting¶
Set anchor dates for multiple participants at once (useful for testing time-based workflows):
curl -X POST "http://localhost:8030/api/testing/studies/{study_id}/bulk-anchor-date" \
-H "Authorization: Bearer <token>" \
-H "Content-Type: application/json" \
-d '{
"anchor_date": "2024-01-15",
"participants": null,
"reason": "testing_longitudinal_workflow"
}'
This will: 1. Set anchor dates for all synthetic participants 2. Regenerate visit schedules based on the new anchor date
Step 5: Test Patient Portal Experience¶
When time simulation is active, the Patient Portal shows a purple banner displaying the simulated date. This helps testers understand what date the system is using.
Patient Portal Features in Test Mode: - Purple "Simulated Date" banner shows effective date and offset - Tasks due dates reflect simulated time - Visit windows calculated from simulated date
Dev Mode Panel (Per-Participant Testing)¶
For detailed participant-level testing, use the Dev Mode Panel on the Participant Detail page:
- Navigate to Portal → Participants → Select Participant
- Expand the "Dev Mode Tools" panel
- Available actions:
- Override Anchor Date: Set enrollment date directly
- Trigger Message Now: Send scheduled notifications immediately
- Force Consent Bypass: Allow scheduling without signed consent
Bypass Status Indicators: - Consent: Valid | Bypassed | Missing - Scheduling: Allowed | Bypassed - Messaging: Allowed | Bypassed
Clear Test Data¶
When done testing, remove all synthetic data:
From Portal UI: 1. In "Development & Testing Mode" section 2. Click "Clear Synthetic Data" 3. Confirm deletion
From API:
curl -X DELETE "http://localhost:8030/api/testing/studies/{study_id}/synthetic-data" \
-H "Authorization: Bearer <token>"
This removes:
- Synthetic participants (codes containing -SYN-)
- Their assessment sessions and responses
- Scheduled visits
- Anchor date records
Real participant data is never affected.
Complete Longitudinal Test Workflow¶
Here's the recommended sequence for comprehensive longitudinal study testing:
1. Create/configure study with visit schedule
└─ Define visit windows (Baseline, Week 4, Week 12, etc.)
2. Enable test mode
└─ Enable synthetic data generation
└─ Enable notifications dry-run
3. Generate synthetic participants (10-20)
└─ Spread enrollment over 30 days
└─ Generate schedules automatically
4. Verify baseline visits
└─ Check all participants have scheduled visits
└─ Verify visit windows are correct
5. Simulate +30 days (Week 4)
└─ Update visit statuses
└─ Verify Week 4 visits are "in_window"
└─ Test reminder triggering via Dev Mode Panel
6. Generate longitudinal data (2 visits)
└─ Creates data for Baseline and Week 4
7. Simulate +90 days (Week 12)
└─ Update visit statuses
└─ Verify Week 4 visits show as "completed"
└─ Verify Week 12 visits are "in_window"
8. Test patient portal experience
└─ Login as test participant
└─ Verify correct tasks are shown
└─ Complete an assessment
9. Test consent withdrawal
└─ Remove consent from a participant
└─ Verify scheduling is blocked
└─ Verify messaging is blocked
10. Reset and clean up
└─ Reset time simulation
└─ Clear synthetic data
Testing API Reference¶
| Endpoint | Method | Description |
|---|---|---|
/api/testing/studies/{id}/config |
GET | Get test mode config |
/api/testing/studies/{id}/config |
PUT | Update test mode config |
/api/testing/studies/{id}/time-simulation |
POST | Set time offset |
/api/testing/studies/{id}/time-simulation/reset |
POST | Reset to today |
/api/testing/studies/{id}/update-visit-statuses |
POST | Recalculate visit statuses |
/api/testing/studies/{id}/generate-synthetic-data |
POST | Create test participants |
/api/testing/studies/{id}/generate-longitudinal-data |
POST | Create multi-visit data |
/api/testing/studies/{id}/bulk-anchor-date |
POST | Set anchor dates for multiple participants |
/api/testing/studies/{id}/synthetic-data |
DELETE | Clear all synthetic data |
Dev Mode API Reference¶
| Endpoint | Method | Description |
|---|---|---|
/api/dev/status |
GET | Get dev mode status |
/api/dev/anchor-date/override |
POST | Override participant's anchor date |
/api/dev/message/trigger-now/{id} |
POST | Trigger scheduled message immediately |
/api/dev/participants/{id}/bypass-preview |
GET | Preview what would be bypassed |
/api/dev/participants/{id}/consent-status |
GET | Get consent status |
/api/dev/participants/{id}/force-consent-bypass |
POST | Force bypass consent gate |
Testing Checklist¶
Before going live with a study:
Battery Testing¶
- [ ] Test battery loads correctly in browser
- [ ] Test all modules complete without errors
- [ ] Verify data submission succeeds
- [ ] Check summaries are calculated correctly
Longitudinal Study Testing¶
- [ ] Generate synthetic participants with schedules
- [ ] Simulate time passage through all visit windows
- [ ] Verify visit status transitions (scheduled → in_window → overdue)
- [ ] Test reminder triggering via Dev Mode Panel
- [ ] Generate longitudinal data across multiple visits
- [ ] Test consent withdrawal blocks scheduling/messaging
- [ ] Clear synthetic data after testing
REDCap Integration (if applicable)¶
- [ ] REDCap config is correct
- [ ] Test sync with a test participant
- [ ] Verify data appears in REDCap
- [ ] Check field mappings are correct
Patient Portal¶
- [ ] Magic link authentication works
- [ ] Scheduled tasks appear
- [ ] Assessment launch works
- [ ] Consent flow (if required) works
- [ ] Time simulation banner shows when active
Mobile Testing (if applicable)¶
- [ ] Test on iOS simulator
- [ ] Test on Android emulator
- [ ] Verify touch interactions work
- [ ] Check offline behavior (if applicable)
Common Issues¶
"Failed to Load Battery"¶
- Check API port (should be 8030)
- Verify battery UUID is correct
- Check CORS configuration
- Look for errors in browser console
iOS Simulator Network Errors¶
- Ensure server is running on
0.0.0.0 - Check
VITE_NATIVE_API_URLuses correct IP - Verify CORS includes
capacitor://origins
REDCap Sync Failures¶
- Check API token validity
- Verify event names match
- Check field mappings
- Review server logs for details
Authentication Issues (Patient Portal)¶
- Magic link tokens are single-use
- Check token hasn't expired
- Verify CORS allows portal origin