Chapter 3: Operational Guide
This chapter is an operational guide. It describes the standard procedures for developing, maintaining, and administering the "Augmented Financial Analyst" application.
3.1. The Ideal Development Workflow
Any change to the project, whether it's a bug fix, a new feature, or a simple text update, must follow this rigorous process to ensure production stability.
3.1.1. Create a Working Branch
Never work directly on the main branch. Every new task begins by creating a dedicated branch from the most up-to-date version of main.
# Ensure the local `main` branch is up-to-date
git checkout main
git pull origin main
# Create and switch to a new descriptive branch
git checkout -b <change-type>/<short-description>
feature/add-new-chart, fix/login-page-bug.
3.1.2. Local Code Modification
Make all necessary code changes in your local development environment (e.g., VS Code).
3.1.3. Local Test Validation
Before submitting your work, run the entire test suite to ensure your changes have not introduced any regressions. This command must be run from the root of the application repository.
All tests must pass (or be intentionallyskipped).
3.1.4. Save Your Work (Commit)
Save your changes to the Git history with a clear and concise commit message, following the "Conventional Commits" convention.
# Add the modified files
git add .
# Create the commit
git commit -m "type(scope): description of the change"
feat(dashboard): Add 52-week analysis panel, fix(pipeline): Correct currency conversion logic.
3.1.5. Share and Review (Pull Request)
Push your working branch to the GitHub repository and open a "Pull Request" (PR).
On GitHub, create the Pull Request targeting themain branch. The PR is the opportunity to describe your changes and allow the CI/CD system to run the tests one last time in a clean environment.
3.1.6. Merging
Once the Pull Request is approved (reviewed and CI tests passed), merge it into main.
3.2. Data Maintenance Procedures
3.2.1. Update Portfolio Composition
- When: When you buy or sell stocks, or modify quantities.
- Procedure:
- Open the
data/tipranks_raw.csvfile on your local machine. - Modify, add, or delete the necessary rows.
- Crucial: For any new row, ensure you fill in the
Marketstack_TickerandMarketstack_Currencymapping columns after researching them on the Marketstack website. - Follow the Ideal Development Workflow (commit, push, merge). The data pipeline will process the changes from the CSV.
- Open the
3.2.2. Manually Run the Data Pipeline
- When: For a one-time data refresh or to force processing for a specific day after an error.
- Procedure:
- SSH into the VPS.
- Navigate to the project directory:
cd /var/www/qa-automated-pipeline. - Run the command:
docker compose exec app python -m code_source_simule.pipeline.
- Important Behavior: This command targets the previous market day. Sunday runs are blocked; Monday runs are blocked by default and allowed only as a morning catch-up when the previously recorded run failed.
3.2.3. Configure SMTP Alerts
- When: Before enabling production operator email notifications.
- Procedure: Configure these environment variables on the target runtime (local shell, VPS, or CI secret store):
SMTP_HOSTSMTP_PORTSMTP_USERSMTP_PASSWORDALERT_EMAIL_TOSMTP_SECURITY(starttls,ssl,none) — defaultstarttlsSMTP_TIMEOUT_SECONDS— default10
- Recommended profiles:
587+SMTP_SECURITY=starttls465+SMTP_SECURITY=ssl
- Fail-open behavior: if SMTP is misconfigured or unreachable, the pipeline continues and logs a warning/event for diagnosis.
3.2.4. Troubleshoot Missing API Prices (Marketstack)
- When: UI history stops updating or alerts report a high number of missing API prices.
- Procedure:
- Confirm cron timing first (
crontab -l) and verify the last run log (tail -n 80 /var/log/cron-pipeline.log). - Reproduce the failing date directly against provider endpoints in Postman or curl.
- Compare
v1/eodandv2/eodfor the same symbols and date range before any code change. - If
v2/eodreturns valid data andv1/eoddoes not, treat this as endpoint compatibility and align pipeline endpoint usage. - Re-run one manual import and verify DB/UI refresh before closing the incident.
- Confirm cron timing first (
- Rule: For external API incidents, provider-side endpoint/version reproduction is mandatory before implementing mitigations.
3.2.5. Run User E2E/API Commands (local Windows)
- When: To manually validate application login, reusable authentication state, dashboard rendering, and Marketstack API scenarios.
- Standard procedure:
- Open PowerShell in
e2e/. - Save secrets once (or after credential changes):
npm run secrets:save:windows
- Rebuild login session state:
npm run test:auth:setup:windows
- Validate access without re-entering login:
npm run test:skip-login:windows
- If only the Marketstack API key changes:
npm run secrets:save:windows:marketstack
- Run secure Marketstack API tests:
npm run test:marketstack:windows
- Run the dedicated dashboard suite without opening the browser:
npm run test:dashboard
- Run the same dashboard suite with the browser visible:
npm run test:dashboard:headed
- Open PowerShell in
- Useful shortcuts:
- Full E2E suite:
npm run test:e2e - Local CI subset aligned with the Marketstack business suite:
npm run test:e2e:ci - Local Marketstack CI subset:
npm run test:marketstack:ci
- Full E2E suite:
- Security rule: never store
MARKETSTACK_API_KEYin plaintext in the repository; use only local Windows secrets (DPAPI) and GitHub secrets in CI.
3.3. Production Server Administration (VPS)
3.3.1. Check Application Status
- View active containers:
docker ps(should showqa-automated-pipeline-app-1andqa-automated-pipeline-db-1with statusUp). - Application logs (Gunicorn/Flask):
cd /var/www/qa-automated-pipeline && docker compose logs -f app(-fto follow in real-time). - Web server error logs:
sudo tail -f /var/log/nginx/error.log.
3.3.2. Restart Services
- Full restart (App + DB):
cd /var/www/qa-automated-pipeline && docker compose restart. - Restart the application only:
cd /var/www/qa-automated-pipeline && docker compose restart app. - Restart Nginx:
sudo systemctl restart nginx.
3.3.3. Manage the Automated Pipeline (cron)
- List scheduled tasks:
crontab -l. - Edit scheduled tasks:
crontab -e.- Recommendation: Keep a simple schedule from Monday to Saturday morning (server local time), and let the pipeline guardrails enforce Sunday block and Monday catch-up-only-on-failure logic.
- Check logs of the last execution:
cat /var/log/cron-pipeline.log.
3.3.4. Interact with the Database
- SSH into the VPS.
- Navigate to the project directory:
cd /var/www/qa-automated-pipeline. - Load environment variables:
source .env. - Launch the MariaDB client:
docker compose exec db mysql -u"$DB_PROD_USER" -p"$DB_PROD_PASSWORD" "$DB_PROD_NAME".
3.4. Dependency Management
To add a new Python library (e.g., new-library):
1. On your local machine, with the venv activated, install the library: pip install new-library.
2. Update the requirements.txt file. This is the most important command to ensure reproducibility.
pytest -v).
4. Follow the Ideal Development Workflow (commit, push, etc.). The deployment process should rebuild the Docker image with the new library (--build), making the dependency available in production.
3.5. Project Documentation Management
The project documentation (the chapters you are currently reading), the test coverage report, and the architect's log are managed with MkDocs.
3.5.1. Build and View Documentation Locally
The documentation source files live directly in the docs/ directory. To preview locally, run the MkDocs development server from the project root:
You can now open your browser to http://127.0.0.1:8000 to see the live preview.
Note: You must have the documentation dependencies installed: pip install -r docs/docs-requirements.txt.
3.5.2. Update the Documentation
- To modify the main documentation (like this guide), edit the Markdown files located in
docs/. - To update the functional test cases, edit the Markdown files in
test_cases/. - Changes are reflected immediately in the local preview server.
3.6. Formalized Debugging Plan
To ensure consistent, reliable, and well-documented bug fixes, all bugs must follow this standardized debugging workflow:
3.6.1. Overview of the Standardized Plan
- Create a GitHub Issue — Brief statement of the problem (constat initial) without diagnosis details.
- Create a Dedicated Branch — Branch naming:
fix/<short-issue-description>. - Develop TDD Tests — Write tests that validate the expected behavior or expose the bug.
- Implement the Fix — Make code changes to pass the new tests.
- Test the Fix — Run new tests and validate the fix works.
- Execute Regression Tests — Ensure all existing tests still pass.
- Update Test Coverage — Generate and update coverage reports.
- Document in an Activity Report — Create a report following the established format (see examples in
docs/activity_report/). - Update Related Documentation — Review and update chapters 1, 2, and 3 if the fix impacts architecture or procedures.
- Formalize as a Project Rule — Verify this plan remains codified and updated in Chapter 3.
3.6.2. GitHub Issue Template
Every issue should be concise and state only the initial observation (no investigation or diagnosis):
## Bug Title
### Constat
Brief description of the symptom or failure observed in production.
Include error messages, dates, and reproducibility information if available.
---
(Investigation and solution details will be documented in the PR and activity report.)
3.6.3. Branch Naming Convention
- Bug fixes:
fix/<short-description>(e.g.,fix/ticker-column-length-bug) - Features:
feature/<short-description> - Refactors:
refactor/<short-description>
3.6.4. Writing TDD Tests
New tests should be added to the appropriate test file and:
- Use descriptive test IDs (e.g., tc-pipe-ticker01).
- Include a docstring explaining what is being validated.
- Validate both the happy path and edge cases (e.g., maximum length, empty values, invalid types).
Example structure:
@pytest.mark.test_id("tc-pipe-ticker01")
def test_insert_ticker_with_reasonable_length(self, clean_db):
"""Validates that a ticker of reasonable length inserts successfully."""
# GIVEN: setup
# WHEN: action
# THEN: assertion
3.6.6. Checklist Before Merging
- GitHub issue created with clear constat
- Dedicated branch created and pushed
- TDD tests written and passing
- Bug fix implemented and working
- All regression tests pass (40+ tests)
- Test coverage reports updated
- Activity report documented
- Related documentation chapters reviewed and updated
- Commit messages follow "Conventional Commits" standard
- Pull Request reviewed and approved
- Merged into
mainand scheduled for production deployment