Chapter 3: Operational Guide
This chapter is an operational guide. It describes the standard procedures for developing, maintaining, and administering the "Augmented Financial Analyst" application.
3.1. The Ideal Development Workflow
Any change to the project, whether it's a bug fix, a new feature, or a simple text update, must follow this rigorous process to ensure production stability.
3.1.1. Create a Working Branch
Never work directly on the main branch. Every new task begins by creating a dedicated branch from the most up-to-date version of main.
# Ensure the local `main` branch is up-to-date
git checkout main
git pull origin main
# Create and switch to a new descriptive branch
git checkout -b <change-type>/<short-description>
feature/add-new-chart, fix/login-page-bug.
3.1.2. Local Code Modification
Make all necessary code changes in your local development environment (e.g., VS Code).
3.1.3. Local Test Validation
Before submitting your work, run the entire test suite to ensure your changes have not introduced any regressions. This command must be run from the root of the application repository.
All tests must pass (or be intentionallyskipped).
3.1.4. Save Your Work (Commit)
Save your changes to the Git history with a clear and concise commit message, following the "Conventional Commits" convention.
# Add the modified files
git add .
# Create the commit
git commit -m "type(scope): description of the change"
feat(dashboard): Add 52-week analysis panel, fix(pipeline): Correct currency conversion logic.
3.1.5. Share and Review (Pull Request)
Push your working branch to the GitHub repository and open a "Pull Request" (PR).
On GitHub, create the Pull Request targeting themain branch. The PR is the opportunity to describe your changes and allow the CI/CD system to run the tests one last time in a clean environment.
3.1.6. Merging
Once the Pull Request is approved (reviewed and CI tests passed), merge it into main.
3.2. Data Maintenance Procedures
3.2.1. Update Portfolio Composition
- When: When you buy or sell stocks, or modify quantities.
- Procedure:
- Open the
data/tipranks_raw.csvfile on your local machine. - Modify, add, or delete the necessary rows.
- Crucial: For any new row, ensure you fill in the
Marketstack_TickerandMarketstack_Currencymapping columns after researching them on the Marketstack website. - Follow the Ideal Development Workflow (commit, push, merge). The data pipeline will process the changes from the CSV.
- Open the
3.2.2. Manually Run the Data Pipeline
- When: For a one-time data refresh or to force processing for a specific day after an error.
- Procedure:
- SSH into the VPS.
- Navigate to the project directory:
cd /var/www/qa-automated-pipeline. - Run the command:
docker compose exec app python -m code_source_simule.pipeline.
- Important Behavior: This command will run the pipeline for the previous day (D-1). For example, if you run it on a Tuesday at 4 PM, it will process Monday's data. It will not fetch the current prices for Tuesday.
3.3. Production Server Administration (VPS)
3.3.1. Check Application Status
- View active containers:
docker ps(should showqa-automated-pipeline-app-1andqa-automated-pipeline-db-1with statusUp). - Application logs (Gunicorn/Flask):
cd /var/www/qa-automated-pipeline && docker compose logs -f app(-fto follow in real-time). - Web server error logs:
sudo tail -f /var/log/nginx/error.log.
3.3.2. Restart Services
- Full restart (App + DB):
cd /var/www/qa-automated-pipeline && docker compose restart. - Restart the application only:
cd /var/www/qa-automated-pipeline && docker compose restart app. - Restart Nginx:
sudo systemctl restart nginx.
3.3.3. Manage the Automated Pipeline (cron)
- List scheduled tasks:
crontab -l. - Edit scheduled tasks:
crontab -e.- Recommendation: Schedule an execution early in the morning (e.g.,
30 4 * * 2-6for 4:30 AM from Tuesday to Saturday). This ensures that the previous day's closing data is final.
- Recommendation: Schedule an execution early in the morning (e.g.,
- Check logs of the last execution:
cat /var/log/cron-pipeline.log.
3.3.4. Interact with the Database
- SSH into the VPS.
- Navigate to the project directory:
cd /var/w ww/qa-automated-pipeline. - Load environment variables:
source .env. - Launch the MariaDB client:
docker compose exec db mysql -u"$DB_PROD_USER" -p"$DB_PROD_PASSWORD" "$DB_PROD_NAME".
3.4. Dependency Management
To add a new Python library (e.g., new-library):
1. On your local machine, with the venv activated, install the library: pip install new-library.
2. Update the requirements.txt file. This is the most important command to ensure reproducibility.
pytest -v).
4. Follow the Ideal Development Workflow (commit, push, etc.). The deployment process should rebuild the Docker image with the new library (--build), making the dependency available in production.
3.5. Project Documentation Management
The project documentation (the chapters you are currently reading), the test coverage report, and the architect's log are managed with MkDocs.
3.5.1. Build and View Documentation Locally
The documentation site is built from several source directories (project_docs/docs, test_cases/, etc.). A command is provided to assemble these sources into a final build directory that MkDocs can use.
The process is a two-step sequence:
-
Build the
docsdirectory: Run the new build command from the project root. This command creates (or cleans and recreates) thedocs/directory, populating it with all necessary Markdown files. -
Run the local preview server: Once the
You can now open your browser todocs/directory is built, run the MkDocs development server.http://127.0.0.1:8000to see the live preview.
Note: You must have the documentation dependencies installed: pip install -r project_docs/docs-requirements.txt.
3.5.2. Update the Documentation
- To modify the main documentation (like this guide), edit the Markdown files located in
project_docs/docs/. - To update the functional test cases, edit the Markdown files in
test_cases/. - After making any changes, you must re-run
python manage.py build-docsfor your changes to be reflected in the local preview.