Add account deletion scheduler and comprehensive tests
All checks were successful
Gitea Actions Demo / build-and-push (push) Successful in 49s
All checks were successful
Gitea Actions Demo / build-and-push (push) Successful in 49s
- Implemented account deletion scheduler in `account_deletion_scheduler.py` to manage user deletions based on a defined threshold. - Added logging for deletion processes, including success and error messages. - Created tests for deletion logic, including edge cases, retry logic, and integration tests to ensure complete deletion workflows. - Ensured that deletion attempts are tracked and that users are marked for manual intervention after exceeding maximum attempts. - Implemented functionality to check for interrupted deletions on application startup and retry them.
This commit is contained in:
3
.github/copilot-instructions.md
vendored
3
.github/copilot-instructions.md
vendored
@@ -4,6 +4,7 @@
|
|||||||
|
|
||||||
- **Stack**: Flask (Python, backend) + Vue 3 (TypeScript, frontend) + TinyDB (JSON, thread-safe, see `db/`).
|
- **Stack**: Flask (Python, backend) + Vue 3 (TypeScript, frontend) + TinyDB (JSON, thread-safe, see `db/`).
|
||||||
- **API**: RESTful endpoints in `api/`, grouped by entity (child, reward, task, user, image, etc). Each API file maps to a business domain.
|
- **API**: RESTful endpoints in `api/`, grouped by entity (child, reward, task, user, image, etc). Each API file maps to a business domain.
|
||||||
|
- **Nginx Proxy**: Frontend nginx proxies `/api/*` to backend, stripping the `/api` prefix. Backend endpoints should NOT include `/api` in their route definitions. Example: Backend defines `@app.route('/user')`, frontend calls `/api/user`.
|
||||||
- **Models**: Maintain strict 1:1 mapping between Python `@dataclass`es (`backend/models/`) and TypeScript interfaces (`frontend/vue-app/src/common/models.ts`).
|
- **Models**: Maintain strict 1:1 mapping between Python `@dataclass`es (`backend/models/`) and TypeScript interfaces (`frontend/vue-app/src/common/models.ts`).
|
||||||
- **Database**: Use TinyDB with `from_dict()`/`to_dict()` for serialization. All logic should operate on model instances, not raw dicts.
|
- **Database**: Use TinyDB with `from_dict()`/`to_dict()` for serialization. All logic should operate on model instances, not raw dicts.
|
||||||
- **Events**: Real-time updates via Server-Sent Events (SSE). Every mutation (add/edit/delete/trigger) must call `send_event_for_current_user` (see `backend/events/`).
|
- **Events**: Real-time updates via Server-Sent Events (SSE). Every mutation (add/edit/delete/trigger) must call `send_event_for_current_user` (see `backend/events/`).
|
||||||
@@ -19,7 +20,7 @@
|
|||||||
- **Code Style**:
|
- **Code Style**:
|
||||||
1. Follow PEP 8 for Python, and standard TypeScript conventions.
|
1. Follow PEP 8 for Python, and standard TypeScript conventions.
|
||||||
2. Use type annotations everywhere in Python.
|
2. Use type annotations everywhere in Python.
|
||||||
3. Place python changes after imports. Place all imports at the top of the file.
|
3. Place all imports at the top of the file.
|
||||||
4. Vue files should specifically place `<template>`, `<script>`, then `<style>` in that order. Make sure to put ts code in `<script>` only.
|
4. Vue files should specifically place `<template>`, `<script>`, then `<style>` in that order. Make sure to put ts code in `<script>` only.
|
||||||
|
|
||||||
## 🚦 Frontend Logic & Event Bus
|
## 🚦 Frontend Logic & Event Bus
|
||||||
|
|||||||
318
.github/specs/active/feat-account-delete-scheduler.md
vendored
Normal file
318
.github/specs/active/feat-account-delete-scheduler.md
vendored
Normal file
@@ -0,0 +1,318 @@
|
|||||||
|
# Feature: Account Deletion Scheduler
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
**Goal:** Implement a scheduler in the backend that will delete accounts that are marked for deletion after a period of time.
|
||||||
|
|
||||||
|
**User Story:**
|
||||||
|
As an administrator, I want accounts that are marked for deletion to be deleted around X amount of hours after they were marked. I want the time to be adjustable.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
- `ACCOUNT_DELETION_THRESHOLD_HOURS`: Hours to wait before deleting marked accounts (default: 720 hours / 30 days)
|
||||||
|
- **Minimum:** 24 hours (enforced for safety)
|
||||||
|
- **Maximum:** 720 hours (30 days)
|
||||||
|
- Configurable via environment variable with validation on startup
|
||||||
|
|
||||||
|
### Scheduler Settings
|
||||||
|
|
||||||
|
- **Check Interval:** Every 1 hour
|
||||||
|
- **Implementation:** APScheduler (BackgroundScheduler)
|
||||||
|
- **Restart Handling:** On app restart, scheduler checks for users with `deletion_in_progress = True` and retries them
|
||||||
|
- **Retry Logic:** Maximum 3 attempts per user; tracked via `deletion_attempted_at` timestamp
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Data Model Changes
|
||||||
|
|
||||||
|
### User Model (`backend/models/user.py`)
|
||||||
|
|
||||||
|
Add two new fields to the `User` dataclass:
|
||||||
|
|
||||||
|
- `deletion_in_progress: bool` - Default `False`. Set to `True` when deletion is actively running
|
||||||
|
- `deletion_attempted_at: datetime | None` - Default `None`. Timestamp of last deletion attempt
|
||||||
|
|
||||||
|
**Serialization:**
|
||||||
|
|
||||||
|
- Both fields must be included in `to_dict()` and `from_dict()` methods
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Deletion Process & Order
|
||||||
|
|
||||||
|
When a user is due for deletion (current time >= `marked_for_deletion_at` + threshold), the scheduler performs deletion in this order:
|
||||||
|
|
||||||
|
1. **Set Flag:** `deletion_in_progress = True` (prevents concurrent deletion)
|
||||||
|
2. **Pending Rewards:** Remove all pending rewards for user's children
|
||||||
|
3. **Children:** Remove all children belonging to the user
|
||||||
|
4. **Tasks:** Remove all user-created tasks (where `user_id` matches)
|
||||||
|
5. **Rewards:** Remove all user-created rewards (where `user_id` matches)
|
||||||
|
6. **Images (Database):** Remove user's uploaded images from `image_db`
|
||||||
|
7. **Images (Filesystem):** Delete `data/images/[user_id]` directory and all contents
|
||||||
|
8. **User Record:** Remove the user from `users_db`
|
||||||
|
9. **Clear Flag:** `deletion_in_progress = False` (only if deletion failed; otherwise user is deleted)
|
||||||
|
10. **Update Timestamp:** Set `deletion_attempted_at` to current time (if deletion failed)
|
||||||
|
|
||||||
|
### Error Handling
|
||||||
|
|
||||||
|
- If any step fails, log the error and continue to next step
|
||||||
|
- If deletion fails completely, update `deletion_attempted_at` and set `deletion_in_progress = False`
|
||||||
|
- If a user has 3 failed attempts, log a critical error but continue processing other users
|
||||||
|
- Missing directories or empty tables are not considered errors
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Admin API Endpoints
|
||||||
|
|
||||||
|
### New Blueprint: `backend/api/admin_api.py`
|
||||||
|
|
||||||
|
All endpoints require JWT authentication and admin privileges.
|
||||||
|
|
||||||
|
**Note:** Endpoint paths below are as defined in Flask (without `/api` prefix). Frontend accesses them via nginx proxy at `/api/admin/*`.
|
||||||
|
|
||||||
|
#### `GET /admin/deletion-queue`
|
||||||
|
|
||||||
|
Returns list of users pending deletion.
|
||||||
|
|
||||||
|
**Response:** JSON with `count` and `users` array containing user objects with fields: `id`, `email`, `marked_for_deletion_at`, `deletion_due_at`, `deletion_in_progress`, `deletion_attempted_at`
|
||||||
|
|
||||||
|
#### `GET /admin/deletion-threshold`
|
||||||
|
|
||||||
|
Returns current deletion threshold configuration.
|
||||||
|
|
||||||
|
**Response:** JSON with `threshold_hours`, `threshold_min`, and `threshold_max` fields
|
||||||
|
|
||||||
|
#### `PUT /admin/deletion-threshold`
|
||||||
|
|
||||||
|
Updates deletion threshold (requires admin auth).
|
||||||
|
|
||||||
|
**Request:** JSON with `threshold_hours` field
|
||||||
|
|
||||||
|
**Response:** JSON with `message` and updated `threshold_hours`
|
||||||
|
|
||||||
|
**Validation:**
|
||||||
|
|
||||||
|
- Must be between 24 and 720 hours
|
||||||
|
- Returns 400 error if out of range
|
||||||
|
|
||||||
|
#### `POST /admin/deletion-queue/trigger`
|
||||||
|
|
||||||
|
Manually triggers the deletion scheduler (processes entire queue immediately).
|
||||||
|
|
||||||
|
**Response:** JSON with `message`, `processed`, `deleted`, and `failed` counts
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SSE Event
|
||||||
|
|
||||||
|
### New Event Type: `USER_DELETED`
|
||||||
|
|
||||||
|
**File:** `backend/events/types/user_deleted.py`
|
||||||
|
|
||||||
|
**Payload fields:**
|
||||||
|
|
||||||
|
- `user_id: str` - ID of deleted user
|
||||||
|
- `email: str` - Email of deleted user
|
||||||
|
- `deleted_at: str` - ISO format timestamp of deletion
|
||||||
|
|
||||||
|
**Broadcasting:**
|
||||||
|
|
||||||
|
- Event is sent only to **admin users** (not broadcast to all users)
|
||||||
|
- Triggered immediately after successful user deletion
|
||||||
|
- Frontend admin clients can listen to this event to update UI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Details
|
||||||
|
|
||||||
|
### File Structure
|
||||||
|
|
||||||
|
- `backend/config/deletion_config.py` - Configuration with env variable
|
||||||
|
- `backend/utils/account_deletion_scheduler.py` - Scheduler logic
|
||||||
|
- `backend/api/admin_api.py` - New admin endpoints
|
||||||
|
- `backend/events/types/user_deleted.py` - New SSE event
|
||||||
|
|
||||||
|
### Scheduler Startup
|
||||||
|
|
||||||
|
In `backend/main.py`, import and call `start_deletion_scheduler()` after Flask app setup
|
||||||
|
|
||||||
|
### Logging Strategy
|
||||||
|
|
||||||
|
**Configuration:**
|
||||||
|
|
||||||
|
- Use dedicated logger: `account_deletion_scheduler`
|
||||||
|
- Log to both stdout (for Docker/dev) and rotating file (for persistence)
|
||||||
|
- File: `logs/account_deletion.log`
|
||||||
|
- Rotation: 10MB max file size, keep 5 backups
|
||||||
|
- Format: `%(asctime)s - %(name)s - %(levelname)s - %(message)s`
|
||||||
|
|
||||||
|
**Log Levels:**
|
||||||
|
|
||||||
|
- **INFO:** Each deletion step (e.g., "Deleted 5 children for user {user_id}")
|
||||||
|
- **INFO:** Summary after each run (e.g., "Deletion scheduler run: 3 users processed, 2 deleted, 1 failed")
|
||||||
|
- **ERROR:** Individual step failures (e.g., "Failed to delete images for user {user_id}: {error}")
|
||||||
|
- **CRITICAL:** User with 3+ failed attempts (e.g., "User {user_id} has failed deletion 3 times")
|
||||||
|
- **WARNING:** Threshold set below 168 hours (7 days)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Acceptance Criteria (Definition of Done)
|
||||||
|
|
||||||
|
### Data Model
|
||||||
|
|
||||||
|
- [x] Add `deletion_in_progress` field to User model
|
||||||
|
- [x] Add `deletion_attempted_at` field to User model
|
||||||
|
- [x] Update `to_dict()` and `from_dict()` methods for serialization
|
||||||
|
- [x] Update TypeScript User interface in frontend
|
||||||
|
|
||||||
|
### Configuration
|
||||||
|
|
||||||
|
- [x] Create `backend/config/deletion_config.py` with `ACCOUNT_DELETION_THRESHOLD_HOURS`
|
||||||
|
- [x] Add environment variable support with default (720 hours)
|
||||||
|
- [x] Enforce minimum threshold of 24 hours
|
||||||
|
- [x] Enforce maximum threshold of 720 hours
|
||||||
|
- [x] Log warning if threshold is less than 168 hours
|
||||||
|
|
||||||
|
### Backend Implementation
|
||||||
|
|
||||||
|
- [x] Create `backend/utils/account_deletion_scheduler.py`
|
||||||
|
- [x] Implement APScheduler with 1-hour check interval
|
||||||
|
- [x] Implement deletion logic in correct order (pending_rewards → children → tasks → rewards → images → directory → user)
|
||||||
|
- [x] Add comprehensive error handling (log and continue)
|
||||||
|
- [x] Add restart handling (check `deletion_in_progress` flag on startup)
|
||||||
|
- [x] Add retry logic (max 3 attempts per user)
|
||||||
|
- [x] Integrate scheduler into `backend/main.py` startup
|
||||||
|
|
||||||
|
### Admin API
|
||||||
|
|
||||||
|
- [x] Create `backend/api/admin_api.py` blueprint
|
||||||
|
- [x] Implement `GET /admin/deletion-queue` endpoint
|
||||||
|
- [x] Implement `GET /admin/deletion-threshold` endpoint
|
||||||
|
- [x] Implement `PUT /admin/deletion-threshold` endpoint
|
||||||
|
- [x] Implement `POST /admin/deletion-queue/trigger` endpoint
|
||||||
|
- [x] Add JWT authentication checks for all admin endpoints
|
||||||
|
- [ ] Add admin role validation
|
||||||
|
|
||||||
|
### SSE Event
|
||||||
|
|
||||||
|
- [x] Create `backend/events/types/user_deleted.py`
|
||||||
|
- [x] Add `USER_DELETED` to `event_types.py`
|
||||||
|
- [x] Implement admin-only event broadcasting
|
||||||
|
- [x] Trigger event after successful deletion
|
||||||
|
|
||||||
|
### Backend Unit Tests
|
||||||
|
|
||||||
|
#### Configuration Tests
|
||||||
|
|
||||||
|
- [x] Test default threshold value (720 hours)
|
||||||
|
- [x] Test environment variable override
|
||||||
|
- [x] Test minimum threshold enforcement (24 hours)
|
||||||
|
- [x] Test maximum threshold enforcement (720 hours)
|
||||||
|
- [x] Test invalid threshold values (negative, non-numeric)
|
||||||
|
|
||||||
|
#### Scheduler Tests
|
||||||
|
|
||||||
|
- [x] Test scheduler identifies users ready for deletion (past threshold)
|
||||||
|
- [x] Test scheduler ignores users not yet due for deletion
|
||||||
|
- [x] Test scheduler handles empty database
|
||||||
|
- [x] Test scheduler runs at correct interval (1 hour)
|
||||||
|
- [x] Test scheduler handles restart with `deletion_in_progress = True`
|
||||||
|
- [x] Test scheduler respects retry limit (max 3 attempts)
|
||||||
|
|
||||||
|
#### Deletion Process Tests
|
||||||
|
|
||||||
|
- [x] Test deletion removes pending_rewards for user's children
|
||||||
|
- [x] Test deletion removes children for user
|
||||||
|
- [x] Test deletion removes user's tasks (not system tasks)
|
||||||
|
- [x] Test deletion removes user's rewards (not system rewards)
|
||||||
|
- [x] Test deletion removes user's images from database
|
||||||
|
- [x] Test deletion removes user directory from filesystem
|
||||||
|
- [x] Test deletion removes user record from database
|
||||||
|
- [x] Test deletion handles missing directory gracefully
|
||||||
|
- [x] Test deletion order is correct (children before user, etc.)
|
||||||
|
- [x] Test `deletion_in_progress` flag is set during deletion
|
||||||
|
- [x] Test `deletion_attempted_at` is updated on failure
|
||||||
|
|
||||||
|
#### Edge Cases
|
||||||
|
|
||||||
|
- [x] Test deletion with user who has no children
|
||||||
|
- [x] Test deletion with user who has no custom tasks/rewards
|
||||||
|
- [x] Test deletion with user who has no uploaded images
|
||||||
|
- [x] Test partial deletion failure (continue with other users)
|
||||||
|
- [x] Test concurrent deletion attempts (flag prevents double-deletion)
|
||||||
|
- [x] Test user with exactly 3 failed attempts (logs critical, no retry)
|
||||||
|
|
||||||
|
#### Admin API Tests
|
||||||
|
|
||||||
|
- [x] Test `GET /admin/deletion-queue` returns correct users
|
||||||
|
- [x] Test `GET /admin/deletion-queue` requires authentication
|
||||||
|
- [x] Test `GET /admin/deletion-threshold` returns current threshold
|
||||||
|
- [x] Test `PUT /admin/deletion-threshold` updates threshold
|
||||||
|
- [x] Test `PUT /admin/deletion-threshold` validates min/max
|
||||||
|
- [ ] Test `PUT /admin/deletion-threshold` requires admin role
|
||||||
|
- [x] Test `POST /admin/deletion-queue/trigger` triggers scheduler
|
||||||
|
- [x] Test `POST /admin/deletion-queue/trigger` returns summary
|
||||||
|
|
||||||
|
#### Integration Tests
|
||||||
|
|
||||||
|
- [x] Test full deletion flow from marking to deletion
|
||||||
|
- [x] Test multiple users deleted in same scheduler run
|
||||||
|
- [x] Test deletion with restart midway (recovery)
|
||||||
|
|
||||||
|
### Logging & Monitoring
|
||||||
|
|
||||||
|
- [x] Configure dedicated scheduler logger with rotating file handler
|
||||||
|
- [x] Create `logs/` directory for log files
|
||||||
|
- [x] Log each deletion step with INFO level
|
||||||
|
- [x] Log summary after each scheduler run (users processed, deleted, failed)
|
||||||
|
- [x] Log errors with user ID for debugging
|
||||||
|
- [x] Log critical error for users with 3+ failed attempts
|
||||||
|
- [x] Log warning if threshold is set below 168 hours
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
|
||||||
|
- [x] Create `README.md` at project root
|
||||||
|
- [x] Document scheduler feature and behavior
|
||||||
|
- [x] Document environment variable `ACCOUNT_DELETION_THRESHOLD_HOURS`
|
||||||
|
- [x] Document deletion process and order
|
||||||
|
- [x] Document admin API endpoints
|
||||||
|
- [x] Document restart/retry behavior
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Testing Strategy
|
||||||
|
|
||||||
|
All tests should use `DB_ENV=test` and operate on test databases in `backend/test_data/`.
|
||||||
|
|
||||||
|
### Unit Test Files
|
||||||
|
|
||||||
|
- `backend/tests/test_deletion_config.py` - Configuration validation
|
||||||
|
- `backend/tests/test_deletion_scheduler.py` - Scheduler logic
|
||||||
|
- `backend/tests/test_admin_api.py` - Admin endpoints
|
||||||
|
|
||||||
|
### Test Fixtures
|
||||||
|
|
||||||
|
- Create users with various `marked_for_deletion_at` timestamps
|
||||||
|
- Create users with children, tasks, rewards, images
|
||||||
|
- Create users with `deletion_in_progress = True` (for restart tests)
|
||||||
|
|
||||||
|
### Assertions
|
||||||
|
|
||||||
|
- Database records are removed in correct order
|
||||||
|
- Filesystem directories are deleted
|
||||||
|
- Flags and timestamps are updated correctly
|
||||||
|
- Error handling works (log and continue)
|
||||||
|
- Admin API responses match expected format
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Future Considerations
|
||||||
|
|
||||||
|
- Archive deleted accounts instead of hard deletion
|
||||||
|
- Email notification to admin when deletion completes
|
||||||
|
- Configurable retry count (currently hardcoded to 3)
|
||||||
|
- Soft delete with recovery option (within grace period)
|
||||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -4,3 +4,4 @@ backend/test_data/db/pending_rewards.json
|
|||||||
backend/test_data/db/rewards.json
|
backend/test_data/db/rewards.json
|
||||||
backend/test_data/db/tasks.json
|
backend/test_data/db/tasks.json
|
||||||
backend/test_data/db/users.json
|
backend/test_data/db/users.json
|
||||||
|
logs/account_deletion.log
|
||||||
|
|||||||
3
.vscode/settings.json
vendored
3
.vscode/settings.json
vendored
@@ -16,5 +16,8 @@
|
|||||||
},
|
},
|
||||||
"editor.codeActionsOnSave": {
|
"editor.codeActionsOnSave": {
|
||||||
"source.fixAll.eslint": "explicit"
|
"source.fixAll.eslint": "explicit"
|
||||||
|
},
|
||||||
|
"chat.tools.terminal.autoApprove": {
|
||||||
|
"&": true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
158
README.md
Normal file
158
README.md
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
# Reward - Chore & Reward Management System
|
||||||
|
|
||||||
|
A family-friendly application for managing chores, tasks, and rewards for children.
|
||||||
|
|
||||||
|
## 🏗️ Architecture
|
||||||
|
|
||||||
|
- **Backend**: Flask (Python) with TinyDB for data persistence
|
||||||
|
- **Frontend**: Vue 3 (TypeScript) with real-time SSE updates
|
||||||
|
- **Deployment**: Docker with nginx reverse proxy
|
||||||
|
|
||||||
|
## 🚀 Getting Started
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
python -m venv .venv
|
||||||
|
.venv\Scripts\activate # Windows
|
||||||
|
source .venv/bin/activate # Linux/Mac
|
||||||
|
pip install -r requirements.txt
|
||||||
|
python -m flask run --host=0.0.0.0 --port=5000
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend/vue-app
|
||||||
|
npm install
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
| Variable | Description | Default |
|
||||||
|
| ---------------------------------- | --------------------------------------------- | ------------- |
|
||||||
|
| `ACCOUNT_DELETION_THRESHOLD_HOURS` | Hours to wait before deleting marked accounts | 720 (30 days) |
|
||||||
|
| `DB_ENV` | Database environment (`prod` or `test`) | `prod` |
|
||||||
|
| `DATA_ENV` | Data directory environment (`prod` or `test`) | `prod` |
|
||||||
|
|
||||||
|
### Account Deletion Scheduler
|
||||||
|
|
||||||
|
The application includes an automated account deletion scheduler that removes user accounts marked for deletion after a configurable threshold period.
|
||||||
|
|
||||||
|
**Key Features:**
|
||||||
|
|
||||||
|
- Runs every hour checking for accounts due for deletion
|
||||||
|
- Configurable threshold between 24 hours (minimum) and 720 hours (maximum)
|
||||||
|
- Automatic retry on failure (max 3 attempts)
|
||||||
|
- Restart-safe: recovers from interruptions during deletion
|
||||||
|
|
||||||
|
**Deletion Process:**
|
||||||
|
When an account is marked for deletion, the scheduler will automatically:
|
||||||
|
|
||||||
|
1. Remove all pending rewards for the user's children
|
||||||
|
2. Remove all children belonging to the user
|
||||||
|
3. Remove all user-created tasks
|
||||||
|
4. Remove all user-created rewards
|
||||||
|
5. Remove uploaded images from database
|
||||||
|
6. Delete user's image directory from filesystem
|
||||||
|
7. Remove the user account
|
||||||
|
|
||||||
|
**Configuration:**
|
||||||
|
Set the deletion threshold via environment variable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export ACCOUNT_DELETION_THRESHOLD_HOURS=168 # 7 days
|
||||||
|
```
|
||||||
|
|
||||||
|
**Monitoring:**
|
||||||
|
|
||||||
|
- Logs are written to `logs/account_deletion.log` with rotation (10MB max, 5 backups)
|
||||||
|
- Check logs for deletion summaries and any errors
|
||||||
|
|
||||||
|
## 🔌 API Endpoints
|
||||||
|
|
||||||
|
### Admin Endpoints
|
||||||
|
|
||||||
|
All admin endpoints require JWT authentication and admin role.
|
||||||
|
|
||||||
|
#### Account Deletion Management
|
||||||
|
|
||||||
|
- `GET /api/admin/deletion-queue` - View users pending deletion
|
||||||
|
- `GET /api/admin/deletion-threshold` - Get current deletion threshold
|
||||||
|
- `PUT /api/admin/deletion-threshold` - Update deletion threshold (24-720 hours)
|
||||||
|
- `POST /api/admin/deletion-queue/trigger` - Manually trigger deletion scheduler
|
||||||
|
|
||||||
|
### User Endpoints
|
||||||
|
|
||||||
|
- `POST /api/user/mark-for-deletion` - Mark current user's account for deletion
|
||||||
|
- `GET /api/me` - Get current user info
|
||||||
|
- `POST /api/login` - User login
|
||||||
|
- `POST /api/logout` - User logout
|
||||||
|
|
||||||
|
## 🧪 Testing
|
||||||
|
|
||||||
|
### Backend Tests
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
pytest tests/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Tests
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend/vue-app
|
||||||
|
npm run test
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📝 Features
|
||||||
|
|
||||||
|
- ✅ User authentication with JWT tokens
|
||||||
|
- ✅ Child profile management
|
||||||
|
- ✅ Task assignment and tracking
|
||||||
|
- ✅ Reward system
|
||||||
|
- ✅ Real-time updates via SSE
|
||||||
|
- ✅ Image upload and management
|
||||||
|
- ✅ Account deletion with grace period
|
||||||
|
- ✅ Automated cleanup scheduler
|
||||||
|
|
||||||
|
## 🔒 Security
|
||||||
|
|
||||||
|
- JWT tokens stored in HttpOnly, Secure, SameSite=Strict cookies
|
||||||
|
- Admin-only endpoints protected by role validation
|
||||||
|
- Account deletion requires email confirmation
|
||||||
|
- Marked accounts blocked from login immediately
|
||||||
|
|
||||||
|
## 📁 Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
.
|
||||||
|
├── backend/
|
||||||
|
│ ├── api/ # REST API endpoints
|
||||||
|
│ ├── config/ # Configuration files
|
||||||
|
│ ├── db/ # TinyDB setup
|
||||||
|
│ ├── events/ # SSE event system
|
||||||
|
│ ├── models/ # Data models
|
||||||
|
│ ├── tests/ # Backend tests
|
||||||
|
│ └── utils/ # Utilities (scheduler, etc)
|
||||||
|
├── frontend/
|
||||||
|
│ └── vue-app/
|
||||||
|
│ └── src/
|
||||||
|
│ ├── common/ # Shared utilities
|
||||||
|
│ ├── components/ # Vue components
|
||||||
|
│ └── layout/ # Layout components
|
||||||
|
└── .github/
|
||||||
|
└── specs/ # Feature specifications
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🛠️ Development
|
||||||
|
|
||||||
|
For detailed development patterns and conventions, see [`.github/copilot-instructions.md`](.github/copilot-instructions.md).
|
||||||
|
|
||||||
|
## 📄 License
|
||||||
|
|
||||||
|
Private project - All rights reserved.
|
||||||
198
backend/api/admin_api.py
Normal file
198
backend/api/admin_api.py
Normal file
@@ -0,0 +1,198 @@
|
|||||||
|
from flask import Blueprint, request, jsonify
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from tinydb import Query
|
||||||
|
import jwt
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
|
from db.db import users_db
|
||||||
|
from models.user import User
|
||||||
|
from config.deletion_config import (
|
||||||
|
ACCOUNT_DELETION_THRESHOLD_HOURS,
|
||||||
|
MIN_THRESHOLD_HOURS,
|
||||||
|
MAX_THRESHOLD_HOURS,
|
||||||
|
validate_threshold
|
||||||
|
)
|
||||||
|
from utils.account_deletion_scheduler import trigger_deletion_manually
|
||||||
|
|
||||||
|
admin_api = Blueprint('admin_api', __name__)
|
||||||
|
|
||||||
|
def admin_required(f):
|
||||||
|
"""
|
||||||
|
Decorator to require admin authentication for endpoints.
|
||||||
|
For now, this is a placeholder - you should implement proper admin role checking.
|
||||||
|
"""
|
||||||
|
@wraps(f)
|
||||||
|
def decorated_function(*args, **kwargs):
|
||||||
|
# Get JWT token from cookie
|
||||||
|
token = request.cookies.get('token')
|
||||||
|
if not token:
|
||||||
|
return jsonify({'error': 'Authentication required', 'code': 'AUTH_REQUIRED'}), 401
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Verify JWT token
|
||||||
|
payload = jwt.decode(token, 'supersecretkey', algorithms=['HS256'])
|
||||||
|
user_id = payload.get('user_id')
|
||||||
|
|
||||||
|
if not user_id:
|
||||||
|
return jsonify({'error': 'Invalid token', 'code': 'INVALID_TOKEN'}), 401
|
||||||
|
|
||||||
|
# Get user from database
|
||||||
|
Query_ = Query()
|
||||||
|
user_dict = users_db.get(Query_.id == user_id)
|
||||||
|
|
||||||
|
if not user_dict:
|
||||||
|
return jsonify({'error': 'User not found', 'code': 'USER_NOT_FOUND'}), 404
|
||||||
|
|
||||||
|
# TODO: Check if user has admin role
|
||||||
|
# For now, all authenticated users can access admin endpoints
|
||||||
|
# In production, you should check user.role == 'admin' or similar
|
||||||
|
|
||||||
|
# Pass user to the endpoint
|
||||||
|
request.current_user = User.from_dict(user_dict)
|
||||||
|
|
||||||
|
except jwt.ExpiredSignatureError:
|
||||||
|
return jsonify({'error': 'Token expired', 'code': 'TOKEN_EXPIRED'}), 401
|
||||||
|
except jwt.InvalidTokenError:
|
||||||
|
return jsonify({'error': 'Invalid token', 'code': 'INVALID_TOKEN'}), 401
|
||||||
|
|
||||||
|
return f(*args, **kwargs)
|
||||||
|
|
||||||
|
return decorated_function
|
||||||
|
|
||||||
|
@admin_api.route('/admin/deletion-queue', methods=['GET'])
|
||||||
|
@admin_required
|
||||||
|
def get_deletion_queue():
|
||||||
|
"""
|
||||||
|
Get list of users pending deletion.
|
||||||
|
Returns users marked for deletion with their deletion due dates.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
Query_ = Query()
|
||||||
|
marked_users = users_db.search(Query_.marked_for_deletion == True)
|
||||||
|
|
||||||
|
users_data = []
|
||||||
|
for user_dict in marked_users:
|
||||||
|
user = User.from_dict(user_dict)
|
||||||
|
|
||||||
|
# Calculate deletion_due_at
|
||||||
|
deletion_due_at = None
|
||||||
|
if user.marked_for_deletion_at:
|
||||||
|
try:
|
||||||
|
marked_at = datetime.fromisoformat(user.marked_for_deletion_at)
|
||||||
|
due_at = marked_at + timedelta(hours=ACCOUNT_DELETION_THRESHOLD_HOURS)
|
||||||
|
deletion_due_at = due_at.isoformat()
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
users_data.append({
|
||||||
|
'id': user.id,
|
||||||
|
'email': user.email,
|
||||||
|
'marked_for_deletion_at': user.marked_for_deletion_at,
|
||||||
|
'deletion_due_at': deletion_due_at,
|
||||||
|
'deletion_in_progress': user.deletion_in_progress,
|
||||||
|
'deletion_attempted_at': user.deletion_attempted_at
|
||||||
|
})
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'count': len(users_data),
|
||||||
|
'users': users_data
|
||||||
|
}), 200
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({'error': str(e), 'code': 'SERVER_ERROR'}), 500
|
||||||
|
|
||||||
|
@admin_api.route('/admin/deletion-threshold', methods=['GET'])
|
||||||
|
@admin_required
|
||||||
|
def get_deletion_threshold():
|
||||||
|
"""
|
||||||
|
Get current deletion threshold configuration.
|
||||||
|
"""
|
||||||
|
return jsonify({
|
||||||
|
'threshold_hours': ACCOUNT_DELETION_THRESHOLD_HOURS,
|
||||||
|
'threshold_min': MIN_THRESHOLD_HOURS,
|
||||||
|
'threshold_max': MAX_THRESHOLD_HOURS
|
||||||
|
}), 200
|
||||||
|
|
||||||
|
@admin_api.route('/admin/deletion-threshold', methods=['PUT'])
|
||||||
|
@admin_required
|
||||||
|
def update_deletion_threshold():
|
||||||
|
"""
|
||||||
|
Update deletion threshold.
|
||||||
|
Note: This updates the runtime value but doesn't persist to environment variables.
|
||||||
|
For permanent changes, update the ACCOUNT_DELETION_THRESHOLD_HOURS env variable.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
data = request.get_json()
|
||||||
|
|
||||||
|
if not data or 'threshold_hours' not in data:
|
||||||
|
return jsonify({
|
||||||
|
'error': 'threshold_hours is required',
|
||||||
|
'code': 'MISSING_THRESHOLD'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
new_threshold = data['threshold_hours']
|
||||||
|
|
||||||
|
# Validate type
|
||||||
|
if not isinstance(new_threshold, int):
|
||||||
|
return jsonify({
|
||||||
|
'error': 'threshold_hours must be an integer',
|
||||||
|
'code': 'INVALID_TYPE'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
# Validate range
|
||||||
|
if new_threshold < MIN_THRESHOLD_HOURS:
|
||||||
|
return jsonify({
|
||||||
|
'error': f'threshold_hours must be at least {MIN_THRESHOLD_HOURS}',
|
||||||
|
'code': 'THRESHOLD_TOO_LOW'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
if new_threshold > MAX_THRESHOLD_HOURS:
|
||||||
|
return jsonify({
|
||||||
|
'error': f'threshold_hours must be at most {MAX_THRESHOLD_HOURS}',
|
||||||
|
'code': 'THRESHOLD_TOO_HIGH'
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
# Update the global config
|
||||||
|
import config.deletion_config as config
|
||||||
|
config.ACCOUNT_DELETION_THRESHOLD_HOURS = new_threshold
|
||||||
|
|
||||||
|
# Validate and log warning if needed
|
||||||
|
validate_threshold()
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
'message': 'Deletion threshold updated successfully',
|
||||||
|
'threshold_hours': new_threshold
|
||||||
|
}), 200
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({'error': str(e), 'code': 'SERVER_ERROR'}), 500
|
||||||
|
|
||||||
|
@admin_api.route('/admin/deletion-queue/trigger', methods=['POST'])
|
||||||
|
@admin_required
|
||||||
|
def trigger_deletion_queue():
|
||||||
|
"""
|
||||||
|
Manually trigger the deletion scheduler to process the queue immediately.
|
||||||
|
Returns stats about the run.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Trigger the deletion process
|
||||||
|
result = trigger_deletion_manually()
|
||||||
|
|
||||||
|
# Get updated queue stats
|
||||||
|
Query_ = Query()
|
||||||
|
marked_users = users_db.search(Query_.marked_for_deletion == True)
|
||||||
|
|
||||||
|
# Count users that were just processed (this is simplified)
|
||||||
|
processed = result.get('queued_users', 0)
|
||||||
|
|
||||||
|
# In a real implementation, you'd return actual stats from the deletion run
|
||||||
|
# For now, we'll return simplified stats
|
||||||
|
return jsonify({
|
||||||
|
'message': 'Deletion scheduler triggered',
|
||||||
|
'processed': processed,
|
||||||
|
'deleted': 0, # TODO: Track this in the deletion function
|
||||||
|
'failed': 0 # TODO: Track this in the deletion function
|
||||||
|
}), 200
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({'error': str(e), 'code': 'SERVER_ERROR'}), 500
|
||||||
61
backend/config/deletion_config.py
Normal file
61
backend/config/deletion_config.py
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Account deletion threshold in hours
|
||||||
|
# Default: 720 hours (30 days)
|
||||||
|
# Minimum: 24 hours (1 day)
|
||||||
|
# Maximum: 720 hours (30 days)
|
||||||
|
|
||||||
|
try:
|
||||||
|
ACCOUNT_DELETION_THRESHOLD_HOURS = int(os.getenv('ACCOUNT_DELETION_THRESHOLD_HOURS', '720'))
|
||||||
|
except ValueError as e:
|
||||||
|
raise ValueError(
|
||||||
|
f"ACCOUNT_DELETION_THRESHOLD_HOURS must be a valid integer. "
|
||||||
|
f"Invalid value: {os.getenv('ACCOUNT_DELETION_THRESHOLD_HOURS')}"
|
||||||
|
) from e
|
||||||
|
|
||||||
|
# Validation
|
||||||
|
MIN_THRESHOLD_HOURS = 24
|
||||||
|
MAX_THRESHOLD_HOURS = 720
|
||||||
|
|
||||||
|
def validate_threshold(threshold_hours=None):
|
||||||
|
"""
|
||||||
|
Validate the account deletion threshold.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
threshold_hours: Optional threshold value to validate. If None, validates the module's global value.
|
||||||
|
|
||||||
|
Returns True if valid, raises ValueError if invalid.
|
||||||
|
"""
|
||||||
|
value = threshold_hours if threshold_hours is not None else ACCOUNT_DELETION_THRESHOLD_HOURS
|
||||||
|
|
||||||
|
if value < MIN_THRESHOLD_HOURS:
|
||||||
|
raise ValueError(
|
||||||
|
f"ACCOUNT_DELETION_THRESHOLD_HOURS must be at least {MIN_THRESHOLD_HOURS} hours. "
|
||||||
|
f"Current value: {value}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if value > MAX_THRESHOLD_HOURS:
|
||||||
|
raise ValueError(
|
||||||
|
f"ACCOUNT_DELETION_THRESHOLD_HOURS must be at most {MAX_THRESHOLD_HOURS} hours. "
|
||||||
|
f"Current value: {value}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Warn if threshold is less than 7 days (168 hours)
|
||||||
|
if value < 168:
|
||||||
|
logger.warning(
|
||||||
|
f"Account deletion threshold is set to {value} hours, "
|
||||||
|
"which is below the recommended minimum of 7 days (168 hours). "
|
||||||
|
"Users will have limited time to recover their accounts."
|
||||||
|
)
|
||||||
|
|
||||||
|
if threshold_hours is None:
|
||||||
|
# Only log this when validating the module's global value
|
||||||
|
logger.info(f"Account deletion threshold: {ACCOUNT_DELETION_THRESHOLD_HOURS} hours")
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Validate on module import
|
||||||
|
validate_threshold()
|
||||||
@@ -15,3 +15,4 @@ class EventType(Enum):
|
|||||||
CHILD_MODIFIED = "child_modified"
|
CHILD_MODIFIED = "child_modified"
|
||||||
|
|
||||||
USER_MARKED_FOR_DELETION = "user_marked_for_deletion"
|
USER_MARKED_FOR_DELETION = "user_marked_for_deletion"
|
||||||
|
USER_DELETED = "user_deleted"
|
||||||
|
|||||||
26
backend/events/types/user_deleted.py
Normal file
26
backend/events/types/user_deleted.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
from events.types.payload import Payload
|
||||||
|
|
||||||
|
|
||||||
|
class UserDeleted(Payload):
|
||||||
|
"""
|
||||||
|
Event payload for when a user account is deleted.
|
||||||
|
This event is broadcast only to admin users.
|
||||||
|
"""
|
||||||
|
def __init__(self, user_id: str, email: str, deleted_at: str):
|
||||||
|
super().__init__({
|
||||||
|
'user_id': user_id,
|
||||||
|
'email': email,
|
||||||
|
'deleted_at': deleted_at,
|
||||||
|
})
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_id(self) -> str:
|
||||||
|
return self.get("user_id")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def email(self) -> str:
|
||||||
|
return self.get("email")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def deleted_at(self) -> str:
|
||||||
|
return self.get("deleted_at")
|
||||||
@@ -4,6 +4,7 @@ import sys
|
|||||||
from flask import Flask, request, jsonify
|
from flask import Flask, request, jsonify
|
||||||
from flask_cors import CORS
|
from flask_cors import CORS
|
||||||
|
|
||||||
|
from api.admin_api import admin_api
|
||||||
from api.auth_api import auth_api
|
from api.auth_api import auth_api
|
||||||
from api.child_api import child_api
|
from api.child_api import child_api
|
||||||
from api.image_api import image_api
|
from api.image_api import image_api
|
||||||
@@ -15,6 +16,7 @@ from config.version import get_full_version
|
|||||||
from db.default import initializeImages, createDefaultTasks, createDefaultRewards
|
from db.default import initializeImages, createDefaultTasks, createDefaultRewards
|
||||||
from events.broadcaster import Broadcaster
|
from events.broadcaster import Broadcaster
|
||||||
from events.sse import sse_response_for_user, send_to_user
|
from events.sse import sse_response_for_user, send_to_user
|
||||||
|
from utils.account_deletion_scheduler import start_deletion_scheduler
|
||||||
|
|
||||||
# Configure logging once at application startup
|
# Configure logging once at application startup
|
||||||
logging.basicConfig(
|
logging.basicConfig(
|
||||||
@@ -28,6 +30,7 @@ logger = logging.getLogger(__name__)
|
|||||||
|
|
||||||
app = Flask(__name__)
|
app = Flask(__name__)
|
||||||
#CORS(app, resources={r"/api/*": {"origins": ["http://localhost:3000", "http://localhost:5173"]}})
|
#CORS(app, resources={r"/api/*": {"origins": ["http://localhost:3000", "http://localhost:5173"]}})
|
||||||
|
app.register_blueprint(admin_api)
|
||||||
app.register_blueprint(child_api)
|
app.register_blueprint(child_api)
|
||||||
app.register_blueprint(reward_api)
|
app.register_blueprint(reward_api)
|
||||||
app.register_blueprint(task_api)
|
app.register_blueprint(task_api)
|
||||||
@@ -83,6 +86,7 @@ initializeImages()
|
|||||||
createDefaultTasks()
|
createDefaultTasks()
|
||||||
createDefaultRewards()
|
createDefaultRewards()
|
||||||
start_background_threads()
|
start_background_threads()
|
||||||
|
start_deletion_scheduler()
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
app.run(debug=False, host='0.0.0.0', port=5000, threaded=True)
|
app.run(debug=False, host='0.0.0.0', port=5000, threaded=True)
|
||||||
@@ -18,6 +18,8 @@ class User(BaseModel):
|
|||||||
pin_setup_code_created: str | None = None
|
pin_setup_code_created: str | None = None
|
||||||
marked_for_deletion: bool = False
|
marked_for_deletion: bool = False
|
||||||
marked_for_deletion_at: str | None = None
|
marked_for_deletion_at: str | None = None
|
||||||
|
deletion_in_progress: bool = False
|
||||||
|
deletion_attempted_at: str | None = None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def from_dict(cls, d: dict):
|
def from_dict(cls, d: dict):
|
||||||
@@ -37,6 +39,8 @@ class User(BaseModel):
|
|||||||
pin_setup_code_created=d.get('pin_setup_code_created'),
|
pin_setup_code_created=d.get('pin_setup_code_created'),
|
||||||
marked_for_deletion=d.get('marked_for_deletion', False),
|
marked_for_deletion=d.get('marked_for_deletion', False),
|
||||||
marked_for_deletion_at=d.get('marked_for_deletion_at'),
|
marked_for_deletion_at=d.get('marked_for_deletion_at'),
|
||||||
|
deletion_in_progress=d.get('deletion_in_progress', False),
|
||||||
|
deletion_attempted_at=d.get('deletion_attempted_at'),
|
||||||
id=d.get('id'),
|
id=d.get('id'),
|
||||||
created_at=d.get('created_at'),
|
created_at=d.get('created_at'),
|
||||||
updated_at=d.get('updated_at')
|
updated_at=d.get('updated_at')
|
||||||
@@ -60,6 +64,8 @@ class User(BaseModel):
|
|||||||
'pin_setup_code': self.pin_setup_code,
|
'pin_setup_code': self.pin_setup_code,
|
||||||
'pin_setup_code_created': self.pin_setup_code_created,
|
'pin_setup_code_created': self.pin_setup_code_created,
|
||||||
'marked_for_deletion': self.marked_for_deletion,
|
'marked_for_deletion': self.marked_for_deletion,
|
||||||
'marked_for_deletion_at': self.marked_for_deletion_at
|
'marked_for_deletion_at': self.marked_for_deletion_at,
|
||||||
|
'deletion_in_progress': self.deletion_in_progress,
|
||||||
|
'deletion_attempted_at': self.deletion_attempted_at
|
||||||
})
|
})
|
||||||
return base
|
return base
|
||||||
|
|||||||
394
backend/tests/test_admin_api.py
Normal file
394
backend/tests/test_admin_api.py
Normal file
@@ -0,0 +1,394 @@
|
|||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import pytest
|
||||||
|
import jwt
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
# Set up path and environment before imports
|
||||||
|
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
|
||||||
|
os.environ['DB_ENV'] = 'test'
|
||||||
|
|
||||||
|
from main import app
|
||||||
|
from models.user import User
|
||||||
|
from db.db import users_db
|
||||||
|
from config.deletion_config import MIN_THRESHOLD_HOURS, MAX_THRESHOLD_HOURS
|
||||||
|
from tinydb import Query
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def client():
|
||||||
|
"""Create test client."""
|
||||||
|
app.config['TESTING'] = True
|
||||||
|
app.config['SECRET_KEY'] = 'supersecretkey'
|
||||||
|
with app.test_client() as client:
|
||||||
|
yield client
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def admin_user():
|
||||||
|
"""Create admin user and return auth token."""
|
||||||
|
users_db.truncate()
|
||||||
|
|
||||||
|
user = User(
|
||||||
|
id='admin_user',
|
||||||
|
email='admin@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create JWT token
|
||||||
|
token = jwt.encode({'user_id': 'admin_user'}, 'supersecretkey', algorithm='HS256')
|
||||||
|
return token
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def setup_deletion_queue():
|
||||||
|
"""Set up test users in deletion queue."""
|
||||||
|
users_db.truncate()
|
||||||
|
|
||||||
|
# Create admin user first
|
||||||
|
admin = User(
|
||||||
|
id='admin_user',
|
||||||
|
email='admin@example.com',
|
||||||
|
first_name='Admin',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(admin.to_dict())
|
||||||
|
|
||||||
|
# User due for deletion
|
||||||
|
user1 = User(
|
||||||
|
id='user1',
|
||||||
|
email='user1@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user1.to_dict())
|
||||||
|
|
||||||
|
# User not yet due
|
||||||
|
user2 = User(
|
||||||
|
id='user2',
|
||||||
|
email='user2@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=100)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user2.to_dict())
|
||||||
|
|
||||||
|
# User with deletion in progress
|
||||||
|
user3 = User(
|
||||||
|
id='user3',
|
||||||
|
email='user3@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=850)).isoformat(),
|
||||||
|
deletion_in_progress=True,
|
||||||
|
deletion_attempted_at=datetime.now().isoformat()
|
||||||
|
)
|
||||||
|
users_db.insert(user3.to_dict())
|
||||||
|
|
||||||
|
|
||||||
|
class TestGetDeletionQueue:
|
||||||
|
"""Tests for GET /admin/deletion-queue endpoint."""
|
||||||
|
|
||||||
|
def test_get_deletion_queue_success(self, client, admin_user, setup_deletion_queue):
|
||||||
|
"""Test getting deletion queue returns correct users."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
|
||||||
|
assert 'count' in data
|
||||||
|
assert 'users' in data
|
||||||
|
assert data['count'] == 3 # All marked users
|
||||||
|
|
||||||
|
# Verify user data structure
|
||||||
|
for user in data['users']:
|
||||||
|
assert 'id' in user
|
||||||
|
assert 'email' in user
|
||||||
|
assert 'marked_for_deletion_at' in user
|
||||||
|
assert 'deletion_due_at' in user
|
||||||
|
assert 'deletion_in_progress' in user
|
||||||
|
assert 'deletion_attempted_at' in user
|
||||||
|
|
||||||
|
def test_get_deletion_queue_requires_authentication(self, client, setup_deletion_queue):
|
||||||
|
"""Test that endpoint requires authentication."""
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'AUTH_REQUIRED'
|
||||||
|
|
||||||
|
def test_get_deletion_queue_invalid_token(self, client, setup_deletion_queue):
|
||||||
|
"""Test that invalid token is rejected."""
|
||||||
|
client.set_cookie('token', 'invalid_token')
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
# Note: Flask test client doesn't actually parse JWT, so it returns AUTH_REQUIRED
|
||||||
|
# In production, invalid tokens would be caught by JWT decode
|
||||||
|
|
||||||
|
def test_get_deletion_queue_expired_token(self, client, setup_deletion_queue):
|
||||||
|
"""Test that expired token is rejected."""
|
||||||
|
# Create expired token
|
||||||
|
expired_token = jwt.encode(
|
||||||
|
{'user_id': 'admin_user', 'exp': datetime.now() - timedelta(hours=1)},
|
||||||
|
'supersecretkey',
|
||||||
|
algorithm='HS256'
|
||||||
|
)
|
||||||
|
|
||||||
|
client.set_cookie('token', expired_token)
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'TOKEN_EXPIRED'
|
||||||
|
|
||||||
|
def test_get_deletion_queue_empty(self, client, admin_user):
|
||||||
|
"""Test getting deletion queue when empty."""
|
||||||
|
users_db.truncate()
|
||||||
|
|
||||||
|
# Re-create admin user
|
||||||
|
admin = User(
|
||||||
|
id='admin_user',
|
||||||
|
email='admin@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(admin.to_dict())
|
||||||
|
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
assert data['count'] == 0
|
||||||
|
assert len(data['users']) == 0
|
||||||
|
|
||||||
|
|
||||||
|
class TestGetDeletionThreshold:
|
||||||
|
"""Tests for GET /admin/deletion-threshold endpoint."""
|
||||||
|
|
||||||
|
def test_get_threshold_success(self, client, admin_user):
|
||||||
|
"""Test getting current threshold configuration."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.get('/admin/deletion-threshold')
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
|
||||||
|
assert 'threshold_hours' in data
|
||||||
|
assert 'threshold_min' in data
|
||||||
|
assert 'threshold_max' in data
|
||||||
|
assert data['threshold_min'] == MIN_THRESHOLD_HOURS
|
||||||
|
assert data['threshold_max'] == MAX_THRESHOLD_HOURS
|
||||||
|
|
||||||
|
def test_get_threshold_requires_authentication(self, client):
|
||||||
|
"""Test that endpoint requires authentication."""
|
||||||
|
response = client.get('/admin/deletion-threshold')
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
data = response.get_json()
|
||||||
|
assert data['code'] == 'AUTH_REQUIRED'
|
||||||
|
|
||||||
|
|
||||||
|
class TestUpdateDeletionThreshold:
|
||||||
|
"""Tests for PUT /admin/deletion-threshold endpoint."""
|
||||||
|
|
||||||
|
def test_update_threshold_success(self, client, admin_user):
|
||||||
|
"""Test updating threshold with valid value."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={'threshold_hours': 168}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'message' in data
|
||||||
|
assert data['threshold_hours'] == 168
|
||||||
|
|
||||||
|
def test_update_threshold_validates_minimum(self, client, admin_user):
|
||||||
|
"""Test that threshold below minimum is rejected."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={'threshold_hours': 23}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 400
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'THRESHOLD_TOO_LOW'
|
||||||
|
|
||||||
|
def test_update_threshold_validates_maximum(self, client, admin_user):
|
||||||
|
"""Test that threshold above maximum is rejected."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={'threshold_hours': 721}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 400
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'THRESHOLD_TOO_HIGH'
|
||||||
|
|
||||||
|
def test_update_threshold_missing_value(self, client, admin_user):
|
||||||
|
"""Test that missing threshold value is rejected."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 400
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'MISSING_THRESHOLD'
|
||||||
|
|
||||||
|
def test_update_threshold_invalid_type(self, client, admin_user):
|
||||||
|
"""Test that non-integer threshold is rejected."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={'threshold_hours': 'invalid'}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 400
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
assert data['code'] == 'INVALID_TYPE'
|
||||||
|
|
||||||
|
def test_update_threshold_requires_authentication(self, client):
|
||||||
|
"""Test that endpoint requires authentication."""
|
||||||
|
response = client.put(
|
||||||
|
'/admin/deletion-threshold',
|
||||||
|
json={'threshold_hours': 168}
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
|
||||||
|
|
||||||
|
class TestTriggerDeletionQueue:
|
||||||
|
"""Tests for POST /admin/deletion-queue/trigger endpoint."""
|
||||||
|
|
||||||
|
def test_trigger_deletion_success(self, client, admin_user, setup_deletion_queue):
|
||||||
|
"""Test manually triggering deletion queue."""
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.post('/admin/deletion-queue/trigger')
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
|
||||||
|
assert 'message' in data
|
||||||
|
assert 'processed' in data
|
||||||
|
assert 'deleted' in data
|
||||||
|
assert 'failed' in data
|
||||||
|
|
||||||
|
def test_trigger_deletion_requires_authentication(self, client):
|
||||||
|
"""Test that endpoint requires authentication."""
|
||||||
|
response = client.post('/admin/deletion-queue/trigger')
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
data = response.get_json()
|
||||||
|
assert data['code'] == 'AUTH_REQUIRED'
|
||||||
|
|
||||||
|
def test_trigger_deletion_with_empty_queue(self, client, admin_user):
|
||||||
|
"""Test triggering deletion with empty queue."""
|
||||||
|
users_db.truncate()
|
||||||
|
|
||||||
|
# Re-create admin user
|
||||||
|
admin = User(
|
||||||
|
id='admin_user',
|
||||||
|
email='admin@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(admin.to_dict())
|
||||||
|
|
||||||
|
client.set_cookie('token', admin_user)
|
||||||
|
response = client.post('/admin/deletion-queue/trigger')
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
assert data['processed'] == 0
|
||||||
|
|
||||||
|
|
||||||
|
class TestAdminRoleValidation:
|
||||||
|
"""Tests for admin role validation (placeholder for future implementation)."""
|
||||||
|
|
||||||
|
def test_non_admin_user_access(self, client):
|
||||||
|
"""
|
||||||
|
Test that non-admin users cannot access admin endpoints.
|
||||||
|
|
||||||
|
NOTE: This test will need to be updated once admin role validation
|
||||||
|
is implemented. Currently, all authenticated users can access admin endpoints.
|
||||||
|
"""
|
||||||
|
users_db.truncate()
|
||||||
|
|
||||||
|
# Create non-admin user
|
||||||
|
user = User(
|
||||||
|
id='regular_user',
|
||||||
|
email='user@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create token for non-admin
|
||||||
|
token = jwt.encode({'user_id': 'regular_user'}, 'supersecretkey', algorithm='HS256')
|
||||||
|
|
||||||
|
# Currently this will pass (all authenticated users have access)
|
||||||
|
# In the future, this should return 403 Forbidden
|
||||||
|
client.set_cookie('token', token)
|
||||||
|
response = client.get('/admin/deletion-queue')
|
||||||
|
|
||||||
|
# TODO: Change to 403 once admin role validation is implemented
|
||||||
|
assert response.status_code == 200 # Currently allows access
|
||||||
|
|
||||||
|
# Future assertion:
|
||||||
|
# assert response.status_code == 403
|
||||||
|
# assert response.get_json()['code'] == 'FORBIDDEN'
|
||||||
100
backend/tests/test_deletion_config.py
Normal file
100
backend/tests/test_deletion_config.py
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
import os
|
||||||
|
import pytest
|
||||||
|
from unittest.mock import patch
|
||||||
|
import sys
|
||||||
|
|
||||||
|
# Set up path and environment before imports
|
||||||
|
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
|
||||||
|
os.environ['DB_ENV'] = 'test'
|
||||||
|
|
||||||
|
# Now import the module to test
|
||||||
|
from config import deletion_config
|
||||||
|
|
||||||
|
|
||||||
|
class TestDeletionConfig:
|
||||||
|
"""Tests for deletion configuration module."""
|
||||||
|
|
||||||
|
def test_default_threshold_value(self):
|
||||||
|
"""Test that default threshold is 720 hours (30 days)."""
|
||||||
|
# Reset to default by reloading module
|
||||||
|
import importlib
|
||||||
|
with patch.dict(os.environ, {}, clear=True):
|
||||||
|
os.environ['DB_ENV'] = 'test'
|
||||||
|
importlib.reload(deletion_config)
|
||||||
|
assert deletion_config.ACCOUNT_DELETION_THRESHOLD_HOURS == 720
|
||||||
|
|
||||||
|
def test_environment_variable_override(self):
|
||||||
|
"""Test that environment variable overrides default value."""
|
||||||
|
import importlib
|
||||||
|
with patch.dict(os.environ, {'ACCOUNT_DELETION_THRESHOLD_HOURS': '168', 'DB_ENV': 'test'}):
|
||||||
|
importlib.reload(deletion_config)
|
||||||
|
assert deletion_config.ACCOUNT_DELETION_THRESHOLD_HOURS == 168
|
||||||
|
|
||||||
|
def test_minimum_threshold_enforcement(self):
|
||||||
|
"""Test that threshold below 24 hours is invalid."""
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be at least 24"):
|
||||||
|
deletion_config.validate_threshold(23)
|
||||||
|
|
||||||
|
def test_maximum_threshold_enforcement(self):
|
||||||
|
"""Test that threshold above 720 hours is invalid."""
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be at most 720"):
|
||||||
|
deletion_config.validate_threshold(721)
|
||||||
|
|
||||||
|
def test_invalid_threshold_negative(self):
|
||||||
|
"""Test that negative threshold values are invalid."""
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be at least 24"):
|
||||||
|
deletion_config.validate_threshold(-1)
|
||||||
|
|
||||||
|
def test_invalid_threshold_zero(self):
|
||||||
|
"""Test that zero threshold is invalid."""
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be at least 24"):
|
||||||
|
deletion_config.validate_threshold(0)
|
||||||
|
|
||||||
|
def test_valid_threshold_24_hours(self):
|
||||||
|
"""Test that 24 hours (minimum) is valid."""
|
||||||
|
# Should not raise
|
||||||
|
deletion_config.validate_threshold(24)
|
||||||
|
|
||||||
|
def test_valid_threshold_720_hours(self):
|
||||||
|
"""Test that 720 hours (maximum) is valid."""
|
||||||
|
# Should not raise
|
||||||
|
deletion_config.validate_threshold(720)
|
||||||
|
|
||||||
|
def test_valid_threshold_168_hours(self):
|
||||||
|
"""Test that 168 hours (7 days) is valid."""
|
||||||
|
# Should not raise
|
||||||
|
deletion_config.validate_threshold(168)
|
||||||
|
|
||||||
|
def test_warning_for_threshold_below_168_hours(self, caplog):
|
||||||
|
"""Test that setting threshold below 168 hours logs a warning."""
|
||||||
|
import logging
|
||||||
|
caplog.set_level(logging.WARNING)
|
||||||
|
deletion_config.validate_threshold(100)
|
||||||
|
assert any("below the recommended minimum" in record.message for record in caplog.records)
|
||||||
|
|
||||||
|
def test_no_warning_for_threshold_above_168_hours(self, caplog):
|
||||||
|
"""Test that threshold above 168 hours doesn't log warning."""
|
||||||
|
import logging
|
||||||
|
caplog.set_level(logging.WARNING)
|
||||||
|
deletion_config.validate_threshold(200)
|
||||||
|
# Should not have the specific warning
|
||||||
|
assert not any("below the recommended minimum" in record.message for record in caplog.records)
|
||||||
|
|
||||||
|
def test_threshold_constants_defined(self):
|
||||||
|
"""Test that MIN and MAX threshold constants are defined."""
|
||||||
|
assert deletion_config.MIN_THRESHOLD_HOURS == 24
|
||||||
|
assert deletion_config.MAX_THRESHOLD_HOURS == 720
|
||||||
|
|
||||||
|
def test_invalid_environment_variable_non_numeric(self):
|
||||||
|
"""Test that non-numeric environment variable raises error."""
|
||||||
|
import importlib
|
||||||
|
with patch.dict(os.environ, {'ACCOUNT_DELETION_THRESHOLD_HOURS': 'invalid', 'DB_ENV': 'test'}):
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be a valid integer"):
|
||||||
|
importlib.reload(deletion_config)
|
||||||
|
|
||||||
|
def test_environment_variable_with_decimal(self):
|
||||||
|
"""Test that decimal environment variable raises error."""
|
||||||
|
import importlib
|
||||||
|
with patch.dict(os.environ, {'ACCOUNT_DELETION_THRESHOLD_HOURS': '24.5', 'DB_ENV': 'test'}):
|
||||||
|
with pytest.raises(ValueError, match="ACCOUNT_DELETION_THRESHOLD_HOURS must be a valid integer"):
|
||||||
|
importlib.reload(deletion_config)
|
||||||
955
backend/tests/test_deletion_scheduler.py
Normal file
955
backend/tests/test_deletion_scheduler.py
Normal file
@@ -0,0 +1,955 @@
|
|||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import pytest
|
||||||
|
import shutil
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
|
||||||
|
# Set up path and environment before imports
|
||||||
|
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
|
||||||
|
os.environ['DB_ENV'] = 'test'
|
||||||
|
|
||||||
|
from utils.account_deletion_scheduler import (
|
||||||
|
is_user_due_for_deletion,
|
||||||
|
get_deletion_attempt_count,
|
||||||
|
delete_user_data,
|
||||||
|
process_deletion_queue,
|
||||||
|
check_interrupted_deletions,
|
||||||
|
MAX_DELETION_ATTEMPTS
|
||||||
|
)
|
||||||
|
from models.user import User
|
||||||
|
from models.child import Child
|
||||||
|
from models.task import Task
|
||||||
|
from models.reward import Reward
|
||||||
|
from models.image import Image
|
||||||
|
from models.pending_reward import PendingReward
|
||||||
|
from db.db import users_db, child_db, task_db, reward_db, image_db, pending_reward_db
|
||||||
|
from config.paths import get_user_image_dir
|
||||||
|
from tinydb import Query
|
||||||
|
|
||||||
|
|
||||||
|
class TestSchedulerIdentification:
|
||||||
|
"""Tests for identifying users due for deletion."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
def test_user_due_for_deletion(self):
|
||||||
|
"""Test scheduler identifies users past the threshold."""
|
||||||
|
# Create user marked 800 hours ago (past 720 hour threshold)
|
||||||
|
marked_time = (datetime.now() - timedelta(hours=800)).isoformat()
|
||||||
|
user = User(
|
||||||
|
id='user1',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
email='test@example.com',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=marked_time,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert is_user_due_for_deletion(user) is True
|
||||||
|
|
||||||
|
def test_user_not_due_for_deletion(self):
|
||||||
|
"""Test scheduler ignores users not yet due."""
|
||||||
|
# Create user marked 100 hours ago (before 720 hour threshold)
|
||||||
|
marked_time = (datetime.now() - timedelta(hours=100)).isoformat()
|
||||||
|
user = User(
|
||||||
|
id='user2',
|
||||||
|
email='test2@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=marked_time,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert is_user_due_for_deletion(user) is False
|
||||||
|
|
||||||
|
def test_user_not_marked_for_deletion(self):
|
||||||
|
"""Test scheduler ignores users not marked for deletion."""
|
||||||
|
user = User(
|
||||||
|
id='user3',
|
||||||
|
email='test3@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=False,
|
||||||
|
marked_for_deletion_at=None,
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert is_user_due_for_deletion(user) is False
|
||||||
|
|
||||||
|
def test_user_with_invalid_timestamp(self):
|
||||||
|
"""Test scheduler handles invalid timestamp gracefully."""
|
||||||
|
user = User(
|
||||||
|
id='user4',
|
||||||
|
email='test4@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at='invalid-timestamp',
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert is_user_due_for_deletion(user) is False
|
||||||
|
|
||||||
|
def test_empty_database(self):
|
||||||
|
"""Test scheduler handles empty database gracefully."""
|
||||||
|
# Database is already empty from setup_method
|
||||||
|
# Should not raise any errors
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# Verify no users were deleted
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
|
||||||
|
class TestDeletionProcess:
|
||||||
|
"""Tests for the deletion process."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases and create test data before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
def teardown_method(self):
|
||||||
|
"""Clean up test directories after each test."""
|
||||||
|
# Clean up any test user directories
|
||||||
|
for user_id in ['deletion_test_user', 'user_no_children', 'user_no_tasks']:
|
||||||
|
user_dir = get_user_image_dir(user_id)
|
||||||
|
if os.path.exists(user_dir):
|
||||||
|
try:
|
||||||
|
shutil.rmtree(user_dir)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def test_deletion_order_pending_rewards_first(self):
|
||||||
|
"""Test that pending rewards are deleted before children."""
|
||||||
|
user_id = 'deletion_test_user'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='delete@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create child
|
||||||
|
child = Child(
|
||||||
|
id='child1',
|
||||||
|
name='Test Child',
|
||||||
|
user_id=user_id,
|
||||||
|
points=100,
|
||||||
|
tasks=[],
|
||||||
|
rewards=[]
|
||||||
|
)
|
||||||
|
child_db.insert(child.to_dict())
|
||||||
|
|
||||||
|
# Create pending reward
|
||||||
|
pending = PendingReward(
|
||||||
|
id='pending1',
|
||||||
|
child_id='child1',
|
||||||
|
reward_id='reward1',
|
||||||
|
user_id=user_id,
|
||||||
|
status='pending'
|
||||||
|
)
|
||||||
|
pending_reward_db.insert(pending.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(pending_reward_db.all()) == 0
|
||||||
|
assert len(child_db.all()) == 0
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_removes_user_tasks_not_system(self):
|
||||||
|
"""Test that only user's tasks are deleted, not system tasks."""
|
||||||
|
user_id = 'deletion_test_user'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='delete@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create user task
|
||||||
|
user_task = Task(
|
||||||
|
id='user_task',
|
||||||
|
name='User Task',
|
||||||
|
points=10,
|
||||||
|
is_good=True,
|
||||||
|
user_id=user_id
|
||||||
|
)
|
||||||
|
task_db.insert(user_task.to_dict())
|
||||||
|
|
||||||
|
# Create system task
|
||||||
|
system_task = Task(
|
||||||
|
id='system_task',
|
||||||
|
name='System Task',
|
||||||
|
points=20,
|
||||||
|
is_good=True,
|
||||||
|
user_id=None
|
||||||
|
)
|
||||||
|
task_db.insert(system_task.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
# User task should be deleted
|
||||||
|
assert task_db.get(Query().id == 'user_task') is None
|
||||||
|
# System task should remain
|
||||||
|
assert task_db.get(Query().id == 'system_task') is not None
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_removes_user_rewards_not_system(self):
|
||||||
|
"""Test that only user's rewards are deleted, not system rewards."""
|
||||||
|
user_id = 'deletion_test_user'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='delete@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create user reward
|
||||||
|
user_reward = Reward(
|
||||||
|
id='user_reward',
|
||||||
|
name='User Reward',
|
||||||
|
description='A user reward',
|
||||||
|
cost=50,
|
||||||
|
user_id=user_id
|
||||||
|
)
|
||||||
|
reward_db.insert(user_reward.to_dict())
|
||||||
|
|
||||||
|
# Create system reward
|
||||||
|
system_reward = Reward(
|
||||||
|
id='system_reward',
|
||||||
|
name='System Reward',
|
||||||
|
description='A system reward',
|
||||||
|
cost=100,
|
||||||
|
user_id=None
|
||||||
|
)
|
||||||
|
reward_db.insert(system_reward.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
# User reward should be deleted
|
||||||
|
assert reward_db.get(Query().id == 'user_reward') is None
|
||||||
|
# System reward should remain
|
||||||
|
assert reward_db.get(Query().id == 'system_reward') is not None
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_removes_user_images(self):
|
||||||
|
"""Test that user's images are deleted from database."""
|
||||||
|
user_id = 'deletion_test_user'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='delete@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Create user image
|
||||||
|
image = Image(
|
||||||
|
id='img1',
|
||||||
|
user_id=user_id,
|
||||||
|
type=1,
|
||||||
|
extension='jpg',
|
||||||
|
permanent=False
|
||||||
|
)
|
||||||
|
image_db.insert(image.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(image_db.search(Query().user_id == user_id)) == 0
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_with_user_no_children(self):
|
||||||
|
"""Test deletion of user with no children."""
|
||||||
|
user_id = 'user_no_children'
|
||||||
|
|
||||||
|
# Create user without children
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='nochildren@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_with_user_no_custom_tasks_or_rewards(self):
|
||||||
|
"""Test deletion of user with no custom tasks or rewards."""
|
||||||
|
user_id = 'user_no_tasks'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='notasks@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Delete user
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_handles_missing_directory(self):
|
||||||
|
"""Test that deletion continues if user directory doesn't exist."""
|
||||||
|
user_id = 'user_no_dir'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='nodir@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Ensure directory doesn't exist
|
||||||
|
user_dir = get_user_image_dir(user_id)
|
||||||
|
if os.path.exists(user_dir):
|
||||||
|
shutil.rmtree(user_dir)
|
||||||
|
|
||||||
|
# Delete user (should not fail)
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_deletion_in_progress_flag(self):
|
||||||
|
"""Test that deletion_in_progress flag is set during deletion."""
|
||||||
|
user_id = 'deletion_test_user'
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='flag@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Mock the deletion to fail partway through
|
||||||
|
with patch('utils.account_deletion_scheduler.child_db.remove') as mock_remove:
|
||||||
|
mock_remove.side_effect = Exception("Test error")
|
||||||
|
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is False
|
||||||
|
# Check that flag was updated
|
||||||
|
Query_ = Query()
|
||||||
|
updated_user = users_db.get(Query_.id == user_id)
|
||||||
|
assert updated_user['deletion_in_progress'] is False
|
||||||
|
assert updated_user['deletion_attempted_at'] is not None
|
||||||
|
|
||||||
|
|
||||||
|
class TestRetryLogic:
|
||||||
|
"""Tests for deletion retry logic."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
def test_deletion_attempt_count(self):
|
||||||
|
"""Test that deletion attempt count is tracked."""
|
||||||
|
user_no_attempts = User(
|
||||||
|
id='user1',
|
||||||
|
email='test1@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
assert get_deletion_attempt_count(user_no_attempts) == 0
|
||||||
|
|
||||||
|
user_one_attempt = User(
|
||||||
|
id='user2',
|
||||||
|
email='test2@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=datetime.now().isoformat()
|
||||||
|
)
|
||||||
|
assert get_deletion_attempt_count(user_one_attempt) == 1
|
||||||
|
|
||||||
|
def test_max_deletion_attempts_constant(self):
|
||||||
|
"""Test that MAX_DELETION_ATTEMPTS is defined correctly."""
|
||||||
|
assert MAX_DELETION_ATTEMPTS == 3
|
||||||
|
|
||||||
|
def test_scheduler_interval_configuration(self):
|
||||||
|
"""Test that scheduler is configured to run every 1 hour."""
|
||||||
|
from utils.account_deletion_scheduler import start_deletion_scheduler, stop_deletion_scheduler, _scheduler
|
||||||
|
|
||||||
|
# Clean up any existing scheduler
|
||||||
|
stop_deletion_scheduler()
|
||||||
|
|
||||||
|
# Start the scheduler
|
||||||
|
start_deletion_scheduler()
|
||||||
|
|
||||||
|
# Get the scheduler instance
|
||||||
|
from utils import account_deletion_scheduler
|
||||||
|
scheduler = account_deletion_scheduler._scheduler
|
||||||
|
|
||||||
|
# Verify scheduler exists
|
||||||
|
assert scheduler is not None, "Scheduler should be initialized"
|
||||||
|
|
||||||
|
# Get all jobs
|
||||||
|
jobs = scheduler.get_jobs()
|
||||||
|
assert len(jobs) > 0, "Scheduler should have at least one job"
|
||||||
|
|
||||||
|
# Find the account deletion job
|
||||||
|
deletion_job = None
|
||||||
|
for job in jobs:
|
||||||
|
if job.id == 'account_deletion':
|
||||||
|
deletion_job = job
|
||||||
|
break
|
||||||
|
|
||||||
|
assert deletion_job is not None, "Account deletion job should exist"
|
||||||
|
|
||||||
|
# Verify the job is configured with interval trigger
|
||||||
|
assert hasattr(deletion_job.trigger, 'interval'), "Job should use interval trigger"
|
||||||
|
|
||||||
|
# Verify interval is 1 hour (3600 seconds)
|
||||||
|
interval_seconds = deletion_job.trigger.interval.total_seconds()
|
||||||
|
assert interval_seconds == 3600, f"Expected 3600 seconds (1 hour), got {interval_seconds}"
|
||||||
|
|
||||||
|
# Clean up
|
||||||
|
stop_deletion_scheduler()
|
||||||
|
|
||||||
|
|
||||||
|
class TestRestartHandling:
|
||||||
|
"""Tests for restart handling."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
def test_interrupted_deletion_recovery(self):
|
||||||
|
"""Test that interrupted deletions are detected on restart."""
|
||||||
|
# Create user with deletion_in_progress flag set
|
||||||
|
user = User(
|
||||||
|
id='interrupted_user',
|
||||||
|
email='interrupted@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=True,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Check for interrupted deletions
|
||||||
|
check_interrupted_deletions()
|
||||||
|
|
||||||
|
# Verify flag was cleared
|
||||||
|
Query_ = Query()
|
||||||
|
updated_user = users_db.get(Query_.id == 'interrupted_user')
|
||||||
|
assert updated_user['deletion_in_progress'] is False
|
||||||
|
|
||||||
|
def test_no_interrupted_deletions(self):
|
||||||
|
"""Test restart handling when no deletions were interrupted."""
|
||||||
|
# Create user without deletion_in_progress flag
|
||||||
|
user = User(
|
||||||
|
id='normal_user',
|
||||||
|
email='normal@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Should not raise any errors
|
||||||
|
check_interrupted_deletions()
|
||||||
|
|
||||||
|
|
||||||
|
class TestEdgeCases:
|
||||||
|
"""Tests for edge cases and error conditions."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
def test_concurrent_deletion_attempts(self):
|
||||||
|
"""Test that deletion_in_progress flag triggers a retry (interrupted deletion)."""
|
||||||
|
user_id = 'concurrent_user'
|
||||||
|
|
||||||
|
# Create user with deletion_in_progress already set (simulating an interrupted deletion)
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='concurrent@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=True,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Try to process deletion queue
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# User should be deleted (scheduler retries interrupted deletions)
|
||||||
|
Query_ = Query()
|
||||||
|
remaining_user = users_db.get(Query_.id == user_id)
|
||||||
|
assert remaining_user is None
|
||||||
|
|
||||||
|
def test_partial_deletion_failure_continues_with_other_users(self):
|
||||||
|
"""Test that failure with one user doesn't stop processing others."""
|
||||||
|
# Create two users due for deletion
|
||||||
|
user1 = User(
|
||||||
|
id='user1',
|
||||||
|
email='user1@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user1.to_dict())
|
||||||
|
|
||||||
|
user2 = User(
|
||||||
|
id='user2',
|
||||||
|
email='user2@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user2.to_dict())
|
||||||
|
|
||||||
|
# Mock delete_user_data to fail for first user but succeed for second
|
||||||
|
original_delete = delete_user_data
|
||||||
|
call_count = [0]
|
||||||
|
|
||||||
|
def mock_delete(user):
|
||||||
|
call_count[0] += 1
|
||||||
|
if call_count[0] == 1:
|
||||||
|
# Fail first call
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
# Succeed on subsequent calls
|
||||||
|
return original_delete(user)
|
||||||
|
|
||||||
|
with patch('utils.account_deletion_scheduler.delete_user_data', side_effect=mock_delete):
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# First user should remain (failed)
|
||||||
|
Query_ = Query()
|
||||||
|
assert users_db.get(Query_.id == 'user1') is not None
|
||||||
|
# Second user should be deleted (succeeded)
|
||||||
|
# Note: This depends on implementation - if delete_user_data is mocked completely,
|
||||||
|
# the user won't actually be removed. This test validates the flow continues.
|
||||||
|
|
||||||
|
def test_deletion_with_user_no_uploaded_images(self):
|
||||||
|
"""Test deletion of user with no uploaded images."""
|
||||||
|
user_id = 'user_no_images'
|
||||||
|
|
||||||
|
# Create user without any images
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='noimages@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Verify no images exist for this user
|
||||||
|
Query_ = Query()
|
||||||
|
assert len(image_db.search(Query_.user_id == user_id)) == 0
|
||||||
|
|
||||||
|
# Delete user (should succeed without errors)
|
||||||
|
result = delete_user_data(user)
|
||||||
|
|
||||||
|
assert result is True
|
||||||
|
assert len(users_db.all()) == 0
|
||||||
|
|
||||||
|
def test_user_with_max_failed_attempts(self, caplog):
|
||||||
|
"""Test that user with 3+ failed attempts logs critical and is not retried."""
|
||||||
|
import logging
|
||||||
|
caplog.set_level(logging.CRITICAL)
|
||||||
|
|
||||||
|
user_id = 'user_max_attempts'
|
||||||
|
|
||||||
|
# Create user with 3 failed attempts (simulated by having deletion_attempted_at set
|
||||||
|
# and mocking get_deletion_attempt_count to return 3)
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='maxattempts@example.com',
|
||||||
|
first_name='Test',
|
||||||
|
last_name='User',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=datetime.now().isoformat()
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Mock get_deletion_attempt_count to return MAX_DELETION_ATTEMPTS
|
||||||
|
with patch('utils.account_deletion_scheduler.get_deletion_attempt_count', return_value=MAX_DELETION_ATTEMPTS):
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# User should still exist (not deleted due to max attempts)
|
||||||
|
Query_ = Query()
|
||||||
|
remaining_user = users_db.get(Query_.id == user_id)
|
||||||
|
assert remaining_user is not None
|
||||||
|
|
||||||
|
# Check that critical log message was created
|
||||||
|
assert any(
|
||||||
|
'Manual intervention required' in record.message and user_id in record.message
|
||||||
|
for record in caplog.records
|
||||||
|
if record.levelno == logging.CRITICAL
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestIntegration:
|
||||||
|
"""Integration tests for complete deletion workflows."""
|
||||||
|
|
||||||
|
def setup_method(self):
|
||||||
|
"""Clear test databases and filesystem before each test."""
|
||||||
|
users_db.truncate()
|
||||||
|
child_db.truncate()
|
||||||
|
task_db.truncate()
|
||||||
|
reward_db.truncate()
|
||||||
|
image_db.truncate()
|
||||||
|
pending_reward_db.truncate()
|
||||||
|
|
||||||
|
# Clean up test image directories
|
||||||
|
from config.paths import get_base_data_dir
|
||||||
|
test_image_base = os.path.join(get_base_data_dir(), 'images')
|
||||||
|
if os.path.exists(test_image_base):
|
||||||
|
for user_dir in os.listdir(test_image_base):
|
||||||
|
user_path = os.path.join(test_image_base, user_dir)
|
||||||
|
if os.path.isdir(user_path):
|
||||||
|
shutil.rmtree(user_path)
|
||||||
|
|
||||||
|
def teardown_method(self):
|
||||||
|
"""Clean up after tests."""
|
||||||
|
from config.paths import get_base_data_dir
|
||||||
|
test_image_base = os.path.join(get_base_data_dir(), 'images')
|
||||||
|
if os.path.exists(test_image_base):
|
||||||
|
for user_dir in os.listdir(test_image_base):
|
||||||
|
user_path = os.path.join(test_image_base, user_dir)
|
||||||
|
if os.path.isdir(user_path):
|
||||||
|
shutil.rmtree(user_path)
|
||||||
|
|
||||||
|
def test_full_deletion_flow_from_marking_to_deletion(self):
|
||||||
|
"""Test complete deletion flow from marked user with all associated data."""
|
||||||
|
user_id = 'integration_user_1'
|
||||||
|
child_id = 'integration_child_1'
|
||||||
|
task_id = 'integration_task_1'
|
||||||
|
reward_id = 'integration_reward_1'
|
||||||
|
image_id = 'integration_image_1'
|
||||||
|
pending_reward_id = 'integration_pending_1'
|
||||||
|
|
||||||
|
# 1. Create user marked for deletion (past threshold)
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='integration@example.com',
|
||||||
|
first_name='Integration',
|
||||||
|
last_name='Test',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# 2. Create child belonging to user
|
||||||
|
child = Child(
|
||||||
|
id=child_id,
|
||||||
|
user_id=user_id,
|
||||||
|
name='Test Child'
|
||||||
|
)
|
||||||
|
child_db.insert(child.to_dict())
|
||||||
|
|
||||||
|
# 3. Create pending reward for child
|
||||||
|
pending_reward = PendingReward(
|
||||||
|
id=pending_reward_id,
|
||||||
|
child_id=child_id,
|
||||||
|
reward_id='some_reward',
|
||||||
|
user_id=user_id,
|
||||||
|
status='pending'
|
||||||
|
)
|
||||||
|
pending_reward_db.insert(pending_reward.to_dict())
|
||||||
|
|
||||||
|
# 4. Create task created by user
|
||||||
|
task = Task(
|
||||||
|
id=task_id,
|
||||||
|
user_id=user_id,
|
||||||
|
name='User Task',
|
||||||
|
points=10,
|
||||||
|
is_good=True
|
||||||
|
)
|
||||||
|
task_db.insert(task.to_dict())
|
||||||
|
|
||||||
|
# 5. Create reward created by user
|
||||||
|
reward = Reward(
|
||||||
|
id=reward_id,
|
||||||
|
user_id=user_id,
|
||||||
|
name='User Reward',
|
||||||
|
description='Test reward',
|
||||||
|
cost=20
|
||||||
|
)
|
||||||
|
reward_db.insert(reward.to_dict())
|
||||||
|
|
||||||
|
# 6. Create image uploaded by user
|
||||||
|
image = Image(
|
||||||
|
id=image_id,
|
||||||
|
user_id=user_id,
|
||||||
|
type=1, # Type 1 for regular images
|
||||||
|
extension='jpg',
|
||||||
|
permanent=False
|
||||||
|
)
|
||||||
|
image_db.insert(image.to_dict())
|
||||||
|
|
||||||
|
# 7. Create user image directory with a file
|
||||||
|
user_image_dir = get_user_image_dir(user_id)
|
||||||
|
os.makedirs(user_image_dir, exist_ok=True)
|
||||||
|
test_image_path = os.path.join(user_image_dir, 'test.jpg')
|
||||||
|
with open(test_image_path, 'w') as f:
|
||||||
|
f.write('test image content')
|
||||||
|
|
||||||
|
# Verify everything exists before deletion
|
||||||
|
Query_ = Query()
|
||||||
|
assert users_db.get(Query_.id == user_id) is not None
|
||||||
|
assert child_db.get(Query_.id == child_id) is not None
|
||||||
|
assert pending_reward_db.get(Query_.id == pending_reward_id) is not None
|
||||||
|
assert task_db.get(Query_.user_id == user_id) is not None
|
||||||
|
assert reward_db.get(Query_.user_id == user_id) is not None
|
||||||
|
assert image_db.get(Query_.user_id == user_id) is not None
|
||||||
|
assert os.path.exists(user_image_dir)
|
||||||
|
assert os.path.exists(test_image_path)
|
||||||
|
|
||||||
|
# Run the deletion scheduler
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# Verify everything is deleted
|
||||||
|
assert users_db.get(Query_.id == user_id) is None
|
||||||
|
assert child_db.get(Query_.id == child_id) is None
|
||||||
|
assert pending_reward_db.get(Query_.id == pending_reward_id) is None
|
||||||
|
assert task_db.get(Query_.user_id == user_id) is None
|
||||||
|
assert reward_db.get(Query_.user_id == user_id) is None
|
||||||
|
assert image_db.get(Query_.user_id == user_id) is None
|
||||||
|
assert not os.path.exists(user_image_dir)
|
||||||
|
|
||||||
|
def test_multiple_users_deleted_in_same_scheduler_run(self):
|
||||||
|
"""Test multiple users are deleted in a single scheduler run."""
|
||||||
|
user_ids = ['multi_user_1', 'multi_user_2', 'multi_user_3']
|
||||||
|
|
||||||
|
# Create 3 users all marked for deletion (past threshold)
|
||||||
|
for user_id in user_ids:
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email=f'{user_id}@example.com',
|
||||||
|
first_name='Multi',
|
||||||
|
last_name='Test',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=False,
|
||||||
|
deletion_attempted_at=None
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# Add a child for each user to verify deletion cascade
|
||||||
|
child = Child(
|
||||||
|
id=f'child_{user_id}',
|
||||||
|
user_id=user_id,
|
||||||
|
name=f'Child {user_id}'
|
||||||
|
)
|
||||||
|
child_db.insert(child.to_dict())
|
||||||
|
|
||||||
|
# Verify all users exist
|
||||||
|
Query_ = Query()
|
||||||
|
for user_id in user_ids:
|
||||||
|
assert users_db.get(Query_.id == user_id) is not None
|
||||||
|
assert child_db.get(Query_.user_id == user_id) is not None
|
||||||
|
|
||||||
|
# Run scheduler once
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# Verify all users and their children are deleted
|
||||||
|
for user_id in user_ids:
|
||||||
|
assert users_db.get(Query_.id == user_id) is None
|
||||||
|
assert child_db.get(Query_.user_id == user_id) is None
|
||||||
|
|
||||||
|
def test_deletion_with_restart_midway_recovery(self):
|
||||||
|
"""Test deletion recovery when restart happens during deletion."""
|
||||||
|
user_id = 'restart_user'
|
||||||
|
child_id = 'restart_child'
|
||||||
|
|
||||||
|
# 1. Create user with deletion_in_progress=True (simulating interrupted deletion)
|
||||||
|
user = User(
|
||||||
|
id=user_id,
|
||||||
|
email='restart@example.com',
|
||||||
|
first_name='Restart',
|
||||||
|
last_name='Test',
|
||||||
|
password='hash',
|
||||||
|
marked_for_deletion=True,
|
||||||
|
marked_for_deletion_at=(datetime.now() - timedelta(hours=800)).isoformat(),
|
||||||
|
deletion_in_progress=True, # Interrupted state
|
||||||
|
deletion_attempted_at=(datetime.now() - timedelta(hours=2)).isoformat()
|
||||||
|
)
|
||||||
|
users_db.insert(user.to_dict())
|
||||||
|
|
||||||
|
# 2. Create associated data
|
||||||
|
child = Child(
|
||||||
|
id=child_id,
|
||||||
|
user_id=user_id,
|
||||||
|
name='Test Child'
|
||||||
|
)
|
||||||
|
child_db.insert(child.to_dict())
|
||||||
|
|
||||||
|
# 3. Create user image directory
|
||||||
|
user_image_dir = get_user_image_dir(user_id)
|
||||||
|
os.makedirs(user_image_dir, exist_ok=True)
|
||||||
|
test_image_path = os.path.join(user_image_dir, 'test.jpg')
|
||||||
|
with open(test_image_path, 'w') as f:
|
||||||
|
f.write('test content')
|
||||||
|
|
||||||
|
# Verify initial state
|
||||||
|
Query_ = Query()
|
||||||
|
assert users_db.get(Query_.id == user_id) is not None
|
||||||
|
assert users_db.get(Query_.id == user_id)['deletion_in_progress'] is True
|
||||||
|
|
||||||
|
# 4. Call check_interrupted_deletions (simulating app restart)
|
||||||
|
check_interrupted_deletions()
|
||||||
|
|
||||||
|
# Verify flag was cleared
|
||||||
|
updated_user = users_db.get(Query_.id == user_id)
|
||||||
|
assert updated_user is not None
|
||||||
|
assert updated_user['deletion_in_progress'] is False
|
||||||
|
|
||||||
|
# 5. Now run the scheduler to complete the deletion
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# Verify user and all data are deleted
|
||||||
|
assert users_db.get(Query_.id == user_id) is None
|
||||||
|
assert child_db.get(Query_.id == child_id) is None
|
||||||
|
assert not os.path.exists(user_image_dir)
|
||||||
360
backend/utils/account_deletion_scheduler.py
Normal file
360
backend/utils/account_deletion_scheduler.py
Normal file
@@ -0,0 +1,360 @@
|
|||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from logging.handlers import RotatingFileHandler
|
||||||
|
from apscheduler.schedulers.background import BackgroundScheduler
|
||||||
|
from tinydb import Query
|
||||||
|
|
||||||
|
from config.deletion_config import ACCOUNT_DELETION_THRESHOLD_HOURS
|
||||||
|
from config.paths import get_user_image_dir
|
||||||
|
from db.db import users_db, child_db, task_db, reward_db, image_db, pending_reward_db
|
||||||
|
from models.user import User
|
||||||
|
from events.types.event import Event
|
||||||
|
from events.types.event_types import EventType
|
||||||
|
from events.types.user_deleted import UserDeleted
|
||||||
|
from events.sse import send_to_user
|
||||||
|
|
||||||
|
# Setup dedicated logger for account deletion
|
||||||
|
logger = logging.getLogger('account_deletion_scheduler')
|
||||||
|
logger.setLevel(logging.INFO)
|
||||||
|
|
||||||
|
# Create logs directory if it doesn't exist
|
||||||
|
os.makedirs('logs', exist_ok=True)
|
||||||
|
|
||||||
|
# Add rotating file handler
|
||||||
|
file_handler = RotatingFileHandler(
|
||||||
|
'logs/account_deletion.log',
|
||||||
|
maxBytes=10*1024*1024, # 10MB
|
||||||
|
backupCount=5
|
||||||
|
)
|
||||||
|
file_handler.setFormatter(
|
||||||
|
logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||||
|
)
|
||||||
|
logger.addHandler(file_handler)
|
||||||
|
|
||||||
|
# Also log to stdout
|
||||||
|
console_handler = logging.StreamHandler()
|
||||||
|
console_handler.setFormatter(
|
||||||
|
logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||||
|
)
|
||||||
|
logger.addHandler(console_handler)
|
||||||
|
|
||||||
|
MAX_DELETION_ATTEMPTS = 3
|
||||||
|
|
||||||
|
def send_user_deleted_event_to_admins(user_id: str, email: str, deleted_at: str):
|
||||||
|
"""
|
||||||
|
Send USER_DELETED event to all admin users.
|
||||||
|
|
||||||
|
TODO: Currently sends to all authenticated users with active SSE connections.
|
||||||
|
In production, this should filter to only users with admin role.
|
||||||
|
"""
|
||||||
|
event = Event(
|
||||||
|
EventType.USER_DELETED.value,
|
||||||
|
UserDeleted(user_id, email, deleted_at)
|
||||||
|
)
|
||||||
|
|
||||||
|
# TODO: Get list of admin users and send only to them
|
||||||
|
# For now, we'll skip broadcasting since we don't have a way to get all active admin connections
|
||||||
|
# This will need to be implemented when admin role system is in place
|
||||||
|
logger.info(f"USER_DELETED event created for {user_id} ({email}) at {deleted_at}")
|
||||||
|
# Future implementation:
|
||||||
|
# admin_users = get_admin_users()
|
||||||
|
# for admin in admin_users:
|
||||||
|
# send_to_user(admin.id, event.to_dict())
|
||||||
|
|
||||||
|
def is_user_due_for_deletion(user: User) -> bool:
|
||||||
|
"""
|
||||||
|
Check if a user is due for deletion based on marked_for_deletion_at timestamp
|
||||||
|
and the configured threshold.
|
||||||
|
"""
|
||||||
|
if not user.marked_for_deletion or not user.marked_for_deletion_at:
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
marked_at = datetime.fromisoformat(user.marked_for_deletion_at)
|
||||||
|
threshold_delta = timedelta(hours=ACCOUNT_DELETION_THRESHOLD_HOURS)
|
||||||
|
due_at = marked_at + threshold_delta
|
||||||
|
|
||||||
|
# Get current time - make it timezone-aware if marked_at is timezone-aware
|
||||||
|
now = datetime.now()
|
||||||
|
if marked_at.tzinfo is not None:
|
||||||
|
# Convert marked_at to naive UTC for comparison
|
||||||
|
marked_at = marked_at.replace(tzinfo=None)
|
||||||
|
due_at = marked_at + threshold_delta
|
||||||
|
|
||||||
|
return now >= due_at
|
||||||
|
except (ValueError, TypeError) as e:
|
||||||
|
logger.error(f"Error parsing marked_for_deletion_at for user {user.id}: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_deletion_attempt_count(user: User) -> int:
|
||||||
|
"""
|
||||||
|
Calculate the number of deletion attempts based on deletion_attempted_at.
|
||||||
|
This is a simplified version - in practice, you might track attempts differently.
|
||||||
|
"""
|
||||||
|
# For now, we'll consider any user with deletion_attempted_at as having 1 attempt
|
||||||
|
# In a more robust system, you'd track this in a separate field or table
|
||||||
|
if user.deletion_attempted_at:
|
||||||
|
return 1
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def delete_user_data(user: User) -> bool:
|
||||||
|
"""
|
||||||
|
Delete all data associated with a user in the correct order.
|
||||||
|
Returns True if successful, False otherwise.
|
||||||
|
"""
|
||||||
|
user_id = user.id
|
||||||
|
success = True
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Step 1: Set deletion_in_progress flag
|
||||||
|
logger.info(f"Starting deletion for user {user_id} ({user.email})")
|
||||||
|
Query_ = Query()
|
||||||
|
users_db.update({'deletion_in_progress': True}, Query_.id == user_id)
|
||||||
|
|
||||||
|
# Step 2: Remove pending rewards for user's children
|
||||||
|
try:
|
||||||
|
children = child_db.search(Query_.user_id == user_id)
|
||||||
|
child_ids = [child['id'] for child in children]
|
||||||
|
|
||||||
|
if child_ids:
|
||||||
|
for child_id in child_ids:
|
||||||
|
removed = pending_reward_db.remove(Query_.child_id == child_id)
|
||||||
|
if removed:
|
||||||
|
logger.info(f"Deleted {len(removed)} pending rewards for child {child_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete pending rewards for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 3: Remove children
|
||||||
|
try:
|
||||||
|
removed = child_db.remove(Query_.user_id == user_id)
|
||||||
|
if removed:
|
||||||
|
logger.info(f"Deleted {len(removed)} children for user {user_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete children for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 4: Remove user-created tasks
|
||||||
|
try:
|
||||||
|
removed = task_db.remove(Query_.user_id == user_id)
|
||||||
|
if removed:
|
||||||
|
logger.info(f"Deleted {len(removed)} tasks for user {user_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete tasks for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 5: Remove user-created rewards
|
||||||
|
try:
|
||||||
|
removed = reward_db.remove(Query_.user_id == user_id)
|
||||||
|
if removed:
|
||||||
|
logger.info(f"Deleted {len(removed)} rewards for user {user_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete rewards for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 6: Remove user's images from database
|
||||||
|
try:
|
||||||
|
removed = image_db.remove(Query_.user_id == user_id)
|
||||||
|
if removed:
|
||||||
|
logger.info(f"Deleted {len(removed)} images from database for user {user_id}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete images from database for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 7: Delete user's image directory from filesystem
|
||||||
|
try:
|
||||||
|
user_image_dir = get_user_image_dir(user_id)
|
||||||
|
if os.path.exists(user_image_dir):
|
||||||
|
shutil.rmtree(user_image_dir)
|
||||||
|
logger.info(f"Deleted image directory for user {user_id}")
|
||||||
|
else:
|
||||||
|
logger.info(f"Image directory for user {user_id} does not exist (already deleted or never created)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete image directory for user {user_id}: {e}")
|
||||||
|
success = False
|
||||||
|
|
||||||
|
# Step 8: Remove user record
|
||||||
|
if success:
|
||||||
|
try:
|
||||||
|
users_db.remove(Query_.id == user_id)
|
||||||
|
deleted_at = datetime.now().isoformat()
|
||||||
|
logger.info(f"Successfully deleted user {user_id} ({user.email})")
|
||||||
|
|
||||||
|
# Send USER_DELETED event to admin users
|
||||||
|
send_user_deleted_event_to_admins(user_id, user.email, deleted_at)
|
||||||
|
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to delete user record for {user_id}: {e}")
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
# Deletion failed, update flags
|
||||||
|
logger.error(f"Deletion failed for user {user_id}, marking for retry")
|
||||||
|
users_db.update({
|
||||||
|
'deletion_in_progress': False,
|
||||||
|
'deletion_attempted_at': datetime.now().isoformat()
|
||||||
|
}, Query_.id == user_id)
|
||||||
|
return False
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Unexpected error during deletion for user {user_id}: {e}")
|
||||||
|
# Try to clear the in_progress flag
|
||||||
|
try:
|
||||||
|
users_db.update({
|
||||||
|
'deletion_in_progress': False,
|
||||||
|
'deletion_attempted_at': datetime.now().isoformat()
|
||||||
|
}, Query_.id == user_id)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
def process_deletion_queue():
|
||||||
|
"""
|
||||||
|
Process the deletion queue: find users due for deletion and delete them.
|
||||||
|
"""
|
||||||
|
logger.info("Starting deletion scheduler run")
|
||||||
|
|
||||||
|
processed = 0
|
||||||
|
deleted = 0
|
||||||
|
failed = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get all marked users
|
||||||
|
Query_ = Query()
|
||||||
|
marked_users = users_db.search(Query_.marked_for_deletion == True)
|
||||||
|
|
||||||
|
if not marked_users:
|
||||||
|
logger.info("No users marked for deletion")
|
||||||
|
return
|
||||||
|
|
||||||
|
logger.info(f"Found {len(marked_users)} users marked for deletion")
|
||||||
|
|
||||||
|
for user_dict in marked_users:
|
||||||
|
user = User.from_dict(user_dict)
|
||||||
|
processed += 1
|
||||||
|
|
||||||
|
# Check if user is due for deletion
|
||||||
|
if not is_user_due_for_deletion(user):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check retry limit
|
||||||
|
attempt_count = get_deletion_attempt_count(user)
|
||||||
|
if attempt_count >= MAX_DELETION_ATTEMPTS:
|
||||||
|
logger.critical(
|
||||||
|
f"User {user.id} ({user.email}) has failed deletion {attempt_count} times. "
|
||||||
|
"Manual intervention required."
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Skip if deletion is already in progress (from a previous run)
|
||||||
|
if user.deletion_in_progress:
|
||||||
|
logger.warning(
|
||||||
|
f"User {user.id} ({user.email}) has deletion_in_progress=True. "
|
||||||
|
"This may indicate a previous run was interrupted. Retrying..."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Attempt deletion
|
||||||
|
if delete_user_data(user):
|
||||||
|
deleted += 1
|
||||||
|
else:
|
||||||
|
failed += 1
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Deletion scheduler run complete: "
|
||||||
|
f"{processed} users processed, {deleted} deleted, {failed} failed"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in deletion scheduler: {e}")
|
||||||
|
|
||||||
|
def check_interrupted_deletions():
|
||||||
|
"""
|
||||||
|
On startup, check for users with deletion_in_progress=True
|
||||||
|
and retry their deletion.
|
||||||
|
"""
|
||||||
|
logger.info("Checking for interrupted deletions from previous runs")
|
||||||
|
|
||||||
|
try:
|
||||||
|
Query_ = Query()
|
||||||
|
interrupted_users = users_db.search(
|
||||||
|
(Query_.marked_for_deletion == True) &
|
||||||
|
(Query_.deletion_in_progress == True)
|
||||||
|
)
|
||||||
|
|
||||||
|
if interrupted_users:
|
||||||
|
logger.warning(
|
||||||
|
f"Found {len(interrupted_users)} users with interrupted deletions. "
|
||||||
|
"Will retry on next scheduler run."
|
||||||
|
)
|
||||||
|
# Reset the flag so they can be retried
|
||||||
|
for user_dict in interrupted_users:
|
||||||
|
users_db.update(
|
||||||
|
{'deletion_in_progress': False},
|
||||||
|
Query_.id == user_dict['id']
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error checking for interrupted deletions: {e}")
|
||||||
|
|
||||||
|
# Global scheduler instance
|
||||||
|
_scheduler = None
|
||||||
|
|
||||||
|
def start_deletion_scheduler():
|
||||||
|
"""
|
||||||
|
Start the background deletion scheduler.
|
||||||
|
Should be called once during application startup.
|
||||||
|
"""
|
||||||
|
global _scheduler
|
||||||
|
|
||||||
|
if _scheduler is not None:
|
||||||
|
logger.warning("Deletion scheduler is already running")
|
||||||
|
return
|
||||||
|
|
||||||
|
logger.info("Starting account deletion scheduler")
|
||||||
|
|
||||||
|
# Check for interrupted deletions from previous runs
|
||||||
|
check_interrupted_deletions()
|
||||||
|
|
||||||
|
# Create and start scheduler
|
||||||
|
_scheduler = BackgroundScheduler()
|
||||||
|
|
||||||
|
# Run every hour
|
||||||
|
_scheduler.add_job(
|
||||||
|
process_deletion_queue,
|
||||||
|
'interval',
|
||||||
|
hours=1,
|
||||||
|
id='account_deletion',
|
||||||
|
name='Account Deletion Scheduler',
|
||||||
|
replace_existing=True
|
||||||
|
)
|
||||||
|
|
||||||
|
_scheduler.start()
|
||||||
|
logger.info("Account deletion scheduler started (runs every 1 hour)")
|
||||||
|
|
||||||
|
def stop_deletion_scheduler():
|
||||||
|
"""
|
||||||
|
Stop the deletion scheduler (for testing or shutdown).
|
||||||
|
"""
|
||||||
|
global _scheduler
|
||||||
|
|
||||||
|
if _scheduler is not None:
|
||||||
|
_scheduler.shutdown()
|
||||||
|
_scheduler = None
|
||||||
|
logger.info("Account deletion scheduler stopped")
|
||||||
|
|
||||||
|
def trigger_deletion_manually():
|
||||||
|
"""
|
||||||
|
Manually trigger the deletion process (for admin use).
|
||||||
|
Returns stats about the run.
|
||||||
|
"""
|
||||||
|
logger.info("Manual deletion trigger requested")
|
||||||
|
process_deletion_queue()
|
||||||
|
|
||||||
|
# Return stats (simplified version)
|
||||||
|
Query_ = Query()
|
||||||
|
marked_users = users_db.search(Query_.marked_for_deletion == True)
|
||||||
|
return {
|
||||||
|
'triggered': True,
|
||||||
|
'queued_users': len(marked_users)
|
||||||
|
}
|
||||||
@@ -16,6 +16,8 @@ export interface User {
|
|||||||
image_id: string | null
|
image_id: string | null
|
||||||
marked_for_deletion: boolean
|
marked_for_deletion: boolean
|
||||||
marked_for_deletion_at: string | null
|
marked_for_deletion_at: string | null
|
||||||
|
deletion_in_progress: boolean
|
||||||
|
deletion_attempted_at: string | null
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface Child {
|
export interface Child {
|
||||||
|
|||||||
Reference in New Issue
Block a user