Pull Request Process
This document defines our pull request standards and review process.
PR Size Guidelines
Rule: Keep PRs small and focused (<400 lines changed)
Small PRs:
- ✅ Are easier to review
- ✅ Get merged faster
- ✅ Have fewer bugs
- ✅ Are easier to revert if needed
If your PR is >400 lines, consider:
- Breaking it into multiple PRs
- Submitting database migrations separately
- Creating a feature branch and multiple PRs into it
PR Description Template
Every PR must include:
What Changed
Clear, concise description of what you built.
Example:
## What Changed
Added webhook subscription functionality to allow merchants to receive
real-time notifications when order statuses change.
Changes include:
- New `webhook_subscriptions` and `webhook_deliveries` database tables
- Webhook subscription CRUD endpoints
- Background worker to deliver webhooks with retry logic
- Signature verification for webhook security
Why It Changed
Explain the business context and problem being solved.
Example:
## Why
Merchants were polling our API every 5 minutes to check for order updates,
causing unnecessary load on our servers. This feature enables real-time
push notifications, reducing API load and improving merchant experience.
Related to: #123 (GitHub issue)
How to Test
Step-by-step instructions for reviewers to test your changes.
Example:
## How to Test
1. Pull the branch and run migrations:
```bash
git checkout feature/webhook-subscriptions
alembic upgrade head
Start the application and Celery worker:
uvicorn app.main:app --reload celery -A app.worker worker --loglevel=infoCreate a webhook subscription:
curl -X POST http://localhost:8000/api/v1/webhooks/subscriptions \ -H "Authorization: Bearer YOUR_TOKEN" \ -H "Content-Type: application/json" \ -d '{"url": "https://webhook.site/your-unique-id", "events": ["order.status_changed"]}'Update an order status and verify webhook is delivered to webhook.site
Check Celery logs to see delivery attempts
Postman collection: Webhooks API.postman_collection.json (in repo root)
### Screenshots/Videos (if UI changes)
Include before/after screenshots or screen recordings for UI changes.
**Example**:
```markdown
## Screenshots
### Before

### After

### Demo
[Loom video](https://loom.com/share/abc123)
Database Migrations Included?
Clearly state if migrations are included.
Example:
## Database Migrations
- ✅ Yes - Migration file: `migrations/versions/001_add_webhook_tables.py`
- ⚠️ **IMPORTANT**: Run `alembic upgrade head` before testing
or
## Database Migrations
- ❌ No database changes
API Docs Updated?
Confirm API documentation is updated.
Example:
## API Documentation
- ✅ Yes - Updated `docs/api/endpoints.md` with new webhook endpoints
or
## API Documentation
- ❌ No API changes
Complete PR Template
Copy this template for your PRs:
## What Changed
[Brief description of changes]
- [Specific change 1]
- [Specific change 2]
- [Specific change 3]
## Why
[Business context and problem being solved]
Related issue: #[issue number]
## How to Test
1. [Step 1]
2. [Step 2]
3. [Step 3]
[Include cURL commands, Postman collection name, or test scripts]
## Screenshots/Videos
[If UI changes, include before/after screenshots or demo video]
## Database Migrations
- [ ] Yes - Migration file: `migrations/versions/XXX_description.py`
- [ ] No
**If yes**: Run `alembic upgrade head` before testing
## API Documentation
- [ ] Yes - Updated `docs/api/endpoints.md`
- [ ] No
## Checklist
- [ ] Tests added/updated and passing
- [ ] Docstrings added for all new functions
- [ ] Pre-commit hooks pass locally
- [ ] Manually tested all changes
- [ ] No console errors or warnings
- [ ] Code follows our standards (see docs/02-standards/)
Review Checklist for Authors
Before requesting review, verify:
Code Quality
- All functions have docstrings (Google-style)
- No commented-out code
- No debug print statements or console.logs
- No TODOs or FIXMEs (create issues instead)
- Consistent naming conventions
- Error handling implemented
Testing
- Unit tests added for new functions
- Integration tests added for new endpoints
- All tests passing locally (
pytestandnpm test) - Test coverage ≥80%
- Manual testing completed
Standards
- Code follows
../02-standards/code-standards.md - API patterns follow
../02-standards/api-patterns.md - Database design follows
../02-standards/database-patterns.md
Pre-commit Hooks
- Pre-commit hooks pass locally
- Code formatted with Black/Prettier
- Imports sorted with isort
- Linting passes (flake8/eslint)
- Type checking passes (mypy)
Documentation
- API docs updated (if API changes)
- README updated (if setup changes)
- Migration instructions in PR description
- Comments explain complex logic
Performance
- No N+1 database queries
- Indexes added for new query patterns
- API endpoints respond in <500ms
- Background jobs used for long operations
Review Checklist for Reviewers
As a reviewer, check:
Functionality
- Changes match the PR description
- Code solves the stated problem
- Edge cases handled
- Error messages are helpful
Code Quality
- Code is readable and maintainable
- No unnecessary complexity
- Functions have single responsibility
- Naming is clear and consistent
- Docstrings are complete and accurate
Testing
- Tests cover main scenarios
- Tests cover error cases
- Tests are readable
- No flaky tests
Security
- No hardcoded secrets or API keys
- Authentication/authorisation implemented
- Input validation present
- SQL injection prevented (using ORM)
- XSS prevented (proper escaping)
Performance
- No obvious performance issues
- Database queries are efficient
- Proper indexing
- Caching used where appropriate
Standards Compliance
- Follows our code standards
- Follows our API patterns
- Follows our database patterns
- Follows our Git workflow
How to Handle Feedback
As an Author
When you receive feedback:
- Don’t take it personally - Code review is about the code, not you
- Ask questions - If feedback is unclear, ask for clarification
- Discuss alternatives - If you disagree, explain your reasoning
- Make changes promptly - Address feedback within 24 hours
- Respond to all comments - Mark as resolved or respond
- Thank reviewers - Appreciate their time and input
Types of responses:
✅ Good responses:
- “Good catch! Fixed in commit abc123”
- “I considered that, but chose X because Y. Thoughts?”
- “Great suggestion! I’ve refactored to use your approach”
- “Can you clarify what you mean by…?”
❌ Bad responses:
- “This works fine” (dismissive)
- “I’ll fix it later” (create an issue if not fixing now)
- No response (always acknowledge feedback)
- Arguing without data/reasoning
As a Reviewer
When giving feedback:
- Be kind and constructive - Focus on the code, not the person
- Explain the why - Don’t just say “change this”, explain why
- Suggest alternatives - Offer specific improvements
- Acknowledge good work - Point out what’s done well
- Differentiate must-fix from nice-to-have - Be clear about priorities
Types of feedback:
✅ Good feedback:
- “Consider using a background job here since this API call can take 5+ seconds. See api-patterns.md for the capture-and-process pattern.”
- “Nice refactoring! This is much more readable. One suggestion: we could extract this into a service method for reusability.”
- “This needs an index on
user_idsince we’re filtering by it. See database-patterns.md rule #3.”
❌ Bad feedback:
- “This is wrong” (not specific)
- “Change this” (no explanation)
- “We don’t do it like that” (cite standards doc instead)
When to Merge
Merge when ALL of these are true:
- 2 approvals from team members (1 for hotfixes)
- All automated checks pass:
- Tests pass (pytest, npm test)
- Linting passes (flake8, eslint)
- Type checking passes (mypy)
- Pre-commit hooks pass
- All feedback addressed or acknowledged
- No unresolved conflicts
- Author confirms ready to merge
Automated Checks
These checks run automatically on every PR via GitHub Actions:
Backend Checks
✓ Tests (pytest)
✓ Code coverage (≥80%)
✓ Linting (flake8)
✓ Type checking (mypy)
✓ Security checks (bandit)
Frontend Checks
✓ Tests (Jest/Vitest)
✓ Code coverage (≥80%)
✓ Linting (eslint)
✓ Type checking (tsc)
✓ Build succeeds
All checks must pass before merging.
GitHub + Slack Integration
Our repository is connected to Slack to keep the team informed.
Automatic Notifications
The following events trigger Slack notifications in #dev-team:
PR Events:
- 🆕 PR opened - New pull request created
- 👀 Review requested - Someone is asked to review
- ✅ PR approved - PR receives an approval
- 💬 Comments added - New review comments posted
- 🔄 Changes requested - Reviewer requests changes
- ✅ PR merged - PR successfully merged
- ❌ PR closed - PR closed without merging
Build Events (in #alerts):
- ❌ Build failed - Tests or checks failing
- ✅ Build fixed - Previously failing build now passes
Deployment Events (in #deployments):
- 🚀 Deployed to staging - Code deployed to staging
- 🚀 Deployed to production - Code deployed to production
Notification Format
Slack notifications include:
- PR title and number
- Author name
- Link to PR
- Status (open, approved, merged, etc.)
- Commit message (for deployments)
Example Slack notification:
🆕 Pull Request Opened
#247 Add webhook subscription functionality
by @john-doe • myorg/myapp
[View Pull Request]
Setup
See ../06-tooling/github-slack-integration.md for setup instructions.
PR Etiquette
Response Times
- Authors: Address feedback within 24 hours
- Reviewers: Complete reviews within 24 hours
- Urgent PRs: Label with
urgentand mention in Slack
Communication
- Use PR comments for code-specific feedback
- Use Slack for high-level discussion or blockers
- Use GitHub threads to keep discussions organised
- Resolve threads when feedback is addressed
Best Practices
✅ Do:
- Review PRs from others before requesting review on yours
- Keep PRs small and focused
- Write detailed PR descriptions
- Test before requesting review
- Thank reviewers for their time
❌ Don’t:
- Force push after reviews started (unless requested)
- Merge your own PRs without approvals
- Leave PRs open for >3 days without updates
- Request reviews from everyone (pick 1-2 relevant people)
- Submit PRs at 5pm on Friday
PR Workflow Diagram
graph TD
A[Create Feature Branch] --> B[Implement Feature]
B --> C[Write Tests]
C --> D[Run Tests Locally]
D --> E{Tests Pass?}
E -->|No| B
E -->|Yes| F[Run Pre-commit Hooks]
F --> G{Hooks Pass?}
G -->|No| B
G -->|Yes| H[Push to GitHub]
H --> I[Create Pull Request]
I --> J[Automated Checks Run]
J --> K{Checks Pass?}
K -->|No| B
K -->|Yes| L[Request Reviews]
L --> M[Reviewers Review]
M --> N{Approved?}
N -->|Changes Requested| B
N -->|Approved| O{2 Approvals?}
O -->|No| L
O -->|Yes| P[Merge PR]
P --> Q[Delete Branch]
Q --> R[Slack Notification Sent]
Common Scenarios
Scenario: PR has merge conflicts
- Pull latest main:
git checkout main && git pull - Checkout your branch:
git checkout your-feature-branch - Merge main:
git merge main - Resolve conflicts in your editor
- Commit resolution:
git commit - Push:
git push
Scenario: Reviewer suggests major changes
- Discuss in PR comments or Slack first
- If agreed, make a new commit with changes
- If still large, consider:
- Creating a new PR with the original approach
- Opening a follow-up issue for the improvements
Scenario: CI fails but works locally
- Check CI logs for specific error
- Common causes:
- Different Python/Node version
- Missing environment variables
- Test depends on local state
- Fix and push again
Scenario: Need to update after long review
If your PR has been open for days:
- Pull latest main
- Rebase your branch:
git rebase main - Resolve conflicts if any
- Force push:
git push --force-with-lease - Re-request reviews if significant changes
Next Steps
- Review
git-workflow.mdfor commit and branch standards - Check
feature-development.mdfor implementation workflow - See
deployment.mdfor post-merge deployment