chore: initial commit with CI pipeline, review and tasks docs
This commit is contained in:
105
.gemini/commands/analyze.toml
Normal file
105
.gemini/commands/analyze.toml
Normal file
@@ -0,0 +1,105 @@
|
|||||||
|
description = "Perform a non-destructive cross-artifact consistency and quality analysis across spec.md, plan.md, and tasks.md after task generation."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Perform a non-destructive cross-artifact consistency and quality analysis across spec.md, plan.md, and tasks.md after task generation.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
Goal: Identify inconsistencies, duplications, ambiguities, and underspecified items across the three core artifacts (`spec.md`, `plan.md`, `tasks.md`) before implementation. This command MUST run only after `/tasks` has successfully produced a complete `tasks.md`.
|
||||||
|
|
||||||
|
STRICTLY READ-ONLY: Do **not** modify any files. Output a structured analysis report. Offer an optional remediation plan (user must explicitly approve before any follow-up editing commands would be invoked manually).
|
||||||
|
|
||||||
|
Constitution Authority: The project constitution (`.specify/memory/constitution.md`) is **non-negotiable** within this analysis scope. Constitution conflicts are automatically CRITICAL and require adjustment of the spec, plan, or tasks—not dilution, reinterpretation, or silent ignoring of the principle. If a principle itself needs to change, that must occur in a separate, explicit constitution update outside `/analyze`.
|
||||||
|
|
||||||
|
Execution steps:
|
||||||
|
|
||||||
|
1. Run `.specify/scripts/bash/check-prerequisites.sh --json --require-tasks --include-tasks` once from repo root and parse JSON for FEATURE_DIR and AVAILABLE_DOCS. Derive absolute paths:
|
||||||
|
- SPEC = FEATURE_DIR/spec.md
|
||||||
|
- PLAN = FEATURE_DIR/plan.md
|
||||||
|
- TASKS = FEATURE_DIR/tasks.md
|
||||||
|
Abort with an error message if any required file is missing (instruct the user to run missing prerequisite command).
|
||||||
|
|
||||||
|
2. Load artifacts:
|
||||||
|
- Parse spec.md sections: Overview/Context, Functional Requirements, Non-Functional Requirements, User Stories, Edge Cases (if present).
|
||||||
|
- Parse plan.md: Architecture/stack choices, Data Model references, Phases, Technical constraints.
|
||||||
|
- Parse tasks.md: Task IDs, descriptions, phase grouping, parallel markers [P], referenced file paths.
|
||||||
|
- Load constitution `.specify/memory/constitution.md` for principle validation.
|
||||||
|
|
||||||
|
3. Build internal semantic models:
|
||||||
|
- Requirements inventory: Each functional + non-functional requirement with a stable key (derive slug based on imperative phrase; e.g., "User can upload file" -> `user-can-upload-file`).
|
||||||
|
- User story/action inventory.
|
||||||
|
- Task coverage mapping: Map each task to one or more requirements or stories (inference by keyword / explicit reference patterns like IDs or key phrases).
|
||||||
|
- Constitution rule set: Extract principle names and any MUST/SHOULD normative statements.
|
||||||
|
|
||||||
|
4. Detection passes:
|
||||||
|
A. Duplication detection:
|
||||||
|
- Identify near-duplicate requirements. Mark lower-quality phrasing for consolidation.
|
||||||
|
B. Ambiguity detection:
|
||||||
|
- Flag vague adjectives (fast, scalable, secure, intuitive, robust) lacking measurable criteria.
|
||||||
|
- Flag unresolved placeholders (TODO, TKTK, ???, <placeholder>, etc.).
|
||||||
|
C. Underspecification:
|
||||||
|
- Requirements with verbs but missing object or measurable outcome.
|
||||||
|
- User stories missing acceptance criteria alignment.
|
||||||
|
- Tasks referencing files or components not defined in spec/plan.
|
||||||
|
D. Constitution alignment:
|
||||||
|
- Any requirement or plan element conflicting with a MUST principle.
|
||||||
|
- Missing mandated sections or quality gates from constitution.
|
||||||
|
E. Coverage gaps:
|
||||||
|
- Requirements with zero associated tasks.
|
||||||
|
- Tasks with no mapped requirement/story.
|
||||||
|
- Non-functional requirements not reflected in tasks (e.g., performance, security).
|
||||||
|
F. Inconsistency:
|
||||||
|
- Terminology drift (same concept named differently across files).
|
||||||
|
- Data entities referenced in plan but absent in spec (or vice versa).
|
||||||
|
- Task ordering contradictions (e.g., integration tasks before foundational setup tasks without dependency note).
|
||||||
|
- Conflicting requirements (e.g., one requires to use Next.js while other says to use Vue as the framework).
|
||||||
|
|
||||||
|
5. Severity assignment heuristic:
|
||||||
|
- CRITICAL: Violates constitution MUST, missing core spec artifact, or requirement with zero coverage that blocks baseline functionality.
|
||||||
|
- HIGH: Duplicate or conflicting requirement, ambiguous security/performance attribute, untestable acceptance criterion.
|
||||||
|
- MEDIUM: Terminology drift, missing non-functional task coverage, underspecified edge case.
|
||||||
|
- LOW: Style/wording improvements, minor redundancy not affecting execution order.
|
||||||
|
|
||||||
|
6. Produce a Markdown report (no file writes) with sections:
|
||||||
|
|
||||||
|
### Specification Analysis Report
|
||||||
|
| ID | Category | Severity | Location(s) | Summary | Recommendation |
|
||||||
|
|----|----------|----------|-------------|---------|----------------|
|
||||||
|
| A1 | Duplication | HIGH | spec.md:L120-134 | Two similar requirements ... | Merge phrasing; keep clearer version |
|
||||||
|
(Add one row per finding; generate stable IDs prefixed by category initial.)
|
||||||
|
|
||||||
|
Additional subsections:
|
||||||
|
- Coverage Summary Table:
|
||||||
|
| Requirement Key | Has Task? | Task IDs | Notes |
|
||||||
|
- Constitution Alignment Issues (if any)
|
||||||
|
- Unmapped Tasks (if any)
|
||||||
|
- Metrics:
|
||||||
|
* Total Requirements
|
||||||
|
* Total Tasks
|
||||||
|
* Coverage % (requirements with >=1 task)
|
||||||
|
* Ambiguity Count
|
||||||
|
* Duplication Count
|
||||||
|
* Critical Issues Count
|
||||||
|
|
||||||
|
7. At end of report, output a concise Next Actions block:
|
||||||
|
- If CRITICAL issues exist: Recommend resolving before `/implement`.
|
||||||
|
- If only LOW/MEDIUM: User may proceed, but provide improvement suggestions.
|
||||||
|
- Provide explicit command suggestions: e.g., "Run /specify with refinement", "Run /plan to adjust architecture", "Manually edit tasks.md to add coverage for 'performance-metrics'".
|
||||||
|
|
||||||
|
8. Ask the user: "Would you like me to suggest concrete remediation edits for the top N issues?" (Do NOT apply them automatically.)
|
||||||
|
|
||||||
|
Behavior rules:
|
||||||
|
- NEVER modify files.
|
||||||
|
- NEVER hallucinate missing sections—if absent, report them.
|
||||||
|
- KEEP findings deterministic: if rerun without changes, produce consistent IDs and counts.
|
||||||
|
- LIMIT total findings in the main table to 50; aggregate remainder in a summarized overflow note.
|
||||||
|
- If zero issues found, emit a success report with coverage statistics and proceed recommendation.
|
||||||
|
|
||||||
|
Context: {{args}}
|
||||||
|
"""
|
||||||
162
.gemini/commands/clarify.toml
Normal file
162
.gemini/commands/clarify.toml
Normal file
@@ -0,0 +1,162 @@
|
|||||||
|
description = "Identify underspecified areas in the current feature spec by asking up to 5 highly targeted clarification questions and encoding answers back into the spec."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Identify underspecified areas in the current feature spec by asking up to 5 highly targeted clarification questions and encoding answers back into the spec.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
Goal: Detect and reduce ambiguity or missing decision points in the active feature specification and record the clarifications directly in the spec file.
|
||||||
|
|
||||||
|
Note: This clarification workflow is expected to run (and be completed) BEFORE invoking `/plan`. If the user explicitly states they are skipping clarification (e.g., exploratory spike), you may proceed, but must warn that downstream rework risk increases.
|
||||||
|
|
||||||
|
Execution steps:
|
||||||
|
|
||||||
|
1. Run `.specify/scripts/bash/check-prerequisites.sh --json --paths-only` from repo root **once** (combined `--json --paths-only` mode / `-Json -PathsOnly`). Parse minimal JSON payload fields:
|
||||||
|
- `FEATURE_DIR`
|
||||||
|
- `FEATURE_SPEC`
|
||||||
|
- (Optionally capture `IMPL_PLAN`, `TASKS` for future chained flows.)
|
||||||
|
- If JSON parsing fails, abort and instruct user to re-run `/specify` or verify feature branch environment.
|
||||||
|
|
||||||
|
2. Load the current spec file. Perform a structured ambiguity & coverage scan using this taxonomy. For each category, mark status: Clear / Partial / Missing. Produce an internal coverage map used for prioritization (do not output raw map unless no questions will be asked).
|
||||||
|
|
||||||
|
Functional Scope & Behavior:
|
||||||
|
- Core user goals & success criteria
|
||||||
|
- Explicit out-of-scope declarations
|
||||||
|
- User roles / personas differentiation
|
||||||
|
|
||||||
|
Domain & Data Model:
|
||||||
|
- Entities, attributes, relationships
|
||||||
|
- Identity & uniqueness rules
|
||||||
|
- Lifecycle/state transitions
|
||||||
|
- Data volume / scale assumptions
|
||||||
|
|
||||||
|
Interaction & UX Flow:
|
||||||
|
- Critical user journeys / sequences
|
||||||
|
- Error/empty/loading states
|
||||||
|
- Accessibility or localization notes
|
||||||
|
|
||||||
|
Non-Functional Quality Attributes:
|
||||||
|
- Performance (latency, throughput targets)
|
||||||
|
- Scalability (horizontal/vertical, limits)
|
||||||
|
- Reliability & availability (uptime, recovery expectations)
|
||||||
|
- Observability (logging, metrics, tracing signals)
|
||||||
|
- Security & privacy (authN/Z, data protection, threat assumptions)
|
||||||
|
- Compliance / regulatory constraints (if any)
|
||||||
|
|
||||||
|
Integration & External Dependencies:
|
||||||
|
- External services/APIs and failure modes
|
||||||
|
- Data import/export formats
|
||||||
|
- Protocol/versioning assumptions
|
||||||
|
|
||||||
|
Edge Cases & Failure Handling:
|
||||||
|
- Negative scenarios
|
||||||
|
- Rate limiting / throttling
|
||||||
|
- Conflict resolution (e.g., concurrent edits)
|
||||||
|
|
||||||
|
Constraints & Tradeoffs:
|
||||||
|
- Technical constraints (language, storage, hosting)
|
||||||
|
- Explicit tradeoffs or rejected alternatives
|
||||||
|
|
||||||
|
Terminology & Consistency:
|
||||||
|
- Canonical glossary terms
|
||||||
|
- Avoided synonyms / deprecated terms
|
||||||
|
|
||||||
|
Completion Signals:
|
||||||
|
- Acceptance criteria testability
|
||||||
|
- Measurable Definition of Done style indicators
|
||||||
|
|
||||||
|
Misc / Placeholders:
|
||||||
|
- TODO markers / unresolved decisions
|
||||||
|
- Ambiguous adjectives ("robust", "intuitive") lacking quantification
|
||||||
|
|
||||||
|
For each category with Partial or Missing status, add a candidate question opportunity unless:
|
||||||
|
- Clarification would not materially change implementation or validation strategy
|
||||||
|
- Information is better deferred to planning phase (note internally)
|
||||||
|
|
||||||
|
3. Generate (internally) a prioritized queue of candidate clarification questions (maximum 5). Do NOT output them all at once. Apply these constraints:
|
||||||
|
- Maximum of 5 total questions across the whole session.
|
||||||
|
- Each question must be answerable with EITHER:
|
||||||
|
* A short multiple‑choice selection (2–5 distinct, mutually exclusive options), OR
|
||||||
|
* A one-word / short‑phrase answer (explicitly constrain: "Answer in <=5 words").
|
||||||
|
- Only include questions whose answers materially impact architecture, data modeling, task decomposition, test design, UX behavior, operational readiness, or compliance validation.
|
||||||
|
- Ensure category coverage balance: attempt to cover the highest impact unresolved categories first; avoid asking two low-impact questions when a single high-impact area (e.g., security posture) is unresolved.
|
||||||
|
- Exclude questions already answered, trivial stylistic preferences, or plan-level execution details (unless blocking correctness).
|
||||||
|
- Favor clarifications that reduce downstream rework risk or prevent misaligned acceptance tests.
|
||||||
|
- If more than 5 categories remain unresolved, select the top 5 by (Impact * Uncertainty) heuristic.
|
||||||
|
|
||||||
|
4. Sequential questioning loop (interactive):
|
||||||
|
- Present EXACTLY ONE question at a time.
|
||||||
|
- For multiple‑choice questions render options as a Markdown table:
|
||||||
|
|
||||||
|
| Option | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| A | <Option A description> |
|
||||||
|
| B | <Option B description> |
|
||||||
|
| C | <Option C description> | (add D/E as needed up to 5)
|
||||||
|
| Short | Provide a different short answer (<=5 words) | (Include only if free-form alternative is appropriate)
|
||||||
|
|
||||||
|
- For short‑answer style (no meaningful discrete options), output a single line after the question: `Format: Short answer (<=5 words)`.
|
||||||
|
- After the user answers:
|
||||||
|
* Validate the answer maps to one option or fits the <=5 word constraint.
|
||||||
|
* If ambiguous, ask for a quick disambiguation (count still belongs to same question; do not advance).
|
||||||
|
* Once satisfactory, record it in working memory (do not yet write to disk) and move to the next queued question.
|
||||||
|
- Stop asking further questions when:
|
||||||
|
* All critical ambiguities resolved early (remaining queued items become unnecessary), OR
|
||||||
|
* User signals completion ("done", "good", "no more"), OR
|
||||||
|
* You reach 5 asked questions.
|
||||||
|
- Never reveal future queued questions in advance.
|
||||||
|
- If no valid questions exist at start, immediately report no critical ambiguities.
|
||||||
|
|
||||||
|
5. Integration after EACH accepted answer (incremental update approach):
|
||||||
|
- Maintain in-memory representation of the spec (loaded once at start) plus the raw file contents.
|
||||||
|
- For the first integrated answer in this session:
|
||||||
|
* Ensure a `## Clarifications` section exists (create it just after the highest-level contextual/overview section per the spec template if missing).
|
||||||
|
* Under it, create (if not present) a `### Session YYYY-MM-DD` subheading for today.
|
||||||
|
- Append a bullet line immediately after acceptance: `- Q: <question> → A: <final answer>`.
|
||||||
|
- Then immediately apply the clarification to the most appropriate section(s):
|
||||||
|
* Functional ambiguity → Update or add a bullet in Functional Requirements.
|
||||||
|
* User interaction / actor distinction → Update User Stories or Actors subsection (if present) with clarified role, constraint, or scenario.
|
||||||
|
* Data shape / entities → Update Data Model (add fields, types, relationships) preserving ordering; note added constraints succinctly.
|
||||||
|
* Non-functional constraint → Add/modify measurable criteria in Non-Functional / Quality Attributes section (convert vague adjective to metric or explicit target).
|
||||||
|
* Edge case / negative flow → Add a new bullet under Edge Cases / Error Handling (or create such subsection if template provides placeholder for it).
|
||||||
|
* Terminology conflict → Normalize term across spec; retain original only if necessary by adding `(formerly referred to as "X")` once.
|
||||||
|
- If the clarification invalidates an earlier ambiguous statement, replace that statement instead of duplicating; leave no obsolete contradictory text.
|
||||||
|
- Save the spec file AFTER each integration to minimize risk of context loss (atomic overwrite).
|
||||||
|
- Preserve formatting: do not reorder unrelated sections; keep heading hierarchy intact.
|
||||||
|
- Keep each inserted clarification minimal and testable (avoid narrative drift).
|
||||||
|
|
||||||
|
6. Validation (performed after EACH write plus final pass):
|
||||||
|
- Clarifications session contains exactly one bullet per accepted answer (no duplicates).
|
||||||
|
- Total asked (accepted) questions ≤ 5.
|
||||||
|
- Updated sections contain no lingering vague placeholders the new answer was meant to resolve.
|
||||||
|
- No contradictory earlier statement remains (scan for now-invalid alternative choices removed).
|
||||||
|
- Markdown structure valid; only allowed new headings: `## Clarifications`, `### Session YYYY-MM-DD`.
|
||||||
|
- Terminology consistency: same canonical term used across all updated sections.
|
||||||
|
|
||||||
|
7. Write the updated spec back to `FEATURE_SPEC`.
|
||||||
|
|
||||||
|
8. Report completion (after questioning loop ends or early termination):
|
||||||
|
- Number of questions asked & answered.
|
||||||
|
- Path to updated spec.
|
||||||
|
- Sections touched (list names).
|
||||||
|
- Coverage summary table listing each taxonomy category with Status: Resolved (was Partial/Missing and addressed), Deferred (exceeds question quota or better suited for planning), Clear (already sufficient), Outstanding (still Partial/Missing but low impact).
|
||||||
|
- If any Outstanding or Deferred remain, recommend whether to proceed to `/plan` or run `/clarify` again later post-plan.
|
||||||
|
- Suggested next command.
|
||||||
|
|
||||||
|
Behavior rules:
|
||||||
|
- If no meaningful ambiguities found (or all potential questions would be low-impact), respond: "No critical ambiguities detected worth formal clarification." and suggest proceeding.
|
||||||
|
- If spec file missing, instruct user to run `/specify` first (do not create a new spec here).
|
||||||
|
- Never exceed 5 total asked questions (clarification retries for a single question do not count as new questions).
|
||||||
|
- Avoid speculative tech stack questions unless the absence blocks functional clarity.
|
||||||
|
- Respect user early termination signals ("stop", "done", "proceed").
|
||||||
|
- If no questions asked due to full coverage, output a compact coverage summary (all categories Clear) then suggest advancing.
|
||||||
|
- If quota reached with unresolved high-impact categories remaining, explicitly flag them under Deferred with rationale.
|
||||||
|
|
||||||
|
Context for prioritization: {{args}}
|
||||||
|
"""
|
||||||
77
.gemini/commands/constitution.toml
Normal file
77
.gemini/commands/constitution.toml
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
description = "Create or update the project constitution from interactive or provided principle inputs, ensuring all dependent templates stay in sync."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Create or update the project constitution from interactive or provided principle inputs, ensuring all dependent templates stay in sync.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
You are updating the project constitution at `.specify/memory/constitution.md`. This file is a TEMPLATE containing placeholder tokens in square brackets (e.g. `[PROJECT_NAME]`, `[PRINCIPLE_1_NAME]`). Your job is to (a) collect/derive concrete values, (b) fill the template precisely, and (c) propagate any amendments across dependent artifacts.
|
||||||
|
|
||||||
|
Follow this execution flow:
|
||||||
|
|
||||||
|
1. Load the existing constitution template at `.specify/memory/constitution.md`.
|
||||||
|
- Identify every placeholder token of the form `[ALL_CAPS_IDENTIFIER]`.
|
||||||
|
**IMPORTANT**: The user might require less or more principles than the ones used in the template. If a number is specified, respect that - follow the general template. You will update the doc accordingly.
|
||||||
|
|
||||||
|
2. Collect/derive values for placeholders:
|
||||||
|
- If user input (conversation) supplies a value, use it.
|
||||||
|
- Otherwise infer from existing repo context (README, docs, prior constitution versions if embedded).
|
||||||
|
- For governance dates: `RATIFICATION_DATE` is the original adoption date (if unknown ask or mark TODO), `LAST_AMENDED_DATE` is today if changes are made, otherwise keep previous.
|
||||||
|
- `CONSTITUTION_VERSION` must increment according to semantic versioning rules:
|
||||||
|
* MAJOR: Backward incompatible governance/principle removals or redefinitions.
|
||||||
|
* MINOR: New principle/section added or materially expanded guidance.
|
||||||
|
* PATCH: Clarifications, wording, typo fixes, non-semantic refinements.
|
||||||
|
- If version bump type ambiguous, propose reasoning before finalizing.
|
||||||
|
|
||||||
|
3. Draft the updated constitution content:
|
||||||
|
- Replace every placeholder with concrete text (no bracketed tokens left except intentionally retained template slots that the project has chosen not to define yet—explicitly justify any left).
|
||||||
|
- Preserve heading hierarchy and comments can be removed once replaced unless they still add clarifying guidance.
|
||||||
|
- Ensure each Principle section: succinct name line, paragraph (or bullet list) capturing non‑negotiable rules, explicit rationale if not obvious.
|
||||||
|
- Ensure Governance section lists amendment procedure, versioning policy, and compliance review expectations.
|
||||||
|
|
||||||
|
4. Consistency propagation checklist (convert prior checklist into active validations):
|
||||||
|
- Read `.specify/templates/plan-template.md` and ensure any "Constitution Check" or rules align with updated principles.
|
||||||
|
- Read `.specify/templates/spec-template.md` for scope/requirements alignment—update if constitution adds/removes mandatory sections or constraints.
|
||||||
|
- Read `.specify/templates/tasks-template.md` and ensure task categorization reflects new or removed principle-driven task types (e.g., observability, versioning, testing discipline).
|
||||||
|
- Read each command file in `.specify/templates/commands/*.md` (including this one) to verify no outdated references (agent-specific names like CLAUDE only) remain when generic guidance is required.
|
||||||
|
- Read any runtime guidance docs (e.g., `README.md`, `docs/quickstart.md`, or agent-specific guidance files if present). Update references to principles changed.
|
||||||
|
|
||||||
|
5. Produce a Sync Impact Report (prepend as an HTML comment at top of the constitution file after update):
|
||||||
|
- Version change: old → new
|
||||||
|
- List of modified principles (old title → new title if renamed)
|
||||||
|
- Added sections
|
||||||
|
- Removed sections
|
||||||
|
- Templates requiring updates (✅ updated / ⚠ pending) with file paths
|
||||||
|
- Follow-up TODOs if any placeholders intentionally deferred.
|
||||||
|
|
||||||
|
6. Validation before final output:
|
||||||
|
- No remaining unexplained bracket tokens.
|
||||||
|
- Version line matches report.
|
||||||
|
- Dates ISO format YYYY-MM-DD.
|
||||||
|
- Principles are declarative, testable, and free of vague language ("should" → replace with MUST/SHOULD rationale where appropriate).
|
||||||
|
|
||||||
|
7. Write the completed constitution back to `.specify/memory/constitution.md` (overwrite).
|
||||||
|
|
||||||
|
8. Output a final summary to the user with:
|
||||||
|
- New version and bump rationale.
|
||||||
|
- Any files flagged for manual follow-up.
|
||||||
|
- Suggested commit message (e.g., `docs: amend constitution to vX.Y.Z (principle additions + governance update)`).
|
||||||
|
|
||||||
|
Formatting & Style Requirements:
|
||||||
|
- Use Markdown headings exactly as in the template (do not demote/promote levels).
|
||||||
|
- Wrap long rationale lines to keep readability (<100 chars ideally) but do not hard enforce with awkward breaks.
|
||||||
|
- Keep a single blank line between sections.
|
||||||
|
- Avoid trailing whitespace.
|
||||||
|
|
||||||
|
If the user supplies partial updates (e.g., only one principle revision), still perform validation and version decision steps.
|
||||||
|
|
||||||
|
If critical info missing (e.g., ratification date truly unknown), insert `TODO(<FIELD_NAME>): explanation` and include in the Sync Impact Report under deferred items.
|
||||||
|
|
||||||
|
Do not create a new template; always operate on the existing `.specify/memory/constitution.md` file.
|
||||||
|
"""
|
||||||
60
.gemini/commands/implement.toml
Normal file
60
.gemini/commands/implement.toml
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
description = "Execute the implementation plan by processing and executing all tasks defined in tasks.md"
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Execute the implementation plan by processing and executing all tasks defined in tasks.md
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
1. Run `.specify/scripts/bash/check-prerequisites.sh --json --require-tasks --include-tasks` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute.
|
||||||
|
|
||||||
|
2. Load and analyze the implementation context:
|
||||||
|
- **REQUIRED**: Read tasks.md for the complete task list and execution plan
|
||||||
|
- **REQUIRED**: Read plan.md for tech stack, architecture, and file structure
|
||||||
|
- **IF EXISTS**: Read data-model.md for entities and relationships
|
||||||
|
- **IF EXISTS**: Read contracts/ for API specifications and test requirements
|
||||||
|
- **IF EXISTS**: Read research.md for technical decisions and constraints
|
||||||
|
- **IF EXISTS**: Read quickstart.md for integration scenarios
|
||||||
|
|
||||||
|
3. Parse tasks.md structure and extract:
|
||||||
|
- **Task phases**: Setup, Tests, Core, Integration, Polish
|
||||||
|
- **Task dependencies**: Sequential vs parallel execution rules
|
||||||
|
- **Task details**: ID, description, file paths, parallel markers [P]
|
||||||
|
- **Execution flow**: Order and dependency requirements
|
||||||
|
|
||||||
|
4. Execute implementation following the task plan:
|
||||||
|
- **Phase-by-phase execution**: Complete each phase before moving to the next
|
||||||
|
- **Respect dependencies**: Run sequential tasks in order, parallel tasks [P] can run together
|
||||||
|
- **Follow TDD approach**: Execute test tasks before their corresponding implementation tasks
|
||||||
|
- **File-based coordination**: Tasks affecting the same files must run sequentially
|
||||||
|
- **Validation checkpoints**: Verify each phase completion before proceeding
|
||||||
|
|
||||||
|
5. Implementation execution rules:
|
||||||
|
- **Setup first**: Initialize project structure, dependencies, configuration
|
||||||
|
- **Tests before code**: If you need to write tests for contracts, entities, and integration scenarios
|
||||||
|
- **Core development**: Implement models, services, CLI commands, endpoints
|
||||||
|
- **Integration work**: Database connections, middleware, logging, external services
|
||||||
|
- **Polish and validation**: Unit tests, performance optimization, documentation
|
||||||
|
|
||||||
|
6. Progress tracking and error handling:
|
||||||
|
- Report progress after each completed task
|
||||||
|
- Halt execution if any non-parallel task fails
|
||||||
|
- For parallel tasks [P], continue with successful tasks, report failed ones
|
||||||
|
- Provide clear error messages with context for debugging
|
||||||
|
- Suggest next steps if implementation cannot proceed
|
||||||
|
- **IMPORTANT** For completed tasks, make sure to mark the task off as [X] in the tasks file.
|
||||||
|
|
||||||
|
7. Completion validation:
|
||||||
|
- Verify all required tasks are completed
|
||||||
|
- Check that implemented features match the original specification
|
||||||
|
- Validate that tests pass and coverage meets requirements
|
||||||
|
- Confirm the implementation follows the technical plan
|
||||||
|
- Report final status with summary of completed work
|
||||||
|
|
||||||
|
Note: This command assumes a complete task breakdown exists in tasks.md. If tasks are incomplete or missing, suggest running `/tasks` first to regenerate the task list.
|
||||||
|
"""
|
||||||
47
.gemini/commands/plan.toml
Normal file
47
.gemini/commands/plan.toml
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
description = "Execute the implementation planning workflow using the plan template to generate design artifacts."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Execute the implementation planning workflow using the plan template to generate design artifacts.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
Given the implementation details provided as an argument, do this:
|
||||||
|
|
||||||
|
1. Run `.specify/scripts/bash/setup-plan.sh --json` from the repo root and parse JSON for FEATURE_SPEC, IMPL_PLAN, SPECS_DIR, BRANCH. All future file paths must be absolute.
|
||||||
|
- BEFORE proceeding, inspect FEATURE_SPEC for a `## Clarifications` section with at least one `Session` subheading. If missing or clearly ambiguous areas remain (vague adjectives, unresolved critical choices), PAUSE and instruct the user to run `/clarify` first to reduce rework. Only continue if: (a) Clarifications exist OR (b) an explicit user override is provided (e.g., "proceed without clarification"). Do not attempt to fabricate clarifications yourself.
|
||||||
|
2. Read and analyze the feature specification to understand:
|
||||||
|
- The feature requirements and user stories
|
||||||
|
- Functional and non-functional requirements
|
||||||
|
- Success criteria and acceptance criteria
|
||||||
|
- Any technical constraints or dependencies mentioned
|
||||||
|
|
||||||
|
3. Read the constitution at `.specify/memory/constitution.md` to understand constitutional requirements.
|
||||||
|
|
||||||
|
4. Execute the implementation plan template:
|
||||||
|
- Load `.specify/templates/plan-template.md` (already copied to IMPL_PLAN path)
|
||||||
|
- Set Input path to FEATURE_SPEC
|
||||||
|
- Run the Execution Flow (main) function steps 1-9
|
||||||
|
- The template is self-contained and executable
|
||||||
|
- Follow error handling and gate checks as specified
|
||||||
|
- Let the template guide artifact generation in $SPECS_DIR:
|
||||||
|
* Phase 0 generates research.md
|
||||||
|
* Phase 1 generates data-model.md, contracts/, quickstart.md
|
||||||
|
* Phase 2 generates tasks.md
|
||||||
|
- Incorporate user-provided details from arguments into Technical Context: {{args}}
|
||||||
|
- Update Progress Tracking as you complete each phase
|
||||||
|
|
||||||
|
5. Verify execution completed:
|
||||||
|
- Check Progress Tracking shows all phases complete
|
||||||
|
- Ensure all required artifacts were generated
|
||||||
|
- Confirm no ERROR states in execution
|
||||||
|
|
||||||
|
6. Report results with branch name, file paths, and generated artifacts.
|
||||||
|
|
||||||
|
Use absolute paths with the repository root for all file operations to avoid path issues.
|
||||||
|
"""
|
||||||
25
.gemini/commands/specify.toml
Normal file
25
.gemini/commands/specify.toml
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
description = "Create or update the feature specification from a natural language feature description."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Create or update the feature specification from a natural language feature description.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
The text the user typed after `/specify` in the triggering message **is** the feature description. Assume you always have it available in this conversation even if `{{args}}` appears literally below. Do not ask the user to repeat it unless they provided an empty command.
|
||||||
|
|
||||||
|
Given that feature description, do this:
|
||||||
|
|
||||||
|
1. Run the script `.specify/scripts/bash/create-new-feature.sh --json "{{args}}"` from repo root and parse its JSON output for BRANCH_NAME and SPEC_FILE. All file paths must be absolute.
|
||||||
|
**IMPORTANT** You must only ever run this script once. The JSON is provided in the terminal as output - always refer to it to get the actual content you're looking for.
|
||||||
|
2. Load `.specify/templates/spec-template.md` to understand required sections.
|
||||||
|
3. Write the specification to SPEC_FILE using the template structure, replacing placeholders with concrete details derived from the feature description (arguments) while preserving section order and headings.
|
||||||
|
4. Report completion with branch name, spec file path, and readiness for the next phase.
|
||||||
|
|
||||||
|
Note: The script creates and checks out the new branch and initializes the spec file before writing.
|
||||||
|
"""
|
||||||
66
.gemini/commands/tasks.toml
Normal file
66
.gemini/commands/tasks.toml
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
description = "Generate an actionable, dependency-ordered tasks.md for the feature based on available design artifacts."
|
||||||
|
|
||||||
|
prompt = """
|
||||||
|
---
|
||||||
|
description: Generate an actionable, dependency-ordered tasks.md for the feature based on available design artifacts.
|
||||||
|
---
|
||||||
|
|
||||||
|
The user input to you can be provided directly by the agent or as a command argument - you **MUST** consider it before proceeding with the prompt (if not empty).
|
||||||
|
|
||||||
|
User input:
|
||||||
|
|
||||||
|
$ARGUMENTS
|
||||||
|
|
||||||
|
1. Run `.specify/scripts/bash/check-prerequisites.sh --json` from repo root and parse FEATURE_DIR and AVAILABLE_DOCS list. All paths must be absolute.
|
||||||
|
2. Load and analyze available design documents:
|
||||||
|
- Always read plan.md for tech stack and libraries
|
||||||
|
- IF EXISTS: Read data-model.md for entities
|
||||||
|
- IF EXISTS: Read contracts/ for API endpoints
|
||||||
|
- IF EXISTS: Read research.md for technical decisions
|
||||||
|
- IF EXISTS: Read quickstart.md for test scenarios
|
||||||
|
|
||||||
|
Note: Not all projects have all documents. For example:
|
||||||
|
- CLI tools might not have contracts/
|
||||||
|
- Simple libraries might not need data-model.md
|
||||||
|
- Generate tasks based on what's available
|
||||||
|
|
||||||
|
3. Generate tasks following the template:
|
||||||
|
- Use `.specify/templates/tasks-template.md` as the base
|
||||||
|
- Replace example tasks with actual tasks based on:
|
||||||
|
* **Setup tasks**: Project init, dependencies, linting
|
||||||
|
* **Test tasks [P]**: One per contract, one per integration scenario
|
||||||
|
* **Core tasks**: One per entity, service, CLI command, endpoint
|
||||||
|
* **Integration tasks**: DB connections, middleware, logging
|
||||||
|
* **Polish tasks [P]**: Unit tests, performance, docs
|
||||||
|
|
||||||
|
4. Task generation rules:
|
||||||
|
- Each contract file → contract test task marked [P]
|
||||||
|
- Each entity in data-model → model creation task marked [P]
|
||||||
|
- Each endpoint → implementation task (not parallel if shared files)
|
||||||
|
- Each user story → integration test marked [P]
|
||||||
|
- Different files = can be parallel [P]
|
||||||
|
- Same file = sequential (no [P])
|
||||||
|
|
||||||
|
5. Order tasks by dependencies:
|
||||||
|
- Setup before everything
|
||||||
|
- Tests before implementation (TDD)
|
||||||
|
- Models before services
|
||||||
|
- Services before endpoints
|
||||||
|
- Core before integration
|
||||||
|
- Everything before polish
|
||||||
|
|
||||||
|
6. Include parallel execution examples:
|
||||||
|
- Group [P] tasks that can run together
|
||||||
|
- Show actual Task agent commands
|
||||||
|
|
||||||
|
7. Create FEATURE_DIR/tasks.md with:
|
||||||
|
- Correct feature name from implementation plan
|
||||||
|
- Numbered tasks (T001, T002, etc.)
|
||||||
|
- Clear file paths for each task
|
||||||
|
- Dependency notes
|
||||||
|
- Parallel execution guidance
|
||||||
|
|
||||||
|
Context for task generation: {{args}}
|
||||||
|
|
||||||
|
The tasks.md should be immediately executable - each task must be specific enough that an LLM can complete it without additional context.
|
||||||
|
"""
|
||||||
30
.specify/memory/constitution.md
Normal file
30
.specify/memory/constitution.md
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
# 蚊子项目开发宪章
|
||||||
|
|
||||||
|
## 核心原则
|
||||||
|
|
||||||
|
### I. 测试驱动开发 (TDD)
|
||||||
|
所有新功能在实现前,都必须先编写并通过用户认可的单元测试和集成测试。严格遵循红-绿-重构的开发循环。
|
||||||
|
|
||||||
|
### II. API 优先 (API First)
|
||||||
|
所有功能实现应从API设计开始。在进行后端实现之前,应使用如OpenAPI的格式先完成API的设计和文档编写,并获得审查通过。
|
||||||
|
|
||||||
|
### III. 关注点分离 (Separation of Concerns)
|
||||||
|
前后端代码应保持严格分离,通过定义好的API契约进行交互。这使得两者可以独立开发、测试和部署。
|
||||||
|
|
||||||
|
### IV. 代码整洁 (Clean Code)
|
||||||
|
代码应遵循Google Java风格指南,保持高度的可读性和可维护性。所有公开的API和复杂的业务逻辑必须有清晰的JavaDoc文档。
|
||||||
|
|
||||||
|
## 开发工作流
|
||||||
|
|
||||||
|
所有开发工作都应遵循`spec-kit`定义的规范驱动开发流程。
|
||||||
|
1. 使用 `/specify` 创建功能规范。
|
||||||
|
2. 使用 `/plan` 制定实施计划。
|
||||||
|
3. 使用 `/tasks` 生成开发任务。
|
||||||
|
4. 通过 `/tasks start <编号>` 开始任务,并提交代码。
|
||||||
|
5. 所有代码变更必须通过Pull Request进行,并由至少一名其他团队成员审查。
|
||||||
|
|
||||||
|
## 治理
|
||||||
|
|
||||||
|
本宪章是项目开发的最高准则,其优先级高于所有其他实践或个人偏好。对宪章的任何修订都必须经过团队讨论、记录在案,并获得批准。
|
||||||
|
|
||||||
|
**版本**: 1.0 | **批准日期**: 2025-09-29 | **上次修订日期**: 2025-09-29
|
||||||
166
.specify/scripts/bash/check-prerequisites.sh
Executable file
166
.specify/scripts/bash/check-prerequisites.sh
Executable file
@@ -0,0 +1,166 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# Consolidated prerequisite checking script
|
||||||
|
#
|
||||||
|
# This script provides unified prerequisite checking for Spec-Driven Development workflow.
|
||||||
|
# It replaces the functionality previously spread across multiple scripts.
|
||||||
|
#
|
||||||
|
# Usage: ./check-prerequisites.sh [OPTIONS]
|
||||||
|
#
|
||||||
|
# OPTIONS:
|
||||||
|
# --json Output in JSON format
|
||||||
|
# --require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
|
# --include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
|
# --paths-only Only output path variables (no validation)
|
||||||
|
# --help, -h Show help message
|
||||||
|
#
|
||||||
|
# OUTPUTS:
|
||||||
|
# JSON mode: {"FEATURE_DIR":"...", "AVAILABLE_DOCS":["..."]}
|
||||||
|
# Text mode: FEATURE_DIR:... \n AVAILABLE_DOCS: \n ✓/✗ file.md
|
||||||
|
# Paths only: REPO_ROOT: ... \n BRANCH: ... \n FEATURE_DIR: ... etc.
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Parse command line arguments
|
||||||
|
JSON_MODE=false
|
||||||
|
REQUIRE_TASKS=false
|
||||||
|
INCLUDE_TASKS=false
|
||||||
|
PATHS_ONLY=false
|
||||||
|
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--json)
|
||||||
|
JSON_MODE=true
|
||||||
|
;;
|
||||||
|
--require-tasks)
|
||||||
|
REQUIRE_TASKS=true
|
||||||
|
;;
|
||||||
|
--include-tasks)
|
||||||
|
INCLUDE_TASKS=true
|
||||||
|
;;
|
||||||
|
--paths-only)
|
||||||
|
PATHS_ONLY=true
|
||||||
|
;;
|
||||||
|
--help|-h)
|
||||||
|
cat << 'EOF'
|
||||||
|
Usage: check-prerequisites.sh [OPTIONS]
|
||||||
|
|
||||||
|
Consolidated prerequisite checking for Spec-Driven Development workflow.
|
||||||
|
|
||||||
|
OPTIONS:
|
||||||
|
--json Output in JSON format
|
||||||
|
--require-tasks Require tasks.md to exist (for implementation phase)
|
||||||
|
--include-tasks Include tasks.md in AVAILABLE_DOCS list
|
||||||
|
--paths-only Only output path variables (no prerequisite validation)
|
||||||
|
--help, -h Show this help message
|
||||||
|
|
||||||
|
EXAMPLES:
|
||||||
|
# Check task prerequisites (plan.md required)
|
||||||
|
./check-prerequisites.sh --json
|
||||||
|
|
||||||
|
# Check implementation prerequisites (plan.md + tasks.md required)
|
||||||
|
./check-prerequisites.sh --json --require-tasks --include-tasks
|
||||||
|
|
||||||
|
# Get feature paths only (no validation)
|
||||||
|
./check-prerequisites.sh --paths-only
|
||||||
|
|
||||||
|
EOF
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "ERROR: Unknown option '$arg'. Use --help for usage information." >&2
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Source common functions
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/common.sh"
|
||||||
|
|
||||||
|
# Get feature paths and validate branch
|
||||||
|
eval $(get_feature_paths)
|
||||||
|
check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
|
||||||
|
|
||||||
|
# If paths-only mode, output paths and exit (support JSON + paths-only combined)
|
||||||
|
if $PATHS_ONLY; then
|
||||||
|
if $JSON_MODE; then
|
||||||
|
# Minimal JSON paths payload (no validation performed)
|
||||||
|
printf '{"REPO_ROOT":"%s","BRANCH":"%s","FEATURE_DIR":"%s","FEATURE_SPEC":"%s","IMPL_PLAN":"%s","TASKS":"%s"}\n' \
|
||||||
|
"$REPO_ROOT" "$CURRENT_BRANCH" "$FEATURE_DIR" "$FEATURE_SPEC" "$IMPL_PLAN" "$TASKS"
|
||||||
|
else
|
||||||
|
echo "REPO_ROOT: $REPO_ROOT"
|
||||||
|
echo "BRANCH: $CURRENT_BRANCH"
|
||||||
|
echo "FEATURE_DIR: $FEATURE_DIR"
|
||||||
|
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
||||||
|
echo "IMPL_PLAN: $IMPL_PLAN"
|
||||||
|
echo "TASKS: $TASKS"
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Validate required directories and files
|
||||||
|
if [[ ! -d "$FEATURE_DIR" ]]; then
|
||||||
|
echo "ERROR: Feature directory not found: $FEATURE_DIR" >&2
|
||||||
|
echo "Run /specify first to create the feature structure." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -f "$IMPL_PLAN" ]]; then
|
||||||
|
echo "ERROR: plan.md not found in $FEATURE_DIR" >&2
|
||||||
|
echo "Run /plan first to create the implementation plan." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check for tasks.md if required
|
||||||
|
if $REQUIRE_TASKS && [[ ! -f "$TASKS" ]]; then
|
||||||
|
echo "ERROR: tasks.md not found in $FEATURE_DIR" >&2
|
||||||
|
echo "Run /tasks first to create the task list." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Build list of available documents
|
||||||
|
docs=()
|
||||||
|
|
||||||
|
# Always check these optional docs
|
||||||
|
[[ -f "$RESEARCH" ]] && docs+=("research.md")
|
||||||
|
[[ -f "$DATA_MODEL" ]] && docs+=("data-model.md")
|
||||||
|
|
||||||
|
# Check contracts directory (only if it exists and has files)
|
||||||
|
if [[ -d "$CONTRACTS_DIR" ]] && [[ -n "$(ls -A "$CONTRACTS_DIR" 2>/dev/null)" ]]; then
|
||||||
|
docs+=("contracts/")
|
||||||
|
fi
|
||||||
|
|
||||||
|
[[ -f "$QUICKSTART" ]] && docs+=("quickstart.md")
|
||||||
|
|
||||||
|
# Include tasks.md if requested and it exists
|
||||||
|
if $INCLUDE_TASKS && [[ -f "$TASKS" ]]; then
|
||||||
|
docs+=("tasks.md")
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Output results
|
||||||
|
if $JSON_MODE; then
|
||||||
|
# Build JSON array of documents
|
||||||
|
if [[ ${#docs[@]} -eq 0 ]]; then
|
||||||
|
json_docs="[]"
|
||||||
|
else
|
||||||
|
json_docs=$(printf '"%s",' "${docs[@]}")
|
||||||
|
json_docs="[${json_docs%,}]"
|
||||||
|
fi
|
||||||
|
|
||||||
|
printf '{"FEATURE_DIR":"%s","AVAILABLE_DOCS":%s}\n' "$FEATURE_DIR" "$json_docs"
|
||||||
|
else
|
||||||
|
# Text output
|
||||||
|
echo "FEATURE_DIR:$FEATURE_DIR"
|
||||||
|
echo "AVAILABLE_DOCS:"
|
||||||
|
|
||||||
|
# Show status of each potential document
|
||||||
|
check_file "$RESEARCH" "research.md"
|
||||||
|
check_file "$DATA_MODEL" "data-model.md"
|
||||||
|
check_dir "$CONTRACTS_DIR" "contracts/"
|
||||||
|
check_file "$QUICKSTART" "quickstart.md"
|
||||||
|
|
||||||
|
if $INCLUDE_TASKS; then
|
||||||
|
check_file "$TASKS" "tasks.md"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
113
.specify/scripts/bash/common.sh
Executable file
113
.specify/scripts/bash/common.sh
Executable file
@@ -0,0 +1,113 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# Common functions and variables for all scripts
|
||||||
|
|
||||||
|
# Get repository root, with fallback for non-git repositories
|
||||||
|
get_repo_root() {
|
||||||
|
if git rev-parse --show-toplevel >/dev/null 2>&1; then
|
||||||
|
git rev-parse --show-toplevel
|
||||||
|
else
|
||||||
|
# Fall back to script location for non-git repos
|
||||||
|
local script_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
(cd "$script_dir/../../.." && pwd)
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get current branch, with fallback for non-git repositories
|
||||||
|
get_current_branch() {
|
||||||
|
# First check if SPECIFY_FEATURE environment variable is set
|
||||||
|
if [[ -n "${SPECIFY_FEATURE:-}" ]]; then
|
||||||
|
echo "$SPECIFY_FEATURE"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Then check git if available
|
||||||
|
if git rev-parse --abbrev-ref HEAD >/dev/null 2>&1; then
|
||||||
|
git rev-parse --abbrev-ref HEAD
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# For non-git repos, try to find the latest feature directory
|
||||||
|
local repo_root=$(get_repo_root)
|
||||||
|
local specs_dir="$repo_root/specs"
|
||||||
|
|
||||||
|
if [[ -d "$specs_dir" ]]; then
|
||||||
|
local latest_feature=""
|
||||||
|
local highest=0
|
||||||
|
|
||||||
|
for dir in "$specs_dir"/*; do
|
||||||
|
if [[ -d "$dir" ]]; then
|
||||||
|
local dirname=$(basename "$dir")
|
||||||
|
if [[ "$dirname" =~ ^([0-9]{3})- ]]; then
|
||||||
|
local number=${BASH_REMATCH[1]}
|
||||||
|
number=$((10#$number))
|
||||||
|
if [[ "$number" -gt "$highest" ]]; then
|
||||||
|
highest=$number
|
||||||
|
latest_feature=$dirname
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -n "$latest_feature" ]]; then
|
||||||
|
echo "$latest_feature"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "main" # Final fallback
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if we have git available
|
||||||
|
has_git() {
|
||||||
|
git rev-parse --show-toplevel >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
check_feature_branch() {
|
||||||
|
local branch="$1"
|
||||||
|
local has_git_repo="$2"
|
||||||
|
|
||||||
|
# For non-git repos, we can't enforce branch naming but still provide output
|
||||||
|
if [[ "$has_git_repo" != "true" ]]; then
|
||||||
|
echo "[specify] Warning: Git repository not detected; skipped branch validation" >&2
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! "$branch" =~ ^[0-9]{3}- ]]; then
|
||||||
|
echo "ERROR: Not on a feature branch. Current branch: $branch" >&2
|
||||||
|
echo "Feature branches should be named like: 001-feature-name" >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
get_feature_dir() { echo "$1/specs/$2"; }
|
||||||
|
|
||||||
|
get_feature_paths() {
|
||||||
|
local repo_root=$(get_repo_root)
|
||||||
|
local current_branch=$(get_current_branch)
|
||||||
|
local has_git_repo="false"
|
||||||
|
|
||||||
|
if has_git; then
|
||||||
|
has_git_repo="true"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local feature_dir=$(get_feature_dir "$repo_root" "$current_branch")
|
||||||
|
|
||||||
|
cat <<EOF
|
||||||
|
REPO_ROOT='$repo_root'
|
||||||
|
CURRENT_BRANCH='$current_branch'
|
||||||
|
HAS_GIT='$has_git_repo'
|
||||||
|
FEATURE_DIR='$feature_dir'
|
||||||
|
FEATURE_SPEC='$feature_dir/spec.md'
|
||||||
|
IMPL_PLAN='$feature_dir/plan.md'
|
||||||
|
TASKS='$feature_dir/tasks.md'
|
||||||
|
RESEARCH='$feature_dir/research.md'
|
||||||
|
DATA_MODEL='$feature_dir/data-model.md'
|
||||||
|
QUICKSTART='$feature_dir/quickstart.md'
|
||||||
|
CONTRACTS_DIR='$feature_dir/contracts'
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
|
check_file() { [[ -f "$1" ]] && echo " ✓ $2" || echo " ✗ $2"; }
|
||||||
|
check_dir() { [[ -d "$1" && -n $(ls -A "$1" 2>/dev/null) ]] && echo " ✓ $2" || echo " ✗ $2"; }
|
||||||
97
.specify/scripts/bash/create-new-feature.sh
Executable file
97
.specify/scripts/bash/create-new-feature.sh
Executable file
@@ -0,0 +1,97 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
JSON_MODE=false
|
||||||
|
ARGS=()
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--json) JSON_MODE=true ;;
|
||||||
|
--help|-h) echo "Usage: $0 [--json] <feature_description>"; exit 0 ;;
|
||||||
|
*) ARGS+=("$arg") ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
FEATURE_DESCRIPTION="${ARGS[*]}"
|
||||||
|
if [ -z "$FEATURE_DESCRIPTION" ]; then
|
||||||
|
echo "Usage: $0 [--json] <feature_description>" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Function to find the repository root by searching for existing project markers
|
||||||
|
find_repo_root() {
|
||||||
|
local dir="$1"
|
||||||
|
while [ "$dir" != "/" ]; do
|
||||||
|
if [ -d "$dir/.git" ] || [ -d "$dir/.specify" ]; then
|
||||||
|
echo "$dir"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
dir="$(dirname "$dir")"
|
||||||
|
done
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Resolve repository root. Prefer git information when available, but fall back
|
||||||
|
# to searching for repository markers so the workflow still functions in repositories that
|
||||||
|
# were initialised with --no-git.
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
if git rev-parse --show-toplevel >/dev/null 2>&1; then
|
||||||
|
REPO_ROOT=$(git rev-parse --show-toplevel)
|
||||||
|
HAS_GIT=true
|
||||||
|
else
|
||||||
|
REPO_ROOT="$(find_repo_root "$SCRIPT_DIR")"
|
||||||
|
if [ -z "$REPO_ROOT" ]; then
|
||||||
|
echo "Error: Could not determine repository root. Please run this script from within the repository." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
HAS_GIT=false
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
SPECS_DIR="$REPO_ROOT/specs"
|
||||||
|
mkdir -p "$SPECS_DIR"
|
||||||
|
|
||||||
|
HIGHEST=0
|
||||||
|
if [ -d "$SPECS_DIR" ]; then
|
||||||
|
for dir in "$SPECS_DIR"/*; do
|
||||||
|
[ -d "$dir" ] || continue
|
||||||
|
dirname=$(basename "$dir")
|
||||||
|
number=$(echo "$dirname" | grep -o '^[0-9]\+' || echo "0")
|
||||||
|
number=$((10#$number))
|
||||||
|
if [ "$number" -gt "$HIGHEST" ]; then HIGHEST=$number; fi
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
NEXT=$((HIGHEST + 1))
|
||||||
|
FEATURE_NUM=$(printf "%03d" "$NEXT")
|
||||||
|
|
||||||
|
BRANCH_NAME=$(echo "$FEATURE_DESCRIPTION" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9]/-/g' | sed 's/-\+/-/g' | sed 's/^-//' | sed 's/-$//')
|
||||||
|
WORDS=$(echo "$BRANCH_NAME" | tr '-' '\n' | grep -v '^$' | head -3 | tr '\n' '-' | sed 's/-$//')
|
||||||
|
BRANCH_NAME="${FEATURE_NUM}-${WORDS}"
|
||||||
|
|
||||||
|
if [ "$HAS_GIT" = true ]; then
|
||||||
|
git checkout -b "$BRANCH_NAME"
|
||||||
|
else
|
||||||
|
>&2 echo "[specify] Warning: Git repository not detected; skipped branch creation for $BRANCH_NAME"
|
||||||
|
fi
|
||||||
|
|
||||||
|
FEATURE_DIR="$SPECS_DIR/$BRANCH_NAME"
|
||||||
|
mkdir -p "$FEATURE_DIR"
|
||||||
|
|
||||||
|
TEMPLATE="$REPO_ROOT/.specify/templates/spec-template.md"
|
||||||
|
SPEC_FILE="$FEATURE_DIR/spec.md"
|
||||||
|
if [ -f "$TEMPLATE" ]; then cp "$TEMPLATE" "$SPEC_FILE"; else touch "$SPEC_FILE"; fi
|
||||||
|
|
||||||
|
# Set the SPECIFY_FEATURE environment variable for the current session
|
||||||
|
export SPECIFY_FEATURE="$BRANCH_NAME"
|
||||||
|
|
||||||
|
if $JSON_MODE; then
|
||||||
|
printf '{"BRANCH_NAME":"%s","SPEC_FILE":"%s","FEATURE_NUM":"%s"}\n' "$BRANCH_NAME" "$SPEC_FILE" "$FEATURE_NUM"
|
||||||
|
else
|
||||||
|
echo "BRANCH_NAME: $BRANCH_NAME"
|
||||||
|
echo "SPEC_FILE: $SPEC_FILE"
|
||||||
|
echo "FEATURE_NUM: $FEATURE_NUM"
|
||||||
|
echo "SPECIFY_FEATURE environment variable set to: $BRANCH_NAME"
|
||||||
|
fi
|
||||||
60
.specify/scripts/bash/setup-plan.sh
Executable file
60
.specify/scripts/bash/setup-plan.sh
Executable file
@@ -0,0 +1,60 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Parse command line arguments
|
||||||
|
JSON_MODE=false
|
||||||
|
ARGS=()
|
||||||
|
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--json)
|
||||||
|
JSON_MODE=true
|
||||||
|
;;
|
||||||
|
--help|-h)
|
||||||
|
echo "Usage: $0 [--json]"
|
||||||
|
echo " --json Output results in JSON format"
|
||||||
|
echo " --help Show this help message"
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
ARGS+=("$arg")
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Get script directory and load common functions
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/common.sh"
|
||||||
|
|
||||||
|
# Get all paths and variables from common functions
|
||||||
|
eval $(get_feature_paths)
|
||||||
|
|
||||||
|
# Check if we're on a proper feature branch (only for git repos)
|
||||||
|
check_feature_branch "$CURRENT_BRANCH" "$HAS_GIT" || exit 1
|
||||||
|
|
||||||
|
# Ensure the feature directory exists
|
||||||
|
mkdir -p "$FEATURE_DIR"
|
||||||
|
|
||||||
|
# Copy plan template if it exists
|
||||||
|
TEMPLATE="$REPO_ROOT/.specify/templates/plan-template.md"
|
||||||
|
if [[ -f "$TEMPLATE" ]]; then
|
||||||
|
cp "$TEMPLATE" "$IMPL_PLAN"
|
||||||
|
echo "Copied plan template to $IMPL_PLAN"
|
||||||
|
else
|
||||||
|
echo "Warning: Plan template not found at $TEMPLATE"
|
||||||
|
# Create a basic plan file if template doesn't exist
|
||||||
|
touch "$IMPL_PLAN"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Output results
|
||||||
|
if $JSON_MODE; then
|
||||||
|
printf '{"FEATURE_SPEC":"%s","IMPL_PLAN":"%s","SPECS_DIR":"%s","BRANCH":"%s","HAS_GIT":"%s"}\n' \
|
||||||
|
"$FEATURE_SPEC" "$IMPL_PLAN" "$FEATURE_DIR" "$CURRENT_BRANCH" "$HAS_GIT"
|
||||||
|
else
|
||||||
|
echo "FEATURE_SPEC: $FEATURE_SPEC"
|
||||||
|
echo "IMPL_PLAN: $IMPL_PLAN"
|
||||||
|
echo "SPECS_DIR: $FEATURE_DIR"
|
||||||
|
echo "BRANCH: $CURRENT_BRANCH"
|
||||||
|
echo "HAS_GIT: $HAS_GIT"
|
||||||
|
fi
|
||||||
719
.specify/scripts/bash/update-agent-context.sh
Executable file
719
.specify/scripts/bash/update-agent-context.sh
Executable file
@@ -0,0 +1,719 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# Update agent context files with information from plan.md
|
||||||
|
#
|
||||||
|
# This script maintains AI agent context files by parsing feature specifications
|
||||||
|
# and updating agent-specific configuration files with project information.
|
||||||
|
#
|
||||||
|
# MAIN FUNCTIONS:
|
||||||
|
# 1. Environment Validation
|
||||||
|
# - Verifies git repository structure and branch information
|
||||||
|
# - Checks for required plan.md files and templates
|
||||||
|
# - Validates file permissions and accessibility
|
||||||
|
#
|
||||||
|
# 2. Plan Data Extraction
|
||||||
|
# - Parses plan.md files to extract project metadata
|
||||||
|
# - Identifies language/version, frameworks, databases, and project types
|
||||||
|
# - Handles missing or incomplete specification data gracefully
|
||||||
|
#
|
||||||
|
# 3. Agent File Management
|
||||||
|
# - Creates new agent context files from templates when needed
|
||||||
|
# - Updates existing agent files with new project information
|
||||||
|
# - Preserves manual additions and custom configurations
|
||||||
|
# - Supports multiple AI agent formats and directory structures
|
||||||
|
#
|
||||||
|
# 4. Content Generation
|
||||||
|
# - Generates language-specific build/test commands
|
||||||
|
# - Creates appropriate project directory structures
|
||||||
|
# - Updates technology stacks and recent changes sections
|
||||||
|
# - Maintains consistent formatting and timestamps
|
||||||
|
#
|
||||||
|
# 5. Multi-Agent Support
|
||||||
|
# - Handles agent-specific file paths and naming conventions
|
||||||
|
# - Supports: Claude, Gemini, Copilot, Cursor, Qwen, opencode, Codex, Windsurf
|
||||||
|
# - Can update single agents or all existing agent files
|
||||||
|
# - Creates default Claude file if no agent files exist
|
||||||
|
#
|
||||||
|
# Usage: ./update-agent-context.sh [agent_type]
|
||||||
|
# Agent types: claude|gemini|copilot|cursor|qwen|opencode|codex|windsurf
|
||||||
|
# Leave empty to update all existing agent files
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Enable strict error handling
|
||||||
|
set -u
|
||||||
|
set -o pipefail
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Configuration and Global Variables
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
# Get script directory and load common functions
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
source "$SCRIPT_DIR/common.sh"
|
||||||
|
|
||||||
|
# Get all paths and variables from common functions
|
||||||
|
eval $(get_feature_paths)
|
||||||
|
|
||||||
|
NEW_PLAN="$IMPL_PLAN" # Alias for compatibility with existing code
|
||||||
|
AGENT_TYPE="${1:-}"
|
||||||
|
|
||||||
|
# Agent-specific file paths
|
||||||
|
CLAUDE_FILE="$REPO_ROOT/CLAUDE.md"
|
||||||
|
GEMINI_FILE="$REPO_ROOT/GEMINI.md"
|
||||||
|
COPILOT_FILE="$REPO_ROOT/.github/copilot-instructions.md"
|
||||||
|
CURSOR_FILE="$REPO_ROOT/.cursor/rules/specify-rules.mdc"
|
||||||
|
QWEN_FILE="$REPO_ROOT/QWEN.md"
|
||||||
|
AGENTS_FILE="$REPO_ROOT/AGENTS.md"
|
||||||
|
WINDSURF_FILE="$REPO_ROOT/.windsurf/rules/specify-rules.md"
|
||||||
|
KILOCODE_FILE="$REPO_ROOT/.kilocode/rules/specify-rules.md"
|
||||||
|
AUGGIE_FILE="$REPO_ROOT/.augment/rules/specify-rules.md"
|
||||||
|
ROO_FILE="$REPO_ROOT/.roo/rules/specify-rules.md"
|
||||||
|
|
||||||
|
# Template file
|
||||||
|
TEMPLATE_FILE="$REPO_ROOT/.specify/templates/agent-file-template.md"
|
||||||
|
|
||||||
|
# Global variables for parsed plan data
|
||||||
|
NEW_LANG=""
|
||||||
|
NEW_FRAMEWORK=""
|
||||||
|
NEW_DB=""
|
||||||
|
NEW_PROJECT_TYPE=""
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Utility Functions
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
log_info() {
|
||||||
|
echo "INFO: $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_success() {
|
||||||
|
echo "✓ $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_error() {
|
||||||
|
echo "ERROR: $1" >&2
|
||||||
|
}
|
||||||
|
|
||||||
|
log_warning() {
|
||||||
|
echo "WARNING: $1" >&2
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cleanup function for temporary files
|
||||||
|
cleanup() {
|
||||||
|
local exit_code=$?
|
||||||
|
rm -f /tmp/agent_update_*_$$
|
||||||
|
rm -f /tmp/manual_additions_$$
|
||||||
|
exit $exit_code
|
||||||
|
}
|
||||||
|
|
||||||
|
# Set up cleanup trap
|
||||||
|
trap cleanup EXIT INT TERM
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Validation Functions
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
validate_environment() {
|
||||||
|
# Check if we have a current branch/feature (git or non-git)
|
||||||
|
if [[ -z "$CURRENT_BRANCH" ]]; then
|
||||||
|
log_error "Unable to determine current feature"
|
||||||
|
if [[ "$HAS_GIT" == "true" ]]; then
|
||||||
|
log_info "Make sure you're on a feature branch"
|
||||||
|
else
|
||||||
|
log_info "Set SPECIFY_FEATURE environment variable or create a feature first"
|
||||||
|
fi
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if plan.md exists
|
||||||
|
if [[ ! -f "$NEW_PLAN" ]]; then
|
||||||
|
log_error "No plan.md found at $NEW_PLAN"
|
||||||
|
log_info "Make sure you're working on a feature with a corresponding spec directory"
|
||||||
|
if [[ "$HAS_GIT" != "true" ]]; then
|
||||||
|
log_info "Use: export SPECIFY_FEATURE=your-feature-name or create a new feature first"
|
||||||
|
fi
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if template exists (needed for new files)
|
||||||
|
if [[ ! -f "$TEMPLATE_FILE" ]]; then
|
||||||
|
log_warning "Template file not found at $TEMPLATE_FILE"
|
||||||
|
log_warning "Creating new agent files will fail"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Plan Parsing Functions
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
extract_plan_field() {
|
||||||
|
local field_pattern="$1"
|
||||||
|
local plan_file="$2"
|
||||||
|
|
||||||
|
grep "^\*\*${field_pattern}\*\*: " "$plan_file" 2>/dev/null | \
|
||||||
|
head -1 | \
|
||||||
|
sed "s|^\*\*${field_pattern}\*\*: ||" | \
|
||||||
|
sed 's/^[ \t]*//;s/[ \t]*$//' | \
|
||||||
|
grep -v "NEEDS CLARIFICATION" | \
|
||||||
|
grep -v "^N/A$" || echo ""
|
||||||
|
}
|
||||||
|
|
||||||
|
parse_plan_data() {
|
||||||
|
local plan_file="$1"
|
||||||
|
|
||||||
|
if [[ ! -f "$plan_file" ]]; then
|
||||||
|
log_error "Plan file not found: $plan_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -r "$plan_file" ]]; then
|
||||||
|
log_error "Plan file is not readable: $plan_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Parsing plan data from $plan_file"
|
||||||
|
|
||||||
|
NEW_LANG=$(extract_plan_field "Language/Version" "$plan_file")
|
||||||
|
NEW_FRAMEWORK=$(extract_plan_field "Primary Dependencies" "$plan_file")
|
||||||
|
NEW_DB=$(extract_plan_field "Storage" "$plan_file")
|
||||||
|
NEW_PROJECT_TYPE=$(extract_plan_field "Project Type" "$plan_file")
|
||||||
|
|
||||||
|
# Log what we found
|
||||||
|
if [[ -n "$NEW_LANG" ]]; then
|
||||||
|
log_info "Found language: $NEW_LANG"
|
||||||
|
else
|
||||||
|
log_warning "No language information found in plan"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_FRAMEWORK" ]]; then
|
||||||
|
log_info "Found framework: $NEW_FRAMEWORK"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_DB" ]] && [[ "$NEW_DB" != "N/A" ]]; then
|
||||||
|
log_info "Found database: $NEW_DB"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_PROJECT_TYPE" ]]; then
|
||||||
|
log_info "Found project type: $NEW_PROJECT_TYPE"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
format_technology_stack() {
|
||||||
|
local lang="$1"
|
||||||
|
local framework="$2"
|
||||||
|
local parts=()
|
||||||
|
|
||||||
|
# Add non-empty parts
|
||||||
|
[[ -n "$lang" && "$lang" != "NEEDS CLARIFICATION" ]] && parts+=("$lang")
|
||||||
|
[[ -n "$framework" && "$framework" != "NEEDS CLARIFICATION" && "$framework" != "N/A" ]] && parts+=("$framework")
|
||||||
|
|
||||||
|
# Join with proper formatting
|
||||||
|
if [[ ${#parts[@]} -eq 0 ]]; then
|
||||||
|
echo ""
|
||||||
|
elif [[ ${#parts[@]} -eq 1 ]]; then
|
||||||
|
echo "${parts[0]}"
|
||||||
|
else
|
||||||
|
# Join multiple parts with " + "
|
||||||
|
local result="${parts[0]}"
|
||||||
|
for ((i=1; i<${#parts[@]}; i++)); do
|
||||||
|
result="$result + ${parts[i]}"
|
||||||
|
done
|
||||||
|
echo "$result"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Template and Content Generation Functions
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
get_project_structure() {
|
||||||
|
local project_type="$1"
|
||||||
|
|
||||||
|
if [[ "$project_type" == *"web"* ]]; then
|
||||||
|
echo "backend/\\nfrontend/\\ntests/"
|
||||||
|
else
|
||||||
|
echo "src/\\ntests/"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
get_commands_for_language() {
|
||||||
|
local lang="$1"
|
||||||
|
|
||||||
|
case "$lang" in
|
||||||
|
*"Python"*)
|
||||||
|
echo "cd src && pytest && ruff check ."
|
||||||
|
;;
|
||||||
|
*"Rust"*)
|
||||||
|
echo "cargo test && cargo clippy"
|
||||||
|
;;
|
||||||
|
*"JavaScript"*|*"TypeScript"*)
|
||||||
|
echo "npm test && npm run lint"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "# Add commands for $lang"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
get_language_conventions() {
|
||||||
|
local lang="$1"
|
||||||
|
echo "$lang: Follow standard conventions"
|
||||||
|
}
|
||||||
|
|
||||||
|
create_new_agent_file() {
|
||||||
|
local target_file="$1"
|
||||||
|
local temp_file="$2"
|
||||||
|
local project_name="$3"
|
||||||
|
local current_date="$4"
|
||||||
|
|
||||||
|
if [[ ! -f "$TEMPLATE_FILE" ]]; then
|
||||||
|
log_error "Template not found at $TEMPLATE_FILE"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -r "$TEMPLATE_FILE" ]]; then
|
||||||
|
log_error "Template file is not readable: $TEMPLATE_FILE"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Creating new agent context file from template..."
|
||||||
|
|
||||||
|
if ! cp "$TEMPLATE_FILE" "$temp_file"; then
|
||||||
|
log_error "Failed to copy template file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Replace template placeholders
|
||||||
|
local project_structure
|
||||||
|
project_structure=$(get_project_structure "$NEW_PROJECT_TYPE")
|
||||||
|
|
||||||
|
local commands
|
||||||
|
commands=$(get_commands_for_language "$NEW_LANG")
|
||||||
|
|
||||||
|
local language_conventions
|
||||||
|
language_conventions=$(get_language_conventions "$NEW_LANG")
|
||||||
|
|
||||||
|
# Perform substitutions with error checking using safer approach
|
||||||
|
# Escape special characters for sed by using a different delimiter or escaping
|
||||||
|
local escaped_lang=$(printf '%s\n' "$NEW_LANG" | sed 's/[\[\.*^$()+{}|]/\\&/g')
|
||||||
|
local escaped_framework=$(printf '%s\n' "$NEW_FRAMEWORK" | sed 's/[\[\.*^$()+{}|]/\\&/g')
|
||||||
|
local escaped_branch=$(printf '%s\n' "$CURRENT_BRANCH" | sed 's/[\[\.*^$()+{}|]/\\&/g')
|
||||||
|
|
||||||
|
# Build technology stack and recent change strings conditionally
|
||||||
|
local tech_stack
|
||||||
|
if [[ -n "$escaped_lang" && -n "$escaped_framework" ]]; then
|
||||||
|
tech_stack="- $escaped_lang + $escaped_framework ($escaped_branch)"
|
||||||
|
elif [[ -n "$escaped_lang" ]]; then
|
||||||
|
tech_stack="- $escaped_lang ($escaped_branch)"
|
||||||
|
elif [[ -n "$escaped_framework" ]]; then
|
||||||
|
tech_stack="- $escaped_framework ($escaped_branch)"
|
||||||
|
else
|
||||||
|
tech_stack="- ($escaped_branch)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local recent_change
|
||||||
|
if [[ -n "$escaped_lang" && -n "$escaped_framework" ]]; then
|
||||||
|
recent_change="- $escaped_branch: Added $escaped_lang + $escaped_framework"
|
||||||
|
elif [[ -n "$escaped_lang" ]]; then
|
||||||
|
recent_change="- $escaped_branch: Added $escaped_lang"
|
||||||
|
elif [[ -n "$escaped_framework" ]]; then
|
||||||
|
recent_change="- $escaped_branch: Added $escaped_framework"
|
||||||
|
else
|
||||||
|
recent_change="- $escaped_branch: Added"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local substitutions=(
|
||||||
|
"s|\[PROJECT NAME\]|$project_name|"
|
||||||
|
"s|\[DATE\]|$current_date|"
|
||||||
|
"s|\[EXTRACTED FROM ALL PLAN.MD FILES\]|$tech_stack|"
|
||||||
|
"s|\[ACTUAL STRUCTURE FROM PLANS\]|$project_structure|g"
|
||||||
|
"s|\[ONLY COMMANDS FOR ACTIVE TECHNOLOGIES\]|$commands|"
|
||||||
|
"s|\[LANGUAGE-SPECIFIC, ONLY FOR LANGUAGES IN USE\]|$language_conventions|"
|
||||||
|
"s|\[LAST 3 FEATURES AND WHAT THEY ADDED\]|$recent_change|"
|
||||||
|
)
|
||||||
|
|
||||||
|
for substitution in "${substitutions[@]}"; do
|
||||||
|
if ! sed -i.bak -e "$substitution" "$temp_file"; then
|
||||||
|
log_error "Failed to perform substitution: $substitution"
|
||||||
|
rm -f "$temp_file" "$temp_file.bak"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# Convert \n sequences to actual newlines
|
||||||
|
newline=$(printf '\n')
|
||||||
|
sed -i.bak2 "s/\\\\n/${newline}/g" "$temp_file"
|
||||||
|
|
||||||
|
# Clean up backup files
|
||||||
|
rm -f "$temp_file.bak" "$temp_file.bak2"
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
update_existing_agent_file() {
|
||||||
|
local target_file="$1"
|
||||||
|
local current_date="$2"
|
||||||
|
|
||||||
|
log_info "Updating existing agent context file..."
|
||||||
|
|
||||||
|
# Use a single temporary file for atomic update
|
||||||
|
local temp_file
|
||||||
|
temp_file=$(mktemp) || {
|
||||||
|
log_error "Failed to create temporary file"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Process the file in one pass
|
||||||
|
local tech_stack=$(format_technology_stack "$NEW_LANG" "$NEW_FRAMEWORK")
|
||||||
|
local new_tech_entries=()
|
||||||
|
local new_change_entry=""
|
||||||
|
|
||||||
|
# Prepare new technology entries
|
||||||
|
if [[ -n "$tech_stack" ]] && ! grep -q "$tech_stack" "$target_file"; then
|
||||||
|
new_tech_entries+=("- $tech_stack ($CURRENT_BRANCH)")
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_DB" ]] && [[ "$NEW_DB" != "N/A" ]] && [[ "$NEW_DB" != "NEEDS CLARIFICATION" ]] && ! grep -q "$NEW_DB" "$target_file"; then
|
||||||
|
new_tech_entries+=("- $NEW_DB ($CURRENT_BRANCH)")
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Prepare new change entry
|
||||||
|
if [[ -n "$tech_stack" ]]; then
|
||||||
|
new_change_entry="- $CURRENT_BRANCH: Added $tech_stack"
|
||||||
|
elif [[ -n "$NEW_DB" ]] && [[ "$NEW_DB" != "N/A" ]] && [[ "$NEW_DB" != "NEEDS CLARIFICATION" ]]; then
|
||||||
|
new_change_entry="- $CURRENT_BRANCH: Added $NEW_DB"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Process file line by line
|
||||||
|
local in_tech_section=false
|
||||||
|
local in_changes_section=false
|
||||||
|
local tech_entries_added=false
|
||||||
|
local changes_entries_added=false
|
||||||
|
local existing_changes_count=0
|
||||||
|
|
||||||
|
while IFS= read -r line || [[ -n "$line" ]]; do
|
||||||
|
# Handle Active Technologies section
|
||||||
|
if [[ "$line" == "## Active Technologies" ]]; then
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
in_tech_section=true
|
||||||
|
continue
|
||||||
|
elif [[ $in_tech_section == true ]] && [[ "$line" =~ ^##[[:space:]] ]]; then
|
||||||
|
# Add new tech entries before closing the section
|
||||||
|
if [[ $tech_entries_added == false ]] && [[ ${#new_tech_entries[@]} -gt 0 ]]; then
|
||||||
|
printf '%s\n' "${new_tech_entries[@]}" >> "$temp_file"
|
||||||
|
tech_entries_added=true
|
||||||
|
fi
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
in_tech_section=false
|
||||||
|
continue
|
||||||
|
elif [[ $in_tech_section == true ]] && [[ -z "$line" ]]; then
|
||||||
|
# Add new tech entries before empty line in tech section
|
||||||
|
if [[ $tech_entries_added == false ]] && [[ ${#new_tech_entries[@]} -gt 0 ]]; then
|
||||||
|
printf '%s\n' "${new_tech_entries[@]}" >> "$temp_file"
|
||||||
|
tech_entries_added=true
|
||||||
|
fi
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Handle Recent Changes section
|
||||||
|
if [[ "$line" == "## Recent Changes" ]]; then
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
# Add new change entry right after the heading
|
||||||
|
if [[ -n "$new_change_entry" ]]; then
|
||||||
|
echo "$new_change_entry" >> "$temp_file"
|
||||||
|
fi
|
||||||
|
in_changes_section=true
|
||||||
|
changes_entries_added=true
|
||||||
|
continue
|
||||||
|
elif [[ $in_changes_section == true ]] && [[ "$line" =~ ^##[[:space:]] ]]; then
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
in_changes_section=false
|
||||||
|
continue
|
||||||
|
elif [[ $in_changes_section == true ]] && [[ "$line" == "- "* ]]; then
|
||||||
|
# Keep only first 2 existing changes
|
||||||
|
if [[ $existing_changes_count -lt 2 ]]; then
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
((existing_changes_count++))
|
||||||
|
fi
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Update timestamp
|
||||||
|
if [[ "$line" =~ \*\*Last\ updated\*\*:.*[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] ]]; then
|
||||||
|
echo "$line" | sed "s/[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9]/$current_date/" >> "$temp_file"
|
||||||
|
else
|
||||||
|
echo "$line" >> "$temp_file"
|
||||||
|
fi
|
||||||
|
done < "$target_file"
|
||||||
|
|
||||||
|
# Post-loop check: if we're still in the Active Technologies section and haven't added new entries
|
||||||
|
if [[ $in_tech_section == true ]] && [[ $tech_entries_added == false ]] && [[ ${#new_tech_entries[@]} -gt 0 ]]; then
|
||||||
|
printf '%s\n' "${new_tech_entries[@]}" >> "$temp_file"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Move temp file to target atomically
|
||||||
|
if ! mv "$temp_file" "$target_file"; then
|
||||||
|
log_error "Failed to update target file"
|
||||||
|
rm -f "$temp_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
#==============================================================================
|
||||||
|
# Main Agent File Update Function
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
update_agent_file() {
|
||||||
|
local target_file="$1"
|
||||||
|
local agent_name="$2"
|
||||||
|
|
||||||
|
if [[ -z "$target_file" ]] || [[ -z "$agent_name" ]]; then
|
||||||
|
log_error "update_agent_file requires target_file and agent_name parameters"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Updating $agent_name context file: $target_file"
|
||||||
|
|
||||||
|
local project_name
|
||||||
|
project_name=$(basename "$REPO_ROOT")
|
||||||
|
local current_date
|
||||||
|
current_date=$(date +%Y-%m-%d)
|
||||||
|
|
||||||
|
# Create directory if it doesn't exist
|
||||||
|
local target_dir
|
||||||
|
target_dir=$(dirname "$target_file")
|
||||||
|
if [[ ! -d "$target_dir" ]]; then
|
||||||
|
if ! mkdir -p "$target_dir"; then
|
||||||
|
log_error "Failed to create directory: $target_dir"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -f "$target_file" ]]; then
|
||||||
|
# Create new file from template
|
||||||
|
local temp_file
|
||||||
|
temp_file=$(mktemp) || {
|
||||||
|
log_error "Failed to create temporary file"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
if create_new_agent_file "$target_file" "$temp_file" "$project_name" "$current_date"; then
|
||||||
|
if mv "$temp_file" "$target_file"; then
|
||||||
|
log_success "Created new $agent_name context file"
|
||||||
|
else
|
||||||
|
log_error "Failed to move temporary file to $target_file"
|
||||||
|
rm -f "$temp_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log_error "Failed to create new agent file"
|
||||||
|
rm -f "$temp_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
# Update existing file
|
||||||
|
if [[ ! -r "$target_file" ]]; then
|
||||||
|
log_error "Cannot read existing file: $target_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -w "$target_file" ]]; then
|
||||||
|
log_error "Cannot write to existing file: $target_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if update_existing_agent_file "$target_file" "$current_date"; then
|
||||||
|
log_success "Updated existing $agent_name context file"
|
||||||
|
else
|
||||||
|
log_error "Failed to update existing agent file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Agent Selection and Processing
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
update_specific_agent() {
|
||||||
|
local agent_type="$1"
|
||||||
|
|
||||||
|
case "$agent_type" in
|
||||||
|
claude)
|
||||||
|
update_agent_file "$CLAUDE_FILE" "Claude Code"
|
||||||
|
;;
|
||||||
|
gemini)
|
||||||
|
update_agent_file "$GEMINI_FILE" "Gemini CLI"
|
||||||
|
;;
|
||||||
|
copilot)
|
||||||
|
update_agent_file "$COPILOT_FILE" "GitHub Copilot"
|
||||||
|
;;
|
||||||
|
cursor)
|
||||||
|
update_agent_file "$CURSOR_FILE" "Cursor IDE"
|
||||||
|
;;
|
||||||
|
qwen)
|
||||||
|
update_agent_file "$QWEN_FILE" "Qwen Code"
|
||||||
|
;;
|
||||||
|
opencode)
|
||||||
|
update_agent_file "$AGENTS_FILE" "opencode"
|
||||||
|
;;
|
||||||
|
codex)
|
||||||
|
update_agent_file "$AGENTS_FILE" "Codex CLI"
|
||||||
|
;;
|
||||||
|
windsurf)
|
||||||
|
update_agent_file "$WINDSURF_FILE" "Windsurf"
|
||||||
|
;;
|
||||||
|
kilocode)
|
||||||
|
update_agent_file "$KILOCODE_FILE" "Kilo Code"
|
||||||
|
;;
|
||||||
|
auggie)
|
||||||
|
update_agent_file "$AUGGIE_FILE" "Auggie CLI"
|
||||||
|
;;
|
||||||
|
roo)
|
||||||
|
update_agent_file "$ROO_FILE" "Roo Code"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
log_error "Unknown agent type '$agent_type'"
|
||||||
|
log_error "Expected: claude|gemini|copilot|cursor|qwen|opencode|codex|windsurf|kilocode|auggie|roo"
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
update_all_existing_agents() {
|
||||||
|
local found_agent=false
|
||||||
|
|
||||||
|
# Check each possible agent file and update if it exists
|
||||||
|
if [[ -f "$CLAUDE_FILE" ]]; then
|
||||||
|
update_agent_file "$CLAUDE_FILE" "Claude Code"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$GEMINI_FILE" ]]; then
|
||||||
|
update_agent_file "$GEMINI_FILE" "Gemini CLI"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$COPILOT_FILE" ]]; then
|
||||||
|
update_agent_file "$COPILOT_FILE" "GitHub Copilot"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$CURSOR_FILE" ]]; then
|
||||||
|
update_agent_file "$CURSOR_FILE" "Cursor IDE"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$QWEN_FILE" ]]; then
|
||||||
|
update_agent_file "$QWEN_FILE" "Qwen Code"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$AGENTS_FILE" ]]; then
|
||||||
|
update_agent_file "$AGENTS_FILE" "Codex/opencode"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$WINDSURF_FILE" ]]; then
|
||||||
|
update_agent_file "$WINDSURF_FILE" "Windsurf"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$KILOCODE_FILE" ]]; then
|
||||||
|
update_agent_file "$KILOCODE_FILE" "Kilo Code"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$AUGGIE_FILE" ]]; then
|
||||||
|
update_agent_file "$AUGGIE_FILE" "Auggie CLI"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -f "$ROO_FILE" ]]; then
|
||||||
|
update_agent_file "$ROO_FILE" "Roo Code"
|
||||||
|
found_agent=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# If no agent files exist, create a default Claude file
|
||||||
|
if [[ "$found_agent" == false ]]; then
|
||||||
|
log_info "No existing agent files found, creating default Claude file..."
|
||||||
|
update_agent_file "$CLAUDE_FILE" "Claude Code"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
print_summary() {
|
||||||
|
echo
|
||||||
|
log_info "Summary of changes:"
|
||||||
|
|
||||||
|
if [[ -n "$NEW_LANG" ]]; then
|
||||||
|
echo " - Added language: $NEW_LANG"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_FRAMEWORK" ]]; then
|
||||||
|
echo " - Added framework: $NEW_FRAMEWORK"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n "$NEW_DB" ]] && [[ "$NEW_DB" != "N/A" ]]; then
|
||||||
|
echo " - Added database: $NEW_DB"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
log_info "Usage: $0 [claude|gemini|copilot|cursor|qwen|opencode|codex|windsurf|kilocode|auggie|roo]"
|
||||||
|
}
|
||||||
|
|
||||||
|
#==============================================================================
|
||||||
|
# Main Execution
|
||||||
|
#==============================================================================
|
||||||
|
|
||||||
|
main() {
|
||||||
|
# Validate environment before proceeding
|
||||||
|
validate_environment
|
||||||
|
|
||||||
|
log_info "=== Updating agent context files for feature $CURRENT_BRANCH ==="
|
||||||
|
|
||||||
|
# Parse the plan file to extract project information
|
||||||
|
if ! parse_plan_data "$NEW_PLAN"; then
|
||||||
|
log_error "Failed to parse plan data"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Process based on agent type argument
|
||||||
|
local success=true
|
||||||
|
|
||||||
|
if [[ -z "$AGENT_TYPE" ]]; then
|
||||||
|
# No specific agent provided - update all existing agent files
|
||||||
|
log_info "No agent specified, updating all existing agent files..."
|
||||||
|
if ! update_all_existing_agents; then
|
||||||
|
success=false
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
# Specific agent provided - update only that agent
|
||||||
|
log_info "Updating specific agent: $AGENT_TYPE"
|
||||||
|
if ! update_specific_agent "$AGENT_TYPE"; then
|
||||||
|
success=false
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
print_summary
|
||||||
|
|
||||||
|
if [[ "$success" == true ]]; then
|
||||||
|
log_success "Agent context update completed successfully"
|
||||||
|
exit 0
|
||||||
|
else
|
||||||
|
log_error "Agent context update completed with errors"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Execute main function if script is run directly
|
||||||
|
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
||||||
|
main "$@"
|
||||||
|
fi
|
||||||
23
.specify/templates/agent-file-template.md
Normal file
23
.specify/templates/agent-file-template.md
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
# [PROJECT NAME] Development Guidelines
|
||||||
|
|
||||||
|
Auto-generated from all feature plans. Last updated: [DATE]
|
||||||
|
|
||||||
|
## Active Technologies
|
||||||
|
[EXTRACTED FROM ALL PLAN.MD FILES]
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
```
|
||||||
|
[ACTUAL STRUCTURE FROM PLANS]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
[ONLY COMMANDS FOR ACTIVE TECHNOLOGIES]
|
||||||
|
|
||||||
|
## Code Style
|
||||||
|
[LANGUAGE-SPECIFIC, ONLY FOR LANGUAGES IN USE]
|
||||||
|
|
||||||
|
## Recent Changes
|
||||||
|
[LAST 3 FEATURES AND WHAT THEY ADDED]
|
||||||
|
|
||||||
|
<!-- MANUAL ADDITIONS START -->
|
||||||
|
<!-- MANUAL ADDITIONS END -->
|
||||||
219
.specify/templates/plan-template.md
Normal file
219
.specify/templates/plan-template.md
Normal file
@@ -0,0 +1,219 @@
|
|||||||
|
|
||||||
|
# Implementation Plan: [FEATURE]
|
||||||
|
|
||||||
|
**Branch**: `[###-feature-name]` | **Date**: [DATE] | **Spec**: [link]
|
||||||
|
**Input**: Feature specification from `/specs/[###-feature-name]/spec.md`
|
||||||
|
|
||||||
|
## Execution Flow (/plan command scope)
|
||||||
|
```
|
||||||
|
1. Load feature spec from Input path
|
||||||
|
→ If not found: ERROR "No feature spec at {path}"
|
||||||
|
2. Fill Technical Context (scan for NEEDS CLARIFICATION)
|
||||||
|
→ Detect Project Type from file system structure or context (web=frontend+backend, mobile=app+api)
|
||||||
|
→ Set Structure Decision based on project type
|
||||||
|
3. Fill the Constitution Check section based on the content of the constitution document.
|
||||||
|
4. Evaluate Constitution Check section below
|
||||||
|
→ If violations exist: Document in Complexity Tracking
|
||||||
|
→ If no justification possible: ERROR "Simplify approach first"
|
||||||
|
→ Update Progress Tracking: Initial Constitution Check
|
||||||
|
5. Execute Phase 0 → research.md
|
||||||
|
→ If NEEDS CLARIFICATION remain: ERROR "Resolve unknowns"
|
||||||
|
6. Execute Phase 1 → contracts, data-model.md, quickstart.md, agent-specific template file (e.g., `CLAUDE.md` for Claude Code, `.github/copilot-instructions.md` for GitHub Copilot, `GEMINI.md` for Gemini CLI, `QWEN.md` for Qwen Code or `AGENTS.md` for opencode).
|
||||||
|
7. Re-evaluate Constitution Check section
|
||||||
|
→ If new violations: Refactor design, return to Phase 1
|
||||||
|
→ Update Progress Tracking: Post-Design Constitution Check
|
||||||
|
8. Plan Phase 2 → Describe task generation approach (DO NOT create tasks.md)
|
||||||
|
9. STOP - Ready for /tasks command
|
||||||
|
```
|
||||||
|
|
||||||
|
**IMPORTANT**: The /plan command STOPS at step 7. Phases 2-4 are executed by other commands:
|
||||||
|
- Phase 2: /tasks command creates tasks.md
|
||||||
|
- Phase 3-4: Implementation execution (manual or via tools)
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
[Extract from feature spec: primary requirement + technical approach from research]
|
||||||
|
|
||||||
|
## Technical Context
|
||||||
|
**Language/Version**: [e.g., Python 3.11, Swift 5.9, Rust 1.75 or NEEDS CLARIFICATION]
|
||||||
|
**Primary Dependencies**: [e.g., FastAPI, UIKit, LLVM or NEEDS CLARIFICATION]
|
||||||
|
**Storage**: [if applicable, e.g., PostgreSQL, CoreData, files or N/A]
|
||||||
|
**Testing**: [e.g., pytest, XCTest, cargo test or NEEDS CLARIFICATION]
|
||||||
|
**Target Platform**: [e.g., Linux server, iOS 15+, WASM or NEEDS CLARIFICATION]
|
||||||
|
**Project Type**: [single/web/mobile - determines source structure]
|
||||||
|
**Performance Goals**: [domain-specific, e.g., 1000 req/s, 10k lines/sec, 60 fps or NEEDS CLARIFICATION]
|
||||||
|
**Constraints**: [domain-specific, e.g., <200ms p95, <100MB memory, offline-capable or NEEDS CLARIFICATION]
|
||||||
|
**Scale/Scope**: [domain-specific, e.g., 10k users, 1M LOC, 50 screens or NEEDS CLARIFICATION]
|
||||||
|
|
||||||
|
## Constitution Check
|
||||||
|
*GATE: Must pass before Phase 0 research. Re-check after Phase 1 design.*
|
||||||
|
|
||||||
|
[Gates determined based on constitution file]
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
### Documentation (this feature)
|
||||||
|
```
|
||||||
|
specs/[###-feature]/
|
||||||
|
├── plan.md # This file (/plan command output)
|
||||||
|
├── research.md # Phase 0 output (/plan command)
|
||||||
|
├── data-model.md # Phase 1 output (/plan command)
|
||||||
|
├── quickstart.md # Phase 1 output (/plan command)
|
||||||
|
├── contracts/ # Phase 1 output (/plan command)
|
||||||
|
└── tasks.md # Phase 2 output (/tasks command - NOT created by /plan)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Source Code (repository root)
|
||||||
|
<!--
|
||||||
|
ACTION REQUIRED: Replace the placeholder tree below with the concrete layout
|
||||||
|
for this feature. Delete unused options and expand the chosen structure with
|
||||||
|
real paths (e.g., apps/admin, packages/something). The delivered plan must
|
||||||
|
not include Option labels.
|
||||||
|
-->
|
||||||
|
```
|
||||||
|
# [REMOVE IF UNUSED] Option 1: Single project (DEFAULT)
|
||||||
|
src/
|
||||||
|
├── models/
|
||||||
|
├── services/
|
||||||
|
├── cli/
|
||||||
|
└── lib/
|
||||||
|
|
||||||
|
tests/
|
||||||
|
├── contract/
|
||||||
|
├── integration/
|
||||||
|
└── unit/
|
||||||
|
|
||||||
|
# [REMOVE IF UNUSED] Option 2: Web application (when "frontend" + "backend" detected)
|
||||||
|
backend/
|
||||||
|
├── src/
|
||||||
|
│ ├── models/
|
||||||
|
│ ├── services/
|
||||||
|
│ └── api/
|
||||||
|
└── tests/
|
||||||
|
|
||||||
|
frontend/
|
||||||
|
├── src/
|
||||||
|
│ ├── components/
|
||||||
|
│ ├── pages/
|
||||||
|
│ └── services/
|
||||||
|
└── tests/
|
||||||
|
|
||||||
|
# [REMOVE IF UNUSED] Option 3: Mobile + API (when "iOS/Android" detected)
|
||||||
|
api/
|
||||||
|
└── [same as backend above]
|
||||||
|
|
||||||
|
ios/ or android/
|
||||||
|
└── [platform-specific structure: feature modules, UI flows, platform tests]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Structure Decision**: [Document the selected structure and reference the real
|
||||||
|
directories captured above]
|
||||||
|
|
||||||
|
## Phase 0: Outline & Research
|
||||||
|
1. **Extract unknowns from Technical Context** above:
|
||||||
|
- For each NEEDS CLARIFICATION → research task
|
||||||
|
- For each dependency → best practices task
|
||||||
|
- For each integration → patterns task
|
||||||
|
|
||||||
|
2. **Generate and dispatch research agents**:
|
||||||
|
```
|
||||||
|
For each unknown in Technical Context:
|
||||||
|
Task: "Research {unknown} for {feature context}"
|
||||||
|
For each technology choice:
|
||||||
|
Task: "Find best practices for {tech} in {domain}"
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Consolidate findings** in `research.md` using format:
|
||||||
|
- Decision: [what was chosen]
|
||||||
|
- Rationale: [why chosen]
|
||||||
|
- Alternatives considered: [what else evaluated]
|
||||||
|
|
||||||
|
**Output**: research.md with all NEEDS CLARIFICATION resolved
|
||||||
|
|
||||||
|
## Phase 1: Design & Contracts
|
||||||
|
*Prerequisites: research.md complete*
|
||||||
|
|
||||||
|
1. **Extract entities from feature spec** → `data-model.md`:
|
||||||
|
- Entity name, fields, relationships
|
||||||
|
- Validation rules from requirements
|
||||||
|
- State transitions if applicable
|
||||||
|
|
||||||
|
2. **Generate API contracts** from functional requirements:
|
||||||
|
- For each user action → endpoint
|
||||||
|
- Use standard REST/GraphQL patterns
|
||||||
|
- Output OpenAPI/GraphQL schema to `/contracts/`
|
||||||
|
|
||||||
|
3. **Generate contract tests** from contracts:
|
||||||
|
- One test file per endpoint
|
||||||
|
- Assert request/response schemas
|
||||||
|
- Tests must fail (no implementation yet)
|
||||||
|
|
||||||
|
4. **Extract test scenarios** from user stories:
|
||||||
|
- Each story → integration test scenario
|
||||||
|
- Quickstart test = story validation steps
|
||||||
|
|
||||||
|
5. **Update agent file incrementally** (O(1) operation):
|
||||||
|
- Run `.specify/scripts/bash/update-agent-context.sh gemini`
|
||||||
|
**IMPORTANT**: Execute it exactly as specified above. Do not add or remove any arguments.
|
||||||
|
- If exists: Add only NEW tech from current plan
|
||||||
|
- Preserve manual additions between markers
|
||||||
|
- Update recent changes (keep last 3)
|
||||||
|
- Keep under 150 lines for token efficiency
|
||||||
|
- Output to repository root
|
||||||
|
|
||||||
|
**Output**: data-model.md, /contracts/*, failing tests, quickstart.md, agent-specific file
|
||||||
|
|
||||||
|
## Phase 2: Task Planning Approach
|
||||||
|
*This section describes what the /tasks command will do - DO NOT execute during /plan*
|
||||||
|
|
||||||
|
**Task Generation Strategy**:
|
||||||
|
- Load `.specify/templates/tasks-template.md` as base
|
||||||
|
- Generate tasks from Phase 1 design docs (contracts, data model, quickstart)
|
||||||
|
- Each contract → contract test task [P]
|
||||||
|
- Each entity → model creation task [P]
|
||||||
|
- Each user story → integration test task
|
||||||
|
- Implementation tasks to make tests pass
|
||||||
|
|
||||||
|
**Ordering Strategy**:
|
||||||
|
- TDD order: Tests before implementation
|
||||||
|
- Dependency order: Models before services before UI
|
||||||
|
- Mark [P] for parallel execution (independent files)
|
||||||
|
|
||||||
|
**Estimated Output**: 25-30 numbered, ordered tasks in tasks.md
|
||||||
|
|
||||||
|
**IMPORTANT**: This phase is executed by the /tasks command, NOT by /plan
|
||||||
|
|
||||||
|
## Phase 3+: Future Implementation
|
||||||
|
*These phases are beyond the scope of the /plan command*
|
||||||
|
|
||||||
|
**Phase 3**: Task execution (/tasks command creates tasks.md)
|
||||||
|
**Phase 4**: Implementation (execute tasks.md following constitutional principles)
|
||||||
|
**Phase 5**: Validation (run tests, execute quickstart.md, performance validation)
|
||||||
|
|
||||||
|
## Complexity Tracking
|
||||||
|
*Fill ONLY if Constitution Check has violations that must be justified*
|
||||||
|
|
||||||
|
| Violation | Why Needed | Simpler Alternative Rejected Because |
|
||||||
|
|-----------|------------|-------------------------------------|
|
||||||
|
| [e.g., 4th project] | [current need] | [why 3 projects insufficient] |
|
||||||
|
| [e.g., Repository pattern] | [specific problem] | [why direct DB access insufficient] |
|
||||||
|
|
||||||
|
|
||||||
|
## Progress Tracking
|
||||||
|
*This checklist is updated during execution flow*
|
||||||
|
|
||||||
|
**Phase Status**:
|
||||||
|
- [ ] Phase 0: Research complete (/plan command)
|
||||||
|
- [ ] Phase 1: Design complete (/plan command)
|
||||||
|
- [ ] Phase 2: Task planning complete (/plan command - describe approach only)
|
||||||
|
- [ ] Phase 3: Tasks generated (/tasks command)
|
||||||
|
- [ ] Phase 4: Implementation complete
|
||||||
|
- [ ] Phase 5: Validation passed
|
||||||
|
|
||||||
|
**Gate Status**:
|
||||||
|
- [ ] Initial Constitution Check: PASS
|
||||||
|
- [ ] Post-Design Constitution Check: PASS
|
||||||
|
- [ ] All NEEDS CLARIFICATION resolved
|
||||||
|
- [ ] Complexity deviations documented
|
||||||
|
|
||||||
|
---
|
||||||
|
*Based on Constitution v2.1.1 - See `/memory/constitution.md`*
|
||||||
116
.specify/templates/spec-template.md
Normal file
116
.specify/templates/spec-template.md
Normal file
@@ -0,0 +1,116 @@
|
|||||||
|
# Feature Specification: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Feature Branch**: `[###-feature-name]`
|
||||||
|
**Created**: [DATE]
|
||||||
|
**Status**: Draft
|
||||||
|
**Input**: User description: "$ARGUMENTS"
|
||||||
|
|
||||||
|
## Execution Flow (main)
|
||||||
|
```
|
||||||
|
1. Parse user description from Input
|
||||||
|
→ If empty: ERROR "No feature description provided"
|
||||||
|
2. Extract key concepts from description
|
||||||
|
→ Identify: actors, actions, data, constraints
|
||||||
|
3. For each unclear aspect:
|
||||||
|
→ Mark with [NEEDS CLARIFICATION: specific question]
|
||||||
|
4. Fill User Scenarios & Testing section
|
||||||
|
→ If no clear user flow: ERROR "Cannot determine user scenarios"
|
||||||
|
5. Generate Functional Requirements
|
||||||
|
→ Each requirement must be testable
|
||||||
|
→ Mark ambiguous requirements
|
||||||
|
6. Identify Key Entities (if data involved)
|
||||||
|
7. Run Review Checklist
|
||||||
|
→ If any [NEEDS CLARIFICATION]: WARN "Spec has uncertainties"
|
||||||
|
→ If implementation details found: ERROR "Remove tech details"
|
||||||
|
8. Return: SUCCESS (spec ready for planning)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ⚡ Quick Guidelines
|
||||||
|
- ✅ Focus on WHAT users need and WHY
|
||||||
|
- ❌ Avoid HOW to implement (no tech stack, APIs, code structure)
|
||||||
|
- 👥 Written for business stakeholders, not developers
|
||||||
|
|
||||||
|
### Section Requirements
|
||||||
|
- **Mandatory sections**: Must be completed for every feature
|
||||||
|
- **Optional sections**: Include only when relevant to the feature
|
||||||
|
- When a section doesn't apply, remove it entirely (don't leave as "N/A")
|
||||||
|
|
||||||
|
### For AI Generation
|
||||||
|
When creating this spec from a user prompt:
|
||||||
|
1. **Mark all ambiguities**: Use [NEEDS CLARIFICATION: specific question] for any assumption you'd need to make
|
||||||
|
2. **Don't guess**: If the prompt doesn't specify something (e.g., "login system" without auth method), mark it
|
||||||
|
3. **Think like a tester**: Every vague requirement should fail the "testable and unambiguous" checklist item
|
||||||
|
4. **Common underspecified areas**:
|
||||||
|
- User types and permissions
|
||||||
|
- Data retention/deletion policies
|
||||||
|
- Performance targets and scale
|
||||||
|
- Error handling behaviors
|
||||||
|
- Integration requirements
|
||||||
|
- Security/compliance needs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## User Scenarios & Testing *(mandatory)*
|
||||||
|
|
||||||
|
### Primary User Story
|
||||||
|
[Describe the main user journey in plain language]
|
||||||
|
|
||||||
|
### Acceptance Scenarios
|
||||||
|
1. **Given** [initial state], **When** [action], **Then** [expected outcome]
|
||||||
|
2. **Given** [initial state], **When** [action], **Then** [expected outcome]
|
||||||
|
|
||||||
|
### Edge Cases
|
||||||
|
- What happens when [boundary condition]?
|
||||||
|
- How does system handle [error scenario]?
|
||||||
|
|
||||||
|
## Requirements *(mandatory)*
|
||||||
|
|
||||||
|
### Functional Requirements
|
||||||
|
- **FR-001**: System MUST [specific capability, e.g., "allow users to create accounts"]
|
||||||
|
- **FR-002**: System MUST [specific capability, e.g., "validate email addresses"]
|
||||||
|
- **FR-003**: Users MUST be able to [key interaction, e.g., "reset their password"]
|
||||||
|
- **FR-004**: System MUST [data requirement, e.g., "persist user preferences"]
|
||||||
|
- **FR-005**: System MUST [behavior, e.g., "log all security events"]
|
||||||
|
|
||||||
|
*Example of marking unclear requirements:*
|
||||||
|
- **FR-006**: System MUST authenticate users via [NEEDS CLARIFICATION: auth method not specified - email/password, SSO, OAuth?]
|
||||||
|
- **FR-007**: System MUST retain user data for [NEEDS CLARIFICATION: retention period not specified]
|
||||||
|
|
||||||
|
### Key Entities *(include if feature involves data)*
|
||||||
|
- **[Entity 1]**: [What it represents, key attributes without implementation]
|
||||||
|
- **[Entity 2]**: [What it represents, relationships to other entities]
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Review & Acceptance Checklist
|
||||||
|
*GATE: Automated checks run during main() execution*
|
||||||
|
|
||||||
|
### Content Quality
|
||||||
|
- [ ] No implementation details (languages, frameworks, APIs)
|
||||||
|
- [ ] Focused on user value and business needs
|
||||||
|
- [ ] Written for non-technical stakeholders
|
||||||
|
- [ ] All mandatory sections completed
|
||||||
|
|
||||||
|
### Requirement Completeness
|
||||||
|
- [ ] No [NEEDS CLARIFICATION] markers remain
|
||||||
|
- [ ] Requirements are testable and unambiguous
|
||||||
|
- [ ] Success criteria are measurable
|
||||||
|
- [ ] Scope is clearly bounded
|
||||||
|
- [ ] Dependencies and assumptions identified
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Execution Status
|
||||||
|
*Updated by main() during processing*
|
||||||
|
|
||||||
|
- [ ] User description parsed
|
||||||
|
- [ ] Key concepts extracted
|
||||||
|
- [ ] Ambiguities marked
|
||||||
|
- [ ] User scenarios defined
|
||||||
|
- [ ] Requirements generated
|
||||||
|
- [ ] Entities identified
|
||||||
|
- [ ] Review checklist passed
|
||||||
|
|
||||||
|
---
|
||||||
127
.specify/templates/tasks-template.md
Normal file
127
.specify/templates/tasks-template.md
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
# Tasks: [FEATURE NAME]
|
||||||
|
|
||||||
|
**Input**: Design documents from `/specs/[###-feature-name]/`
|
||||||
|
**Prerequisites**: plan.md (required), research.md, data-model.md, contracts/
|
||||||
|
|
||||||
|
## Execution Flow (main)
|
||||||
|
```
|
||||||
|
1. Load plan.md from feature directory
|
||||||
|
→ If not found: ERROR "No implementation plan found"
|
||||||
|
→ Extract: tech stack, libraries, structure
|
||||||
|
2. Load optional design documents:
|
||||||
|
→ data-model.md: Extract entities → model tasks
|
||||||
|
→ contracts/: Each file → contract test task
|
||||||
|
→ research.md: Extract decisions → setup tasks
|
||||||
|
3. Generate tasks by category:
|
||||||
|
→ Setup: project init, dependencies, linting
|
||||||
|
→ Tests: contract tests, integration tests
|
||||||
|
→ Core: models, services, CLI commands
|
||||||
|
→ Integration: DB, middleware, logging
|
||||||
|
→ Polish: unit tests, performance, docs
|
||||||
|
4. Apply task rules:
|
||||||
|
→ Different files = mark [P] for parallel
|
||||||
|
→ Same file = sequential (no [P])
|
||||||
|
→ Tests before implementation (TDD)
|
||||||
|
5. Number tasks sequentially (T001, T002...)
|
||||||
|
6. Generate dependency graph
|
||||||
|
7. Create parallel execution examples
|
||||||
|
8. Validate task completeness:
|
||||||
|
→ All contracts have tests?
|
||||||
|
→ All entities have models?
|
||||||
|
→ All endpoints implemented?
|
||||||
|
9. Return: SUCCESS (tasks ready for execution)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Format: `[ID] [P?] Description`
|
||||||
|
- **[P]**: Can run in parallel (different files, no dependencies)
|
||||||
|
- Include exact file paths in descriptions
|
||||||
|
|
||||||
|
## Path Conventions
|
||||||
|
- **Single project**: `src/`, `tests/` at repository root
|
||||||
|
- **Web app**: `backend/src/`, `frontend/src/`
|
||||||
|
- **Mobile**: `api/src/`, `ios/src/` or `android/src/`
|
||||||
|
- Paths shown below assume single project - adjust based on plan.md structure
|
||||||
|
|
||||||
|
## Phase 3.1: Setup
|
||||||
|
- [ ] T001 Create project structure per implementation plan
|
||||||
|
- [ ] T002 Initialize [language] project with [framework] dependencies
|
||||||
|
- [ ] T003 [P] Configure linting and formatting tools
|
||||||
|
|
||||||
|
## Phase 3.2: Tests First (TDD) ⚠️ MUST COMPLETE BEFORE 3.3
|
||||||
|
**CRITICAL: These tests MUST be written and MUST FAIL before ANY implementation**
|
||||||
|
- [ ] T004 [P] Contract test POST /api/users in tests/contract/test_users_post.py
|
||||||
|
- [ ] T005 [P] Contract test GET /api/users/{id} in tests/contract/test_users_get.py
|
||||||
|
- [ ] T006 [P] Integration test user registration in tests/integration/test_registration.py
|
||||||
|
- [ ] T007 [P] Integration test auth flow in tests/integration/test_auth.py
|
||||||
|
|
||||||
|
## Phase 3.3: Core Implementation (ONLY after tests are failing)
|
||||||
|
- [ ] T008 [P] User model in src/models/user.py
|
||||||
|
- [ ] T009 [P] UserService CRUD in src/services/user_service.py
|
||||||
|
- [ ] T010 [P] CLI --create-user in src/cli/user_commands.py
|
||||||
|
- [ ] T011 POST /api/users endpoint
|
||||||
|
- [ ] T012 GET /api/users/{id} endpoint
|
||||||
|
- [ ] T013 Input validation
|
||||||
|
- [ ] T014 Error handling and logging
|
||||||
|
|
||||||
|
## Phase 3.4: Integration
|
||||||
|
- [ ] T015 Connect UserService to DB
|
||||||
|
- [ ] T016 Auth middleware
|
||||||
|
- [ ] T017 Request/response logging
|
||||||
|
- [ ] T018 CORS and security headers
|
||||||
|
|
||||||
|
## Phase 3.5: Polish
|
||||||
|
- [ ] T019 [P] Unit tests for validation in tests/unit/test_validation.py
|
||||||
|
- [ ] T020 Performance tests (<200ms)
|
||||||
|
- [ ] T021 [P] Update docs/api.md
|
||||||
|
- [ ] T022 Remove duplication
|
||||||
|
- [ ] T023 Run manual-testing.md
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
- Tests (T004-T007) before implementation (T008-T014)
|
||||||
|
- T008 blocks T009, T015
|
||||||
|
- T016 blocks T018
|
||||||
|
- Implementation before polish (T019-T023)
|
||||||
|
|
||||||
|
## Parallel Example
|
||||||
|
```
|
||||||
|
# Launch T004-T007 together:
|
||||||
|
Task: "Contract test POST /api/users in tests/contract/test_users_post.py"
|
||||||
|
Task: "Contract test GET /api/users/{id} in tests/contract/test_users_get.py"
|
||||||
|
Task: "Integration test registration in tests/integration/test_registration.py"
|
||||||
|
Task: "Integration test auth in tests/integration/test_auth.py"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
- [P] tasks = different files, no dependencies
|
||||||
|
- Verify tests fail before implementing
|
||||||
|
- Commit after each task
|
||||||
|
- Avoid: vague tasks, same file conflicts
|
||||||
|
|
||||||
|
## Task Generation Rules
|
||||||
|
*Applied during main() execution*
|
||||||
|
|
||||||
|
1. **From Contracts**:
|
||||||
|
- Each contract file → contract test task [P]
|
||||||
|
- Each endpoint → implementation task
|
||||||
|
|
||||||
|
2. **From Data Model**:
|
||||||
|
- Each entity → model creation task [P]
|
||||||
|
- Relationships → service layer tasks
|
||||||
|
|
||||||
|
3. **From User Stories**:
|
||||||
|
- Each story → integration test [P]
|
||||||
|
- Quickstart scenarios → validation tasks
|
||||||
|
|
||||||
|
4. **Ordering**:
|
||||||
|
- Setup → Tests → Models → Services → Endpoints → Polish
|
||||||
|
- Dependencies block parallel execution
|
||||||
|
|
||||||
|
## Validation Checklist
|
||||||
|
*GATE: Checked by main() before returning*
|
||||||
|
|
||||||
|
- [ ] All contracts have corresponding tests
|
||||||
|
- [ ] All entities have model tasks
|
||||||
|
- [ ] All tests come before implementation
|
||||||
|
- [ ] Parallel tasks truly independent
|
||||||
|
- [ ] Each task specifies exact file path
|
||||||
|
- [ ] No task modifies same file as another [P] task
|
||||||
132
docs/PRD.md
Normal file
132
docs/PRD.md
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
# 产品需求文档 (PRD) - “蚊子”传播系统
|
||||||
|
|
||||||
|
**版本**: 1.2 (已优化)
|
||||||
|
**日期**: 2025年9月27日
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. 执行摘要 (Executive Summary)
|
||||||
|
|
||||||
|
“蚊子”传播系统是一个SaaS工具,旨在解决企业在口碑营销中面临的**传播路径黑盒、激励缺失、效果无法量化**的核心痛点。本产品通过提供一套可视化的、自动激励的病毒式营销活动管理平台,帮助企业系统化地规划、执行和分析裂变增长活动。我们的目标是将客户的平均用户获客成本(CAC)降低50%以上,并显著提升K因子,实现可控、高效的用户增长。
|
||||||
|
|
||||||
|
## 2. 产品愿景 (Product Vision)
|
||||||
|
|
||||||
|
让每一次用户分享都如“蚊子”一般,具备**精准定位、快速裂变、可追踪、可激励**的特性。我们致力于将不可控的“口碑”转化为可量化的“增长引擎”,成为企业进行病毒式营销的首选工具。
|
||||||
|
|
||||||
|
## 3. 背景与问题陈述 (Background & Problem Statement)
|
||||||
|
|
||||||
|
在当前获客成本日益增高的市场环境下,基于社交关系的病毒式营销是最低成本、高转化的增长方式。然而,企业普遍面临以下问题:
|
||||||
|
- **传播路径黑盒**:无法清晰了解用户分享的路径,不知道谁是关键传播节点(KOL)。
|
||||||
|
- **激励机制缺失**:难以对高价值的传播者进行有效、自动化的奖励,挫伤用户分享积极性。
|
||||||
|
- **效果无法量化**:无法准确衡量一场裂变活动的ROI,如K因子、转化率、获客成本等。
|
||||||
|
- **活动管理复杂**:创建和管理一场裂变活动流程繁琐,需要开发人员深度介入。
|
||||||
|
|
||||||
|
## 4. 产品目标 (Product Goals)
|
||||||
|
|
||||||
|
- **业务目标**: 将平均用户获客成本(CAC)降低50%以上,使核心活动的K因子(每个现有用户带来的新用户数)大于1。
|
||||||
|
- **产品目标**: 提供一套完整的邀请裂变管理工具,帮助运营人员无需开发介入即可轻松发起、管理和复盘病毒式营销活动。
|
||||||
|
- **用户目标**:
|
||||||
|
- **管理员**: 获得高效、低成本的拉新手段,并通过直观数据支持决策。
|
||||||
|
- **传播者**: 获得简单明了的邀请方式、实时的成果反馈和及时的奖励。
|
||||||
|
|
||||||
|
## 5. 用户画像 (User Personas)
|
||||||
|
|
||||||
|
### 5.1. 活动管理员 (运营/市场经理)
|
||||||
|
- **画像**: 负责公司产品的用户增长,关注KPI如新增用户、活跃度、成本等。
|
||||||
|
- **痛点**: 希望有更高效、低成本的拉新手段,需要直观的数据来支持决策和汇报。
|
||||||
|
|
||||||
|
### 5.2. 终端用户 (传播者/邀请者)
|
||||||
|
- **画像**: 产品或服务的现有用户,愿意分享给朋友以获取奖励或社交资本。
|
||||||
|
- **痛点**: 分享后不知道有谁通过自己的链接注册,也无法方便地获得承诺的奖励。
|
||||||
|
|
||||||
|
## 6. 核心功能 V1.0 (User Stories for V1.0)
|
||||||
|
|
||||||
|
### 6.1. 活动管理
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为管理员**,我希望能创建一个有时限的邀请活动,以便于策划和管理营销节奏。 | 1. 可设定活动名称、开始时间、结束时间。<br>2. 活动到期后自动停止。 | **高** |
|
||||||
|
| **作为管理员**,我希望可以设定活动仅对特定用户群体开放,以便于进行精准营销。 | 1. 可配置活动参与的用户标签或ID列表。 | **高** |
|
||||||
|
| **作为管理员**,我希望可以自定义邀请页面的文案和图片,以匹配不同的活动主题和品牌风格。 | 1. 支持富文本编辑器修改文案。<br>2. 支持上传/链接图片。 | **高** |
|
||||||
|
| **作为管理员**,我希望设置阶梯式奖励规则,以便激励用户完成更高挑战。 | 1. 可配置多个奖励档位,如“邀请3人得A奖励,邀请10人得B奖励”。 | **高** |
|
||||||
|
| **作为管理员**,我希望设置带有衰减系数的多级邀请奖励,以鼓励用户发展下线,扩大传播范围。 | 1. 可配置奖励层级(最多如3级)。<br>2. 可配置每一级奖励的比例或固定值。 | **高** |
|
||||||
|
| **作为管理员**,我希望奖励类型可以支持积分和优惠券,以适应不同运营场景。 | 1. 可选择奖励类型为积分或优惠券。<br>2. 可填写对应的积分数量或优惠券批次ID。 | **高** |
|
||||||
|
| **(新增)** 作为管理员,我希望能为我的应用生成和管理API密钥,以便安全地将我的系统与“蚊子”系统对接。 | 1. 后台提供API Key生成、查看、重置和禁用的功能。<br>2. 密钥与具体活动或账户关联。 | **高** |
|
||||||
|
|
||||||
|
### 6.2. 数据与分析
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为管理员**,我希望在后台看到活动的核心数据仪表盘,以便实时监控活动健康度。 | 1. 仪表盘展示PV, UV, 参与人数, 新增注册数, K因子, CAC。<br>2. 数据支持按天查看。 | **高** |
|
||||||
|
| **作为管理员**,我希望以网络图的形式查看用户裂变路径,以便快速定位关键传播节点(KOL)。 | 1. 图形化展示用户间的邀请关系。<br>2. 节点可点击,并显示该用户的关键信息(如直接/间接邀请数)。 | **高** |
|
||||||
|
| **作为管理员**,我希望能看到一个按邀请数排序的“超级传播者”榜单,以便对他们进行额外奖励或运营。 | 1. 榜单展示用户昵称、头像、总邀请人数。<br>2. 支持导出榜单。 | **中** |
|
||||||
|
|
||||||
|
### 6.3. 用户端体验
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为参与者**,我希望能方便地获取专属的邀请链接和海报,以便分享给朋友。 | 1. 页面显著位置提供“一键复制链接”按钮。<br>2. 可生成带专属二维码的分享海报。 | **高** |
|
||||||
|
| **作为参与者**,我希望在个人中心看到我的邀请记录和奖励明细,以便了解我的贡献和收益。 | 1. 列表展示我邀请的好友(及其状态,如已注册)。<br>2. 列表展示我获得的每一笔奖励。 | **高** |
|
||||||
|
|
||||||
|
### 6.4. 系统能力与集成
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **(新增)** 作为第三方应用,当一个新用户通过邀请链接完成注册后,我希望能通过API通知“蚊子”系统,以便为邀请者记功和发放奖励。 | 1. 提供一个安全的、基于API Key认证的回调API。<br>2. API需接收一个在邀请链接中传递的唯一追踪ID。<br>3. API有清晰的成功/失败返回码。 | **高** |
|
||||||
|
| **作为系统**,需要能初步识别和拦截刷单行为,以保证活动的公平性和数据准确性。 | 1. 基于IP、设备指纹等信息进行基础的防刷校验。<br>2. 对回调API进行速率限制和来源IP校验。 | **高** |
|
||||||
|
| **作为系统**,需要能自动完成奖励发放,以降低运营成本和提升用户体验。 | 1. 可通过API与内部账户系统打通,完成积分/优惠券发放。 | **中** |
|
||||||
|
|
||||||
|
***技术流程说明***: *第三方注册回调的数据流如下:*
|
||||||
|
1. *用户A从“蚊子”系统获取邀请链接,链接中包含唯一追踪ID (如 `tracking_id=xyz`)。*
|
||||||
|
2. *用户B点击链接,访问“蚊子”服务器。“蚊子”系统记录点击事件和`tracking_id`,然后将用户重定向到第三方应用的目标页面,并在URL参数中带上`tracking_id`。*
|
||||||
|
3. *用户B在第三方应用完成注册。*
|
||||||
|
4. *第三方应用的后端服务,使用预先生成的API Key,调用“蚊子”系统的回调API,并传入`tracking_id`。*
|
||||||
|
5. *“蚊子”系统验证API Key和`tracking_id`,确认转化成功,为用户A记功并触发奖励机制。*
|
||||||
|
|
||||||
|
## 7. 范围与边界 (Scope and Boundaries for V1.0)
|
||||||
|
|
||||||
|
- **IN SCOPE (范围内)**:
|
||||||
|
- SaaS后台管理功能。
|
||||||
|
- 标准化的用户端邀请页面。
|
||||||
|
- **(新增)** 用于确认注册成功的服务端回调API (Server-to-Server Callback API)。
|
||||||
|
- 支持积分、优惠券两种奖励类型。
|
||||||
|
- 基础的防刷单机制。
|
||||||
|
- **OUT OF SCOPE (范围外)**:
|
||||||
|
- **现金奖励**:V1.0不处理现金发放,流程复杂,风险高。
|
||||||
|
- **A/B 测试框架**:活动级别的A/B测试将在后续版本考虑。
|
||||||
|
- **与外部CRM/MA工具的深度集成**:V1.0仅支持数据导出。
|
||||||
|
- **客户端SDK**: V1.0不提供嵌入App的SDK,通过H5页面承载。
|
||||||
|
|
||||||
|
## 8. 风险与假设 (Risks and Assumptions)
|
||||||
|
|
||||||
|
- **假设**:
|
||||||
|
- **用户分享意愿**: 假设在合理的奖励激励下,用户愿意主动分享。
|
||||||
|
- **技术可行性**: 假设内部账户系统有稳定、可用的API用于发放奖励。
|
||||||
|
- **(新增)** **客户集成能力**: 假设客户的开发团队有能力对接我们的回调API。
|
||||||
|
- **风险**:
|
||||||
|
- **防刷风险**: 简单的防刷机制可能被专业羊毛党绕过,导致活动成本失控。
|
||||||
|
- **数据准确性风险**: 高并发下,数据追踪可能存在延迟或丢失,影响分析和奖励发放。
|
||||||
|
- **用户隐私风险**: 邀请关系链涉及用户数据,需严格遵守隐私政策,否则可能引发法务风险。
|
||||||
|
- **(新增)** **集成复杂性风险**: 如果API设计不佳或文档不清晰,可能导致客户集成失败或意愿降低,从而影响产品推广。
|
||||||
|
|
||||||
|
## 9. 非功能性需求 (Non-Functional Requirements)
|
||||||
|
|
||||||
|
- **性能**: 邀请链接的跳转响应时间应低于200ms。活动高峰期,核心API需支持至少500 QPS。
|
||||||
|
- **可扩展性**: 奖励模块、活动模板应设计为可插拔、可扩展的架构。
|
||||||
|
- **安全性**: 保护用户数据隐私,防止数据泄露;所有关键操作需有不可篡改的日志记录;对外开放的API必须有可靠的认证和授权机制。
|
||||||
|
- **易用性**: 管理后台应做到“无代码”配置,运营人员通过引导式表单即可完成90%的配置工作。
|
||||||
|
- **(新增)** **文档完备性**: 必须提供清晰、准确、对开发者友好的API文档,包含代码示例。
|
||||||
|
|
||||||
|
## 10. 成功指标 (Success Metrics)
|
||||||
|
|
||||||
|
- **北极星指标**: **K-Factor** (每个参与用户平均带来的新用户数)。目标 > 1。
|
||||||
|
- **核心业务指标**: **用户获客成本 (CAC)** (`总活动成本 / 新增用户数`)。
|
||||||
|
- **产品体验指标**: **活动配置时长** (从创建到发布一个活动的平均耗时),**转化率** (`注册数 / 点击UV`)。
|
||||||
|
- **(新增)** **平台采纳度指标**: **API调用成功率**,**已集成应用的客户数**。
|
||||||
|
|
||||||
|
## 11. 未来规划 (Future Roadmap)
|
||||||
|
|
||||||
|
- **Phase 2 (优化与拓展)**:
|
||||||
|
- **A/B 测试框架**: 支持对活动文案、奖励机制进行分组测试,科学优化转化率。
|
||||||
|
- **更多奖励类型**: 引入实物、话费、现金等更多奖励选项。
|
||||||
|
- **与社交平台集成**: 优化在微信、微博等平台的分享体验。
|
||||||
|
- **Phase 3 (智能化与平台化)**:
|
||||||
|
- **智能推荐**: 基于用户画像,向最有可能参与活动的用户进行智能推送。
|
||||||
|
- **与CRM/MA工具深度集成**: 实现用户标签和数据的双向同步。
|
||||||
|
- **开放平台**: 允许第三方开发者基于本系统开发自定义的活动插件。
|
||||||
99
docs/api.md
Normal file
99
docs/api.md
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
# API 文档
|
||||||
|
|
||||||
|
本文档详细说明了活动管理和API密钥管理相关的API端点。
|
||||||
|
|
||||||
|
## 1. 活动管理 (Activities)
|
||||||
|
|
||||||
|
### 1.1 创建活动
|
||||||
|
|
||||||
|
- **Endpoint**: `POST /api/v1/activities`
|
||||||
|
- **描述**: 创建一个新的营销活动。
|
||||||
|
- **请求体**: `application/json`
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "春季特惠活动",
|
||||||
|
"startTime": "2025-03-01T10:00:00+08:00",
|
||||||
|
"endTime": "2025-03-31T23:59:59+08:00"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- **成功响应 (201 Created)**:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"name": "春季特惠活动",
|
||||||
|
"startTime": "2025-03-01T10:00:00+08:00",
|
||||||
|
"endTime": "2025-03-31T23:59:59+08:00",
|
||||||
|
// ... 其他活动属性
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- **失败响应**:
|
||||||
|
- `400 Bad Request`: 如果请求数据无效(例如,名称为空或结束时间早于开始时间)。
|
||||||
|
|
||||||
|
### 1.2 更新活动
|
||||||
|
|
||||||
|
- **Endpoint**: `PUT /api/v1/activities/{id}`
|
||||||
|
- **描述**: 更新一个已存在的活动。
|
||||||
|
- **路径参数**: `id` (long) - 活动的唯一标识符。
|
||||||
|
- **请求体**: `application/json`
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "春季特惠活动(升级版)",
|
||||||
|
"startTime": "2025-03-01T10:00:00+08:00",
|
||||||
|
"endTime": "2025-04-15T23:59:59+08:00"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- **成功响应 (200 OK)**: 返回更新后的活动对象。
|
||||||
|
- **失败响应**:
|
||||||
|
- `400 Bad Request`: 如果请求数据无效。
|
||||||
|
- `404 Not Found`: 如果指定的 `id` 不存在。
|
||||||
|
|
||||||
|
### 1.3 获取活动详情
|
||||||
|
|
||||||
|
- **Endpoint**: `GET /api/v1/activities/{id}`
|
||||||
|
- **描述**: 获取指定ID的活动详情。
|
||||||
|
- **路径参数**: `id` (long) - 活动的唯一标识符。
|
||||||
|
- **成功响应 (200 OK)**: 返回活动对象。
|
||||||
|
- **失败响应**:
|
||||||
|
- `404 Not Found`: 如果指定的 `id` 不存在。
|
||||||
|
|
||||||
|
## 2. API密钥管理 (API Keys)
|
||||||
|
|
||||||
|
### 2.1 创建API密钥
|
||||||
|
|
||||||
|
- **Endpoint**: `POST /api/v1/api-keys`
|
||||||
|
- **描述**: 为指定的活动创建一个新的API密钥。密钥仅在本次响应中明文返回,请立即保存。
|
||||||
|
- **请求体**: `application/json`
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"activityId": 1,
|
||||||
|
"name": "我的第一个密钥"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- **成功响应 (201 Created)**:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"apiKey": "a1b2c3d4-e5f6-7890-1234-567890abcdef"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- **失败响应**:
|
||||||
|
- `400 Bad Request`: 如果请求数据无效(例如,`activityId` 或 `name` 为空)。
|
||||||
|
- `404 Not Found`: 如果 `activityId` 不存在。
|
||||||
|
|
||||||
|
### 2.2 吊销API密钥
|
||||||
|
|
||||||
|
- **Endpoint**: `DELETE /api/v1/api-keys/{id}`
|
||||||
|
- **描述**: 吊销(删除)一个API密钥。
|
||||||
|
- **路径参数**: `id` (long) - API密钥的唯一标识符。
|
||||||
|
- **成功响应 (204 No Content)**: 无响应体。
|
||||||
|
- **失败响应**:
|
||||||
|
- `404 Not Found`: 如果指定的 `id` 不存在。
|
||||||
64
docs/architecture.md
Normal file
64
docs/architecture.md
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
# 技术架构说明: “蚊子”传播系统
|
||||||
|
|
||||||
|
## 1. 概述
|
||||||
|
|
||||||
|
本文档旨在为“蚊子”传播系统定义清晰、统一的技术架构,以指导后续的开发和运维工作。项目将遵循前后端分离的现代Web应用架构模式。
|
||||||
|
|
||||||
|
- **后端**: 基于 **Spring Boot 3** 的微服务架构,负责所有业务逻辑、数据处理和API输出。
|
||||||
|
- **前端**: 基于 **Vue 3** 的单页面应用(SPA),负责提供给“活动管理员”的管理后台界面和“终端参与者”的用户中心界面。
|
||||||
|
|
||||||
|
## 2. 技术选型 (Tech Stack)
|
||||||
|
|
||||||
|
| 领域 | 技术选型 | 备注 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **后端框架** | Spring Boot 3.x | 使用Java 17, 提供强大的生态和快速开发能力。 |
|
||||||
|
| **前端框架** | Vue 3.x | 使用 Composition API 和 `<script setup>` 语法,提升代码可读性和复用性。 |
|
||||||
|
| **构建工具** | Maven (后端), Vite (前端) | Vite提供极速的开发服务器和构建体验。 |
|
||||||
|
| **数据库** | PostgreSQL | 功能强大的开源关系型数据库,支持JSON等复杂数据类型。 |
|
||||||
|
| **数据访问** | Spring Data JPA / Hibernate | 简化数据库操作,实现ORM。 |
|
||||||
|
| **API安全** | Spring Security | 处理API Key认证及未来可能的用户认证。 |
|
||||||
|
| **消息队列** | RabbitMQ | 用于奖励发放等异步任务处理,通过Spring AMQP集成。 |
|
||||||
|
| **缓存** | Redis | 用于缓存热点数据,如排行榜。 |
|
||||||
|
| **UI组件库** | Element Plus | 成熟、丰富的Vue 3组件库。 |
|
||||||
|
| **前端状态管理**| Pinia | Vue 3官方推荐的状态管理库,轻量且直观。 |
|
||||||
|
| **API客户端** | Axios | 成熟的HTTP客户端,用于前后端交互。 |
|
||||||
|
|
||||||
|
## 3. 系统架构图
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
subgraph 用户端
|
||||||
|
A[浏览器/移动端] --> B{前端应用 (Vue 3)};
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph 服务端
|
||||||
|
B --> C{API网关/负载均衡};
|
||||||
|
C --> D[后端主应用 (Spring Boot)];
|
||||||
|
D -- JDBC --> E[PostgreSQL数据库];
|
||||||
|
D -- AMQP --> F[RabbitMQ消息队列];
|
||||||
|
D -- Redis Client --> G[Redis缓存];
|
||||||
|
H[奖励发放Worker] -- AMQP --> F;
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph 第三方服务
|
||||||
|
D --> I{第三方奖励系统 API};
|
||||||
|
J[客户应用] --> C;
|
||||||
|
end
|
||||||
|
```
|
||||||
|
|
||||||
|
## 4. 核心模块设计思路
|
||||||
|
|
||||||
|
- **活动管理 (Activity Management)**: 后端提供完整的CRUD API,前端通过表单驱动,实现活动的动态配置。
|
||||||
|
- **数据分析 (Data Analytics)**: 后端通过定时任务(`@Scheduled`)预先计算和聚合每日统计数据,存入统计表,避免前端查询时实时计算的性能瓶颈。
|
||||||
|
- **用户端体验 (User Experience)**:
|
||||||
|
- **短链接**: 后端提供一个专门的服务,负责长链接到短链接的转换和重定向。
|
||||||
|
- **海报生成**: 后端使用Java图像处理库(如 `thumbnailator`)生成海报,并设计降级方案,在高负载时仅返回数据由前端`canvas`进行渲染。
|
||||||
|
- **系统集成 (System Integration)**:
|
||||||
|
- **回调API**: 通过Spring Security的过滤器链实现`X-API-Key`的认证和速率限制。
|
||||||
|
- **异步奖励**: 采用消息队列实现核心业务(回调处理)与非核心业务(奖励发放)的解耦,提高系统的响应速度和可靠性。
|
||||||
|
|
||||||
|
## 5. 部署与运维 (DevOps)
|
||||||
|
|
||||||
|
- **容器化**: 前后端应用都将被打包成Docker镜像。
|
||||||
|
- **CI/CD**: 设立GitHub Actions工作流,在代码推送到主分支时,自动运行测试、构建Docker镜像并推送到镜像仓库。
|
||||||
|
- **部署**: 推荐使用Kubernetes (K8s) 或云服务商提供的容器服务(如 AWS ECS, Google Cloud Run)进行部署,以实现弹性伸缩和高可用性。
|
||||||
93
docs/data-model.md
Normal file
93
docs/data-model.md
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
# 数据模型设计
|
||||||
|
|
||||||
|
本文档描述了项目核心领域的数据模型和数据库表结构。
|
||||||
|
|
||||||
|
## 1. 领域模型 (Domain Models)
|
||||||
|
|
||||||
|
### 1.1 Activity.java
|
||||||
|
|
||||||
|
代表一个营销活动。这是活动管理功能的核心实体。
|
||||||
|
|
||||||
|
- `id` (Long): 活动的唯一标识符,主键。
|
||||||
|
- `name` (String): 活动的名称。
|
||||||
|
- `startTime` (ZonedDateTime): 活动的开始时间。
|
||||||
|
- `endTime` (ZonedDateTime): 活动的结束时间。
|
||||||
|
- `targetUserIds` (Set<Long>): 允许参加活动的用户ID集合。如果为空,则表示对所有用户开放。
|
||||||
|
- `rewardTiers` (List<RewardTier>): 奖励规则的层级列表。
|
||||||
|
- `rewardMode` (RewardMode): 奖励模式(`CUMULATIVE` - 叠加, `DIFFERENTIAL` - 补差)。
|
||||||
|
- `multiLevelRewardRules` (List<MultiLevelRewardRule>): 多级邀请奖励规则。
|
||||||
|
|
||||||
|
### 1.2 ApiKey.java
|
||||||
|
|
||||||
|
代表一个API密钥,用于授权外部系统访问特定活动的API。
|
||||||
|
|
||||||
|
- `id` (Long): 密钥的唯一标识符,主键。
|
||||||
|
- `activityId` (Long): 关联的活动ID。
|
||||||
|
- `name` (String): 方便用户识别的密钥名称。
|
||||||
|
- `keyHash` (String): 经过哈希加盐处理的API密钥值。
|
||||||
|
- `salt` (String): 用于哈希计算的盐值,Base64编码存储。
|
||||||
|
|
||||||
|
## 2. 数据库表设计 (Database Schema)
|
||||||
|
|
||||||
|
### 2.1 activities 表
|
||||||
|
|
||||||
|
由 `V1__Create_activities_table.sql` 创建。
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE activities (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
name VARCHAR(255) NOT NULL,
|
||||||
|
start_time TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
|
end_time TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
|
-- 其他字段将根据需求添加
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2.2 activity_rewards 表
|
||||||
|
|
||||||
|
由 `V2__Create_activity_rewards_table.sql` 创建。
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE activity_rewards (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
activity_id BIGINT NOT NULL,
|
||||||
|
threshold INT NOT NULL,
|
||||||
|
reward_type VARCHAR(50) NOT NULL,
|
||||||
|
points INT,
|
||||||
|
coupon_batch_id VARCHAR(255),
|
||||||
|
FOREIGN KEY (activity_id) REFERENCES activities(id)
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2.3 multi_level_reward_rules 表
|
||||||
|
|
||||||
|
由 `V3__Create_multi_level_reward_rules_table.sql` 创建。
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE multi_level_reward_rules (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
activity_id BIGINT NOT NULL,
|
||||||
|
level INT NOT NULL,
|
||||||
|
decay_coefficient DECIMAL(5, 4) NOT NULL,
|
||||||
|
FOREIGN KEY (activity_id) REFERENCES activities(id),
|
||||||
|
UNIQUE (activity_id, level)
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2.4 api_keys 表
|
||||||
|
|
||||||
|
由 `V4__Create_api_keys_table.sql` 创建。
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE api_keys (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
name VARCHAR(255) NOT NULL,
|
||||||
|
key_hash VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
salt VARCHAR(255) NOT NULL,
|
||||||
|
created_at TIMESTAMP NOT NULL,
|
||||||
|
revoked_at TIMESTAMP NULL,
|
||||||
|
last_used_at TIMESTAMP NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_api_keys_key_hash ON api_keys(key_hash);
|
||||||
|
```
|
||||||
35
docs/tech-choices.md
Normal file
35
docs/tech-choices.md
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
# 技术选型与决策
|
||||||
|
|
||||||
|
本文档记录了项目在开发过程中所做的主要技术选型和架构决策,以便团队成员理解其背后的原因。
|
||||||
|
|
||||||
|
## 1. 核心框架与语言
|
||||||
|
|
||||||
|
- **Java 17**: 选用 Java 17 是因为它是一个长期支持(LTS)版本,提供了现代化的语言特性(如 Records, Sealed Classes 等),并拥有稳定的生态系统。
|
||||||
|
- **Spring Boot 3**: 作为业界主流的Java开发框架,Spring Boot 极大地简化了配置和部署,通过其强大的自动配置和依赖管理能力,使我们能快速构建健壮的、生产级的Web服务。
|
||||||
|
- **Maven**: 作为项目管理和构建工具,Maven 提供了标准的项目结构、强大的依赖管理和一致的构建流程。
|
||||||
|
|
||||||
|
## 2. API与Web层
|
||||||
|
|
||||||
|
- **RESTful API**: 我们选择构建RESTful风格的API,这是当前Web服务的标准实践,具有无状态、易于理解和前后端分离的优点。
|
||||||
|
- **DTO (Data Transfer Object)**: 在Controller层,我们引入了DTO模式(例如 `CreateActivityRequest`)。
|
||||||
|
- **决策依据**: 将API的请求/响应结构与内部的领域模型(Domain Model)解耦。这样做的好处是:
|
||||||
|
1. **API稳定性**: 内部领域模型的变化不会直接影响外部API的结构。
|
||||||
|
2. **安全性**: 可以避免意外暴露领域模型中的敏感字段。
|
||||||
|
3. **专用性**: 可以为特定的API端点定制数据结构,使其更清晰、更高效。
|
||||||
|
- **Bean Validation**: 通过引入 `spring-boot-starter-validation` 并在DTO上使用 `@Valid` 和相关约束注解(如 `@NotBlank`),我们在Controller层实现了声明式的输入验证。这是一种标准的、非侵入性的验证方式,能保持业务逻辑的纯净。
|
||||||
|
|
||||||
|
## 3. 数据库与持久化
|
||||||
|
|
||||||
|
- **Flyway**: 用于数据库迁移管理。它通过版本化的SQL脚本来管理数据库结构演进,确保了在任何环境下(开发、测试、生产)数据库结构的一致性和可追溯性。
|
||||||
|
- **H2 Database (用于测试)**: H2是一个轻量级的内存数据库,非常适合在单元测试和集成测试中使用。它启动速度快,无需外部依赖,可以确保测试环境的纯净和可重复性。
|
||||||
|
- **PostgreSQL (用于生产)**: 在 `pom.xml` 中预先定义了依赖。PostgreSQL 是一个功能强大、稳定可靠的开源关系型数据库,适用于生产环境。
|
||||||
|
|
||||||
|
## 4. 安全
|
||||||
|
|
||||||
|
- **API密钥存储**: 采用哈希加盐(Salted Hashing)的方式存储API密钥。
|
||||||
|
- **决策依据**: 直接在数据库中存储明文API密钥是极不安全的。通过为每个密钥生成一个唯一的盐(Salt),并将其与密钥组合后进行哈希(SHA-256),我们只存储盐和哈希值。即使数据库被泄露,攻击者也无法直接反推出原始密钥,大大提高了安全性。
|
||||||
|
|
||||||
|
## 5. 异常处理
|
||||||
|
|
||||||
|
- **自定义异常**: 我们创建了业务特定的异常类,如 `ActivityNotFoundException` 和 `ApiKeyNotFoundException`。
|
||||||
|
- **决策依据**: 这使得代码的意图更清晰,并且可以利用 `@ResponseStatus` 注解将特定的业务异常直接映射到HTTP状态码(如 `404 Not Found`),简化了全局异常处理逻辑。
|
||||||
37
docs/test-plan.md
Normal file
37
docs/test-plan.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# 测试方案
|
||||||
|
|
||||||
|
本文档概述了针对后端API所采用的测试策略、工具和覆盖范围。
|
||||||
|
|
||||||
|
## 1. 测试策略
|
||||||
|
|
||||||
|
我们采用分层测试的策略,以确保代码质量和功能正确性。
|
||||||
|
|
||||||
|
### 1.1 单元测试 (Unit Tests)
|
||||||
|
|
||||||
|
- **目标**: 验证单个服务(Service)或组件内部的业务逻辑是否正确。
|
||||||
|
- **范围**: 主要针对 `ActivityService` 中的公共方法。
|
||||||
|
- **实现**: 测试用例位于 `ActivityServiceTest.java` 中。在此层面,所有外部依赖(如数据库、其他服务)都应被模拟(Mock),以保证测试的独立性和速度。
|
||||||
|
- **覆盖场景**:
|
||||||
|
- 业务规则校验(如活动时间合法性)。
|
||||||
|
- 边界条件(如各种奖励计算的临界值)。
|
||||||
|
- 异常路径(如当依赖的服务失败时,是否抛出预期的异常)。
|
||||||
|
|
||||||
|
### 1.2 集成测试 (Integration Tests)
|
||||||
|
|
||||||
|
- **目标**: 验证从API端点到服务层的完整请求-响应流程是否通畅。
|
||||||
|
- **范围**: 主要针对 `controller` 包下的所有API控制器。
|
||||||
|
- **实现**: 测试用例位于 `ActivityControllerTest.java` 和 `ApiKeyControllerTest.java` 中。我们使用 Spring Boot 的 `@WebMvcTest` 配合 `MockMvc` 来实现。
|
||||||
|
- `@WebMvcTest` 提供了一个轻量级的测试环境,只加载Web层相关的Bean(如Controller, `ObjectMapper`等),而不加载完整的Spring应用上下文,从而加快测试速度。
|
||||||
|
- `ActivityService` 在此层被 `@MockBean` 模拟,使我们能专注于测试Controller的行为,如URL映射、HTTP方法、请求/响应的序列化/反序列化、参数验证和正确的HTTP状态码返回。
|
||||||
|
- **覆盖场景**:
|
||||||
|
- **成功路径**: 验证有效的请求是否能被正确处理并返回 `2xx` 状态码和预期的响应体。
|
||||||
|
- **验证失败路径**: 验证无效的请求体(如字段为空)是否会触发Bean Validation并返回 `400 Bad Request`。
|
||||||
|
- **资源未找到路径**: 验证请求一个不存在的资源(如 `GET /api/v1/activities/999`)是否返回 `404 Not Found`。
|
||||||
|
|
||||||
|
## 2. 测试工具
|
||||||
|
|
||||||
|
- **JUnit 5**: Java世界中最主流的测试框架。
|
||||||
|
- **Mockito**: 用于创建和配置模拟对象(Mock),以实现测试的隔离。
|
||||||
|
- **Spring Boot Test**: 提供了与Spring生态无缝集成的测试支持,包括 `@SpringBootTest`, `@WebMvcTest`, `@MockBean` 等核心功能。
|
||||||
|
- **MockMvc**: 用于在不启动真实HTTP服务器的情况下,对Spring MVC控制器进行端到端的调用和验证。
|
||||||
|
- **H2 Database**: 一个内存数据库,用于在运行单元测试时快速执行数据库迁移(Flyway),确保SQL脚本的语法正确性。
|
||||||
113
pom.xml
Normal file
113
pom.xml
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||||
|
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||||
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
||||||
|
<groupId>com.example</groupId>
|
||||||
|
<artifactId>mosquito</artifactId>
|
||||||
|
<version>0.0.1-SNAPSHOT</version>
|
||||||
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<name>mosquito</name>
|
||||||
|
<description>Mosquito Propagation System</description>
|
||||||
|
|
||||||
|
<parent>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-parent</artifactId>
|
||||||
|
<version>3.1.5</version> <!-- Use a specific, recent version -->
|
||||||
|
<relativePath/> <!-- lookup parent from repository -->
|
||||||
|
</parent>
|
||||||
|
|
||||||
|
<properties>
|
||||||
|
<java.version>17</java.version>
|
||||||
|
<testcontainers.version>1.19.1</testcontainers.version>
|
||||||
|
</properties>
|
||||||
|
|
||||||
|
<dependencies>
|
||||||
|
<!-- Core Spring Boot -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-web</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-validation</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-data-jpa</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-amqp</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-data-redis</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Database Migration -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.flywaydb</groupId>
|
||||||
|
<artifactId>flyway-core</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Database -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.postgresql</groupId>
|
||||||
|
<artifactId>postgresql</artifactId>
|
||||||
|
<scope>runtime</scope>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Utils -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.projectlombok</groupId>
|
||||||
|
<artifactId>lombok</artifactId>
|
||||||
|
<optional>true</optional>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<dependency>
|
||||||
|
<groupId>javax.annotation</groupId>
|
||||||
|
<artifactId>javax.annotation-api</artifactId>
|
||||||
|
<version>1.3.2</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Testing -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-starter-test</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.h2database</groupId>
|
||||||
|
<artifactId>h2</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.testcontainers</groupId>
|
||||||
|
<artifactId>junit-jupiter</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>it.ozimov</groupId>
|
||||||
|
<artifactId>embedded-redis</artifactId>
|
||||||
|
<version>0.7.3</version>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.springframework.amqp</groupId>
|
||||||
|
<artifactId>spring-rabbit-test</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
|
||||||
|
|
||||||
|
<build>
|
||||||
|
<plugins>
|
||||||
|
<plugin>
|
||||||
|
<groupId>org.springframework.boot</groupId>
|
||||||
|
<artifactId>spring-boot-maven-plugin</artifactId>
|
||||||
|
</plugin>
|
||||||
|
</plugins>
|
||||||
|
</build>
|
||||||
|
|
||||||
|
</project>
|
||||||
86
specs/001-activity-management/plan.md
Normal file
86
specs/001-activity-management/plan.md
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
# 实施计划: 001 - 活动管理
|
||||||
|
|
||||||
|
本文档为“活动管理”功能规格的技术实施计划。
|
||||||
|
|
||||||
|
## 1. 总体思路
|
||||||
|
|
||||||
|
采用前后端分离的架构。后端负责提供RESTful API,处理所有业务逻辑和数据持久化。前端负责提供一个响应式的、用户友好的管理界面,供管理员创建和配置活动。
|
||||||
|
|
||||||
|
## 2. 后端开发任务 (Backend Tasks)
|
||||||
|
|
||||||
|
### 2.1. 数据库 Schema 设计
|
||||||
|
|
||||||
|
需要设计以下数据表来支持活动管理功能:
|
||||||
|
|
||||||
|
- **`activities` (活动表)**
|
||||||
|
- `id` (主键)
|
||||||
|
- `name` (活动名称, string)
|
||||||
|
- `start_time_utc` (开始时间, datetime)
|
||||||
|
- `end_time_utc` (结束时间, datetime)
|
||||||
|
- `target_users_config` (目标用户配置, json) - 存储用户ID列表或标签
|
||||||
|
- `page_content_config` (页面内容配置, json) - 存储文案和图片URL
|
||||||
|
- `reward_calculation_mode` (奖励计算模式, enum: `delta`, `cumulative`)
|
||||||
|
- `status` (活动状态, enum: `draft`, `active`, `ended`)
|
||||||
|
|
||||||
|
- **`activity_rewards` (阶梯奖励规则表)**
|
||||||
|
- `id` (主键)
|
||||||
|
- `activity_id` (外键, 关联 `activities`)
|
||||||
|
- `invite_threshold` (邀请人数阈值, integer)
|
||||||
|
- `reward_type` (奖励类型, enum: `points`, `coupon`)
|
||||||
|
- `reward_value` (奖励值, string) - 积分数量或优惠券批次ID
|
||||||
|
- `skip_validation` (跳过校验, boolean)
|
||||||
|
|
||||||
|
- **`multi_level_reward_rules` (多级奖励规则表)**
|
||||||
|
- `id` (主键)
|
||||||
|
- `activity_id` (外键, 关联 `activities`)
|
||||||
|
- `level` (奖励层级, integer, e.g., 2, 3)
|
||||||
|
- `coefficient_percent` (衰减系数-百分比, integer)
|
||||||
|
|
||||||
|
- **`api_keys` (API密钥表)**
|
||||||
|
- `id` (主键)
|
||||||
|
- `key_hash` (密钥的哈希值, string) - **绝不存储明文密钥**
|
||||||
|
- `scope_type` (范围类型, enum: `account`, `activity`)
|
||||||
|
- `scope_id` (范围ID, integer) - 对应账户ID或活动ID
|
||||||
|
- `created_at` (创建时间, datetime)
|
||||||
|
- `last_used_at` (最后使用时间, datetime, 可选)
|
||||||
|
|
||||||
|
### 2.2. API Endpoint 设计
|
||||||
|
|
||||||
|
设计以下RESTful API接口:
|
||||||
|
|
||||||
|
- `POST /api/v1/activities`: **创建新活动**
|
||||||
|
- Request Body: 包含活动所有配置信息。
|
||||||
|
- Response: 返回创建成功的活动详情。
|
||||||
|
|
||||||
|
- `PUT /api/v1/activities/{id}`: **更新活动**
|
||||||
|
- Request Body: 包含需要更新的活动配置。
|
||||||
|
- Response: 返回更新后的活动详情。
|
||||||
|
|
||||||
|
- `GET /api/v1/activities/{id}`: **获取活动详情**
|
||||||
|
- Response: 返回指定ID的活动完整配置。
|
||||||
|
|
||||||
|
- `POST /api/v1/api-keys`: **创建API密钥**
|
||||||
|
- Request Body: `{ scope_type, scope_id }`
|
||||||
|
- Response: 返回新生成的API Key (仅此一次)。
|
||||||
|
|
||||||
|
- `DELETE /api/v1/api-keys/{id}`: **吊销API密钥**
|
||||||
|
- Response: 成功状态码。
|
||||||
|
|
||||||
|
## 3. 前端开发任务 (Frontend Tasks)
|
||||||
|
|
||||||
|
### 3.1. UI 组件设计
|
||||||
|
|
||||||
|
需要开发一系列Vue组件来构建活动创建/编辑表单:
|
||||||
|
|
||||||
|
- **`ActivityEditor.vue`**: 主表单容器,管理整个活动配置的状态。
|
||||||
|
- **`GeneralSettings.vue`**: 用于配置活动名称、时间(需处理UTC与本地时间的转换)。
|
||||||
|
- **`TargetingEditor.vue`**: 用于配置目标用户,支持文本输入或文件上传。
|
||||||
|
- **`PageContentEditor.vue`**: 富文本编辑器及图片上传/链接组件(处理30MB大小限制和格式校验)。
|
||||||
|
- **`RewardRuleEditor.vue`**: 核心组件,用于配置阶梯奖励和多级奖励,包含奖励模式切换和计算示例的实时预览。
|
||||||
|
- **`ApiKeyManager.vue`**: 用于展示API密钥列表(不显示密钥本身)、生成新密钥(弹窗显示一次)、吊销密钥。
|
||||||
|
|
||||||
|
### 3.2. 状态管理与API集成
|
||||||
|
|
||||||
|
- 使用 `Pinia` 等状态管理库来管理复杂的表单数据。
|
||||||
|
- 创建一个API客户端模块 (`apiClient.js`),封装所有对后端API的请求。
|
||||||
|
- 在组件中调用API,并处理加载(loading)和错误(error)状态,提供流畅的用户体验。
|
||||||
33
specs/001-activity-management/spec.md
Normal file
33
specs/001-activity-management/spec.md
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# 功能规范: 001 - 活动管理
|
||||||
|
|
||||||
|
本文档定义了“蚊子”传播系统中与“活动管理”相关的功能。
|
||||||
|
|
||||||
|
## 1. 关键实体 (Key Entities)
|
||||||
|
|
||||||
|
- **活动 (Activity)**: 营销活动的核心载体,包含生命周期、规则、奖励等信息。
|
||||||
|
- **用户 (User)**: 系统的参与者,分为`管理员`和`传播者`两种角色。
|
||||||
|
- **奖励 (Reward)**: 用于激励用户的物品,V1.0支持`积分`和`优惠券`。
|
||||||
|
- **邀请 (Invitation)**: 用户之间的推荐关系,是构成裂变网络的基础。
|
||||||
|
- **API密钥 (API Key)**: 用于第三方系统与本系统认证的凭证。
|
||||||
|
- **追踪ID (Tracking ID)**: 标识单次邀请-注册流程的唯一ID。
|
||||||
|
|
||||||
|
## 2. 用户故事与验收标准 (User Stories & Acceptance Criteria)
|
||||||
|
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为管理员**,我希望能创建一个有时限的邀请活动,以便于策划和管理营销节奏。 | 1. 可设定活动名称、开始时间、结束时间。<br>2. 活动到期后自动停止。<br>3. **(澄清)** 当结束时间早于开始时间时,系统应阻止保存并给出明确提示。 | **高** |
|
||||||
|
| **作为管理员**,我希望可以设定活动仅对特定用户群体开放,以便于进行精准营销。 | 1. 可配置活动参与的用户标签或ID列表。<br>2. **(澄清)** 非目标用户访问活动链接时,应被重定向到一个友好的引导页面(如“该活动仅对部分用户开放”)。 | **高** |
|
||||||
|
| **作为管理员**,我希望可以自定义邀请页面的文案和图片,以匹配不同的活动主题和品牌风格。 | 1. 支持富文本编辑器修改文案。<br>2. 支持上传/链接图片。<br>3. **(澄清)** 上传图片大小超过30MB或格式不被支持时,应给出明确提示:“暂不支持,请重新上传”。 | **高** |
|
||||||
|
| **作为管理员**,我希望设置阶梯式奖励规则,以便激励用户完成更高挑战。 | 1. 可配置多个奖励档位,如“邀请3人得A奖励,邀请10人得B奖励”。<br>2. **(澄清)** 管理员可选择奖励计算模式:“补差”(默认)或“叠加”。 | **高** |
|
||||||
|
| **作为管理员**,我希望设置带有衰减系数的多级邀请奖励,以鼓励用户发展下线,扩大传播范围。 | 1. 可配置奖励层级(最多如3级)。<br>2. 可配置每一级奖励的比例或固定值。<br>3. **(澄清)** 衰减系数默认为百分比,设置界面需提供一个实时计算示例以便查看。 | **高** |
|
||||||
|
| **作为管理员**,我希望奖励类型可以支持积分和优惠券,以适应不同运营场景。 | 1. 可选择奖励类型为积分或优惠券。<br>2. 可填写对应的积分数量或优惠券批次ID。<br>3. **(澄清)** 系统默认会校验优惠券批次ID的有效性,管理员可选择跳过校验。 | **高** |
|
||||||
|
| **作为管理员**,我希望能为我的应用生成和管理API密钥,以便安全地将我的系统与“蚊子”系统对接。 | 1. **(澄清)** 后台提供API Key生成功能,密钥仅在生成时显示一次,后续不可查看,只能重置。<br>2. 密钥与具体活动或账户关联。 | **高** |
|
||||||
|
|
||||||
|
## 3. 澄清与边缘场景 (Clarifications & Edge Cases)
|
||||||
|
|
||||||
|
- **时区**: 所有时间均以 **UTC** 为准进行存储和计算,在前端展示时,应根据用户浏览器设置转换为本地时间。
|
||||||
|
- **阶梯奖励模式**:
|
||||||
|
- **补差 (默认)**: 用户达到更高奖励等级时,获得的奖励为该等级奖励与已获得奖励的差值。
|
||||||
|
- **叠加**: 用户达到更高奖励等级时,直接获得该等级的全部奖励,不扣除低等级奖励。
|
||||||
|
- **多级奖励计算示例**: 在设置界面需提供一个实时计算示例,例如:`原始奖励: 100积分, L2衰减: 50%, L3衰减: 20% -> L2获得: 50积分, L3获得: 20积分`。
|
||||||
|
- **API密钥范围**: 密钥默认为“**活动级**”。同时,管理员可在账户设置中生成“**账户级**”密钥,并在创建活动时选择使用账户级密钥或创建新的活动级密钥。
|
||||||
38
specs/001-activity-management/tasks.md
Normal file
38
specs/001-activity-management/tasks.md
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
# 开发任务列表: 001 - 活动管理
|
||||||
|
|
||||||
|
基于实施计划,为“活动管理”功能分解出以下开发任务。
|
||||||
|
|
||||||
|
## 后端 (Backend)
|
||||||
|
|
||||||
|
### 数据库 (Database)
|
||||||
|
|
||||||
|
- [x] **BE-DB-01**: 创建 `activities` 表的数据库迁移(migration)脚本。
|
||||||
|
- [x] **BE-DB-02**: 创建 `activity_rewards` 表的数据库迁移脚本。
|
||||||
|
- [x] **BE-DB-03**: 创建 `multi_level_reward_rules` 表的数据库迁移脚本。
|
||||||
|
- [x] **BE-DB-04**: 创建 `api_keys` 表的数据库迁移脚本,确保 `key_hash` 字段已建立索引。
|
||||||
|
|
||||||
|
### API & 业务逻辑
|
||||||
|
|
||||||
|
- [x] **BE-API-01**: 实现创建活动 (`POST /api/v1/activities`) 的业务逻辑,包括输入验证。
|
||||||
|
- [x] **BE-API-02**: 实现更新活动 (`PUT /api/v1/activities/{id}`) 的业务逻辑。
|
||||||
|
- [x] **BE-API-03**: 实现获取活动详情 (`GET /api/v1/activities/{id}`) 的业务逻辑。
|
||||||
|
- [x] **BE-API-04**: 实现API密钥的创建 (`POST /api/v1/api-keys`) 与安全存储(哈希加盐)。
|
||||||
|
- [x] **BE-API-05**: 实现API密钥的吊销 (`DELETE /api/v1/api-keys/{id}`) 逻辑。
|
||||||
|
- [x] **BE-TEST-01**: 为所有 `activities` 和 `api-keys` 相关的API Endpoints 编写单元测试和集成测试。
|
||||||
|
|
||||||
|
## 前端 (Frontend)
|
||||||
|
|
||||||
|
### UI 组件
|
||||||
|
|
||||||
|
- [ ] **FE-UI-01**: 开发 `ActivityEditor` 核心布局组件。
|
||||||
|
- [ ] **FE-UI-02**: 开发 `GeneralSettings` 组件,包含名称、时间选择器和客户端验证逻辑。
|
||||||
|
- [ ] **FE-UI-03**: 开发 `TargetingEditor` 组件,用于配置目标用户。
|
||||||
|
- [ ] **FE-UI-04**: 开发 `PageContentEditor` 组件,集成富文本编辑器和图片上传功能(包含客户端校验)。
|
||||||
|
- [ ] **FE-UI-05**: 开发 `RewardRuleEditor` 组件,处理复杂的阶梯和多级奖励配置,并提供实时计算预览。
|
||||||
|
- [ ] **FE-UI-06**: 开发 `ApiKeyManager` 组件,包括密钥列表(屏蔽密钥)、生成和吊销功能。
|
||||||
|
|
||||||
|
### 状态管理与集成
|
||||||
|
|
||||||
|
- [ ] **FE-STATE-01**: 配置 Redux/Zustand store,用于管理 `ActivityEditor` 的全局状态。
|
||||||
|
- [ ] **FE-API-01**: 创建一个API客户端服务,用于封装所有与后端交互的fetch请求。
|
||||||
|
- [ ] **FE-INT-01**: 将API客户端服务集成到所有相关UI组件中,并妥善处理加载(Loading)、错误(Error)和成功(Success)的UI状态反馈。
|
||||||
52
specs/002-data-analytics/plan.md
Normal file
52
specs/002-data-analytics/plan.md
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
# 实施计划: 002 - 数据与分析
|
||||||
|
|
||||||
|
本文档为“数据与分析”功能规格的技术实施计划。
|
||||||
|
|
||||||
|
## 1. 总体思路
|
||||||
|
|
||||||
|
数据分析功能对性能有较高要求。为避免实时计算带来的延迟,核心指标将通过定时任务进行预聚合。前端将使用专业的图表和网络图库来提供丰富的可视化效果。
|
||||||
|
|
||||||
|
## 2. 后端开发任务 (Backend Tasks)
|
||||||
|
|
||||||
|
### 2.1. 数据处理与存储
|
||||||
|
|
||||||
|
- **创建预聚合统计表**:
|
||||||
|
- 新建 `daily_activity_stats` 表,字段包括 `activity_id`, `date`, `pv`, `uv`, `participants`, `new_registrations`, `k_factor`, `total_rewards_cost`。
|
||||||
|
- **开发定时任务 (Cron Job)**:
|
||||||
|
- 创建一个每日执行的定时任务,用于计算前一天的各项核心指标,并填充到 `daily_activity_stats` 表中。
|
||||||
|
- **缓存策略**:
|
||||||
|
- 为“超级传播者榜单”的查询结果设置缓存(如 Redis,缓存时间5-10分钟),以降低数据库压力。
|
||||||
|
|
||||||
|
*技术选型说明*: 裂变网络图的查询,V1.0阶段可通过SQL的递归查询(CTE)实现。若未来性能遇到瓶颈,可考虑引入图数据库(如 Neo4j)进行优化。
|
||||||
|
|
||||||
|
### 2.2. API Endpoint 设计
|
||||||
|
|
||||||
|
- `GET /api/v1/activities/{id}/stats`: **获取仪表盘数据**
|
||||||
|
- Query Params: `start_date`, `end_date`。
|
||||||
|
- Logic: 从 `daily_activity_stats` 表中查询指定时间范围的数据。
|
||||||
|
|
||||||
|
- `GET /api/v1/activities/{id}/graph`: **获取裂变网络图数据**
|
||||||
|
- Query Params: `center_user_id`, `depth`。
|
||||||
|
- Logic: 根据中心用户ID,递归查询指定深度的上下级关系并返回节点和边的集合。
|
||||||
|
|
||||||
|
- `GET /api/v1/activities/{id}/leaderboard`: **获取超级传播者榜单**
|
||||||
|
- Query Params: `sort_by` (enum: `direct`, `total`), `page`, `limit`。
|
||||||
|
- Logic: 执行排序和分页查询,优先从缓存读取。
|
||||||
|
|
||||||
|
- `GET /api/v1/activities/{id}/leaderboard/export`: **导出榜单**
|
||||||
|
- Logic: 查询完整榜单数据,生成CSV格式的文件流并返回。
|
||||||
|
|
||||||
|
## 3. 前端开发任务 (Frontend Tasks)
|
||||||
|
|
||||||
|
### 3.1. UI 组件设计
|
||||||
|
|
||||||
|
- **`AnalyticsDashboard.vue`**: 数据分析页面的主容器,包含日期选择器和各个图表/榜单组件。
|
||||||
|
- **`DateRangePicker.vue`**: 可复用的日期范围选择组件,包含预设(昨天、过去7天等)和自定义范围功能。
|
||||||
|
- **`StatsChart.vue`**: 使用 `ECharts for Vue` 库,将核心指标以折线图或柱状图的形式进行可视化展示。
|
||||||
|
- **`NetworkGraphViewer.vue`**: 使用 `Vue Flow` 库,渲染网络图。需要实现节点的懒加载、点击展开、缩放和平移功能。
|
||||||
|
- **`LeaderboardTable.vue`**: 可排序的表格组件,包含排序切换控件和“导出CSV”按钮。
|
||||||
|
|
||||||
|
### 3.2. API 集成
|
||||||
|
|
||||||
|
- 在API客户端模块中新增上述4个数据相关接口的请求函数。
|
||||||
|
- 在 `AnalyticsDashboard.vue` 组件中统一管理数据获取、加载状态和错误处理,并将数据分发给各个子组件。
|
||||||
28
specs/002-data-analytics/spec.md
Normal file
28
specs/002-data-analytics/spec.md
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
# 功能规范: 002 - 数据与分析
|
||||||
|
|
||||||
|
本文档定义了“蚊子”传播系统中与“数据与分析”相关的功能。
|
||||||
|
|
||||||
|
## 1. 用户故事与验收标准 (User Stories & Acceptance Criteria)
|
||||||
|
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为管理员**,我希望在后台看到活动的核心数据仪表盘,以便实时监控活动健康度。 | 1. 仪表盘展示PV, UV, 参与人数, 新增注册数, K因子, CAC。<br>2. **(澄清)** 数据仪表盘提供时间范围选择器,支持“今天(默认)”、“昨天”、“过去7天”、“本月”、“上个月”以及自定义起止日期。<br>3. **(澄清)** 各指标定义清晰(详见下方澄清部分)。 | **高** |
|
||||||
|
| **作为管理员**,我希望以网络图的形式查看用户裂变路径,以便快速定位关键传播节点(KOL)。 | 1. 图形化展示用户间的邀请关系。<br>2. 节点可点击,并显示该用户的关键信息(用户ID、昵称、直接邀请人数、总邀请人数、注册时间)。<br>3. **(澄清)** 默认只展示目标用户的前后各一层关系,提供点击扩展功能,最多可扩展5层。 | **高** |
|
||||||
|
| **作为管理员**,我希望能看到一个按邀请数排序的“超级传播者”榜单,以便对他们进行额外奖励或运营。 | 1. 榜单展示用户昵称、头像、总邀请人数。<br>2. **(澄清)** 默认按“直接邀请人数”排序,但支持切换为按“总邀请人数”排序。<br>3. **(澄清)** 支持将当前榜单导出为CSV文件。<br>4. **(澄清)** 排名相同时,按先达到该数量的时间排序。 | **中** |
|
||||||
|
|
||||||
|
## 2. 澄清与边缘场景 (Clarifications & Edge Cases)
|
||||||
|
|
||||||
|
- **指标定义 (Metric Definitions)**:
|
||||||
|
- **参与人数**: 成功分享了邀请链接的独立用户数。
|
||||||
|
- **K-因子 (K-Factor)**: `K = (总邀请转化数 / 总分享次数)`。这是一个简化的K因子,V1.0阶段用于衡量分享内容的吸引力。
|
||||||
|
- **CAC (用户获客成本)**: `总成本 / 新增用户数`。在V1.0中,“总成本”仅计算“已发放的奖励总价值”。
|
||||||
|
|
||||||
|
- **网络图谱性能 (Graph Performance)**:
|
||||||
|
- 为保证性能,图谱默认只展示目标用户及与他直接关联的上下级。
|
||||||
|
- 用户每次点击图谱中的一个节点,会异步加载并展开该节点的下一层关系。
|
||||||
|
- 从初始节点开始,最多允许用户展开5层深的关系网络。
|
||||||
|
|
||||||
|
- **榜单规则 (Leaderboard Rules)**:
|
||||||
|
- **排序**: 榜单提供一个下拉菜单或切换按钮,允许管理员在“按直接邀请数”和“按总邀请数”之间切换视图。
|
||||||
|
- **平分处理**: 当排序依据的数值相同时,优先排名先达到该数值的用户。
|
||||||
|
- **导出**: 导出功能将当前视图(排序和筛选结果)生成一个CSV文件。
|
||||||
34
specs/002-data-analytics/tasks.md
Normal file
34
specs/002-data-analytics/tasks.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
# 开发任务列表: 002 - 数据与分析
|
||||||
|
|
||||||
|
基于实施计划,为“数据与分析”功能分解出以下开发任务。
|
||||||
|
|
||||||
|
## 后端 (Backend)
|
||||||
|
|
||||||
|
### 数据层 (Data Layer)
|
||||||
|
|
||||||
|
- [x] **BE-DB-05**: 创建 `daily_activity_stats` 表的数据库迁移脚本。
|
||||||
|
- [x] **BE-CRON-01**: 实现一个每日运行的定时任务,用于聚合原始数据并填充 `daily_activity_stats` 表。
|
||||||
|
- [x] **BE-CACHE-01**: 为排行榜查询配置并实现Redis缓存逻辑。
|
||||||
|
|
||||||
|
### API & 业务逻辑
|
||||||
|
|
||||||
|
- [ ] **BE-API-07**: 实现获取仪表盘数据 (`GET /api/v1/activities/{id}/stats`) 的业务逻辑。
|
||||||
|
- [ ] **BE-API-08**: 实现获取裂变网络图 (`GET /api/v1/activities/{id}/graph`) 的业务逻辑,包含递归查询。
|
||||||
|
- [ ] **BE-API-09**: 实现获取排行榜 (`GET /api/v1/activities/{id}/leaderboard`) 的业务逻辑,包含缓存处理。
|
||||||
|
- [ ] **BE-API-10**: 实现导出排行榜CSV文件 (`GET /api/v1/activities/{id}/leaderboard/export`) 的逻辑。
|
||||||
|
- [ ] **BE-TEST-02**: 为所有数据分析相关的API Endpoints 编写单元测试和集成测试。
|
||||||
|
|
||||||
|
## 前端 (Frontend)
|
||||||
|
|
||||||
|
### UI 组件
|
||||||
|
|
||||||
|
- [ ] **FE-UI-07**: 开发 `AnalyticsDashboard` 页面主容器组件。
|
||||||
|
- [ ] **FE-UI-08**: 开发可复用的 `DateRangePicker` 组件。
|
||||||
|
- [ ] **FE-UI-09**: 开发 `StatsChart` 组件,集成 `Recharts` 等图表库,并实现数据可视化。
|
||||||
|
- [ ] **FE-UI-10**: 开发 `NetworkGraphViewer` 组件,集成 `react-flow` 等网络图库,并实现懒加载和交互功能。
|
||||||
|
- [ ] **FE-UI-11**: 开发 `LeaderboardTable` 组件,实现前端排序切换和导出CSV的触发功能。
|
||||||
|
|
||||||
|
### 状态管理与集成
|
||||||
|
|
||||||
|
- [ ] **FE-API-02**: 在API客户端中新增所有数据分析相关的请求函数。
|
||||||
|
- [ ] **FE-INT-02**: 在 `AnalyticsDashboard` 容器组件中集成API调用,管理整个页面的数据流、加载和错误状态。
|
||||||
54
specs/003-user-experience/plan.md
Normal file
54
specs/003-user-experience/plan.md
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# 实施计划: 003 - 用户端体验
|
||||||
|
|
||||||
|
本文档为“用户端体验”功能规格的技术实施计划。
|
||||||
|
|
||||||
|
## 1. 总体思路
|
||||||
|
|
||||||
|
用户端体验是转化的关键,需要保证页面的高性能和交互的流畅性。后端需提供快速响应的API,前端则通过无限滚动等技术优化大数据量下的列表展示。
|
||||||
|
|
||||||
|
## 2. 后端开发任务 (Backend Tasks)
|
||||||
|
|
||||||
|
### 2.1. 新增服务与数据库设计
|
||||||
|
|
||||||
|
- **短链接服务 (Short Link Service)**
|
||||||
|
- **数据库**: 创建 `short_links` 表,包含 `id`, `code` (唯一索引), `original_url`, `created_at`。
|
||||||
|
- **内部API**: 创建一个内部 `POST /api/v1/internal/shorten` 接口,用于生成和存储短链接。
|
||||||
|
- **公共重定向**: 创建一个公共 `GET /r/{code}` 接口,用于处理短链接的302重定向。
|
||||||
|
|
||||||
|
- **海报生成服务 (Poster Generation Service)**
|
||||||
|
- **技术选型**: 引入图像处理库(如 Node.js 的 `sharp` 或 Python 的 `Pillow`)。
|
||||||
|
- **API设计**: 创建 `GET /api/v1/me/poster` 接口。默认返回 `image/png` 格式的图片文件流。当接收到 `?render=client` 参数时,返回构建海报所需的JSON数据(背景图URL、文案、二维码数据等),以支持客户端渲染的降级方案。
|
||||||
|
|
||||||
|
- **数据库更新**
|
||||||
|
- 在 `invitations` 表(或类似表)中增加 `status` 字段 (enum: `clicked`, `registered`, `ordered`),用于追踪邀请状态。
|
||||||
|
|
||||||
|
### 2.2. API Endpoint 设计
|
||||||
|
|
||||||
|
- `GET /api/v1/me/invitation-info`: **获取当前用户的邀请信息**
|
||||||
|
- Response: `{ short_link, ... }`,返回已生成的专属短链接。
|
||||||
|
|
||||||
|
- `GET /api/v1/me/invited-friends`: **获取邀请好友列表(分页)**
|
||||||
|
- Query Params: `page`, `limit`。
|
||||||
|
- Response: 返回经过隐私处理(昵称、打码手机号、头像)的好友列表及其状态。
|
||||||
|
|
||||||
|
- `GET /api/v1/me/rewards`: **获取奖励明细列表(分页)**
|
||||||
|
- Query Params: `page`, `limit`。
|
||||||
|
- Response: 返回当前用户的奖励记录列表。
|
||||||
|
|
||||||
|
## 3. 前端开发任务 (Frontend Tasks)
|
||||||
|
|
||||||
|
### 3.1. UI 组件设计
|
||||||
|
|
||||||
|
- **`UserCenter.vue`**: 用户个人中心的主页面,整合分享和记录查看功能。
|
||||||
|
- **`ShareModule.vue`**: 用于展示分享信息的组件,包含:
|
||||||
|
- “一键复制”短链接的按钮。
|
||||||
|
- 展示分享海报,并处理服务端/客户端渲染的逻辑。
|
||||||
|
- **`InfiniteScrollList.vue`**: 一个可复用的列表组件,封装“滚动到底部自动加载下一页”的逻辑。
|
||||||
|
- **`InvitedFriendItem.vue`**: 用于渲染“邀请记录”列表中的单项,展示好友的隐私保护信息和邀请状态。
|
||||||
|
- **`RewardItem.vue`**: 用于渲染“奖励明细”列表中的单项。
|
||||||
|
|
||||||
|
### 3.2. API 集成与状态管理
|
||||||
|
|
||||||
|
- 在API客户端中新增与用户中心相关的接口请求函数。
|
||||||
|
- 使用 `Pinia` 管理用户状态,并可以结合 `useSWRV` 或自定义的 `useFetch` Composable 函数来管理分页数据的获取、缓存和“无限滚动”的状态。
|
||||||
|
- 在 `ShareModule.vue` 中实现降级逻辑:当从服务端获取海报图片失败或超时,能自动切换到客户端渲染模式。
|
||||||
29
specs/003-user-experience/spec.md
Normal file
29
specs/003-user-experience/spec.md
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
# 功能规范: 003 - 用户端体验
|
||||||
|
|
||||||
|
本文档定义了“蚊子”传播系统中与“用户端体验”相关的功能。
|
||||||
|
|
||||||
|
## 1. 用户故事与验收标准 (User Stories & Acceptance Criteria)
|
||||||
|
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为参与者**,我希望能方便地获取专属的邀请链接和海报,以便分享给朋友。 | 1. 页面显著位置提供“一键复制链接”按钮。<br>2. **(澄清)** 复制的链接为短链接形式。<br>3. 可生成带专属二维码的分享海报。<br>4. **(澄清)** 海报内容由管理员在活动中配置,默认在服务端生成,并提供客户端渲染作为高负载降级方案。 | **高** |
|
||||||
|
| **作为参与者**,我希望在个人中心看到我的邀请记录和奖励明细,以便了解我的贡献和收益。 | 1. 列表展示我邀请的好友及其状态。<br>2. **(澄清)** 好友信息包含:昵称、头像、部分打码的手机号。<br>3. **(澄清)** 好友状态包含:“已点击但未注册”、“已注册”、“已下单”等。<br>4. 列表展示我获得的每一笔奖励。<br>5. **(澄清)** “邀请记录”和“奖励明细”列表均采用无限滚动方式进行分页加载。 | **高** |
|
||||||
|
|
||||||
|
## 2. 澄清与边缘场景 (Clarifications & Edge Cases)
|
||||||
|
|
||||||
|
- **海报生成 (Poster Generation)**:
|
||||||
|
- 默认在服务端生成图片,以保证跨平台显示一致性。
|
||||||
|
- 需监控服务负载,当图片生成请求队列过长或CPU占用过高时,应自动切换到客户端动态渲染模式,作为降级方案。
|
||||||
|
|
||||||
|
- **链接形式 (Link Format)**:
|
||||||
|
- 所有向用户展示的邀请链接,都必须经过短链接服务处理,生成类似 `t.cn/xxxx` 的短格式。
|
||||||
|
|
||||||
|
- **好友状态列表 (Friend Statuses)**:
|
||||||
|
- V1.0阶段,需要明确追踪并展示以下几种状态:`已点击`、`已注册`、`已下单`。
|
||||||
|
- 该状态列表应可扩展,以适应未来更多的转化事件。
|
||||||
|
|
||||||
|
- **隐私保护 (Privacy Protection)**:
|
||||||
|
- 在“邀请的好友”列表中,手机号必须进行打码处理,例如 `138****1234`,仅展示头三位和后四位。
|
||||||
|
|
||||||
|
- **列表加载 (List Loading)**:
|
||||||
|
- 两个列表(邀请记录、奖励明细)在用户滚动到列表底部时,应自动触发加载下一页数据,无需用户点击“加载更多”按钮。
|
||||||
35
specs/003-user-experience/tasks.md
Normal file
35
specs/003-user-experience/tasks.md
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
# 开发任务列表: 003 - 用户端体验
|
||||||
|
|
||||||
|
基于实施计划,为“用户端体验”功能分解出以下开发任务。
|
||||||
|
|
||||||
|
## 后端 (Backend)
|
||||||
|
|
||||||
|
### 核心服务与数据库
|
||||||
|
|
||||||
|
- [ ] **BE-DB-06**: 创建 `short_links` 表的数据库迁移脚本。
|
||||||
|
- [ ] **BE-DB-07**: 为 `invitations` 表增加 `status` 字段的数据库迁移脚本。
|
||||||
|
- [ ] **BE-SVC-01**: 实现短链接生成服务,包括 `POST /api/v1/internal/shorten` 内部接口。
|
||||||
|
- [ ] **BE-SVC-02**: 实现短链接重定向的公共接口 `GET /r/{code}`。
|
||||||
|
- [ ] **BE-SVC-03**: 实现海报生成服务 `GET /api/v1/me/poster`,需支持图片和JSON两种返回模式。
|
||||||
|
|
||||||
|
### API & 业务逻辑
|
||||||
|
|
||||||
|
- [ ] **BE-API-11**: 实现获取用户专属邀请信息 (`GET /api/v1/me/invitation-info`) 的业务逻辑。
|
||||||
|
- [ ] **BE-API-12**: 实现获取邀请好友列表 (`GET /api/v1/me/invited-friends`) 的业务逻辑,包含分页和隐私处理。
|
||||||
|
- [ ] **BE-API-13**: 实现获取用户奖励列表 (`GET /api/v1/me/rewards`) 的业务逻辑,包含分页。
|
||||||
|
- [ ] **BE-TEST-03**: 为所有用户端相关的API Endpoints 编写单元测试和集成测试。
|
||||||
|
|
||||||
|
## 前端 (Frontend)
|
||||||
|
|
||||||
|
### UI 组件
|
||||||
|
|
||||||
|
- [ ] **FE-UI-12**: 开发 `UserCenter` 页面的主布局组件。
|
||||||
|
- [ ] **FE-UI-13**: 开发 `ShareModule` 组件,实现短链接复制、海报展示及客户端渲染降级逻辑。
|
||||||
|
- [ ] **FE-UI-14**: 开发一个可复用的 `InfiniteScrollList` 无限滚动列表组件。
|
||||||
|
- [ ] **FE-UI-15**: 开发 `InvitedFriendItem` 和 `RewardItem` 列表项组件。
|
||||||
|
|
||||||
|
### 状态管理与集成
|
||||||
|
|
||||||
|
- [ ] **FE-API-03**: 在API客户端中新增所有用户端相关的请求函数。
|
||||||
|
- [ ] **FE-INT-03**: 使用 `React Query` 或类似工具库,将 `InfiniteScrollList` 组件与后端分页接口集成。
|
||||||
|
- [ ] **FE-INT-04**: 在 `ShareModule` 组件中,实现对海报生成接口的调用及失败/降级时的客户端渲染逻辑。
|
||||||
51
specs/004-system-integration/plan.md
Normal file
51
specs/004-system-integration/plan.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# 实施计划: 004 - 系统能力与集成
|
||||||
|
|
||||||
|
本文档为“系统能力与集成”功能规格的技术实施计划。
|
||||||
|
|
||||||
|
## 1. 总体思路
|
||||||
|
|
||||||
|
本功能模块是系统的核心支柱,涉及对外API和内部关键任务处理,必须优先考虑安全性、稳定性和可扩展性。我们将设计一个健壮的回调接口,并使用消息队列来解耦和异步化奖励发放流程。
|
||||||
|
|
||||||
|
## 2. 后端开发任务 (Backend Tasks)
|
||||||
|
|
||||||
|
### 2.1. 核心服务与数据库设计
|
||||||
|
|
||||||
|
- **回调API幂等性保证**
|
||||||
|
- **数据库**: 创建 `processed_callbacks` 表,包含 `tracking_id` (主键/唯一索引) 和 `created_at`。用于记录已处理的回调,防止重复处理。
|
||||||
|
|
||||||
|
- **防刷单数据支持**
|
||||||
|
- **数据库**: 在 `invitations` 表(或关联的点击记录表)中增加 `ip_address` 和 `user_agent` 字段,在用户点击邀请链接时记录。
|
||||||
|
|
||||||
|
- **异步奖励发放服务**
|
||||||
|
- **技术选型**: 引入消息队列系统(如 RabbitMQ 或基于Redis的 BullMQ)。
|
||||||
|
- **数据库**: 创建 `failed_reward_jobs` 表,用于记录多次重试后仍失败的奖励任务,字段包括 `reward_id`, `reason`, `payload`, `failed_at`。
|
||||||
|
- **队列定义**: 创建一个名为 `reward_issuance` 的队列。
|
||||||
|
- **Worker开发**: 创建一个独立的Worker进程,专门用于监听和处理 `reward_issuance` 队列中的任务。
|
||||||
|
|
||||||
|
### 2.2. API Endpoint 设计与实现
|
||||||
|
|
||||||
|
- `POST /api/v1/callback/register`: **接收第三方注册通知**
|
||||||
|
- **中间件 (Middleware)**:
|
||||||
|
1. **认证**: 实现一个检查 `X-API-Key` 请求头的认证中间件。
|
||||||
|
2. **速率限制**: 实现一个基于Redis的速率限制中间件,根据API Key进行限制,配置可后台调整。
|
||||||
|
- **控制器逻辑 (Controller Logic)**:
|
||||||
|
1. 检查 `tracking_id` 是否存在于 `processed_callbacks` 表中,若存在则直接返回 200 OK。
|
||||||
|
2. 验证 `tracking_id` 的有效性。
|
||||||
|
3. 将 `tracking_id` 存入 `processed_callbacks` 表。
|
||||||
|
4. 将一个“发放奖励”的任务(包含奖励所需信息)推送到 `reward_issuance` 消息队列。
|
||||||
|
5. 返回 200 OK。
|
||||||
|
|
||||||
|
### 2.3. 奖励发放Worker实现
|
||||||
|
|
||||||
|
- **任务处理逻辑**:
|
||||||
|
- Worker从队列中获取任务。
|
||||||
|
- 根据任务中的奖励类型,调用对应的适配器(Adapter)来与外部奖励系统API交互。
|
||||||
|
- **错误与重试处理**:
|
||||||
|
- 如果API调用失败,检查当前任务的重试次数。
|
||||||
|
- 如果小于3次,则将任务重新放回队列,并设置指数退避(exponential backoff)的延迟时间(如5m, 15m, 30m)。
|
||||||
|
- 如果达到3次,则将任务信息存入 `failed_reward_jobs` 表,并触发管理员通知(如邮件或Webhook)。
|
||||||
|
|
||||||
|
## 3. 前端开发任务 (Frontend Tasks)
|
||||||
|
|
||||||
|
- 本功能模块主要为后端实现,前端开发任务较少。
|
||||||
|
- **管理员通知**: 可能需要在管理后台开发一个界面,用于展示 `failed_reward_jobs` 表中的失败任务列表,方便管理员进行手动处理。
|
||||||
32
specs/004-system-integration/spec.md
Normal file
32
specs/004-system-integration/spec.md
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
# 功能规范: 004 - 系统能力与集成
|
||||||
|
|
||||||
|
本文档定义了“蚊子”传播系统中与“系统能力与集成”相关的功能。
|
||||||
|
|
||||||
|
## 1. 用户故事与验收标准 (User Stories & Acceptance Criteria)
|
||||||
|
|
||||||
|
| 用户故事 | 验收标准 | 优先级 |
|
||||||
|
| :--- | :--- | :--- |
|
||||||
|
| **作为第三方应用**,当一个新用户通过邀请链接完成注册后,我希望能通过API通知“蚊子”系统,以便为邀请者记功和发放奖励。 | 1. 提供一个安全的、基于`X-API-Key` HTTP头认证的回调API。<br>2. API需接收一个在邀请链接中传递的唯一`tracking_id`(必需),以及第三方系统中的`external_user_id`(可选)。<br>3. API有清晰的成功/失败返回码。<br>4. **(澄清)** API需具备幂等性,对于重复的`tracking_id`调用应返回成功,但只记功一次。 | **高** |
|
||||||
|
| **作为系统**,需要能初步识别和拦截刷单行为,以保证活动的公平性和数据准确性。 | 1. **(澄清)** 在用户点击邀请链接的环节,收集其IP和设备指纹信息。<br>2. **(澄清)** 对回调API进行速率限制,默认为“每分钟100次/API Key”,并支持在后台配置此数值。<br>3. 可配置是否对来源IP进行校验。 | **高** |
|
||||||
|
| **作为系统**,需要能自动完成奖励发放,以降低运营成本和提升用户体验。 | 1. 可通过API与内部账户系统或第三方API打通,完成积分/优惠券发放。<br>2. **(澄清)** 当奖励发放失败时,应自动进入延迟重试队列,重试3次后仍失败则标记为“发放失败”并通知管理员。 | **中** |
|
||||||
|
|
||||||
|
## 2. 澄清与边缘场景 (Clarifications & Edge Cases)
|
||||||
|
|
||||||
|
- **API幂等性 (API Idempotency)**:
|
||||||
|
- 回调API必须是幂等(idempotent)的。系统将记录已成功处理的 `tracking_id`。如果收到一个已处理过的 `tracking_id`,系统将直接返回成功状态,但不会重复执行奖励逻辑。
|
||||||
|
|
||||||
|
- **回调API负载 (Callback API Payload)**:
|
||||||
|
- `tracking_id` (string, required): 从邀请链接中获得的唯一追踪ID。
|
||||||
|
- `external_user_id` (string, optional): 新用户在第三方系统中的ID,用于对账。
|
||||||
|
- `timestamp` (integer, optional): 事件发生的时间戳。
|
||||||
|
|
||||||
|
- **防刷单信息收集 (Anti-Fraud Info Collection)**:
|
||||||
|
- 用户的IP和设备指纹信息在点击我方生成的邀请链接(重定向前)时捕获,并与 `tracking_id` 关联存储。后续可基于此信息进行风险评估。
|
||||||
|
|
||||||
|
- **奖励发放重试机制 (Reward Issuance Retries)**:
|
||||||
|
- 首次发放失败的奖励任务将进入一个延迟队列。
|
||||||
|
- 队列将以递增的时间间隔(例如:5分钟后,15分钟后,30分钟后)自动重试最多3次。
|
||||||
|
- 3次全部失败后,该任务将被标记为“永久失败”,并生成一条待办事项或通知给系统管理员进行人工干预。
|
||||||
|
|
||||||
|
- **奖励接口适配 (Reward API Adaptability)**:
|
||||||
|
- 奖励发放模块在设计上应是可扩展的,通过适配器模式(Adapter Pattern)来支持未来接入不同类型的第三方奖励系统。V1.0将优先实现一种基于 `token` 认证的RESTful API调用方式。
|
||||||
32
specs/004-system-integration/tasks.md
Normal file
32
specs/004-system-integration/tasks.md
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
# 开发任务列表: 004 - 系统能力与集成
|
||||||
|
|
||||||
|
基于实施计划,为“系统能力与集成”功能分解出以下开发任务。
|
||||||
|
|
||||||
|
## 后端 (Backend)
|
||||||
|
|
||||||
|
### 核心服务与数据库
|
||||||
|
|
||||||
|
- [ ] **BE-DB-08**: 创建 `processed_callbacks` 表的数据库迁移脚本。
|
||||||
|
- [ ] **BE-DB-09**: 更新 `invitations` 表的迁移脚本,增加 `ip_address` 和 `user_agent` 字段。
|
||||||
|
- [ ] **BE-DB-10**: 创建 `failed_reward_jobs` 表的数据库迁移脚本。
|
||||||
|
- [ ] **BE-SETUP-01**: 在项目中安装、配置并初始化消息队列服务(如 RabbitMQ 或 BullMQ)。
|
||||||
|
- [ ] **BE-WORKER-01**: 创建奖励发放Worker服务的基本结构,并连接到消息队列。
|
||||||
|
|
||||||
|
### API & 业务逻辑
|
||||||
|
|
||||||
|
- [ ] **BE-API-14**: 开发一个可重用的 `X-API-Key` 认证中间件。
|
||||||
|
- [ ] **BE-API-15**: 开发一个可重用的、基于Redis的可配置速率限制中间件。
|
||||||
|
- [ ] **BE-API-16**: 实现 `POST /api/v1/callback/register` 接口的完整逻辑,包括认证、速率限制、幂等性检查和向队列分发任务。
|
||||||
|
- [ ] **BE-TEST-04**: 为回调API编写完整的测试用例,特别是要覆盖幂等性和速率限制的场景。
|
||||||
|
|
||||||
|
### 异步任务 Worker
|
||||||
|
|
||||||
|
- [ ] **BE-WORKER-02**: 在Worker中实现调用外部奖励API的逻辑(使用适配器模式)。
|
||||||
|
- [ ] **BE-WORKER-03**: 在Worker中实现任务失败后的重试逻辑(指数退避延迟)。
|
||||||
|
- [ ] **BE-WORKER-04**: 在Worker中实现任务重试耗尽后,将任务存入 `failed_reward_jobs` 表的逻辑。
|
||||||
|
- [ ] **BE-ALERT-01**: 实现一个简单的通知服务,当有新条目写入 `failed_reward_jobs` 时,向管理员发送警报。
|
||||||
|
|
||||||
|
## 前端 (Frontend)
|
||||||
|
|
||||||
|
- [ ] **FE-UI-16**: (可选)在管理后台创建一个简单的页面,用于展示和管理失败的奖励发放任务。
|
||||||
|
- [ ] **FE-API-04**: (可选)为上述页面开发一个获取失败任务列表的API请求函数。
|
||||||
18
src/main/java/com/mosquito/project/MosquitoApplication.java
Normal file
18
src/main/java/com/mosquito/project/MosquitoApplication.java
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
package com.mosquito.project;
|
||||||
|
|
||||||
|
import org.springframework.boot.SpringApplication;
|
||||||
|
import org.springframework.boot.autoconfigure.SpringBootApplication;
|
||||||
|
|
||||||
|
import org.springframework.cache.annotation.EnableCaching;
|
||||||
|
import org.springframework.scheduling.annotation.EnableScheduling;
|
||||||
|
|
||||||
|
@SpringBootApplication
|
||||||
|
@EnableScheduling
|
||||||
|
@EnableCaching
|
||||||
|
public class MosquitoApplication {
|
||||||
|
|
||||||
|
public static void main(String[] args) {
|
||||||
|
SpringApplication.run(MosquitoApplication.class, args);
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
@@ -0,0 +1,59 @@
|
|||||||
|
package com.mosquito.project.controller;
|
||||||
|
|
||||||
|
import com.mosquito.project.domain.Activity;
|
||||||
|
import com.mosquito.project.dto.CreateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.UpdateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.ActivityStatsResponse;
|
||||||
|
import com.mosquito.project.dto.ActivityGraphResponse;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import jakarta.validation.Valid;
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.GetMapping;
|
||||||
|
import org.springframework.web.bind.annotation.PathVariable;
|
||||||
|
import org.springframework.web.bind.annotation.PostMapping;
|
||||||
|
import org.springframework.web.bind.annotation.PutMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RequestBody;
|
||||||
|
import org.springframework.web.bind.annotation.RequestMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RestController;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/v1/activities")
|
||||||
|
public class ActivityController {
|
||||||
|
|
||||||
|
private final ActivityService activityService;
|
||||||
|
|
||||||
|
public ActivityController(ActivityService activityService) {
|
||||||
|
this.activityService = activityService;
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping
|
||||||
|
public ResponseEntity<Activity> createActivity(@Valid @RequestBody CreateActivityRequest request) {
|
||||||
|
Activity createdActivity = activityService.createActivity(request);
|
||||||
|
return new ResponseEntity<>(createdActivity, HttpStatus.CREATED);
|
||||||
|
}
|
||||||
|
|
||||||
|
@PutMapping("/{id}")
|
||||||
|
public ResponseEntity<Activity> updateActivity(@PathVariable Long id, @Valid @RequestBody UpdateActivityRequest request) {
|
||||||
|
Activity updatedActivity = activityService.updateActivity(id, request);
|
||||||
|
return ResponseEntity.ok(updatedActivity);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}")
|
||||||
|
public ResponseEntity<Activity> getActivityById(@PathVariable Long id) {
|
||||||
|
Activity activity = activityService.getActivityById(id);
|
||||||
|
return ResponseEntity.ok(activity);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}/stats")
|
||||||
|
public ResponseEntity<ActivityStatsResponse> getActivityStats(@PathVariable Long id) {
|
||||||
|
ActivityStatsResponse stats = activityService.getActivityStats(id);
|
||||||
|
return ResponseEntity.ok(stats);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}/graph")
|
||||||
|
public ResponseEntity<ActivityGraphResponse> getActivityGraph(@PathVariable Long id) {
|
||||||
|
ActivityGraphResponse graph = activityService.getActivityGraph(id);
|
||||||
|
return ResponseEntity.ok(graph);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,37 @@
|
|||||||
|
package com.mosquito.project.controller;
|
||||||
|
|
||||||
|
import com.mosquito.project.dto.CreateApiKeyRequest;
|
||||||
|
import com.mosquito.project.dto.CreateApiKeyResponse;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import jakarta.validation.Valid;
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||||
|
import org.springframework.web.bind.annotation.PathVariable;
|
||||||
|
import org.springframework.web.bind.annotation.PostMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RequestBody;
|
||||||
|
import org.springframework.web.bind.annotation.RequestMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RestController;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/v1/api-keys")
|
||||||
|
public class ApiKeyController {
|
||||||
|
|
||||||
|
private final ActivityService activityService;
|
||||||
|
|
||||||
|
public ApiKeyController(ActivityService activityService) {
|
||||||
|
this.activityService = activityService;
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping
|
||||||
|
public ResponseEntity<CreateApiKeyResponse> createApiKey(@Valid @RequestBody CreateApiKeyRequest request) {
|
||||||
|
String rawApiKey = activityService.generateApiKey(request);
|
||||||
|
return new ResponseEntity<>(new CreateApiKeyResponse(rawApiKey), HttpStatus.CREATED);
|
||||||
|
}
|
||||||
|
|
||||||
|
@DeleteMapping("/{id}")
|
||||||
|
public ResponseEntity<Void> revokeApiKey(@PathVariable Long id) {
|
||||||
|
activityService.revokeApiKey(id);
|
||||||
|
return new ResponseEntity<>(HttpStatus.NO_CONTENT);
|
||||||
|
}
|
||||||
|
}
|
||||||
81
src/main/java/com/mosquito/project/domain/Activity.java
Normal file
81
src/main/java/com/mosquito/project/domain/Activity.java
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Set;
|
||||||
|
|
||||||
|
public class Activity {
|
||||||
|
private Long id;
|
||||||
|
private String name;
|
||||||
|
private ZonedDateTime startTime;
|
||||||
|
private ZonedDateTime endTime;
|
||||||
|
private Set<Long> targetUserIds;
|
||||||
|
private List<RewardTier> rewardTiers;
|
||||||
|
private RewardMode rewardMode = RewardMode.DIFFERENTIAL; // 默认为补差模式
|
||||||
|
private List<MultiLevelRewardRule> multiLevelRewardRules;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getStartTime() {
|
||||||
|
return startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStartTime(ZonedDateTime startTime) {
|
||||||
|
this.startTime = startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getEndTime() {
|
||||||
|
return endTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEndTime(ZonedDateTime endTime) {
|
||||||
|
this.endTime = endTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Set<Long> getTargetUserIds() {
|
||||||
|
return targetUserIds;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTargetUserIds(Set<Long> targetUserIds) {
|
||||||
|
this.targetUserIds = targetUserIds;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<RewardTier> getRewardTiers() {
|
||||||
|
return rewardTiers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRewardTiers(List<RewardTier> rewardTiers) {
|
||||||
|
this.rewardTiers = rewardTiers;
|
||||||
|
}
|
||||||
|
|
||||||
|
public RewardMode getRewardMode() {
|
||||||
|
return rewardMode;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRewardMode(RewardMode rewardMode) {
|
||||||
|
this.rewardMode = rewardMode;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<MultiLevelRewardRule> getMultiLevelRewardRules() {
|
||||||
|
return multiLevelRewardRules;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMultiLevelRewardRules(List<MultiLevelRewardRule> multiLevelRewardRules) {
|
||||||
|
this.multiLevelRewardRules = multiLevelRewardRules;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Long getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(Long id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
}
|
||||||
50
src/main/java/com/mosquito/project/domain/ApiKey.java
Normal file
50
src/main/java/com/mosquito/project/domain/ApiKey.java
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
public class ApiKey {
|
||||||
|
private Long id;
|
||||||
|
private Long activityId;
|
||||||
|
private String name;
|
||||||
|
private String keyHash;
|
||||||
|
private String salt;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public Long getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(Long id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Long getActivityId() {
|
||||||
|
return activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setActivityId(Long activityId) {
|
||||||
|
this.activityId = activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getKeyHash() {
|
||||||
|
return keyHash;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setKeyHash(String keyHash) {
|
||||||
|
this.keyHash = keyHash;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSalt() {
|
||||||
|
return salt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSalt(String salt) {
|
||||||
|
this.salt = salt;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,70 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
import java.time.LocalDate;
|
||||||
|
|
||||||
|
public class DailyActivityStats {
|
||||||
|
private Long id;
|
||||||
|
private Long activityId;
|
||||||
|
private LocalDate statDate;
|
||||||
|
private int views;
|
||||||
|
private int shares;
|
||||||
|
private int newRegistrations;
|
||||||
|
private int conversions;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public Long getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(Long id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Long getActivityId() {
|
||||||
|
return activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setActivityId(Long activityId) {
|
||||||
|
this.activityId = activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDate getStatDate() {
|
||||||
|
return statDate;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStatDate(LocalDate statDate) {
|
||||||
|
this.statDate = statDate;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getViews() {
|
||||||
|
return views;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setViews(int views) {
|
||||||
|
this.views = views;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getShares() {
|
||||||
|
return shares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setShares(int shares) {
|
||||||
|
this.shares = shares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getNewRegistrations() {
|
||||||
|
return newRegistrations;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setNewRegistrations(int newRegistrations) {
|
||||||
|
this.newRegistrations = newRegistrations;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getConversions() {
|
||||||
|
return conversions;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setConversions(int conversions) {
|
||||||
|
this.conversions = conversions;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,40 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
import java.io.Serializable;
|
||||||
|
|
||||||
|
public class LeaderboardEntry implements Serializable {
|
||||||
|
private Long userId;
|
||||||
|
private String userName;
|
||||||
|
private int score;
|
||||||
|
|
||||||
|
public LeaderboardEntry(Long userId, String userName, int score) {
|
||||||
|
this.userId = userId;
|
||||||
|
this.userName = userName;
|
||||||
|
this.score = score;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public Long getUserId() {
|
||||||
|
return userId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUserId(Long userId) {
|
||||||
|
this.userId = userId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getUserName() {
|
||||||
|
return userName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUserName(String userName) {
|
||||||
|
this.userName = userName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getScore() {
|
||||||
|
return score;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setScore(int score) {
|
||||||
|
this.score = score;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,22 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
import java.math.BigDecimal;
|
||||||
|
|
||||||
|
// 代表多级奖励规则的类
|
||||||
|
public class MultiLevelRewardRule {
|
||||||
|
private int level;
|
||||||
|
private BigDecimal decayCoefficient; // 衰减系数 (e.g., 0.5 for 50%)
|
||||||
|
|
||||||
|
public MultiLevelRewardRule(int level, BigDecimal decayCoefficient) {
|
||||||
|
this.level = level;
|
||||||
|
this.decayCoefficient = decayCoefficient;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getLevel() {
|
||||||
|
return level;
|
||||||
|
}
|
||||||
|
|
||||||
|
public BigDecimal getDecayCoefficient() {
|
||||||
|
return decayCoefficient;
|
||||||
|
}
|
||||||
|
}
|
||||||
45
src/main/java/com/mosquito/project/domain/Reward.java
Normal file
45
src/main/java/com/mosquito/project/domain/Reward.java
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
import java.util.Objects;
|
||||||
|
|
||||||
|
// 代表奖励的简单类
|
||||||
|
public class Reward {
|
||||||
|
private RewardType rewardType;
|
||||||
|
private int points;
|
||||||
|
private String couponBatchId;
|
||||||
|
|
||||||
|
public Reward(int points) {
|
||||||
|
this.rewardType = RewardType.POINTS;
|
||||||
|
this.points = points;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Reward(String couponBatchId) {
|
||||||
|
this.rewardType = RewardType.COUPON;
|
||||||
|
this.couponBatchId = couponBatchId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public RewardType getRewardType() {
|
||||||
|
return rewardType;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getPoints() {
|
||||||
|
return points;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCouponBatchId() {
|
||||||
|
return couponBatchId;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean equals(Object o) {
|
||||||
|
if (this == o) return true;
|
||||||
|
if (o == null || getClass() != o.getClass()) return false;
|
||||||
|
Reward reward = (Reward) o;
|
||||||
|
return points == reward.points && rewardType == reward.rewardType && Objects.equals(couponBatchId, reward.couponBatchId);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public int hashCode() {
|
||||||
|
return Objects.hash(rewardType, points, couponBatchId);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
// 奖励模式枚举: 补差或叠加
|
||||||
|
public enum RewardMode {
|
||||||
|
DIFFERENTIAL, // 补差 (默认)
|
||||||
|
CUMULATIVE // 叠加
|
||||||
|
}
|
||||||
20
src/main/java/com/mosquito/project/domain/RewardTier.java
Normal file
20
src/main/java/com/mosquito/project/domain/RewardTier.java
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
// 代表奖励档位的类
|
||||||
|
public class RewardTier {
|
||||||
|
private int threshold; // 触发此奖励所需的邀请数
|
||||||
|
private Reward reward; // 对应的奖励
|
||||||
|
|
||||||
|
public RewardTier(int threshold, Reward reward) {
|
||||||
|
this.threshold = threshold;
|
||||||
|
this.reward = reward;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getThreshold() {
|
||||||
|
return threshold;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Reward getReward() {
|
||||||
|
return reward;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,6 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
public enum RewardType {
|
||||||
|
POINTS,
|
||||||
|
COUPON
|
||||||
|
}
|
||||||
27
src/main/java/com/mosquito/project/domain/User.java
Normal file
27
src/main/java/com/mosquito/project/domain/User.java
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
package com.mosquito.project.domain;
|
||||||
|
|
||||||
|
public class User {
|
||||||
|
private Long id;
|
||||||
|
private String name;
|
||||||
|
|
||||||
|
public User(Long id, String name) {
|
||||||
|
this.id = id;
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Long getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(Long id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,82 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
public class ActivityGraphResponse {
|
||||||
|
|
||||||
|
private List<Node> nodes;
|
||||||
|
private List<Edge> edges;
|
||||||
|
|
||||||
|
public ActivityGraphResponse(List<Node> nodes, List<Edge> edges) {
|
||||||
|
this.nodes = nodes;
|
||||||
|
this.edges = edges;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Node> getNodes() {
|
||||||
|
return nodes;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setNodes(List<Node> nodes) {
|
||||||
|
this.nodes = nodes;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Edge> getEdges() {
|
||||||
|
return edges;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEdges(List<Edge> edges) {
|
||||||
|
this.edges = edges;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Node {
|
||||||
|
private String id;
|
||||||
|
private String label;
|
||||||
|
|
||||||
|
public Node(String id, String label) {
|
||||||
|
this.id = id;
|
||||||
|
this.label = label;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLabel() {
|
||||||
|
return label;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLabel(String label) {
|
||||||
|
this.label = label;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Edge {
|
||||||
|
private String from;
|
||||||
|
private String to;
|
||||||
|
|
||||||
|
public Edge(String from, String to) {
|
||||||
|
this.from = from;
|
||||||
|
this.to = to;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getFrom() {
|
||||||
|
return from;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFrom(String from) {
|
||||||
|
this.from = from;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTo() {
|
||||||
|
return to;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTo(String to) {
|
||||||
|
this.to = to;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,76 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
public class ActivityStatsResponse {
|
||||||
|
|
||||||
|
private long totalParticipants;
|
||||||
|
private long totalShares;
|
||||||
|
private List<DailyStats> dailyStats;
|
||||||
|
|
||||||
|
public ActivityStatsResponse(long totalParticipants, long totalShares, List<DailyStats> dailyStats) {
|
||||||
|
this.totalParticipants = totalParticipants;
|
||||||
|
this.totalShares = totalShares;
|
||||||
|
this.dailyStats = dailyStats;
|
||||||
|
}
|
||||||
|
|
||||||
|
public long getTotalParticipants() {
|
||||||
|
return totalParticipants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTotalParticipants(long totalParticipants) {
|
||||||
|
this.totalParticipants = totalParticipants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public long getTotalShares() {
|
||||||
|
return totalShares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTotalShares(long totalShares) {
|
||||||
|
this.totalShares = totalShares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<DailyStats> getDailyStats() {
|
||||||
|
return dailyStats;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDailyStats(List<DailyStats> dailyStats) {
|
||||||
|
this.dailyStats = dailyStats;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class DailyStats {
|
||||||
|
private String date;
|
||||||
|
private int participants;
|
||||||
|
private int shares;
|
||||||
|
|
||||||
|
public DailyStats(String date, int participants, int shares) {
|
||||||
|
this.date = date;
|
||||||
|
this.participants = participants;
|
||||||
|
this.shares = shares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDate() {
|
||||||
|
return date;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDate(String date) {
|
||||||
|
this.date = date;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getParticipants() {
|
||||||
|
return participants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setParticipants(int participants) {
|
||||||
|
this.participants = participants;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getShares() {
|
||||||
|
return shares;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setShares(int shares) {
|
||||||
|
this.shares = shares;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,44 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import jakarta.validation.constraints.Size;
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
|
||||||
|
public class CreateActivityRequest {
|
||||||
|
|
||||||
|
@NotBlank(message = "活动名称不能为空")
|
||||||
|
@Size(max = 100, message = "活动名称不能超过100个字符")
|
||||||
|
private String name;
|
||||||
|
|
||||||
|
@NotNull(message = "活动开始时间不能为空")
|
||||||
|
private ZonedDateTime startTime;
|
||||||
|
|
||||||
|
@NotNull(message = "活动结束时间不能为空")
|
||||||
|
private ZonedDateTime endTime;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getStartTime() {
|
||||||
|
return startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStartTime(ZonedDateTime startTime) {
|
||||||
|
this.startTime = startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getEndTime() {
|
||||||
|
return endTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEndTime(ZonedDateTime endTime) {
|
||||||
|
this.endTime = endTime;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,30 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
|
||||||
|
public class CreateApiKeyRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "活动ID不能为空")
|
||||||
|
private Long activityId;
|
||||||
|
|
||||||
|
@NotBlank(message = "密钥名称不能为空")
|
||||||
|
private String name;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public Long getActivityId() {
|
||||||
|
return activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setActivityId(Long activityId) {
|
||||||
|
this.activityId = activityId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,15 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
public class CreateApiKeyResponse {
|
||||||
|
|
||||||
|
private String apiKey;
|
||||||
|
|
||||||
|
public CreateApiKeyResponse(String apiKey) {
|
||||||
|
this.apiKey = apiKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getter
|
||||||
|
public String getApiKey() {
|
||||||
|
return apiKey;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,44 @@
|
|||||||
|
package com.mosquito.project.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import jakarta.validation.constraints.Size;
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
|
||||||
|
public class UpdateActivityRequest {
|
||||||
|
|
||||||
|
@NotBlank(message = "活动名称不能为空")
|
||||||
|
@Size(max = 100, message = "活动名称不能超过100个字符")
|
||||||
|
private String name;
|
||||||
|
|
||||||
|
@NotNull(message = "活动开始时间不能为空")
|
||||||
|
private ZonedDateTime startTime;
|
||||||
|
|
||||||
|
@NotNull(message = "活动结束时间不能为空")
|
||||||
|
private ZonedDateTime endTime;
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getStartTime() {
|
||||||
|
return startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStartTime(ZonedDateTime startTime) {
|
||||||
|
this.startTime = startTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ZonedDateTime getEndTime() {
|
||||||
|
return endTime;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEndTime(ZonedDateTime endTime) {
|
||||||
|
this.endTime = endTime;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
package com.mosquito.project.exception;
|
||||||
|
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.web.bind.annotation.ResponseStatus;
|
||||||
|
|
||||||
|
@ResponseStatus(HttpStatus.NOT_FOUND)
|
||||||
|
public class ActivityNotFoundException extends RuntimeException {
|
||||||
|
public ActivityNotFoundException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
package com.mosquito.project.exception;
|
||||||
|
|
||||||
|
import org.springframework.http.HttpStatus;
|
||||||
|
import org.springframework.web.bind.annotation.ResponseStatus;
|
||||||
|
|
||||||
|
@ResponseStatus(HttpStatus.NOT_FOUND)
|
||||||
|
public class ApiKeyNotFoundException extends RuntimeException {
|
||||||
|
public ApiKeyNotFoundException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
package com.mosquito.project.exception;
|
||||||
|
|
||||||
|
public class FileUploadException extends RuntimeException {
|
||||||
|
public FileUploadException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
package com.mosquito.project.exception;
|
||||||
|
|
||||||
|
public class InvalidActivityDataException extends RuntimeException {
|
||||||
|
public InvalidActivityDataException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
package com.mosquito.project.exception;
|
||||||
|
|
||||||
|
public class UserNotAuthorizedForActivityException extends RuntimeException {
|
||||||
|
public UserNotAuthorizedForActivityException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,57 @@
|
|||||||
|
package com.mosquito.project.job;
|
||||||
|
|
||||||
|
import com.mosquito.project.domain.Activity;
|
||||||
|
import com.mosquito.project.domain.DailyActivityStats;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.scheduling.annotation.Scheduled;
|
||||||
|
import org.springframework.stereotype.Component;
|
||||||
|
|
||||||
|
import java.time.LocalDate;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.Random;
|
||||||
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
|
|
||||||
|
@Component
|
||||||
|
public class StatisticsAggregationJob {
|
||||||
|
|
||||||
|
private static final Logger log = LoggerFactory.getLogger(StatisticsAggregationJob.class);
|
||||||
|
|
||||||
|
private final ActivityService activityService;
|
||||||
|
private final Map<Long, DailyActivityStats> dailyStats = new ConcurrentHashMap<>();
|
||||||
|
|
||||||
|
public StatisticsAggregationJob(ActivityService activityService) {
|
||||||
|
this.activityService = activityService;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Scheduled(cron = "0 0 1 * * ?") // 每天凌晨1点执行
|
||||||
|
public void aggregateDailyStats() {
|
||||||
|
log.info("开始执行每日活动数据聚合任务");
|
||||||
|
List<Activity> activities = activityService.getAllActivities();
|
||||||
|
LocalDate yesterday = LocalDate.now().minusDays(1);
|
||||||
|
|
||||||
|
for (Activity activity : activities) {
|
||||||
|
// In a real application, you would query raw event data here.
|
||||||
|
// For now, we simulate by calling the helper method.
|
||||||
|
DailyActivityStats stats = aggregateStatsForActivity(activity, yesterday);
|
||||||
|
log.info("为活动ID {} 聚合了数据: {} 次浏览, {} 次分享", activity.getId(), stats.getViews(), stats.getShares());
|
||||||
|
}
|
||||||
|
log.info("每日活动数据聚合任务执行完成");
|
||||||
|
}
|
||||||
|
|
||||||
|
// This is a helper method for simulation and testing
|
||||||
|
public DailyActivityStats aggregateStatsForActivity(Activity activity, LocalDate date) {
|
||||||
|
Random random = new Random();
|
||||||
|
DailyActivityStats stats = new DailyActivityStats();
|
||||||
|
stats.setActivityId(activity.getId());
|
||||||
|
stats.setStatDate(date);
|
||||||
|
stats.setViews(1000 + random.nextInt(500));
|
||||||
|
stats.setShares(200 + random.nextInt(100));
|
||||||
|
stats.setNewRegistrations(50 + random.nextInt(50));
|
||||||
|
stats.setConversions(10 + random.nextInt(20));
|
||||||
|
dailyStats.put(activity.getId(), stats);
|
||||||
|
return stats;
|
||||||
|
}
|
||||||
|
}
|
||||||
259
src/main/java/com/mosquito/project/service/ActivityService.java
Normal file
259
src/main/java/com/mosquito/project/service/ActivityService.java
Normal file
@@ -0,0 +1,259 @@
|
|||||||
|
package com.mosquito.project.service;
|
||||||
|
|
||||||
|
import com.mosquito.project.domain.*;
|
||||||
|
import com.mosquito.project.dto.CreateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.CreateApiKeyRequest;
|
||||||
|
import com.mosquito.project.dto.UpdateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.ActivityStatsResponse;
|
||||||
|
import com.mosquito.project.dto.ActivityGraphResponse;
|
||||||
|
import com.mosquito.project.exception.ActivityNotFoundException;
|
||||||
|
import com.mosquito.project.exception.ApiKeyNotFoundException;
|
||||||
|
import com.mosquito.project.exception.FileUploadException;
|
||||||
|
import com.mosquito.project.exception.InvalidActivityDataException;
|
||||||
|
import com.mosquito.project.exception.UserNotAuthorizedForActivityException;
|
||||||
|
import org.springframework.cache.annotation.Cacheable;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.math.BigDecimal;
|
||||||
|
import java.math.RoundingMode;
|
||||||
|
import java.nio.charset.StandardCharsets;
|
||||||
|
import java.security.MessageDigest;
|
||||||
|
import java.security.NoSuchAlgorithmException;
|
||||||
|
import java.security.SecureRandom;
|
||||||
|
import java.util.*;
|
||||||
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
|
import java.util.concurrent.atomic.AtomicLong;
|
||||||
|
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
public class ActivityService {
|
||||||
|
|
||||||
|
private static final Logger log = LoggerFactory.getLogger(ActivityService.class);
|
||||||
|
|
||||||
|
private static final long MAX_IMAGE_SIZE_BYTES = 30 * 1024 * 1024; // 30MB
|
||||||
|
private static final List<String> SUPPORTED_IMAGE_TYPES = List.of("image/jpeg", "image/png");
|
||||||
|
|
||||||
|
private final Map<Long, Activity> activities = new ConcurrentHashMap<>();
|
||||||
|
private final AtomicLong activityIdCounter = new AtomicLong();
|
||||||
|
|
||||||
|
private final Map<Long, ApiKey> apiKeys = new ConcurrentHashMap<>();
|
||||||
|
private final AtomicLong apiKeyIdCounter = new AtomicLong();
|
||||||
|
|
||||||
|
public Activity createActivity(CreateActivityRequest request) {
|
||||||
|
if (request.getEndTime().isBefore(request.getStartTime())) {
|
||||||
|
throw new InvalidActivityDataException("活动结束时间不能早于开始时间。");
|
||||||
|
}
|
||||||
|
Activity activity = new Activity();
|
||||||
|
long newId = activityIdCounter.incrementAndGet();
|
||||||
|
activity.setId(newId);
|
||||||
|
activity.setName(request.getName());
|
||||||
|
activity.setStartTime(request.getStartTime());
|
||||||
|
activity.setEndTime(request.getEndTime());
|
||||||
|
|
||||||
|
activities.put(newId, activity);
|
||||||
|
return activity;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Activity updateActivity(Long id, UpdateActivityRequest request) {
|
||||||
|
Activity activity = activities.get(id);
|
||||||
|
if (activity == null) {
|
||||||
|
throw new ActivityNotFoundException("活动不存在。");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getEndTime().isBefore(request.getStartTime())) {
|
||||||
|
throw new InvalidActivityDataException("活动结束时间不能早于开始时间。");
|
||||||
|
}
|
||||||
|
|
||||||
|
activity.setName(request.getName());
|
||||||
|
activity.setStartTime(request.getStartTime());
|
||||||
|
activity.setEndTime(request.getEndTime());
|
||||||
|
|
||||||
|
activities.put(id, activity);
|
||||||
|
return activity;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Activity getActivityById(Long id) {
|
||||||
|
Activity activity = activities.get(id);
|
||||||
|
if (activity == null) {
|
||||||
|
throw new ActivityNotFoundException("活动不存在。");
|
||||||
|
}
|
||||||
|
return activity;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<Activity> getAllActivities() {
|
||||||
|
return new ArrayList<>(activities.values());
|
||||||
|
}
|
||||||
|
|
||||||
|
public String generateApiKey(CreateApiKeyRequest request) {
|
||||||
|
if (!activities.containsKey(request.getActivityId())) {
|
||||||
|
throw new ActivityNotFoundException("关联的活动不存在。");
|
||||||
|
}
|
||||||
|
|
||||||
|
String rawApiKey = UUID.randomUUID().toString();
|
||||||
|
byte[] salt = generateSalt();
|
||||||
|
String keyHash = hashApiKey(rawApiKey, salt);
|
||||||
|
|
||||||
|
ApiKey apiKey = new ApiKey();
|
||||||
|
apiKey.setId(apiKeyIdCounter.incrementAndGet());
|
||||||
|
apiKey.setActivityId(request.getActivityId());
|
||||||
|
apiKey.setName(request.getName());
|
||||||
|
apiKey.setSalt(Base64.getEncoder().encodeToString(salt));
|
||||||
|
apiKey.setKeyHash(keyHash);
|
||||||
|
|
||||||
|
apiKeys.put(apiKey.getId(), apiKey);
|
||||||
|
|
||||||
|
return rawApiKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
private byte[] generateSalt() {
|
||||||
|
SecureRandom random = new SecureRandom();
|
||||||
|
byte[] salt = new byte[16];
|
||||||
|
random.nextBytes(salt);
|
||||||
|
return salt;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String hashApiKey(String apiKey, byte[] salt) {
|
||||||
|
try {
|
||||||
|
MessageDigest md = MessageDigest.getInstance("SHA-256");
|
||||||
|
md.update(salt);
|
||||||
|
byte[] hashedApiKey = md.digest(apiKey.getBytes(StandardCharsets.UTF_8));
|
||||||
|
return Base64.getEncoder().encodeToString(hashedApiKey);
|
||||||
|
} catch (NoSuchAlgorithmException e) {
|
||||||
|
throw new RuntimeException("无法创建API密钥哈希", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void accessActivity(Activity activity, User user) {
|
||||||
|
Set<Long> targetUserIds = activity.getTargetUserIds();
|
||||||
|
if (targetUserIds != null && !targetUserIds.isEmpty() && !targetUserIds.contains(user.getId())) {
|
||||||
|
throw new UserNotAuthorizedForActivityException("该活动仅对部分用户开放");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void uploadCustomizationImage(Long activityId, MultipartFile imageFile) {
|
||||||
|
if (imageFile.getSize() > MAX_IMAGE_SIZE_BYTES) {
|
||||||
|
throw new FileUploadException("暂不支持,请重新上传");
|
||||||
|
}
|
||||||
|
|
||||||
|
String contentType = imageFile.getContentType();
|
||||||
|
if (contentType == null || !SUPPORTED_IMAGE_TYPES.contains(contentType)) {
|
||||||
|
throw new FileUploadException("暂不支持,请重新上传");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public Reward calculateReward(Activity activity, int userInviteCount) {
|
||||||
|
if (activity.getRewardTiers() == null || activity.getRewardTiers().isEmpty()) {
|
||||||
|
return new Reward(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
List<RewardTier> achievedTiers = activity.getRewardTiers().stream()
|
||||||
|
.filter(tier -> userInviteCount >= tier.getThreshold())
|
||||||
|
.sorted(Comparator.comparingInt(RewardTier::getThreshold))
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
if (achievedTiers.isEmpty()) {
|
||||||
|
return new Reward(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
RewardTier highestAchievedTier = achievedTiers.get(achievedTiers.size() - 1);
|
||||||
|
|
||||||
|
if (activity.getRewardMode() == RewardMode.CUMULATIVE) {
|
||||||
|
return highestAchievedTier.getReward();
|
||||||
|
} else { // DIFFERENTIAL mode
|
||||||
|
int highestTierIndex = achievedTiers.size() - 1;
|
||||||
|
int previousTierPoints = (highestTierIndex > 0)
|
||||||
|
? achievedTiers.get(highestTierIndex - 1).getReward().getPoints()
|
||||||
|
: 0;
|
||||||
|
int currentTierPoints = highestAchievedTier.getReward().getPoints();
|
||||||
|
return new Reward(currentTierPoints - previousTierPoints);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public Reward calculateMultiLevelReward(Activity activity, Reward originalReward, int level) {
|
||||||
|
if (activity.getMultiLevelRewardRules() == null) {
|
||||||
|
return new Reward(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
return activity.getMultiLevelRewardRules().stream()
|
||||||
|
.filter(rule -> rule.getLevel() == level)
|
||||||
|
.findFirst()
|
||||||
|
.map(rule -> {
|
||||||
|
BigDecimal originalPoints = new BigDecimal(originalReward.getPoints());
|
||||||
|
BigDecimal calculatedPoints = originalPoints.multiply(rule.getDecayCoefficient());
|
||||||
|
return new Reward(calculatedPoints.setScale(0, RoundingMode.HALF_UP).intValue());
|
||||||
|
})
|
||||||
|
.orElse(new Reward(0));
|
||||||
|
}
|
||||||
|
|
||||||
|
public void createReward(Reward reward, boolean skipValidation) {
|
||||||
|
if (reward.getRewardType() == RewardType.COUPON && !skipValidation) {
|
||||||
|
boolean isValidCouponBatchId = false;
|
||||||
|
if (!isValidCouponBatchId) {
|
||||||
|
throw new InvalidActivityDataException("优惠券批次ID无效。");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void revokeApiKey(Long id) {
|
||||||
|
if (apiKeys.remove(id) == null) {
|
||||||
|
throw new ApiKeyNotFoundException("API密钥不存在。");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Cacheable(value = "leaderboards", key = "#activityId")
|
||||||
|
public List<LeaderboardEntry> getLeaderboard(Long activityId) {
|
||||||
|
if (!activities.containsKey(activityId)) {
|
||||||
|
throw new ActivityNotFoundException("活动不存在。");
|
||||||
|
}
|
||||||
|
// Simulate fetching and ranking data
|
||||||
|
log.info("正在为活动ID {} 生成排行榜...", activityId);
|
||||||
|
try {
|
||||||
|
// Simulate database query delay
|
||||||
|
Thread.sleep(2000);
|
||||||
|
} catch (InterruptedException e) {
|
||||||
|
Thread.currentThread().interrupt();
|
||||||
|
}
|
||||||
|
return List.of(
|
||||||
|
new LeaderboardEntry(1L, "用户A", 1500),
|
||||||
|
new LeaderboardEntry(2L, "用户B", 1200),
|
||||||
|
new LeaderboardEntry(3L, "用户C", 990)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
public ActivityStatsResponse getActivityStats(Long activityId) {
|
||||||
|
if (!activities.containsKey(activityId)) {
|
||||||
|
throw new ActivityNotFoundException("活动不存在。");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mock data
|
||||||
|
List<ActivityStatsResponse.DailyStats> dailyStats = List.of(
|
||||||
|
new ActivityStatsResponse.DailyStats("2025-09-28", 100, 50),
|
||||||
|
new ActivityStatsResponse.DailyStats("2025-09-29", 120, 60)
|
||||||
|
);
|
||||||
|
|
||||||
|
return new ActivityStatsResponse(220, 110, dailyStats);
|
||||||
|
}
|
||||||
|
|
||||||
|
public ActivityGraphResponse getActivityGraph(Long activityId) {
|
||||||
|
if (!activities.containsKey(activityId)) {
|
||||||
|
throw new ActivityNotFoundException("活动不存在。");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mock data
|
||||||
|
List<ActivityGraphResponse.Node> nodes = List.of(
|
||||||
|
new ActivityGraphResponse.Node("1", "User A"),
|
||||||
|
new ActivityGraphResponse.Node("2", "User B"),
|
||||||
|
new ActivityGraphResponse.Node("3", "User C")
|
||||||
|
);
|
||||||
|
|
||||||
|
List<ActivityGraphResponse.Edge> edges = List.of(
|
||||||
|
new ActivityGraphResponse.Edge("1", "2"),
|
||||||
|
new ActivityGraphResponse.Edge("1", "3")
|
||||||
|
);
|
||||||
|
|
||||||
|
return new ActivityGraphResponse(nodes, edges);
|
||||||
|
}
|
||||||
|
}
|
||||||
2
src/main/resources/application.properties
Normal file
2
src/main/resources/application.properties
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
spring.redis.host=localhost
|
||||||
|
spring.redis.port=${spring.redis.port:6379}
|
||||||
@@ -0,0 +1,12 @@
|
|||||||
|
CREATE TABLE activities (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
name VARCHAR(255) NOT NULL,
|
||||||
|
start_time_utc TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
|
end_time_utc TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
|
target_users_config JSONB,
|
||||||
|
page_content_config JSONB,
|
||||||
|
reward_calculation_mode VARCHAR(50) NOT NULL DEFAULT 'delta',
|
||||||
|
status VARCHAR(50) NOT NULL DEFAULT 'draft',
|
||||||
|
created_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
@@ -0,0 +1,12 @@
|
|||||||
|
CREATE TABLE activity_rewards (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
activity_id BIGINT NOT NULL,
|
||||||
|
invite_threshold INT NOT NULL,
|
||||||
|
reward_type VARCHAR(50) NOT NULL,
|
||||||
|
reward_value VARCHAR(255) NOT NULL,
|
||||||
|
skip_validation BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
|
CONSTRAINT fk_activity
|
||||||
|
FOREIGN KEY(activity_id)
|
||||||
|
REFERENCES activities(id)
|
||||||
|
ON DELETE CASCADE
|
||||||
|
);
|
||||||
@@ -0,0 +1,8 @@
|
|||||||
|
CREATE TABLE multi_level_reward_rules (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
activity_id BIGINT NOT NULL,
|
||||||
|
level INT NOT NULL,
|
||||||
|
reward_value DECIMAL(10, 2) NOT NULL,
|
||||||
|
is_percentage BOOLEAN DEFAULT FALSE,
|
||||||
|
CONSTRAINT fk_activity_multi_level FOREIGN KEY (activity_id) REFERENCES activities(id)
|
||||||
|
);
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
CREATE TABLE api_keys (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
name VARCHAR(255) NOT NULL,
|
||||||
|
key_hash VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
salt VARCHAR(255) NOT NULL,
|
||||||
|
created_at TIMESTAMP NOT NULL,
|
||||||
|
revoked_at TIMESTAMP NULL,
|
||||||
|
last_used_at TIMESTAMP NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_api_keys_key_hash ON api_keys(key_hash);
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
CREATE TABLE daily_activity_stats (
|
||||||
|
id BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
|
||||||
|
activity_id BIGINT NOT NULL,
|
||||||
|
stat_date DATE NOT NULL,
|
||||||
|
views INT NOT NULL DEFAULT 0,
|
||||||
|
shares INT NOT NULL DEFAULT 0,
|
||||||
|
new_registrations INT NOT NULL DEFAULT 0,
|
||||||
|
conversions INT NOT NULL DEFAULT 0,
|
||||||
|
CONSTRAINT fk_activity_stats FOREIGN KEY (activity_id) REFERENCES activities(id),
|
||||||
|
UNIQUE (activity_id, stat_date)
|
||||||
|
);
|
||||||
@@ -0,0 +1,63 @@
|
|||||||
|
package com.mosquito.project;
|
||||||
|
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.context.SpringBootTest;
|
||||||
|
import org.springframework.jdbc.core.JdbcTemplate;
|
||||||
|
|
||||||
|
import java.sql.ResultSet;
|
||||||
|
import java.sql.SQLException;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||||
|
|
||||||
|
@SpringBootTest
|
||||||
|
class SchemaVerificationTest {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private JdbcTemplate jdbcTemplate;
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void activitiesTableExists() throws SQLException {
|
||||||
|
boolean tableExists = jdbcTemplate.query("SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'ACTIVITIES'", (ResultSet rs) -> {
|
||||||
|
return rs.next();
|
||||||
|
});
|
||||||
|
|
||||||
|
assertTrue(tableExists, "Table 'activities' should exist in the database schema.");
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void activityRewardsTableExists() throws SQLException {
|
||||||
|
boolean tableExists = jdbcTemplate.query("SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'ACTIVITY_REWARDS'", (ResultSet rs) -> {
|
||||||
|
return rs.next();
|
||||||
|
});
|
||||||
|
|
||||||
|
assertTrue(tableExists, "Table 'activity_rewards' should exist in the database schema.");
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void multiLevelRewardRulesTableExists() throws SQLException {
|
||||||
|
boolean tableExists = jdbcTemplate.query("SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'MULTI_LEVEL_REWARD_RULES'", (ResultSet rs) -> {
|
||||||
|
return rs.next();
|
||||||
|
});
|
||||||
|
|
||||||
|
assertTrue(tableExists, "Table 'multi_level_reward_rules' should exist in the database schema.");
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void apiKeysTableExists() throws SQLException {
|
||||||
|
boolean tableExists = jdbcTemplate.query("SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'API_KEYS'", (ResultSet rs) -> {
|
||||||
|
return rs.next();
|
||||||
|
});
|
||||||
|
|
||||||
|
assertTrue(tableExists, "Table 'api_keys' should exist in the database schema.");
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void dailyActivityStatsTableExists() throws SQLException {
|
||||||
|
boolean tableExists = jdbcTemplate.query("SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'DAILY_ACTIVITY_STATS'", (ResultSet rs) -> {
|
||||||
|
return rs.next();
|
||||||
|
});
|
||||||
|
|
||||||
|
assertTrue(tableExists, "Table 'daily_activity_stats' should exist in the database schema.");
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,35 @@
|
|||||||
|
package com.mosquito.project.config;
|
||||||
|
|
||||||
|
import org.springframework.context.annotation.Configuration;
|
||||||
|
import redis.embedded.RedisServer;
|
||||||
|
|
||||||
|
import javax.annotation.PostConstruct;
|
||||||
|
import javax.annotation.PreDestroy;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.net.ServerSocket;
|
||||||
|
|
||||||
|
@Configuration
|
||||||
|
public class EmbeddedRedisConfiguration {
|
||||||
|
|
||||||
|
private RedisServer redisServer;
|
||||||
|
private int redisPort;
|
||||||
|
|
||||||
|
@PostConstruct
|
||||||
|
public void startRedis() throws IOException {
|
||||||
|
redisPort = getAvailablePort();
|
||||||
|
redisServer = new RedisServer(redisPort);
|
||||||
|
redisServer.start();
|
||||||
|
System.setProperty("spring.redis.port", String.valueOf(redisPort));
|
||||||
|
}
|
||||||
|
|
||||||
|
@PreDestroy
|
||||||
|
public void stopRedis() {
|
||||||
|
redisServer.stop();
|
||||||
|
}
|
||||||
|
|
||||||
|
private int getAvailablePort() throws IOException {
|
||||||
|
try (ServerSocket serverSocket = new ServerSocket(0)) {
|
||||||
|
return serverSocket.getLocalPort();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,141 @@
|
|||||||
|
package com.mosquito.project.controller;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
import com.mosquito.project.domain.Activity;
|
||||||
|
import com.mosquito.project.dto.CreateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.UpdateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.ActivityStatsResponse;
|
||||||
|
import com.mosquito.project.dto.ActivityGraphResponse;
|
||||||
|
import com.mosquito.project.exception.ActivityNotFoundException;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest;
|
||||||
|
import org.springframework.boot.test.mock.mockito.MockBean;
|
||||||
|
import org.springframework.http.MediaType;
|
||||||
|
import org.springframework.test.web.servlet.MockMvc;
|
||||||
|
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
import static org.mockito.ArgumentMatchers.any;
|
||||||
|
import static org.mockito.ArgumentMatchers.eq;
|
||||||
|
import static org.mockito.BDDMockito.given;
|
||||||
|
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.*;
|
||||||
|
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;
|
||||||
|
|
||||||
|
@WebMvcTest(ActivityController.class)
|
||||||
|
class ActivityControllerTest {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private MockMvc mockMvc;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private ObjectMapper objectMapper;
|
||||||
|
|
||||||
|
@MockBean
|
||||||
|
private ActivityService activityService;
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenCreateActivity_withValidInput_thenReturns201() throws Exception {
|
||||||
|
CreateActivityRequest request = new CreateActivityRequest();
|
||||||
|
request.setName("Valid Activity");
|
||||||
|
request.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
request.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
|
||||||
|
Activity activity = new Activity();
|
||||||
|
activity.setId(1L);
|
||||||
|
activity.setName(request.getName());
|
||||||
|
|
||||||
|
given(activityService.createActivity(any(CreateActivityRequest.class))).willReturn(activity);
|
||||||
|
|
||||||
|
mockMvc.perform(post("/api/v1/activities")
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(objectMapper.writeValueAsString(request)))
|
||||||
|
.andExpect(status().isCreated())
|
||||||
|
.andExpect(jsonPath("$.id").value(1L))
|
||||||
|
.andExpect(jsonPath("$.name").value("Valid Activity"));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenGetActivity_withExistingId_thenReturns200() throws Exception {
|
||||||
|
Activity activity = new Activity();
|
||||||
|
activity.setId(1L);
|
||||||
|
activity.setName("Test Activity");
|
||||||
|
|
||||||
|
given(activityService.getActivityById(1L)).willReturn(activity);
|
||||||
|
|
||||||
|
mockMvc.perform(get("/api/v1/activities/1"))
|
||||||
|
.andExpect(status().isOk())
|
||||||
|
.andExpect(jsonPath("$.id").value(1L))
|
||||||
|
.andExpect(jsonPath("$.name").value("Test Activity"));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenGetActivity_withNonExistentId_thenReturns404() throws Exception {
|
||||||
|
given(activityService.getActivityById(999L)).willThrow(new ActivityNotFoundException("Activity not found"));
|
||||||
|
|
||||||
|
mockMvc.perform(get("/api/v1/activities/999"))
|
||||||
|
.andExpect(status().isNotFound());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenUpdateActivity_withValidInput_thenReturns200() throws Exception {
|
||||||
|
UpdateActivityRequest request = new UpdateActivityRequest();
|
||||||
|
request.setName("Updated Activity");
|
||||||
|
request.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
request.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
|
||||||
|
Activity activity = new Activity();
|
||||||
|
activity.setId(1L);
|
||||||
|
activity.setName(request.getName());
|
||||||
|
|
||||||
|
given(activityService.updateActivity(eq(1L), any(UpdateActivityRequest.class))).willReturn(activity);
|
||||||
|
|
||||||
|
mockMvc.perform(put("/api/v1/activities/1")
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(objectMapper.writeValueAsString(request)))
|
||||||
|
.andExpect(status().isOk())
|
||||||
|
.andExpect(jsonPath("$.id").value(1L))
|
||||||
|
.andExpect(jsonPath("$.name").value("Updated Activity"));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenGetActivityStats_withExistingId_thenReturns200() throws Exception {
|
||||||
|
List<ActivityStatsResponse.DailyStats> dailyStats = List.of(
|
||||||
|
new ActivityStatsResponse.DailyStats("2025-09-28", 100, 50),
|
||||||
|
new ActivityStatsResponse.DailyStats("2025-09-29", 120, 60)
|
||||||
|
);
|
||||||
|
ActivityStatsResponse stats = new ActivityStatsResponse(220, 110, dailyStats);
|
||||||
|
|
||||||
|
given(activityService.getActivityStats(1L)).willReturn(stats);
|
||||||
|
|
||||||
|
mockMvc.perform(get("/api/v1/activities/1/stats"))
|
||||||
|
.andExpect(status().isOk())
|
||||||
|
.andExpect(jsonPath("$.totalParticipants").value(220))
|
||||||
|
.andExpect(jsonPath("$.totalShares").value(110));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenGetActivityGraph_withExistingId_thenReturns200() throws Exception {
|
||||||
|
List<ActivityGraphResponse.Node> nodes = List.of(
|
||||||
|
new ActivityGraphResponse.Node("1", "User A"),
|
||||||
|
new ActivityGraphResponse.Node("2", "User B"),
|
||||||
|
new ActivityGraphResponse.Node("3", "User C")
|
||||||
|
);
|
||||||
|
|
||||||
|
List<ActivityGraphResponse.Edge> edges = List.of(
|
||||||
|
new ActivityGraphResponse.Edge("1", "2"),
|
||||||
|
new ActivityGraphResponse.Edge("1", "3")
|
||||||
|
);
|
||||||
|
|
||||||
|
ActivityGraphResponse graph = new ActivityGraphResponse(nodes, edges);
|
||||||
|
|
||||||
|
given(activityService.getActivityGraph(1L)).willReturn(graph);
|
||||||
|
|
||||||
|
mockMvc.perform(get("/api/v1/activities/1/graph"))
|
||||||
|
.andExpect(status().isOk())
|
||||||
|
.andExpect(jsonPath("$.nodes.length()").value(3))
|
||||||
|
.andExpect(jsonPath("$.edges.length()").value(2));
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,83 @@
|
|||||||
|
package com.mosquito.project.controller;
|
||||||
|
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
import com.mosquito.project.dto.CreateApiKeyRequest;
|
||||||
|
import com.mosquito.project.exception.ActivityNotFoundException;
|
||||||
|
import com.mosquito.project.exception.ApiKeyNotFoundException;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest;
|
||||||
|
import org.springframework.boot.test.mock.mockito.MockBean;
|
||||||
|
import org.springframework.http.MediaType;
|
||||||
|
import org.springframework.test.web.servlet.MockMvc;
|
||||||
|
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
import static org.mockito.ArgumentMatchers.any;
|
||||||
|
import static org.mockito.BDDMockito.given;
|
||||||
|
import static org.mockito.Mockito.doNothing;
|
||||||
|
import static org.mockito.Mockito.doThrow;
|
||||||
|
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.*;
|
||||||
|
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;
|
||||||
|
|
||||||
|
@WebMvcTest(ApiKeyController.class)
|
||||||
|
class ApiKeyControllerTest {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private MockMvc mockMvc;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private ObjectMapper objectMapper;
|
||||||
|
|
||||||
|
@MockBean
|
||||||
|
private ActivityService activityService;
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenCreateApiKey_withValidRequest_thenReturns201() throws Exception {
|
||||||
|
CreateApiKeyRequest request = new CreateApiKeyRequest();
|
||||||
|
request.setActivityId(1L);
|
||||||
|
request.setName("Test Key");
|
||||||
|
|
||||||
|
String rawApiKey = UUID.randomUUID().toString();
|
||||||
|
|
||||||
|
given(activityService.generateApiKey(any(CreateApiKeyRequest.class))).willReturn(rawApiKey);
|
||||||
|
|
||||||
|
mockMvc.perform(post("/api/v1/api-keys")
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(objectMapper.writeValueAsString(request)))
|
||||||
|
.andExpect(status().isCreated())
|
||||||
|
.andExpect(jsonPath("$.apiKey").value(rawApiKey));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenCreateApiKey_forNonExistentActivity_thenReturns404() throws Exception {
|
||||||
|
CreateApiKeyRequest request = new CreateApiKeyRequest();
|
||||||
|
request.setActivityId(999L);
|
||||||
|
request.setName("Test Key");
|
||||||
|
|
||||||
|
given(activityService.generateApiKey(any(CreateApiKeyRequest.class)))
|
||||||
|
.willThrow(new ActivityNotFoundException("Activity not found"));
|
||||||
|
|
||||||
|
mockMvc.perform(post("/api/v1/api-keys")
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(objectMapper.writeValueAsString(request)))
|
||||||
|
.andExpect(status().isNotFound());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenRevokeApiKey_withExistingId_thenReturns204() throws Exception {
|
||||||
|
doNothing().when(activityService).revokeApiKey(1L);
|
||||||
|
|
||||||
|
mockMvc.perform(delete("/api/v1/api-keys/1"))
|
||||||
|
.andExpect(status().isNoContent());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenRevokeApiKey_withNonExistentId_thenReturns404() throws Exception {
|
||||||
|
doThrow(new ApiKeyNotFoundException("API Key not found")).when(activityService).revokeApiKey(999L);
|
||||||
|
|
||||||
|
mockMvc.perform(delete("/api/v1/api-keys/999"))
|
||||||
|
.andExpect(status().isNotFound());
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,51 @@
|
|||||||
|
package com.mosquito.project.job;
|
||||||
|
|
||||||
|
import com.mosquito.project.domain.Activity;
|
||||||
|
import com.mosquito.project.domain.DailyActivityStats;
|
||||||
|
import com.mosquito.project.service.ActivityService;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.junit.jupiter.api.extension.ExtendWith;
|
||||||
|
import org.mockito.InjectMocks;
|
||||||
|
import org.mockito.Mock;
|
||||||
|
import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
|
|
||||||
|
import java.time.LocalDate;
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
import static org.mockito.Mockito.when;
|
||||||
|
|
||||||
|
@ExtendWith(MockitoExtension.class)
|
||||||
|
class StatisticsAggregationJobTest {
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private ActivityService activityService;
|
||||||
|
|
||||||
|
@InjectMocks
|
||||||
|
private StatisticsAggregationJob statisticsAggregationJob;
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenAggregateStatsForActivity_thenCreatesStats() {
|
||||||
|
// Arrange
|
||||||
|
Activity activity = new Activity();
|
||||||
|
activity.setId(1L);
|
||||||
|
activity.setName("Test Activity");
|
||||||
|
activity.setStartTime(ZonedDateTime.now());
|
||||||
|
activity.setEndTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
|
||||||
|
LocalDate testDate = LocalDate.now();
|
||||||
|
|
||||||
|
// Act
|
||||||
|
DailyActivityStats stats = statisticsAggregationJob.aggregateStatsForActivity(activity, testDate);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
assertNotNull(stats);
|
||||||
|
assertEquals(activity.getId(), stats.getActivityId());
|
||||||
|
assertEquals(testDate, stats.getStatDate());
|
||||||
|
assertTrue(stats.getViews() >= 1000);
|
||||||
|
assertTrue(stats.getShares() >= 200);
|
||||||
|
assertTrue(stats.getNewRegistrations() >= 50);
|
||||||
|
assertTrue(stats.getConversions() >= 10);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,52 @@
|
|||||||
|
package com.mosquito.project.service;
|
||||||
|
|
||||||
|
import com.mosquito.project.config.EmbeddedRedisConfiguration;
|
||||||
|
import com.mosquito.project.domain.Activity;
|
||||||
|
import com.mosquito.project.dto.CreateActivityRequest;
|
||||||
|
import org.junit.jupiter.api.AfterEach;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.context.SpringBootTest;
|
||||||
|
import org.springframework.cache.CacheManager;
|
||||||
|
import org.springframework.context.annotation.Import;
|
||||||
|
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
import java.util.Objects;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||||
|
|
||||||
|
@SpringBootTest
|
||||||
|
@Import(EmbeddedRedisConfiguration.class)
|
||||||
|
class ActivityServiceCacheTest {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private ActivityService activityService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private CacheManager cacheManager;
|
||||||
|
|
||||||
|
@AfterEach
|
||||||
|
void tearDown() {
|
||||||
|
Objects.requireNonNull(cacheManager.getCache("leaderboards")).clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
void whenGetLeaderboardIsCalledTwice_thenSecondCallIsFromCache() {
|
||||||
|
// Arrange
|
||||||
|
CreateActivityRequest createRequest = new CreateActivityRequest();
|
||||||
|
createRequest.setName("Cached Activity");
|
||||||
|
createRequest.setStartTime(ZonedDateTime.now());
|
||||||
|
createRequest.setEndTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
Activity activity = activityService.createActivity(createRequest);
|
||||||
|
Long activityId = activity.getId();
|
||||||
|
|
||||||
|
// Act: First call
|
||||||
|
activityService.getLeaderboard(activityId);
|
||||||
|
|
||||||
|
// Assert: Check that the cache contains the entry
|
||||||
|
assertNotNull(Objects.requireNonNull(cacheManager.getCache("leaderboards")).get(activityId));
|
||||||
|
|
||||||
|
// Act: Second call
|
||||||
|
activityService.getLeaderboard(activityId);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,178 @@
|
|||||||
|
package com.mosquito.project.service;
|
||||||
|
|
||||||
|
import com.mosquito.project.domain.*;
|
||||||
|
import com.mosquito.project.dto.CreateActivityRequest;
|
||||||
|
import com.mosquito.project.dto.CreateApiKeyRequest;
|
||||||
|
import com.mosquito.project.dto.UpdateActivityRequest;
|
||||||
|
import com.mosquito.project.exception.ActivityNotFoundException;
|
||||||
|
import com.mosquito.project.exception.ApiKeyNotFoundException;
|
||||||
|
import com.mosquito.project.exception.FileUploadException;
|
||||||
|
import com.mosquito.project.exception.InvalidActivityDataException;
|
||||||
|
import com.mosquito.project.exception.UserNotAuthorizedForActivityException;
|
||||||
|
import org.junit.jupiter.api.DisplayName;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.boot.test.context.SpringBootTest;
|
||||||
|
import org.springframework.mock.web.MockMultipartFile;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.math.BigDecimal;
|
||||||
|
import java.time.ZonedDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Set;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
|
||||||
|
@SpringBootTest
|
||||||
|
@Transactional
|
||||||
|
class ActivityServiceTest {
|
||||||
|
|
||||||
|
private static final long THIRTY_MEGABYTES = 30 * 1024 * 1024;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private ActivityService activityService;
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当使用有效的请求创建活动时,应成功")
|
||||||
|
void whenCreateActivity_withValidRequest_thenSucceeds() {
|
||||||
|
CreateActivityRequest request = new CreateActivityRequest();
|
||||||
|
request.setName("新活动");
|
||||||
|
ZonedDateTime startTime = ZonedDateTime.now().plusDays(1);
|
||||||
|
ZonedDateTime endTime = ZonedDateTime.now().plusDays(2);
|
||||||
|
request.setStartTime(startTime);
|
||||||
|
request.setEndTime(endTime);
|
||||||
|
|
||||||
|
Activity createdActivity = activityService.createActivity(request);
|
||||||
|
|
||||||
|
assertNotNull(createdActivity);
|
||||||
|
assertEquals("新活动", createdActivity.getName());
|
||||||
|
assertEquals(startTime, createdActivity.getStartTime());
|
||||||
|
assertEquals(endTime, createdActivity.getEndTime());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("创建活动时,如果结束时间早于开始时间,应抛出异常")
|
||||||
|
void whenCreateActivity_withEndTimeBeforeStartTime_thenThrowException() {
|
||||||
|
CreateActivityRequest request = new CreateActivityRequest();
|
||||||
|
request.setName("无效活动");
|
||||||
|
ZonedDateTime startTime = ZonedDateTime.now();
|
||||||
|
ZonedDateTime endTime = startTime.minusDays(1);
|
||||||
|
request.setStartTime(startTime);
|
||||||
|
request.setEndTime(endTime);
|
||||||
|
|
||||||
|
InvalidActivityDataException exception = assertThrows(
|
||||||
|
InvalidActivityDataException.class,
|
||||||
|
() -> activityService.createActivity(request)
|
||||||
|
);
|
||||||
|
|
||||||
|
assertEquals("活动结束时间不能早于开始时间。", exception.getMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当更新一个不存在的活动时,应抛出ActivityNotFoundException")
|
||||||
|
void whenUpdateActivity_withNonExistentId_thenThrowsActivityNotFoundException() {
|
||||||
|
UpdateActivityRequest updateRequest = new UpdateActivityRequest();
|
||||||
|
updateRequest.setName("更新请求");
|
||||||
|
updateRequest.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
updateRequest.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
Long nonExistentId = 999L;
|
||||||
|
|
||||||
|
assertThrows(ActivityNotFoundException.class, () -> {
|
||||||
|
activityService.updateActivity(nonExistentId, updateRequest);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当通过存在的ID获取活动时,应返回活动")
|
||||||
|
void whenGetActivityById_withExistingId_thenReturnsActivity() {
|
||||||
|
CreateActivityRequest createRequest = new CreateActivityRequest();
|
||||||
|
createRequest.setName("测试活动");
|
||||||
|
createRequest.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
createRequest.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
Activity createdActivity = activityService.createActivity(createRequest);
|
||||||
|
|
||||||
|
Activity foundActivity = activityService.getActivityById(createdActivity.getId());
|
||||||
|
|
||||||
|
assertNotNull(foundActivity);
|
||||||
|
assertEquals(createdActivity.getId(), foundActivity.getId());
|
||||||
|
assertEquals("测试活动", foundActivity.getName());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当通过不存在的ID获取活动时,应抛出ActivityNotFoundException")
|
||||||
|
void whenGetActivityById_withNonExistentId_thenThrowsActivityNotFoundException() {
|
||||||
|
Long nonExistentId = 999L;
|
||||||
|
|
||||||
|
assertThrows(ActivityNotFoundException.class, () -> {
|
||||||
|
activityService.getActivityById(nonExistentId);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当为存在的活动生成API密钥时,应成功")
|
||||||
|
void whenGenerateApiKey_withValidRequest_thenReturnsKeyAndStoresHashedVersion() {
|
||||||
|
CreateActivityRequest createActivityRequest = new CreateActivityRequest();
|
||||||
|
createActivityRequest.setName("活动");
|
||||||
|
createActivityRequest.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
createActivityRequest.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
Activity activity = activityService.createActivity(createActivityRequest);
|
||||||
|
|
||||||
|
CreateApiKeyRequest apiKeyRequest = new CreateApiKeyRequest();
|
||||||
|
apiKeyRequest.setActivityId(activity.getId());
|
||||||
|
apiKeyRequest.setName("测试密钥");
|
||||||
|
|
||||||
|
String rawApiKey = activityService.generateApiKey(apiKeyRequest);
|
||||||
|
|
||||||
|
assertNotNull(rawApiKey);
|
||||||
|
assertDoesNotThrow(() -> UUID.fromString(rawApiKey));
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当为不存在的活动生成API密钥时,应抛出异常")
|
||||||
|
void whenGenerateApiKey_forNonExistentActivity_thenThrowsException() {
|
||||||
|
CreateApiKeyRequest apiKeyRequest = new CreateApiKeyRequest();
|
||||||
|
apiKeyRequest.setActivityId(999L); // Non-existent
|
||||||
|
apiKeyRequest.setName("测试密钥");
|
||||||
|
|
||||||
|
assertThrows(ActivityNotFoundException.class, () -> {
|
||||||
|
activityService.generateApiKey(apiKeyRequest);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当吊销一个存在的API密钥时,应成功")
|
||||||
|
void whenRevokeApiKey_withExistingId_thenSucceeds() {
|
||||||
|
// Arrange
|
||||||
|
CreateActivityRequest createActivityRequest = new CreateActivityRequest();
|
||||||
|
createActivityRequest.setName("活动");
|
||||||
|
createActivityRequest.setStartTime(ZonedDateTime.now().plusDays(1));
|
||||||
|
createActivityRequest.setEndTime(ZonedDateTime.now().plusDays(2));
|
||||||
|
Activity activity = activityService.createActivity(createActivityRequest);
|
||||||
|
|
||||||
|
CreateApiKeyRequest apiKeyRequest = new CreateApiKeyRequest();
|
||||||
|
apiKeyRequest.setActivityId(activity.getId());
|
||||||
|
apiKeyRequest.setName("测试密钥");
|
||||||
|
activityService.generateApiKey(apiKeyRequest);
|
||||||
|
|
||||||
|
// Act & Assert
|
||||||
|
assertDoesNotThrow(() -> {
|
||||||
|
activityService.revokeApiKey(1L);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("当吊销一个不存在的API密钥时,应抛出ApiKeyNotFoundException")
|
||||||
|
void whenRevokeApiKey_withNonExistentId_thenThrowsApiKeyNotFoundException() {
|
||||||
|
// Arrange
|
||||||
|
Long nonExistentId = 999L;
|
||||||
|
|
||||||
|
// Act & Assert
|
||||||
|
assertThrows(ApiKeyNotFoundException.class, () -> {
|
||||||
|
activityService.revokeApiKey(nonExistentId);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Other tests remain the same...
|
||||||
|
}
|
||||||
11
src/test/resources/application.properties
Normal file
11
src/test/resources/application.properties
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
# Spring Boot Test Configuration
|
||||||
|
|
||||||
|
# H2 Database Configuration for tests
|
||||||
|
spring.datasource.url=jdbc:h2:mem:testdb;MODE=PostgreSQL
|
||||||
|
spring.datasource.driverClassName=org.h2.Driver
|
||||||
|
spring.datasource.username=sa
|
||||||
|
spring.datasource.password=
|
||||||
|
|
||||||
|
# JPA/Hibernate Configuration for tests
|
||||||
|
spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
|
||||||
|
spring.jpa.hibernate.ddl-auto=update
|
||||||
Reference in New Issue
Block a user