/
Miami Benchmark Project 2023-24

Miami Benchmark Project 2023-24

1.We can use this space to create all the requirement documents related to features, bugs and tasks.

2.Proposed ( We can begin with this to update tickets durring grooming and keep improving as per feedbacks from teams working on these to make sure tickets are user friendly and have all the information needed to start development.)

Ticket Template for Miami Benchmark Project

Title: [Brief, descriptive title of the feature or task]

Description:

[A concise overview of the feature or task, providing context and purpose]

User Story: (if applicable)

As a [type of user],

I want to [perform some action],

So that [I can achieve some goal/benefit/value].

Steps to add/Edit/View functional flow related to Ticket Scope: (Add screen shots or wireframes if available)

  • This will have steps to see or add functional flow for new features and steps to reproduce bugs

Acceptance Criteria/Test Cases: ( To be added by QA and BA team based on full understanding of ticket scope and functionality.)

  •  Criterion 1

  •  Criterion 2

  •  Criterion 3
    [List specific, testable conditions that must be met for the feature/task to be considered complete]

Technical Requirements:

  • [List any technical specifications or constraints]

  • [Include architecture considerations, if applicable]

  • [Mention any specific technologies or tools to be used]

Dependencies:

  • [List any dependencies on other tasks, features, or external factors]

Effort Estimate: ( Dev team will enter or provide this durring ticket grooming sessions)

[Provide an estimate using your team's preferred method (e.g., story points, time)]

Priority: ( Will be set by Business team)

[Indicate the priority level (e.g., High, Medium, Low)]

Related Documents: 

  • [Links to relevant design documents, mockups, or specifications created at confluence]

  • [Links to related tickets or epics]

Additional Notes:

[Any other relevant information, considerations, or context]


3.Definition of Done for any ticket:

  •  Code implemented and peer-reviewed

  •  Unit tests written and passing

  •  Integration tests passing

  •  Documentation updated

  •  Product Owner review completed

  •  QA testing completed
    [Customize based on your team's specific definition of done]


4.End Goal -Each JIRA Ticket should provides following information: 

  1. Clear and informative

  2. Actionable with specific acceptance criteria

  3. Aligned with user needs (through the user story)

  4. Technically detailed

  5. Contextualized within the larger project

  6. Estimable and prioritized

  7. Trackable through the definition of done

Add label

Related content