In our company FlowFabric the quality guidelines require a peer review for all new or changed functionality. These peer reviews
1. Result in better products
2. Reduce the required support
3. Lead to cost reduction because solving issues in production cost 50-100 times more than in development
4. Make you learn from your colleagues, because you see new solutions and patterns and because you think about the solutions of your colleagues
5. Improve custom satisfaction
A peer review has 3 steps
1. Find the changes
2. Inspect and review the changes
3. Agree with the changes or report your findings and go back to step 1
For step 1 - Find changes - currently is available within the modeler
1. Commit management, 1 change at time, add relevant comments.
2. Add annotations in microflows if needed.
3. Mark/Find the technical debt if annotated as TODO
3. Describe required data migrations if applicable
Suggestion for step 1 - Find the changes
1. See relevant changes in history
2. Partial commit, don't commit everything modified but select 'what to commit' (Level: 'document' (mf, page, domain model))
3. Retrieve history of specific document (page/microflow) and click on revisions see not all changes but specific changes for this document
4. Marking of things to check, optionally based on history of certain commit. Can be part of generic Bookmarking in the Mendix modeler
For step 2 is available:
1. Open an editor to inspect the changes visually
Suggestions for step 2 - Review the changes
1. Compare tooling for expressions and xpaths, which is now only available on merge conflicts. Like github.
2. Compare tooling for visual elements like microflows and pages
3. Compare tooling for java-actions
4. Show Panels 2 next to each other
For step 3 - Documentation - is available
1. Annotations and documentation
2. Comments of commit
Suggestion for step 3 - Documentation
1. A documentation system with (deep)links to details of pages and microflows, for example a line in an assignment, an xpath in a grid of a page. Could start with a WYSIWYG annotations and links to full qualified paths.
2. Expand sprint with a peer review phase
3. Modeler: Link to sprintr to mark a user story as 'reviewed'. Log the review in sprintr history.
Please add other features or comment on these steps.
In case you use JIRA, Github, TFS, etc the other idea should be to provide custom connectors and or plugins for the modeler.
Excellent idea; just keep in mind that some clients don't use Sprintr... some use TFS, GIT or other tools to manage lifecycle of user stories
Being able to tag microflows would be a good solution (Managing tags would be nice, but even something as simple as marking it as ''favorite/star'' or something similar would be helpful).
I like this idea very much. Better support for the peer review process embedded in the development platform would improve quality and speed up the review process.