Xinova Rubric Review Tool

Overview:
A key driver in achieving the 60% efficiency increase in submission to selection was a innovation to the Solution Review process. Previously reviews, which are by a selected subset of network members, were freeform text. That is each reviewer would simply write whatever they felt. There was no structure to these assessments, and more importantly they did not necessarily reflect the selection criteria that the Customer had outlined in setting up the project. This in turn required the Xinova Project Managers to then carefully read, summarize and draft their own interpretation of the review’s thoughts. Making for a time intensive process.

To address this I created a rubric based review tool. This tool allowed Customers and Xinova Project Managers to either select from a template, or draft new questions and statements. Each reviewer would then select the one statement they felt best answered the question.

For example one question might be “Which of the following best describes your opinion of this solution?” in which case reviewer would select one of the following possible answers:

  1. This is not an innovation, this solution already exists

  2. This is a minor feature improvement, its value is in addressing the problem requires additional consideration

  3. This is a useful feature improvement that may have some value in addressing the problem

  4. This innovation offers a significant improvement that may have should prove valuable in addressing the problem

  5. This is a groundbreaking innovation that will likely be very valuable in addressing this problem

These statements would then be weighed and combined using simply logic to draft a comprehensive review for each submission. For example if two reviewers answered #3 and four answered #4, and four answered #5, the logic would combine the responses to read: “While some reviewers felt this solution offered a useful feature improvement that may have some value, the majority of reviewers felt this solution offers a significant improvement and is possibly a groundbreaking innovation that should be very valuable in addressing this problem.”

Deliverables:
Wireframe for the tool
Initial set of rubrics, question and answers
Logic tree for weighing and combining the text snippets
Pilot with Reviewers

Impact:
60%, decrease in submission to selection time
Overwhelming support from the Project Managers
Significant member participation in creating the rubrics

Reviewers start by reading each solution….

…based on user testing, the first question Reviewers answer is simple thumbs-up/thumbs-down. Reviewers felt this was helpful step in removing the emotion from their reviews…

…as they progress in their review, a draft review is autogenerated using simple logic to combine the rubric responses…

…after completing the rubric, Reviewers can also enter any additional comments they have about the solution…

…once it has been sent, the entire review team can see the final review…

 
 

…while reading a solution, Reviewer can add comments or ask questions to authors. Once they are ready, Reviewers begin the formal review process by pressing the Start Review button…

…each project has a customized set of questions and responses for its rubric. The Customer and Project Managers select questions from a library or they can create new rubrics…

…rubrics are typically comprises of 10-20 questions. Even with more questions, these reviews take a fraction of the time to complete compared to the previous free-form review process…

…after all the Reviewers are finished, the Project Manager receives a Draft Review, that includes comments from the Reviewers. The Project Manager can then make any final edits before sending the review to the member(s) who submitted the solution…

…Reviewers can join a discussion with the Project Manager about that solution, that is seeded with any comments made in the final step of their reviews.

Previous
Previous

innovation marketplace

Next
Next

personal search