💡 Introducing Mandatory Comments for a More Transparent Evaluation Process
We've released a new update to help you gain more insight and accountability in your evaluation process. While this may seem like a small change, it's designed to bring greater clarity to how scores are provided and a more transparent evaluation journey.
💬 New! Mandatory Comments for Evaluators
Based on valuable feedback, we've added a setting to the advanced evaluations that allows you to require evaluators to provide a comment for each score they give. This ensures that every score is backed by a clear rationale, giving you the context needed to justify final decisions during consensus scoring.
This setting is off by default. If you want to use it for your evaluations, you will need to enable it in the advanced evaluation module settings.
🛠️ Other Enhancements & Fixes
We've also included a number of other improvements and fixes to enhance your experience across the platform.
- Mail Merge Codes: We've updated the mail merge codes for evaluation document generation to accurately show both initial and consensus scores.
- PDF Generation: We've resolved an issue where KPI labels were overlapping other text when generating a PDF document, ensuring a cleaner layout.
We hope these updates help to make your evaluation process more effective. As always, we appreciate your feedback on how we can continue to improve.