Thursday, August 30, 2012

Length of Time to Score

Earlier this year, we noticed a trend in our scoring practices that needed some correcting. We would like to give an update to our members on this issue and how we have addressed it and will continue to address it.

Every CloudSpokes challenge has three important dates - the (1) Start Date, (2) End Date, and (3) Winner Announced Date. Each date is set manually at the start of a challenge. The only one that has a healthy level of discretion involved is the Winner Announced Date.

The Winner Announced Date is clearly a prediction, but it's a prediction that we need to be very aware of when it comes to judging and scoring challenges.


What we were noticing in the system, and noticing from our members, was that we were either incorrectly predicting our Winner Announced Date or not getting our challenge reviewers to judge and score challenges within their allotted timeframe. We do not want to set a practice of "missing deadlines" with the community.

In May we set up the below dashboard and began to measure how we were managing the scoring of challenges after the end date. The result of this focus is that over the last couple months, we've improved greatly on both our Winner Announced Date predictions as well as our time to score. Back in April, we were averaging around 5 days behind our projected Winner Announced Date to actual announced date - which isn't good. And today, we're proud to announce that we're now averaging about 1 day before our projected Winner Announced Date for our actual: 


As CloudSpokes scales and grows, we'll definitely experience growing pains and have areas for improvement. Listening to the feedback from the community and addressing it as quickly as possible will always be a priority. Thanks to all the community members who surfaced this concern earlier in the year, and your trust that we would address it. 

For future feedback, join us on our community forum.

2 comments:

  1. Interesting post. Thanks for incorporating Members feedback. Coordinating with numerous judges and meeting the scheduled deadlines is definitely a very tricky task.

    Couple of other charts that may be interesting to track are --
    a) Average number of days between Planned Winner Announced Date and End Date
    b) % of the contests in which the results were announced on or before the Planned Winner announced date

    ReplyDelete
  2. Thanks Naveen! The data I do have that is similar is the average "start to end date" and average "start to actual winner date". For August this was 6 days for start to end, and 14 for start to actual winner date.

    Regarding the % of contests scored on or before, I don't have that metric in front of me. But what I can share is that only a handful of challenges go significantly past the planned winner announced date. These challenges often have logistical issues, such as problems with submission packages or the inability for the challenge reviewer to proceed as scheduled. The vast majority of challenges are scored within a day of the planned date.

    ReplyDelete