Every CloudSpokes challenge has three important dates - the (1) Start Date, (2) End Date, and (3) Winner Announced Date. Each date is set manually at the start of a challenge. The only one that has a healthy level of discretion involved is the Winner Announced Date.
The Winner Announced Date is clearly a prediction, but it's a prediction that we need to be very aware of when it comes to judging and scoring challenges.
What we were noticing in the system, and noticing from our members, was that we were either incorrectly predicting our Winner Announced Date or not getting our challenge reviewers to judge and score challenges within their allotted timeframe. We do not want to set a practice of "missing deadlines" with the community.
In May we set up the below dashboard and began to measure how we were managing the scoring of challenges after the end date. The result of this focus is that over the last couple months, we've improved greatly on both our Winner Announced Date predictions as well as our time to score. Back in April, we were averaging around 5 days behind our projected Winner Announced Date to actual announced date - which isn't good. And today, we're proud to announce that we're now averaging about 1 day before our projected Winner Announced Date for our actual:
As CloudSpokes scales and grows, we'll definitely experience growing pains and have areas for improvement. Listening to the feedback from the community and addressing it as quickly as possible will always be a priority. Thanks to all the community members who surfaced this concern earlier in the year, and your trust that we would address it.
For future feedback, join us on our community forum.