Competition - Score Calculation Options
When creating a new competition, users can select multiple metrics, varying scoring types (points or value), and advanced scoring options.
What are the advanced Scoring Options within competitions?
How are challenge scores calculated?
Can I view Scoring Projections for a competition?
Why do I see negative scores or drops in users' scores?
Scoring Options - Definitions
Option |
Recommended Use |
Behavior |
Use Per-User Average |
Recommended for when competing groups are not the same size, and the challenge time frame is at least 1 day. If an hour long or half-day challenge is utilizing averages, the full day's average will be used in the score calculation.
Since Activity and Objective Scores are inherently averaged, they will already use a Per-User Average. This toggle will have no additional impact towards either Activity or Objective Score. |
If toggled off, the metric value will be used: This setting will use the value of the metric associated with an entity to generate the score. Example: If a user/team creates 10 new leads every day for 1 work week, their value is 50.
If toggled on, the average value will be used: If the entity being scored is a super entity (or team) use the average metric value of each sub entity (users) to generate the score. Example: Team A has two team members. Respectively, they each have activity scores of 80 and 100 for the week, so their average score is 90 [(80+100)/2]. |
Sum Daily Averages (Aggregation Method) |
Recommended for long durations where affecting the average becomes increasingly difficult. |
For competitions that last longer than a day and contain more than one metric value (Monday's value, Tuesday's values, etc) we must determine how to aggregate these values together, for example, summing or averaging:
If toggled off, Default scoring applies = This will depend on the metric being scored. If the metric is an average metric, the value will be averaged across the time span of the competition. If the value is summed, the value will be summed. Note: Activity Score and Objective Score are Average metrics
If toggled on, Sum scoring applies = Ignore the metric type and always sum the daily values together |
Exclude Weekends |
Do not include weekends in the scoring calculation. Only applicable for Competitions spanning over the weekend days, Saturday or Sunday. |
If toggled on = all weekend values will be ignored. If the user generates metrics over the weekend, they will not be counted towards the competition’s score. If you are utilizing a Monthly Objective Score scoring parameter and desire the competition score to match the Groups's Dashboard, ensure this is toggled Off. |
Out of Office Competitors Do Not Earn Points |
Employees who are out for sick days, PTO, etc will not be calculated into a team's score. |
If toggled on = metric values generated on days where employees are marked Out of Office will be ignored. |
Min Value* |
A "pay to play" setup. One must cross a threshold before generating a score. |
The floor that must be reached before score contribution is made.
Note: After the threshold is crossed, a user's score will be representative of the threshold's value and not starting from 0. Ex. A User crosses a min value of $10,000 Revenue, and their score will reflect their $10,500 metric value. NOT from starting $0 once the threshold is crossed. |
Max Value* |
A "cap" on the amount of contribution a user can receive from a single metric. Prevent inflation of score from a single metric. |
The ceiling at which no additional score contribution is made.
Value Scoring Ex. If value scoring (1:1) is being used and a max value of 20 is set for Calls, the max points you could receive in the competition for Calls is 20. Points Scoring Ex. If Calls are worth 2 points a piece and a max value of 20 is set, the max points you could receive in the competition for Calls is 40. |
* Note about Thresholds and Multi-day Challenges
A minimum or maximum threshold will be applicable to the Competition's duration as a whole.
Competition Calculations
Competition calculations are also dependent upon the Competition Metric's metric type.
-
Metric Type: Average
- Scoring Type: Summed (Sum Daily Averages toggle selected)
- Scoring Type: Averaged
-
Metric Type: Sum
- Scoring Type: Summed
- Scoring Type: Averaged (Use Per-User Average toggle selected)
-
Metric Type: Count
- Scoring Type: Summed
- Scoring Type: Averaged (Use Per-User Average toggle selected)
Metric Type: Average
User within Account Executive | Average Deal Size 6/1/21 | Average Deal Size 6/2/21 |
Colin | $10,000 | $15,000 |
Katie | $20,000 | $14,000 |
Caroline | $16,000 |
Table A Average Deal Size for users within the Account Executive Role (Group) from 6/1/21 to 6/2/21
Scoring Type: Averaged
For a Competition that is 2 days long with data from Table A, Ambition will do a records based average:
($10,000 + $20,000 + $15,000 + $14,000 + $16,000)/5 = $15,000
Scoring Type: Summed (Sum Daily Averages toggle selected)
For a Competition that is 2 days long with data from Table A, Ambition will do a sum of the users' averages:
($10,000 + $20,000 + $15,000 + $14,000 + $16,000) = $75,000
Metric Type: Summed
User within Account Executive | Amount Closed 6/1/21 | Amount Closed 6/2/21 |
Colin | $20,000 | $30,000 |
Katie | $40,000 | $28,000 |
Caroline | $32,000 |
Table B Amount Closed for Users within the Account Executive Role(Group) from 6/1/21 to 6/2/21
Scoring Type: Averaged (Use Per-User Average toggle selected)
For a Competition that is 2 days long with data from Table B, Ambition will do a group based average:
($20,000 + $40,000 + $30,000 + $28,000 + $32,000 + 0)/6 = $25,000
Scoring Type: Summed
For a Competition that is 2 days long with data from Table B, Ambition will sum the available records.
($20,000 + $40,000 + $30,000 + $28,000 + $32,000) = $150,000
Metric Type: Count
User within Account Executive | Deal Count 6/1/21 | Deal Count 6/2/21 |
Colin | 2 | 2 |
Katie | 2 | 3 |
Caroline | 3 |
Table C Deal Count for Users within the Account Executive Role(Group) from 6/1/21 to 6/2/21
Scoring Type: Averaged (Use Per-User Average toggle selected)
For a Competition that is 2 days long with data from Table C, Ambition will do a group-based average:
(2 + 2 + 2 + 3 + 3 + 0)/6 = 2
Scoring Type: Summed
For a Competition that is 2 days long with data from Table C, Ambition will sum the users' counts.
(2 + 2 + 2 + 3 + 3) = 12
Scoring Projections
Within the Competition's Overview page, users can see how they or their team stacks up when compared to other competitors. A user can view progress towards each Metric in the Challenge, their score progression, and score projection (dotted line) which is calculated using historical performance data.
Projections are calculated by analyzing the past 12 weeks of data (if available) and makes an educated guess on what any individual user/team will score. The more data our system collects, the more accurate Projections get. However, it is still just a prediction.
Troubleshooting Negative Scores
Occasionally, a competitor's score can dip into negative values. When this happens, it may be related to the competition's starting time and starting value. This can happen when a competition starts mid-day and is for a duration less than a day. In this configuration, the competition engine gets the current values at the beginning of the competition to use as its starting point. From there, it computes the scores moving forward based on that starting value and the new incoming value. If the new incoming metric record represents a correction in the previous total (lowering the metric's total) it will be translated into a negative competition score.
Comments
0 comments
Please sign in to leave a comment.