We just hosted a demo competition for 16 different legaltech companies and allowed the audience to determine the winner. Here's how we did it, who won, and some reflections on the event...
The Methodology:
At the end of each presentation, we prompted the audience to vote on a 1-5 scale in three categories: (1) Innovation, (2) Ease of Use, (3) the Problem Statement (i.e. is there a real problem here?). Voting was anonymous. We determined the winner by averaging each company's scores and then summing these three averages to get a final score. The maximum final score was 15.
Who won?
The overall winner was Marveri with a final score of 13.36. The runner-up was AltFee with a score of 11.88. Given the final distribution of scores ranged from 13.36 to 9.30, this was quite a large winning margin between 1st and 2nd.
We also had a few category specific winners:
Commercial drafting category winner: ScreensAI
Backoffice tool winner: AltFee
Litigation tool winner: Goodfact
AI research winner: GC AI
Reflections:
1. I'm really happy we did audience voting... I think this led to a more honest outcome than having a small panel of judges doing the deciding. Bias is real, and legaltech is a small world. We all know each other. I'm glad we opted for more data here. It was also way more fun. 420 total votes were cast during the day.
2. Reflecting on the format a bit, I realized that packing all 16 demos into one day was too long, and audience fatigue began to set in on the afternoon session. This didn't seem to have any noticeable effect on average scores (i.e. the scores from demos in the morning and afternoon were pretty evenly distributed), but it did have an effect on participation rates (less audience members and less votes in the afternoon). That's not fair to the afternoon competitors... so it's something I'll reconsider next time.
3. Participation was incredible. The competitors came with their A-game and the energy was amazing. We also had a total audience count of 137... which is 54% of the 251 who signed up. For a midday event on a Thursday, that was amazing to see.
4. Voters didn't hold back. I attribute this to anonymity. When they didn't understand the product, or didn't see the value, they scored accordingly. We had a free-form field at the end where they could leave comments. Many did, with positive and negative feedback.
5. The best performing demos let their product do the talking. The time-to-aha! was short. Presenters were able to do this when the problem they were solving was obvious, so it didn't need much explaining.
If you attended demo day, please chime in with your feedback. I'd love to hear how we could have done things better. Because we WILL be doing it again!