Vivino, the crowd-sourced wine rating app, gets criticized for its five star rating system. Wine consumers claim that it’s too narrow of a rating system to distinguish products from each other, and that the classic 100-point system is more apt. Beyond that, five star rating systems made popular by companies like Yelp, TripAdvisor and Amazon, get their fair share of criticism. The rating systems get criticized for the following:
While these criticisms are accurate in my experience, they nonetheless do not impact the usefulness of five star rating systems. Despite narrow ranges, Vivino has developed a natural bell curve of wine ratings over the past 8 years. This graph shows wines with over 20 ratings:
About half of the wines fall within the narrow range of 3.4-3.9. Like most things in life, the majority of wines are average: not too bad and not too good. This is what we’d expect to see in a high functioning crowd-sourced rating system. We’d also expect to see a small portion of products several standard deviations lower and higher from average, and we are seeing that in the bell curve above:
Lastly, we do see these ratings affect consumer decisions. Without sharing too much data on Vivino sales, while the average wine on Vivino is a 3.6-3.7, the average wine sold on Vivino is above a 4.0.