When Dashboards Fail Their Purpose
What reviewing dozens of dashboards taught me about clarity, value, and user needs
Lately Iāve been reviewing a lot of different dashboards, and the conversations often go something like this:
What is the dashboard about, why was it created?
Itās about team efficiency.
How is efficiency measured?
By the number of completed tasks and the speed of their completion.
Why do people look at this?
To understand who on the team is performing better or worse.
Which chart shows that?
[they scroll down the page and find a table with this data, or start applying a bunch of filters to get one of the charts to display what they need]
If the dashboardās main purpose canāt be achieved just by looking at it, thereās a good chance somethingās wrong with it. Either we donāt understand the goal, or weāve designed it in a way thatās just not user-friendly.
The second type of question:
What can we say based on this chart?
That we have this particular price for the product.
Okay, we most likely already know what our product costs ā what can we add in terms of value?
There are benchmarks and price dynamics in a table below.
Letās move that to the top and immediately show where our price stands relative to the benchmark and what its trend is.
Every chart on a dashboard should deliver some value and answer a specific user question. Sometimes you look at something and itās genuinely beautiful ā you just enjoy it visually ā but it doesnāt actually solve any problem for the users. Itās just a pretty picture about something they already knew. And then you painfully have to cut it...
P.S. Iāve simplified the examples a lot for illustration purposes, but I think the point comes across.

