The Meeting Health Score: How to Measure if Your Meetings Are Working

Most teams have no idea whether their meetings are getting better or worse. The Meeting Health Score changes that. Here's how to measure, track, and improve meeting effectiveness.

Vik Chadha
Vik Chadha - Founder, MeetingTango ·
The Meeting Health Score: How to Measure if Your Meetings Are Working

You Manage What You Measure (Except Meetings)

Companies measure everything. Revenue gets a dashboard. Customer satisfaction gets an NPS score. Employee engagement gets a quarterly survey. Engineering teams track sprint velocity down to the decimal. Marketing measures cost per acquisition, conversion rates, and attribution across every channel.

But meetings? Meetings get nothing.

This is remarkable when you consider the investment. The average employee spends 392 hours per year in meetings — roughly 18% of their working time. For leadership teams, it is worse. Many executives spend 50 to 70 percent of their week in meetings. When you multiply those hours by loaded compensation, the average company is spending $29,000 per employee per year on meetings. For a 50-person company, that is nearly $1.5 million annually.

Any other investment of that size would have a dashboard, a quarterly review, and an executive sponsor. Meetings get a shrug.

This is not because people think meetings are fine. 71% of senior managers say meetings are unproductive and inefficient. But when you ask how they know this, the answer is always the same: gut feel. They sense that meetings are not working. They leave frustrated. They watch the same issues get discussed week after week without resolution. But they have no measurement, no baseline, and no way to track whether things are getting better or worse.

That is the problem I set out to solve with MeetingTango. And the answer starts with a single number: the Meeting Health Score.

What Is a Meeting Health Score?

The Meeting Health Score is a composite metric, rated 1 to 10, that combines multiple signals about meeting effectiveness into a single trackable number. Think of it as a credit score for your meetings — one number that synthesizes several underlying factors into something you can monitor over time.

What makes it different from a simple post-meeting survey is that it blends objective measures with subjective ones. It does not just ask "did you like the meeting?" It asks whether the meeting had structure, whether it respected time boundaries, whether it produced decisions, whether those decisions turned into completed work, and whether the team felt the time was well spent.

Most importantly, it is designed to be tracked over time. A single score is a snapshot. The trend — week over week, month over month — is what actually matters. A Meeting Health Score of 6.5 is not inherently bad. A Meeting Health Score that has been declining from 7.5 to 6.5 over the past eight weeks is a clear signal that something has changed and needs attention.

The 5 Components

The Meeting Health Score is built from five components, each measuring a different dimension of meeting effectiveness. Here is the framework.

1. Preparation Score — Was There an Agenda?

The single strongest predictor of meeting quality is whether the meeting had a structured agenda shared in advance. This is not a controversial claim. Every piece of research on meeting effectiveness points to the same conclusion: meetings with agendas are roughly 30% shorter and produce twice as many concrete decisions as meetings without them.

The Preparation Score measures not just whether an agenda existed, but how it was built:

  • No agenda at all: 2 out of 10. (Not a zero, because even an unstructured meeting where people show up and talk has some minimal value.)
  • Last-minute agenda: 5 out of 10. Someone threw bullet points together five minutes before the call. Better than nothing, but the team had no time to prepare.
  • AI-generated agenda, reviewed by the team: 9 out of 10. The agenda was built from real inputs — outstanding action items, recurring topics, flagged metrics — and team members had the chance to review and adjust it before the meeting started.
  • Team actively contributed items in advance: 10 out of 10. Multiple team members added agenda items before the meeting, signaling engagement and ownership.

Weight: 20% of the total score.

The reason this matters so much is that preparation creates a compound effect. When people know the agenda in advance, they come prepared to discuss. When they come prepared, decisions happen faster. When decisions happen faster, the meeting stays on time. One component lifts all the others.

2. Time Discipline Score — Did It Run on Time?

Time discipline is the most visible symptom of meeting health. When meetings consistently start late and run over, it trains the team to ignore time boundaries entirely. This creates a destructive spiral: if the meeting is going to run over anyway, there is no incentive to be concise. If there is no incentive to be concise, the meeting runs even longer. Eventually, the leadership meeting that was supposed to be 60 minutes is routinely consuming 90, and everyone has silently accepted it.

The Time Discipline Score measures three things: Did the meeting start on time? Did it end on time? Did individual agenda items stay within their allocated time?

  • Started 5+ minutes late and ran over: 3 out of 10.
  • Started on time, ran 5 minutes over: 7 out of 10.
  • Started on time and ended on time: 10 out of 10.

Weight: 20% of the total score.

A meeting timer is helpful here, but the real driver is culture. Teams that respect time boundaries are teams that respect each other's time. The score just makes the pattern visible.

3. Decision Velocity — Were Decisions Actually Made?

The purpose of a leadership meeting is to make decisions. Not to share updates — that can be done asynchronously. Not to brainstorm — that deserves its own format. The weekly leadership meeting exists so that the people with authority can hear the relevant information, discuss the tradeoffs, and decide.

Decision Velocity measures the ratio of agenda items that produced a clear decision versus items that were deferred, tabled, or left unresolved.

If your meeting had eight discussion items and five resulted in clear decisions with owners and deadlines, your ratio is 63%. If all eight were resolved, it is 100%.

  • Fewer than 40% of items decided: 3 out of 10.
  • 40 to 60% decided: 5 out of 10.
  • 60 to 80% decided: 7 out of 10.
  • 80 to 100% decided: 9 to 10 out of 10.

Weight: 20% of the total score.

A common objection is that some decisions genuinely need more information or time. That is true. But when you track Decision Velocity over time, you quickly see whether deferrals are the exception or the pattern. Teams that habitually defer decisions are teams that are avoiding conflict, lacking information, or unclear on who has authority. The score surfaces the behavior so you can address the root cause.

4. Action Item Completion Rate — Did Last Week's Items Get Done?

This is the most heavily weighted component, at 25% of the total score, because it is the ultimate measure of whether your meetings create real-world results.

Every effective leadership meeting produces action items: specific commitments with clear owners and deadlines. The question is whether those commitments actually get fulfilled by the next meeting. If they do, the meeting is a machine that converts discussion into progress. If they do not, the meeting is theater — it feels productive in the moment but changes nothing.

The scoring is straightforward:

  • 90% or higher completion: 10 out of 10.
  • 80 to 89%: 8 out of 10.
  • 70 to 79%: 6 out of 10.
  • Below 70%: Proportional to the percentage.

Weight: 25% of the total score.

Low action item completion is the most damaging pattern a leadership team can develop. When commitments are routinely broken, the team learns that words in the meeting do not mean anything. Cynicism takes root. The best people stop volunteering for action items because they know nobody will be held accountable anyway. An action item tracker that carries incomplete items forward and makes the pattern visible is one of the highest-leverage tools a leadership team can adopt.

5. Participant Rating — How Did the Team Feel?

The final component is the simplest: a quick post-meeting poll where each participant rates the meeting on a scale of 1 to 10. This takes ten seconds and captures the subjective experience that the objective metrics cannot.

A meeting can score well on preparation, time, decisions, and completion — and still feel draining if the tone is adversarial, the same person dominates every discussion, or the team never gets to the strategic topics because the agenda is consumed by operational firefighting.

The Participant Rating captures engagement, energy, psychological safety, and the team's sense of forward progress. It is the component most likely to surface issues that the other four miss.

Weight: 15% of the total score.

The lower weight reflects that subjective ratings are inherently noisier than objective measures. But they matter. A team that consistently rates meetings below 6 is a team that is losing engagement, regardless of what the other numbers say.

How to Calculate It

The Meeting Health Score is a weighted average of the five components:

Meeting Health Score = (Preparation x 0.20) + (Time Discipline x 0.20) + (Decision Velocity x 0.20) + (Action Item Completion x 0.25) + (Participant Rating x 0.15)

Here is a concrete example. Suppose your last leadership meeting looked like this:

  • Preparation Score: 8 (agenda was shared in advance, a few team members added items)
  • Time Discipline: 7 (started on time, ran about 5 minutes over)
  • Decision Velocity: 6 (four of seven items decided, three deferred)
  • Action Item Completion: 9 (12 of 13 items from last week completed)
  • Participant Rating: 8 (team felt it was a good meeting)

Score = (8 x 0.20) + (7 x 0.20) + (6 x 0.20) + (9 x 0.25) + (8 x 0.15) = 1.6 + 1.4 + 1.2 + 2.25 + 1.2 = 7.65

A 7.65 is a solid score. Not perfect, but well above average. The clear area for improvement is Decision Velocity — three deferred items suggests the team may be avoiding hard calls or missing the information they need to decide.

What the Numbers Mean

Over time, we have found that Meeting Health Scores fall into four meaningful ranges:

  • 8.0 to 10.0 — Excellent. Your meetings are a competitive advantage. The team is well-prepared, disciplined with time, decisive, and accountable. Protect this culture fiercely.
  • 6.5 to 7.9 — Good. Solid foundation with specific areas to improve. Most teams that are intentional about meeting quality land here. Look at which component is dragging the average down and focus there.
  • 5.0 to 6.4 — Needs work. Your team probably dreads these meetings, even if nobody says it out loud. There are likely multiple components scoring below 6. Pick the lowest-scoring component and focus on that one thing for four weeks.
  • Below 5.0 — Broken. Before optimizing, ask a harder question: should this meeting exist at all? A meeting scoring below 5.0 may need to be completely redesigned — new format, new cadence, or new attendee list — rather than incrementally improved.

A single Meeting Health Score is useful. A trend line is transformative.

Consider a team that starts measuring and sees this progression:

  • Week 1: 5.2
  • Week 4: 6.1
  • Week 8: 7.3
  • Week 12: 7.8

That team can see their meetings improving. The improvement is not a feeling — it is a number. This creates a positive feedback loop. When the team sees the score rising, they invest more effort in preparation, time management, and follow-through, which pushes the score higher still.

The trend also works as an early warning system. If a team that has been steady at 7.5 suddenly drops to 6.8, that is a signal. Something changed. Maybe a key team member left. Maybe the agenda stopped being circulated in advance. Maybe accountability slipped because the team is distracted by a crisis. Whatever the cause, the score drop makes the invisible visible before it becomes a permanent decline.

Common Patterns and What They Mean

After working with leadership teams on meeting effectiveness, certain patterns keep appearing:

High preparation, low completion. The team is having great discussions and making decisions — but nobody follows through. This is an accountability problem, not a meeting problem. Focus on building an action item tracking system that carries items forward and makes completion visible.

High completion, low ratings. The meeting is objectively productive — items get done, decisions get made — but the team does not enjoy it. This often means the meeting has become purely operational: status updates and checkbox reviews with no space for strategic discussion, relationship building, or creative problem-solving. Consider reserving the last 10 minutes for a topic that energizes the team.

Scores plateau at 7. This is the most common pattern. Getting from 5 to 7 is relatively straightforward — add an agenda, start on time, track action items. Getting from 7 to 8+ requires addressing harder issues: candor in discussions, willingness to engage in healthy conflict, and the quality of decisions being made, not just the quantity.

Scores drop after someone joins or leaves. Team dynamics shift when the roster changes. A new member may not understand the meeting norms. A departed member may have been the one driving accountability. When this happens, it is worth explicitly resetting meeting expectations rather than assuming the new configuration will find its own rhythm.

How to Start Measuring Today

You do not need software to start. Here is how to calculate your first Meeting Health Score after your next leadership meeting:

  1. After the meeting, have everyone rate it 1 to 10. Text, Slack message, or a show of hands. Takes ten seconds.
  2. Score your preparation. Did you have an agenda? Was it shared in advance? Use the rubric above.
  3. Score your time discipline. Did you start on time? End on time?
  4. Count decisions made versus items deferred. Calculate the ratio.
  5. Next week, check how many of this week's action items were completed. Now you have all five components.
  6. Calculate the weighted average. Write it on a whiteboard. Track it every week.

Within four weeks, you will have a trend line. Within eight weeks, you will know exactly which dimension of your meetings needs attention. Within twelve weeks, your meetings will be measurably better — not because of any single intervention, but because you are paying attention.

Or you can skip the manual work entirely. MeetingTango calculates your Meeting Health Score automatically — tracking agendas, time, decisions, action items, and team ratings without anyone touching a spreadsheet. It gives you the trend line from day one and flags the specific component dragging your score down each week.

Your meetings consume thousands of hours and hundreds of thousands of dollars every year. It is time they had a score.

Start measuring for free at MeetingTango.

Read next

Stop Recording Your Meetings. Start Fixing Them.
  • Meetings

Stop Recording Your Meetings. Start Fixing Them.

The AI meeting market is obsessed with recording and transcription. But a perfect transcript of a broken meeting is still a broken meeting. Here's what actually works.

Vik Chadha
Vik Chadha - Founder, MeetingTango

Your Next Meeting Could Be Your Best One

Start with an AI-generated agenda. End with clear action items. See your meetings improve every week.

No recording

We never listen to your meetings

No bots

Nothing joins your Zoom or Teams

No lock-in

Free forever plan, cancel anytime

Join leadership teams already running better meetings with MeetingTango