The metric that actually tells you whether your security awareness programme is working.

If you run phishing simulations in your organisation, there is a good chance you are measuring success primarily by click rates. How many people clicked the link? How does that compare to last quarter? Are we trending downwards? These are the numbers that get reported to leadership, celebrated when they fall, and worried about when they rise.

Click rates are not a bad metric. They provide a useful data point about how the workforce responds to simulated threats at a given moment in time. But if click rates are the only metric being measured, something far more important is being missed: reporting culture.

What Reporting Culture Actually Means

Reporting culture is the degree to which people in an organisation feel able, willing, and motivated to report suspicious activity, potential security incidents, and their own mistakes. It is the difference between an employee who clicks a phishing link and immediately tells the security team, and an employee who clicks the same link and says nothing because they are afraid of what will happen.

Both employees clicked. By the click rate metric alone, they look identical. But from a security perspective, they could not be more different. The first employee has given the security team the information they need to investigate, contain, and respond. The second has left a potential compromise undetected, possibly for hours, days, or weeks.

This is why reporting culture matters more than click rates. In a real incident, the speed and completeness of the response depends entirely on whether people tell you what has happened.

Why People Do Not Report

In most organisations, underreporting is not caused by apathy or ignorance. It is caused by fear. People do not report because they are afraid of being blamed, disciplined, or embarrassed. They do not report because previous experience has taught them that raising a concern leads to uncomfortable conversations, public call-outs in team meetings, or mandatory remedial training that feels like a punishment.

Every time an organisation responds to a phishing simulation failure with punitive action, it sends a clear message to the entire workforce: if you make a mistake, you will be punished. And the rational response to that message is not to stop making mistakes. It is to stop admitting to them.

This is the hidden cost of fear-based security awareness programmes. They may drive click rates down in the short term, but they simultaneously drive reporting rates down as well. And unreported incidents are far more dangerous than clicked links.

Building a Culture Where People Speak Up

A healthy reporting culture does not happen by accident. It has to be designed, nurtured, and protected. Here is what that looks like in practice.

Make reporting easy. If reporting a suspicious email requires navigating three menus and filling in a form, people will not do it. Give people a one-click reporting button in their email client and make sure they know it is there. The fewer barriers between noticing something suspicious and reporting it, the more reports an organisation will receive.

Respond with gratitude, not judgement. When someone reports a suspicious email, whether it turns out to be genuine or not, thank them. Publicly recognise reporting behaviour as a positive contribution to the organisation’s security. If someone reports that they clicked a phishing link, thank them for telling you. The fact that they told you is the most important part.

Never punish reporting. This should be non-negotiable. If people believe that reporting a mistake will lead to consequences, they will stop reporting. It does not matter how many times an organisation tells people it is safe to speak up if its actions tell a different story. Consistency between words and behaviour is everything.

Separate reporting from consequences. If an organisation has a consequence management framework for repeated or negligent security failures, that framework should never be triggered by the act of reporting itself. Consequences, where they exist, should be proportionate, fair, and focused on support rather than punishment. They should never discourage the behaviour that matters most: honest, timely reporting.

Measure and celebrate reporting rates. Start tracking how many suspicious emails people report each month. Share the numbers with the organisation. Celebrate improvements. When leadership sees reporting rates going up, help them understand that this is a sign of a healthy security culture, not a sign that more threats are getting through.

The Metrics That Actually Matter

Click rates tell you how the workforce responds to a simulated attack at a single point in time. Reporting rates tell you something far more valuable: whether people trust the security team enough to be honest with them.

An organisation with a 15% click rate and a 5% reporting rate has a serious problem, even though the click rate looks reasonable. An organisation with a 20% click rate and a 60% reporting rate is in a much stronger position, because the security team has visibility of what is happening and can respond accordingly.

The goal of a security awareness programme should not be to eliminate human error. That is neither realistic nor achievable. The goal should be to create an environment where, when mistakes happen, they are caught quickly, reported honestly, and resolved effectively. That is what resilience looks like. And it starts with building a culture where people feel safe to speak up.

A Final Thought

The next time you review your phishing simulation results, look beyond the click rate. Ask how many people reported the simulation. Ask how quickly they reported it. Ask whether the trend is going up or down. And if you do not have that data, make getting it your next priority.

Because the organisations that will weather a real cyber incident are not the ones with the lowest click rates. They are the ones where every employee knows that when something goes wrong, the right thing to do is pick up the phone, hit the report button, and tell someone. No fear. No blame. Just action.

Ready to build genuine cyber resilience through your people? At Unity Group Solutions, we design and deliver empowerment-led security awareness programmes that create lasting behavioural change across your organisation. Contact us today via hello@unitysolutions.org.uk.