Background & Challenge:
An international medical society had measured member satisfaction with its organization since 1999. During that time, satisfaction had increased significantly from 58% in 1999 to 76% in 2013. However, while most members were seemingly “happy” (and retention rates remained around 98%), the organization did not understand how engaged its membership truly was.
Figure 1: Member Satisfaction does not always translate into Member Engagement
- Annual meeting attendance had remained stable relative to membership growth, with pressures to increase non-dues revenue.
- Grassroots advocacy efforts were almost non-existent, done by a small cadre of dedicated members.
- Membership had reached an average age of 52+, with a majority of members in the Baby Boomer and Traditionalist/GI generational cohort. With most expecting to reach retirement age at some point in the next 10 years, there had been challenges in backfilling not only membership, but also leadership, with younger/newer members.
- Satisfied members behaved the same as dis-satisfied members, resulting in misdirected investment of resources to improve satisfaction, with an ill-defined “return on investment.”
It was clear that a new, more robust metric was needed to serve as a true guide for strategies to grow the Association and retain its vibrancy. In early discussions with The Loyalty Research Center (LRC), the senior leadership team of the organization was wary to make such a major shift in research and strategy, as they would be abandoning all of the historical data on the membership. The struggle was in balancing that loss of data – and the significant investment made in it over the years – with capturing powerful and more meaningful data. The question was – is data worth “hoarding” and “propagating” if it’s not usable, not actionable, and not meaningful?
Adopting a New Metric:
After years of scrambling to react to member needs with little to no impact on the member relationship, the organization decided it was time for a change. First, they took it upon themselves to create a composite engagement score to baseline and track member engagement over time.
Through brainstorming and trial and error, they identified 5 measures that they felt were meaningful engagement “goals”:
- Volunteerism in the organization
- Membership tenure
- Annual meeting attendance (over the most recent 5 year period)
- Advocacy effort participation
- Non-dues revenue
The next step was the task of assigning an individual score for each member. The entire process took nearly a year to complete, but it started to give a better and clearer understanding of the differences in levels of member engagement.
The organization then partnered with LRC to transition their “voice of the member” research away from satisfaction and towards engagement. This would allow them to marry attitudinal measures captured through a survey instrument with the composite engagement score (which is more of a behavioral measure), to create a 360 degree view of member engagement.
The survey explored the aspects of the membership that have a significant impact on the value of membership and member engagement – areas that, if improved, would have a high probability of strengthening engagement.
Attitudinally, while member satisfaction was significantly stronger than what LRC typically finds even in “best-in-class” associations, it was weakly correlated with engagement. That is, some members who were very satisfied with membership were not at all engaged in the organization, and some who were dissatisfied were very engaged (care about the future of the organization).
Behaviorally, volunteerism, advocacy, and non-dues revenue were moderately correlated with the LRC engagement measure. Tenure and meeting attendance were weakly correlated (strongly engaged members have similar tenures as weakly engaged members, and are just as likely to attend the annual meeting at the same frequency/regularity).
When combined, however, the composite engagement score was strongly correlated with LRC’s engagement metric, indicating that the sum of measures was better than the individual parts.
Now, as improvements are made in the areas identified by the member research as critical member needs, the organization can track how member engagement is changing through its own composite score. There is now a clearly defined “return on investment” to applying staff time and financial resources towards member engagement.
Finally, the organization is smart in recognizing that the engagement goals that factor into the composite engagement score are likely to evolve over time. This means the score must be dynamic and not static, and an ongoing “voice of the measure” study would refresh the correlations with the attitudes of the membership.
The longer an organization collects specific measures on its customers, members or other constituents, the easier it becomes to accept status quo.
Just as an organization adjusts its strategic plan at various intervals, it should also be examining the performance metrics to which it is accountable. As this association discovered, it may very well bring a much-needed, refreshing perspective to the strategic planning process.