Published in Growth -

Four common data pitfalls to avoid when marketing social services.

The social services sector is significant and sprawling.

So much so that it can even be difficult to succinctly define. Spanning education, healthcare and disability support services, social security, and housing—in addition to a range of other specialty services that are often unique to providers—the social services sector is equally referred to as the ‘social purpose market’, the ‘community services industry’, ‘human services industry’, and other names.

However you refer to it, it is substantial and growing. The market size, measured by revenue, was $72.3 billion in 2022. This represents an increase of 4.1% on the previous year, with an average of 4.0% year-on-year growth between 2017 and 2022.

Beyond this sustained growth, many industries within this broader sector have been—or are being—redesigned to provide consumers and service seekers with more choice, driving increased competition between providers.

With sustained sector growth and increasingly empowered customers, effective marketing has never been more important in social services.

The role of data in forecasting and assessing your activity is equally crucial. In my experience, most marketers know this. Still though, far too many run into common pitfalls.

Here are four of the most common ones to watch out for.

1. Being distracted by things that don’t actually matter.

It’s very easy to fixate on what are often referred to as ‘vanity metrics’. These things are usually easy to measure and understand, but don’t offer any tangible value to your organisation or map closely to your objectives.

The most common one I see in working with marketers in social service organisations is an ongoing emphasis on site traffic.

An increase in site traffic is not automatically good. A decrease in traffic is not automatically bad.

Changes in site traffic are not a meaningful data point, especially when isolated in and of themselves. Rather, they’re a means of understanding things that are occurring at one end of your marketing conversion funnel.

Traffic to this service is up (or, equally, down)’ is not a helpful statement in isolation of any meaningful context. This brings me to the second pitfall.

2. Jumping into data and reports without a clearly articulated question in mind.

Whether you conduct analysis yourself or have others analyse marketing performance data on your behalf, it’s highly unlikely you’ll simply stumble onto valuable insights.

The situation is only made worse by the design of most modern analytics software platforms. Every time you log in, you’re immediately presented with hundreds of options in terms of visualising, segmenting, and comparing different data sets.

If you don’t have a good question to ground and contextualise the information you’re looking for, you’ll get lost immediately.

It’s not enough for your analysis team to provide mechanical updates on the state of various metrics. You need insights: a viewpoint on performance and what the data is actually saying.

By framing the data you need with a compelling question, you’re infinitely more likely to get information you can use to inform meaningful activity.

Far too often, marketing teams will assess their performance in the context of a broad or open-ended question. ‘How are our digital marketing channels performing this month?’

A more specific overarching question—and one which is likely to drive insights that can help you better connect with people seeking your services—is something like: ‘which services pages on our website have seen a positive shift in organic search traffic from the same period last year?’

With this information you can:

  • Replicate any marketing activity which may have contributed to the improvement and apply the same effort to new or underperforming services, or
  • Experiment with new marketing initiatives to promote further growth on those services which have not seen any organic improvement.

3. Setting campaign goals without setting learning goals.

The most effective marketing teams will design their activity to enable learning as a crucial factor that iteratively informs what they do. They learn through doing, and they use what they learn to pivot and improve what they’re doing—ideally, simultaneously in the one campaign or marketing initiative.

So often, a team will only pause to ask, ‘what did we learn from this?’, when a marketing initiative is compete. In this approach, learning is coincidental rather than intentional.

A superior approach is to set off with a theory at the outset—or a series of variables to understand in greater detail—and run marketing with a test-and-learn mindset.

Control the variables and know what you want to learn. For example:

  • Don’t: run a series of ads promoting a support service and then ask, ‘which ad performed best, and why do we think that is?’
  • Do: run a series of different ads, each with a different image. Better yet, develop a theory behind each; perhaps smiling faces will draw more clicks than serious expressions. Perhaps ads with more people from diverse cultural and linguistic backgrounds in the image will perform better than images that only contain a single person.

This way, you can qualify and assess the impact of different activity in real time and realise the value of learning as you go, rather than waiting until the campaign is complete.

4. Believing that every single decision needs to be ‘data-driven’.

As marketers, we commonly hear about the importance of being data-driven in decision making.

Many people hold the idea of objectivity in decision-making as an ideal. I have a different viewpoint; data doesn’t make decisions. Or at least, it doesn’t have to.

As a senior marketer, you do. Particularly in social services organisations, where a quick and well executed marketing initiative can help you to capitalise on, or better support:

  • Rapidly changing needs in the community.
  • Response to a timely event or circumstance.
  • Legislation or policy changes.

Most of the time, you’ve likely already made a decision with your gut or your heart—or at least have a preferred pathway—before you contend with any of the data. This is because decisions are most typically made on an intuitive and emotional level. That’s human nature. Any data used to inform the decision is processed at a subconscious level.

My theory here is that data is often, in reality, applied retrospectively. It acts as a useful validation on the pathway you’ve chosen. Or, perhaps even more importantly, as a control that stops you doing something inefficient, ineffective, or… ill-advised.

Rather than think of data as a necessary driver for all decisions—and potentially one that can create a blocker for your ability to make progress, learn, or test different theories—think of data like traffic lights on a street.

Like traffic lights, data should be used to tell us to move forward, stay where we are, or proceed with caution.

Keen to do more with your data and better promote your social services?

This barely scratches the surface, but it’s a helpful start. There’s so much more that goes into using data effectively: from configuration and tooling to processes and cultural practices.

Without these things in place, you’re leaving value on the table and operating inefficiently as a marketing team.

If you’re keen to start working more effectively towards your marketing objectives—and get the very best out of your team—drop me a line. I’m confident we help you avoid these pitfalls (along with many more) and get you firing on all cylinders in no time.