Promotion Evaluation

This section covers:

  • Assessing whether your promotional efforts are having an effect
  • Determining which actions or collateral are worth continuing, and which need rethinking.

Many libraries run promotional campaigns without any real evidence of whether they are achieving their aims – assuming any promotion must have some impact. To get full value from your efforts evaluation is required. Even a modest review can provide information, rather than assumptions, on how best to further develop your approach and collateral.  And collateral or initiatives that are proving to have little impact can be dropped with justification rather than suspicion.

If you are using usage statistics to measure objectives they should be assessed in the context of other constraining factors. If there are known access constraints for example, then obviously uptake is going to be limited no matter the strength of your promotion. And be sure you have clear baseline figures from which to assess any change over a period. Also consider if the eResource statistics that you currently collect accurately reflect the successful usage of your eResource. A service may have a reasonable amount of “hits” last month, but was the service of any use? Does the eResource measure full-text downloads? If so, consider measuring the successful completion of searches.

Action:

If you have drawn up a marketing strategy and targeted segments as discussed earlier, then it will be straightforward to determine concrete objectives (e.g. increase use by existing users…increase visits to eResources web pages … increase requests for eResources articles or increase instruction in using eResources etc.) All such objectives are directly measureable and can be used to assess the impact of your promotion.  These objectives are the KPIs (Key Performance Indicators) of your marketing plan.  But are they SMART objectives? 

SMART objectives are:

  • Specific, clear and well defined
  • Measurable
  • Achievable and action orientated
  • Realistic based on the resources at hand
  • Timely, where the time-frame to reach the objective is also realistic.

Other more qualitative measures could be derived from:

Focus groups of staff and users - ask small groups if they have seen the messages. What did they take from them? Did the message encourage them to take any action? What do they prefer to see?

Critical incident reporting - encourage staff to forward any reports of user comments on awareness of eResources or the promotional material and what they took them to be. While not statistically valid such feedback can serve to indicate media, vehicle or message weaknesses.

v1.2