Get in Touch

6 Essential Features of a Program Audit Survey

Ardent Learning
8/23/2021

Your L&D program has been designed, developed, launched, and rolled out to your teams. Great! Learners are discovering new bits of information, sales teams are meeting sales goals, and senior leaders are thrilled with available training. Everyone is happy. Life is good.

But could it be better?  Even the best L&D programs can be improved. Whether your learning program content could use an update to manage learning gaps or you want to include new tech to create a more engaging program, a program audit will help you find feedback to improve.

Program audits start with defining key performance indicators and other internal measures of success. The next step is writing a survey with a variety of questions to get a feel for how well your managed learning program is meeting the KPIs and to gather feedback from learners about content – including what they believe could be improved. 

For a survey to pull out the most accurate responses, it needs several key features. Here are six must-have features to include in your program audit survey.

Clear, Concise Language

Writing your survey is tricky. You need to ensure your text is jargon-free, appropriate for your entire audience of learners, and unbiased. Use familiar vocabulary and a comfortable, friendly tone. We recommend staying away from absolute phrases – never, always, all – because participants may hesitate to commit to an all-or-nothing answer. Make sure all questions are straightforward and include short, concise sentences to minimize confusion or miscommunication. 

Open-Ended Questions

Open-ended questions are those without a standard answer. Typically, open-ended questions have a box or area in which your learners can add their own statements. Use open-ended questions to find opportunities for personalized solution discovery and insights into an effective blend.

Example: What methods of learning help you retain information?

Closed-Ended Questions

Closed-ended questions are those which have a single or short, specific answer. These questions can appear as a drop-down menu, radio buttons, or multiple-select options. Closed-ended questions collect precise details, limiting participants' responses to a set of predetermined responses in order to tabulate data and report on trends.

Example: Which office location do you report to? 

Scaled Questions

Scales are used to determine where participants fall within a range of frequency or occurrences. Some standard scales include a number range and might refer to agreement, frequency, importance, likelihood, or quality. Scaled questions provide quantitative measurements of learner sentiment, so they’re important to include. 

Example: On a scale from one to five, respond to the following question...

Questions That Align

Each question on your program audit survey should align with the KPIs you established. While it may be tempting to add unrelated questions to expand your results, it will add an extra time commitment to your learners, leading to frustration and incompletion.

Time Commitment

Suppose you’ve started a survey that goes on for pages and pages without knowing how long it will take. In that case, you’ll better understand why adding an estimated time to the start of your survey is essential to keep your participants engaged. Listing an estimated time lets your survey participants take it in their own time when they can give you the most accurate answers, improving your overall survey data.

With these six features, you’ll be well on your way to creating a program audit survey to make managed learning the best it can be. 

Need more help? Download our white paper, Seizing Multiple Opportunities with a Program Audit – a step-by-step guide to developing and launching a program audit. 

Download the White Paper

Do you yearn to learn?
Stay in the loop.

You May Also Like

These Stories on Custom Learning Program