Develop your conclusions

On this page:

This step will help you consider your evidence and to make conclusions. Gathering evidence doesn’t do the trick on its own. You need to consider the evidence as a whole. This will help you identify key patterns, surprises or detail that’s important. The information will form your ‘findings’, or conclusions, and answer your evaluation question(s).

If you follow the process we’ve outlined below, you will avoid bias. If you look biased, others probably won’t find your conclusions credible.

By the end of this step you will have:

  • Located all your evidence in one place.
  • Brought what you’ve learned together to form sound conclusions.
  • Become cautious of jumping to unfounded conclusions.

Locate all your evidence in one place

This is crunch time, where you get to step back, pull your evidence together and decide what it means.

First, check you have all the evidence you need. Then return to your evaluation framework and check your measures. For each measure, try to answer the following questions:

  • What evidence have you gathered?
  • What does the evidence tell you about the measure? What has happened, and what does that say about the outcomes you are trying to measure?
  • What are the limitations of the evidence you have gathered? Do you have enough of it to draw a really clear conclusion? You are better to be honest and offer a cautious conclusion if you are not sure your evidence is clear.

Here is a Template: Bring your evidence together to organise your evidence by measure. Use this to make notes about the above questions.

Think about what you now know based on your evidence. Bring it together to form some conclusions about the initiative

After you think about each measure, try to identify patterns or relationships across the evidence that can tell a story about your initiative. Some useful questions to ask yourself are:

  • Does the evidence you’ve collected tell you if the initiative is doing what it was set up to do?
    • What information do you have that supports your answer?
    • Is your information limited in what it can say?
  • Are there any other patterns or relationships across the evidence that are useful to know? For example, an initiative might fulfil its objective of engaging students through work experience, but the evidence might tell you these students would probably have engaged without the initiative.
  • Was there anything that really surprised you? For example, any unintended benefits or consequences.

Don’t jump to conclusions

It’s rare to get a full picture of how the initiative has impacted everyone it should. For example, not all teachers who do a professional learning program will complete a survey. This has consequences for when you talk about your conclusions, and make decisions based on them.

To be cautious and transparent, there are some things you need to think about when developing your conclusions:

  • Assumptions: Keep track of any assumptions you have about your conclusions as you’ll need to be transparent when you share them with others. 
  • Confirmation bias: As well as the other biases listed on the Gather the evidence you need page, think about confirmation bias. You might form a view about something at the start of, or before an initiative. It can be tempting to try to confirm this initial view with evidence you gather. Instead you should try to impartially prove or disprove your own views. You should be especially careful if you were involved in the initiative. Without realising it, there is a risk that you might seek to say positive things about the initiative to justify the time and effort spent on it. A good way to avoid this risk is to think about the independence of the evaluation upfront. See the Pick who needs to be involved and how page for further information.
  • Causality: One of the biggest challenges in STEM education evaluations is understanding what has changed as a result of the STEM initiative in question, rather than for other reasons. Education initiatives don’t take place in a laboratory where you can change one thing at a time. So you may never know for sure what factors influenced change. You can manage this through your language: say results ‘suggest’ or ‘indicate’, and avoid terms such as ‘prove’. You also might feel more confident about cause and effect if people say the initiative has made a change and there is another source of hard evidence that confirms this (such as test scores, or attendance rates). For example, if students and teachers both said an initiative made a difference to how they thought about problem-solving, and their maths test scores went up, it would be more reasonable to suggest cause and effect than if the only information you had was test scores.