Program Evaluation

When your kids write the Diploma or Achievement Test Department sends out a printout of how your class did compared to everybody else in the province

Three types of report:

  1. ASSESSMENT HIGHLIGHTS (pamphlet)

    • how are kids doing today in terms of meeting the standards?
    • how are they doing compared to four years ago? eight years ago?
      (monitor over time)

  2. PROVINCIAL REPORT

    • format keeps changing --> some years all tests in one book to save on paper and mailing costs; other years each exam gets its own report

    • tons of technical information (gender stuff, etc.)

  3. JURISDICTION & SCHOOL REPORTS

      (up to superintendent what happens to these after that
      --> can publish in newspaper, keep secret central office only, etc.)

    • get your hands on and interpret
    • either you do it or someone else will do it for/to you
    • better teachers take responsibility rather than top down
    • new table formats are so easy to interpret no reason not to

    • this means you can compare their responses to the responses of 30,000 students across the province

      • will help you calibrate your expectations for this class

      • is your particular class high or low one?

      • have you set your standards too high or too low?
      • giving everyone 'CŐs because you think they ought to do better than this, but they all ace the provincial tests?

    Who Knows Where This is?OVERHEAD: SCHOOL TABLE 2 (June 92 GRADE 9 Math Achievement)

    • check table 2 for meeting standard of excellence
    • Standards set by elaborate committee structure

    This example (overhead): Your class had 17 students

    Total test out of 49 (means test of 50, but dropped on after item analysis)

      standard setting procedures decided that 42/49 is standard of EXCELLENCE for grade 9s in Alberta
      next column shows they expect 15% to reach this standard

      standard setting procedure decided that 23 out of 49 which Acceptable
      standard; next column says expect 85% to reach that standard

      columns at end of table show that actually, only 8.9% made standard of excellence, and only 67.4% made acceptable standard

      (bad news!)

      but looking at YOUR class, 5.9, almost 6% made standard of excellence (so fewer than province as a whole) but on the other hand, 76.5% meeting acceptable standard.

    Need comparison -- otherwise, fact that only get 6% to excellence might sound bad...

      Interpretation: either you only have one excellent math student in your class, or you are teaching to acceptable standard, but not encouraging excellence?

    BUT can use tables to look deeper,

    • use tables to identify strengths and weaknesses in student learning

    • and therefore identify your own strength and weaknesses

    Problem solving & knowledge/skills broken down --> table of specs topics

    Interestingly, though, above provincial on problem solving at excellence...

    ASK: -how do you explain % meeting knowledge and % meeting problems both higher than % meeting standard on whole test?

    Answer: low correlation between performance on the two types of questions
    (i.e., those who met standard on the one often did not on the other)

      which means (a) can't assume that easy/hard = Bloom's taxonomy

      and (b) that you have to give students both kinds of questions on your test or you are being unfair to group who is better at the other stuff

    Don't know where this is OVERHEAD: SCHOOL TABLE 5.1 (GRADE 9 MATH, JUNE 92)

    • check tables 5.1 to 5.6 for particular areas of strengths and weaknesses

    • look for every question where students in your school were 5% different on keyed answer from those in provincial test

      • if 5% or more higher, is a particular strength of your program
      • if 5% or more lower, is a particular weakness