2024-2025 AY Instructions & Academic Program Assessment Report Template

Download template. Template includes full instructions at the beginning.  Please delete instruction pages from final report.

Academic Program Assessment Report 

Annual (undergraduate) and Biennial (Graduate) Assessment Reports are to be submitted by October 15th every year.  If more time is needed, please contact Assistant Provost Deb Blanchard for support (deborahblanchard@montana.edu).

Academic Year Assessed:
College:
Department:
Submitted by:

 

Program(s) Assessed:       
List all majors (including each option), minors, and certificates that are included in this assessment:

 

 


The Assessment Report should contain the following elements, which are outlined in the template. Detailed instructions can be found in the downloadable template. 

Additional instructions and information should be deleted from final reports.

  1. Past Assessment Summary.
  2. Institutional Assessment Data Request.    *NOTE: NEW ITEM FOR 24-25AY*
  3. Actionable Research Question for Your Assessment.
  4. Assessment Plan, Schedule, and Data Source(s).
  5. What Was Done.
  6. What Was Learned.
  7. How We Responded.
  8. Closing The Loop(s).

1.    Past Assessment Summary.

Briefly summarize the findings from the last assessment report conducted related to the PLOs being assessed this year. Include any findings that influenced this cycle’s assessment approach. Alternatively, reflect on the program assessment conducted last year, and explain how that impacted or informed any changes made to this cycle’s assessment plan. 

2.    Institutional Learning Outcome Assessment Data Request.

MSU values the skills that students acquire as graduates of this university. Core Quality Learning Outcomes state that MSU Graduates will be Effective Communicators, Thinkers & Problem Solvers, and Local & Global Citizens.  The inclusion of the world "Graduates" implies that all programs address these larning outcomes at some point in their curricuum, not just in Core-designated courses.  Please review the definitions of the MSU Core Qualities.  The Core Curriculum Committee uses a set of Core Assessment Rubrics to assess the Core program at the "Beginning" to "Developing" levels.  Please review these rubrics, paying attention to "Developing" and "Proficient" criteria and include the following chart in your report this year.  (Note: There are no right or wrong answers - this data is requested for use in future institutional assessment endeavors)

 

Core Quality LOs are Institutional Learning Outcomes (ILOs)

Mark X if program has at least one PLO that overlaps with an ILO

Beginning Level

 

e.g. CORE courses (US, W, Q, IN, CS, IA, IH, IS, D)

Developing Level

 

e.g. list one 200- or 300-level course

Proficient Level

 

e.g. list one 300- or 400-level course

N/A

No course exists in our program that addresses this Core Quality/ILO

Thinkers & Problem Solvers   Core Classes are designed to address an introduction, foundational level of Core Qualities. Some may overlap with the developing level, but most intermediate to developing or proficient/mastery level course will exist within the major programs.      
Effective Communicators        
Local & Global Citizens        

 

3.    Actionable Research Question for Your Assessment.

            What question are you seeking to answer in this cycle’s assessment?

Note: What question(s) are you trying to answer in this cycle's assessment? Research questions should focus on an area you need to know the answer to, be tied to program goals, and be measurable. Focus on: What will we be able to improve on if we answer this question?

Macro level type of question: Can students apply problem solving steps? Or, how well do student apply problem-solving steps?

Micro level type of question: How have student outcomes on applying problem-solving steps changed over the last three years?

 

4.    Assessment Plan, Schedule, and Data Source(s).

a)  Did you change the previously established Assessment Plan Schedule.  If yes, how was it changed.

 

b)  Please provide a multi-year assessment schedule that will show when all program learning outcomes (PLOs) will be assessed, and by what criteria (data). List your PLOs in full for reference.  Add rows as necessary.

 

Note: This schedule can be adjusted as needed. Attempt to assess all PLOs every three years. 

ASSESSMENT PLANNING SCHEDULE CHART
PLO# PROGRAM LEARNING OUTCOME 2023-2024 2024-2025 2025-2026 2026-2027 DATA SOURCE(S)
             
             

 

c)  What are the threshold values for which your program demonstrates student achievement? Provide a rationale for your threshold values.

 

Note:  Example provided in the table should be deleted before submission. Add rows as needed.

Threshold Values

PROGRAM LEARNING OUTCOME

Threshold Value

Data Source

Example: 6) Communicate in written form about fundamental and modern microbiological concepts

The threshold value for this outcome is for 75% of assessed students to score above 2 on a 1-4 scoring rubric.

Randomly selected student essays

 

 

 

 

 

 

 

 

 

Examples of direct evidence of student learning: specifically designed exam questions, written work, performances, presentations, projects (using a program-specific rubric – not a course grading rubric); scores and pass rates on licensure exams that assess key learning goals; observations of student skill or behavior; summaries classroom response systems; student reflections.  

Indirect evidence of student learning includes: course grades, grade distributions, assignment grades, retention and graduation rates, alumni perceptions, and questions on end-of-course evaluations forms related to the course rather than the instructor. These may provide information for identifying areas of learning that need more direct assessment but should NOT be used as primary sources for direct evidence of student learning. 

5.  What Was Done.

a)    Self-reporting Metric (required answer): Was the completed assessment consistent with the program’s assessment plan? If not, please explain the adjustments that were made.

                                     Yes      square                               No        square

b)    How were data collected and analyzed and by whom? Please include method of collection and sample             size.

c)    Please provide a rubric that demonstrates how your data were evaluated.

Note: Rubrics are program-specific NOT course grading rubrics. Example provided below should be deleted before submission. Rubrics may be very different than these examples; it just needs to explain the criteria used for evaluating the student artifacts as they relate to the PLOs being assessed.

Indicators

Beginning - 1

Developing- 2

Competent- 3

Accomplished- 4

Analysis of Information, Ideas, or Concepts

Identifies problem types

Focuses on difficult problems with persistence

Understands complexity of a problem

Provides logical interpretations of data

 

Application of Information, Ideas, or Concepts

Uses standard solution methods

Provides a logical interpretation of the data

Employs creativity in search of a solution

Achieves clear, unambiguous conclusions from the data

 

Synthesis

Identifies intermediate steps required that connects previous material

Recognizes and values alternative problem solving methods

Connects ideas or develops solutions in a clear coherent order

Develops multiple solutions, positions, or perspectives

Evaluation

Check the solutions against the issue

Identifies what the final solution should determine

Recognizes hidden assumptions and implied premises

Evaluates premises, relevance to a conclusion and adequacy of support for conclusion.

 

Rubrics can be created to assess learning at any course level, with evaluation scores and threshold percentages adjusted accordingly. Some rubrics are designed for use across multiple course levels—for example, to assess outcomes in both lower- and upper-division courses—depending on how the assessment is structured. If you’re evaluating more foundational learning outcomes, it’s appropriate to focus on lower-division coursework where those outcomes are typically introduced and expected to be achieved earlier in a student’s academic progression. 

 

Student names must NOT be included in data collection. Reporting on successful completions, or manner of assessment (publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes. In programs where numbers are very small and individual identification can be made, focus should be on programmatic improvements rather than student success. Data should be collected throughout the year on an annual basis – this is especially helpful for biennial reporting. Proprietary program information (e.g., exam questions and examples) must not be included in the report if the program does not want that information to be included in any public-facing access.

 

6. What Was Learned.

a)    Based on the analysis of the data, and compared to the threshold values established, what was learned from the assessment?

b)    What areas of strength in the program were identified from this assessment process?

c)    What areas were identified that either need improvement or could be improved in a different way from this assessment process?

 

7. How We Responded.

a)    Describe how “What Was Learned” was communicated to the department, or program faculty. How did faculty discussions re-imagine new ways program assessment might contribute to program growth/improvement/innovation beyond the bare minimum of achieving program learning objectives through assessment activities conducted at the course level?

b)    How are the results of this assessment informing changes to enhance student learning in the program? 

c)    If information outside of this assessment is informing programmatic change, please describe that. 

d)    What support and resources (e.g. workshops, training, etc.) might you need to make these adjustments?

 

8. Closing The Loop(s).

Reflect on the program learning outcomes, how they were assessed in the previous cycle (refer to #1 of the report), and what was learned in this cycle about any actions stemming from the previous cycle.  

a)    Self-Reporting Metric (required answer): Based on the findings and/or faculty input, will there be any  changes made (such as plans for measurable improvements, realignment of learning outcomes, curricular changes, etc.) in preparation for upcoming assessments?

Yes      square                               No        square


b)     In reviewing the last report that assessed the PLO(s) in this assessment cycle, what changes proposed were implemented and will be measured in future assessment reports? What action will be taken to improve student learning objectives going forward?

c)     Have you seen a change in student learning based on other program adjustments made in the past? Please describe the adjustments made and subsequent changes in student learning.

d)     If the program sees anything emerging from this assessment cycle that it anticipates would be a factor, or an item of discussion in its 7-year program review cycle, please use this space to document that for future reference.

 

Submit report to programassessment@montana.edu

Update Department program assessment report website.

Update PLO language in CIM if needed (Map Program Learning Outcomes to Course Learning Outcomes in CIM)