Glossary of Terms

To ensure shared understanding and consistency, below is a glossary of terms.


The process by which data collected is transformed into information that can be shared and used (ASK STANDARDS, ACPA, 2006)

Assessment is any effort to gather, analyze, and interpret evidence which describes institutional, departmental, divisional, or agency effectiveness. (Upcraft and Schuh, 1996, p.18)

The cycle refers to the full sequence of assessment activities including identifying outcomes, determining methods, planning assessment, gathering evidence, analyzing and interpreting evidence, sharing results and implementing change. (ASK STANDARDS, ACPA, 2006)

Council for Advancement of Standards in Higher Education (ASK STANDARDS, ACPA, 2006)

Standards which help professionals create high-quality programs and services (ASK STANDARDS, ACPA, 2006)

The process of utilizing data for improvement or modification of a program, service, or department. (ASK STANDARDS, ACPA, 2006)

SERVICE- Measures customer satisfaction, tracking usage, demographics and needs assessments (Barham & Scott, 2006, p. 214)• LEARNING- Measuring what students learn from participating in an experience (Barham & Scott, 2006, p. 214)• DEVELOPMENT- Measures student growth or outcomes

Regularly using assessment results to improve programs and services. (Upcraft & Schuh, 1996)

The process of providing information that is valid and credible to the larger audience. (ASK STANDARDS, ACPA, 2006)

Information gathered for the purpose of research assessment or evaluation (ASK STANDARDS, ACPA, 2006)

Evidence that is tangible, visible, self-exploratory and compelling evidence of exactly what students have and have not learned.  They include both objective exams and performance measures such as evaluations of demonstrations, internships, and portfolios that are evaluated by individuals other that the instructor. (ASK STANDARDS, ACPA, 2006)

Any effort to use assessment evidence to improve institutional, departmental, divisional, or agency effectiveness. (Upcraft and Schuh, 1996, p.19)

Group discussions that are intentionally designed to gain in depth discussion around a specific topic.  These groups are typically led by trained moderators with questions that have been developed prior to the session.  The intent of focus groups is to examine feelings, perceptions, attitudes, and ideas. (ASK STANDARDS, ACPA, 2006)

Applicable to a larger population. (ASK STANDARDS, ACPA, 2006)

Set approximately 3-5 years, describes to what do we aspire • Clearly aligned with institutional and divisional mission and vision • Challenging but realistic and achievable, and reflect positive change.• Depicts the core of what unit does: includes primary concerns and strategic direction for the unit, NOT a comprehensive listing of operational, day-to-day activities. Has some degree of measurability; should be able to gauge whether or not progress toward achievement is being made.


Evidence that consists of proxy signs that students are probably learning- it is less clear. (ASK STANDARDS, ACPA, 2006)

A concrete action that a student demonstrates as a result of learning.  A learning outcome can be a demonstration of knowledge, a skill, or a value. (ASK STANDARDS, ACPA, 2006)

The approach taken for data collection: qualitative, quantitative, or mixed design. (ASK STANDARDS, ACPA, 2006)


Approach to how data will be gathered (survey, experimental, correlation, biography, narrative, case study, phenomenology, etc)  (ASK STANDARDS, ACPA, 2006; HENNING & ROBERTS, 2016)

The central purpose that provides focus, direction and destination for our work; describes the purpose of our organization, who we serve, and our hopes.  

What do we want to accomplish in the short term? Objectives are SMART- Specific, Measurable, Attainable, Realistic, Time bound, and are set annually.


An outcome is the desired effect of a service or intervention, but is much more specific that goals. It is participant focused; what students should be able to demonstrate after their participation, which are also defined as learning or developmental outcomes.  Outcomes can also be programmatic (such as increase in usage of a particular resource or service).  

Analysis used to tell a story or demonstrate key themes.  Detailed descriptions of people, events, situations, interaction, and observed behaviors. (ASK STANDARDS, ACPA, 2006)

Data collection that assign numbers to objects, events, or observations according to some rule.  Generally analyzed using descriptive and inferential statistics. (ASK STANDARDS, ACPA, 2006)

Scholarly activity, typically has broader implications for student affairs and higher education. (Upcraft & Schuh, 1996)

Consistency of a set of measurements; the extent to which they measure the same thing over repeated administrations. (ASK STANDARDS, ACPA, 2006)

An established set of criteria by which information is being measured, categorized, or evaluated. (ASK STANDARDS, ACPA, 2006)

The manner in which participants are selected.  There are various types- probability, which allows you to make inferences about a population, and non-probability, which does not allow you to make inferences to a larger population. (ASK STANDARDS, ACPA, 2006)

An internal assessment used to evaluate programs including quality and effectiveness in reference to established criteria. (ASK STANDARDS, ACPA, 2006)

Method of collecting information from people about their characteristics, behaviors, attitude, or perceptions.  Most often surveys are questionnaires or structured interviews with a specific set of questions. (ASK STANDARDS, ACPA, 2006)

TYPES OF ASSESSMENT                                                                      

  • Tracking: keeping track of who uses services, programs and facilities
  • Needs: assessing student and other stakeholder needs
  • Satisfaction: assessing student and other stakeholder satisfaction
  • Campus Climate
  • Outcomes: end result of an experience
  • learning:  what students know or are able to do as a result of program or experience
  • program:  what the program will accomplish (not levels of learning)
  • Program Review:  comparing functional areas against set of standards    
    • such as national standards such as CAS, or locally developed
  • Benchmarking: comparing programs or services to peer institutions
  • Resource effectiveness:  cost benefit analysis of programs and services, and how programs promote retention, academic success and time to graduation
  • Accreditation:  contributions to institutional effectiveness; using data for continuous improvement (closing the loop)
  • National instruments:  Large assessments on various topics such as student health, leadership, living on campus.  (HENNING & ROBERTS, 2016)

Determines if the instrument measure what it is supposed to measure and includes construct, criterion and content validity. (ASK STANDARDS, ACPA, 2006)

A compelling and futuristic statement of desirable state of reality made possible by accomplishing the missions in a way that is consistent with values." (Haines, 2010).  Unit Potential· How does the department look if we "do" our mission extraordinarily well?• Should be idealistic, authentic, extraordinary, appealing• Guided by the institutional and divisional strategic plan

Those qualities and behaviors that are most highly regarded by members.  It is our value/belief system. (How will we do our work?)