School of Education

Standard 2

Additional Navigation

The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.

2.1 Assessment System and Unit Evaluation

How does the unit use its assessment system to improve candidate performance, program quality and unit operations?

2.2.b Continuous Improvement

  • Summarize activities and changes based on data that have led to continuous improvement of candidate performance and program quality.
  • Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard.

2.3 Areas for Improvement Cited in the Action Report from the Previous Accreditation Review

Summarize activities, processes, and outcomes in addressing each of the AFIs cited for the initial and/or advanced program levels under this standard.

Download Standard 2 (PDF 147KB)

2.1 Assessment System and Unit Evaluation

The School of Education uses a continuous improvement cycle in which candidate data are integrated with other assessment protocols for field experiences, faculty evaluation, and unit resources to create a comprehensive, multi-faceted unit assessment system. The system is designed to offer evidence to guide improvement in three areas: 1. Candidate Performance - Providing feedback to individual candidates at required checkpoints (e.g. admission; at degree candidacy or entrance to student teaching or internship; and, graduation or program completion) is currently implemented through the electronic data management system, Tk20, a comprehensive system for outcomes-based assessment, accountability, and reporting. 2. Program Quality - Providing useful aggregated data analyses of candidate performance on key candidate assessments at required checkpoints and post-graduation to SOE stakeholders supports a continuous, data-driven program improvement. Data collection through Tk20 provides the mechanism for sharing the data. 3. Unit Quality - Providing data analyses to SOE faculty and administrators on overall unit program quality, uses the Conceptual Framework and Professional Dispositions as the means for organizing aggregated program-level data across the unit; and, unit operations, governance, resources, faculty, and candidate support sustain improved program delivery to candidates through a combination of institutional and SOE data sources.

The candidate and program assessment process in the School of Education is guided by the following principles: Assessment is continuous and systematic. Assessment is formative and summative. Summative assessment occurs at multiple checkpoints. Assessments are aligned with applicable state and national professional performance standards and the School of Education's Conceptual Framework. Assessment is fair, consistent, and accurate. Candidate assessment is based on multiple measures of performance over time. Program assessment is based on aggregating data from key candidate assessments at or near program completion. Program and unit-level assessment data are aggregated, analyzed, summarized, and shared on a regular basis with SOE stakeholders to guide improvement efforts.

The School of Education's Unit Assessment System begins with identification of key assessment instruments (assignments and rubrics) that are mapped to professional standards in the Tk20 data management system. Data reports are generated for compliance, analysis, and program improvement which are or can be mapped to professional organization standards, accreditation agency standards, NYSED certification requirements, and the SOE Conceptual Framework and Professional Dispositions. Graphic depictions of the unit assessment system are shown in Exhibit 2.4.a.

The assessment system is designed to collect data that provide multiple measures of candidate, program, and unit quality including: content knowledge, pedagogical knowledge, skills and dispositions; impact on P-12 student learning; governance, resources, faculty characteristics, program delivery, and candidate support services.

Key information collected by the Assessment System includes: quantitative measures such as GPA requirements, state licensure content and pedagogy test scores, ratings of candidate performance by clinical faculty, opinion surveys of candidates/alumni on program quality, opinion surveys of employers on performance of alumni and program quality, budget and enrollment trends over time, and faculty workload; qualitative measures such as evaluations of admission essays or interviews; culminating or comprehensive examinations; curriculum units; validation measures such as completion of required courses or workshops, completion of prerequisite degrees or certifications, and employment or graduate school status; descriptive information such as office and instructional space availability and instructional technology availability. The key quantitative, qualitative, validation and descriptive information collected in the Tk20 data management system is reported in subsequent pages in this report. Program level key assessments, aligned with the Conceptual Framework, Professional Dispositions, and relevant professional organization standards, are also administered and coordinated at the program level and entered into Tk20.

The assessment system addresses candidate performance using measures aligned with the Conceptual Framework; faculty performance using teaching, scholarship, and service as determined by the institution's personnel policies; and, procedural operations in terms of resources, functionality, and productivity.

Candidate performance assessments are embedded in program coursework and field experiences that are aligned with the professional standards for each program. Programs leading to initial certification have developed assessments organized around 3 - 4 transition points: admission to the program; admission to student teaching/internship; exit from student teaching/internship; at program completion. Each teacher certification and school professional program clearly identifies the minimum requirements for competency. SPA reports in AIMS provide detailed key assessment data by program.

Admissions criteria vary by program and are not driven by program key assessments but rather by institutional criteria. For the programs that require evidence of scholarship such as GPA and required coursework, information can be found in Exhibit I.5.a - college catalog pages. Admissions assessments are typically measures of basic skills and content preparation, i.e., minimum GPAs, GRE/MAT scores, completion of degrees, or minimum grades in required coursework. College policies determine undergraduate admission to the institution and include a minimum high school GPA and SAT/ACT score. (Exhibit 2.4.b) Assessments that permit continuation in and completion of SOE programs include attaining and maintaining a minimum GPA (2.5 for undergraduate programs, 3.00 for advanced programs); satisfactory performance in required courses and on all program assessments; and professional behavior in courses and field placements.

Candidates who are unsuccessful in major or cognate courses or do not meet minimum GPA requirements may not continue professional education coursework until deficiencies are corrected; advisors work with candidates to identify and remedy these problems. Documented violations of institution or SOE policies may result in termination from a program as described in detail in the Fair Process Policy & Procedures (Exhibit 2.4.c) and Candidate Consultation Policy & Procedures (Exhibit 2.4.e) in the SOE Policy Handbook. Records of candidate complaints, and the unit's responses and resolutions are maintained in the SOE Dean's office. Additional procedures for judicial related complaints are presented in the Oswego Student Handbook (Exhibit 2.4.e) and typically begin at the department level.

Candidates completing critical performance tasks reflect the commitment to social justice in the Conceptual Framework. Procedures to ensure fairness, accuracy, consistency, and elimination of bias include: multiple measures at checkpoints in all programs, measurement tools with known reliability and validity characteristics (e.g., Teacher Work Sample), use of rubrics for rating and scoring measures, and use of two raters for high-stakes decisions on candidate performance (e.g., college supervisor and cooperating teacher or site supervisor during clinical experiences).

Policies and practices that ensure data are compiled, aggregated, summarized, and used for continuous improvement are reported in the SOE Assessment Handbook.(Exhibit 2.4.d)

The second area of assessment, Program Quality, is based on professional standards, NYSED requirements, and/or the Conceptual Framework. Data collection on evaluation of critical performance tasks identified for each program is accomplished each semester by professional and clinical faculty with appropriate qualifications and experience. Completion of data entry in the Tk20 system is a departmental responsibility supervised by the chair with support from the SOE's Technology Support Professional and Associate Dean of Assessment and Accreditation. Analyses of program assessments conducted on a regular basis are integrated with institutional data to draw conclusions about candidate content knowledge, professional/pedagogical knowledge, skills and dispositions; to draw conclusions about impact on P-12 student learning; and, to be shared with professional and clinical faculty as appropriate.

Procedural operations, the third assessment area, include effectiveness of the data management system, Tk20, which has improved collection, analysis, and reporting for all programs both undergraduate and advanced. Resources allow for a full-time Technology Support Professional to provide professional development to faculty, staff, and candidates as well as to input data and create reports using the Tk20 system. This personnel position functions out of the Dean's office and is available to faculty to assist with data collection and report analyses of data.

The unit regularly and systematically uses data to evaluate the value of its courses, programs, and clinical experiences. Programs have developed course evaluations that are completed by both undergraduate and advanced program candidates. Data are used most directly by faculty to make course-based changes in instructional strategies. Department chairs and undergraduate and graduate curriculum coordinators work with program faculty to ensure that content is well-organized, effective, and reflective of current initiatives. An example of course changes and the process for changes in response to data is offered in Exhibit 2.4.g.

2.2.b Continuous Improvement

Activities and changes based on data that have led to continuous improvement of candidate performance and program quality follows:

  • Appointing a full-time Technology Support Professional and a full-time Associate Dean for Assessment and Accreditation has led to improved facilitation of a more effective and efficient assessment system.
  • Giving clinical faculty access to assessment tools through Tk20 allows supervisors and cooperating teachers data entry capabilities directly rather than completing paper copies of assessment tools to be loaded into the online system after the fact.
  • Providing candidates access to the Tk20 system for 10 years allows them to create presentation and/or teaching portfolios. With support from the Provost, senior administrators, and SUNY, the fee for Tk20, $100, is charged at first enrollment in School of Education. This fee can be covered through financial aid.
  • Using basic criteria for entry and retention in programs such as the required 2.5 GPA has proven more rigorous as compared to other units in the institution that require a 2.0 GPA. Candidates must maintain the GPA as well as earn a C- or better in the major core and cognate requirements, which also demonstrates more rigor.
  • Collaborating with Career Services, C & I faculty worked to equip candidates to use the Tk20 assessment process and Optimal Résumé to better market themselves following graduation.
  • Faculty implemented Tk20 for key assessments, and newly written rubrics were executed in Tk20.
  • The Technology department re-wrote all existing courses and added several new elective courses in response to criteria from the NYSED Content Specialty Test, advice from in-service teachers, and professional literature from the field in order to improve program quality.

Plans for sustaining and enhancing performance through continuous improvement in the unit as a whole and in part include the following:

  • The importance of multiple data collection points will lead to corroborative reviews to identify performance indicators that speak to the data collected at transition points as adequate indications of candidate success.
  • The Associate Dean of Assessment and Accreditation will coordinate additional collaboration efforts among SOE committees and programs to strengthen the assessment system and increase the coordination of data sharing across the unit.
  • We will continue to align key assessments for programs with new professional association standards (SPA) leading to review of new measurements for consistency, reliability, validity, and effectiveness as a unit.
  • Discussions with EAD PAG members on the possible need for an additional platform statement requirement that will address anti-bullying leadership strategies and methods to monitor/prevent incidences of cyber-bullying in schools and across districts.
  • Work with practicing attorneys who specialize in education law to provide current and specific information about the highlights of and compliance with the Dignity for All Students Act (DASA) for our school leader internship candidates.
  • Continued work on dispositions assessment and data collection begun in spring 2013. Two mandatory checkpoints for all programs have been identified with encouragement to use the instrument for observing and assessing dispositions within courses. Those checkpoints are suggested at entry into the program as an advisement session to review the dispositions and make candidates aware of expectations and required at completion of the program.
  • Data sharing across committees through Tk20. For example, a survey used by the Educational Technology committee with regard to use of technology by student teachers and interns, can impact the Field Placement committee deliberations and lead to strengthening candidates' use of educational technologies in school and clinical placements.
  • Use of information technologies to convert documents or data to a digital format to expedite document sharing.
  • Assurance that new faculty and adjunct faculty receive training in all assessment procedures and policies.
  • Updates in the FPO to include field experience, student teaching, and internship placements to be uploaded into Tk20. Additionally, ensuring that department personnel upload field experience, student teaching, internship assessment, and key assessment data into Tk20 will lead to continuous improvement.
  • Continued process to get valid data from alumni and employer surveys at both the initial and advanced levels. The institution is creating a position for fall 2013 that will assist with this process: The Graduate Outreach Specialist will develop and implement a comprehensive strategic approach for the collection of graduate exit data. The Graduate Outreach Specialist will work closely with the alumni office for effective communication flow, collaborate with faculty and staff to systematize and centralize the collection of data across campus, and supervise a team of students related to a calling campaign and social media research, and perform other forms of outreach.
  • Preparing candidates for NYSED's new certification exams beginning in 2014. These tests include: Teacher Performance Exam (edTPA), Educating All Students (EAS), Academic Literacy Skills Test (ALST), revised Content Specialty Test (CST), and revised School Building Leader Assessment (SBL). The SOE is working to cross-reference the Teacher Work Sample (TWS) to the edTPA. These tests will replace the certification exams currently used as key assessments across the unit and will provide more data that support candidate qualifications.

2.3 Areas for Improvement Cited in the Previous Accreditation Review

Areas for improvement are enumerated and addressed in the following segment.

1. The unit has not taken effective steps to ensure that assessment procedures are fair, accurate, and consistent across the unit.

The data management system Tk20 has allowed systematic, consistent collection of data on key assessments across the unit. Additionally, two unit-wide assessments - Professional Dispositions and Graduate Exit Surveys are completed through the system with data collection occurring immediately. Reports from data collected in the system are created for chairs, committees, and individual faculty members. The Technology Support Professional as the point person for data collection assistance runs reports, and faculty and staff have opportunities for professional development and orientation to the full capabilities of the Tk20 system conducted by both the Technology Support Professional and the Associate Dean for Assessment and Accreditation. In order to ensure fairness, accuracy, and consistency, clinical practice is evaluated by two reviewers - the college supervisor and the cooperating teacher or site supervisor. Multiple checkpoints for data collection are consistent across the unit - admission to program, entry to clinical practice, and program completion. Some programs include a fourth checkpoint distinguishing between completion of clinical practice and completion of program.

2. Data are not regularly and systematically tabulated, summarized, and analyzed in MSEd program in Technology Education, Curriculum and Instruction, and CTE.

The Technology Education MSEd program, advanced yet fulfilling professional certification only, utilizes a two-part assessment plan. It consists of collection of candidate data from a series of three candidacy exams written by faculty, for which candidates sit mid-way through their program. Exam contents relate directly to curricular core courses, and the exam rubric provides a summative evaluation based on scoring of relevant criteria including assessment of candidate knowledge of standards, theories, and current best practices in technology education. The second assessment profile takes a more formative approach to assessing each candidate through the required compilation of a graduate portfolio, which encompasses their entire graduate program. The rubric is based on a set of competencies developed by the graduate faculty, which align with departmental expectations of graduates of the program. Data was evaluated in 2011 and is on a three year cycle to be evaluated again in 2014. Program improvement has included review of the candidacy exam questions and inclusion of additional technical courses based on changes in the field.

MSEd in Curriculum and Instruction assesses candidate progress by tracking courses completed, in progress, and future. The MSEd program assessments meet the NBPTS Propositions. See response 2 to AFIs in Standard 1.

Additionally, candidates in the MSEd Career and Technical Education program participate in a required comprehensive examination, written and evaluated by program faculty prior to program completion.

3. The collection, aggregation, and sharing of data related to advanced programs is inconsistent across the unit.

Rubrics and data for evaluating candidate performance are uploaded in Tk20 allowing systematic analysis of data to improve unit operations. Systematic collection of data across checkpoints in programs provides opportunities for sharing data related to similar assessments across the unit. Data collected through institutional resources related to all advanced programs is also stored in the Tk20 system and comparative reports created by the SOE Technology Support Professional. Each advanced program has its own unique assessments so all advanced programs include similar rather than identical data sets. However, with the Tk20 system, all data is collected, aggregated, and shared with program chairs and directors.