Accreditation Proposals

When Are Proposals Due?

You can turn in a proposal at any time. AAQEP staff match proposals with peer reviewers at three points during the academic year: August 15, November 15, and March 15. You will receive feedback approximately 3 months after the match date.

Send proposals (and any questions) to Karen Lowenstein, Director of Member Services.

AAQEP offers many resources to support you in writing your Accreditation Proposal! Follow the links on this page to register for an event or download a file.

Provider Templates

Downloadable templates and instructions are available for two of the elements that need to be included in every proposal: the Program Specification Table and the Aspect-Evidence Tables.

  • Program Specification Table (PDF)

    This table is an official record of the programs included in the scope of your AAQEP review. It identifies the degrees or programs being accredited by AAQEP, any associated state credentials, enrollment and completer numbers, and other information. Download this Word version to use as your template.

  • Aspect-Evidence Tables (PDF)

    These tables, which replace the "Aspect-Evidence Index" introduced in the 2021 Guide to AAQEP Accreditation, provide an organizer for your report readers by identifying evidence sources, their alignment to the standards and aspects, and which program(s) they support. Download this Word version to use as your template.

Proposal Samples

AAQEP members often request samples of other programs’ Accreditation Proposals to help guide their own writing. The excerpts below are grouped by proposal section, introduced by a brief summary of that section’s purpose and possible ways to present it (see also the guidelines above). Thanks to the member faculty and programs who have agreed to share these samples!

To help guide your use of these samples, staff have provided brief annotations of elements that may be helpful to notice in each sample. These annotations highlight various strategies used by proposal authors to provide a concise snapshot of their work.

Note that these samples represent options for approaching the Accreditation Proposal and are not intended as templates. Downloads are available only to logged-in AAQEP members.

What’s the point of Section 1?

The introduction identifies the particular programs seeking accreditation and presents a high-level overview of the provider’s context. Although brief (generally two to four pages), it should include important details for reviewers to understand about programs’ design, candidate population, geographic factors, mission or other commitments, and relevant state requirements.

How do authors present this section?

Section 1 may be successfully presented in a variety of ways. Some introductions are pure narrative, while others organize content under separate headings or include graphics, such as organizational charts, to illustrate hierarchies and relationships. Additionally, some authors helpfully include a glossary to introduce reviewers to terms and acronyms associated with their programs and assessments.

Examples (click title to open PDF—must be logged in)

Section 1 Example A

  • The provider has explicitly included stakeholder involvement as part of the overview.
  • The Program Specification Table presents disaggregated candidate and completer data by credentials within programs, offering a clear, concise and complete picture of the provider’s numbers by field, degree and program.

Section 1 Example B

  • The brief description of the university’s mission is helpful because it shows how the work of preservice teacher education is integral to the larger institution.
  • The provider presents an overview in the form of a table of all of the measures for the aspects of Standard 1 before presenting individual tables of measures for each aspect of Standard 1. This helps orient readers to the set of overall measures and how they are used across aspects.

Section 1 Example C

  • The narrative descriptions of each program briefly provide a recent history on important and relevant program updates as well as a rationale for those updates. The rationales also refer to recommendations by school partners, serving as evidence of relationships with the field.

Section 1 Example D

  • The Program Specification Table presents a clear picture of initial and advanced programs with disaggregated data on credentials and degrees as well as numbers of unique candidates in programs.
  • Two key advisory boards central to the work of the provider are featured up front in the overview.
  • The proposal describes the measures common across initial and advanced programs followed by descriptions of the tools specific to each set of programs.

What’s the point of Section 2?

Section 2 describes the evidence sources the author intends to use to explicitly address each aspect of Standards 1 and 2. Reviewers also look for a clear articulation of the alignment of assessments used throughout the program, the aspects they address, and the perspectives they represent (which must include program faculty, P-12 partners, program completers, and completers’ employers, although not all perspectives must be represented for each aspect).

How do authors present this section?

To organize and document these essential components of the evidence set, authors employ a variety of tables. Some proposals organize the evidence by aspect, while others do so by measure. The tables may also show the frequency of a measure’s use over time and denote whether the data source is currently in use or planned, and/or whether it is a direct or indirect measure.

Examples (click title to open PDF—must be logged in)

Section 2 Example A

  • This table is a helpful map for the evidence of Standard 1, showing at a glance which measures are common across two programs and which measures are distinct. It also shows the perspectives represented by the measures.

Section 2 Example B

  • The first table (Table 2) serves as a helpful map of specific kinds of measures (e.g., exit surveys, field rubrics) that each program has.
  • The second table (Overview of Measures) indicates when measures are implemented and includes their criteria for success. There are even response rate goals for surveys.
  • Rather than referring to an entire measure, the third and fourth tables (Tables Mapping of Measures to AAQEP Standards: Exit Survey; Alumni Survey) align specific rubric or survey items to specific aspects and include the actual questions to demonstrate alignment.

Section 2 Example C

  • The tables of measures by aspect explicitly indicate whether each measure is direct or indirect. They also indicate whether a message is currently being used or is planned.
  • Table 4 provides a snapshot of all of the measures presented in the proposal and the aspects each addresses.

Section 2 Example D

  • The tables demonstrate evidence for Aspects 1a and 1b, including stakeholder perspectives, when the measure is used, and whether the measure is in use or planned.

Section 2 Example E

  • These tables show that a measure can be used for multiple aspects. The tables also align specific items or parts of a measure to an aspect.

Section 2 Example F

  • The tables show measures by aspect, including perspectives, descriptions of whether the measures are in use or planned, and a description of when the measure is used.

What's the point of Section 3?

This section describes how the provider is examining each proposed measure from a data-quality standpoint. In particular, it’s essential to document how locally developed measures have been (or will be) evaluated for validity, reliability, trustworthiness, and fairness. For more widely available measures, links to related data-quality studies are also helpful, but more pertinent may be an explanation of how such measures accurately reflect candidate success (e.g., consideration for curriculum alignment, fidelity of implementation). It’s helpful to append instruments to the proposal or include links to sample measures.

How do authors present this section?

This section can benefit from a tabular presentation, or it may list and describe each measure and the stakeholders involved in determining data quality. Some authors opt for narrative explanations of how they have addressed or plan to address a particular data quality concept.

Examples (click title to open PDF—must be logged in)

Section 3 Example A

  • Each measure is explored with data quality questions and descriptions of the provider’s work to obtain quality data. There is also mention of how the provider shares revisions with stakeholders.

Section 3 Example B

  • A brief history including attention to revisions of measures and stakeholder involvement is presented for each measure.

Section 3 Example C

  • For instruments that have undergone validity studies, links or sources are provided.
  • Specific training for interrater reliability is described.
  • There is a plan to evaluate fairness by evaluating whether there are disparate outcomes for specific groups of candidates.

Section 3 Example D

  • The provider situates its data quality work in a description of its current and planned internal processes.

What’s the point of Section 4?

Section 4 identifies new or emerging features of the programs being reviewed, how the provider plans to monitor these changes, and what markers will be used to guide and evaluate them. Proposal authors should note if the measure used to secure this evidence is not previously discussed in the proposal.

How do authors present this section?

Authors generally use a narrative format to tell this story. To help reviewers understand the program’s rationale for proposed changes, authors identify contextual challenges and describe how they have spurred innovations.

Examples (click title to open PDF—must be logged in)

Section 4 Example A

  • Each challenge or innovation is named and succinctly described. The provider also includes concise information on impact and next steps.

Section 4 Example B

  • The provider prioritizes description of two contextual developments. The first focuses on the context and identity of the institution and impact on accreditation work; the second focuses on concrete work of rubric revisions.

Section 4 Example C

  • Innovations are organized by program using tables and include a brief description of the substance of the innovation and evaluation or monitoring of the evaluation.

Section 4 Example D

  • The provider situates challenges and planned innovations in the larger context of the institutional commitment to diversity.