Occasional Paper 19 FINAL

Transcript

1 National Institute for Learning Outcomes Assessment October 2013 All-in-One: Combining Grading, Course, Program, and General Education Outcomes Assessment W. Allen Richman Laura Ariovich k nowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge create achievement connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection understand communicate listen learn access quality innovation success ingenuity self-reflection educate action understand intellect knowledge accountability connection self-reflection educate action understand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection self-reflection knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection self-reflection educate action understand communicate curiosity challenge create achievement connection self-reflection understand communicate listen learn access quality action educate action understand communicate listen learn action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation success ingenuity self-reflection educate action understand intellect knowledge accountability connection self-reflection educate action understand knowledge accountability connection self- reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge connection knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity challenge create achievement connection self-reflection educate action understand connection self-reflection understand communicate listen learn access quality action create achievement connection self-reflection educate action understand communicate listen learn access quality innovation success educate action communicate listen learn access quality action educate action understand communicate educate innovation success self- reflection knowledge accountability communicate listen learn achievement connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflection understand educate action understand communicate listen learn action understand communicate listen learn access quality innovation success ingenuity curiosity challenge create achievement connection self-reflection understand communicate listen learn Occasional Paper #19 www.learningoutcomesassessment.org 1 | National Institute for Learning Outcomes Assessment

2 About the Authors W. Allen Richman Contents Dr. W. Allen Richman is Interim Dean of the Office of Planning, Assessment, and Executive Summary . . . 3 Institutional Research and the Director of Outcomes Assessment & Institutional Effectiveness at Prince George’s Community College in Largo, Maryland. He is All-in-One: Combining Grading, responsible for all aspects of institutional effectiveness and assessment for all units Course, Program, and General on campus, including oversight of the collection, statistical analysis, summarization, Education Outcomes Assessment . . . 4 and dissemination of data pertaining to nearly every aspect of the institution. 5 The All-in-One Process. . . Finding and creating Dr. Richman’s published work has been included in the Peer Review, Proceedings of interconnections . . . 6 the NEAIR 39th Annual Conference, Journal of Special Education Technology, The 6 Identifying key assignments . . . Journal of Research in Childhood Education, and others. He is a frequent conference presenter on topics related to assessment, accreditation, institutional effectiveness, Using rubrics for grading and and technology. Some of his presentations include those at the Annual Dream assessment . . . 7 Conference, Middle States Commission on Higher Education, the Assessment Inherently connected to the Institute in Indianapolis, the Association for Institutional Research and the American curriculum. . . 9 Association of Colleges for Teacher Education Annual Meeting. Efficient use of resources. . .9 Faculty involvement and Dr. Richman earned his B.A. in English from the University of Texas at Austin and his support. . .10 M.A. and Ph.D. both in Developmental Psychology from the University of Kansas. Consistent feedback for students and faculty. . .11 Laura Ariovich Robust data set . .11 Dr. Laura Ariovich is currently a Research and Planning Analyst at Prince George’s Same student, multiple measures . .12 Community College (PGCC), in Largo, Maryland. At PGCC, she applies her vast Skill development. . .12 experience in qualitative and mixed methods research to conduct multi-layered Implementation issues and lessons assessment of curricular and co-curricular activities, including observation, surveys, learned. . .13 and focus groups to capture students’ views, voices, and experiences. As a member Conclusion. . .14 of the College’s Assessment Team, she contributed to implementing All-in-One, the college-wide assessment of course, program, and general education learning References . . . 15 outcomes. She has also conducted numerous workshops and presentations to engage Appendices . . . 16 the College community in the discussion of research and assessment data. Prior to Appendix A: Example Rubric A . . . 16 working at PGCC, Laura conducted survey research on work and family trajectories in Argentina and ethnographic research on union organizing and member activism Appendix B: Example Rubric B . . . 20 in the U.S. She was a recipient of a Fulbright Foreign Scholarship, has published two Appendix C: Example Rubric C. . . 23 books, one as a sole author and one as a co-author, and holds a Ph.D. in Sociology from Northwestern University. NILOA National Advisory Panel . . . 27 About NILOA . . . 28 NILOA Staff . . . 28 NILOA Sponsors . . . 28 2 | National Institute for Learning Outcomes Assessment

3 Executive Summary Pressed by the ongoing drive for accountability, most U.S. colleges and universities are conducting some form of student learning outcomes assessment. Yet because their efforts are fragmented, many institutions are stymied in their attempts to fully engage with assessment findings to improve student learning. Facing multiple and competing demands for evidence of student learning, institutions typically have created separate, coexisting assessment practices, handling the assessment of courses, programs, and general education as isolated pieces rather than as interconnected components of the evaluation of students’ knowledge and skills. This fragmentation has made it hard to translate assessment findings into meaningful recommendations for faculty and students. New software solutions, however, have opened up opportunities for comprehensive assessment that covers multiple levels and serves multiple purposes without disconnected processes or duplicated efforts. More centralized and technologically advanced assessment systems can now help institutions with the storage, management, and dissemination of data, while still leaving room for valid assessment rooted in “authentic measures” of student work (Banta, Griffin, Flateby, & Kahn, 2009; Ewell, 2009). Several institutions have begun to incorporate new technology and to create more centralized and interconnected assessment systems. Yet combining the assessment of course, program, and general education into a coordinated, coherent, single process has not been a common approach. Traditionally, assessment scholars have seen the separation between assessment and grading as a safeguard to ensure objective measurement of student learning. More recently, however, the “firewall” between assessment and grading has been challenged for reasons of efficiency as well as on pedagogical grounds. Connecting assessment and grading can save time and resources by avoiding duplicated efforts (Salisbury, 2012). Moreover, if grades are based on the achievement of learning outcomes, students will be more likely to work on mastering those outcomes (McClendon & Eckert, 2007). While a few institutions have started to experiment with systems that combine outcomes assessment and grading, these initiatives are still in a pilot or early implementation stage. This paper describes the system developed and implemented by one institution, Prince George’s Community College (PGCC), in Largo, Maryland, to integrate assessment of course, program, and general education and to connect outcomes assessment with grading. PGCC’s assessment system—called “All-in-One”—allows faculty to capture students’ discrete skills by using rubrics to assess and grade key projects across program curricula and then entering the data into a centralized database. All-in-One requires a technology platform that incorporates rubrics for grading student work. It also needs careful, ongoing review of curricula to maintain connections among course, program, and general education learning outcomes. Crucially, faculty collaborate in All-in-One in developing and using common embedded assessments to ensure that all sections of a course are evaluated with a common rubric. All-in-One returns assessment to its primary goal, namely, improving student learning and faculty teaching. It measures performance at the level of the individual student and, thus, students’ skills can be tracked over time and compared simultaneously across courses. Since spring 2012, when it was first fully implemented, All-in-One has collected between 2,500 and 4,000 scored rubrics each semester. All components of the assessment process, including grading and the evaluation of course, program, and general education learning outcomes, are captured in a single assessment system—All-in-One. 3 | National Institute for Learning Outcomes Assessment

4 All-in-One: Combining Grading, Course, Program, and General Education Outcomes Assessment W. Allen Richman & Laura Ariovich Pressed by the ongoing drive for greater accountability, most U.S. colleges and universities are conducting some form of student learning outcomes assessment. Yet when considering the returns of such widespread efforts in improving student Administrators and faculty do learning, the impact is difficult to gauge (Angelo, 1999, 2002). Many institutions aspire to use assessment results find themselves caught in a reactive mode, merely assessing courses, programs, to improve learning, but and general education to report the results to accreditors, without fully engaging their best intentions often get with those results in a way that would actually benefit students. Summarizing the findings of NILOA’s survey of Chief Academic Officers at regionally accredited thwarted by the fragmentation - undergraduate institutions, Kuh and Ikenberry (2009) confirm that most institu of assessment efforts. tions have gone through the process of defining common learning outcomes for all students, and most have taken steps to assess those outcomes. The institutions’ primary purpose for doing so, however, has been to meet accreditation require - ments. As the authors point out, “assessing student learning outcomes just to post a score on the institution’s website is of little value to campuses, students, parents, or policy makers” (Kuh & Ikenberry 2009, p. 4). Administrators and faculty do aspire to use assessment results to improve learning, but their best intentions often get thwarted by the fragmentation of assessment efforts. Facing multiple and competing demands for evidence of student learning, institutions typically have created separate, coexisting assessment practices, handling the assessment of courses, programs, and general education as isolated pieces rather than as interconnected components of the evaluation of students’ knowledge and skills. This fragmentation has made it hard to translate assessment findings into meaningful recommendations for faculty and students. - New software solutions, however, have opened up opportunities for comprehen sive assessment that covers multiple levels and serves multiple purposes without disconnected processes or duplicated efforts. More centralized and technologi - cally advanced assessment systems can now help institutions with the storage, - management, and dissemination of data, while still leaving room for valid assess ment rooted in “authentic measures” of student work (Banta, Griffin, Flateby, & Kahn, 2009; Ewell, 2009; see also Provezis, 2012). Several institutions have begun to incorporate new technology and to create more centralized, inter - connected assessment systems (Hutchings, 2009). Daytona State College, for example, has adopted a learning and assessment management system for tracking course, program, and institutional learning outcomes college wide (Hamby & Saum, 2013). Nevertheless, a systematic approach for combining the assessment - of course, program, and general education into a single process is still the excep tion rather than the rule. Traditionally, assessment scholars have seen the separation between assessment and grading as a safeguard to ensure objective measurement of student learning. More recently, however, the “firewall” between assessment and grading has been challenged for reasons of efficiency as well as on pedagogical grounds. Connecting assessment and grading can save time and resources by avoiding duplicated efforts (Salisbury, 2012). Moreover, if grades are based on the achievement of learning outcomes, students will be more likely to work on mastering those outcomes 4 | National Institute for Learning Outcomes Assessment 4 | National Institute for Learning Outcomes Assessment

5 (McClendon & Eckert, 2007). While a few institutions, such as the California State University, Fresno, have started to experiment with systems that combine outcomes assessment and grading (Bengiamin & Leimer, 2012), these initia - All-in-One capitalizes on tives are still in a pilot or early implementation stage. new technology to achieve the This paper describes the system developed and implemented by one institu - large-scale adoption of best tion, Prince George’s Community College (PGCC), in Largo, Maryland, to assessment practice. integrate assessment of course, program, and general education and to connect outcomes assessment with grading. PGCC’s assessment system—called “All- in-One”—allows faculty to capture students’ discrete skills by using rubrics to assess and grade key projects across program curricula and then entering 1 - All-in-One requires a technology plat the data into a centralized database. form that incorporates rubrics for grading student work. It also needs careful, ongoing review of curricula to maintain connections among course, program, and general education learning outcomes. Crucially, faculty collaborate in All- in-One in developing embedded assessments to ensure that all sections of a course are evaluated with a common rubric. - All-in-One capitalizes on new technology to achieve the large-scale adop tion of best assessment practice. Following the principles of best practice, an effective assessment system, first, should demonstrate student achievement of learning outcomes at the course, program, and general education level in a coherent, integrated, and cost-effective structure (Huba & Freed, 2000, p. 82; Walvoord, 2010, p. 26). Second, an effective assessment system should ensure that feedback for faculty and students remains within the teaching context (Banta et al. 2009, p. 17). Third, an effective assessment system should include different ways of measuring the same skill, it should measure the same student on different skills, and it should track the same student on the same skill over time (American Association for Higher Education [1992], second principle). In what follows, we discuss how All-in-One meets these three requirements at PGCC. The All-in-One Process Every piece of work a student completes requires the amalgamation of a broad set of knowledge, skills, and values that the student has built over time. Every research paper, for example, requires the student to demonstrate skills in writing, formatting, technology, information literacy, critical thinking, and knowledge of the content. The mastery of a specific skill, furthermore, may be “conditioned” by the acquisition of another skill (Ewell, 2013, p.7). Thus, when faculty assign a grade to a project, they necessarily take into consid - eration a broad set of student capabilities. However, many faculty neither measure these skills nor provide feedback in a discrete manner. Since grading is generally dissociated from the assessment of learning outcomes, faculty tend to approach grading as an overall evaluation of student work, while providing students with a multitude of comments and feedback in writing. Even faculty using rubrics in their classroom usually do not share the data with students. Thus, faculty are spending a significant amount of time and energy grading student work without reaping all of the potential benefits. With All-in-One at PGCC, rubrics developed by faculty are used to evaluate students’ performance on different skills and to provide the faculty member and the student with clear feedback on individual strengths and weaknesses. 1 http://www.pgcc.edu/About_PGCC/opair/Assessment_Data.aspx 5 | National Institute for Learning Outcomes Assessment

6 By putting the rubric data into an electronic storage system, the faculty member can quickly obtain aggregate performance results on the rubric for his or her course and, thus, easily observe strengths and weaknesses on discrete knowledge, The All-in-One approach - skills, and values for all students in the class. From there, the aggregation possi works only when there are bilities expand, with deans and chairs being able to track the performance of all students within a course, program, or department. clear, tight connections of learning outcomes across the All-in-One allows faculty to measure students’ discrete skills by creating and using curriculum. rubrics to grade key projects across program curricula and then entering the data into a centralized database. Measuring students’ discrete skills at PGCC began by identifying high-enrollment courses and general education courses as well as by working with departments to identify additional courses in which students can best demonstrate achievement of the program’s specific outcomes (e.g., capstones and other courses in the major). Finding and creating interconnections The All-in-One approach works only when there are clear, tight connections of learning outcomes across the curriculum. Every course learning outcome must be examined in the context of the programs it serves to ensure that the learning outcomes of the course are indeed leading to the program outcomes. PGCC faculty spent a long time working to connect course learning outcomes to program outcomes and to integrate general education skills throughout the curriculum. The end product has been a preferred sequence of courses for every program, identifying the preferred courses for general education and the order in which the student should take his or her major-specific courses. While not required, the preferred course sequence identifies courses that build skills for later courses so that the learning outcomes of each course create a scaffold of experi - ences for students culminating in the program’s learning outcomes. - While every course identified as general education must address general educa tion learning outcomes, these outcomes are also addressed throughout all other courses. As such, not only in general education courses but in other courses as well, students are honing their understanding of general education learning outcomes. Indeed, we want students to further develop their writing, critical thinking, and other general education skills within their discipline. In every course a student takes, therefore, the student should demonstrate some further development of these general education learning outcomes. According to this philosophy, every course learning outcome at the institution should lead to outcomes at another level: to the learning outcomes of the next course in the sequence of courses, to the program learning outcomes, or to a general education learning outcome. Through these interconnections, collecting data on a single course learning - outcome not only provides data about that course outcome but also builds infor mation about the general education and program learning outcomes. This web of curriculum interconnections, represented in the assessment database, allows for the aggregation of skills across the curriculum. Identifying key assignments At the course level, creating the interconnections begins with the identification of key assignments. As Ewell (2013) emphasized when discussing the Degree Qualifications Profile competencies, “The primary vehicle or mechanism for determining whether or not students have mastered the competency, there - fore, is a course assignment of some kind” (p. 13). At PGCC, if possible, the identified assignment is used to demonstrate all course outcomes. If covering - all learning outcomes in a single assignment is not possible, students are evalu ated with two or three assignments (e.g., a midterm project and a final project). 6 | National Institute for Learning Outcomes Assessment

7 These assignments are designed as culminating demonstrations of the knowl - edge, skills, and values that the student is expected to gain from the course. Because faculty who teach the course collaborate to create the assignment(s) and the rubric(s) to assess student work, they retain control in identifying the The preferred approach at PGCC best assignment and how to evaluate it. has been authentic assessment of student work through rubrics. Once the assignment is identified, all sections of a course administer the same assignment. For example, if faculty identify a research paper as the best way to demonstrate the course learning outcomes, then students in all sections of that course complete a research paper. Within each section, faculty have some range of variation in their assignment. For the research paper, for instance, faculty can select specific topics or mold the research paper to fit their individual style of instruction and content focus. However, all assignments are graded using the same rubric designed by faculty and built in the assessment database. In some cases, faculty may choose to administer a common multiple choice exam across course sections rather than a common assignment evaluated with - a rubric. However, the preferred approach at PGCC has been authentic assess ment of student work through rubrics and, hence, that is the primary focus of this paper. Using rubrics for grading and assessment Faculty design each rubric with the aid of a template in the software package (see Table 1). Using this template, they identify a set of assignment domains and performance levels, typically five, ranging from “Excellent” to “Unsatis - factory.” In addition, they assign a point value to each cell of the rubric for grading. Faculty complete the template in a Word document and then the assessment staff uploads the rubric to the assessment database. In this way, faculty determine what goes into the rubric and how student work is evalu - ated. Course Excellent Good Average Below Unsatisfactory Average Outcomes Domain Intro Points: 8 Points: Points: Points: 2 Points: 1;0 1 7;6 Paragraph 5;4;3 Descriptions are entered here Points: 5;4;3 Content A Points: 10;11;9 Points: 3 Points: 6 Points: 2;1;0 8;7 Table 1: Example Rubric in Development Faculty give the assignment and instructions to their students and then evaluate the students’ work using the rubric. For every student, the rubric is completed online by selecting the points the student earns in each domain. As an example, Table 1 shows that “Intro paragraph” can receive from 0 to 8 points based on the student’s performance. To decide how many points to assign, faculty first use the description given in the rubric for each cell (e.g., average). The description in this cell identifies the specific characteristics of an “average” introductory paragraph. To maximize the flexibility of grading, faculty can then decide, within “average,” whether the student’s introductory paragraph is worth 5 points, 4 points, or 3 points based on the quality of the paragraph elements present (Table 1). In this way, faculty do not get “boxed- in” by the five categories of the rubric. Instead, they can use the rubric to guide the grading process but still grade on a continuum. As the faculty member 7 | National Institute for Learning Outcomes Assessment

8 moves through the rubric assigning points, the software calculates a sum of the scores and a percentage (points obtained over points possible). Either of these can be used as the final grade for the student, depending on whether the The focus of the assessment faculty member is using a point-based or a percentage-based grading method. analysis is on the percentage of students who fall within each Once the semester is over and all faculty members have graded their students’ work, the rubrics are used in a slightly different manner. For assessment performance level. In general, purposes, we are not interested in the number of points each student received students who fall within the but instead in the performance level (e.g., “Excellent,” “Good,” “Average,” etc.) “Excellent,” “Good,” and - that students achieved on the rubric for each domain. The focus of the assess “Average” levels are considered - ment analysis is on the percentage of students who fall within each perfor mance level. In general, students who fall within the “Excellent,” “Good,” to be meeting the expectations and “Average” levels are considered to be meeting the expectations for the for the defined skill. defined skill (e.g., writing), while students scoring within the below average and unsatisfactory levels are considered to be falling behind the performance expectations for that skill. In the All-in-One database, each domain or row of the rubric is connected to a course learning outcome, and that course outcome, in turn, is connected to other learning outcomes, e.g., program and general education outcomes. Figure 1 displays how the rubric feeds data into specific knowledge, skills, and values. Data from each rubric feed into the course outcomes and are then connected to program and/or general education skills. Furthermore, since the same rubric is used across all sections of the same course, the end result is perfor - mance data on the skills demonstrated in the assignment for a large sample of students. Thus, from a single assignment, faculty have graded the students while course learning outcomes have been assessed, and data have been collected on both program and general education learning outcomes. Figure 1: Example of Interconnections between Rubrics and Learning Outcomes 8 | National Institute for Learning Outcomes Assessment

9 Inherently connected to the curriculum Because the All-in-One approach relies on tight connections among desired All-in-One provides data learning outcomes across the curriculum, it is imperative that courses and programs be examined holistically to ensure that the knowledge and skills on whether courses are obtained in a 100-level course are the knowledge and skills necessary to be sufficiently interconnected to ready for the 200-level courses. Furthermore, it must be ensured that the foster student achievement courses students take lead directly toward the attainment of the program’s learning outcomes and the general education learning outcomes. The All-in- of the program and general One approach is inherently and deeply connected to the curriculum; it works education learning outcomes. only when connections across the curriculum are clearly defined. Indeed, when implemented, All-in-One provides data on whether courses are sufficiently interconnected to foster student achievement of the program and general education learning outcomes. Achieving this level of interconnectedness takes time, and it is not a “once-and-done” activity. The PGCC faculty worked very hard to establish an initial set of interconnections, but a number of changes have been made since then and evidence-based changes continue to be made. As such, this assessment process is not simply about achieving an end-goal. It is also about undertaking the journey. The alignment of courses, programs, and general education learning outcomes is continuously being refined through the assessment process. As faculty collaborate to create assignments and rubrics, they engage in conversations about their courses and the connections from assignment to course, program, and general education learning outcomes. In a cycle of continuous engagement, faculty regularly discuss the purpose of their courses and the role of their assignments in building the skills necessary for the next course. While the benefits of such collaboration are not easily quantified, the collaborative process clearly makes assessment meaningful for faculty and improves their understanding of how their courses and assignments fit into the whole curriculum. One of the adjustments required in the implementa - tion process has been allowing opportunities for departments and faculty to grow at their own pace. As faculty learn more about the assessment model and assessment in general, their efforts will produce better assignments, tighter alignments between the curriculum and expected learning outcomes, and a stronger curriculum, leading to increased student learning and success. Efficient use of resources As noted above, the knowledge, skills, and values that students gain at the course, program, and general education levels are all interconnected. At the level of the individual student, learning occurs as a holistic process in which different sets of skills may be acquired concurrently over an extended period of time. Furthermore, all artifacts produced by a student represent multiple skills and knowledge working in concert. Managing the measurement of these abilities through separate processes, therefore, does not make sense. Rather than running three separate measurement systems at the institution, All-in- One reduces the workload by operating within the existing workflow. In the most basic form of All-in-One, faculty grade student work and, at the same time, enter data on discrete student skills into an electronic platform. These data, based on evaluations of students’ culminating assignments, are all that is needed to provide information on course, program, and general education learning across the institution. This means there is no need for other processes that re-evaluate student work through the lens of program learning outcomes and even a third time through the lens of general education learning outcomes. All of the necessary data are produced by a process in which faculty are already engaged throughout the semester: the evaluation of student work. 9 | National Institute for Learning Outcomes Assessment

10 Faculty involvement and support Because All-in-One is tightly coupled with promising pedagogical practices, its All of the necessary data are implementation requires good curricular design, marked by a logical progression in the acquisition of knowledge and skills as called for in the Degree Qualifica - produced by a process in which tions Profile (Ewell, 2013; Lumina Foundation, 2011). Thus, faculty conversa - faculty are already engaged tions center on the “purpose” of their courses, not just on how to assess them. throughout the semester: the The process becomes more meaningful to faculty, as a result, because it focuses equally on learning in the classroom and measuring that learning. evaluation of student work. Furthermore, the success of the All-in-One assessment process depends on faculty participation and control. Faculty need to engage in the assessment process and, at PGCC, they create all course, program, and general education learning outcomes. They also identify the alignments from course learning outcomes to program and general education learning outcomes. In addition, they select the courses to be assessed, create the assignments for the assessment process, and design the rubrics to evaluate the assignments. Thus, the locus of control resides squarely with faculty, and this helps generate faculty support for the process. In the All-in-One approach, faculty are more likely to support assessment because they decide the criteria for assessing students’ skills, not just at the course level but also at the program and general education level. For example, faculty who teach Psychology at PGCC are expected to evaluate general educa - tion skills, including student writing. To this end, Psychology faculty created a rubric reflecting their own criteria for what constitutes good writing, rather than using a general, all-purpose writing rubric. This approach is attractive to faculty because it is not driven by a vague, universal “gold standard” of writing but, instead, it involves a grassroots discussion about discipline-specific conventions and expectations about writing and the challenges students face when moving from one writing environment to another. The high level of faculty involvement in All-in-One also entails a significant amount of work. At PGCC, we continue to refine our system to provide adequate support to faculty participants. The current structure involves three - faculty members from each department, collectively identified as the Depart mental Assessment Team (DAT). These faculty members are responsible for shepherding the assessment process within their department and are the first point of contact for questions about assessment. Each academic division has two DAT members who serve as the divisional representatives to the institu - tion-wide Assessment Committee. This offers a second line of support, with faculty in the same larger disciplinary area (e.g., STEM) who can assist other DAT members and also keep their division apprised of changes in procedures. The current All-in-One structure also includes two faculty members called “Assessment Coaches,” who receive release time. Providing a third level of - support, each Assessment Coach is responsible for up to three academic divi sions. The Assessment Coaches are more seasoned assessors who help with everything from improving the wording of a learning outcome to fine-tuning rubrics and using the assessment software. Finally, overseeing the entire process is an administrator, the Director of Outcomes Assessment and Institu - tional Effectiveness. While these layers of support require a significant number of individuals, it is important to remember that All-in-One addresses all the outcomes assessment requirements for the entire institution and that the same group of individuals is also charged with continually reviewing the curriculum to ensure its alignment with expected learning outcomes. 10 | National Institute for Learning Outcomes Assessment

11 Consistent feedback for students and faculty In the All-in-One approach, faculty feedback to students includes the same data At PGCC, we continue to collected in evaluating the student’s performance in the course, program, and refine our system to provide general education learning outcomes. In other assessment models, in contrast, adequate support to faculty samples of student work commonly are selected and re-assessed with a different measurement instrument and often by different faculty at a different time. Thus, participants. there is no connection between the feedback received by the student in the course and the evaluation of the student’s progress toward the learning outcomes. The problem with such assessment methodologies is that they are so far removed from the classroom they have little benefit for current students. When different faculty evaluate assignments with a different assessment tool after the end of the semester, the assessment does not allow for intervention or engagement with students currently enrolled, faculty do not learn better grading techniques, and students do not receive improved, real-time feedback. Unlike other assessment models, the All-in-One approach requires participation of the faculty at large in every stage of the assessment process. As explained above, - faculty build the instruments, discuss the learning outcomes, define the connec tions between their courses and other courses, and draw the links from course outcomes to program and general education outcomes. Furthermore, faculty are engaged in examining the assessment data. All-in-One data are more mean - ingful to faculty because these data directly reflect what they do in the classroom and their own students’ performance. Finally, All-in-One helps develop faculty’s assessment expertise as they work on improving their assessment instruments and their classroom teaching based on the data they helped generate. Robust data set The data generated by All-in-One can be examined at multiple levels: at the section level, the course level, and across the curriculum. At the section level, faculty can see the mean, median, and mode for the points representing profi - ciencies their own students have demonstrated through completed assignments. In addition, they can produce reports showing the count and percentage of students scoring in each domain and at every performance level on the rubric. Data can also be examined at the course level; data from multiple sections of the same course can be aggregated into a single report. These data focus on the count and percentage of students scoring within each performance level. With these aggregate data, a dean or a chair can see the percentage of students who score “Average” or “Above average” and compare it to the percentage of students who score “Below average” or “Unsatisfactory.” The final level of analysis aggregates performance data across the curriculum by specific skills. In these reports, a single skill (e.g., writing) is selected and all rubric domains (rows) measuring this skill within a chosen timeframe will be pulled. These data primarily focus on the percentage of students at each perfor - mance level. Additionally, these reports show the number of assessments as well as the number of domains that measured writing. Finally, it is possible to see the number of individual students measured as well as the number of individuals measured more than once. An additional type of report that will be available is a record of individual student performance, showing each individual’s performance on specific skills (e.g., writing skill is “Excellent” and critical thinking is “Average”). Although not yet created, this type of report is likely to be a powerful tool for future analyses as well as a useful device to inform students about their progress in the development and mastery of specific skills. In sum, All-in-One is capable 11 | National Institute for Learning Outcomes Assessment

12 of generating a robust data set, one in which data can be examined at different levels of aggregation, overcoming critical deficits commonly found in other All-in-One is capable of assessment models. generating a robust data set, Same student, multiple measures one in which data can be A major deficit in most assessment systems is the use of a single measurement examined at different levels point for program or general education learning outcomes. Without going of aggregation, overcoming into a long discussion of learning and measurement, we briefly note that it critical deficits commonly is well established that no single measurement of any skill is ever adequate. found in other assessment Indeed, what offers the best measure of a general skill, like writing, is to measure the same individual multiple times with different instruments. Thus, models. by measuring a student’s writing in her English courses, psychology course, and major courses, we are more likely to attain an accurate evaluation of her writing abilities. Moreover, we can observe the reliability of measurements across multiple faculty members and across multiple types of writing. Thus, for example, the focus on MLA writing style in this student’s English courses could mean that she is ill prepared for the more technical APA style required in her psychology course. In a case like this, All-in-One is sensitive enough to detect the discrep - ancies between the two courses. With other measurement systems, an institu - tion may declare their students strong writers based on their performance in English courses alone, missing the fact that the same students are not strong writers within their discipline. Neither group of faculty is incorrect in their evaluation of the student, but the style of writing expected varies from one discipline to another. What matters most is how identifying and addressing these discrepancies might contribute to strengthening students’ ability to make progress and complete a degree. The data created by All-in-One allow - for a more fine-grained evaluation of student performance and a deeper under standing of student learning across the curriculum. All-in-One data are sensi - tive to gaps between what is expected of students and what has been taught in prior coursework. The clear expectation here is that through removal or amelioration of these gaps we can have a strong impact on student graduation and success. Skill development Another common deficit in many other assessment systems is that when students are not meeting performance expectations, going back and identi - fying the root causes of poor performance is very difficult. Thus, assessment - systems that establish single points of measurement to evaluate the achieve ment of specific performance criteria without tracking skill development over - time do not provide much data on how to intervene or where student perfor mance started to falter. All-in-One is used to measure all sections of a course periodically and, thus, the history of knowledge and skill development for large numbers of students becomes part of the All-in-One data set. Therefore, it is possible to look at students who progress from 100- to 200-level course - work and beyond to ascertain where performance began to falter. This type of analysis makes it possible to identify the significance of learning certain skills early and the impact of specific skill gaps on later course performance. The hope is that these developmental trajectory data will provide significant insight into the skill deficits that impede student performance in the classroom and inform strategies to increase student retention, progress, and graduation. 12 | National Institute for Learning Outcomes Assessment

13 Implementation issues and lessons learned The data created by In creating All-in-One, PGCC has remained committed to the overall theory All-in-One allow for a more of assessment behind the model. However, putting theory into practice has meant a range of adjustments and lessons learned along the way. For All-in- fine-grained evaluation of One to work, everyone must participate. This assessment model is based on student performance and connections from course learning outcomes to program and general education a deeper understanding of learning outcomes; thus, the data on program and general education outcomes are collected at the course level. If certain departments or divisions opt out of student learning across the the process, then the institution will not have a complete picture of student curriculum. learning. The model requires strong administrative support to ensure that all divisions and departments are actively engaged in the assessment process. This is true of any assessment process, but because what we are doing entails such a strong degree of integration, we continue to revisit and refine our strategies at PGCC for college-wide involvement. With each passing year, the assessment process becomes better understood and more automatic for faculty, but the model still necessitates the presence of individuals dedicated to moving assess - ment forward. Another lesson learned has to do with developing the faculty’s assessment expertise. As at most institutions, the faculty at PGCC are strong content experts, but they have a broad range of understandings of teaching, learning, and assessment. Since All-in-One relies heavily on faculty to create assess - ments and make connections between assessments and learning outcomes, a - large percentage of PGCC faculty are now regularly engaged in the assess ment process. Although the first rounds of assessments have had some obvious shortcomings, the assessment materials keep improving each term. As more faculty members understand the process and the connections between assess - ments and curriculum, the questions get better for the Assessment Coaches, DAT, and division representatives who support them. The lesson learned is to begin moving forward with the model without dwelling too much on the overall accuracy of the assessment instruments. Professional development for All-in-One implementation needs to be ongoing, while faculty are partici - pating in assessment. While frustrating for some, having faculty actually do it has proven the best means of strengthening their assessment expertise. The most important lesson learned from implementing All-in-One at PGCC is that the first step is the hardest. Before All-in-One, PGCC did not have a comprehensive assessment process and, therefore, not all the faculty were involved in assessment. Thus, prior to the implementation of All-in-One, a range of reasons were given as to why it wouldn’t work, why it wasn’t mean - ingful, or why it was too much of a burden. However, once departments began to engage in the process, most became more open to the new approach. For PGCC, completing the entire All-in-One process for a four-year cycle amounts to all 20 departments on campus collecting data from two to four courses each semester. As part of this work, all departments have to create a new assignment for most courses assessed as well as a new rubric for assessing the assignment. When the four-year schedule of courses to be assessed repeats, the burden on faculty is less because faculty do not have to start from scratch to identify an assignment and create a rubric. Although there is still some resistance at PGCC to the assessment process, each new semester has shown broader acceptance, greater interest in the assessment data, improved assess - ments and curricula, and—most important of all—improvements in student performance. 13 | National Institute for Learning Outcomes Assessment

14 Conclusion The objective at the heart of every institution’s mission statement is student - success. To achieve this mission, institutions need to develop methods to ascer The All-in-One methodology tain not only if students are learning but also how they are developing their creates a robust data set that skills over time. In the current accountability climate in higher education, is potentially one of the richest moreover, institutions need to develop significantly more advanced means of evaluating and tracking student progress and success. Additionally challenging, repositories of data collected they must do all this without overwhelming the institution’s faculty or staff. on students. The All-in-One assessment approach responds to these needs. It tightly connects curriculum with assessment and integrates the measurement of - course, program, and general education learning into a single process. More over, because it accomplishes this through faculty grading papers in their own courses, it is significantly more streamlined than alternative approaches that assess small numbers of students through a disconnected set of measures. The outcomes from using All-in-One are multilayered. First, because it is so intricately interwoven with the curriculum, faculty have to regularly examine how a course fits into the program and how the culminating assessment for the course directly demonstrates student knowledge and skills. Second, All- in-One ensures direct correlation between faculty feedback to students via grading and the evaluation of students’ capabilities via the assessment process. As a result, faculty become more knowledgeable about assessment, their peda - gogy and assessment instruments improve, and the overall assessment process becomes more effective. Finally, the All-in-One methodology creates a robust data set that is potentially one of the richest repositories of data collected on students. These data are invaluable to the institution, as they demonstrate the value added of each course assessed, they identify “holes” in student learning that can be quickly filled through changes in the classroom, and they track skill development over time so that later student struggles can be minimized by improving coursework earlier in the student’s curriculum. Like all assessment processes, All-in-One must overcome the “Do-we-really- have-to-do-this?” mindset. It is by no means the silver bullet leading all faculty to embrace assessment while solving every assessment problem. Once faculty begin to engage in it, however, All-in-One seems to make sense to most of them and, indeed, most faculty find it useful. They also find it highly efficient because, rather than calling for their involvement with a range of duplicate measurement methods, All-in-One has them use a single measure to gather evidence of course, program, and general education learning. Finally, because the assessment of general education skills has direct and real connections to the skills that faculty address and evaluate in the classroom, faculty and adminis - trators can better understand the data and identify means to improve student performance. Since its inaugural semester of full implementation, in Spring 2012, All- in-One has collected 2,500 to 4,000 scored rubrics each semester at Prince George’s Community College. Every term, the data set grows more robust, as more individual students are assessed more than once in different courses and at different times in their academic career. The entirety of this process, from early performance to later performance, and the evaluation of individual course, program, and general education learning outcomes are captured in a single assessment system, All-in-One. 14 | National Institute for Learning Outcomes Assessment

15 References (AAHE Assessment Principles of good practice for assessing student learning American Association for Higher Education. (1992). Forum). Retrieved from http://www.learningoutcomesassessment.org/NILOAarchive.html AAHE Bulletin , 51 (9), 3–6. Angelo, T. A. (1999). Doing assessment as if learning matters most. Angelo, T. A. (2002). Engaging and supporting faculty in the scholarship of assessment: Guidelines from research and best practice. In T. W. Banta & Associates (Eds.), Building a scholarship of assessment . San Francisco, CA: Jossey-Bass. Banta, T. W., Griffin, M., Flateby, T. L., & Kahn, S. (2009). Three promising alternatives for assessing college students’ knowledge and skills (NILOA Occasional Paper No.2). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Bengiamin, N. N., & Leimer, C. (2012). SLO-based grading makes assessment an integral part of teaching. Assessment Update, 24(5). doi: 10.1002/au.245 Ewell, P. T. (2009). Assessment, accountability, and improvement: Revisiting the tension (NILOA Occassional Paper No. 1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. (NILOA Occassional Paper No. 16). Ewell, P. T. (2013). The Lumina Degree Qualifications Profile (DQP): Implications for assessment Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Hamby, E., & Saum, R. (2013). Institutional change: Curriculum alignment with course, program, and institutional outcomes via a LMS [PowerPoint slides]. Retrieved from The Community College Conference on Learning Assessment website at http:// wp.valenciacollege.edu/learningassessment/conference-materials/ Huba, M. E., & Freed, J. E. (2000). Learner-centered assessment on college campuses. Shifting the focus from teaching to learning. Needham Heights, MA: Allyn & Bacon. Change , 41 (3), 26–33. Hutchings, P. (2009). The new guys in assessment town. Kuh, G., & Ikenberry, S. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. The degree qualifications profile . Indianapolis, IN: Author. Lumina Foundation. (2011). McClendon, K., & Eckert, E. (2007). Grades as documentation of SLO achievement: Constructing an outcomes-based grading system [PowerPoint slides]. Retrieved from the California Association for Institutional Research website at http://www.cair.org/ conferences/CAIR2007/pres/McClendon-Eckert.pdf Provezis, S. (2012, June). LaGuardia Community College: Weaving assessment into the institutional fabric (NILOA Examples of Good Assessment Practice). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Salisbury, M. (2012, Dec. 4). Linking outcomes assessment to grading could build faculty support. Inside Higher Ed. Walvoord, B. E. (2010). Assessment clear and simple. A practical guide for institutions, departments, and general education . San Francisco, CA: Jossey-Bass. 15 | National Institute for Learning Outcomes Assessment

16 Appendix A Combining Grading, Course, Program, and General Education Assessment Example Rubric: RAD 1540, Clinical Radiography II By Angela Anderson and Jo Beth Linzy The assignment The Radiography Program is based on a competency-based education plan. Students are required to complete courses in a prescribed order; each course provides a building block for subsequent courses. RAD 1540, Clinical Radiography II, is in the second semester of the program curriculum. It is the second of five clinical education courses in the curriculum. In order to successfully complete each clinical education course, students are required to achieve and maintain clinical competency on a set number of radiographic examinations. Each radiographic examination consists of multiple radiographic projections. The “Category Competency Evaluation” is considered to be the final examination for RAD 1540. This assessment ensures retention of knowledge and clinical skills. The clinical instructor randomly picks 5 radiographic projections, corresponding to competencies completed during the semester, for each student to simulate. Students must perform each projection with a minimum score of 90%. Scores below the minimum acceptable percentage disqualify corresponding competency evaluations. Students must have a minimum overall grade of 90% on this evaluation to pass the course. About the students The students in this course are in their second semester of the Radiography Program. Although this is technically the second clinical course in the curriculum, it is the first clinical course where they spend two days a week for the entire semester in the hospital setting interacting directly with patients. Prior to program admission, they have completed the pre-requisite general education courses of English Composition I, Finite Mathematics, and two semesters of Anatomy and Physiology. Many have also completed the three additional general education courses in the program curriculum: English Composition II, Speech, and General Psychology. Students are admitted in the program in cohorts. Since they see each other every day in class for four semesters, they tend to form very strong bonds with each other and the program faculty. This bond provides the students with a very strong support system that enables them to help each other succeed in this rigorous program of study. Many of the students remain in contact with each other and the program faculty after graduating. Faculty feedback about the assessment When institutional assessment started at Prince George’s Community College, the Radiography Program Faculty felt relatively comfortable with the process. Teaching in a JRCERT accredited program, the faculty were familiar with program outcome assessment. They had been using rubrics to assess students in clinical education courses for many years. Still, when the first clinical course was scheduled to be assessed as part of the program’s four-year assessment plan, they realized that they still had a lot to learn. The rubric has evolved since originally designed. In the spring of 2013, when the course came up for assessment, it was determined that the scoring criteria on the original rubric were too subjective. The rubric is now designed to be very succinct in the criteria needed to produce an optimal, diagnostic radiograph, providing faculty with a more objective assessment tool. The rubric ascertains minute errors so that students can tweak their skills in a simulation mode to reinforce their learning. Since students do not know ahead of time what projections they will be asked to simulate, there is no preparation per se. This challenges students to be prepared to do any of the projections in a particular exam and encourages them to maintain their competency skills. The biggest challenge in redesigning the rubric was conforming to the college’s template and explaining the program’s grading policies to members of the assessment committee. The grading scales used in the Allied Health and Nursing Programs are higher than in other disciplines. The lowest overall percentage grade a student may receive and be assigned a grade of “C” in 16 | National Institute for Learning Outcomes Assessment

17 the Radiography Program is 75%. In addition, the required passing grade on specific clinical assessments, such as the “Clinical Competency Evaluation” is typically in the 90-100% range. This assessment is weighted higher because of the critical nature radiographs play in diagnosing disease processes. An average radiographic image may be classified as diagnostic, but it may be lacking pertinent information which a physician may need to make an accurate diagnosis. The rubric is helpful for the instructor as well as the student. It provides feedback on student performance that prompts dialog between faculty and students about obtaining an optimum radiograph and whether there needs to be remedial instruction on a particular projection/exam. Students do not know in advance which projections they will be asked to simulate. The student must maintain competency on all the examinations covered since the beginning of their education, ensuring that they possess the skills required to graduate from the program and enter the workforce as a competent, entry-level practitioner. Compiled data from the assessment are reviewed by the faculty and used to assess the program’s student learning outcomes and develop an action plan that becomes part of the program’s assessment plan. The rubric Core Excellent Unsatisfactory Course Outcomes Good Average Below Average Competencies Clinical Clinical Clinical Clinical Clinical performance performance performance is performance performance is safe and demonstrates is unsafe and is safe and minimally safe adequately inadequate inadequately adequately with inconsistent demonstrates demonstrates demonstrates knowledge, application of the processes knowledge, application of skills and/or application of skills and/or the processes abilities needed appropriate to the processes appropriate to abilities needed by an entry-level an entry-level appropriate to an entry-level radiographer. an entry-level radiographer. by an entry-level radiographer. radiographer. radiographer. Domain Points: 4 Communication 2. Display a 1. Patient Points: 1, 0 Points: 2 Points: 5 Points: 3 Instructions professional Patient Instructions were Patient Patient Patient demeanor not accurate; instructions were instructions were instructions were instructions 1-5 Pts. were accurate, no instructions not quite accurate clear, concise and accurate, but not 5.Communicate accurate. or not clear and but not clear provided. quite clear; able effectively, using and concise; to answer patient concise; difficulty proper medical questions. answering patient able to answer terminology when Pt. could not most patient questions. Pt. was necessary questions. Pt. understand totally unaware did not hear of breathing breathing inst. breathing inst. due to student instructions. explanation. due to student’s voice not loud enough. N/A 2. Proper 1. Perform Critical Points: 5 Points: 0 N/A N/A Reasoning marker radiographic Anatomical N/A Anatomical N/A N/A examinations of the placement markers were markers were not chest, abdomen, placed correctly. used or placement upper and lower 0 or 5 Pts. was inaccurate. extremities 6. Demonstrate proper use of radiation protection devices 17 | National Institute for Learning Outcomes Assessment

18 Core Good Unsatisfactory Average Course Outcomes Below Average Excellent Competencies N/A Ethics N/A Points: 1, 0 1. Perform 3. Radiation Points: 5 N/A radiographic Protection N/A No shielding N/A The patient N/A examinations of the was shielded was used or chest, abdomen, 0 or 5 Pts. correctly. the shielding upper and lower placement was extremities not correct. 6. Demonstrate proper use of radiation protection devices Points: 1, 0 4. Positioning Points: 5 Points: 4 Points: 3 Points: 2 Scientific and 1. Perform aids / radiographic Quantitative Didn’t use two Didn’t use Student Appropriate Positioning aids/ examinations of the accessories Reasoning positioning aids/ attempted to use one necessary positioning accessories were chest, abdomen, not used correctly the aids but they positioning accessories were aids that were upper and lower 1-5 Pts. used correctly. required. or not used. aid that was didn’t stay in extremities place as needed. required. 7. Utilize radiographic equipment and accessories Scientific and 1. Perform Points: 10, 9 Points: 8 Points: 7.5 5. Part Points: 4, 2, 0 Points: 6 Position radiographic Quantitative Three or more No errors in One minor Two errors in Positioning is Reasoning examinations of the not appropriate; error in part part position; part positioning; errors in part chest, abdomen, 0-10 Pts. positioning; positioning; resulting image resulting image resulting image upper and lower would be would be would be non- resulting resulting extremities image would optimal. diagnostic. image would diagnostic. be optimal/ be borderline 6. Demonstrate diagnostic. diagnostic. proper use of radiation protection devices 6. SID/Angle Scientific and Points: 10, 9 Points: 8 Points: 7.5 Points: 6 Points: 4, 2, 0 1. Perform radiographic Quantitative SID was off by The correct SID The correct SID The correct SID SID and angle examinations of the Reasoning 0-10 Pts. and angle was were incorrect by and angle was and angle was 5-6 inches and chest, abdomen, used. used; within being off over 6” angle was more used; within upper and lower 3-4” of correct 1-2” of correct on SID and 15 than 10 degrees extremities degrees on angle. SID and 3-5 off. SID and 1-2 degrees of the degrees of the 6. Demonstrate angle. angle. proper use of radiation protection devices Scientific and Points: 8 1. Perform 7. Central Ray Points: 6 Points: 10, 9 Points: 4, 2, 0 Points: 7.5 radiographic Quantitative The central ray The central The central ray The central ray The central ray examinations of the Reasoning 0-10 Pts. was slightly ray was off by location was not was directed was off by more chest, abdomen, off by 1-2 cm; than 3cm; no appropriate; no 2-3 cm; no to the correct upper and lower location with consideration consideration of consideration of consideration of extremities body habitus. body habitus. body habitus. consideration given to body habitus. given to body 6. Demonstrate habitus. proper use of radiation protection devices 18 | National Institute for Learning Outcomes Assessment

19 Core Unsatisfactory Course Outcomes Below Average Average Excellent Good Competencies Points: 7.5 Points: 6 8. Alignment Points: 4, 2, 0 1. Perform Critical Points: 10, 9 Points: 8 radiographic Reasoning Errors in Major errors The part/CR/ There were Major errors examinations of the 0-10 Pts. part/CR/IR minor errors in IR alignment in part/CR/ in part/CR/ chest, abdomen, IR alignment; IR alignment; the part/CR/IR was accurate; alignment; upper and lower alignment; off by resulting resulting image resulting image resulting image extremities ¼-1/2” resulting image would would be would be non- would be optimal. diagnostic. Off diagnostic. Off by image would be borderline 6. Demonstrate diagnostic. Off by be optimal/ more than 1 inch. by ½- ¾”. proper use 1 inch. diagnostic. of radiation protection devices Points: 6 9. Image 1. Perform Points: 4, 2, 0 Points: 10, 9 Points: 8 Points: 7.5 Critical Reasoning Receptor radiographic The image The image The image The image The ideal image examinations of the receptor chosen receptor chosen receptor size was receptor chosen receptor chosen chest, abdomen, 0-10 Pts. used. was acceptable was not correct was inappropriate was incorrect, upper and lower but should have and the student but the final and the final extremities image was not image was used a smaller/ did not know which size to diagnostic. larger cassette for diagnostic. 6. Demonstrate optimal image. use. Had to proper use have instructor of radiation intervene. protection devices Critical 10. Points: 4, 2, 0 Points: 6 1. Perform Points: 8 Points: 7.5 Points: 10, 9 Collimation radiographic Reasoning Some evidence of Little or no Collimation Collimation was The maximum examinations of the collimation; not amount of was acceptable, evidence of good, but field chest, abdomen, Ethics 0-10 Pts. but field size collimation. size could have collimation was sufficient. Could upper and lower been restricted could have used. have done more extremities slightly further. been restricted than 1-2 cm on further. (¾-1 cm (¼-1/2 cm more all four sides. 6. Demonstrate more on all 4 on all 4 sides) proper use sides) of radiation protection devices 3. Select Scientific and 11. Points: 4, 2, 0 Points: 6 Points: 10, 9 Points: 8 Points: 7.5 Quantitative radiographic Anatomy 6-7 anatomical 8 anatomical 9-10 anatomical Less than 5 5 anatomical Reasoning exposure factors parts identified. parts identified. anatomical parts parts identified. parts identified. 0-10 Pts. identified. Correct exposure 4. Critique factors used; “S” Correct exposure Incorrect Correct exposure radiographic number within factors used; “S” factors used; “S” exposure factors Incorrect exposure images for ideal range. number within factors used; used, “S” number number within positioning and “S” number acceptable acceptable range. outside of image quality range, but acceptable range. grossly outside of Off by 5-10%. improvement is acceptable range. Off by 30-50%. Off by more than needed. Off by 50%. 10-25%. 19 | National Institute for Learning Outcomes Assessment

20 Appendix B Combining Grading, Course, Program, and General Education Assessment Example Rubric: English 2140 African American Literature By Anne Showalter The assignment To maintain classroom autonomy, the faculty assess English 2140 through either a course-embedded analytical research paper or a final essay exam encompassing all course learning outcomes. The paper requires students to conduct research and analyze various historical periods, themes, and/or literary devices in relation to cultural moments in African American literature. Students are expected to apply at least one critical lens to read and analyze a text. A common rubric (see below) is used to evaluate the paper. For students Students are provided with the following instructions: 1. Compare and contrast the interracial relationships or encounters in at least two of the following texts: Butler’s Kindred Hansberry’s A Raisin in the Sun Baraka’s Dutchman Morrison’s “Recitatif ” a. What are these relationships/encounters characterized by? Fear? Understanding? Sympathy? Love? Hatred? A combination of several of the above? Use specific examples from the texts and be sure to discuss how they speak to the social and cultural climate of their times. 2. Henry Louis Gates, Jr. has repeatedly declared that “if there are 40 million black Americans, then there are 40 million ways to be black.” Explain the ways in which Gates’ statement is in line with the trends in African American Literature since 1975. Then consider how leaders of the Black Arts Movement might respond to this statement. a. In your response, be sure to speak to how the cultural climates of the Black Arts Movement and Literature since 1975, respectively, might influence the ways in which the movement/time period frames black identity and representation. 3. In this course, we have examined African American literature from the early 1900s to present. Imagine that you have been asked to teach a one-day workshop on this period of African American literature to a group of local middle school students. Due to the format of the workshop, you are told that you need to cover exactly three literary works. In a response of at least three paragraphs, identify which three literary works from our course you would teach the students and why. In justifying your selections – in answering the “why” – please include the following for each work that you have selected: a. the work’s title and author b. a brief explanation of the work’s literary movement/tradition/time period (no more than three sentences) c. a specific reason or specific reasons why you consider this work particularly worthy of inclusion in the workshop d. a specific Discussion Question on the work that you would pose to the students Note: A Discussion Question, as the name implies, is an open-ended question that merits a conversation/debate and does not have a “correct” or easily identifiable answer e. what you hope the students would take away from the work 20 | National Institute for Learning Outcomes Assessment

21 About the students The students who take this course have completed Composition I and Composition II as these are pre-requisites for all 2000-level literature. Consequently, most students are nearing the end of their associate’s degree when they register for the course. On more than one occasion, students have shared with me that EGL 2140 is the last class they need to take before graduation/transfer. Because of the pre-requisites, the very earliest a student could take the course would be in their third semester. Most of the students who take EGL-2140 are non-majors, though there may be a few majors in any given class. It is not unusual for students to come into the class having taken an African American Studies or history course. Faculty feedback about the assessment Overall, I’ve been pleased with the assignment. I think, with the combination of the three very different types of questions, the assignment is an effective barometer of how successfully the student has grasped the major concepts of the course. The rubric Core Course Below Average Unsatisfactory Average Good Excellent Outcomes Competencies Domain Constructs Points: 1 Communication 1. Construct Points: 2 Points: 4 Points: 3.5 Points: 3 and interpret and interprets Limited yet logical Accurate Illogical and Illogical Thorough connections the relationship Critical connections false connections but indirect and direct connections between 20th between African- Reasoning between the text connections between the text between the text connections American texts, century African- and the cultural and the cultural and the cultural between the text between text American writers writers, literary Culture movement, and the cultural movement, and cultural movement, theme, movements, and and 18th and theme, form, form, and/or theme, form, movement, movement, time periods 19th century and/or period are theme, form, and/or period are period are made. theme, form, writers. made. and/or period are made. and/or period are logically logically made, 2. Explain how though support supported. the social and may be thin in intellectual some places. climate has influenced the themes of recent African-American literature. Communication Demonstrates an Points: 1 Points: 2 Points: 3 Points: 3.5 2. Explain how Points: 4 understanding the social and Thorough No understanding Thorough Limited Accurate of how the social intellectual Critical understanding of the literary understanding understanding understanding climate has and intellectual Reasoning of the literary of the literary of the literary movement or of the literary climates of the influenced the movement or movement or movement or theme or its movement or themes of recent 1920s to the Culture theme and its theme and its theme and its theme and its significance is present have African-American demonstrated; significance are significance are significance are significance are literature. influenced demonstrated demonstrated demonstrated; demonstrated; one no examples are the African- offered. through one an example is or more examples through multiple, American literary are offered, significant offered but is significant movement although the irrelevant or examples. example. significance may mischaracterized. be thin. 21 | National Institute for Learning Outcomes Assessment

22 Course Core Good Average Below Average Excellent Unsatisfactory Outcomes Competencies Understands 3. Apply literary Points: 4 Points: 3.5 Points: 3 Points: 2 Points: 1 Communication and critical and evaluates Demonstrates Demonstrates Demonstrates Demonstrates No references terminology rhetorical Critical to rhetorical limited and accurate a thorough limited yet techniques and and concepts, Reasoning understanding accurate inaccurate techniques understanding demonstrating movement- knowledge and critical knowledge of rhetorical of rhetorical based critical their use in Culture techniques terminology. of rhetorical of rhetorical techniques terminology understanding techniques and critical techniques and critical and appreciating terminology and critical terminology and critical African-American based on terminology based based on terminology literature. based on on examples examples given. examples given. given. examples given. Points: 2 Communication Identifies, Points: 4 Points: 3.5 Points: 3 Points: 1 4. Write analysis- synthesizes, and driven essays Demonstrates an Demonstrates a Demonstrates Demonstrates a Demonstrates critiques varied that critique Critical thorough analysis a limited yet accurate analysis limited analysis an analysis that varied texts, texts and literary Reasoning fails to identify, that identifies, logical analysis that illogically that identifies, and a research- movements identifies, synthesize, and synthesizes, and that identifies, synthesizes, and style paper that within analysis- Culture critiques the synthesizes, and critiques the critique the synthesizes, and utilizes additional driven language discussed literary critiques the discussed literary discussed literary critiques the documented Information movements. discussed literary movements and movements, discussed literary sources. Literacy utilizes logical movements; though support movements; little support is no support is support. may be thin in Ethics some places. provided. provided. Utilizes 4. Write analysis- Communication Points: 4 Points: 3.5 Points: 3 Points: 2 Points: 1 documented driven essays Incorporates Thoroughly Irresponsibly Accurately Incorporates that critique sources Critical documented incorporates documented incorporates quotes, varied texts, responsibly Reasoning documented sources through sources through summarizes, and/ documented through and a research- sources through sources through inaccurate use or paraphrases. use of quotations, quotations, style paper that Culture responsible use summaries, and of quotations, responsible use summaries, and utilizes additional of quotations, summaries, and of quotations, paraphrases, paraphrases documented Information though there are summaries, and paraphrases. summaries, and sources. Literacy paraphrases. some errors in paraphrases, despite minor approach and Ethics errors in execution. execution. 10=50% 15=75% 17.5=87.5% 5=25% Totals 20=100% 22 | National Institute for Learning Outcomes Assessment

23 Appendix C Combining Grading, Course, Program, and General Education Assessment Example Rubric: Art 1010, Introduction to Art By Sarah Wegner, Ken Conley, and John Anderson The assignment This assignment is a paper based on direct observation of a work of art in a museum or art gallery. The paper is assigned in the latter half of the semester. Students are required to turn in a rough draft for feedback before the final version of the paper is due. The instructions take the student step by step through the points they need to cover. Below is a summary of the instructions. For students Through direct observation of a master work of art, write a paper that addresses the following: 1. The first paragraph should introduce the work of art and state the thesis of the paper. 2. The second paragraph should identify the media, technique and style of the work and briefly explain the historical and cultural context in which the work was created. 3. The third paragraph should provide a detailed description of the work of art. The following paragraphs should identify the elements of art and the principles of design and analyze how they are used 4. in the work. 5. Following the analysis of formal issues, discuss the pictorial content (meaning) of the work. 6. The final paragraph should summarize the main points of the paper and explain how these points have proven or modified the thesis statement given in the first paragraph. If outside sources are used, appropriate citation and bibliographical references must be included. The instructor will make the student aware of these requirements and how to cite them. The paper should be no fewer than 3 and no greater than 5 pages in length. The format should be: 12 pt font, double spaced, 1” margins. A color reproduction, on a separate page, must accompany the paper and must be identified according to the standards of the instructor. About the students This course fulfills a Humanities credit for non-art majors. Students can take this course at any point in their program. Most students taking the course have had little or no exposure to the visual arts. They are faced with learning an entirely new vocabulary and ways of seeing and interpreting visual information. Many students come to this course with low reading and writing proficiency. The museum paper is often the first college-level writing assignment required of these students. Additionally, many students lack solid time management strategies. They do not know when or how to take notes in a lecture class and are shy about asking questions and engaging in the course. As a result, students have difficulty learning the vocabulary of art, reading and comprehending textbook material and assignment directions, and clearly articulating, verbally or in writing, what they observe in a work of art. 23 | National Institute for Learning Outcomes Assessment

24 Faculty feedback about the assessment Faculty agree that the museum paper is the most difficult assignment for students in Introduction to Art. This paper has always been part of the course. However, when it was included in the assessment plan, the museum paper became much more structured. Faculty are now required to use the same assignment prompt and grading rubric. With the new assessment format, the first semester’s results were very disappointing. Students demonstrated an inability to structure their paper, address the required points in coherent paragraphs, and draw conclusions based on their observations. The results of the museum paper assessment for the following semester showed a significant improvement. Instructors found that requiring rough drafts, offering detailed feedback on the rough drafts, and familiarizing students with the step-by-step directions and the grading rubric when the paper was assigned resulted in greater overall success. The most significant factor in increasing student success was the amount of time faculty spent helping the students articulate their ideas on their rough drafts. Faculty also started providing students with strong examples of museum papers in order to deepen their understanding of what was expected. Faculty also helped students build their writing skills throughout the semester by giving essay questions on tests. Midway through the semester, students were required to create and analyze artwork of their own making. This creative and analytical process increased students’ understanding of the artistic process and helped them relate the vocabulary and concepts of art to personal experience. Faculty found that, for some students, engagement in the course and ability to articulate their observations improved with this creative, hands-on project. One aspect that remains puzzling is the small but significant number of students who disregard the directions for the paper. In spite of faculty feedback on the rough draft, some students turned in a final paper that did not address most or all of the required points. Helping these students remains a challenge. One-on-one conferences with these students to go over their rough drafts and requiring a second rough draft are possible solutions being explored. The rubric Core Good Average Below Average Unsatisfactory Course Outcomes Excellent Competencies Domain Ethics Plagiarized: if paper contains plagiarized material, it will receive a score of “0” points Domain A: 10 Communication 1. Use art 5, 3, 0 6 7 8 Introductory terminology... Artwork not Artwork is Artwork Artwork fully Artwork is paragraph Information fully identified. incompletely fully identified. identified. Thesis identified or identifies the 4. Observe a Literacy incompletely Thesis statement Thesis statement identified and/or statement is artwork (title, masterwork of thesis statement is present and present but may identified and/ present and artist, date, visual art... attempted but relates to the relates to the or no thesis not fully relate medium, country statement. content of the does not relate to content of the to the content of of origin) and the content of the paper. the paper. paper. Thesis gives a thesis statement paper. statement exceeds scope of course requirements. 24 | National Institute for Learning Outcomes Assessment

25 Core Excellent Course Outcomes Average Good Unsatisfactory Below Average Competencies Domain B 5, 3, 0 Communication 6 10 8 7 1 Use art terminology... Identify the Media, Media, technique Media, Two out of the Media, technique prevailing media, Information and style not three prevailing technique and style technique technique, Literacy 2. Describe the aspects (media, demonstrated and style and style identified (or) one and style present and demonstrated of the prevailing demonstrated in the work are technique, demonstrated in historical roles of Culture in the work are in the work are aspects but not all and style) identified. 2 or the work. Give a the visual arts. 3 are identified. more statements identified. 1 demonstrated identified. 1 brief description in the work are about the statement about statement about of the historical historical and historical and identified. No historical and Historical and and cultural cultural context cultural context cultural context statements about cultural context context in which is made and historical and are made and is made but not mentioned. the work was cultural context. related to the statement may related to the created. or may not be work. work of art. directly related to the artwork. Domain C Communication 1. Use art 5, 3, 0 10 8 7 6 terminology... Describe what is 4 descriptive No descriptive 3 descriptive 2 general 5 or more depicted in the Information statements are descriptive descriptive statements. No statements. (Or) artwork. 4. Observe a Literacy present. No interpretation interpretation. statements. No statements: masterwork of “There is a cow.” in place of interpretation. interpretation. visual art... description is “There is a blue line.” Some given, “The work interpretation, shows Aphrodite’s such as “the girl love for Adonis.” is sad,” may be present. 1. Use art Domain D 10, 5, 0 13, 12 Communication 15, 14 17, 16 20, 18 Identify the terminology... Fewer than 2 4 or 5 elements 2 or 3 elements More than 5 3 elements of Art Elements of Art: Information of Art discussed, elements of Art elements of Art of Art addressed are addressed; line, shape, color, 4. Observe a Literacy and analyzed are addressed analysis is the analysis is are mentioned. texture, value, masterwork of No attempt and analyzed correct. Each correctly. Each incomplete and space present visual art... Critical statement is or analysis is at analysis correctly. Each statement is and analyze how Reasoning or incorrect defended by one statement is incorrect in some defended by one they are used in areas. Few or no example within example within analysis. Few defended by the work. Culture examples from the context of the context of or no examples one or more from the work the work are examples within the artwork. the artwork. used to support used to support the context of the artwork. statements. statements. 13, 12 Communication Domain E. 15, 14 1. Use art 20, 18 17, 16 10, 5, 0 terminology... Identify the 5 Principles 3 Principles More than 4 Principles Fewer than Principles of Information of Design of Design are of Design are 3 Principles 5 Principles Design: Unity, 4. Observe a Literacy identified. identified. of Design are identified of Design are variety, emphasis, masterwork of and their use are identified Analysis of their Analysis of their mentioned. proportion/ visual art... Critical and correctly use is incomplete is correctly No attempt use is correct. scale, balance, Reasoning at analysis or analyzed. Each or parts are analyzed. Each Each statement and movement incorrect. Few or statement is statement is incorrect analysis. is defended by present and Culture no examples from one example defended by defended by one (Or) no attempt analyze how they one or more within the the work are used example within to identify and are used in the the context of examples within to support the analyze the context of the work. Principles of the context of the artwork. artwork. statements. the artwork. Design. 25 | National Institute for Learning Outcomes Assessment

26 Core Excellent Course Outcomes Good Below Average Unsatisfactory Average Competencies Communication 10 8 7 6 5, 3, 0 1. Use art Domain F terminology... Discuss the Content is Content is Content is Content is Content is either pictorial content Critical identified and mentioned never mentioned identified and identified and of the work. 4. Observe a Reasoning analyzed in 3 or or statements analyzed in in 1 or more analyzed in masterwork of 2 statements. about content are statements but 1 statement. more statements. visual art... Culture no examples At least one attempted but do At least one At least one example used example used offered to support not make sense. example used to defend each statements. to defend each to defend the No examples are offered to support statement. statement. statement. statements. Statements exceed the requirements of the writing assignment. Domain G Communication 8 7 6 5, 3, 0 4. Observe a 10 masterwork of Conclusion No conclusion, Conclusion Conclusion is Conclusion is Conclusion is paragraph visual art... Critical present and the paper just mostly restates present and present and Reasoning ends. relates to the opening relates to the relates to the thesis. 3 main thesis. 4 or more paragraph. thesis. 2 main points are points of the main points of summarized. paper are the paper are summarized. summarized. Exceeds the scope of course requirements. Domain H The Communication 5 4 3 2 1, 0 1. Use art terminology... paper as a whole: Fewer than 4 4 to 6 art 7 to 10 art 11 to 14 art 15 or more Use of vocabulary art vocabulary Art vocabulary vocabulary vocabulary vocabulary words introduced words are used words used with words used with words are used words are in Art 1010 used with understanding of with little or no with little understanding of understanding of their meaning. understanding of understanding of their meaning. their meaning. their meaning their meaning. Communication Domain I The 1, 0 2 3 4 5 1. Use art paper as a whole: terminology... Paper contains Paper contains Student was Rules of Rules of Grammar, grammar, usage, few grammatical, unable to numerous grammar, usage, spelling, sentence 4. Observe a demonstrate and punctuation punctuation and and punctuation grammatical, structure, masterwork of are followed; spelling errors. are followed; punctuation, and enough documentation visual art... spelling is spelling errors. spelling is knowledge to of sources (if Language correct. receive credit for correct. applicable) Language lacks clarity or the assignment. Language is clear Language is using standard includes the use uses jargon or Sources not and precise. clear. required by conversational of some jargon properly instructor, color Documentation Documentation or conversational tone. documented (if reproductions of of sources of sources tone. applicable). And/or sources artwork provided present (if present (if documented (if Documentation using standard applicable) using necessary) using Paper does not applicable) but of sources required by the standard the standard meet specified standard required present (if instructor, length required by required by length and/ applicable) using by instructor is and format of instructor. instructor. or format not used. the standard paper requirement. Color Color required by reproductions reproductions And/or paper Color instructor. provided provided using does not meet reproductions not using standard standard required specified length Color provided. by instructor. required by and/or format reproductions instructor. provided. requirements. Paper meets specified length Paper meets Paper meets And/or color and format specified length specified length reproductions not and format requirements. and format provided. requirements. requirements. 26 | National Institute for Learning Outcomes Assessment

27 NILOA National Advisory Panel Kent Phillippe Joseph Alutto Provost Associate Vice President, Research and NILOA Mission The Ohio State University Student Success American Association of Community Colleges NILOA’s primary objective is to Trudy W. Banta discover and disseminate ways that Professor Randy Swing academic programs and institutions Indiana University-Purdue University Executive Director can productively use assessment data Indianapolis Association for Institutional Research internally to inform and strengthen Wallace Boston Carol Geary Schneider - undergraduate education, and exter President and CEO President nally to communicate with policy American Public University System Association of American Colleges and - makers, families and other stake Universities Molly Corbett Broad holders. President Michael Tanner American Council on Education Chief Academic Officer/Vice President NILOA Occasional Paper Judith Eaton Association of Public and Land-grant Series President Universities Council for Higher Education Accreditation NILOA Occasional Papers Belle Wheelan Richard Ekman President are commissioned to examine President Southern Association of Colleges and Schools contemporary issues that will inform Council of Independent Colleges the academic community of the Ralph Wolff Mildred Garcia current state-of-the art of assessing President President Western Association of Schools and Colleges learning outcomes in American higher California State University - education. The authors are asked to Fullerton write for a general audience in order Ex-Officio Members Susan Johnston to provide comprehensive, accurate Executive Vice President Timothy Reese Cain information about how institutions and Association of Governing Boards Associate Professor other organizations can become more University of Georgia Stephen Jordan proficient at assessing and reporting President Peter Ewell student learning outcomes for the Metropolitan State University - Denver Vice President purposes of improving student learning National Center for Higher Education Mary Kalantzis and responsibly fulfilling expectations Management Systems Dean, College of Education for transparency and accountability University of Illinois Urbana-Champaign Stanley Ikenberry to policy makers and other external President Emeritus and Regent Professor Paul Lingenfelter audiences. University of Illinois President State Higher Education Executive Officers George Kuh Director, National Institute for Learning George Mehaffy Comments and questions about this Outcomes Assessment Vice President paper should be sent to Academic Leadership and Change Adjunct Professor, University of Illinois [email protected] American Association of State Colleges and Urbana-Champaign Universities Chancellor’s Professor Emeritus, Indiana University Charlene Nunley Jillian Kinzie Program Director Senior Scholar, NILOA; Associate Director, Doctoral Program in Community College Indiana University Policy and Administration University of Maryland University College 27 | National Institute for Learning Outcomes Assessment

28 About NILOA • The National Institute for Learning Outcomes Assessment (NILOA) was estab - lished in December 2008. • NILOA is co-located at the University of Illinois and Indiana University. • The NILOA website contains free assessment resources and can be found at http:// www.learningoutcomesassessment.org/ . • The NILOA research team has scanned institutional websites, surveyed chief academic officers, and commissioned a series of occasional papers. • One of the co-principal NILOA investigators, George Kuh, founded the National Survey for Student Engagement (NSSE). • The other co-principal investigator for NILOA, Stanley Ikenberry, was president of the University of Illinois from 1979 to 1995 and of the American Council of Education from 1996 to 2001. • Peter Ewell joined NILOA as a senior scholar in November 2009. NILOA Staff NATIONAL INSTITUTE FOR LEARNING OUTCOMES ASSESSMENT Stanley Ikenberry, Co-Principal Investigator George Kuh , Co-Principal Investigator and Director Peter Ewell , Senior Scholar Jillian Kinzie, Senior Scholar; Indiana University Associate Director Pat Hutchings , Senior Scholar Timothy Reese Cain , Senior Scholar Natasha Jankowski, Assistant Director and Research Analyst Robert Dumas , Research Analyst Katie Schultz , Research Analyst Carrie Allen , Research Analyst Jelena Pokimica , Research Analyst NILOA Sponsors Lumina Foundation for Education The Teagle Foundation University of Illinois, College of Education Produced by Creative Services | Public Affairs at the University of Illinois for NILOA. 10.032 28 | National Institute for Learning Outcomes Assessment

29 knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge create achievement connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection understand communicate listen learn access quality innovation success ingenuity self-reflection educate action understand intellect knowledge accountability connection self-reflection educate action understand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection self-reflection knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity tellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection in self-reflection educate action understand communicate curiosity challenge create achievement connection self-reflection understand communicate listen learn access quality action educate action understand communicate listen learn action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation success ingenuity self-reflection educate action understand intellect knowledge accountability connection self-reflection educate action understand knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect curiosity challenge connection knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation success ingenuity challenge create achievement connection self-reflection educate action understand connection self-reflection understand communicate listen learn access quality action create achievement connection self-reflection educate action understand communicate listen learn access quality innovation success educate action communicate listen learn access quality action educate action understand communicate educate innovation success self-reflection knowledge accountability communicate listen learn achievement connection self- reflection educate action understand communicate listen learn access quality innovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflection understand educate action understand communicate listen learn action understand communicate listen learn access quality innovation success ingenuity curiosity challenge create achievement connection self-reflection understand communicate listen learn access quality action create achievement connection self-reflection educate action understand communicate listen learn access quality innovation success educate action communicate listen learn access quality action educate action understand create achievement connection self-reflection understand communicate listen learn access quality action create achievement connection self-reflection educate action understand communicate listen communicate educate innovation success self-reflection knowledge accountability connection self-reflection educate action understand communicate listen learn access quality innovation ingenuity intellect connection self-reflection understand communicate listen learn access quality action create achievement connection self-reflection educate action understand communicate listen learn access quality innovation National Institute for Learning Outcomes Assessment For more information, please contact: National Institute for Learning Outcomes Assessment (NILOA) University of Illinois at Urbana-Champaign 340 Education Building Champaign, IL 61820 learningoutcomesassessment.org [email protected] Phone: 217.244.2155 Fax: 217.244.5632

Related documents

CityNT2019TentRoll 1

CityNT2019TentRoll 1

STATE OF NEW YORK 2 0 1 9 T E N T A T I V E A S S E S S M E N T R O L L PAGE 1 VALUATION DATE-JUL 01, 2018 COUNTY - Niagara T A X A B L E SECTION OF THE ROLL - 1 CITY - North Tonawanda TAX MAP NUMBER ...

More info »
CRPT 116hrpt9 u2

CRPT 116hrpt9 u2

U:\2019CONF\HJRes31Front.xml APPRO. SEN. [COMMITTEE PRINT] REPORT { } CONGRESS 116TH 1st HOUSE OF REPRESENTATIVES Session 116- FURTHER APPROPRIATIONS FOR MAKING CONTINUING OF HOMELAND SECURITY FOR THE...

More info »
RIE Tenant List By Docket Number

RIE Tenant List By Docket Number

SCRIE TENANTS LIST ~ By Docket Number ~ Borough of Bronx SCRIE in the last year; it includes tenants that have a lease expiration date equal or who have received • This report displays information on ...

More info »
Fourth National Report on Human Exposure to Environmental Chemicals Update

Fourth National Report on Human Exposure to Environmental Chemicals Update

201 8 Fourth National Report on Human Exposure to Environmental Chemicals U pdated Tables, March 2018 , Volume One

More info »
CalCOFI Atlas 33

CalCOFI Atlas 33

THE EARLY STAGES IN OF THE FISHES CALIFORNIA CURRENT REGION CALIFORNIA FISHERIES COOPERATIVE OCEANIC INVESTIGATIONS ATLAS NO. 33 BY THE SPONSORED STATES OF COMMERCE DEPARTMENT UNITED OCEANIC AND ATMOS...

More info »
doj final opinion

doj final opinion

UNITED STAT ES DIS TRICT COURT IC F OR THE D ISTR T OF CO LU M BIA UNITED STAT F AMERICA, : ES O : : la in t if f, P 99 No. on cti l A vi Ci : 96 (GK) -24 : and : TOBACCO-F UND, : REE KIDS ACTION F : ...

More info »
MPI: A Message Passing Interface Standard

MPI: A Message Passing Interface Standard

MPI : A Message-Passing Interface Standard Version 3.0 Message Passing Interface Forum September 21, 2012

More info »
Untitled

Untitled

Harmoniz ed vision 4 hedule of the United States (2019) Re Tariff Sc Annotated f poses ting Pur or Statistical Repor GN p .1 GENERAL R ATION ULES OF INTERPRET inciples: wing pr ollo y the f verned b i...

More info »
625137

625137

2018-19 Nebraska All-Sports Record Book - Nebraska Communications Office -

More info »
untitled

untitled

G:\P\16\HR1\INTRO.XML ... (Original Signature of Member) TH 116 CONGRESS 1 ST S ESSION H. R. 1 To expand Americans’ access to the ballot box, reduce the influence of big money in politics, and strengt...

More info »
Price Book: Canvas Office Landscape Wall and Private Office

Price Book: Canvas Office Landscape Wall and Private Office

Price Book Y Canvas Office Landscape ® Prices effective January 7, 2019 Published May 2019 Wall and Private Office page 2 Introduction Canvas Office 3 Landscape Office Wall and Private 5 Walls Work Su...

More info »
pisa 2012 results volume I

pisa 2012 results volume I

PISA 2012 Results: What Students Know and Can Do tICS, themA StuDent PeRfoRmAnCe In mA ReADIng AnD SCIenCe Volume I rogramme for ssessment A tudent S nternational I P

More info »
Second National Report on Biochemical Indicators of Diet and Nutrition in the U.S. Population

Second National Report on Biochemical Indicators of Diet and Nutrition in the U.S. Population

Second National Report on Biochemical Indicators of Diet and Nutrition in the U.S. Population Second National Report on Biochemical Indicators of Diet and Nutrition in the U.S. Population 2012 Nationa...

More info »
Department of Defense   Law of War Manual (June 2015)

Department of Defense Law of War Manual (June 2015)

D E A R T M E N T O F D E F E N S E P N A L O F W A R M A W U A L J U N E 2 0 1 5 O F F I C E O F G E N ER A L C O U N S E L D P A R T M E N T E O F D E F E N S E

More info »
Out of Reach 2018

Out of Reach 2018

2018 of OUT REACH THE HIGH COST OF HOUSING MADE POSSIBLE BY THE GENEROSITY OF:

More info »
OperatorHoursReport

OperatorHoursReport

John Bel Edwards Rebekah E. Gee MD, MPH SECRETARY GOVERNOR State of Louisiana Louisiana Department of Health Office of Public Health Certified Water and Wastewater Operators 2018 - 2019 Hours Hours li...

More info »
UNSCEAR 2008 Report Vol.I

UNSCEAR 2008 Report Vol.I

This publication contains: VOLUME I: SOURCES SOURCES AND EFFECTS Report of the United Nations Scientific Committee on the Effects of Atomic Radiation to the General Assembly OF IONIZING RADIATION Scie...

More info »
TTL19$13DU

TTL19$13DU

Sec. 19-13-D page 1 (1-13) Department of Public Health TABLE OF CONTENTS The Public Health Code of the State of Connecticut CHAPTER IV Hospitals, Child Day Care Centers and Other Institutions and Chil...

More info »
GAO 19 157SP, HIGH RISK SERIES: Substantial Efforts Needed to Achieve Greater Progress on High Risk Areas

GAO 19 157SP, HIGH RISK SERIES: Substantial Efforts Needed to Achieve Greater Progress on High Risk Areas

United States Government Accountability Office Report to Congressional Committees March 2019 -RISK SERIES HIGH Substantial Efforts Needed to Achieve Progress on Greater Risk Areas High- 19 - GAO - 157...

More info »