Effective institutional assessment achieves the goal of accountability for the institution’s activities and provides the basis for improvement. The increasing interest in assessment in higher education has led to the proliferation of models intended to enhance teaching and learning. The most effective models and conceptual frameworks depend on interrelationships, focusing on partnerships among faculty and administrators; these collaborations are based on shared goals and priorities, and provide mechanisms for devising and implementing coherent institution-wide assessment practices.
During the past ten years, the College has increased its use of assessment. The College has developed written assessment plans prior to every major strategic initiative. The current Assessment Plan, like past plans, provides a thorough and detailed narrative of efforts utilized to determine institutional effectiveness at the College. Likewise in its Strategic Plan, the College has underscored the importance of assessment. Strategic Principle IV in particular described a commitment to documented quality assurance. In addition, the College has for many years engaged in developing and refining appropriate institutional assessments through the activities of the Office of Institutional Research. At present, there is much evidence of productive collaboration between that Office and many College departments and programs.
Additionally other examples of assessment activities at the College make clear that educational quality improves when interdisciplinary teams of faculty work with administrators to apply what can be gleaned from institutional research. Furthermore, at various levels within the institution, many units have developed mission statements and set annual goals.
The charges to the Committee studying Institutional Assessment focused on examining the methodologies developed by the College for assessment with respect to their effectiveness, scope and process of creation, and on analyzing the utilization of data regarding student outcomes. Additional issues considered include an evaluation of the College’s plan for program and faculty evaluation and an analysis of the overall assessment process with respect to its flexibility in responding to and communicating with external constituencies. Strategic Principle IV which called for the development of an assessment model at the unit level was also reviewed.
The Committee members conducted interviews with faculty and administrators, studied numerous documents such as the Strategic Plan, the Assessment Plan and program audits and worked closely with other Institutional Context Committees.
The Assessment Plan
The Assessment Plan provides an overview of approaches and activities used to understand and describe institutional effectiveness at the College. Key areas for research that are routinely explored through assessment activities include workforce development, student achievement, college transfer, community outreach, and financial and operating effectiveness. Institutional effectiveness is measured within the context of student goals at all levels of the institution: the department, program and classroom levels. Research is also characterized by the use of external benchmarks and longitudinal studies. Wherever possible, existing databases are used for consistency and cross-validation.
Since the last Self Study, approximately 95 formal reports, including the Annual Statistical Compendium have been published by the Office of Institutional Research.
The principles guiding this institutional research are outlined in the Assessment Plan:
· Effectiveness information is presented in formats that support institutional planning and decision-making efforts.
· Institutional assessment is performed within the context of student goals, abilities and subpopulations, rather than simply applying global performance standards such as graduation and retention rates.
· Effectiveness is measured at all levels, i.e., institution, campus, program, department, and classroom.
· A wide range of internal and external standards are applied to develop an understanding of the College’s position relative to other institutions.
· Effectiveness is viewed longitudinally to detect and track evolving patterns.
· Cross-validation of findings is a high priority.
· Various campus constituents are involved in data collection and interpretation.
· Multiple reporting formats are employed, including web-based formats.
· Using external scanning, attempts are made to continually assess the institution’s information needs.
New technological approaches to enhance the conduct and dissemination of institutional research are continually being devised and implemented. For example, an Institutional Research website has been developed to ensure College-wide access to research findings, and external databases are routinely used to supplement institutional information concerning student outcomes. In addition, new SCT Banner software will provide all staff access to student information from their desktops.
In keeping with the College’s open access policy, assessment research at the College is based on well-developed and accepted student outcome models that elucidate the interaction between the students and the College environment. Assessment research at the College does not treat students as comprising a uniform population, but takes into account the goals and needs of subpopulations to examine the dynamics associated with student learning and other educational outcomes.
One of the priorities of the Office of Institutional Research is to ensure the collaborative nature of data collection. To this end, the activities of the Data Quality Task Force have contributed much to the overall effort at institutional assessment. The 25 member Task Force represents most College divisions and departments. As a preliminary step in the development of the next Strategic Plan, the Task Force participated in analysis of external and internal scanning research compiled by the Office of Institutional Research, and identified key assumptions, current conditions and institutional concerns that helped to shape strategic planning issues for the College.
In the 1993 Self Study Report, the Middle States Commission on Higher Education Evaluation Team acknowledged the efforts of the Office of Institutional Research in generating “a large amount of institutional research and outcomes assessment data” (p. 9), and made recommendations for improvement. The College has adopted those recommendations, implementing them in the accomplishments described above.
The Assessment Plan presents a comprehensive overview of the College’s efforts to view student outcomes within the context of institutional effectiveness. A sample of student achievement measures that are routinely assessed include GPA, successful completion of course sequences, certificate and associate degree completion, short term persistence rates, and student assessment of goal completion and personal growth. In addition, transfer rates, transfer student performance at other institutions, employment outcomes, and student assessment of transfer and career preparation are also examined. Community outreach measures include indicators of responsiveness to community needs, participation rates in the service area and the economic impact of the College on the City and region. Operational effectiveness includes measures in the areas of cost efficiency and resource usage.
are viewed longitudinally by tracking students from their initial enrollment at the College through as long as nine
months to a year after departure
Decentralized assessments of a formative nature add to the picture of institutional effectiveness crafted by the Office of Institutional Research. Large-scale, ongoing student assessment with respect to program goals is found in career programs concerned with external accreditation. In addition, since the last Middle States evaluation, several programs have begun to examine student outcomes data in order to improve program performance. The use of qualitative methods such as focus groups and development of measures relating to non-cognitive student outcomes at the classroom level have been used in several developmental programs and elsewhere to guide decision-making at the program level.
The Strategic Plan
During the last strategic planning effort, great attention was paid to assessment; indeed, assessment practices were to be embedded in all facets of College activity. A culture of assessment was envisioned in which training and development would be provided to key personnel and then transmitted gradually throughout the College community, creating the fertile ground from which assessment practices could comprehensively take root. In order to shape coherent attitudes towards assessment throughout all levels of the College, an important aspect of the training was to center on the vital interconnection of mission, planning, and assessment.
To anchor this initiative, systematic unit reviews were projected. The recent review of Facilities Management provides an example of this process. Another response to this planning principle was the development of and continued refinement of a set of institutionally-sanctioned effectiveness indicators to demonstrate effectiveness and for use in College planning efforts. Currently, as a direct result of the College’s most recent strategic planning, commendable pockets of assessment may be found throughout the College.
In the past ten years much effort has been focused on devising a workable academic program audit process, resulting in a number of improvements in the College’s program audit procedure. Issues addressed include the effective implementation of a College-wide audit model, and oversight and facilitation of audit activities.
The ability of auditing to revitalize a program is illustrated by the Music program. An audit brought to light several major problems, including drastically reduced enrollment. Through the internal and external scanning processes embedded in the program audit model, department faculty were able to bring their curricular offerings up to date, thereby attracting a new population of student applicants that was largely untapped by other area institutions.
The current program audit model demands a well thought-out program rationale, and provides pathways for assessing the strengths of faculty, curricula, and resources. It also allows for review of program-specific student outcomes and services, and findings are supported by research data, including surveys of current and former students and occupational outlook information. Budget data is also assessed.
Recently, the Office of Academic Affairs centralized responsibility for writing program audits with the support and input of department faculty. This represents a restructuring of the previous process, in which faculty took primary responsibility for writing under the guidance of a facilitator, and reports were submitted to Academic Affairs for review and approval. Under this prior procedure, the process frequently took an extended amount of time, duplicated the efforts extended for external program accreditation and some programs were never audited. This new procedure brings the audit process under the responsibility of the Assistant to the Vice President for Academic Affairs and includes a tracking database to ensure that audits are completed in a timely manner. Additionally, the Academic Program Effectiveness Report contains Performance Measures and provides assessment information for the over 80 academic programs at the College. These program-level Performance Measures are being expanded by the Office of Institutional Research.
In addition to audits, College programs are offered thorough and collaborative development and revision processes by the Curriculum Facilitation Team (CFT). Consistency with the College Mission and a clearly articulated need for the program within the College are by-products of these processes. Also, courses within programs are documented and revised through this Office, with the aim of fostering curricular consistency and continuing refinement of practice over time.
Ongoing assessment for renewal and improvement is also needed, and may be provided by supplementing audits with a research and development committee model. The effectiveness of this model is illustrated by the CAP Program (The College Achievement Partnership, CAP, is an extensive combination of courses and support services for students who need to strengthen their skills in English and Mathematics). After a Middle States Self Study recommendation in 1990 to improve assessment activities, the CAP program began a process of self study and data analysis (See IR Reports #84, 93, 95, 98, and 103). These assessment efforts were designed to evaluate the mission effectiveness of these programs. This ongoing effort has recently been given a strong focus through the activities of the Developmental Education Research and Development Committee, an interdisciplinary group whose collaborative work with the Office of Institutional Research has produced a number of important recommendations for improving services within the CAP program. In conclusion, the program audit system at the College has improved dramatically in the past ten years. In addition, other avenues for ongoing assessment are utilized to augment the picture of institutional effectiveness.
Currently there are several efforts underway at the College to make faculty evaluation more accessible, predictable, consistent and meaningful. Departments are devising and implementing plans for annual evaluation of both full- and part-time faculty utilizing guidelines to ensure that the plans are consistent and clear. Many of these plans include an intensive process for pre-tenured faculty, which may include interviews, peer observations, and written self-evaluation. Plan development also includes processes for the assessment of tenured faculty. Each semester all faculty are also required to implement the College-wide Student Evaluation of Faculty Teaching Surveys.
The faculty promotion process requires the preparation of a dossier which includes self-evaluation as well as peer evaluation. Recent incentives to the promotion process, including increased remuneration, may encourage more faculty to apply for promotion and thereby undertake extensive self-assessment.
Mechanisms such as Teaching Circles, mentoring, and practitioner research are occurring throughout the College; responses to the Faculty/Staff Survey conducted for this Self Study indicate that more than half the teachers who responded reevaluated their teaching based on information derived from direct student feedback and/or faculty-led workshops. The following examples of formative evaluation are currently taking place within various departments:
Seventeen Teaching Circles are presently active in
the English department, bringing together part-time, pre-tenured, and
tenured faculty in collaborations dedicated to ongoing self and peer
evaluation. Recently, the
originators of the
· A formal mentoring program was recently implemented in the English Department, pairing full-time faculty with newly hired part-time faculty.
· While not pervasive, practitioner and action research has been ongoing across the College for many years; evidence that faculty engage in course improvement by constructing their own instruments for student evaluation of teaching can be found in many of the course documents facilitated by the CFT, as well as in the Faculty/Staff Survey conducted for this Self Study.
The College established a Professional Development Plan and timetable for 2002-2003 which spells out the organizational responsibilities for professional development, and establishes a Professional Development Advisory Committee. The objectives of the Plan include an examination of existing policies and procedures related to professional development, a review of professional development needs at the College, and a study of best practices. In April 2003, the Committee created a website that posts ongoing College-wide professional development activities. Additionally, the College has expanded its professional development activities and they now encompass two weeks; one at the beginning of the Fall and Spring semesters as well as two days of activities during the academic year. These activities will be evaluated through the Professional Development Plan which will be reviewed on an annual basis, and a report on professional development activities will be available annually to the College community on the web.
In the area of staff development, a formal process has recently been put into place to provide career ladders for advancement. Partially in response to new language in the Collective Bargaining Agreement, the project creates opportunities for advancement for classified staff. The project, which was piloted in Student Affairs and Facilities Management and is now College-wide, focuses on movement in grades and development of skill sets tied to a training and development program that an employee eligible for the position would pursue.
The College’s external constituencies consist of numerous people and agencies throughout the city and region, including city, state, and federal elected officials, city administration, city ward leaders, churches/clergy, Regional Center advisory boards, Philadelphia Chamber of Commerce, news media, community activists, community clubs, community groups, community development corporations, program advisory committees, and the business community at large. This listing is only a short sample of external constituencies that play a part in the College.
The College’s assessment data and activities are disseminated in several ways to external constituencies. In addition to extensive website postings, the annual President’s Report serves to communicate assessment findings and showcase important College initiatives. Transcripts, the College’s newsletter, is another source for information, regularly covering campus events. In addition, Education That Works! is a newsletter distributed to community leaders and includes information about the College’s academic achievements and new programs. The College Report Card, The Impact of the Allied Health Programs on the Philadelphia Region, and The Economic Impact of Community College of Philadelphia , and many College websites are also important vehicles for communicating with external constituencies.
Much activity within the College relies on partnerships with external agencies and institutions and evidences effective exchange between the College and external constituencies on issues related to institutional assessment. Major examples of this include ongoing relationships with state and local funders. In addition, program advisory committees are influential in shaping the direction of career programs. Outreach to public schools brings college and high school teachers into a dialogue about shared goals, practices, and values. Articulation agreements with baccalaureate institutions play an important part in shaping both curricula and courses within them. Relationships with business and industry have direct impact on the way the College functions.
The College engages in numerous systematic, effective and ongoing institutional assessment activities, and its assessment efforts strive to focus on multiple dimensions of institutional effectiveness, including student learning outcomes. The results of its assessment activities have been effective in stimulating renewal in several significant ways. Although the overall picture regarding institutional assessment is strong, the College should extend its efforts to ensure that assessment activities are pervasive throughout the institution.
example, while the past ten years has evidenced considerable progress in the
development and utilization of institutional assessment methods, some aspects of the College’s approach to assessment
could improve further. One area of
concern is the way assessment is linked to mission statements and
planning. Goal setting, particularly at
the unit level, sometimes takes place apart from area mission statements. Additionally, not all units have developed
mission statements and the relationship between the College
Likewise, the current efforts regarding student outcome assessments could be more thoroughly and consistently adopted throughout the institution. Ongoing assessment of student outcomes at the curricular and classroom level is not a universal feature of the assessment picture. Assessment methods and measures employed in evaluating career programs are less likely to be used routinely in transfer programs such as Liberal Arts or Culture, Science, and Technology. Classroom based research usually remains within the specific confines of a particular instructor’s course; at best, it may be used to guide the practices of like-minded faculty, and could be effectively folded into a larger context. Although written course goals and course competencies provide a thoughtful approach to the articulation of goals and may dovetail with the larger purposes of a particular program, this process should be taken further to ensure that actual practice conforms to the guidelines set forth in course documents.
The most significant issue with respect to student outcomes assessment lies in the area of general education. Assessment of student outcomes is adversely affected by the College’s current lack of clear General Education guidelines. In a recent survey of faculty and staff conducted for this Self Study, over half the respondents felt that the current Standing Committee structure provided inadequate oversight of General Education. Regardless of the core issue here (see discussion in the section on Standard 12: General Education), it is fair to say that assessment of the General Education experience for all students at the College has been hampered by a lack of clear-cut implementation of goals.
With respect to the delivery of program audits, implementation has become far more systematic than in 1993, but some problems remain. Career programs receive the most regular scrutiny, both through internal audits and accountability to outside agencies, but disciplines are currently not included in the formal audit process. Assessment processes overseen by Deans and Department Heads may be as appropriate as formal audits in these cases, but a considered look at the adequacy and consistency of such activities should be undertaken.
Likewise, with respect to professional development, although an array of professional development initiatives exist at the College, interviews conducted with members of the College community revealed that generally there seems to be little evidence of sustained formal assessment of professional development needs or professional development programs. Efforts to survey the staff and faculty have not resulted in significant response. Efforts to assess the effectiveness of professional development activities, such as the Professional Development Weeks that take place at the start of each semester, have been inconsistent until recent semesters.
Staff and professional development activities have increased in scope and frequency. Continued improvement in this area would result from further coordination and evaluation. In recent years, for example, there have been several reviews of staff and professional development at the College. While there have been a large number of professional development programs and activities available, those efforts seem fragmented and episodic rather than coordinated, planned and strategic. Likewise, technology training at the College historically has been fragmented, inconsistent, and uncoordinated. It was concluded that use of assessment in the planning of professional development activities has been hampered by a lack of a centralized organizational structure that oversees professional development activities. Both reports recommended the formation of a centralized office that would have responsibility to coordinate all staff and faculty professional development. Thoughtful consideration of these recommendations should be undertaken. The involvement of part-time faculty in the life of the College, particularly in professional development activities also merits focus. Currently there is no College-wide incorporation of part-time faculty into professional development activities. Part-time faculty are invited to attend professional development activities, but pursuant to the Collective Bargaining Agreement, cannot be required to attend such activities without compensation. Budgetary concerns have hampered the College’s ability to make such financial incentives available.
In the area of
faculty evaluation, the past ten years have seen considerable progress as
departments develop evaluation plans. Concern
remains, however, regarding whether these efforts represent a cohesive strategy
guided by a sense of
With respect to the communication of its assessment activities to external forces, the College undertakes numerous and effective activities. The assessment of the actual partnerships with external constituencies, however, could be improved. The mechanisms for evaluating the appropriateness of these contacts, as well as their viability and usefulness, are often opaque. Day to day planning may be propelled by sudden contingencies and market-driven opportunities; in the absence of a genuine marketing plan, it is frequently difficult to evaluate the effectiveness of these partnerships, especially in the long-term.
In summary, assessment of College activities
and services should provide the College community with a clear view of
institutional effectiveness in light of its
· Improve use of data in institutional decision-making at all levels.
· Devise a General Education outcomes model and assessment plan which reflects a viable approach to General Education for all students.
· Audit programs that have not yet been evaluated and give them priority in the audit process.
· Revise the Student Evaluation of Faculty/Teaching Survey and develop meaningful methods of assessment to supplant the current instrument.
· Develop a systematic needs assessment and approach for assessing outcomes of partnerships with outside constituencies.
· Continue assessment of efforts in classified staff development. The Career Ladders program should be the beginning of a process that addresses problems of morale and performance noted in the 1993 Institutional Self Study and the 1999 Periodic Review Report.
· Foster an awareness of the relationship between mission statements and goal-setting and better communicate this connection throughout the institution. If units are expected to create effective mission/goals statements that support a College-wide model of outcomes assessment, a faculty/staff dialogue about the model must ensue.
· Promote a collaborative assessment model. Program faculty and unit leaders should be encouraged to work collaboratively, in a sustained and organized manner, with groups of academic peers and administrators in an ongoing effort to analyze outcomes data and make recommendations for program or project improvements.
· Improve communication about assessment activities.
· Continue assessment of professional development activities with a particular focus on determining better ways to address the needs of part-time faculty.
· IR Report #77 – A Review of the Higher Education Literature Related to Models of Student Outcomes (6/94)
IR Report #95 – An Evaluation of the Achievement of the
· IR Report #103 – Developmental Education Outcomes – Three Years After the Developmental Education Task Force Report (4/99)
· IR Report #104 – Highlight of Institutional Research Findings from the Last Five Years (5/99)
IR Report #111 – The Economic Impact of
· IR Report #116 – Student Preferences for Alternative Course Delivery Options (11/00)
IR Report #117 – Impact of
· IR Report #119 – Institutional Effectiveness 2000: A College Report Card (1/01)
· IR Report #120 – Student Attrition at CCP – When Students Leave, Why They Leave, and Their Academic Success at Departure (6/01)
IR Report #123 – Distance Education at the
· IR Report #125 – Institutional Effectiveness 2001 – A College Report Card (3/02)
· IR Report #126 – Career Outcomes for 2001: Career Program Graduates (10/02)
IR Report #128 - The Progress of 2001 Graduates of
· IR Report #129 – Institutional Effectiveness 2002 - A College Report Card (1/03)
· IR Report #130A - Responses to Middle States Self Study Current Student Questionnaire (4/03)
· IR Report #130B - Responses to Middle States Self Study Faculty/Staff Questionnaire (4/03)
· Institutional Self Study (10/93)
Report to the Faculty, Administration, Trustees, and
Self Study Design (
Office of the
Vice President for Academic Affairs, Community College of
D. Office of the Vice President for Planning and Finance, 2000-2004 Strategic Plan and Progress Reports
Mathematics, Science and Health Careers, Memorandum, Programs with Specialized Accreditation and Other External Recognition,
Office of the
Vice President for Planning and Finance, Assessment
Plan: An Overview of Efforts to Understand Institutional Effectiveness at the
G. Office of the Vice President for Academic Affairs, Program Review (Audit) Model (2002)
H. Office of the Vice President for Planning and Finance, Statistical Compendium (2000-2001)
I. Office of the President, Self Study Design (10/02)
J. Office of the Vice President for Academic Affairs, Curriculum Development Services: Format for Program Revision (2001)
K. Office of the Vice President for Academic Affairs, Curriculum Facilitating Team, Guidelines for Course Development and Revision at Community College of Philadelphia (11/6/02)
L. Office of the Vice President for Academic Affairs, Curriculum Development Services: Format for Program Development (1999)
M. Middle States Commission on Higher Education. Documents distributed at the Annual Conference of the MSCHE (12/02)
Office of Human
Resources, Classified Employees (Bargaining Unit) Proposal (
Office of Human
Resources, Internal Classified Employee Job Posting – Sample (
P. Department of Office Administration, 2002-2003 Faculty Evaluation Plan
Q. Office of the Vice President for Academic Affairs, 2002-2003 Academic Affairs Goals (12/02)
R. Office of the Vice President for Academic Affairs, Department Head Evaluation Revision (2001)
S. Office of the Vice President for Academic Affairs, Memorandum, Timeline for Evaluation of First, Second & Third-Year Untenured Faculty, Visiting Lecturers and Adjunct Faculty (9/02)
T. Department of Mathematics, Self Evaluation Plan – DRAFT
U. Department of Mathematics, Summary Credit Course Evaluation
· MATH 171
· MATH 172
· MATH 271
V. Library Services Department, Departmental Criteria for Faculty Evaluation
W. Department of History & Philosophy, Faculty Evaluation System
X. Office of the Vice President for Academic Affairs, 2002-2003 Professional Development Plan (11/02)
Z. Office of the Vice President for Academic Affairs, Music Curriculum Audit (Spring 1995)
AA. Office of the Vice President for Academic Affairs, Academic Programs Audit Schedule (10/02)
BB. Office of the President, 2002 President’s Report
memorandum from the Office of Communications dated
CC. Office of Planning and Finance, Information Technology Services, Lists of members who serve on various curriculum advisory committees (11/02)
Office of the
Vice President for Planning and Finance, Information Technology Services, SCT Banner Request for Proposal #9121 (
· New Student Goal Statement
· ESL Student Personal Data and Goal Statement Form
Regional Centers Administration, Guidelines
Office of Human
Resources, Career Ladder Position
Description – Sample (
Curriculum Development Services, Curriculum
Facilitation Team – Mission Statement (
ESL Curriculum Committee,
Proposed Program Revision – English as a
Second Language (
JJ. Office of the Vice President for Academic Affairs. See the following materials from Spring 2003 Professional Development Week (1/03):
· Division of Educational Support Services, Intimations of Assessment, a presentation by Tom Ott
· “Minute Paper” Comments, presentation by Linda Suskie
· Questionnaire, Questions for Roundtable Discussions
· Some Useful References on Assessment by Linda Suskie
· Questionnaire, Reflections on Assessment
· Enhancing Teaching and Learning Through Assessment by Linda Suskie
KK. Office of the Vice President for Planning and Finance, Overview of Current Institutional Research Activities and Effort to Track Student Outcomes and Institutional Effectiveness
Mathematics, Memorandum, Advertisement
for PT Faculty (
MM. Office of the Vice President for Academic Affairs, Memorandum, Student Evaluation of Faculty Teaching (FL 01)
Educational Support Services, Memorandum, Plans
for C-level C.A.P. (
Click here to view or add comments.
Click here to view or add comments.