STANDARD 14: ASSESSMENT OF STUDENT LEARNING

Standard for Accreditation

Assessment of student learning demonstrates that the institution’s students have knowledge, skills, and competencies consistent with institutional goals and that students at graduation have achieved appropriate higher education goals.

 

 

Introduction

            In order to demonstrate accountability to its various constituents, a quality educational institution must be committed to assessing the outcomes of student learning and using the results of that assessment to improve the educational experiences of its students.  As increasing emphasis is placed on outcomes assessment, educational institutions are struggling to identify the most meaningful ways to study student learning, document the results of their efforts and use this information for continuous improvement. 

The Community College of Philadelphia has a long history of rich databases to support institutional inquiry into its effectiveness.  Since the last Self Study, a significant amount of research has focused on assessing student outcomes and the results of this research are increasingly accessible to faculty and staff.  Annual reports, such as Institutional Effectiveness: A College Report Card , and the Statistical Compendium also provide valuable information on the College’s strengths and identify areas where improvement is needed.

Methodology

The charge to the Committee studying the Assessment of Student Learning focused on an analysis of the following four issues: the process of defining expectations for student learning; the congruence of student learning outcomes at the course, program and institutional levels; the effectiveness of methods and/or procedures used for collecting student learning outcomes information; and the extent to which student learning outcomes influence the improvement of programs and courses.

A number of documents pertinent to student outcomes including the College’s Assessment Plan , Statement of Mission, 2000-2004 Strategic Plan and Institutional Research reports helped inform the Committee’s study.  Several programs from the College’s array of career and transfer program offerings were studied: Art, Automotive Technology, Dental Assisting Hygiene, Electronic Engineering Technology, Music, and Paralegal Studies.  Syllabi, course and program documents as well as recently completed academic audits were examined and provided useful information.  Three disciplines were also selected for review because large numbers of students from across the institution enroll in these courses to help meet General Education requirements.  Syllabi and course documents from English, mathematics and psychology were examined.  To understand perspectives beyond the programs and disciplines listed, the Committee interviewed, individually or in focus groups, academic Department Heads, Curriculum Coordinators, faculty, administrative personnel and students.  Results of the Faculty/Staff Survey and 2003 Middle States Current Student Survey which were conducted specifically for the present Middle States Self Study provided additional data for consideration.

Strengths

 

Course Level

 

The Community College of Philadelphia has a well-defined process for articulating expectations for student learning.  An innovative approach to curriculum development was initiated in 1994.  Under the leadership of the Coordinator of Curriculum Development (a faculty member with expertise in curriculum theory), a team of faculty members knowledgeable about curricular issues and experienced in curriculum development, the Curriculum Facilitation Team, serve as peer facilitators for colleagues as they develop or revise courses and programs. 

While faculty in the various departments are ultimately responsible for defining course objectives and design, the curriculum facilitation process offers a supportive environment for this activity.  Currently, the College has two approved models for course development and revision. The activities-centered model focuses on the intellectual environment to be developed in a course and the intellectual processes that characterize it; emphasis is placed on coherence between the course rationale and its classroom activities.  Guiding questions such as “How does [the activity] encourage students to consider multiple perspectives?” and “How does the activity help students frame and solve problems?” help faculty define the expected learning outcomes.  A second model, the objectives-centered model, may be used when faculty prefer to organize a large body of knowledge and define expected learning outcomes by stated objectives.  Both models provide significant guidance to course writers and require that they provide detailed descriptions of the need for the course and its potential placement within a curriculum – thus pointing out the need for congruence of learning outcomes at the course and program levels.  

While the vast majority of the College’s course offerings were well substantiated, either through the process described above or by certification through the Dimensions process described in the previous section on General Education, formal documentation was needed for the remainder.  Regulations promulgated by the Commonwealth Board of Education (Chapter 335.  Community College Courses, Part XVI. Standards Title 22. Education, Pennsylvania Code, Commonwealth of Pennsylvania, Adopted June 20, 1997) define expectations for courses and programs offered by community colleges.  These regulations specify the need for each course to have stated learning goals, a planned sequence of topics or learning activities designed to help students achieve the learning outcomes, and consistency with the College’s Mission.  In addition, a course must be developed, approved and offered in accordance with established institutional policies and procedures.  Any course for which these requirements could not be substantiated underwent formal documentation using the approved curriculum development process.  During the phase-in period from 1998-2002, 168 existing courses were documented and approved in accordance with College policy.  This review process helped many faculty identify courses which needed revision in order to achieve relevancy and enhance quality.  At the present time, 100% of the courses being offered have been documented.  The documents are viewed as useful; many Department Heads use them in orienting new faculty to departmental expectations for student learning.   During the past year more than 300 copies of these documents were distributed throughout the institution. 

As described in the Assessment Plan, course assessment is built into ongoing processes.  Course development models, either the activities or objectives-centered models, direct course writers explicitly to discuss how student learning outcomes information will be used to determine the effectiveness of the course design.   Faculty are encouraged to include strategies for determining the impact of direct measures of student learning such as tests and assignments.  The models suggest a list of possible tools (besides those to determine grades) that faculty can use to evaluate student learning.  This includes portfolios, student course evaluation, pass/fail rates, employer feedback and advisory committee feedback based on review of course materials.   

A review of course documents and syllabi in the various programs and disciplines listed above revealed that a variety of assessment methods are used.  Interviews with Department Heads, Curriculum Coordinators and faculty also revealed that direct methods of assessing student learning (e.g., grades, examinations, standardized tests, term papers, capstone projects, oral presentations, and portfolios) are used for the purpose of improving student-learning outcomes.  It is also clear from these interviews that student learning outcomes assessment to evaluate course effectiveness is ongoing and supported by various formal and informal professional discussions, though largely not documented and not systematic.

Interviews with various faculty also indicated that other indirect assessment methods of student learning, specifically student surveys, are helpful tools for improving their courses; the Behavioral Sciences Department, for example, implements its own student survey and uses the collected data to improve classroom instruction.  The English Department recently developed a pilot study using a student survey for English 102 (English Composition II).

The Faculty/Staff Survey indicates that most faculty are constantly reexamining their classroom strategies.  Furthermore, the Survey indicated that the dominant reason why faculty make changes to their courses is the assessment of unsatisfactory outcomes.  In addition to the course-level assessment that is initiated by faculty in their classrooms, faculty collaborate with the Office of Institutional Research to explore course-based issues.  For example, English faculty have requested that the Office of Institutional Research help determine whether students performed adequately in English 102 following the completion of English 101.  Mathematics faculty have looked at the sequence of math courses and the adequacy of placement test methods. The Director of Developmental Education (DE) has tracked developmental students to look at success in subsequent developmental courses and college-level courses (see IR Reports #84, 93, 95, 98, and 103 ).  Additionally, comparisons of student performance have been made between 10-week and 14-week courses and student preferences for alternative course delivery options have been explored (see IR Report #116).  An ongoing assessment strategy for distance education courses has been in place for the last several years as this type of course offering has increased (see Effectiveness Indicators #40 – 43 in IR Report #129).

  In compliance with State regulations described earlier, every course must include a plan for evaluation and it must be evaluated at least every five years.  Department Heads utilize data from individual instructor evaluations of the course to compile a five-year summary evaluation.  This evaluation asks questions such as:

·        Is the course consistent with the College Mission?

·        Are assigned credits based on nationally or regionally accepted practices or guidelines?

·        Are the course’s stated learning outcomes necessary to enable students to attain the essential knowledge and skills embodied in the program’s educational objectives?

·        Do the course materials reflect knowledge in the program’s field of study?

·        Is the course comparable to similar courses which are generally accepted for transfer to accredited baccalaureate institutions (if designed for transfer)? 

Each of the evaluations must be reviewed by the appropriate Dean and the Vice President for Academic Affairs.  The Coordinator of Curriculum Development also reviews completion of evaluations and documents.  To date, evaluation of more than 80 percent of the courses requiring five-year summaries has been completed and progress in this area continues.  Existing courses that are specifically part of a program are also evaluated in the program auditing process (see below for additional details.)

Program Level

At the programmatic level, approved models for development and revision guide faculty writers to provide a strong rationale for proposed programs.  To do so, writers analyze changes in academic disciplines, requirements of accrediting agencies where appropriate, and developments along the educational continuum – at secondary schools or baccalaureate institutions.  The models also direct writers to explain how the program is internally congruent: how students grow academically, how advanced courses build on foundational courses, and how courses deal with issues of depth and comprehensiveness.  The model further directs writers to consider institutional congruence – how the program fits with the College’s Mission and supports the Strategic Plan.

Faculty members and Curriculum Coordinators indicated during interviews that they consider articulation agreements with transfer institutions, the standards of external accrediting agencies, and current developments within their discipline when defining expectations for student learning, thus confirming that practice conforms to the defined expectations.  For example, the Art and Education curricula work closely with transfer institutions to design programs that will allow students seamless transfer.  Currently, ninety-one articulation agreements with 35 different institutions of higher education are in place. 

The Allied Health and Nursing Programs as well as the Paralegal Studies and Automotive Technology Programs are currently accredited or approved by the appropriate external accrediting body.  To achieve this status, peer reviewers validate that the curriculum allows students to achieve standards of the relevant discipline.  Career programs not specifically accountable to external agencies also identify goals for student learning to be congruent with expectations for professional practice.  For example, the Behavioral Health/Human Services curriculum considered the requirements necessary to become a Certified Addictions Counselor when recently revising courses.  Career curricula advisory committees often provide important guidance in the areas of curriculum and needs of the work force.  As an expression of the College’s confidence in student achievement, it recently instituted a Career Program Guarantee that will allow a graduate the opportunity to enroll for up to fifteen additional credits of course work tuition free if an employer verifies the individual is lacking in job-related competencies and skills specific to the career program.

Many departments and curricula use capstone projects and/or collaborative research projects to assess program outcomes following the completion of course requirements, although their use is sporadic and discipline specific.  In the Art, Design Technologies and Photographic Imaging curricula, for example, each student’s work is displayed and presented to the entire faculty for critique.  Faculty have the opportunity to evaluate the students’ performance and the assignment itself.

Similar to the frequency of course evaluation, State regulations dictate that each career program be audited every five years.  The Assessment Plan describes an approach to evaluating the effectiveness of programs.  A core set of Performance Measures has been developed and maintained in support of the academic audit process.  These data include: 1) enrollment trends and the diversity of the student population; 2) persistence and retention rates; 3) graduation, transfer and employment rates; and 4) student academic performance.  These core performance indicators, which are available for all of the 80 academic programs at the College, are provided over a five year time frame thereby enabling each program the ability to compare program performance over time or against the performance of a peer program or College-wide figures. 

The audit process ensures a consistent collection of data and uses three kinds of tools to assess program effectiveness.  These are graduate surveys, surveys of former students and surveys of current students.  These surveys provide feedback on student satisfaction with their program, their perceptions of its strengths and weaknesses, and the preparation they received for the job market and/or transfer.  The audit process relies on quantifiable assessment tools- transfer rates, student attrition/persistence, certifying exams and employment data.  Furthermore, the audit model for programs with and without external accreditation guides audit writers to summarize current program evaluation efforts and to use external sources (e.g., advisory committee and employer feedback) to validate curriculum.

All audits reviewed have a section where areas of concern are addressed and a specific plan for improvement is included.   Findings and recommendations presented in the audit serve as a foundation for program revision, including modification of learning goals and objectives on the program (and course) level, when appropriate.     

Interviews with Department Heads and Curriculum Coordinators revealed that audit data was used to guide planning and revise courses.  In the Early Childhood Education (ECE) Curriculum, for example, audit data was used to determine if an appropriate balance existed between academic content and hands-on experience.  Faculty involved in the audit found that since the ECE field has changed in ten years, it was time to update courses.  This included adding more hands-on activities and developing community advocacy projects such as work in shelters.  Questionnaires sent to students as part of the audit process confirmed the need to incorporate these projects into the curriculum.  The music industry also experienced significant changes that led to the development of new courses and the creation of a computerized music studio in the Music Department.  Additionally, the Music Curriculum, as a result of the audit process, was revised and a new non-performance option was developed.  Other Curriculum Coordinators have used audit data to develop retention strategies for their programs.  The current audit procedures provide a vehicle for systematic assessment of student learning outcomes, although they are used inconsistently for planning by faculty.

 Timeliness of program audits has been a concern in the past and recent changes in procedure have been initiated.  Currently, academic audits are completed by the Assistant to the Vice President for Academic Affairs in conjunction with Department Heads and faculty. Academic Deans participate in the process to ensure that institutional as well as programmatic perspectives are maintained and that the final product reflects a realistic assessment of strengths and needs.  Programs with external accrediting agencies may accelerate the process by drawing substantial data from recent self studies to incorporate into the academic audits.  Eight programs have been scheduled for audit during the 2003-2004 academic year; two have been completed and approved by the Board of Trustees.

Institutional Level   

            A major focus of ongoing Institutional Research is directed toward assessing and providing outcomes data for planning purposes to faculty and staff.  An extensive description of the theoretical framework and principles of institutional assessment may be found in the Assessment Plan.   Key institutional documents, such as the Strategic Plan, Mission Statement, and President’s Vision Statement, shape the research agenda for the Office of Institutional Research.  The expectations and needs of external constituents, such as Middle States, State and national Departments of Education, specialized accreditors and funding sources, also provide direction concerning research priorities for the Office.

In order to efficiently respond to assessment requests, the Office of Institutional Research maintains a generalized assessment database.  The file structure, which is longitudinal, contains records that track students through their enrollment at the College.  These student records are supplemented with additional assessment information from internal and external databases to create a student record that can track a student from entry to the College to one year after departure.  In addition to enabling institution-wide assessments, the file has been adapted to meet the information needs for program assessments, such as the academic audits.  The extensive historical information that is available on this longitudinal database provides reference points for assessing student change over time.

Surveys of both graduates and former students (students who leave the College prior to graduation) elicit information concerning students’ short-term transfer and career experiences, their assessments of the efficacy of their educational experiences at the College in helping them to achieve their educational goals, and their development of a core set of learning outcomes related to General Education and affective attributes.  In order to be able to assess change over time, similar methods have been used to gather survey information and a core set of questions have been asked consistently.  The survey process is flexible and has been adapted to address newly emerging critical issues.

The following is a recent sample of research documents/reports that focus on learning outcomes : 

·        Career Outcomes (IR Reports #107, 111, 115, 117, 121, 126, 133, and 135)

·        Transfer Outcomes (IR Reports #106, 112, 118, 127, 132, 134 and In-Briefs #90     and 91)

·        Attrition and Persistence (IR Reports #105, 108, and 120)

·        Surveys of Graduates (1999, 2000, 2001, and 2002)

·        Surveys of Former Students (1997 and 2002)

·        The Progress of 2001 Graduates of Community College of Philadelphia in Development of General Education Skills and Affective Attributes (IR Report #128).   

The annual report, Institutional Effectiveness: A College Report Card, describes indicators that represent areas of institutional effectiveness and include three that are related to student learning outcomes: 1) workforce development; 2) transfer preparation; and 3) student persistence, goal attainment, and assessment of collegiate experiences.  These data serve as a reference point for assessing changes in the College’s student outcomes over time.  Available to all faculty and staff on the Institutional Research Home Page, these data provide student performance information on a College-wide scale.  In addition, they correlate strongly with goals explicit in the Mission Statement.   A review of the most recent College Report Card (2002) shows that:

·        Two-thirds of career program graduates secured employment related to their academic program and overall 81% were employed.

·        Between 1996 and 2001, the average salary earned by graduates increased by 37.7%. This increase outpaced the increase in the Consumer Price Index during this time period.

·        Most graduates of the College remain in the City as active contributing members of the local economy.  Three-quarters of the 2001 graduates were working in Philadelphia; when expanded to the metropolitan area, 90% of the 2001 graduates were working in regional businesses.

·        The overall satisfaction level for 2001 graduates improved significantly.  Nearly 86% of 2001 working graduates reported their preparation for employment was either excellent or good.

·        The pass rates for College graduates on certification exams in the health care professions have consistently been higher than the national average.  For example, 92% of Respiratory Care Technology graduates in 2000 and 100% in 2001 passed compared to a national average of 53% and 69% respectively.  One hundred percent of the Diagnostic Medical Imaging graduates passed in both 2000 and 2001 as compared to a national average of 86% and 76% in the same time frame.

·        The outcomes of transfer students at the transfer institution are improving.  Most CCP transfer students at State System of Higher Education universities in recent semesters have earned GPA’s of 2.0 or higher.

·        Sixty-eight percent of transfer program graduates in 2001 were taking courses elsewhere shortly after graduating from the College.

Concerns

The College engages in many on-going processes that, either directly or indirectly, use student learning outcomes assessment to track, evaluate, and review aspects of the College’s educational offerings.  Various dimensions of students learning outcomes are measured and assessed within the classroom and at the program and institutional levels.  The lack of consistency in the conception and implementation of a core set of General Education requirements across the College has impeded the development of an overarching assessment model that connects these various processes.  At present, General Education seems to be defined within programs, not across the College.  This has limited effective assessment of these learning outcomes to the course and program level. 

Because of a lack of the development and implementation of a core set of institution-wide learning outcomes, faculty teaching in disciplines may have to create their own mechanisms for determining congruence between learning goals and objectives at the course and program levels.  Group grading, done routinely in the English Department, for example, is one such way consistency is fostered.

Additionally, there is no specific outcomes assessment model for General Education and no assessment structure comparable to the program audit for disciplines at the College.   Consequently, there is no mechanism for systematic self-assessment to evaluate congruence of learning goals and outcome measures within these disciplines at this time.  While many faculty engage in effective ongoing assessment of student learning outcomes to improve their courses, outside of the career programs these improvements go largely undocumented and individual faculty successes do not necessarily spill over and spread to other sections within the disciplines.  This is not to say, however, that there is not a rich exchange of ideas occurring in many parts of the College, an exchange motivated by a desire to improve students’ educational experience.  Faculty efforts and innovations are often shared with colleagues in forums such as Teaching Circles (a pilot program recently established in the English Department, for example), and presentations such as those that routinely occur in the Teaching Center.  In such forums, a variety of classroom practices from across the disciplines are discussed, leading, one assumes, to improved instruction.

There are few obvious methods to ensure that certain aspects of the Mission not directly related to transfer and job opportunities are addressed within courses and academic programs.  For example, while the Mission speaks to the development of civic involvement, there is not a structured process in place to assess where this dimension of the Mission is being addressed within the College.  Despite this, attempts have been made to collect outcomes information related to citizenship and community membership.  In Institutional Report #130A , respondents to 2003 Middle States Self Study Current Student Survey indicated that in the areas of “preparation for active participation in my community,” and “development into a more informed and active citizen,” student growth was limited. 

The College has used a wide range of benchmarking strategies to develop and understand institutional effectiveness relative to peer institutions.  Examples of these can be found in The College Report Card,  IR Report #112, titled A Comparison of Community College of Philadelphia Student Outcomes with Those of Other Pennsylvania Community College Students; and IR Report #110 titled  Student Satisfaction with Student Services, Academic Services and Campus Climate 1996 - 1999 (January 2000).  Additionally, the use of well-designed longitudinal studies has allowed the College to assess the impact of programmatic and service-delivery changes over time and to easily monitor evolving patterns in institutional effectiveness with respect to the many different subpopulations served by the College.  One dimension of benchmarking that has not been used extensively at the College is internally driven standards that define appropriate institutional performance levels.  For example, during a recent Professional Development activity, staff were asked to define an acceptable five-year graduation rate for full-time, first-time CCP students.  Acceptable graduation rates that were proposed by the College community ranged from 5% to 85%.    

RECOMMENDATIONS    

The College should develop a comprehensive student outcomes assessment plan, building on current successful practices such as the audits and the current course development models, to systematically use data and outcomes to affect change on the institutional, programmatic, and course levels.  The plan should be designed to assist administrators, Department Heads, Curriculum Coordinators, and faculty to use assessment data consistently.  It should encourage creativity in the classroom, foster improvement in student learning outcomes, and be flexible enough to encourage the full involvement of a traditionally independent faculty.  This overarching plan will be accomplished by the following recommendations:

·        Implement a core set of General Education requirements across the College.

·        Connect student learning goals with the College’s Mission Statement more explicitly at the institutional, programmatic, and course levels.

·        Encourage wider use of indirect assessment methods such as written student surveys and self-reflective questions in order to evaluate congruence between student outcomes and the Mission Statement.  Additionally, rubrics and portfolios may be used to increase parity of outcome measures for assignments between course sections, in programs, and across disciplines.

·        Assure routine collection of data on student learning outcomes, including data from employers, and use findings as the basis for course and program revision.

·        Develop internal benchmarks at the institutional, programmatic, and course levels to measure effectiveness and improvement in student learning outcomes.


Resource LIST

A.                  Institutional Research Reports Related to Standard 14:

 

·                     IR Report #69 – Middle States Self Study Survey Results – A Summary of Reponses (5/93)

·                     IR Report #77 – A Review of the Higher Education Literature Related to Models of Student Outcomes (6/94)

·                     IR Report #84 - The ACT NOW Program - A Description and Evaluation (6/95)

·                     IR Report #93 – Beating the Odds:  Reasons for At-Risk Student Success at Community College of Philadelphia (9/97)

·                     IR Report #95 – An Evaluation of the Achievement of the Developmental Education Mission (11/97)

·                     IR Report #98 – An Evaluation of the Achievement of the Developmental Education Mission – An Update (1/98)

·                     IR Report #103 – Developmental Education Outcomes – Three Years After the Developmental Education Task Force Report (4/99)

·                     IR Report #105 – Barriers to the Persistence of Students with Freshman and Sophomore Status (7/99)

·                     IR Report #106 – Transfer Outcomes of 1997 Graduates and Former Students (9/99)

·                     IR Report #107 – Career Outcomes of 1997 Graduates and Former Students (9/99)

·                     IR Report #108 – Why Do Students Drop Out of Community College of Philadelphia?  Reasons for the Attrition of Black and White Students (9/99)

·                     IR Report #110 – Student Satisfaction with Student Services, Academic and Campus Climate 1996-1999 (1/00)

·                     IR Report #111 – The Economic Impact of Community College of Philadelphia (2/00)

·                     IR Report #112 - A Comparison of Community College of Philadelphia Student Outcomes with Those of Other Pennsylvania Community College Students (8/00)

·                     IR Report #113 – Profiles of Students Who Enroll at Single and Multiple Community College of Philadelphia Sites (8/00)

·                     IR Report #115 – Career Outcomes for 1999 Career Program Graduates (11/00)

·                     IR Report #116 – Student Preferences for Alternative Course Delivery Options (11/00)

·                     IR Report #117 – Impact of Community College of Philadelphia Allied Health Programs on the Philadelphia Region (11/00)

·                     IR Report #118 – Temple University Persistence Rates for Community College of Philadelphia Transfer Students (12/00)

·                     IR Report #119 – Institutional Effectiveness 2000: A College Report Card (1/01)

·                     IR Report #120 – Student Attrition at CCP – When Students Leave, Why They Leave, and Their Academic Success at Departure (6/01)

·                     IR Report #121 – Career Outcomes for 2000 Career Program Graduates (11/01)

·                     IR Report #122 – Short-Term Transfer and Career Outcomes of Community College of Philadelphia’s Graduating Class of 2000 (1/02)

·                     IR Report #123 – Distance Education at the Community College of Philadelphia: Fall 1998 through Spring 2001 (1/02)

·                     IR Report #124 – Transfer Outcomes of Graduates in 1999 and 2000 (1/02)

·                     IR Report #125 – Institutional Effectiveness 2001 – A College Report Card (3/02)

·                     IR Report #126 – Career Outcomes for 2001: Career Program Graduates (10/02)

·                     IR Report #127 Transfer Outcomes of Graduates in 2001 (11/02)

·                     IR Report #128 - The Progress of 2001 Graduates of Community College of Philadelphia in Development of General Education Skills and Affective Attributes (12/02)

·                     IR Report #129 – Institutional Effectiveness 2002 -  A College Report Card (1/03)

·                     IR Report #130A - Responses to Middle States Self Study Current Student Questionnaire (4/03)

·                     IR Report #130B - Responses to Middle States Self Study Faculty/Staff Questionnaire (4/03)

·                     IR Report #132 - Transfer Outcomes of Graduates in 2002 (10/03)

·                     IR Report #133 - Career Outcomes for 2002 Career Program Graduates (10/03)

·                     IR Report #134 - Transfer Outcomes of 2002 Graduate and Non-Graduate Former Students (12/03)

·                     IR Report #135 - Career Outcomes of 2002 Graduate and Non-Graduate Former Students (12/03)

 

B.                   Institutional Research In-Briefs Related to Standard 14:

 

·                     IR In-Brief #90 – West Chester Acceptance Achievement and Persistence Outcomes Associated with Former CCP Students Who Enrolled at West Chester University in 1991 and 2001(12/01)

·                     IR In-Brief #91 – Acceptance Outcomes of Former CCP Students Who Applied to Thomas Jefferson University (1/02)

 

C.                   Office of Institutional Research:

 

·                     Performance Measure Definitions (5/03)

·                     Performance Measures (5/03)

·                     Surveys of Graduates (1999, 2000, 2001, and 2002)

·                     Surveys of Former Students (1997 and 2002)

 

D.                  Office of the Vice President for Academic Affairs. See the following Curriculum Revision Proposals:

 

·                     Associate in Arts: Art Curriculum - Minor Revision (11/30/00)

·                     Automotive Technology Curriculum Revision Proposal (10/10/95)

·                     Program Revision Proposal: Electronics Engineering Technology Associate in Applied Science Degree Program (11/95)

·                     Proposal for Minor Program Revision for Electronics Engineering Technology (1/30/02)

·                     Proposal for Program Revision: Music Curriculum (12/11/97)

·                     Proposal for Revision of Paralegal Studies Curriculum (10/19/93)

                                                                                                                                                               

E.                   Office of the Vice President for Academic Affairs. See the following Dimensional Course Approvals:

 

·                     MATH 171-Calculus I (4/95)

·                     PSYC 101-Introduction to Psychology (4/15/97)

·                     PSYC 115-Introduction to Parenting

·                     PSYC 215-Developmental Psychology (3/97)

 

F.                   Office of the Vice President for Academic Affairs. See the following Course Documents:

 

·                     ART 110-Ceramics II (4/26/01)

·                     ART 125-Design I (11/9/01)

·                     ART 290-Portfolio Preparation (12/3/01)

·                     AT 111-Automotive Suspension and Steering Systems (9/98)

·                     AT 121-Principles of Automotive Electricity and Electronics (12/96)

·                     AT 181-Automotive Engine Mechanical Repair (9/92)

·                     AT 261-Engine Performance and Diagnosis (4/97)

·                     AT 271-Air Conditioning and Heating Systems (10/97)

·                     DAH 101-Basic Dental Sciences (1/14/02)

·                     DAH 105-Practice Administration (1/12/01)

·                     DAH 135 – Course Radiology (5/01)

·                     DAH 221-Oral Histology and Embryology (6/11/01)

·                     DAH 291-Dental Hygiene Clinic I (4/98)

·                     ELEC 106-Introduction to Electricity (12/01)

·                     ELEC 126-Principles of Electronics (2/02)

·                     ELEC 227-Electronic Circuits I (1/02)

·                     ENGL 098-Fundamentals of Writing (12/4/00)

·                     ENGL 101-General Studies

·                     ENGL 208-Introduction to Literature: Prose (10/16/00)

·                     ENGL 210-Advanced Creative Writing (9/98)

·                     MATH 017-Elementary Algebra (6/94)

·                     MATH 118-Intermediate Algebra (9/91)

·                     MUS 101-Piano I (1/23/02)

·                     MUS 103-Music Appreciation (4/92)

·                     MUS 214-Chromatic Harmony (1/7/02)

·                     PLS 101-Introduction to Paralegal Studies (5/14/96)

·                     PLS 295-Legal Internship (7/8/93)

 

G.                   Office of the Vice President for Academic Affairs, Curriculum Development Services: Format for Program Revision (2001)

 

H.                  Office of the Vice President for Academic Affairs, Curriculum Facilitating Team, Guidelines for Course Development and Revision at Community College of Philadelphia (11/6/02)

 

I.                     Office of the Vice President for Academic Affairs, Career Degree Guarantee: Statement of Philosophy

 

J.                    Office of the Vice President for Academic Affairs. See the following Program Audits:

 

·                     Art Curriculum Audit (10/98)

·                     Automotive Technology Program Audit (8/13/01)

·                     Dental Assisting Hygiene Program Audit (12/22/00)

·                     Electronics Engineering Technology Audit (3/3/99)

·                     Management Degree and Certificate Program Audit – DRAFT (5/6/96)

·                     Music Curriculum Audit (SP 95)

·                     Paralegal Studies Curriculum Audit (1/01)

 

K.                  Office of the Vice President for Academic Affairs, Community College of Philadelphia. See the following documents relating to the 1993 Institutional Self Study:

 

·                     Major Issues -Where Significant Progress Has Been Made

·                     Document I – Chapter XI: Conclusion Chapter from 1993 MSA Self Study Report

·                     Document II-Summary of Major Recommendations Made By MSA Evaluation team

·                     Document III-Extract from Periodic Review Report dated July 1, 1999: Specific Responses to Middle States Evaluation Team Report

 

L.                   Office of the Vice President for Academic Affairs, Academic Programs Audit Schedule (10/02)

 

M.                 Office of the Vice President for Academic Affairs, Curriculum Development Services: Format for Program Development (12/01)

 

N.                  Office of the Vice President for Planning and Finance, Statistical Compendiums (1999-2000 and 2000-2001)

 

O.                  Office of the Vice President for Planning and Finance, Assessment Plan: An Overview of Efforts to Understand Institutional Effectiveness at the Community College of Philadelphia

 

Click here to view or add comments.