Acreditación de programas en la Universidad de Puerto Rico: Análisis de las políticas para evidenciar un proceso de cambio exitoso, 2003-2014
Celeste E. Freytes-González, Ed.D.
Consuelo Figueras-Álvarez, Ph.D.
Universidad de Puerto Rico
Recinto de Río Piedras
Resumen
Una universidad de primera debe demostrar la excelencia y calidad de sus programas, incluyendo cómo estos atienden cambios continuos y complejos, entre ellos, los adelantos tecnológicos, los cambios demográficos, políticas fiscales y los nuevos contenidos en la oferta académica. En 2003, la Universidad de Puerto Rico inició un proceso para establecer una cultura de evaluación y escrutinio externo que asegurara que sus programas sean de excelencia y capaces de enfrentar estos desafíos. Para ello, requirió la acreditación de todos los programas de acreditación voluntaria. El proyecto fue muy exitoso, y el 84 porciento de esos programas se acreditaron por las agencias correspondientes. Tomando como referencia el modelo de John P. Kotter, especialista en investigaciones sobre las transformaciones y cambios en ambientes a grandes escalas, este trabajo describe las etapas de los programas para acreditarse. Además, confirma que el modelo de Kotter es también aplicable a la educación superior.
Palabras clave: acreditación, evaluación de programas, Kotter, cultura de evaluación
Abstract
A first-rate university should demonstrate the excellence and quality of its programs, including how they meet continuous and complex changes in areas such as technological advances, demographic changes, fiscal policies, and new contents in the academic offerings. In 2003, the University of Puerto Rico initiated a process to establish a culture of external scrutiny and evaluation of its programs to ensure that they could respond to the challenge of working with these complex topics. As part of said process, it required that all the programs with voluntary accreditation should be accredited. The project was a success, and 84 percent of those programs were accredited by the corresponding agencies. Taking as reference the model by John P. Kotter, a research specialist on large-scale transformation and change, this article describes the different stages that the programs went through to ensure accreditation. At the same time, it also confirms that the Kotter model is applicable to higher education.
Keywords: accreditation, program evaluation, Kotter, culture of evaluation
Recibido: 6 jun. 2016; Aceptación inicial: 19 ene. 2018; Aceptado: 31 mar. 2018
Cómo citar este artículo (estilo APA)/Citing this article (APA style):
- Impreso/Print:Freytes-González, C. & Figueras-Álvarez, C. (2017). Program accreditation at the University of Puerto Rico: Policy analysis to evidence a successful process of change, 2003-2014. Cuaderno de Investigación en la Educación, 32, 126-173.
- Digital: Freytes-González, C. & Figueras-Álvarez, C. (2017). Program accreditation at the University of Puerto Rico: Policy analysis to evidence a successful process of change, 2003-2014. Cuaderno de Investigación en la Educación, 32, 126-173. Recuperado de https://cie.uprrp.edu/cuaderno/2019/01/10/program-accreditation-upr-2003-2014/
Many major change initiatives are destined to fail.
John Kotter, 2006
Change is a constant feature in our global society. The pace and complexity of change are due to technology, new communication channels, competition, and the speed in accessing and sharing information. “Managers and enterprises, be they public or private, service or manufacturing, will continue to be judged upon their ability to effectively and efficiently manage change” (Paton & McCalman, 2000, p. 5).
Although change is continuous, research indicates that it has been well planned and carefully implemented only in times of crisis (Morley & Eadie, 2001). One of the main reasons is that managers are not fully aware of their responsibility for facilitating and implementing change. In fact, every small transformation always requires an effective management plan (Hiatt & Creasey, 2003), that will help consider some of the most significant barriers in a timely manner. These include the “normal human resistance, the pressure of day-to-day events, scarce resources, inadequate planning process; and incomplete information and changing circumstances” (Morley & Eadie, 2001).
A good conceptual framework can provide a clear agenda for change to be effective. Further, larger organizations or those with complex systems will benefit from a process framed within a model for change (Mento, Jones, & Dirndorfer, 2002).
John Kotter has carefully and sensitively researched change in business administration organizations and enterprises. To succeed, change initiatives must be viewed as a transformation that must be elaborated in phases that build on each other. In Leading Change (1996) he proposes that the next eight phases be completed in the following sequence:
- Establishing a sense of urgency,
- Forming a powerful guiding coalition,
- Creating a vision,
- Communicating the vision,
- Empowering others to act on the vision,
- Planning for and creating short-term wins,
- Consolidating improvements,
- Producing more change, and
- Institutionalizing new approaches.
It takes years to successfully implement change, which is fundamentally resisted by the people it affects. It follows that any amendment or modification is influenced by:
…the typical human response to change: to question it, to challenge it, and to slow it down… The drivers for change —technology, competition, recognition, economics— are ones that most members of higher education institutions will agree on, but the process for implementation, reorganization and redesign has been less frankly discussed. (Wedge, 2006, p. 10)
Promoting change and innovation in higher education is as complex as in any other organization; more so in public universities where the participation of the academic community is part of the institution’s historical milieu. In fact, the tradition of active democratic participation of different constituencies in public higher education decision-making adds another set of complexities for administrators.
Overall, the main elements promoting changes in institutions of higher education are external forces. A pressing example is the question of perpetual accountability raised by governments about the quality and relevance of such a system (El-Khawas, 2001) in relation to its increasing cost. Another example is the inquiry into institutional processes and services by accrediting agencies, such as Middle States Commission on Higher Education.
By the 1990’s the international mobility of students in our global society required that higher education credentials be recognized across borders. In this regard, accreditation was a great option to establish common criteria for all. Higher education needed to develop a system to assure the international equivalence of degrees. For example,
…in Europe this reform is known as the Bologna Process, which has been imitated in Latin America, North Africa, and Australia. … Bologna approaches [… emphasize] accountability, access, quality assurance, credits and transfer, and, most notably, learning outcomes in the context of the disciplines. (Hiatt & Creasey, 2003, p. viii)
Some countries have adopted professional accreditation of their university programs as part of a common strategy to establish communication with other institutions. In the case of the University of Puerto Rico, we were convinced that external and continuous evaluation would benefit our programs and improve student outcomes assessment data (CHEA, 2006). It is in this context that accreditation was viewed as a beneficial option.
The purpose of this document is to share the process, reflections, strengths, and needs that were instrumental in the success of the Project for the Professional Accreditation of Programs and Services at the University of Puerto Rico. The model developed by John Kotter (1996) will help explain the managerial actions used in leading the process for change. These actions will be compared to those recommended in the model. Most of Kotter’s research was developed by observing different business models, and application to the higher education environment is uncommon. This paper will address and expand many observations in this area.
Although our emphasis is related to program accreditation, the change process is part of a much larger one. In fact, many changes that are still evident to this day are framed within our objective to strengthen, evaluate and update all processes related to programs. This includes the creation of new programs and the evaluation (external and internal) of existing ones.
The University of Puerto Rico (UPR)
With a long-standing history of academic excellence, the University of Puerto Rico is the oldest system of higher education on the island. At the time this project was conceived, UPR had 5,054 professors and researchers, 61,967 students and thousands of alumni who honor this institution with their intellectual and professional contributions at the local and international level. It has eleven campuses located throughout the island offering 236 associate and bachelor’s degrees, 127 master’s degrees, 27 doctoral degrees, and many other courses and programs that are part of different continuing education offers. The organizational structure is complex and at the system level includes a Board of Governors, a University Board, a President, a Vice President for Academic Affairs, a Vice President of Student Affairs, and a Vice President of Technology and Research. Each campus has a Chancellor, a Dean of Academic Affairs, a Dean of Students, and a Dean of Administration.
All units have institutional accreditation (Middle States Commission on Higher Education, or MSCHE) as well as program accreditations necessary for students to continue graduate studies in the United States and other countries. UPR has an ample research agenda and professional exchange programs with over 120 institutions of higher education at the international level. This provides students with unique research and academic experiences.

The expansion of the system from one campus in 1903 to the current 11 campuses throughout the island was not the result of a carefully studied strategic growth plan. As was the case with most institutions at the time, these were created to solve specific needs or issues, including the political partisan pressure common to public universities. As a result, today there is a need to revisit who we are as a system. It is in this context that voluntary program accreditation was viewed as an important academic change to unify our curriculum, especially in those content-areas offered in more than one campus.
In 2002 a new President was appointed with the endorsement of the academic community. His proposed plan included the need to strengthen existing programs and consider accreditation as the primary tool for the success of continual assessment. In fact, the objective was to obtain and maintain professional accreditation for all programs of study in which such accreditation is granted. As a result, the Office of the Vice President for Academic Affairs (VPAA), with the support and endorsement of the President, discussed with the university community the relevance of program accreditation as an option to “foster an academic culture in which programs, departments, schools, and colleges will adapt their curricular offerings… to the best developments in their respective disciplines or field of knowledge.” This, in turn, would serve to “provide flexible protocols for evaluation, renewal, and academic assessment” (University of Puerto Rico, 2006, p. 11).
At the time, the Office of the VPAA viewed the voluntary professional accreditation of programs as a change option to update and strengthen the curricular contents of offerings at UPR. It proved to be a good strategic decision that encouraged contact with other specialists in the field at the national and international level, and nurtured professional diversity.
A brief review of UPR programs confirmed the need to initiate a systemic intervention to ensure that all were relevant and updated. Within such a complex institution we were well aware of the importance of establishing specific criteria to help units complete a more purposeful analysis of each program. Since all accrediting agencies have a blueprint of specific minimum standards, initial evaluation along those lines was a good place to start. Hence, in 2003-2004 the Office of the VPAA designed the Project for the Professional Accreditation of Programs and Services to strengthen the institutional culture of evaluation and pursue program quality assurance. Its vision established the accreditation of all programs as a priority.
At the beginning of the Project, the UPR had a total of 491 programs. Of these, 274 (56%) fell under the scope of a professional accrediting agency; 217 (44%) did not. Of those with accrediting bodies, 122 (24%) were already accredited since it is a legal requirement for the practice of the profession (such as law, architecture, engineering, and the health-related professions, among others). The remaining 144 programs (31%) were incorporated into the Project; most of these were offered in more than one campus, while some were unique to one unit.
The programs that could be accredited were: Teacher Preparation, Office System Administration, Chemistry, Information Systems, Computer Science, and Engineering Technology and Communication. These are offered at more than one campus. Other programs are unique to one campus, such as the School of Hospitality at UPR-Carolina, which is accredited by ACPHA. These one-of-a-kind programs were added to the list and individually discussed with the campus Dean of Academic Affairs.
While looking into professional accreditation agencies, it became evident that many services also had specialized accrediting channels. We then identified 98 services offered at UPR, none of which were accredited or externally evaluated, including: counseling centers, libraries, preschool programs, museums and the numerous journals published by the institution. The libraries were the only group that had some experience with an external evaluation process, as part of the requirements of the Middle States Commission on Higher Education.
Initially, the decision to incorporate these services into the Project was viewed as an additional and unnecessary burden or, at best, as a parallel initiative. However, after an extensive discussion it became clear that programs and services had to be viewed as a whole. Moreover, the evaluation of the latter facilitates and enriches the process of professional program accreditation. For example, libraries are an integral part of the students’ learning experience, and most agencies request assessment of their collections and services as part of their accreditation standards. Thus, external accreditation and the development of improvement plans for libraries would contribute to the quality assurance of program accreditation. The results of their self-study could be used to support program accreditation, consequently strengthening the entire institutional process of assessment and improvement.
The goal was then set that by 2012-2013 all programs and services susceptible to accreditation should formally initiate steps toward that end. This meant each program must have identified their accrediting agency, studied their requirements, used their standards to evaluate themselves, and begin implementing the required changes for accreditation. In addition, the appropriate accrediting agencies should officially acknowledge or certify that the program had advanced in this initial task. Once these preconditions were met, most agencies certify that the program was an “official candidate.” Much to our surprise, by 2009 all programs and services were official candidates, as certified by their agency, and had started their accreditation process. This went far beyond our initial expectations and served to encourage others to also reach the final goal.
The following sections explain the implementation process during the project’s first six-year period. For each of the phases proposed by Kotter, we shall illustrate how it applied to our initiative. Although his model was developed for private companies, we shall demonstrate how it can also be applied to institutions of higher education.
1. Establish a sense of urgency
At this stage, Kotter proposes a hard look at the company’s competitive organization, including market position, technological trends, and financial performance.
At UPR, the focus on program superiority and accreditation first gained momentum when, early on, the new President identified professional program accreditation as a priority in 2002. Although the limited financial situation at that time was a crucial factor that required a strong and dispassionate view of the institution, we were convinced that external evaluation could demonstrate our programs were up-to-date and offered our students outstanding options.
At first glance, it was evident that many of our programs fulfilled the minimum evaluation requirements. However, they needed extra motivation and encouragement to aspire to a higher level. In other words, we realized they were good, but still there were aspects that could be improved when compared to other first-rate programs. The first task was to define what exactly qualified as such and how we could demonstrate our programs complied with the minimum characteristics of excellence.
A major drawback was the fact that most program chairs were already convinced that they had a first-rate program and further review was unnecessary. For that matter, they constantly referred to data signaling that UPR already admitted the best students on the island. For example, during the past five academic years the average high school GPA of the new incoming class at UPR, which comprised about 13,000 students, was 3.50. Although this may be a source of pride for any higher education institution —as it is for us— it might also be a limitation in regards to growth and development. In other words, the general consensus among administrators was that because UPR maintains such admittance standards, it offers the best programs; therefore, a need to establish a new, unknown agenda was uncalled for. Indeed, some campuses used this information to justify maintaining the status quo.
Initial discussion about program evaluation required a comprehensive description of all our academic offerings. The list of programs had not been updated in thirty years and there were only vague descriptions available for some. In a massive time-consuming effort, the VPAA through the monthly meetings with the campus Deans of Academic Affairs (DAA), completed a brief profile of all offerings, which was then endorsed by the Board of Trustees. This was an essential step to establish a sense of clarity and unity throughout the university system. It also provided an environment of collaboration and teamwork characterized by constant communication and support. Conversations about how accredited programs could showcase their strengths started in these meetings.
The first hard look at program evaluation began when the activated the then Board of Trustees policy established, in 1993 (Certification No. 113, 1992-03), requiring all units to submit to the Board that year’s annual evaluation report. A total of 80 reports were received. After each one was reviewed, the following became evident:
- Even those programs that had submitted evaluation reports every five years did not include all the official requirements as outlined in the 1993 Board of Trustees Certification 93-113 regarding the creation of new programs and the evaluation of existing ones.
- The reports followed different formats.
- Most of the data referenced was not up-to-date.
- Few reports included data of the program’s effectiveness or whether the curricular content was updated with new developments in the field.
- Some important concepts were not included, such as assessment and use of technology.
- The requirements under this certification did not provide for continuous evaluation or for a specific system to review all the information received and provide feedback to the programs.
- Our main concern —rigorous student outcome assessments— was not even included in the reports because it was not required by Certification 93-113.
Furthermore, when the Board of Trustees questioned the practicality of directly receiving these reports—as it did not have the administrative structure to monitor compliance with student outcome assessments and accreditation standards— we realized an effective intervention for change would require updating all stages of the review process: namely, program creation, evaluation, moratorium and closure, as well as the organizational structure at the central level. As a result, all matters related to program review were referred to the VPAA. This decision helped centralize all processes.
It took close to eight months for the VPAA, together with specialist in the field, to read and react to each of the 80 reports. When comments were completed, the VPAA met individually with each Dean of Academic Affairs and the campus teams to discuss the findings. This personal experience later provided for an animated discussion among the Deans of Academic Affairs about how to demonstrate that the programs were indeed superior. It was at this time that the urgency to develop an action plan to strengthen the programs was born.
At this stage, the fact that the President actively recognized the importance of program review and included this topic in his monthly meetings with chancellors, the University Board, and the Board of Trustees, was a critical strategy. The VPAA also offered follow-up presentations at the University Board’s monthly meetings and worked closely with its Committee on Academic Affairs to discuss the subject. It was quite a cumbersome process that took several years; however, the full endorsement of all these groups increased knowledge, awareness and enthusiasm regarding the need to establish a procedure that would lead to program development and improvement.
Nevertheless, the development of a systemic policy did not automatically ensure an enthusiastic implementation. Quite the contrary: some professors and other members of the university community regarded it as an inappropriate interference with program content. To quell these reactions, specific answers and examples were presented of how important external evaluation was for our programs, particularly voluntary accreditation. In these conversations the support available to assist the units with this task was emphasized.
These policies, approved by the Board of Trustees, established professional accreditation as a requirement for all programs with an accrediting agency. Figure 2 presents the Process for improving and strengthening institutional programs.

This process is divided into three stages. Stage I comprises the two minimum institutional requirements necessary before requesting professional accreditation. It includes the appropriate license to grant degrees and institutional accreditation. For UPR, these are granted by the Puerto Rico Education Council (CEPR, for its Spanish acronym) and Middle States Commission on Higher Education (MSCHE), respectively.
Once these two requirements are met, in Stage II the unit can design its academic programs following any one of two options: existing program evaluation or creation of a new program. Therefore, it was important to view our programs in light of these two components, keeping in mind that the evaluation would be further subdivided in two categories: internal and external.
The implementation of Stage II required for all new program proposals to be in-line with accreditation standards. It was helpful that the Board of Trustees, with the endorsement of the University Board, approved a certification for program creation to “Respond to the institutional mission of guaranteeing academic offerings of the highest quality” (Certification No. 80, 2005-2006). In this certification, accreditation is introduced as a requirement for the creation of new programs, thus sending a strong message regarding continuous internal and external assessment (Board of Trustees, 2006).
A second systemic policy was approved regarding the continuous five-year evaluation cycle for all programs, which laid down the Norms for the Periodic Evaluation of Academic Programs at UPR (Certification No. 43, 2006-2007). Once more, the Board of Trustees emphasized the institutional mission of “guaranteeing high quality evaluation of academic programs on a continuous basis,” which clearly communicated the importance of constant and rigorous evaluation (Board of Trustees, 2007).
A process was established for programs susceptible to accreditation and those that did not have an accrediting agency in the field. For the former, the creation and evaluation of new programs must follow the standards of the appropriate agency; the latter had to complete an internal evaluation process. Two five-year cycles for the evaluation of the 217 programs that did not have an accrediting agency was approved, and a 10-year evaluation cycle was endorsed by the Board. Such was the case of various programs in the Humanities.
Stage III is a decision-making process. After professors, administrators and students complete a critical review of the program, various options can be considered: initiate a curricular revision, continue compliance with sustained accreditation requirements —if applicable, develop new areas of emphasis, recommend a moratorium, or consider closing the program.
The process outlined in Figure 1 positively influenced this accreditation initiative and helped strengthen our institutional profile. Frequent discussions about program creation and evaluation at the University Board meetings trickled down and stimulated discussions at the campus level. Similarly, the Academic Senate faculty representative at the University Board discussed the topic at the Senate meetings. The Deans of Academic Affairs (DAA) were also instrumental in this stage, as they addressed concerns and initiated individual discussions with specific program directors about the results of their evaluation and in the monthly meetings with the VPAA. They showed exemplary leadership and were highly respected by their peers at all levels. It is also important to note that the DAAs are members of the Academic Senate at their respective units and were available to elaborate on any information needed.
Another excellent source of communication instituted by the President were the monthly reports that the VPAA prepared for the University Board and the Board of Trustees regarding different topics related to the accreditation process. These reports included information on the number of programs in different disciplines and the demand for such programs on the island. This information facilitated the hard look required to analyze the competitive needs of our programs.
For a long time after its foundation, UPR was the only local option for higher education. However, the past decades have seen the flourishing of other institutions, as well as the establishment of new units under the UPR system. Presently there are over 34 centers of higher education on an island to cover an area of 3500 square miles. Some of them are as complex and widespread as UPR. As a result, various colleges offer the same programs. For example, Business Administration, Education, and the clinical health programs (mostly at the Medical Sciences Campus) represent 46% of UPR offerings. In all, Business Administration has 88 programs in ten units; for each program that UPR offers in this area, there are five at other institutions of higher education in Puerto Rico. The same can be said for Education, with six programs for each one offered at UPR. This information was also essential for analyzing our market position.
The reports presented by the VPAA emphasized the importance of assuring strong programs that could set UPR apart from other institutions. Moreover, it was important to highlight that the initiative had to do with demonstrating the strengths of the curriculum than the need to eliminate those programs with limited effectiveness—a perception initially shared by many professors.
In sum, UPR governing bodies, with concrete data, advanced the sense of urgency by using an external evaluation system: the accreditation process. The fact that the President, the University Board, and the Board of Trustees endorsed this view and approved systemic creation and evaluation policies set forth a common agenda for all programs.
2. Form a powerful guiding coalition
At this stage Kotter recognizes the need to “assemble a group with shared commitment and enough power to lead the change effort.” Furthermore, teamwork beyond the normal hierarchy and leadership develops a shared commitment to excellent performance among all ranks.
In our case the teamwork was performed by three distinct clusters at different levels. These included the leading group, the coalition group and the campus committees. Figure 3 presents the different divisions used in this process.

As seen in Figure 3, three groups were established to facilitate close collaboration and communication to advance the project:
- The Leading Group included various bodies. At Central Administration there are two governance groups, the University Board and the Board of Trustees, which received continuous information about this initiative in their monthly meetings. At the administrative level, the UPR President and campus chancellors advanced discussion of the project every month in a systemic way. The third group, comprising the Deans of Academic Affairs of all campuses, worked and implemented the project with strong leadership. Furthermore, the continuous exchange of information at the deans’ monthly meetings with the VPAA increased their specialized knowledge and helped consider specific strategies for advancing and adapting this initiative at their campuses. The VPAA coordinated and facilitated direct communication between these groups.
- The Coalition Group included a carefully selected faculty member for each of the ten program accreditation areas. As part of their daily activities these professors worked with the chairs of the specific program in each campus. They also communicated directly with their Dean of Academic Affairs and the Office of the VPAA. More importantly, they met with their chairs and faculty to advance the accreditation process and recommend the implementation phases for each program or service. In many ways this allowed them to work as a team outside the normal hierarchy, with the authority and flexibility to advance the project. Members of the coalition group were required to have a full understanding of professional accreditation standards and establish direct communication with a contact person at the agency. The VPAA stimulated the attendance of these leaders to annual professional conferences to further advance their knowledge of the discipline and create a network of resources and specialized professionals to assist others. The knowledge they acquired transformed them into excellent resources, with the ability to explain and adapt this know-how to specific campus contexts. In fact, many of them were invited by different agencies to participate in the external evaluation of other institutions of higher education. The VPAA also communicated directly with the agency representative to address the precise information required advancing the program evaluation.
- Campus Committees, the third general group, included the participation of College Deans, Program Chairs, Directors of the academic assessment office, Budget Directors, Professors, Program Coordinators, and students of each campus. They worked closely with the coalition group representative and established a working relationship with their Dean of Academic Affairs.
Early on, the institution’s established administrative-organizational structure was used to facilitate the foundation for change. In each campus, the chancellor, deans, chairs, and directors maintained continuous communication with Central Administration, particularly through monthly meetings between the Deans of Academic Affairs and the VPAA; the budget directors, with their counterpart in Central Administration; and the President, with the chancellors. As part of this effort, the central budget director and the campus budget directors were instrumental in identifying and distributing funds so as to meet program needs and make available necessary resources. Accreditation was not defined as a new task: it was incorporated into the ordinary responsibilities of these positions. Furthermore, when the President interviewed the chancellors as part of the process of recommending their designation to the Board of Trustees, program accreditation became an essential element for consideration.
The initial communication process to integrate the coalition group with professors and administrators was important to explain the desired change. For this effort to be successful, the position of Assistant Vice President in charge of Accreditation (VPAA-AT) was created. At first, it took the VPAA-AT an enormous amount of time to consistently communicate the vision and change process. UPR had the advantage that most of the programs that could be accredited already complied with the minimum characteristics of a good program, and program chairs agreed that they could aspire to meet a higher level of achievement.
In sum, UPR assembled a special and diverse group of professionals dedicated to excellence who shared commitment and enough power to lead the change effort at the institution. It was a large systemic, multitask workforce that strengthened and supported the accreditation process.
3. Create a vision
At this stage Kotter recommends the creation of a vision to direct the change effort and the development of strategies for accomplishing it.
It was at the first stage of our process that we developed a clear idea of how to express our vision of strengthening the programs and services: “Any program or service for which a widely recognized accrediting agency exists should be accredited.” Once the vision was established, different meetings and group discussions helped clarify the concept of external evaluation and accreditation. We were not as concerned with the programs that did not have an accrediting agency, since they were to follow the internal procedure that had been previously established. In this regard, system-wide policies approved by the Board of Trustees in this area were valuable and influential.
Two main issues emerged during the discussion of the vision that helped us prepare to provide clear and specific answers to the academic community. Although the word accreditation offered a distinct view of what needed to be done, it was also associated with the experiences units had with institutional accreditation (all units are accredited by MSCHE). This association was not necessarily beneficial when working with professional program accreditation because the initial response to the content required was more focused on general institutional outcomes or processes, such as assessment. In fact, some units did not have any experience whatsoever with program accreditation, much less with student outcomes assessments in that context.
During this initial stage, orientation meetings resulted in frequent and focused discussions between the VPAA-AT and different camp leaders (such as deans, department directives and others) about the differences between both types of accreditation. It was pointed out program evaluation would use discipline standards developed by peers, who are cognizant of the best practices. In these orientations it was underscored that the evaluators were specialists in the field and should be considered their colleagues. In this respect, the idea was to address the perception expressed by some faculty members that external evaluators would “tell you what to do.” It was important to clarify this common view shared by the university community early on.
Parallel to this process the VPAA initiated contact with the different accrediting agencies and gained more knowledge of the standards required from each organization. Consequently, various system-wide strategies were discussed and implemented to achieve the vision. The VPAA developed a Plan of Accreditation at UPR: Reaffirming the Culture of Evaluation, 2004, which included general guidelines for each of the program accreditation processes and a timeline for each phase. It also provided a chronology of actions to be implemented by chancellors, DAAs, and department chairs.
The sheer quantity of programs and services was daunting. Numerous assemblies were coordinated during the week, and every day the Office of the VPAA included meetings and visits with many professors and administrators from the different campuses. Groups meeting for the first time needed some time to learn how other programs worked. Professors and administrators had specific questions about how the standards applied to their programs, while simultaneously studying new ways to explain those changes to their units. Once administrators understood they were going to be in charge of this “new way of thinking,” it was clear they needed to carefully read the new information. At the same time, and in response to all these academic activities, the VPAA and VPAA-AT established continuous, frequent and extensive meetings to review the effectiveness of the strategies and plan other options. At times it was a complex day-to-day challenge. Throughout the first year, the possibility of a crisis was always a thought close to those in charge.
Running parallel to this learning curve, questions related to the budget were always on the table. In fact, there was a persistent need to equate the “change” in terms of the available fiscal resources.
In response to the complexity of the institutional structure the different groups were encouraged to identify the best options to advance the vision. Once the top-level management achieved greater precision in explaining how to improve performance by working with accreditation, they were better prepared to recommend effective strategies. An extra step that improved the vision as an important objective and an integral part of the institutional plan was the approval of the institutional policy regarding accreditation by the Board of Trustees (Certification No. 138, 2003-2004, Institutional policy on accreditations).
4. Communicate the vision
In this phase, Kotter indicates a need to establish a link between the new vision and strategies for achieving it, as well as a process to teach new behaviors “by the example of the guiding coalition.”
As previously mentioned, the guiding coalition group was composed of professionals who understood the vision, established strong communication with each other, developed the required expertise, and could influence and lead the change process. It developed some of the better strategies for effectively communicating the vision during this stage and acquired the capacity to communicate and break it down as applied to the specific program. When the different working groups received practical and useful information about how to adapt the concept of accreditation to their program, they were convinced of how important it was for them to participate.
It was immediately evident that a large number of programs and their individual complexities required the project to be conducted in stages. Aside from the project’s system level plan, a timeline for each program was prepared. This was very helpful for programs or services available in various units, as is the case for the teacher preparation and the business administration programs, which are offered in eight units. In fact, the work with the “common groups” of programs started during the first year of the accreditation project, and on a yearly basis other programs were added. The same situation was evident for the services: the libraries in each unit were the first service group to initiate the evaluation process.
The project was presented as a transformation in how to “think” about our programs, instead of a bureaucratic process carried out by a new campus office. In fact, it was clearly stated that program chairs had to assume a special leadership role; the Office of the VPAA would offer assistance, so they may acquire the necessary knowledge. Direct communication with a representative of each accrediting agency served to identify specialists who could provide immediate information to the different groups. As these moved toward more specific and new topics, such as how student assessment information is relevant for program evaluation, their questions were immediately answered.
The vision of professional accreditation for all programs was communicated to different groups, including deans, program directors and committees working with the standards or with the assessment processes. Most of the meetings were coordinated, sponsored and conducted by the Office of the VPAA. As the vision was shared and communicated, the groups were presented with the support system available to adapt the process to each unique program.
Engaging the agencies and establishing a closer relationship was an important step. For this reason, the President or Executive Director of each accrediting agency were invited to visit the UPR. Their first meeting was always with the UPR President and the chancellors to discuss the advantages and strategies of the agency’s requirements. It was an excellent opportunity for them to have an overview of the most important aspects that the agency would review and ask specific questions they might have about the procedures used.
The agency representatives then met with the deans and other working groups to explain their agency’s specific framework. During these initial meetings, they continuously noted how significant it was that the President and Chancellors endorsed the project. Actually, they shared how different their experience had been at other institutions, where they needed to convince the President and chancellors of the importance of this initiative. Not only did these meetings provide the highest administrative level with a common language and better insight about the accreditation process, but also, they strengthened the institutional vision at the campus and department levels. The fact that the information came directly from the agency representative sent a powerful message.
During these visits, several group meetings were also coordinated between program chairs, area coordinators, and specialists from the agencies. The VPAA and the VPAA-AT were always present, especially during the initial discussions to communicate the vision. It was a time to convey their support to the project.
To further demonstrate commitment to this vision and as part of the planning priorities, the President allocated a specific budget for accreditation. This decision helped stimulate, support and motivate the university community to give serious consideration to this project. It was a time of economic constraints, yet it was encouraging to know that some help would be available to advance ideas and recommendations they had identified. This is not to say that the university community was immediately taken to the idea of program accreditation, but frequent meetings and discussions between the Office of the VPAA with the chancellors and deans changed the mood surrounding the project as they became aware they would receive the needed assistance.
Another innovative strategy to communicate the vision was an online professional accreditation community of practice. The VPAA set up the structure and posted documents, such as Board of Trustees certifications endorsing the vision for the project, the timelines for the accreditation process for each program, the standards established by each agency, and the documents recommended by the different groups.
Communicating the vision and the strategies for achieving it was one of the most time- consuming phases of the project. For example, it was crucial that the information was constantly repeated to different groups at varying times, but it was also necessary to elaborate, with specific examples, how the final result would be superior to what was available. In order to effectively direct the project toward its goal, it was important to gain at least basic knowledge of the requirements and course of action of different agencies. At times, it was a laborious and lengthy effort, yet a worthwhile investment.
The vision for the project and the implementation strategies had to be clear for everyone on campus. It was also important for directors, faculty members and students working toward accreditation to receive the endorsement from chancellors, deans and the coalition group. In fact, the leading and coalition groups continuously showed their support and helped advance the agenda. They visited each of the units and met with groups of professors or committees working with the curricular revision. They constantly communicated the vision and established a better understanding of how to work with and measure our institutional effectiveness.
5. Empower others to act on the vision
In this phase, Kotter highlights the importance of removing or altering systems or structures undermining the vision. He mentions it is also crucial to encourage risk-taking and nontraditional ideas, activities, and actions.
The notion of implementing nontraditional ideas first came up during the frequent meetings between each accreditation coordinator and the campus committees, including individual and group meetings with department coordinators, faculty and other committees of different campuses. The meetings encouraged the groups to understand their programs as compared to the standards and elaborate on all the change options required to meet the criteria. Aside from the opportunity to share, it was also a chance to propose common activities and collaborate with the task. In this process, the units gained an important understanding of different ways to work with the requirements.
Limited resources also played an important role in the change process. During the initial years the programs typically commented that unless a specific budget was assigned, it would not be possible to achieve accreditation. They also recommended establishing a new administrative office for this purpose. In academia this is perhaps a natural reaction to a new topic. Although it was evident extra funds were needed, thinking accreditation as a new office or personnel could strongly interfere with the change progression. It was important to convince the groups that a change in thinking was not totally contingent on a budget and to encourage them to view their programs with a different mindset.
Subsequently, a specific budget was allocated for accreditation. As soon as the community was aware of this, the VPAA received many proposals that justified how it would be easily accomplished with the assigned funds. Letters from the units frequently mentioned that it could easily fail at their unit if the monies were not assigned. At this time, it was interesting to note that most of the requests were based in the many years of experience of the program director or professor working at the institution, a clear sign of status-quo thinking. This is to say that the budget request was mostly based on the personal or professional opinion of the professors or directors. Hence, it was a priority for the institution to provide another frame of reference to identify budget requests and relate it to accreditation.
The next step was to establish close communication with professionals from each accrediting agency to understand how to work with the requirements through the Office of the VPAA. For example, in cases for which a specific academic content or student skill was required, the immediate response of the units was to develop a course and, if need be, recruit a professor within the lacking area of expertise. In conversations with the agency, we learned that the content to be included did not necessarily require a three-credit course, but could be incorporated as part of an existing course, while the professor in charge could take some extra courses to be certified in the area. In fact, another option was to identify professors from other campuses to teach the specific skill. Thus, it was necessary to think out of the box and to consider nontraditional ideas.
The continuous communication the VPAA sustained with the different agencies was very helpful in trying to propose new ways of discussing and addressing the issues at hand. In return, this close connection between the agencies, the different coordinators of the coalition group and the VPAA-AT encouraged the programs to create new and innovative ways of working. They felt empowered to propose different ways to be successful with the agenda.
In this regard, Kotter notes that different systems or structures should be developed to ensure that the vision is a priority. The need to develop a specific structure was evident in response to the first end-of-semester reports submitted by the programs. They suggested that, although many units were working intensively toward accreditation, the activities summarized did not necessarily respond to the standards or processes required by the accrediting agency. The programs reported many meetings and productive discussions on the importance of accreditation, yet there was not a clear path to the requirements. At this stage, the VPAA-AT, together with the coalition group developed a monitoring structure to ensure that all programs were advancing according to their agency’s requirements.
Table 1 is an example of the monitoring phases outlined for the programs that were accredited by the Accreditation Council for Business Schools and Programs (ACBSP) as compared to their standards. The highest score (100%) was obtained when the unit received the official letter granting accreditation. The format also provided guidance to measure improvements, and the information was used for the monthly reports to the University Board and the Board of Trustees.
Table 1
Example of the monitoring phases for the program with the ACBSP
ACBSP |
|
Phase | Percent completed |
Review standards and compare to program | 10% |
Send intention letter | 20% |
Preliminary self-study visit questionnaire | 30% |
Designation of a mentor by each agency | 40% |
Develop assessment plan | 50% |
Prepare data of two assessment cycles | 60% |
Ensure approval of agency to submit self-study | 70% |
Identify evaluation visit date | 80% |
Coordinate the evaluation visit of the agency | 90% |
Accreditation recommended by agency | 100% |
As soon as the monitoring phases were developed for each of the agencies, the units had a better idea of what was required at each step of the way. Reports submitted by the programs were more specific and the budget was carefully assigned according to how each one advanced through the phases.
Figures 4 and 5 shows the progression for the teacher preparation and business administration programs up to the final date of accreditation. They show the time frame by which programs achieved each phase of the accreditation process described in Table 1. This format was also used for the monthly reports to the University Board and the Board of Trustees.
The only pending accreditation is the business administration program of the RUM campus.


At the time this article was submitted for publication 100 percent of the Business Administration programs were accredited.
The chancellors were a first-rate resource for they were very committed to achieving program accreditation. They met with the different groups and frequently reviewed the status of each program. In addition, every month they shared this information with the Academic Senate. The DAAs were also instrumental in ensuring the integration of the administrative structure of the campuses was integrated into this initiative. They communicated the vision, phases and new directions to the campus community.
As Figure 4 shows, each unit designated a campus committee, which included the chancellor, the DAA and the area coordinator for each program or service to be accredited. The DAA chaired this group and conducted monthly meetings, though they celebrated weekly meetings with those needing more help.
The VPAA sponsored at least one monthly meeting with programs offered in various campuses. For example, the chairs of the eight teacher preparation programs would meet with the accreditation coordinator. These meetings created a sense of unity among the participants, and the level of commitment of professors was extremely high. The ideas and documents shared helped clarify doubts and develop new strategies. As a result, an authentic community of practice was formed.
Very early on, these groups needed direct contact with the accrediting agencies to strengthen their understanding of the standards and the conceptual frame work content. Therefore, during the first two years of the project, professors and program chairs, together with the coalition group coordinator, were encouraged to attend the yearly professional conferences or conventions. Upon their return, special meetings were organized to update and exchange information with members of their campus committees and coalition group.
To empower others to act on the vision, they needed to feel comfortable and clearly understand the new concept and content of program evaluation. At the monthly meetings, the VPAA sponsored frequent work sessions for each coalition group and invited skilled and knowledgeable specialists from outside the institution to discuss issues and concerns. These specialists were carefully selected with the recommendations of the accrediting agencies. They had to be risk takers and innovators from similar programs who had achieved professional accreditation at their institutions. The nontraditional ideas and actions shared by them motivated the groups to innovate and consider new ways to achieve their goals.
The format for these work sessions, which at times lasted for more than one day, included very few general lectures and more emphasis on specific tasks. We also made sure that the groups had an assignment, as identified by the specialist, before each meeting to ensure they would bring detailed questions and innovative strategies. As new ideas were born, they were published in the online community of practice and shared with the entire university community. This also encouraged continuous discussion and comparison among programs. These work sessions also work well for different conceptual questions; it was not only to learn about specific techniques.
However, during the working sessions some comments were made to undermine the process, especially during the first group meetings. In order to address the concerns, it was important that they felt they received all needed information in response to their questions and were able to share their concerns and comments with the group of DAAs. Their comments helped all the DAAs elaborate ways to discuss similar issues within their units. As time progressed, these comments were down to a minimum.
Another source of resistance was the use of English to communicate results and prepare reports. The activities with external specialists were conducted in that language and initially, in this regard, all went well. However, as the project advanced, it came to the attention of the Office of the VPAA that some units favored writing the draft in Spanish, which is the main language spoken in Puerto Rico. Hence, the VPAA recruited English translators to assist them—a strategy that proved to be an excellent option to energize the groups and eliminate hesitation regarding the project.
Finally, at the system level, the procedure for curricular revisions’ approval was simplified. The new certification for program creation and evaluation included a detailed guide to complete the process in less time. Curricular revisions are critical for accreditation—perhaps the most difficult task to complete. It entails a total assessment of the student profile and outcomes, including the sequence and content of the courses that the students take. It concludes with the endorsement of all program professors.
To empower is to encourage, and to do this working groups needed the tools to be successful. It was of the utmost importance to keep the groups engaged, so they would view the Office of the VPAA as a support system, ready to collaborate and provide what was needed to complete the goal. The purpose of the VPAA was to ensure that each program or service was motivated in achieving the highest level of performance. It was not an easy task because any subtle change would affect the complete system being built. In addition, most of the groups, as well as the chancellors and the President trusted the Office of the VPAA to make the right decision. At times, some decisions were extremely complex and split-second intuition was essential to make the right choice.
Looking back, it was helpful that the project director had previous experience in working with other accreditation initiatives and with many other projects at the campus level. Also, most of their university life experience included working in a large and complex campus that had many multidimensional offerings and activities.
6. Plan for and create short-term wins
At this stage the project should “engineer visible performance improvements, and recognize and reward employees contributing to those improvements.”
As the programs continued to comply with the standards, they received public recognition. They were congratulated when they submitted the letter of intent or the initial report to the accrediting agency, when they reviewed their mission or completed a curricular revision, and when they gathered data for the first cycle of the student outcomes assessment plans. These “small wins” were announced and celebrated at group meetings and in reports submitted to the President and the Board of Trustees. In many instances, the groups also received a congratulatory letter from the Chancellor or the DAA, with a copy sent to the program chair. Public professional recognition brought enthusiasm, and a healthy environment of positive competition grow among the groups.
In some cases, the reward for these achievements was to sponsor faculty participation in accreditation conferences or specialized meetings. In other cases, funds were awarded to meet program needs in accordance with the success of their assessment process. For example, if a program identified a need to update technology as a result of their self-study, funds were assigned for this purpose.
A new mindset emerged. Programs had to have a working plan in place and in-depth knowledge of what they needed. They were encouraged to work on new concepts and apply them innovatively to improve themselves. Aside from the fact that it was an opportunity to share their knowledge with other programs or services at the campus and system level, their greatest reward was the recognition they obtained through this process.
When the process was completed and accreditation was awarded, the President, the VPAA, and the Chancellor met to congratulate the groups and discuss new topics, issues, reflections, and projections. As soon as the accreditation letter arrived, the President would send two letters to celebrate the achievement: one directed to the group with a copy to the campus community and another to the university community at large (all campuses). This “big win” was duly acknowledged when the Board of Trustees issued a special certification to the campus and the general community congratulating the program for their efforts and explaining what their achievement meant for our institution.
Press releases were issued to keep the general public abreast of their achievements. It is interesting how this information also helped units to establish a point of comparison. For example, some of the largest and oldest units were surprised to discover that the smaller units had advanced in their accreditation, which injected a good dose of healthy competition.
As may be expected, the Office of the VPAA also changed, developing close contact with each agency about the specific information needed to advance the accreditation process. Programs were well aware that there was an interest in knowing what was expected. The role of the VPAA in communicating and exchanging this information with the chancellors and the DAAs placed everybody on the same page. As soon as the final accreditation visit was recommended, the VPAA informed the President, the Board of Trustees, the Administrative Board, the Chancellors, and the DAAs.
From an administrative perspective, the Office of the VPAA was careful to define and discuss its role vis-a-vis the chancellors and DAAs. The VPAA responds to the President and Boards, but in working directly with the programs, it was important to likewise work closely with the chancellors and deans. In essence, the delicate question of who was in charge of the programs was automatically answered when the campuses started to see the benefits of working together. The Office of the VPAA defined the best working relationship for the success of the programs, and the project was fortunate to have competent chancellors who strongly endorsed the initiative.
Tables 2 and 3 present the number of programs and services that were accredited for the first time, from 2004 to 2014, by campus and accrediting agency. A total of 117 programs were accredited. By 2014, 81 percent of the 144 participating programs received accreditation for the first time. This total increased to 89 percent when the Business Administration Program of RUM Mayagüez was accredited by ACBSP after 2014.
This was an extraordinary success for the UPR because it sent a strong message that the institution could improve its profile and attain higher levels of performance.
Table 2
Total of UPR programs accredited by campus and agency, from 2004 to 2014
Unit | AACSB (ADEM) | ACBSP (ADEM) | ACBSP (SOFI) | ACEJMC (Comu) | ACS
(Chem)[1] |
ABET-CAC (Comp) | ABET-ETAC (Tec. Ing) | NCATE
(Educ.) |
Unique
(New) |
Total |
Río Piedras | 11 | 2 | 3 | 2 | 5 | 23 | ||||
Mayagüez | (9)[2] | 12 | 12 | |||||||
Humacao | 4 | 1 | 2 | 3 | 10 | |||||
Cayey | 3 | 2 | 11 | 16 | ||||||
Arecibo | 4 | 1 | 1 | 1 | 1 | 2 | 10 | |||
Bayamón | 4 | 1 | 1 | 5 | 2 | 13 | ||||
Ponce | 4 | 1 | 3 | 2 | 1 | 11 | ||||
Aguadilla | 5 | 1 | 1 | 2 | 9 | |||||
Carolina | 2 | 1 | 2 | 5 | ||||||
Utuado | 1 | 1 | 2 | 4 | ||||||
RCM | 4 | 4 | ||||||||
Subtotal | 11 | 27 | 11 | 4 | 4 | 12 | 36 | 12 | 117 |
[1] The accreditation process for the Chemistry programs was initiated. Of the four UPR programs, two are accredited (RRP and RUM). The other two have submitted the pre-condition documents, which are under revision by the agency. They are actively working with new facilities, equipment and space. As part of their definition of academic excellence, ACS requires a high level of technology and space to apply.
[2] Accredited after 2014 by ACBSP.
As shown in Table 3, by 2014, 60 percent of the 102 services participating in the project received accreditation or external evaluation for the first time. This was an incredible change in the paradigm of how the services viewed themselves in the context of a higher education institution. It was an opportunity for them to also demonstrate their excellence in the same view as the academic programs they serve or support.
Table 3
Total of UPR services accredited by campus and agency from 2004 to 2014
Services susceptible to accreditation that initiated the process in 2004-2005 | Susceptible to accreditation or external evaluation 2004-2005 | Accredited or certified as part of this initiative | Percentage |
Libraries ACRL (ALA) |
14 | 14 | 100% |
Periodicals (LATINDEX) |
62 | 26 in Catalog (36 in Directory) |
58% |
Counseling centers (IACS) |
11 | 7 | 63% |
Museums (AAM) |
7 | 1 | 2% |
Preschool centers (NAEYC) |
7 | 2 | 29% |
Maternal Day Care Center | 1 | 1 | 100% |
Total | 102 | 61 | |
Percentage | 100% | 60% |
The UPR has four Chemistry programs, of which two are accredited. The others have initiated the process, and are constructing new facilities and purchasing the required equipment to meet ACS guidelines. This is one of the few entities that, as part of the discipline, requires a high level of technology and facilities before accreditation. These two programs have received the initial endorsement of the accrediting agency.
In 2009 the Board of Trustees appointed a new president. For a moment, there was uncertainty about how the project would continue or whether it would continue at all. However, the new administration endorsed it and kept working with the different groups, probably because by now the groups were convinced that the vision was relevant to their purposes. This administration was in charge until 2013, when a new president was designated. It is probably a repetitive topic, but the continuous change (every four years) of the academic structure at an institution of higher education can create uncertainty. Nevertheless, this was one of the few system projects that throughout this timeframe have continued; hence, many programs and services are still receiving accreditation.
7. Consolidate improvements and produce more change
As recommended by Kotter, at this stage an “increased credibility from early wins to change systems, structures and policies undermining the vision” is expected. This includes the process to “hire, promote, and develop employees who can implement the vision and reinvigorate the change process with new projects and change agents.”
To assure the continuous advancement of the vision, various new strategies were implemented to motivate the participants to work together to create and maintain change and support a “culture of assessment and institutional evaluation” as set forth by the UPR 2006 Agenda for Planning, the institutional strategic plan.
The position of VPAA-AT was critical to ensure that the units always had direct contact with someone who could assist them, because units frequently needed a strong and permanent contact to deal with the day-to-day situations and questions. In addition, recurring monthly conversations with all groups served to constantly monitor their specific needs. In fact, frequent contact between all groups was an important factor at all levels. For example, the almost daily meetings between the VPAA-AT and the VPAA helped integrate the accreditation process to the program creation and evaluation continuum.
Overall, the institutional culture of each program changed according to the requirements established by their accreditation agencies and the results of the evaluation visits. The program and student assessment —common topics that impacted all programs— improved, and many of the individual changes required for re-accreditation continue in place. Furthermore, it has helped with the institutional accreditation process for each campus. Administrators, professors, and students have shared the unique satisfaction of attaining accreditation and establishing this priority as something to sustain.
The initiative is very much alive today. All the certifications related to program creation, accreditation and evaluation are still in place, and many programs continue to receive the required support to sustain their accreditation. So as to minimize the continuous revision of different certifications by the Board of Trustees, which require a tedious, long and complicated timeframe, an addendum in the form of a guide was recommended by the VPAA. This guide is meant to be updated regularly and includes new content that the programs should take into consideration.
The monthly accreditation meetings of the common groups pursuing accreditation by the same agency established strong working relationships. They continuously discussed the mission and vision of the programs and the connection between the curricular sequence and the required standards. After one of these monthly meetings, some programs seemed to have progressed more than others and were ready to request the official visit. As a result, the Office of the VPAA individualized the group meetings and modified the timeline to allow these programs to continue at their pace.
It is interesting to note that group cohesion was so strong that individual achievement or change took some time to be accepted. It was a concern to see that programs achieving the standard requirements earlier were willing to delay their progress so as to keep the group together. The VPAA carefully insisted that when they received their accreditation they would return to the group to facilitate and assist the others. In fact, as programs or services advanced in the accreditation process or were accredited, professors from each group were selected to work with the programs needing special assistance on a one-to-one basis. These professors reinvigorated the change process and to this day continue to direct the vision and change effort.
To ensure a “good win”, an innovative capstone was implemented for programs close to the official accreditation visit. First, they would participate in a simulated or mock visit, which would include only personnel from the institutions: for example, directors of other programs or special evaluators with specific knowledge. As a second step, an external evaluator from the agency would visit the program and make the final recommendations before the official evaluation visit. The opinion of this external evaluator not necessarily meant the program would receive accreditation, but it was useful information to help decide if it was ready.
The mock visits included an external committee that would evaluate the program according to the required standards and procedures of each agency. At the end of the visit, this group would meet to discuss their impressions. In the external simulated visit, the committee was composed of carefully selected faculty members from similar UPR programs, members of the VPAA accreditation team, and one or two external consultants from the accrediting agency.
The external consultants assigned to an evaluation team waived their rights to participate in other official accreditation visits of our institution, but they were very enthusiastic with the opportunity to collaborate at this stage. They were designated as chair of the visiting committee; they also had experience as a visiting team evaluator. The fact that the evaluation committee chair was English-speaking provided the committee members with an opportunity to understand and practice the technical language related to the standards.
Local mock visits were conducted in a similar manner, but only included persons from the island; in other words, without external consultants from the accrediting agency. This exercise was used if an appointment from the external-agency specialist could not be scheduled or when the first mock visit was completed but we wanted to be sure their recommendations were implemented before the official accreditation visit. At times it was nerve-racking for the first programs that were ready to receive the evaluation. This was also the case for the VPAA as it was evident the hard work of many years would be instantly lost if the first ones were not accredited. The units were equally under pressure to organize their self-study report and presentation to ensure accreditation, since none of the programs wanted to be the first to fail the accreditation visit.
Both external and local mock visits proved to be extremely helpful in many ways. They provided the groups with a proper understanding of the priorities to consider and a sense of empowerment, which strongly invigorated the change process and produced more change. In fact, this was the starting point for the formation of the community of practice that some groups still maintain. These groups shared their commitment, compared their data, and identified innovative ways to examine and ensure common strengths.
The mock visits helped the programs feel more confident about the official evaluation visit. Actually, thanks to this strategy, the first ones to request it received the accreditation from the agency. Although the requests for the official visit came from the programs, the Office of the VPAA endorsed the recommendations after the local or external mock visit was completed.
Mock visits also gave the Office of the VPAA the chance to help and assist in specific areas. For example, for one of the programs that had worked intensively for two years the mock visit concluded the assessment process was still very fragile. Initially, some members were not concerned and indicated the program would be accredited because it complied with the rest of the standards. Nevertheless, an external evaluator with expertise in designing and implementing an assessment process —as required by the agency— was invited to strengthen the area. During a three-day visit, this evaluator, together with the professors, incorporated new ways to ensure that the program had strong data to present. Programs from other campuses also participated in these visits and had a chance to revise their assessment plan. The strategy worked well for all.
Another promoted change at the central level was the role of the Campus Institutional Planning Office in gathering evidence for the assessment process. Each campus has a Planning Office in charge of institutional assessment, but the only data available was for campus-wide evaluation and for some of the programs already accredited. Thus, the Office of the VPAA recommended this staff to gather basic program information that could be useful for professional accreditation. The fact that in most campuses this unit responds to the DAA or is part of the Office of the Chancellor encouraged discussion regarding different ways to assist the programs. These Planning Offices were instrumental in helping design and develop different formats for data gathering.
Data is considered important information to demonstrate strengths; the institution also needed to reinforce this concept. In line with this view, the Board of Trustees issued Certification No. 136 requiring programs to gather and actively use assessment data (Board of Trustees, 2003-2004, Policy on Institutional Research) as part of their continuous evaluation.
This integrated attitude reminds us of how important it is to manage change as a complex process for institutions of higher education. Duck (1998) observes that companies:
Keep breaking change into small pieces and then managing the pieces… But with change, the task is to manage the dynamics, not the pieces. The challenge is to innovate mental work, not to replicate physical work. The goal is to teach thousands of people how to think strategically, recognize patterns, and anticipate problems and opportunities before they occur. (p. 57)
It was important to encourage active participation of prospective employers and alumni in different committees. To obtain their input, various advisory boards were created at the campus level. As a result, challenges and future developments in the fields were identified faster and with greater precision. This was a significant step that helped shape program requirements and develop new and relevant goals.
Prospective employers and alumni appreciated the fact that they were asked to contribute to UPR’s development. More importantly, a strong message was sent to the programs to consider and analyze how they could meet the particular needs of the community and Puerto Rico at large within the context of a comprehensive approach. It was a chance to have a different outlook of the institution, for at the moment the view of the individual campuses or units were secondary to a more global perspective.
Student outcome assessment data of the various programs was contrasted with the rigorous standards established by accrediting agencies. Thus, a new project developed from this analysis. As the units gathered data to demonstrate their effectiveness, they started to compare their results with other programs in Puerto Rico and the United States. The Office of the VPAA decided to sponsor the administration of standardized tests prepared by external agencies for some of the programs. For example, the College Board provided an assessment process for our teacher preparation programs, which helped gather information to compare UPR students with others at a national and global context. Similarly, the Office of the VPAA sponsored and endorsed any idea that would help evidence their commitment to improve the programs.
However, for some units, comparison with other programs at the national level had the opposite effect. For example, the chemistry professors were one of the few groups that, even before the project started, continuously met every semester to consider ways for improving their curriculum. In fact, curricular revision was a regular topic of conversations and student outcomes assessment was a priority, yet they were initially apprehensive of including or comparing their test results with scores at the national level. After much conversation and reflection, they decided to participate in this benchmark because it would be favorable for the programs and the students in the long run.
As the programs analyzed their assessment results, they realized they could participate in national and international discussions with the groups that set the agenda and standards for them. By now, they mastered the precise language, developed the expertise, and had the added advantage of applying it to different cultural and academic environments. They could now share this experience with their peers at other institutions.
In sum, a complete cycle was achieved for some. Such is the case for teacher preparation programs, which were accredited for the first time and continued to improve their assessment plan to ensure re-accreditation. At this point they have identified the importance of instituting a yearly process of gathering and discussing relevant information about their strengths and needs, as well as student outcomes, which is vital for the next accreditation cycle. In this regard, their programs have improved tremendously.
The credibility of this project came from the initial accreditation of many programs. Overall, it provided the opportunity to think about the institution differently. Many chairs and professors, as well as administrators, learned new ways to view the strengths of their programs and were engaged in the change process.
This is not an overnight achievement. It took many years to complete the required curriculum revision in order for programs to change and achieve accreditation.
8. Institutionalize new approaches
In this final stage, it is important to articulate connections between new behaviors and corporate success and establish a projected vision consistent with the new approaches. According to Kotter, there are two important factors for institutionalizing change. One of them is: “the conscious attempt to show people how the new approaches, behaviors, and attitudes have helped people improve performance.”
The assessment process mentioned above gave way to new projects. Several programs worked on major curriculum revisions that needed the approval of the Board of Trustees. This process took close to five years, starting from the time the unit submitted the review to their campus Academic Senate. To ease and accelerate the process to consolidate change, the VPAA designed a new approval structure. The VPAA-AT monitored the curricular revisions that were required and discussed them with the VPAA. This included what they entailed according to the agency’s standards. Since this information was then available at the Office of the VPAA, the revision approval advanced at a faster pace. Throughout the accreditation process, and as a result of the internal and external evaluation, over 470 minor curricular changes were approved for numerous programs.
The continuous orientations and workshops offered to the faculty emphasized the content needed for the accreditation of programs or services. Thus, a considerable number of professors at each campus learned exciting ways of evaluating their programs. A specific association between the content required for accreditation and program profile brought changes to courses and assessment data. For example, one of the accreditation agencies proposed a curriculum with 60 percent of the content in general education courses. The current curriculum had a much lower option, which compelled professors to meet and discuss the expected drastic change. Most program groups that started to discuss these changes and the ways to achieve them are now accredited.
As time went by and more programs and services completed important stages toward accreditation or received notice of accreditation, system-wide program meetings were held to reinforce the fact that reaccreditation was an institutional on-going process linked to the new UPR culture of evaluation. Groups from all eleven UPR campuses and from similar programs or services have continued to work and communicate with each other. Some of these groups have established a continuous electronic format to share different professional topics. These learning communities have established ongoing friendship and professional exchanges among local and international peers that are still active today.
In May 2009, the UPR celebrated the Achievements of the System-wide Program and Service Initiative Conference. Professors and employees from programs and services of the eleven campuses that participated in the project met with other professors, academic deans, and chancellors to discuss the influence and effects of this undertaking. The accreditation committees shared their reflections on many topics, including how they worked with the accreditation process, the importance of faculty involvement and student participation, student assessment, and effective strategies to complete curricular revisions. They emphasized the achievements obtained in terms of the quality of the services or programs and their graduates. The UPR President also presented the effort as an on-going process of institutional self-improvement and one of the best academic goals any institution of higher education could have. In addition, UPR implemented a media tour to showcase the project and its results to the general public. Local newspapers and television channels interviewed the UPR President, the VPAA, the VPAA-AT and the presidents of some accreditation agencies.
As more programs and services were accredited, the need to start working on the reaccreditation cycle was evident. After the formal visit, programs received recommendations that required continuous work and a systemic overview. This also reinforced the fact that accreditation is an ongoing process of improvement. New behaviors and updated contents were instrumental for the program’s future success.
As the project advanced, some accrediting agencies began to invite UPR specialists to participate in US accreditation teams. For example, NCATE invited two UPR professors —a first for Puerto Rico— to be part of the Board of Examiners. After initial training these professors continue to take part in the evaluation of teacher preparation programs in the US. This is the best example of a win-win situation for all: UPR participates at the national and international educational agenda, and the agency can readily show how they value diversity.
Furthermore, another system-wide change resulting from the accreditation agenda included a new faculty recruitment policy. After Board of Trustees’ Certification No. 15, 2006-2007 on the necessary conditions to hold a teaching position at UPR, recruitment now considers specific qualifications aligned with accreditation standards, such as student outcomes assessment, research, and community service. Moreover, the Board of Trustees amended the UPR General Bylaws to require a doctorate or terminal degree for new professors or researchers. As a result, UPR was able to demonstrate that 92.5 percent of recruitments during the 2008-2009 academic year had a doctoral degree, which increased to 63.5 percent the number of professors or researchers that have such a degree in their discipline. This certification is still in effect.
Transfer programs were another area that improved as a result of program consistency among campus units that went hand-in-hand with accreditation. Students participating in these programs begin their studies at one campus and after complying with the required credits, they may register at the campus that offers the complete BA program to finish the requirements and graduate. In 2003-2004 there were 92 transfer programs at UPR, but only three of them were articulated, just 3 percent. As part of the accreditation process, the DAAs worked to ensure a smooth transfer for all students within the university system. As a result, 100 percent of the programs have been articulated.
The libraries are also an excellent example of one of the most profound changes at the institution. They completed their unit evaluation and showed great leadership by requesting the Board of Trustees approve a policy requiring a continuous external five-year evaluation by the American Library Association (ALA). The result was the approval of Certification No. 38, 2009-2010, a policy for the institutionalization of external evaluation of UPR libraries, “with the quality standards established by the profession.” The Board recognized this was a unique request that would strengthen the culture of planning, evaluation and assessment for UPR libraries and have a positive impact on programs in general.
The best evidence this change was important is the fact that today it is still active. It is quite common at UPR, and probably at other public institutions of higher education, to disregard previous initiatives when a new president is designated. Yet, the subsequent administration recognized the relevance of this initiative, and although funds were significantly reduced due to economic hardships, each campus program and service made the internal measures to continue the process. This also demonstrates faculty commitment to the project. After 2009, programs and services continue to receive accreditation and support the change.
Conclusions and observations: Implementing change
The purpose of this paper is to share the process, reflections, strengths, and challenges that were instrumental in the success of the Project for the Professional Accreditation of Programs and Services. The model developed by Kotter in Leading Change (1996) was used as a conceptual framework to view our change process. As is evident, the transformation process at UPR was effective and resulted in the professional accreditation of most programs and services. Yet, the purpose of the project was to work with accreditation as a means to implement a different way to view them. The institution now has clear and strong evidence that demonstrates that our programs have the minimum characteristics of excellence to compete with other first-rate programs at other institutions of higher education.
For six years we worked intensely with program and service accreditation to change and improve the culture of evaluation at our institution. Through orientations and workshops faculty members and middle and upper level administrators developed the behaviors needed to succeed in this academic endeavor. Curricula and student outcomes also improved. Specific strategies that advanced the project included gathering data for assessment, speeding up curriculum revisions according to and aligning faculty qualifications with accreditation standards.
The Project was also a chance for the accrediting agencies to critically analyze how their standards would apply in a complex public higher education system with another language and culture. For many of them, it was the first time they were able to participate in such an undertaking with simultaneous accreditation of all programs. They were instrumental in identifying excellent external evaluators to work with the various groups. Also, intensive discussions were held with agency representatives on how to interpret their standards, including the topic of diversity, which in our academic and cultural context took a new perspective. For example, many of them have expressed satisfaction in receiving more international requests for accreditation, yet the basic language for all processes is still English only; for some, training second language evaluators has been slow. Also, although we are aware that some cultural differences might exist, the agencies have not elaborated information on how to consider this heterogeneity as compared to the standards.
As proposed in Kotter’s first stage, the UPR identified professional accreditation as a requirement to effectively compete in the international educational arena. Program accreditation was a great way to respond to the challenge of maintaining an academic culture that continually renovate and update itself to better serve the community and the student population. It was a process that facilitated the discussion of hard data and the active use of specific plans and interventions to regulate informed decisions, thus ensuring necessary transformations.
The established powerful guiding coalition was instrumental in the change process. The UPR assembled system-wide, multitasking, multilevel groups to strengthen and support the culture of evaluation in light of the accreditation process. These groups were committed to excellence and lead the change effort. Chancellors and DAAs showed great leadership at the campus level by endlessly answering questions, creatively solving issues that will arise, and endorsing the initiative as part of their academic agendas.
There is an important difference between our change processes and Kotter’s model. In our case, the vision was created, stated and endorsed —as one process— at the beginning of the project. We believe that this integration was a major asset for establishing a clear direction and help move the guiding coalition at a faster pace. From the campus chancellors to the faculty members of the programs susceptible to accreditation, the project’s vision and the strategies for its implementation were clear. This helped them complete the specific tasks they were assigned.
Upon reflection, we understand the intercampus meetings of common programs we actively promoted and facilitated provided a great opportunity to establish a professional contact among professors, students and program directors. These meetings encouraged participants to share new ideas, best practices, innovations, and documents to ease the accreditation process. As a result, a community of practice was formed and continues working to this day.
The criteria to select group leaders were another decisive factor for success. One of the essential qualities was a natural ability to work with others and prior experience in advancing a complex agenda at a public institution of higher education. The leaders selected to work on the project at the campus level attracted and motivated faculty members to get involved in the accreditation process and empowered them with innovative ideas and actions.
A healthy competitiveness took place between programs and services participating in the accreditation process. All groups were constantly interested in information about the different stages achieved by other campus programs. This was probably related to the fact that each accreditation stage was clearly identified, and that every available opportunity and strategy to communicate success was used. This included official letters to participants and the academic community, presentations in periodic meetings, and monthly reports to the University Board and the Board of Trustees.
The recognition and reward system proposed by Kotter and implemented in this project contributed to unexpected accomplishments. The first goal stated that by 2012-2013 all programs and services would have started the accreditation process. This was achieved in 2009. Moreover, by 2014, 81 percent of the 144 participating programs received accreditation for the first time. During the same period, 60 percent of the 102 participating services obtained accreditation or external evaluation.
To say a program or service has been accredited does not account for the effort and energy it took to accomplish. Accreditation is not an individual process. In our case, it was a collaborative decision-making effort that for some programs required endless hours of hard work, moving from one campus to another for a meeting, reviewing every decision, sharing them with faculty and students, and addressing very complex and difficult questions. We are in awe of all the challenges that the programs addressed. Surely their commitment to improve academic life at the institution is a fundamental reason for the accreditation achievements.
The project for the professional accreditation of programs and services and the external evaluation initiative of all programs impacted our institutional profile. UPR now has the tools and the know-how to objectively compare itself with some of the best institutions of higher education.
References
Board of Trustees of the University of Puerto Rico. (1993). Norms and Guide to the creation and revision of programs at the University of Puerto Rico (Certification No. 113, 1992-1993). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2004). Institutional policy on accreditations (Certification 138, 2003-2004). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2004). Policy on institutional research (Certification No. 136, 2003-2004). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2006). Certification to amend the necessary conditions to hold a teaching position at UPR (Certification No. 15, 2006-2007). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2006). Norms of the creation of academic programs at the University of Puerto Rico (Certification No. 80, 2005-2006). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2007). Norms for the periodic evaluation of academic programs at the University of Puerto Rico (Certification No. 43, 2006-2007). Río Piedras, PR: University of Puerto Rico.
Board of Trustees of the University of Puerto Rico. (2009). Policy for the institutionalization of external evaluation of UPR libraries (Certification No. 38, 2009-2010). Río Piedras, PR: University of Puerto Rico.
Council for Higher Education Accreditation (CHEA). (2006).
Duck, J. D. (1998). Managing change: The art of balancing. Harvard Business Review on Change, 71(6), 109-118.
El-Khawas, E. (2001). Accreditation in the USA: Origins, developments and future prospects. Paris, France: International Institute for Educational Planning, UNESCO. Retrieved from http://unesdoc.unesco.org/images/0012/001292/129295e.pdf
Hiatt, J. M. & Creasey, T. J. (2003). Change management: The people side of change. Loveland, CO: Prosci.
Kotter, J. P. (1996). Leading change. Boston, MA: Harvard Business School Press.
Kotter, J. P. (2008). A sense of urgency. Boston, MA: Harvard Business Press.
Mento, A. J., Jones, R. M., Dirndorfer, W. (2002). A change management process: Grounded in both theory and practice. Journal of Organizational Change Management, 3(1), 45-59.
Morley, J. & Eadie, D. (2001). Leading change. In J. Morley & D. Eadie, The Extraordinary Higher Education Leader. Washington, DC: National Association of College and University Business Officers. Reprinted with permission by the American Council on Education Department Leadership Project. Retrieved from http://www2.acenet.edu/resources/chairs/docs/Morley_LeadingFMT.pdf
Paton, R. A. & McCalman, J. (2000). Change management: A guide to effective implementation. Thousand Oaks, CA: Sage Publications.
University of Puerto Rico. (2006). Ten Challenges for 2006-2016: An agenda for planning. Río Piedras, PR: Office of the President.
Wedge, C. (2006, Nov.-Dec.). Leading change: An exploratory process. EDUCAUSE Review, 41(6), 10–11.
List of accrediting or certifying agencies:
- ACBSP: Accreditation Council for Business Schools and Programs
- AACSB: Association to Advance Collegiate Schools of Business
- ACEJMC: Accrediting Council for Education in Journalism and Mass Communication
- ACS: American Chemical Society
- ABET-CAC: ABET Computing Accreditation Commission
- ABET-ETAC: Engineering Technology Accreditation Commission
- NCATE: National Council for Accreditation of Teacher Education
- ACRL-ALA: Association of College and Research Libraries-American Library Association
- LATINDEX: Regional Cooperative Online Information System for Scholarly Journals from Latin America, the Caribbean, Spain and Portugal
- IACS: International Association of Counseling Services
- AAM: American Alliance of Museums
- NAEYC: National Association for the Education of Young Children