Integrated planning, strategic alignment and metrics

Report Post
Dr K Downing

By Dr Kevin Downing
Director Knowledge, Enterprise and Analysis
City University of Hong Kong, Hong Kong

Developing and writing strategy for higher education institutions (HEIs) involves an increasingly challenging set of tasks which requires careful external and internal environment scanning, and substantial consultation with stakeholders (at all levels) to achieve consensus. However, many strategic plans still manage to fail, irrespective of how carefully they are conceived and written or how extensive the stakeholder consultation processes. The author argues that this is largely a failure of “implementation”, a term that is not generally properly understood in many organisations, and one which is inextricably linked with the proper use of appropriate outcomes, management “metrics” and performance indicators. The solution involves adopting an integrated and evidenced-based approach to planning. Drawing on experience from a range of institutions, this article outlines the whole process step-by-step and provides some in-depth advice together with unusually detailed examples of how “metrics” can be used to promote integrated planning and maximise the chances of successful strategic implementation.

The planning process

In order to be effective, and to ensure the campus and the various stakeholders are focused on the task, the planning process must be clearly outlined from the outset, and should identify what choices are available given the information gathered from any benchmarking exercises and the external and internal environment scans. Choices can then be made in a more informed way, taking into account considerations relating to cost, and encouraging a shared commitment to allocate the resources necessary to achieve the identified strategic goals or outcomes. This latter point is particularly important because inevitably some areas will be prioritised whilst others will not.

Most strategic planning processes in higher education institutions will contain a number of fairly simple steps and these can form the basis for putting together a more detailed planning outline. These will typically include forming a central strategic planning group, undertaking benchmarking against selected peers (the various rankings can be a useful starting point here), identifying key strategic areas or themes, scheduling meetings for broad consultation, engaging in external and internal environment scanning, performing gap analyses, setting measureable strategic outcomes or goals, drafting the plan and further consultation. The strategic plan is then typically finalised, published and actively disseminated amongst some stakeholders, whilst others are told where the plan might be accessed. Colleges, schools and departments are then typically asked to provide their own strategic plans which are aligned with the identified institutional goals. Sadly, for some HEIs this is where the process falters because of a lack of clarity about how the plan will be implemented and monitored.

An evidenced-based approach to strategic planning

A major component in identifying the right areas for strategic focus, and ensuring maximum impact on future performance improvement, is the possession of accurate information, comprehensive internal datasets and appropriate benchmarking data. It is then possible to identify which specific areas of the organisation or operation are in need of particular strategic attention. Consequently, it is essential that institutions have adopted, or decide to adopt an evidence-based approach to strategic decision making and planning. The existence of “hard” data also makes it much easier to convince colleagues and stakeholders of the value of particular focus areas. If an evidenced-based approach to planning has not hitherto been adopted, external environment scanning is a good place to start.

External environment scanning

External environment scans provide information about what is happening “external” to the institution but which might significantly impinge on its operation now or in the future. They typically contain both quantitative and qualitative data which helps management develop a clearer understanding of the context in which they currently operate, and what is likely to change in the next few months and years.

An example of qualitative external environment scanning data would be an impending change in government policy, perhaps related to important arts-related or engineering infrastructure projects, which identifies more funding for certain growth areas in terms of research or teaching. A useful mnemonic for common headings to consider when engaging in external environment scans is TEMPLES where T=technology, E=economy, M=markets, P=politics, L=law, E=environment and S=society. An example of quantitative data relating to markets would be the recent significant rise in annual fees for UK universities and the potential impact this might have on international and local student recruitment during a time of global financial constraint. This information is useful to universities conducting external environment scans outside of the United Kingdom (as well as those within) because rapid and significant increases in fees in one country can mean a new opportunity to recruit students from that country as international students. Clearly, external environment scans should be restricted to those areas most likely to impact on the institution during the period covered by the plan, usually the next three to five years. In many ways, an external environment scan is an external risk assessment which assists planners to identify threats and opportunities, match these to their HEI’s particular strengths and weaknesses, and then plan appropriately to maximise the benefits and reduce the risks to the institution.

Internal environment scanning

Similarly, internal environment scanning is a frank, in-depth look at where the institution currently stands in terms of its core business, infrastructure, finances, and position relative to peers, and any aspirational benchmark HEIs it might wish to identify. Access to robust, verifiable and relevant data is a pre-requisite for success, and those institutions with an established and trusted institutional research or analysis office have a head start over the competition. Identifying appropriate benchmarks and engaging in thorough “gap analyses” together with scrutiny of benchmark’s strategic plans is a good place to start. A gap analysis involves comparing data from other benchmarking institutions, in certain selected key areas of interest, with similar data sets from the home institution. Gaps in performance can then be identified and rectified by looking at how the benchmark institution organises that task or area of core business. This can be done at the level of something relatively simple like rankings criteria, or can involve a more complicated process of reviewing policies relating to staff appraisal or quality assurance.

“Buzz” groups can then be organised involving deans and heads, staff, students and other stakeholders because these provide for lively consultation on any identified working themes and help ensure a thorough 360-degree appraisal of the current strengths, weaknesses, opportunities and threats impinging upon the organisation and its operation. They also increase the likelihood of later stakeholder support for the final strategic plan because those consulted feel more ownership in terms of the eventual goals and outcomes. In other words, the process usually leads to more “buy-in”. Working themes or strategic areas will probably be proposed at the outset as will a set of core values guiding the process, but these are often modified or rejected as a result of the buzz group style consultation process. The more extensive it is, and the more stakeholders that are involved, the more likely the appraisal or internal environment scan will yield accurate, triangulated quantitative and qualitative data to inform the planning process. The following diagram (figure 1) illustrates the typical strategic planning process outlined above.

Figure 1: Diagrammatic summary of strategic planning process

During the whole of the process outlined in figure 1, the value of evidence-based planning is emphasised, together with the central place that the academic vision and mission holds in focusing and driving campus decisions. In fact, the beginning of the strategic planning process is a good time to review vision, mission and core values to bring them into line with the desired direction of the institution and any requirements of the funding body or government. These can then be re-visited during the consultation process to ensure the identified strategic outcome areas are aligned.

The implementation process-constructive alignment
From planning to action: implementation and aligned through integrated planning

Most strategic plans fail, not because they are poorly conceived or written, but because the planning process does not consider from the outset how the plan will be implemented and monitored, and whether the institution has the necessary administrative structures to monitor and drive the alignment and implementation process. In other words, the planning and implementation are not integrated. Integrated planning involves change management and this can only be demonstrably successful when measures or “metrics” are in place to help determine progress towards the stated strategic goals, and those goals are set bearing in mind the need to monitor them. Therefore, they are best stated as “outcomes” which can be directly monitored and assessed. Units within the HEI can then model the strategic outcomes they seek so that they “constructively align” with the centrally determined outcomes. Performance indicators can then be designed to align with institutional outcomes so that progress in terms of strategic implementation can be directly monitored, staff incentivised and potential problems identified earlier. This involves significant additional effort and upheaval because an effective planning process, by definition, focuses on moving the institution forward, and this necessarily involves broad-scale changes on any given campus.

The macro-level: strategic integration and motivation

In summary, for a strategic plan to be truly integrated, and to stand any realistic chance of being implemented, it must not only accomplish the planning process along the lines outlined so far, but also ensure that consideration is given to how progress will be monitored, and who specifically will be held accountable for various aspects of that progress. It must also provide a means of motivating staff who probably already feel they have sufficiently busy lives. At the macro level that involves putting in place a process for monitoring success and for ensuring that all staff members are motivated to engage with the strategic outcomes and goals.

One way to do this is to link progress towards the institutional strategic outcomes with the annual budget allocation cycle via a top-slicing system. At first the top-slice can be very small to allow time for staff to get used to a competitive system. For example, each College or School might have a small percentage (perhaps 5% in year one rising to 20% by year five) of the previous year’s budget top-sliced and placed in a central pool for competitive bidding at annual budget hearings. Annually data is analysed centrally from each department, within each College or School, and made available to the central budget committee. Specific examples of such data will be provided later in this article to demonstrate how alignment with strategic outcomes is achieved. Data from a number of areas such as those indicated in figure 2 below can then be drawn upon at budget hearings during which the deans and heads of the various colleges and schools have an opportunity to bid for the return of their share of the top-slice or additional funds.

Figure 2: Aligning metrics for integrated planning

Quantitative datasets related to quality assurance, financial, research and knowledge transfer as well as other core areas of operation, can be used to form a basis for any competitive bidding process. Qualitative data can also be made available and presented in support of each unit’s case (where a “unit” is a college or school containing a number of departments). For example, failure to recruit a key member of staff due to circumstances beyond a head of department’s control might be seen as qualitative information that is relevant to the quantitative data presented. Many governments around the globe (e.g. the Hong Kong SAR) now have some sort of competitive bidding process in place when they determine funding for universities and this is a practice which will grow as competition for funds generally increases and taxpayers demand greater value for the considerable sums of money spent on higher education.

The micro-level: aligning and monitoring strategic outcomes

In order to illustrate how this can be achieved, it is necessary to use some examples which include a specific desired strategic outcome, the associated performance indicator(s), and possible samples of how datasets related to these might look when presented to a budget hearing committee. These examples are created for illustrative purposes only and, of course, do not relate to any specific HEI.

Example learning and teaching performance indicators

Learning and teaching forms a core area of business for all universities and so the first example relates to this area and is illustrated in figure 3 below.

Figure 3: Example performance metrics for learning and teaching.

Taking figure 3 as an example of a dataset, it is possible to see how valuable “growth charts” might be constructed for a number of possible strategic outcomes (see figures 4 and 5 below). For example, if an institution has a stated strategic outcome to increase the number of international students over a five-year period to say 15% of the total student population (an input goal) then the performance indicator (PI = percentage of international students) can be presented as a measure of progress towards this goal or outcome. However, achievement of this outcome might be influenced by unforeseen deterioration in the department’s staff/student ratio over the year because key staff members have become ill or left the institution. In these circumstances, a budget committee might also take this indicator into account when assessing and comparing the relative performance of a particular department. Progress towards other possible strategic outcomes can also be assessed, using the example set of “starter” indicators above to produce performance charts on an annual basis as illustrated in figures 4 and 5 below.

Figure 4: Example Growth Chart (Department A)

Figure 5: Example Growth Chart (Department B)

Relative performance for departments A and B on the example set of indicators provided can then be easily compared. More important (from a strategic viewpoint) than cross-sectional comparison between departments is the longitudinal comparison of a single department’s performance over a period of two or more years because this becomes a significant measure of progress towards implementation of the institutional strategic goals. College or school averages can then be looked at year on year to identify positive and correct negative trends in a timely and evidence-based manner.

Example research, grant and supervision performance indicators

Examples of performance indicators for research, grant income and supervision are also treated in the same way, except that it is essential, given known differences in disciplines in terms of length of “typical” articles, average time to publication and citation, number of available SCOPUS listed journals, types of output (in terms of the arts disciplines in particular) etc to use some sort of college/school average as the benchmark rather than comparing with departments from other discipline groupings. Three indices can be used as a working example of some “starter” potential performance indicators (PI’s) for research and associated work:

  • The staff performance index
  • The grant performance index
  • The postgraduate supervision performance index

The staff performance index contains two PIs, namely the number of publications per full-time equivalent academic staff member (FTE) and the number of citations per FTE. The grant performance index contains three PI’s, namely the percentage of FTE holding external grants, the number of grants per FTE and grant income per FTE. The postgraduate supervision performance index contains two PIs, namely the number of PhD students per FTE and the percentage of PhD completion per FTE. Using this example, there are six broad categories of output included in publication breakdown slides (see figure 6 below).

(Total = 27 types of output set with the same weighting for calculating publication performance.)

1. Scholarly books, monographs and chapters
1.1 Research book or monograph (author)
1.2 Chapter in an edited book (author)
1.3 Textbook (author)
1.4 Edited book (editor)
2. Journal publications
2.1 Publication in refereed journal
2.2 Publication in policy or professional journal
3. Conference papers
3.1A Invited conference paper (refereed items)
3.1B Invited conference paper (non-refereed items)
3.2 Refereed conference paper
3.3 Other conference paper
4. Creative and literary works, consulting reports and case studies
4.1 Authored play, poem, novel, story
4.2 Painting, sculpture, drawing, photograph
4.3 Film, video 44 Performance and participation in exhibits
4.5 Translation of other’s work
4.6 Engineering, architectural, graphic designs
4.7 Computer software or system
4.8 Consulting or contract research report
4.9 Written teaching case study or extensive note
5. Patents, agreements, assignments, and companies
5.1 Patents granted
5.2 Licensing agreements
5.3 Assignments of intellectual property rights
5.4 Companies
6. All other outputs
6.1 Journal editor
6.2 Review of books or of software
6.3 Postgraduate research theses
6.4 Other outputs (prizes and awards etc)

Figure 6: Example Departmental Overall Annual Performance (E.g. 2009-2010)

 

Figure 7: Example Departmental Overall Annual Performance (E.g. 2010-2011)

These categories can then be presented as metrics (see figures 6 and 7) in order to provide summarised longitudinal records of performance towards the stated strategic outcomes which can be made available to heads and deans and annual budget committee for management and monitoring purposes.

More detail can be provided as required, including a breakdown of the specific types of output. This simple process allows effective management monitoring of strategic implementation, and can help motivate staff to adhere to the agreed institutional strategic goals. In other words, the planning process is integrated and constructively aligned across and throughout the institution to maximise the chances of successful implementation of the strategic plan.

Conclusion

Strategic plans are a statement of intent which must involve and inform stakeholders, so helping to engage with the wider community upon whom we all depend. As competition for the best staff and students becomes more global and more intense, the challenge of developing and implementing strategy is becoming ever more important to HEIs around the world. Recognising and acting upon changes and perceived future developments at local, regional and global level helps HEIs better represent the regions in which they operate and leaves them less vulnerable to increasingly competitive peer institutions. Consequently, senior managers of HEIs have a responsibility to their stakeholders, to ensure delivery of the highest quality learning, teaching, research, knowledge transfer and administrative services in the most relevant, efficient and cost-effective way.

In summary, in the context of today’s globally competitive HE sector, managers have a responsibility to be able to demonstrate and evidence the outcomes of their strategic and management decisions. Integrated planning, constructive alignment and the intelligent use of performance management metrics as outlined in this article, are the essential and basic tools for achieving demonstrable strategic success.

Dr Downing is secretary to Council and Court and director of the Institutional Research Office at City University of Hong Kong. In addition to his work as a member of the QS WUR rankings advisory board, Dr Downing is chair of QS-MAPLE (Middle East and North Africa Professional Leaders in Education) International Academic Advisory Committee and a member of the QS APPLE (Asia-Pacific Professional Leaders in Education) International Academic Advisory Committee. Dr Downing is a chartered psychologist and chartered scientist with a current licence to practice, and associate fellow of the British Psychological Society with wide international experience including senior academic and administrative posts in Europe and Asia. He is editor-in chief of the prestigious scholarly journal Educational Studies which is listed in SCOPUS. His substantial published work centres on psychology, rankings, education management and metrics, and metacognitive development.