The Canadian Journal of Higher Education, Vol. XVI-1, 1986 La revue canadienne d'enseignement supérieur, Vol. XVI-1, 1986 Measuring the Effectiveness of Research Grant Getting* CHARLES H. BÉLANGER** and ROBERT LACROIXt ABSTRACT To a very large extent, the national and international reputation of major research universities depends upon their research performance. That explains why competition is so fierce among them to get as much as they can from the three Canadian government major granting agencies. This study demonstrates how performance indicators were developed to measure the effectiveness of research grant getting among eleven Canadian universities. It shows how amount of money received, size of teaching staff, and disciplinary characteristics were standardized to yield objective disciplinary and institutional rankings. RÉSUMÉ La réputation nationale et internationale des principales universités où il se fait beaucoup de recherche dépend dans une très large mesure de leur rendement dans ce domaine .C'est pourquoi ces institutions mènent une lutte serrée pour obtenir le maximum de subventions des trois principaux organismes subventionnaires du gouvernement canadien. L'article porte sur l'approche utilisée pour mesurer, à l'aide des indices de rendement, l'efficacité des subventions de recherche réparties entre onze universités canadiennes. Il montre que pour en arriver à un classement objectif des disciplines et des institutions, il a fallu standardiser les sommes versées, l'importance numérique du corps professoral et les caractéristiques disciplinaires. * Revised version of a paper presented at the European Association for Institutional Research, University of Copenhagen, Denmark, August 1985. The authors gratefully acknowledge the contribution of Mrs. Claude Parizeau, Office of Institutional Research, Université de Montréal, for her assistance in the data collection. ** Vice-president Academic and Professor of Administration, Laurentian University/Université Laurentienne. He was Director of Institutional Research at the Université de Montréal at the time this research was conducted. t Director of Centre de recherche et développement en économique and Professor of Economics, Université de Montréal. 26 Charles H. Bélanger and Robert Lacroix In 1984-1985, the Government of Canada invested 0.7 billion$ in academic research through grants. That money was seen as a direct means to develop and train hundreds of young people, to contribute to the attainment of national economic growth, and to promote the production of new knowledge. Although this money constitutes a substantial capital outlay from the grantor's vantage point, it plays an even more important role in the organization of academe (Jencks and Riesman, 1969; Light, 1974). Universities expect research activities from their faculty members in order to complete the triangle of their three major functions: teaching, research and service. The range of faculty workload percentage devoted to research in universities was found to vary between 14% and 25% on the basis of types of institutions (Ladd & Lipsett, 1972, 1974, 1976; Baldridge et al., 1978; Berkeley, 1978). Notwithstanding the existence of other forms of research output, publications are almost universally recognized by academics as the competence and performance test. To make that point, scores of authors have dealt with the evaluation of university professors' research productivity and performance (Jauch and Glueck, 1975; Rushton and Meltzer, 1981; Ingalls, 1982; Université de Montréal, 1985). Most were concerned with multiple measures of research output and impact, and with sophisticated weightings of various kinds of publications for inter-institutional disciplinary comparisons and quality rankings. Due mainly to the construction of Citation Indices in the Sciences (SCI) the Social Sciences (SSCI), and the Arts and Humanities (A&HCI), the list of articles dealing with bibliometric measures is almost endless. The pros and cons of using bibliometric data were competently summarized by Moed et al. (1984). By contrast to those finely tuned techniques, Jauch and Glueck (1975) studied eighty-six (86) professors in twenty-three (23) departments in natural, mathematical, medical and biological sciences who had been involved in significant research over a five-year period; they came to the conclusion that effectiveness could be measured by a simple count of the number of publications in respectable journals. Getting a grant may not only facilitate publication productivity, but may depend on it. Therefore, it is no great surprise to observe fierce competition among universities and individual scholars to get as much as they can from the national pie. Aside from the sheer money involved, grant-supported research attracts high caliber graduate students, helps to build disciplinary empires, buys modern equipment, promotes publications, and provides travel money for scientific conferences. National and international reputations of universities as well as scholars are built on research performance and grantsmanship capabilities. This paper is an attempt to develop research funding performance indicators that will measure the degree of grantsmanship effectiveness across institutions and within disciplinary fields. Should widespread performance discrepancies be found, institutions would undoubtedly be interested in identifying and explaining the factors which give them the edge or put them in an unfavorable position. 27 Measuring the Effectiveness of Research Grant Getting The Meritocratic Competition for Grants The distribution of grants is selective because there exists some scarcity of funds. Given the disparity between demand and supply, grants of any size become important precisely because they are allocated on a competitive and meritocratic basis. Obviously, agencies differ in the competitiveness of their grants, or in the productivity of their recipients, in a rank order roughly similar to an intuitive ranking by cosmopolitanism (Liebert, 1977). The least competitive grants are those allocated at intramural, provincial and local government, and industrial levels. They aim at specific objectives often related to regional problem-solving activities and development. At the other end of the spectrum are the main federal granting agencies. In Canada, the agencies which can be considered in the major leagues are the Natural Sciences and Engineering Research Council (NSERC), the Medical Research Council (MRC), and the Social Science and Humanities Research Council (SSHRC). These three national councils distributed 0.5 billion $ Can. in 1984-1985 through a highly selective peer review process. As was mentioned earlier, publications feed the communication system and identify the most productive and authoritative researchers in various specialties. Successful recipients are those with substantial track records of knowledge productivity. They are scholars who either publish a lot, or publish significant work or do both, with the implication that quantity over a career span implies quality. Skeptics may argue that aside from interfield differences, particular institutional circumstances and personal assets make the competition-on-merit principle more ambiguous. This line of reasoning might have some value with the lower-ranked granting agencies but is not substantiated by grantsmanship research findings when highly competitive grantors are considered. After analyzing factors such as institutional wealth, enrollment selectivity, library facilities, regional location, career age, salary, consulting activities, and other institutional and individual characteristics, Liebert (1977) and Bayer (1973) concluded that grant-supported research was "virtually unrelated to institutional and personal status characteristics". What is important is "individual productivity." Until the grant-getting process can be proven biased, it would appear that the correlation between publication productivity and grant recipientship is very high and credible. Interfield Differences A measure of caution must be exercised when dealing with different fields. First, not all fields need research grants to conduct research. Such is the case with a few disciplines in the humanities and letters where an excellent library and a competent mind are the two most essential elements to generate research and knowledge. Second, a number of disciplines must receive a certain level of grant support if they are to be research productive. However, the size of grants received can be kept relatively small because there is no or little equipment involved. These could be qualified first and foremost as labour intensive. 28 Charles H. Bélanger and Robert Lacroix TABLE 1 DISCIPLINARY SECTORS NUMBER 01 02 03 04 05 06 07 08 09 10 11 IDENTIFICATION Peri-medical Sciences Para-medical Sciences Pure Sciences Applied Sciences Humanities and Social Sciences Education Administrative Sciences Arts Letters Law Medicine and Specialties EXAMPLE Dentistry Nursing Geology Engineering History Educational Technology Health Administration Music Linguistics Law Surgery NOTE : The complete breakdown of each disciplinary sector can be obtained from the author or from the Quebec Ministry of Higher Education and Technology as indicated in the reference section. Third, there are the medical, natural, and engineering sciences where large amounts of money are crucial. The larger the grants get, the more money goes to support a facility or an organized research team rather than merely a principal investigator. In many instances, the decision involved in giving grants to particular individuals is based not only on the track records of these researchers but also on the facilities and equipment already at the researchers' disposal. Critics might suggest that some fields attract more grants than others because their products and effects are deemed to have greater social value. Whatever the case may be, scholars of all fields are involved in the politics of priority setting to secure as much money as possible for their respective fields. Despite that caution, it is interesting to point out that in a 1985 extensive study conducted at the Université de Montréal, 52% of all faculty members received grants in 1983-1984. One might believe that this high percentage was the result of a high degree of success observed primarily in the medical, natural, and engineering sciences. That was not necessarily the case, since in the same year, grants were received by 54% of humanities professors, 51 % by education professors, and 47% and 41% respectively by philosophy and letters faculty members. Although the number of grants might be on an equal footing across disciplines, numbers alone would not recognize the distinctiveness of the size of grants among various fields. This is a sufficient reason to regroup similar fields together and not to have comparisons across different disciplinary groupings or sectors. This methodological precaution must be secured if one is to make reasonable comparisons of similar sectors across institutions. 29 Measuring the Effectiveness of Research Grant Getting Developing Grantsmanship Performance Indicators The computation of absolute dollar numbers for a single institution from year to year is a necessary exercise to monitor trends, but one that fails to capture the degree of success and/or competitiveness vis-à-vis comparable universities. In the development of a methodology that has the capability of assessing the competition, three factors must be accounted for: disciplinary groupings or sectors, teaching staff, and the actual amount of money received from granting agencies. Disciplinary Groupings - Ideally, each field or discipline should be kept separate and analyzed separately. From a methodological point of view, this is easy to achieve assuming that an adequately detailed database is already in place. Oftentimes, given the multiplicity of academic units in large institutions, management is primarily interested in identifying the strengths and weaknesses of broader disciplinary areas as a whole for strategic planning purposes. This identification keeps the number of disciplinary sectors down to a manageable size and does not preclude the further probing into single disciplines if such probing is called for. This study broke down fields to eleven (11) disciplinary sectors, the same ones used by the Quebec Ministry of higher education and technology to finance additional student enrollment (Ministère de l'éducation, 1983). The number of fields regrouped in each of the sectors presented in Table 1 is variable because sector 10 includes only law while sectors 05 and 11 aggregate many disciplines on the basis of established commonalities. Grant Dollars Received - For the purpose of this article, only NSERC, MRC, and SSHRC, the three most competitive federal granting agencies were retained. All grants awarded to each recipient in all universities are recorded by the Canadian Institute f o r S c i e n t i f i c a n d T e c h n i c a l I n f o r m a t i o n ( N a t i o n a l R e s e a r c h C o u n c i l of Canada) in an annual publication called Directory of Federally Supported Research in Universities. The exploitation of that information for specific management objectives unfortunately must be conducted by hand since the Can-Ole system used by that agency is more amenable to bibliographic manipulations than to statistical and managerial tabulations. Despite the very cumbersome sorting out process involved, grants can be classified in any field and disciplinary sector chosen. Teaching Staff- The previously explained distinctiveness of interfield differences in terms of grant-supported research funding volume makes it equally important to have the teaching staff categorized in the proper disciplinary sectors. Given the grant size variability among disciplinary groupings, one can readily assess how much distortion would be built into any comparison attempts if relative institutional disciplinary emphases were not accounted for. Statistical readings of university teaching staff data should be a fairly straightforward affair. Nevertheless, they have generated much internal and external debate mainly because there exists more than one statistic per institution. 30 Charles H. Bélanger and Robert Lacroix TABLE 2 RESEARCH GRANTS (NSERC, KRC, SSHRC) AND TEACHING STAFF FOR 11 SELECTED HUTLIVERSITIES AND TOTAL CANADA 1982 1983 Sectors Multiversities #a Perimedical 1976-1977 Multiversities Total Canada Multiversities Total Canada 4,677,457 85.9 637 81.0 1.06 5 443,277 3,332,319 86.0 543 82.0 1.04 3,871,473 307,861 65.9 688 66.9 0.98 466,989 151,186 53.8 527 71.7 0.75 280,766 25,887,187 61.0 2,763 46.5 1.31 42 4 2 4 , 4 8 0 18,460,067 62.5 2,556 46.4 1.34 29,496,124 14,032,090 61.5 1,360 50.6 1.21 22 7 9 6 , 1 2 7 11,399,097 65.2 1,311 55.5 1.17 17,475,456 3,435,782 60.2 3,300 39.1 1.53 5 704,867 3,297,003 62.2 3,024 40.1 1.54 5,298,610 81,546 24.3 1,623 49.8 0.48 335,065 56,571 37.9 1,431 54.9 0.69 148,945 146,044 46.5 698 46.4 1.00 313,732 82,955 46.4 572 54.1 0.85 178,516 129,849 77.0 516 40.5 1.90 168,508 129,121 78.8 429 42.4 1.85 163,822 380,122 61.8 1,368 43.2 1.43 614,929 500,189 62.3 1,440 43.8 1.42 802,698 15,064 41.8 331 58.6 0.71 35,996 53,130 86.4 298 60.9 1.41 61,464 29,811,995 85.4 2,380 78.3 1.09 34.894,959 21,406,544 85.6 2,021 80.3 1.06 24,992,841 78,907,997 69.7 15,664 49.4 1.40 .113 , 1 9 8 , 9 2 9 58,868,182 71.1 , 14,152 51.0 1.39 82,770,715 $ 1$ F IF P 108,178,958 39,003,865 58.1 1,428 48.5 1.19 67,047,005 12,634,205 53.4 3 343 39.4 1.35 23,636,314 1,215,956 46.8 1,570 49.5 0.94 2,593,780 t tt F IF P 1,087,980 51.8 807 43.0 1.20 2,099,130 555,609 68.0 535 37.0 1.83 815,909 t It F IF P 1,233,185 48.7 1,353 43.5 1.12 2,527,071 264,799 71.3 353 52.3 1.36 370,941 tt F IF P $ tt F IF P 71,302,350 83.6 3,034 78.8 1.06 85,209,008 204,920,833 66.6 16,674 49.6 1.34 307,510,410 tt F IF P Sciences 64,867,707 59.9 2,787 46.4 1.29 F IF P Pure 2,037,192 tt F IF P 02 1,461,466 71.7 698 67.2 1.06 $ Paramedical 11,293,711 86.9 673 82.5 1.05 12,995,102 tt F IF P 01 tt 03 F IF P Applied Sciences t It F %F P 04 Humanities Soc. Sciences 05 a Education 06 Adm. Sciences 07 Arts $ 1$ F «F P J <1 08 Letters 09 J Law 10 Medicine S- Specialties 11 Total a $ 1$ F IF P = = = = = Absolute d o l l a r s $ M u l t i v e r s i t i e s / t T o t a l Canada Number of f a c u l t y members F M u l t i v e r s i t i e s / F T o t a l Canada It/IF 1972-1973 Total Canada 818 1,039 6,017 2.943 8,723 3,171 1,875 1,446 3,113 676 3,854 33,722 788 1,031 5,937 2,683 8,429 3,267 1,505 1,279 3,179 564 3,046 31,682 665 737 5,523 2,361 7,534 2,611 1,059 1,014 3,296 489 2,516 27,813 31 Measuring the Effectiveness of Research Grant Getting This situation is a consequence of our Canadian decentralized education system, of university autonomy in defining their own internal parameters, and of the many different definitions used by agencies to which institutions are requested to report data. Statistical readings of staff are therefore difficult but should not be looked upon as totally atypical and insurmountable. In this study, a special computer run of the Universities and Colleges Academic Staff (UCAS) file was done by the Education Division of Statistics Canada. The exploitation of that file yielded all full-time teaching faculty members excluding deans, librarians, research personnel with rank, central administration personnel, and clinicians. It must be noted that UCAS classifies each faculty member on the basis of the subject taught, and not according to the hiring unit or the specialization of the highest degree received. Hence, a faculty member with a Ph.D. in mathematics, hired by a business school and teaching computer science is classified in computer science (Sector 4). There remains some ambiguity as to whether his/her research activities and grants are related to business or to computer science. Other sorting out criteria have also their shortcomings including the reliability and comparability of the database. In any case, the Statistics Canada file was judged to be the best available and apparently the most reliable, since figures are forwarded by institutions. At the time this study was being conducted the last complete year on file was 1981-1982. Performance Indicator - The development of this research grant getting performance indicator was based on the assumption that if all faculty members of each disciplinary sector for each university had the same motivation, competence, and productivity, a perfect correlation of 1.0 should be found between grant money received by a disciplinary sector as a percentage of the total national dollar amount awarded to that sector and the teaching staff classified in that same disciplinary sector as a percentage of total faculty members in the same sector across Canada. The mathematical expression of the performance indicator was as follows: Py. where P = Performance indicator G = Grant money ($) received as a percentage of the total national (or of a more limited pool) dollar amount awarded = Faculty members as a percentage of the national (or a more limited pool) teaching staff i = Specific disciplinary sector j = Specific university t = Year surveyed Given the premises of that indicator, each university, either within each disciplinary sector or as a whole, can be assigned a performance ranking. The higher the ratio, the better the performance and vice-versa. 32 Charles H. Bélanger and Robert Lacroix Presentation and Analysis of Results As spelled out in the mathematical expression of the performance indicator, this methodology has the capability of yielding results at the macro or micro level. The initial incentive to generate this study came from the Planning Committee of the University of Montreal who was interested in having a better grasp of how effective or competitive the University was at getting grants when matched with similar institutions! Hence, selection of universities offering a wide coverage of academic programs including medical education and known for their excellence on graduate studies was made. Other criteria such as region, size, and operating budget were considered in arriving at the final selection. On that basis, eleven major research universities, referred to in this study as multiversities, were compared. There is no doubt that other institutions could have been included because of their excellence in specific disciplines and disciplinary sectors. It was felt necessary that each multiversity be represented in each of the disciplinary groupings. With the exception of McMaster University which does not have a Law School (Sector 10), that objective was achieved. Table 2 gives the readers an overview of the relative importance of these eleven multiversities from a research grant and teaching staff point of view. First, it must be noted that three reference years were used. When this study was initiated in the Fall of 1984, National Research Council Canada had not completed its 1983-1984 Directory edition of research grants, and Statistics Canada did not have a complete file on teaching staff for 1982-1983. The 1982-1983 reference year is composed of 1982-1983 grant figures and 1981-1982 teaching staff data. The second feature of Table 2 can be readily shown by a reading of the Total Sectors row at the very bottom. The research grants received by the selected universities range from 71.1% in 1972-1973 to 66.6% in 1982-1983 out of the total grant dollar figure awarded to the more than fifty (50) Canadian universities, while their teaching staff accounted for approximately 50% of the Canadian pool. As a consequence, their overall performance as indicated by the performance indicators (P) was very strong. Identification of the Best Performers - Tables 3 and 4 are intended to give a step-by-step approach to the mechanics of the performance indicator and to present the database used for each university. To the extent that data provided by the two national data gathering agencies are exact, Table 3 shows actual grant dollar amount and teaching staff numbers for each multiversity per disciplinary grouping. Table 4 is a conversion of absolute numbers of Table 3 into percentages. The very first line of the peri-medical (01) sector indicates that in 1982-1983, the University of Alberta with a teaching staff that represented only 9.6% was receiving 12.7% of the research grant amount allocated to the eleven multiversities in that same sector. The overall percentage comparison between grants and teaching staff of the University of Alberta can also be seen in the last three lines of Table 4. Across disciplinary sectors, the percentage of teaching staff is somewhat higher than that of research grants. That being the case, one should expect the TA1E 3 RESEARCH OUNTS (NSERC, ME, S 9 * C ) « C TEACHING S W T FtR E«H (f 1t€ 11 SELECTS) M.TIVERSIT1ES m OISCIH.IWW SECT*» SECTORS YEARS a S ALBERTA D F $ 8R. COLI«. . tWJCUSIE F i F 65 60 48 2280 287 671 86 82 70 129 46 5 53 55 36 37 34 89 86 66 581 145 74 99 89 51 - S LAVAL MANITOBA S KSILL WWS1B) MONTREAL SASAT. TIROm) F S F - 18 7 8 245 190 42 104 90 97 1407 302 196 106 104 86 343B 1400 824 38 43 39 12 - 35 31 23 31 17 - 87 102 67 86 21 12 46 47 40 244 11 10 F $ F F 139 16 89 4? 35 26 1071 737 519 50 54 51 567 419 421 31 36 10 48 41 25 321 40 9 42 45 23 S3 36 13 67 70 67 16 - S F $ S PeriMedical 01 1962-83 1976-77 1972-73 ParaMedi cai 02 1982-83 1976-77 1972-73 Pure Scierces 03 1982-83 1976-77 1972-73 5986 3421 2009 311 290 289 9635 3858 2238 410 395 380 2903 1250 622 125 135 96 3JI8 1396 1221 289 301 265 4388 1769 1100 254 230 231 6472 2494 1656 332 312 284 5891 2637 2048 139 152 117 4667 2378 1838 264 249 237 5745 1316 543 138 139 1S3 Applied Sciences 04 19S-83 1976-77 1972-73 3559 1034 1133 129 123 116 4709 1701 1484 152 142 136 1109 226 141 7 6 6 2795 1221 934 139 140 126 2646 761 487 124 116 126 3675 1465 1318 121 125 IB 4392 1170 1124 78 71 76 4181 1622 884 264 225 211 1802 1000 963 105 99 91 8380 3337 2S64 Hutu 1 Soc. Sri 05 1982-83 1976-77 1972-73 683 235 166 272 257 261 1651 421 319 328 325 306 807 405 215 136 138 127 634 108 273 364 324 249 513 122 99 312 279 287 1467 406 404 212 241 235 1101 469 5B3 231 226 201 1430 192 261 404 398 271 255 27 86 178 166 154 Educatici 1982-83 1976-77 06 1972-73 34 37 34 237 249 219 67 5 8 251 256 213 44 2 54 52 39 332 3 4 110 126 136 107 98 81 61 1 - 95 93 M 33 - ?4 21 30 5? 18 - 149 159 93 4 - U4 118 119 Adi. Sciences 07 1982-83 1976-77 1972-73 21 10 - 79 66 69 129 GO 39 120 104 76 27 4 - 46 40 24 84 10 4 37 37 25 49 114 90 68 3 - Arts 1982-S 1976-77 1972-73 8 72 77 70 79 2 9 66 64 61 - 17 20 16 1982-83 1976-77 1972-73 49 7 11 153 155 156 59 46 20 207 168 172 7 21 1982-83 1976-77 1972-73 8 3 - 28 24 17 54 4 5 43 42 36 4 - 1982-83 1976-77 1972-73 4658 1752 1576 158 166 138 6064 2048 1676 16496 1593 7619 1553 5392 1449 25310 8577 6542 06 Letters 09 La« 10 Medicine t Specialties 11 TOTAL a 6 1982-8! 1976-77 1972-73 1435 1082 421 « 72 55 61 12 - 48 52 42 208 24 4 65 55 29 99 5 - 11 59 56 49 - 54 4/ 22 104 1 3 50 48 3/ 48 - 46 59 64 109 117 181 103 106 104 222 14 23 69 72 86 10B 31 36 115 12/ 119 37 34 28 59 23 52 52 44 6 22 23 21 24 - 29 27 18 284 195 165 3122 234 1053 204 627 160 3694 1276 779 122 133 126 242 210 130 13493 5764 4464 2066 1862 1666 8141 803 2996 784 1632 623 11518 4188 3528 1394 1373 1209 J = Dollar f i g u r e s rounded out t o n e a r e s t ttausand F • N j i t e r of f a c u l t y ranters _ 4801 aio 1555 13745 1349 26197 5767 1251 10606 3803 1144 8306 _ 1 _ 20 20 . m 673 637 543 10 - 65 62 47 1461 308 151 698 688 527 11667 359 4162 3788 356 1580 3593 307 1591 196 234 195 64868 25887 18460 2787 2763 2556 247 252 242 1756 495 377 62 61 SB 2397 656 616 6« 643 614 1696 St 275 566 17 7 ?96 312 307 21 - 29 31 32 « 7? 79 n o soot 14032 11399 1423 1360 1311 330 303 319 12634 3436 3297 3435 3300 3024 113 139 110 1216 82 57 1570 1623 1431 97 82 83 1088 146 83 «07 698 572 286 48 34 69 71 83 39 5 - 62 60 41 556 130 129 535 516 429 SB 57 255 24 136 379 402 402 6 44 21 111 118 136 1ZS3 380 SOO 1353 1368 1440 21 n 19 6 3 15 31 30 28 115 - 34 30 21 265 15 S3 363 331 298 16980 6961 4699 594 437 397 5188 204B 1030 331 255 209 71302 29812 21407 3034 ?380 2021 - 93 13 67 71 6B 248 SO 36 44 33 65 19 10 2 H A - 1« m 1 3 56 46 65 - 4704 2306 1557 236 197 151 80S 3518 2835 261 222 22B 1572 758 608 213 15? 127 1367 16375 1306 6587 1168 5325 935 831 720 18929 807? 5984 1784 10892 1638 3437 1423 2400 106B 980 928 190 F 11294 4677 3332 249 12 3 37 24 21 XS 199 TOTAL H1TI S 73 56 65 ( t t F 35 44 40 210 18 13 83 70 71 W. CUT. S 583 198 146 44 39 18 20 K F m 43429 ?R89 16283 2718 12512 2563 13B88 1436 Z)4H1 16674 4776 1368 78908 15664 5B868 14152 3443 1259 TAaf 4 RESEARCH GRWTS (KERC, HiC, S M C ) M ) TEACH1IG STAFF FOR E A C H CF T i t 11 SLECTED HLTIVERS1TTES AS A PERCENT«! (I) OF T i t TOTAL MJLTIVERSITIES f€R DISCIPUHWY SECTOR SECTORS BR. COLIMI. YEARS a b IS ÎF IS IF QALH0US1E % S V LAVAL MANITOBA MCG1LL F IS ff IS ff IS J MDttSTER MKTREAL SASCAT. IS ffIS IS PeriHedical 01 1982-83 1976-77 1972-73 12.7 23.1 12.6 9.6 9.4 8.8 20.1 6.1 20.1 12.7 12.8 12.8 1.1 0.9 0.1 7.8 8.6 6.6 1.2 0.3 2.6 6.2 5.4 4.7 9.4 15.7 15.5 7.4 8.4 9.3 5.0 8.9 12.6 4.6 5.6 1.8 - 2.7 1.0 1.4 2.1 4.0 1.2 15.4 14.1 17.8 ParaMedical 02 1982-83 1976-77 1972-73 4.3 12.0 22.1 12.7 12.5 12.5 39.7 47.2 48.7 14.1 12.9 9.6 - 6.9 6.0 4.7 22.1 13.0 5.9 6.0 6.5 4.3 6.3 9.5 11.6 10.1 6.6 12.7 1.1 5.4 6.3 7.4 0.8 5.0 4.5 4.4 2.1 5.4 12.4 14.8 12.7 5.9 6.7 7.9 Pure Sciences 03 1962-83 1976-77 1972-73 9.2 13.2 10.6 11.1 10.4 11.3 14.8 14.9 12.1 14.7 14.2 14.8 4.4 4.8 3.3 4.4 4.8 3.8 5.1 5.3 6.6 10.3 10.8 10.3 6.7 6.8 5.9 9.1 8.3 9.0 9.9 9.6 8.9 10.8 11.2 11.1 11.0 9.0 10.1 4.9 5.5 4.5 7.1 9.1 9.9 9.4 9.0 9.2 Applied Sdences 01 1962-83 1976-77 1972-73 9.1 7.3 9.9 9.0 9.0 8.8 12.0 12.1 13.0 10.6 10.4 10.3 2.8 1.6 1.2 0.4 0.4 0.4 7.1 8.7 8.1 9.7 10.2 9.6 6.7 5.4 4.2 8.6 8.5 9.6 9.4 10.4 11.5 8.4 9.1 9.3 11.2 8.3 9.8 5.4 5.2 5.7 10.7 11.5 7.7 HJIL S 1982-83 Soc. S d . 1976-77 05 1972-73 5.4 6.8 5.0 7.9 7.7 8.6 13.0 12.2 9.6 9.5 9.8 10.1 6.3 11.7 6.5 3.9 4.1 4.2 5.0 3.1 8.2 10.5 9.8 8.2 4.0 3.5 3.0 9.0 8.4 9.4 11.6 11.8 12.2 6.1 7.3 7.7 8.7 13.6 17.6 6.7 6.8 6.6 E & c a t i a 1982-83 1976-77 06 1972-73 2.8 45.4 60.5 15.0 15.3 15.3 5.5 6.1 14.1 16.6 15.7 14.8 3.6 3.4 3.2 2.7 27.3 4.4 6.8 7.0 7.7 9.5 6.8 6.0 5.7 5.0 1.9 6.0 5.7 5.9 2.6 21.5 3.8 1.5 1.5 2.1 9.7 9.4 12.1 11.8 41.2 46.6 14.8 14.9 13.2 2.4 2.9 5.7 5.7 4.2 7.7 6.6 5.0 8.9 7.8 10.6 5.9 7.4 7.3 19.1 16.5 4.9 8.0 7.8 5.0 9.0 3.4 10.1 9.1 5.1 18.7 0.4 2.0 9.3 9.3 6.6 8.5 Adn. Sdences 07 1962-83 1976-77 1972-73 Arts 1982-83 1976-77 1972-73 6.5 13.5 14.9 16.3 14.2 1.3 6.7 12.3 12.4 14.2 - 3.2 3.9 3.7 8.3 10.9 11.4 - 1982-83 1976-77 1972-73 4.0 1.8 2.2 11.3 11.3 10.8 4.8 12.0 4.0 15.2 12.2 11.9 1.9 4.2 3.4 4.3 4.4 8.8 30.8 36.1 7.6 7.7 7.2 18.0 3.8 4.6 5.1 8.8 5.2 • 8.0 5.9 7.1 8.5 9.2 8.2 2.5 1982-83 1976-77 1972-73 2.8 21.2 7.9 7.2 5.7 20.4 28.1 10.3 12.1 12.6 12.0 10.5 10.2 9.4 22.1 43.7 14.7 15.7 14.7 10.7 6.2 6.9 7.0 9.0 29.7 8.2 8.2 6.0 H A 1982-83 1976-77 1972-73 6.5 5.8 7.3 5.2 6.9 6.8 8.5 6.8 7.8 9.3 8.1 8.1 4.3 3.5 2.9 7.7 8.5 7.9 5.1 4.2 3.6 4.0 5.5 6.2 6.7 7.7 7.2 7.9 8.8 6.4 18.9 19.3 20.6 10.1 8.3 9.4 6.5 7.7 7.2 1962-83 1976-77 1972-73 8.0 9.6 9.1 9.5 9.9 10.2 12.3 10.8 11.1 12.3 11.8 11.7 3.9 3.7 2.7 4.8 5.0 4.4 5.6 5.3 5.9 8.3 8.7 8.5 6.7 7.3 6.4 8.0 7.9 8.0 12.7 13.4 14.1 8.1 8.3 8.2 7.9 8.3 9.0 0 8 Letters 09 L» 10 Medfdne" »Specialties 11 TOTAL a IS t> 5F 1.9 6.7 = $ S p e d i t e R j l t i v e r s i t y / S Total R j l t i v e r s i t i e s = F Speclffc K i l t i v m i i y / F Total H i l t i v m i t i e s 1.1 11.0 c yc d SX 7.5 M A m c tF IS V IS IF IS 12.4 15.7 6.4 16.3 5.6 15.8 30.4 29.9 24.7 d V 12.3 10.9 13.0 5.1 4.2 4.3 5.2 6.9 7.3 86.9 86.9 86.0 82.5 81.0 82.0 6.5 6.8 7.5 16.6 11.7 3.6 10.4 6.6 14.9 0.7 9.3 9.0 8.9 71.7 65.9 53.8 67.2 66.9 71.7 8.8 5.0 2.9 4.9 5.0 5.9 17.9 14.6 19.4 12.8 12.8 12.0 6.4 6.1 8.6 7.0 7.3 7.6 59.9 61.0 62.5 46.4 46.5 46.4 18.4 16.5 16.0 4.6 7.1 8.3 7.3 7.2 6.9 21.4 23.7 22.4 17.2 18.5 18.4 4.5 3.5 3.3 4.3 4.4 4.4 5B.1 61.5 65.2 48.5 50.6 55.5 11.3 5.5 7.9 11.7 12.0 8.9 2.0 0.7 2.6 5.1 5.0 5.0 18.9 19.4 19.0 19.4 18.6 20.3 13.4 11.4 8.3 9.6 9.1 10.5 53.4 60.2 62.2 39.4 39.1 40.1 4.2 9.4 9.8 6.4 0.3 7.8 7.3 8.3 46.5 18.8 20.3 19.2 13.1 21.4 1.7 7.1 8.6 7.7 46.8 24.3 37.8 49.5 49.8 54.9 1.3 4.5 5.3 4.4 14.1 12.9 11.8 1.7 23.6 6.1 6.3 6.8 19.2 12.3 15.7 9.9 10.4 9.7 22.8 8.2 4.0 12.0 11.7 14.5 51.8 46.5 46.4 43.0 46.4 54.1 3.7 3.5 4.7 57.7 93.3 6.9 4.6 4.8 - 5.4 6.0 7.5 51.4 36.9 26.0 12.8 13.7 13.9 6.9 3.5 11.5 11.6 9.6 66.0 77.0 78.8 37.0 40.5 42.4 4.9 5.2 4.7 20.1 21.0 7.1 3.2 2.4 4.5 1.5 2.4 0.4 4.3 4.1 4.7 20.6 6.3 27.2 S.O S.3 27.9 5.5 11.6 4.1 8.2 8.6 9.4 46.7 61.8 ffi.3 43.5 43.2 43.8 15.9 13.9 21.8 - 5.9 6.9 6.4 2.1 20.9 28.9 8.7 9.0 9.3 43.2 6.2 9.6 9.1 7.0 71.3 41.8 86.4 52.3 56.6 60.9 9.4 8.2 7.4 11.2 11.8 13.2 8.6 9.3 11.2 2.2 2.5 2.8 7.0 6.3 6.2 22.4 23.4 21.9 19.5 18.3 19.6 7.2 6.8 4.8 10.9 11.1 10.3 83.6 86.4 86.6 76.8 76.3 80.3 5.6 5.3 5.0 9.2 10.2 10.1 10.6 10.4 10.0 5.3 4.3 4.0 6.4 6.2 6.5 21.1 20.6 17.3 17.3 18.1 6.7 6.0 5.8 8.6 8.7 8.8 66.6 69.7 71.1 49.6 49.4 51.0 M A M A M A 4.4 = S Total K j l t i w r s i t i e s / S Total Canada = F Total H j l t i v e r e i t i e s / F Total Canada a.2 8t»8 Odd 1.20 1.00 | 0.86 r i.83 1.90 1.86 1.12 1.43 1.42 1.36 0.71 | 1.41 1.06 1.09 1.06 1.34 1.40 | 1.39 1.05 1.06 1.0» 1.06 0.98 0.75 1.29 1.31 1.34 1.19 1.21 1.17 1.35 1.53 1.54 Measuring the Effectiveness of Research Grant Getting § — 1— < 0mPS 2§S Í * > S ® §~ asa 888 S * D > P ¿ 83S a .» PS» SH8 SS8 H ¡ S £ R ooó ddo ddo 0 ó «do ood d H C N 4HOÓ ood Jr óv ñ S 0 3 r r r » a> < n H I A 1 0 «OI» S 2a * § B , ,s s a n ggs «8(5 S3 . 3 , , as» O -ido ood o d d h 4 d d ^ d d d d OOH 00 dod jc or v J £ » » O I »C J SS3 C 0 >C D0 S38 s .» 3 S 1 S5f oajes o5 3 S¡ ,&¡ , ¡ s sí odd odd s dd dd 000 ddo i ñr if s j Sí or ^ P T Í f t« 555 gS5 , s ,2 , ,5 S £ S K f f i ssss 833 o^id odd dod 000 0 0 ood mdd 0 t o S r T c o 55 §55 s SSS 555 55§ «25? . . ssa ® «88 SSS d O O do« OOH 000 3 odd HHO 000 odd ó O S^C V J 555 555 o í 555 v oS í ñ ago M< Ñ JL T > 55 , , s ass ssa M , O O i J o ' H ' O ^ d odd dd "d c d dd0 v l HO- odd 555 SGís 5s s SSS <©35 ! £ l 5 5 d SÍSK H S as . Soo don ddo d 0 d O H ood 000 d 0 O poS? t o r ^ g jw 55^ SSiS 8 S 3 oSS S iS >odd O C SS ^d V Jr ñS 4 d ds«s 4c ú 0d d d 0 d H<0 JHd odd - I SRfffl 2 i i ¡ á i i i ! 1 i i d0 o>co f ^ T ££5 S 1 d i i SECTORS 35 ÑON S M H SS . . .| S ¡ áfjjo s d , , odd |5S SJRC asa ddo c R12S s j^ H dS S S S S S odd a s ? » íc \ íí í c\J r-^ r-T rT ÑÑÑ 5 3 S3 co S- S r77 $ S 1 a . ,S ' a 333 S S? Sí » < Sffis fflffiS E n P : 000 555 3SS 32S3 tpCR 8S , OO S^ r í R ÜSS lia 555 sas c gC \ J d8sa < d dS 1 1 c O d d— CO M S R R ügS -1 IS 3 »5 gCTioí 111 la S 1 8 lis Ñ | 8 | s J s * 1 P «0 O ^ 5 a s í s Mi 1 36 Charles H. Bélanger and Robert Lacroix overall performance of that institution to be somewhat below the established norm of 1.0. The same rationale applies to other institutions throughout. When percentages were worked into the performance indicator formula (Table 5), each institution received its performance grades within and across disciplinary sectors. The order of presentation of each institution in Table 5 is based on the 1982-1983 overall performance. That explains why McGill University and Laval University appear first and last respectively. The three top-ranked universities perform well above the established norm in most disciplinary sectors. McGill has kept its number one position in medicine; McMaster did the same in the pure sciences; and Toronto has had a strong showing in the peri-medical sector. As for the eight remaining institutions, one can observe wide variations within and across sectors, although some strength areas are also noticeable. For illustrative purposes, let us pinpoint a few examples. Dalhousie has been a top performer in the applied sciences and shows an excellent track record in the humanities and social sciences. Laval has firmed up its competitive edge in the para-medical sector along with British Columbia. Finally, Montreal, as a middle-of-the-pack performer does very well in letters and medicine. As a general observation, fluctuations are likely to be more frequent and wider in traditionally low research-funded sectors. While the level of funding is a disciplinary characteristic, the cause of the fluctuations can be mostly explained by the coming into play of small numbers. Implications and Conclusions Grantsmanship performance indicators can be a useful monitoring device to complement bibliometric data. In fields such as the natural, mathematical and life sciences where there is a close correlation between grantsmanship effectiveness and research productivity, the results of such indicators constitute rather convincing evidence to assess the degree of excellence and competitiveness of a faculty and/or an institution. In areas where grants are less built into the tradition and the basic requirements of disciplinary knowledge production, one might sensibly argue that such information is scarcely necessary or not necessary at all. To counteract that argument, we might reply that even in those disciplines, there is a definite pecking order or track record whereby a faculty or an institution has perennially demonstrated strengths. Therefore, they must be doing something right. Results of performance indicators enable university research policy-makers to reinforce successes and to dispel quickly incorrect claims of strong performance. Such vital information is a sine qua non of sound policies for academic staff management. First, provided that similar institutions and disciplines are compared, such indicators constitute means to quantify the quality of a faculty and/or institution. Second, they serve as a gauge to determine the degree of exposure to and association with the international academic community. Third, they keep 37 Measuring the Effectiveness of Research Grant Getting reminding universities of developing and applying high quality standards in their recruitment, promotion, and reward policies if those same universities wish to acquire, improve, or maintain an international or even a national reputation. Fourth, universities must create the appropriate environment to maximize output. Two essential means to arrive at that consist of differentiated teaching loads and multiform incentives. The former produces greater equity whereas the latter has a way to motivate humans. That seems to be the key of the most successful universities in Canada. In the final analysis, there is no doubt that the production of performance indicators for eleven disciplinary sectors is a considerable improvement over the simple division of all grant money by all teaching staff. It is also a further refinement of a University of Western Ontario in-house study (1984) which produced similar indicators by matching each of the three largest federal granting agencies with their respective potential recipients. Ideally, each separate field, discipline, or profession should be compared across institutions and ranked. To realize that objective which does not seem too distant or so formidable, both federal data gathering agencies will have to make adjustments. National Research Council Canada will have to facilitate the database access through electronic means and Statistics Canada will have to refine the notion of teaching staff. As it currently stands, the UCAS file includes lecturers and visiting academic staff and excludes academic staff who have been hired as researchers rather than teachers. Hopefully, this paper will encourage the above agencies and the universities to pursue common approaches to assist all parties in their assessment and management efforts. REFERENCES Baldridge, J.V. et al. Policy-making 1978. Bayer, A.E. Teaching faculty no. 8:2, 1973. and effective leadership. San Francisco: Jossey-Bass Publishers, in academe: 1972-1973. American Council on Education, Report Ingalls, W.B. Increasing research productivity in small universities: a case study. The Journal of Higher Education, 1982, 12(3), 59-64. Canadian Jauch, L.R. & Glueck, W.F. Evaluation of university professors' research performance. Management Science, 1975, 22 (1), 66-75. Jencks, C. & Riesman, D. The academic evolution. New York: Anchor, 1969. Ladd, E.C., Jr. & Lipsett, S.M. How professors spend their time. The Chronicle of Higher 1972, 1974, 1976. Education, Liebert, R.J. Research-grant getting and productivity among scholars. Journal of Higher 1977,48 (2), 164-192. Education, Light, Jr., D. Introduction: the structure of the academic professions. Sociology of Education, (Winter 1974), 2-28. 47 38 Charles H. Bélanger and Robert Lacroix Ministère de l'éducation. Méthode d'évaluation des coûts unitaires par secteur disciplinaire des universités du Québec utilisée pour le financement des effectifs étudiants supplémentaire en 1982-1983. Rapport méthodologique, Gouvernement du Québec, MEQ - DGERU, 1983. Moed, H.F. et al. The use of bibliometric data as tools for university research policy. In Bélanger, C.H. (ed.) Beyond Retrenchment: Planning for Quality and Efficiency. Proceedings of Sixth European AIR Forum, 1984. National Research Council Canada. Directory 1972-1973, 1976-1977, 1982-1983. of federally supported research in universities. Rushton, J.P. & Meltzer, S. Research productivity university revenue, and scholarly impact (citations) of 169 British, Canadian and United States universities (1977). Scientometrics, 1981, 3 (4), 275-303. Université de Montréal. La poursuite de l'excellence. Rapport du groupe de travail sur les priorités présenté au comité de la planification de l'Université de Montréal, 1985. University of Berkeley. University of California faculty Research in Social Behavior, 1978. University of Western Ontario. Research funding University research office, 1984. time-use study. Berkeley: Institute for indicators based on federal This Publication is available in Microform. University Microfilms International Please send additional inrormation lor ui:iin.-..I piihlK.iii.ini Name Institution Street City State Zip 300 North Zceb Road. Dept. P R.. A n n Arbor. Mi. 4 X 1 0 6 council grants.
Author
Author