Mai Trong Nhuan, Truong Xuan Cu, Nguyen Thi Hoang Ha, Tran Dang Quy, Pham Thuy Linh, Nguyen Tai Tue, Luu Viet Dung

Main Article Content

Abstract

The present research was conducted to propose the index for assessing the effectiveness of science and technology projects in Vietnam. The effectiveness of projects was measured through 20 variables of 03 indicators: (1) The effectiveness of science and technology values including 12 variables; 2) The effectiveness of human resources including 04 variables; 3) and the effectiveness of education and training including 04 variables. These variables were evaluated by the 0-1 scale, with the zero-value indicated ineffectiveness of projects and vice versa. These variables were measured by the number of results, products from selected projects, and compared with the actual value in the project contract. Total 8 projects (including natural sciences, social sciences and technology projects) from National Science and Technology Program for Sustainable Development of North West Vietnam (NSTP-SDNW) were selected from 58 projects of NSTP-SDNW for testing the present index. Research results showed that the number of results and products of all projects have been met or exceeded requirements in project contracts. The assessment value of social projects (code 06X, 07X, 17X) ranged from 0.55 to 0.75, whereas the assessment value of natural sciences and technology projects ranged from 0.55 to 0.72. Research results showed that all selected projects are measured as high effective level, with the highest effectiveness was observed in No. 06X and No. 02C projects.

Keywords: Assessment, Effectiveness, Project of Science and Technology, Product of Science and Technology, Indicators.

References

[1] I. Bartuševičienė and E. Šakalytė, Organizational assessment: Effectiveness vs. Efficiency, Social Transformations in Contemporary Society 1 (2013) 45-53.
[2] W. Zheng, B. Yang, G. McLean, Linking organizational culture, structure, strategy, and organizational effectiveness: Mediating role of knowledge management, Journal of Business Research 63(7) (2010) 763-771. https://doi.org/10. 1016/j.jbusres.2009.06.005.
[3] C. Donovan and S. Hanney, The “Payback Framework” explained. Research Evaluation 20 (2011) 181-183. https:// doi.org/10.3152/095820211X13118583635756.
[4] R. Banzi, L. Moja, V. Pistotti, A. Facchini and A. Liberati, Conceptual frameworks and empirical approaches used to assess the impact of health research: An overview of reviews, Health Research Policy and Systems 9 (2011) 26-36. https://dx.doi. org/10.1186%2F1478-4505-9-26.
[5] H. P. McKenna, J. Daly, P. Davidson, C. Duffield, D. Jackson, RAE, ERA, Spot the difference, Int J Nurs Stud 49 (2012) 375-377. https://doi.org/10.1 016/j.ijnurstu.2011.11.013.
[6] C. Manville, S. Guthrie, M. Henham, B. Garrod, S. Sousa, A. Kirtley, S. Clarke and T. Ling, Assessing impact submissions for REF 2014: An evaluation. RAND, Prepared for HEFCE, SFC, HEFCW and DEL (2014). https://www.rand.org/pubs/research_ reports/RR1032.html (accessed 30 December 2019).
[7] REF2014, 2010. Decisions on Assessing Research Impact. https://www.ref.ac.uk/2014/media/ref/ content/pub/decisionsonassessingresearchimpact/01_11.pdf (accessed 30 December 2019).
[8] CAHS (Canadian Academy of Health Sciences Panel on Return on Investment in Health Research), Making an Impact: A Preferred Framework and Indicators to Measure Returns on Investment in Health Research (Publish online), 2009. http://www. cahs-acss.ca/wp_content/uploads/2011/ 09/ROI_FullReport .pdf (accessed 30 December 2019).
[9] M. Duryea, M. Hochman and A. Parfitt, Measuring the Impact of Research, Research Global, 27, 8-9 (Publish online) (2007). http://facdent.hku.hk/docs/ ResGlob2007.pdf (accessed 30 December 2019).
[10] S. Morton, Progressing research impact assessment: A ‘contributions’ approach. Research Evaluation 24 (2015), 405-419. https://doi.org/10.1093/reseval/rvv016.
[11] S.H. Oh, H.Y. Lim, B. Kim, Strategy to Promote the Effectiveness of Technology Transfer of National R&D Programs in Korea: Seen through the G7 Leading Technology Development Program, Procedia Computer Science 91 (2016) 221 - 229. https://doi.org/10.1016/j.procs.2016.07.061.
[12] H. Lee, Y. Park, H. Choi, Comparative evaluation of performance of national R&D programs with heterogeneous objectives: A DEA approach, European Journal of Operational Research 196 (2009) 847-855. https://doi.org/10.1016/j.ejor.2008. 06.016.
[13] P. Patunakul, Y. H. Kwak, O. Zwikael, M. Liu, What impacts the performance of large-scale government projects?, International Journal of Project Management 34 (2016), 452-466. https:// doi.org/10.1016/j.ijproman.2015.12.001.
[14] B. Bozeman, J. Youtie, Socio-Economic impacts and public value of government-funded research: Lessons from four US National Science Foundation initiatives. Research Policy 46(8) (2017) 1387-1398. https://doi.org/10.1016/j.respol. 2017.06.003.
[15] B.P. Cozzarin, Data and the measurement of R&D program impacts, Evaluation and Program Planning 31 (2008) 284-298. https://doi.org/10. 1016/j.evalprogplan. 2008.03.004.
[16] J. Han, M. Kamber, J. Pei, Data mining - Concepts and Techniques, 3rd edition, Elsevier Inc, USA (2012).
[17] UNDP, Human development report, United Nations Development Program (2006).
[18] M. Buxton and S. Hanney, How can payback from health services research be assessed? J Health Serv Res Policy 1(1) (1996) 35-43. https://doi.org/10. 1177%2F135581969600100107.
[19] S. Wooding, S. Hanney, M Buxton, J. Grant, The returns from arthritis research Volume 1: approach, analysis and recommendations, A report prepared for the Arthritis Research Campaign, RAND Europe (2004).