바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

  • P-ISSN2287-1608
  • E-ISSN2287-1616
  • KCI

Coproducing Quality Performance Information Through Institutional Design: Proposal for a Data Exchange Structure

Asian Journal of Innovation and Policy / Asian Journal of Innovation and Policy, (P)2287-1608; (E)2287-1616
2020, v.9 no.1, pp.12-35
https://doi.org/10.7545/ajip.2020.9.1.012
Yun-Hsiang Hsu (National Central University)
Haena Kim
Jack Y. J. Lee (National Open University)

Abstract

Quality performance information has been regarded as a significant step toward managing public performance. Although a correlation between the quality of information and its actual usage among managers in high-accountability policy areas has been found, quality performance information has not been properly provided to practitioners. This study takes an Institutional Analysis and Development approach to assess an appropriate institutional framework that facilitates state agencies and academics to coproduce this information. Based on a conceptual framework, we analyze a public information system of the Workforce Data Quality Initiative in Ohio and carry out a content analysis with NVIVO. It is found that arrangements that can manage the incentive dynamic in this process may help to align heterogeneous stakeholders in a mutually supportive fashion. Also, the research agenda and information resulted from being coproduced for management and academic purposes, simultaneously. This use of administrative data sheds light on how quality performance information can be coproduced under an appropriate institutional arrangement between administration and research communities. It is suggested that accessibility to the information system among various stakeholders should be improved.

keywords
Performance information, administrative data, Institutional Analysis and Development framework, Institutional arrangement, Ohio Longitudinal Data Archive

Reference

1.

Au, W. (2011). Teaching under the new Taylorism: High‐stakes testing and the standardization of the 21st century curriculum. Journal of Curriculum Studies, 43(1), 25–45. doi:10.1080/00220272.2010.521261.

2.

Baekgaard, M. and Serritzlew, S. (2015). Interpreting performance information: motivated reasoning or unbiased comprehension. Public Administration Review, 76(1), 73–82, doi: 10.1111/puar.12406.

3.

Ballard, A. (2019). Promoting performance information use through data visualization: evidence from an experiment. Public Performance & Management Review, 43(1), 109–128, doi:10.1080/15309576.2019.1592763.

4.

Benkler, Y. (1998). The commons as a neglected factor of information policy. Paper presented at the Telecommunications Research Conference, Alexandria, VA.

5.

Benkler, Y. (2003). The political economy of commons. Upgrade: The European Journal for the Informatics Professional, 4(3), 6–9.

6.

Berman, E. (2002). How useful is performance measurement. Public Performance & Management Review, 25(4), 348–351. doi:10.2307/3381127.

7.

Bourdeaux, C., and Chikoto, G. (2008). Legislative influences on performance management reform. Public Administration Review, 68(2), 253–265.

8.

Bozeman, B., & Bretschneider, S. (1986). Public management information systems: Theory and prescription. Public Administration Review, 46(November), 475–487.

9.

CHRR. (2012). Ohio workforce data quality initiative: Research data system and access. Paper presented at the Advisory Meeting, Columbus, OH.

10.

CHRR. (2013). Ohio analytics research agenda interview findings. Paper presented at the Advisory Meeting, Columbus, OH.

11.

Dawes, S. (1996). Interagency information sharing: Expected benefits, manageable risks. Journal of Policy Analysis and Management, 15(3), 377–394.

12.

Dawes, S. (2010). Stewardship and usefulness: Policy principles for information-based transparency. Government Information Quarterly, 27(4), 377–383. doi: 10.1016/ j.giq.2010.07.001.

13.

Dawes, S. (2012). A realistic look at open data. Paper presented at Using Open Data Workshop. Retrieved from http://www.w3.org/2012/06/pmod/presentation-dawes.pdf, 4 December 2016.

14.

Dawes, S., Pardo, T. A., and Cresswell, A. M. (2004). Designing electronic government information access programs: A holistic approach. Government Information Quarterly, 21(1), 3–23. doi: 10.1016/j.giq, 02 November 2003.

15.

Dillon, S. (2006). As 2 Bushes try to fix schools, tools differ. The New York Times. Retrieved from http://www.nytimes.com/2006/09/28/education/28child.html, 8 January 2017

16.

Department of Labor. (2013a). Ohio’s design plan. Retrieved from http://www. doleta.gov/performance/pfdocs/Ohio.pdf, 15 February 2017.

17.

Department of Labor. (2013b). Workforce data quality initiative (WDQI) orientation web meeting. Columbus, OH.

18.

Evans, A. M., and Campos, A. (2013). Open government initiatives: Challenges of citizen participation. Journal of Policy Analysis and Management, 32(1), 172–185. doi:10.1002/pam.21651.

19.

Fozzard, A. (2001). The basic budgeting problem approaches to resource allocation in the public sector and their implications for pro-poor budgeting. Working Paper 147 Overseas Development Institute, London, UK.

20.

Frederickson, D. G., and Frederickson, H. G. (2006). Measuring the performance of the hollow state. Washington DC: Georgetown University Press.

21.

Friedlander, A., and Adler, P. (2006). To stand the test of time: long-term stewardship of digital data sets in science and engineering. Paper presented at ARL Workshop on New Collaborative Relationships Report to the National Science Foundation. Http://editlib.org/p/69918. 1 April 2017.

22.

Government Accountability Office. (2010). Department of education: improved dissemination and timely product release would enhance the usefulness of the what works clearinghouse. GAO-10-644. Report to Congressional Committees. Washington DC: United States General Accounting Office.

23.

Government Accountability Office. (2011a). Postsecondary education: many states collect graduates’ employment information, but clear guidance on student privacy requirements is needed. GAO-10-927. Report to Congressional Committees. Washington DC: United States General Accounting Office.

24.

Government Accountability Office. (2011b). Managing for results: GPRA modernization act implementation provides important opportunities to address government challenges. Washington DC: Government Accountability Office.

25.

Government Accountability Office. (2012). A guide for using the GPRA modernization act to help inform congressional decision making. Washington DC: Government Accountability Office.

26.

Government Accountability Office. (2013). Unemployment insurance information technology: States face challenges in modernization efforts. Washington DC: United States Government Accountability Office.

27.

Ginsberg, W. R. (2011). Obama administration’s open government initiative: Issues for congress. R41361. CRS Report for Congress. Washington DC: Congressional Research Service.

28.

Harrison, T. M., Guerrero, S., Burke, G. B., Cook, M., Cresswell, A., Helbig, N., Hrdinova, J., and Pardo, T. (2012). Open government and e-government: democratic challenges from a public value perspective. Information Polity, 17(2), 83–97. doi:10.3233/IP-2012-0269.

29.

Hawley, J., and Hsu. Y. (2012). Demonstration on Ohio workforce data quality initiative. Paper presented at the 25th Annual STATS-DC Data Conference. Washington, D.C.

30.

Heinrich, C. (2012). How credible is the evidence, and does it matter? An analysis of the program assessment rating tool. Public Administration Review, 72(1), 123–134. doi:10.1111/j.1540-6210.2011.02490.x.

31.

Heinrich, C.J. (2007a). Evidence-based policy and performance management: Challenges and prospects in two parallel movements. The American Review of Public Administration, 37(3), 255–277. doi:10.1177/0275074007301957.

32.

Heinrich, C.J. (2007b). False or fitting recognition? The use of high performance bonuses in motivating organizational achievements. Journal of Policy Analysis and Management, 26(2), 281–304.

33.

Heinrich, C.J. (2008). Advancing public sector performance analysis. Applied Stochastic Models in Business and Industry, 24(5), 373–389.

34.

Heinrich, C.J., Peter, R. and Troske, K. (2008). The role of temporary help employment in low-wage worker advancement. In D. Autor (eds.), Studies of Labour Market Inter Mediation (pp 399-436). Chicago: University of Chicago Press.

35.

Hess, C. and Ostrom. E. (2003). Ideas, artifacts, and facilities: Information as a common-pool resource. Law and Contemporary Problems, 66(1/2), 111–145. doi:10.2307/ 20059174.

36.

Hess, C. and Ostrom. E (2007). Understanding knowledge as commons: From theory to practice. Cambridge, MA: MIT Press.

37.

Hsu, Y. (2011). Gainful employment regulation amendment in higher education act of 1965. Final Report in Higher Education Policy. Columbus, OH.

38.

Ikemoto, G.S. and Marsh, J. A. (2007). Cutting through the “data-driven” mantra: Different perceptions of data-driven decision making. In Moss, P. A. (eds.), Evidence and Decision Making: 106th yearbook of the National Society for the Study of Education (pp. 104–131). Malden, MA: Blackwell Publishing.

39.

ILO (2001). The impact of decentralization and privatization on municipal services. Retrieved from https://www.ilo.org/public/english/standards/relm/gb/docs/gb283/pdf/ jmmsr.pdf, 15 April 2020.

40.

Jennings, E.T. and Hall. J.L. (2012). Evidence-based practice and the use of information in state agency decision making. Journal of Public Administration Research and Theory, 22(2), 245–266. doi:10.1093/jopart/mur040.

41.

Kelly, J.M. (2002). Why we should take performance measurement on faith (facts being hard to come by and not terribly important). Public Performance & Management Review, 25(4), 375–380. doi:10.2307/3381132.

42.

Kim, T., Johansen, M. and Zhu, L. (2019). The effects of managers’ purposeful performance information use on American hospital performance. Public Performance & Management Review, 43(1), 129–156. doi:10.1080/15309576.2019.1638275.

43.

Koning, P. and Heinrich, C. (2010). Cream-skimming, parking and other intended and unintended effects of performance-based contracting in social welfare services. (Discussion Paper 46). IZA. Retrieved from http://papers.ssrn.com/sol3/papers.cfm? abstract_id=1570399, 3 March 2012.

44.

Kroll, A. (2015). Explaining the use of performance information by public managers: a planned-behavior approach. The American Review of Public Administration, 45(2), 201-215, doi:10.1177/0275074013486180

45.

Kranich, N. and Schement, J. R. (2008). Information commons. Annual Review of Information Science and Technology, 42(1), 546–591. doi:10.1002/aris.2008. 1440420119.

46.

Kusek, J. Z., & Rist, R. C. (2004). Ten steps to a results-based monitoring and evaluation system. The World Bank: Washington, D.C.

47.

Lee, C. (2019). Understanding the diverse purposes of performance information use in nonprofits: An empirical study of factors influencing the use of performance measures. Public Performance & Management Review, 43(1), 81–108. doi.org/10.1080/ 15309576.2019.1596136

48.

Lee, J. and Johnston, E.W. (2013). How to embed transparency into collaborative governance. PA Times Online. Retrieved from http://patimes.org/embed-transparency-collaborative-governancee, 5 October 2017.

49.

Lee, J. Y. & Wang, X. (2009). Assessing the impact of performance-based budgeting: a comparative analysis across the United States, Taiwan, and China. Public Administration Review, 69(Special Issue), 60–66.

50.

Lewin, T. (2013). Obama’s plan aims to lower cost of college. The New York Times, Dated 22 August 2013.

51.

Lynch, C. (2008). Big data: How do your data grow? Nature, 455(7209), 28–29. doi:10.1038/455028a.

52.

Mandinach, E. B., Honey, M., & Light, D. (2006). A theoretical framework for data-driven decision making. Paper presented at the annual meeting of American Educational Research Association, San Francisco.

53.

Martin, M. (2013). MacArthur fellow crunches data to streamline health care. NPR.org. Http://www.npr.org/2013/09/26/226477356/macarthur-fellow-crunches-data-to-streamline-health-care, 5 January 2018.

54.

McGinnis, M.D. (2011). Networks of adjacent action situations in polycentric governance. Policy Studies Journal, 39(1), 51–78. doi:10.1111/j.1541-0072.2010.00396.x.

55.

McGinnis, M.D., and Ostrom, E. (2012). Reflections on Vincent Ostrom, public administration, and polycentricity. Public Administration Review, 72(1), 15–25. doi:10.1111/j.1540-6210.2011.02488.x.

56.

Moynihan, D.P. (2008). The dynamics of performance management: Constructing information and reform. Washington DC: Georgetown University Press.

57.

Moynihan, D., Fernandez, S., Kim, S., LeRoux, K., Piotrowski, S., Wright, B., and Yang, K. (2011). Performance regimes amidst governance complexity. Journal of Public Administration Research and Theory, 21(s1), i141–155.

58.

Moynihan, D. and Pandey, S. (2010). The big question for performance management: Why do managers use performance information. Journal of Public Administration Research and Theory, 20(4), 849-866.doi:10.1093/jopart/muq004.

59.

Mueser, P.R., Troske, K.R. and Gorislavsky, A. (2007). Using state administrative data to measure program performance. The Review of Economics and Statistics, 89(4), 761–783. doi:10.1162/rest.89.4.761.

60.

Noveck, B.S. (2009). Wiki Government: How technology can make government better, democracy stronger, and citizens more powerful. Washington, DC: Brookings Institution Press.

61.

OECD. (2018). OECD Draft policy framework on sound public governance. https://www.oecd.org/gov/draft-policy-framework-on-sound-public-governance.pdf, 12 April 2020.

62.

OECD. (n. d.). Enhancing research performance through evaluation, impact assessment and priority setting. https://www.oecd.org/sti/inno/Enhancing-Public-Research-Performance.pdf, 10 April 2020.

63.

Ohio Education Research Center. (2012). The Ohio State University. Retrieved from http://oerc.osu.edu/data/available-data, 5 October 2018.

64.

Orszag, P.R. (2009). Open Government Directive. Washington, DC: Office of Management and Budget.

65.

Ostrom, E. (2007a). Institutional rational choice: An assessment of the Institutional Analysis and Development Framework. In Theories of the Policy Process, 2nd ed., P.A. Sabatier (ed.). Cambridge, MA: Westview Press.

66.

Ostrom, E. (2007b). A diagnostic approach for going beyond panaceas. The Proceedings of the National Academy of Sciences, 104(39), 15181–15187. doi:10.1073/pnas. 0702288104.

67.

Ostrom, E. (2011). Background on the institutional analysis and development framework. Policy Studies Journal, 39(1), 7–27. doi:10.1111/j.1541-0072.2010.00394.x.

68.

Pardo, T.A., Gil-Garcia, J.R. and Burke., G.B. (2008). Governance structures in cross-boundary information sharing: Lessons from state and local criminal justice initiatives. In Hawaii International Conference on System Sciences. The Proceedings of the 41st Annual, 211–211. doi:10.1109/HICSS.2008.185.

69.

Radin, B. (2006). Challenging the performance movement: Accountability, complexity, and democratic values. Washington, DC: Georgetown University Press.

70.

Sanchez, C. (2013). Do the data exist to make a college-rating system work? NPR.org. Http://www.npr.org/2013/08/22/214520122/does-the-data-exist-to-make-college-rating-system-work, 7 May 2015.

71.

Stevens, D. (2012). Documents and presentations enabled by or related to the administrative data research and evaluation (ADARE) project 1998-2012. Washington, DC: U.S. Department of Labor, Employment and Training Administration. Http://www.jacob-france-institute.org/wp-content/uploads/ ADARE-publications-presentations-compendium-11-8-12.pdf, 1 October 2017.

72.

Sylvia, R. D. and Sylvia, K. M. (2004). Program planning and evaluation for the public manager. Chicago, Illinois: Waveland Press.

73.

Thorn, C.A. and Meyer, R.H. (2006). Longitudinal data systems to support data-informed decision making a tri-state partnership between Michigan, Minnesota, and Wisconsin. WCER Working Paper No. 2006-1. Madison, WI: University of Wisconsin–Madison, Wisconsin Center for Education Research.

74.

Walker, R. M., Lee, M. J., James, O. and Ho, S. M. Y (2018). Analyzing the complexity of performance information use: experiments with stakeholders to disaggregate dimensions of performance, data sources, and data types. Public Administration Review, 78(6), 852-863. doi:10.1111/puar.12920.

75.

Wang, R.Y and Strong, D.M. (1996). Beyond accuracy: What data quality means to data consumers. Journal of Management Information System, 12(4), 5–33.

76.

Wang, X.H. (2008) Convincing legislators with performance measures. International Journal of Public Administration, 31(6), 654–667. doi:10.1080/01900690701641232.

77.

Wang, X.H. and Berman, E. (2001). Hypotheses about performance measurement in counties: Findings from a survey. Journal of Public Administration Research and Theory, 11(3), 403–428.

Asian Journal of Innovation and Policy