Conclusions | Appendix A - E
Main Takeaways
Through analyzing four different perspectives—research output and impact, research focus, research inputs, and knowledge transfer and collaboration, this report outlines a process that states can undertake to identify and showcase their research strengths—those areas in which they have a comparative research advantage.
Research Output and Impact
Research Focus
Research Inputs and Efficiency
Knowledge Transfer and Collaboration
Appendix A - State Abbreviations and Region Mappings
Appendix B - Additional Notes about Methodology
Appendix C - Glossary of Terms
Appendix D - Field Classification Systems
Appendix E - Summary Tables
Endnotes
1. Weinberg, B. A., Owen-Smith, J., Rosen, R. F., Schwarz, L., Allen, B. M., Weiss, R. E., & Lane, J. (2014). Science Funding and Short-Term Economic Activity.
Science, 344(6179), 41–43. doi:10.1126/science.1250055; Lane, J., & Bertuzzi, S. (2011). Research funding. Measuring the results of science investments.
Science, 331(6018), 678–80. doi:10.1126/science.1201865
2. Florida, R. (2014). The Rise of the Creative Class--Revisited: Revised and Expanded. Basic Books.
3. Roberts, E., & Eesley, C. E. (2011). Entrepreneurial Impact: The Role of MIT. Foundations and Trends in Entrepreneurship, 7(1-2), 1–149. Retrieved from
http://www.nowpublishers.com/articles/foundations-and-trends-in-entrepreneurship/ENT-030
4. Eesley, C. E., & Miller, W. F. (2013). Impact: Stanford University’s Economic Impact via Innovation and Entrepreneurship. SSRN Electronic Journal. doi:10.2139/ssrn.2227460
5. Mazzucato, M. (2013). The Entrepreneurial State: Debunking Public vs. Private Sector Myths (p. 264). Anthem Press. Retrieved from http://www.amazon.com/The-Entrepreneurial-State-Debunking-Private/dp/0857282522
6. Moretti, E. (2012). The New Geography of Jobs. Houghton Mifflin Harcourt.; Delgado, M., Porter, M. E., & Stern, S. (2014). Clusters, convergence, and economic performance. Research Policy, 43(10), 1785–1799. doi:10.1016/j.respol.2014.05.007
7. For more information on what types of records are included in our definition of peer-reviewed publications, please see Appendix C: Glossary.
8. Thus, the percentile associated with 1.0 (California) indicates that that state produced more publications than all other states, a percentile associated with 0.5 (Alabama) indicates that that state produced the median number of publications among all states, and a percentile associated with 0.0 (South Dakota—not displayed in the following scatterplot) indicates that the state produced fewer publications than all other states.
9. For more details on how institutions were categorized into the different sectors, please see Appendix C: Glossary.
10. Peer-reviewed publications in the Scopus database can be categorized as one of three different types – journal articles, review papers, and conference proceedings. See Appendix B: Document Types for more details. For more theoretical considerations of citation based indicators, see e.g., Waltman, L., van Eck, N. J., van Leeuwen, T. N., Visser, M. S., & van Raan, A. F. J. (2011). Towards a new crown indicator: an empirical analysis. Scientometrics, 87(3),
467–481. doi:10.1007/s11192-011-0354-5
11. For a full list of research fields and how research publications are categorized into fields, please see Appendix D: Field Classification Systems.
12. See Appendix C: Glossary for a more technical description of how this is calculated.
13. This organization is based on based on Klavans and Boyack’s uni-dimensional map of science. As a general rule, adjacent subject areas (such as computer science and mathematics, or energy and environmental science) are “closer” to one another in terms of cross-disciplinary collaboration and epistemic overlap. See Klavans, R., & Boyack, K. W. (2009). Toward a Consensus Map of Science. Journal of the American Society for Information Science and
Technology, 60(3), 455–476. doi:10.1002/asi.20991
14. In this and subsequent figures, names of fields are shortened for ease of view. For full list and names of fields, please see Appendix D: Scopus 27 Subject Classification.
15. Katz, B., & Bradley, J. (2013). The Metropolitan Revolution: How Cities and Metros Are Fixing Our Broken Politics and Fragile Economy. Washington, DC: Brookings Institution Press.
16. According to the US Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes_ar.htm) occupational employment surveys (OES), the location quotient for farming, fishing, and forestry occupations in Arkansas in 2013 was 1.44. This means that the relative percentage of workers in that occupation in Arkansas was 44% than the relative percentage of workers in that occupation throughout the US. For more information, see http://www.bls.gov/help/
def/lq.htm.
17. Britt, R. (2015). Higher Education R&D Expenditures Resume Slow Growth in FY 2013. Arlington, VA. Retrieved from http://www.nsf.gov/statistics/2015/nsf15314/#data.
18. See http://report.nih.gov/nihdatabook/index.aspx.
19. Basken, P. (2015). Team science: Research cooperation grows as federal money tightens. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/Team-science-Research/228175/
20. This is a high-level proxy for research efficiency. For a longer discussion on the limitations of this approach, please see Appendix B: Measuring Research Efficiency and Productivity.
21. Since HERD only covers R&D spending by higher education institutions, we restrict our normalization to just research output from the academic sector. If we instead normalized total research output (across all sectors) by total HERD, Tennessee would still be the most efficient state, producing 23.8 publications per million $ R&D.
22. It is important to note, however, that different fields of research have different space requirements – for example, research in the biological and biomedical sciences tend to require much more space than that in the social sciences. More in-depth studies of research space usage and efficiency can normalize for these distortions. See Appendix B: Measuring Research Efficiency and Productivity for more details.
23. Maine technically had the highest number of medical publications from universities per 1000 NASF of medical school research space (58.6), but based on the 2011 survey, it had only 9504 NASF of research space in medical schools, the sixth lowest among all states. By comparison, Massachusetts had 1,693,485 NASF of research space in its medical schools.
24. Please see Appendix B:Measuring Research Efficiency and Productivity for a longer discussion on limitations of this measure.
25. Saxenian, A. L. (1996). Regional Advantage: Culture and Competition in Silicon Valley and Route 128. Cambridge, Mass.: Harvard University Press. Retrieved from http://www.amazon.com/Regional-Advantage-Culture-Competition-Silicon/dp/0674753402
26. Leydesdorff, L., & Meyer, M. (2003). The Triple Helix of university – industry – government relations. Scientometrics, 58(2), 191–203; Mowery, D. C., & Ziedonis, A. A. (2015). Markets versus spillovers in outflows of university research. Research Policy, 44(1), 50–66. doi:10.1016/j.respol.2014.07.019
27. D’Este, P., Guy, F., & Iammarino, S. (2012). Shaping the formation of university-industry research collaborations: what type of proximity does really matter? Journal of Economic Geography, 13(4), 537–558. doi:10.1093/jeg/lbs010 ; Hewitt-Dundas, N. (2011). The role of proximity in university-business cooperation for innovation. The Journal of Technology Transfer, 38(2), 93–115. doi:10.1007/s10961-011-9229-4; Bishop, K., D’Este, P., & Neely, A. (2011). Gaining from interactions with universities: Multiple methods for nurturing absorptive capacity. Research Policy, 40(1), 30–40. doi:doi:10.1016/j.respol.2010.09.009
28. For more information on how we classify research output into the corporate, government, medical, and other sectors, please see Appendix C: Glossary.
29. Kurtz, M.J., & Bollen, J. (2012). Usage Bibliometrics. Annual Review of Information Science and Technology
30. Volume 44, Issue 1. Retrieved online from http://onlinelibrary.wiley.com/doi/10.1002/aris.2010.1440440108/pdf.; Moed, H. F. (2005). Statistical relationships between downloads and citations at the level of individual documents within a single journal. Journal of the American Society for Information Science and Technology, 56(10), 1088–1097. doi:10.1002/asi.20200; Schlögl, C., Gorraiz, J., Gumpenberger, C., Jack, K., & Kraker, P. (2014).
Comparison of downloads, citations and readership data for two information systems journals. Scientometrics, 101(2), 1113–1128. doi:10.1007/s11192-014-1365-9; Schloegl, C., & Gorraiz, J. (2010). Comparison of citation and usage indicators: the case of oncology journals. Scientometrics, 82(3), 567–580. doi:10.1007/s11192-010-0172-1; Schloegl, C., & Gorraiz, J. (2011). Global usage versus global citation metrics: The case of pharmacology journals. Journal
of the American Society for Information Science and Technology, 62(1), 161–170. doi:10.1002/asi.21420; Wang, X., Wang, Z., & Xu, S. (2012). Tracing scientist’s research trends realtimely. Scientometrics, 95(2), 717–729. doi:10.1007/s11192-012-0884-5
31. Bornmann, L. (2013). What is societal impact of research and how can it be assessed? a literature survey. Journal of the American Society for Information Science and Technology, 64(2), 217–233. doi:10.1002/asi.22803; Tijssen, R. J. . (2001). Global and domestic utilization of industrial relevant science: patent citation analysis of science–technology interactions and knowledge flows. Research Policy, 30(1), 35–54. doi:10.1016/S0048-7333(99)00080-3
32. For a more precise definition of what counts as a “download,” please see Appendix C: Glossary.
33. For more information about the data sources for patent data, please see Appendix B: Data Sources.
34. D’Este, P., & Patel, P. (2007). University–industry linkages in the UK: What are the factors underlying the variety of interactions with industry? Research Policy, 36(9), 1295–1313. doi:10.1016/j.respol.2007.05.002; Schartinger, D., Rammer, C., Fischer, M. M., & Fröhlich, J. (2002). Knowledge interactions between universities and industry in Austria: sectoral patterns and determinants. Research Policy, 31, 303–328.
35. Similar to research usage (downloads), we normalize a patent citations to a state’s research output (either overall or by subject) relative to its national publication share since states with higher levels of research output have a higher probability” of having any research output being cited in patents.
36. Wuchty, S., Jones, B. F., & Uzzi, B. (2007). The Increasing Dominance of Teams in Production of Knowledge. Science, 316(5827), 1036–1039. doi:10.1126/science.1136099
37. In 2013, Elsevier and Science Europe collaborated on a joint report that analyzed and comparatively benchmarked collaboration rates amongst European countries and US states. For more information, see http://www.elsevier.com/about/press-releases/science-and-technology/european-and-us-research-collaboration-and-mobility-patterns-science-europe-and-elsevier-release-comprehensive-study.
38. For more information, see Appendix C: Glossary.
39. The average Salton index between any two U.S. states is 0.016.
40. Moed, H. F., Glänzel, W., & Schmoch, U. (Eds.). (2005). Handbook of Quantitative Science and Technology Research. Dordrecht: Kluwer Academic Publishers. doi:10.1007/1-4020-2755-9
41. Price, D. J. de S. (1977). Foreword. In Essays of an Information Scientist (pp. v–ix).
42. Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375. doi:10.1007/BF02019306
43. Pinski, G., & Narin, F. (1976). Citation influence for journal aggregates of scientific publications: Theory, with application to the literature of physics. Information Processing & Management, 12(5), 297–312. doi:10.1016/0306-4573(76)90048-0.
44. Irvine, J., Martin, B. R., Abraham, J., & Peacock, T. (1987). Assessing basic research: Reappraisal and update of an evaluation of four radio astronomy observatories. Research Policy, 16(2-4), 213–227. doi:10.1016/0048-7333(87)90031-X
45. ftp://ftp.cordis.europa.eu/pub/indicators/docs/3rd_report_biblio_ext_methodology.pdf.
46. Van Eck, N. J., Waltman, L., van Raan, A. F. J., Klautz, R. J. M., & Peul, W. C. (2013). Citation analysis may severely underestimate the impact of clinical researchas compared to basic research. PloS One, 8(4), e62395. doi:10.1371/journal.pone.0062395
47. http://projectcounter.org/, http://usagereports.elsevier.com/asp/main.aspx
48. Glänzel, W. (2001) “National characteristics in international scientific co-authorship relations” Scientometrics 51 (1) pp. 69–115.
49. Leydesdorff, L. (2008) “On the normalization and visualization of author co-citation data: Salton’s Cosine versus the Jaccard Index” Journal Of The American Society For Information Science And Technology 59 (1) pp. 77–85.
< Prev Next >