2007; Nason et al. The introduction of impact assessments with the requirement to collate evidence retrospectively poses difficulties because evidence, measurements, and baselines have, in many cases, not been collected and may no longer be available. A very different approach known as Social Impact Assessment Methods for research and funding instruments through the study of Productive Interactions (SIAMPI) was developed from the Dutch project Evaluating Research in Context and has a central theme of capturing productive interactions between researchers and stakeholders by analysing the networks that evolve during research programmes (Spaapen and Drooge, 2011; Spaapen et al. The case study of the Research Information System of the European Research Council, E-Infrastructures for Research and Innovation: Linking Information Systems to Improve Scientific Knowledge, Proceedings of the 11th International Conference on Current Research Information Systems, (June 69, 2012), pp. This might describe support for and development of research with end users, public engagement and evidence of knowledge exchange, or a demonstration of change in public opinion as a result of research. The RQF pioneered the case study approach to assessing research impact; however, with a change in government in 2007, this framework was never implemented in Australia, although it has since been taken up and adapted for the UK REF. An evaluation essay or report is a type of argument that provides evidence to justify a writer's opinions about a subject. Explain. Published by Oxford University Press. 2007; Grant et al. 0000346296 00000 n
In developing the UK REF, HEFCE commissioned a report, in 2009, from RAND to review international practice for assessing research impact and provide recommendations to inform the development of the REF. The current definition of health, formulated by the WHO, is no longer adequate for dealing with the new challenges in health care systems. In the Brunel model, depth refers to the degree to which the research has influenced or caused change, whereas spread refers to the extent to which the change has occurred and influenced end users. only one author attempts to define evaluation. These traditional bibliometric techniques can be regarded as giving only a partial picture of full impact (Bornmann and Marx 2013) with no link to causality. The Economic and Social Benefits of HRB-funded Research, Measuring the Economic and Social Impact of the Arts: A Review, Research Excellence Framework Impact Pilot Exercise: Findings of the Expert Panels, Assessment Framework and Guidance on Submissions, Research Impact Evaluation, a Wider Context. Wooding et al. Developing systems and taxonomies for capturing impact, 7. Recommendations from the REF pilot were that the panel should be able to extend the time frame where appropriate; this, however, poses difficult decisions when submitting a case study to the REF as to what the view of the panel will be and whether if deemed inappropriate this will render the case study unclassified. Perhaps it is time for a generic guide based on types of impact rather than research discipline? (2005), Wooding et al. Systems need to be able to capture links between and evidence of the full pathway from research to impact, including knowledge exchange, outputs, outcomes, and interim impacts, to allow the route to impact to be traced. 2006; Nason et al. 0000004731 00000 n
There are areas of basic research where the impacts are so far removed from the research or are impractical to demonstrate; in these cases, it might be prudent to accept the limitations of impact assessment, and provide the potential for exclusion in appropriate circumstances. The Payback Framework systematically links research with the associated benefits (Scoble et al. Impact is derived not only from targeted research but from serendipitous findings, good fortune, and complex networks interacting and translating knowledge and research. Productive interactions, which can perhaps be viewed as instances of knowledge exchange, are widely valued and supported internationally as mechanisms for enabling impact and are often supported financially for example by Canadas Social Sciences and Humanities Research Council, which aims to support knowledge exchange (financially) with a view to enabling long-term impact. What is the Difference between Formative and Summative Evaluation through Example? This is being done for collation of academic impact and outputs, for example, Research Portfolio Online Reporting Tools, which uses PubMed and text mining to cluster research projects, and STAR Metrics in the US, which uses administrative records and research outputs and is also being implemented by the ERC using data in the public domain (Mugabushaka and Papazoglou 2012). The justification for a university is that it preserves the connection between knowledge and the zest of life, by uniting the young and the old in the imaginative consideration of learning. To evaluate impact, case studies were interrogated and verifiable indicators assessed to determine whether research had led to reciprocal engagement, adoption of research findings, or public value. 0000007967 00000 n
What are the reasons behind trying to understand and evaluate research impact? The inherent technical disparities between the two different software packages and the adjustment . Every piece of research results in a unique tapestry of impact and despite the MICE taxonomy having more than 100 indicators, it was found that these did not suffice. In this article, we draw on a broad range of examples with a focus on methods of evaluation for research impact within Higher Education Institutions (HEIs). Downloadable! To be considered for inclusion within the REF, impact must be underpinned by research that took place between 1 January 1993 and 31 December 2013, with impact occurring during an assessment window from 1 January 2008 to 31 July 2013. (2006) on the impact arising from health research. 2009; Russell Group 2009). The Social Return on Investment (SROI) guide (The SROI Network 2012) suggests that The language varies impact, returns, benefits, value but the questions around what sort of difference and how much of a difference we are making are the same. A discussion on the benefits and drawbacks of a range of evaluation tools (bibliometrics, economic rate of return, peer review, case study, logic modelling, and benchmarking) can be found in the article by Grant (2006). Standard approaches actively used in programme evaluation such as surveys, case studies, bibliometrics, econometrics and statistical analyses, content analysis, and expert judgment are each considered by some (Vonortas and Link, 2012) to have shortcomings when used to measure impacts. There are standardized tests involved in the process of measurement assessment and evaluation to enable the students to make better use of the data available in the daily classroom. It is now possible to use data-mining tools to extract specific data from narratives or unstructured data (Mugabushaka and Papazoglou 2012). The REF will therefore assess three aspects of research: Research impact is assessed in two formats, first, through an impact template that describes the approach to enabling impact within a unit of assessment, and second, using impact case studies that describe the impact taking place following excellent research within a unit of assessment (REF2014 2011a). , , . n.d.). Dennis Atsu Dake. These metrics may be used in the UK to understand the benefits of research within academia and are often incorporated into the broader perspective of impact seen internationally, for example, within the Excellence in Research for Australia and using Star Metrics in the USA, in which quantitative measures are used to assess impact, for example, publications, citation, and research income. 2007). A key concern here is that we could find that universities which can afford to employ either consultants or impact administrators will generate the best case studies. n.d.). The time lag between research and impact varies enormously. For example, some of the key learnings from the evaluation of products and personnel often apply to the evaluation of programs and policies and vice versa. (2007) adapted the terminology of the Payback Framework, developed for the health and biomedical sciences from benefit to impact when modifying the framework for the social sciences, arguing that the positive or negative nature of a change was subjective and can also change with time, as has commonly been highlighted with the drug thalidomide, which was introduced in the 1950s to help with, among other things, morning sickness but due to teratogenic effects, which resulted in birth defects, was withdrawn in the early 1960s. Research findings including outputs (e.g., presentations and publications), Communications and interactions with stakeholders and the wider public (emails, visits, workshops, media publicity, etc), Feedback from stakeholders and communication summaries (e.g., testimonials and altmetrics), Research developments (based on stakeholder input and discussions), Outcomes (e.g., commercial and cultural, citations), Impacts (changes, e.g., behavioural and economic). Oxford University Press is a department of the University of Oxford. The Payback Framework enables health and medical research and impact to be linked and the process by which impact occurs to be traced. For example, the development of a spin out can take place in a very short period, whereas it took around 30 years from the discovery of DNA before technology was developed to enable DNA fingerprinting. It is worth considering the degree to which indicators are defined and provide broader definitions with greater flexibility. RAND Europe, Capturing Research Impacts. While aspects of impact can be adequately interpreted using metrics, narratives, and other evidence, the mixed-method case study approach is an excellent means of pulling all available information, data, and evidence together, allowing a comprehensive summary of the impact within context. 2009). Evaluate means to assess the value of something. Understanding what impact looks like across the various strands of research and the variety of indicators and proxies used to evidence impact will be important to developing a meaningful assessment. Even where we can evidence changes and benefits linked to our research, understanding the causal relationship may be difficult. Times Higher Education, Assessing the Impact of Social Science Research: Conceptual, Methodological and Practical Issues, A Profile of Federal-Grant Administrative Burden Among Federal Demonstration Partnership Faculty, Department for Business, Innovation and Skills, The Australian Research Quality Framework: A live experiment in capturing the social, economic, environmental and cultural returns of publicly funded research, Reforming the Evaluation of Research. As part of this review, we aim to explore the following questions: What are the reasons behind trying to understand and evaluate research impact? It is therefore in an institutions interest to have a process by which all the necessary information is captured to enable a story to be developed in the absence of a researcher who may have left the employment of the institution. Definition of Assessment & Evaluation in Education by Different Authors with Its Characteristics, Evaluation is the collection, analysis and interpretation of information about any aspect of a programme of education, as part of a recognised process of judging its effectiveness, its efficiency and any other outcomes it may have., 2. It is perhaps assumed here that a positive or beneficial effect will be considered as an impact but what about changes that are perceived to be negative? A Preferred Framework and Indicators to Measure Returns on Investment in Health Research, Measuring Impact Under CERIF at Goldsmiths, Anti-Impact Campaigns Poster Boy Sticks up for the Ivory Tower. It incorporates both academic outputs and wider societal benefits (Donovan and Hanney 2011) to assess outcomes of health sciences research. It can be seen from the panel guidance produced by HEFCE to illustrate impacts and evidence that it is expected that impact and evidence will vary according to discipline (REF2014 2012). 0000334705 00000 n
Figure 1, replicated from Hughes and Martin (2012), illustrates how the ease with which impact can be attributed decreases with time, whereas the impact, or effect of complementary assets, increases, highlighting the problem that it may take a considerable amount of time for the full impact of a piece of research to develop but because of this time and the increase in complexity of the networks involved in translating the research and interim impacts, it is more difficult to attribute and link back to a contributing piece of research. We suggest that developing systems that focus on recording impact information alone will not provide all that is required to link research to ensuing events and impacts, systems require the capacity to capture any interactions between researchers, the institution, and external stakeholders and link these with research findings and outputs or interim impacts to provide a network of data. The . Enhancing Impact. It is acknowledged that one of the outcomes of developing new knowledge through research can be knowledge creep where new data or information becomes accepted and gets absorbed over time. It has been acknowledged that outstanding leaps forward in knowledge and understanding come from immersing in a background of intellectual thinking that one is able to see further by standing on the shoulders of giants. Time, attribution, impact. Impact has become the term of choice in the UK for research influence beyond academia. trailer
<<
/Size 97
/Info 56 0 R
/Root 61 0 R
/Prev 396309
/ID[<8e25eff8b2a14de14f726c982689692f><7a12c7ae849dc37acf9c7481d18bb8c5>]
>>
startxref
0
%%EOF
61 0 obj
<<
/Type /Catalog
/Pages 55 0 R
/Metadata 57 0 R
/AcroForm 62 0 R
>>
endobj
62 0 obj
<<
/Fields [ ]
/DR << /Font << /ZaDb 38 0 R /Helv 39 0 R >> /Encoding << /PDFDocEncoding 40 0 R >> >>
/DA (/Helv 0 Tf 0 g )
>>
endobj
95 0 obj
<< /S 414 /T 529 /V 585 /Filter /FlateDecode /Length 96 0 R >>
stream
Hb```f``e`c`Tgf@ aV(G Ldw0p)}c4Amff0`U.q$*6mS,T",?*+DutQZ&vO T4]2rBWrL.7bs/lcx&-SbiDEQ&. Concerns over how to attribute impacts have been raised many times (The Allen Consulting Group 2005; Duryea et al. It is possible to incorporate both metrics and narratives within systems, for example, within the Research Outcomes System and Researchfish, currently used by several of the UK research councils to allow impacts to be recorded; although recording narratives has the advantage of allowing some context to be documented, it may make the evidence less flexible for use by different stakeholder groups (which include government, funding bodies, research assessment agencies, research providers, and user communities) for whom the purpose of analysis may vary (Davies et al. (2008), and Hanney and Gonzlez-Block (2011). 0000002868 00000 n
2008; CAHS 2009; Spaapen et al. Evaluative research has many benefits, including identifying whether a product works as intended, and uncovering areas for improvement within your solution. The Payback Framework has been adopted internationally, largely within the health sector, by organizations such as the Canadian Institute of Health Research, the Dutch Public Health Authority, the Australian National Health and Medical Research Council, and the Welfare Bureau in Hong Kong (Bernstein et al. Again the objective and perspective of the individuals and organizations assessing impact will be key to understanding how temporal and dissipated impact will be valued in comparison with longer-term impact. 0000348060 00000 n
When considering the impact that is generated as a result of research, a number of authors and government recommendations have advised that a clear definition of impact is required (Duryea, Hochman, and Parfitt 2007; Grant et al. This report, prepared by one of the evaluation team members (Richard Flaman), presents a non-exhaustive review definitions of primarily decentralization, and to a lesser extent decentralization as linked to local governance. evaluation practice and systems that go beyond the criteria and their definitions. The aim of this study was to assess the accuracy of 3D rendering of the mandibular condylar region obtained from different semi-automatic segmentation methodology. 0000328114 00000 n
Impact can be temporary or long-lasting. Merit refers to the intrinsic value of a program, for example, how effective it is in meeting the needs those it is intended help. The process of evaluation involves figuring out how well the goals have been accomplished. In the UK, evaluation of academic and broader socio-economic impact takes place separately. 0000007223 00000 n
Without measuring and evaluating their performance, teachers will not be able to determine how much the students have learned. If knowledge exchange events could be captured, for example, electronically as they occur or automatically if flagged from an electronic calendar or a diary, then far more of these events could be recorded with relative ease. Evaluation is the systematic collection and inter- pretation of evidence leading as a part of process to a judgement of value with a view to action., Evaluation is the application of a standard and a decision-making system to assessment data to produce judgments about the amount and adequacy of the learning that has taken place., 1. "Evaluation is a process of judging the value of something by certain appraisal." Characteristics of evaluation in Education Below are some of the characteristics of evaluation in education, Continuous Process Comprehensive Child-Centered Cooperative Process Common Practice Teaching Methods Multiple Aspects Continuous Process 8. Overview of the types of information that systems need to capture and link. While defining the terminology used to understand impact and indicators will enable comparable data to be stored and shared between organizations, we would recommend that any categorization of impacts be flexible such that impacts arising from non-standard routes can be placed. The criteria for assessment were also supported by a model developed by Brunel for measurement of impact that used similar measures defined as depth and spread. Impact is assessed alongside research outputs and environment to provide an evaluation of research taking place within an institution. In the UK, UK Department for Business, Innovation, and Skills provided funding of 150 million for knowledge exchange in 201112 to help universities and colleges support the economic recovery and growth, and contribute to wider society (Department for Business, Innovation and Skills 2012).