"CSOs sometimes need to summarise or aggregate information across multiple interventions. This can be a difficult and challenging task, especially for large non-governmental organisations (NGOs) working in many different countries and/or sectors. Summarisation and aggregation can be achieved through
...
a variety of methods. However, all come with associated costs." (Introduction)
more
"Beginning in 2002, working closely with co-evaluators and commissioners of evaluations, the author developed Outcome Harvesting to enable evaluators, grant makers, and managers to identify, formulate, verify, and make sense of changes that interventions have influenced in a broad range of cuttingâ
...
€“edge innovation and development projects and programs around the world. Over these years, he led Outcome Harvesting evaluative exercises involving almost 500 non-governmental organizations, networks, government agencies, funding agencies, community-based organizations, research institutes and university programs. In over fifty evaluations, with forty co-evaluators he has harvested thousands of outcomes on six continents. Outcome Harvesting has proven useful in evaluations of a great diversity of initiatives: human rights advocacy, political, economic and environmental advocacy, arts and culture, health systems, information and communication technology, conflict and peace, water and sanitation, taxonomy for development, violence against women, rural development, organic agriculture, participatory democracy, waste management, public sector reform, good governance, eLearning, social accountability, and business competition, amongst others. In this book, the author explains the steps of Outcome Harvesting and how to customize them according to the nine underlying principles. He shares his experience and gives practical advice on how to work with Outcome Harvesting and remain true to its essential features." (Back cover)
more
"Systemic evaluation design continuously addresses six questions: 1. What are the intervention purposes? 2. What is the scope and focus of the intervention and evaluation? 3. What ought to be the consequences of the evaluation and what evaluation purposes promote those consequences? 4. What are the
...
criteria (or values) that should underpin the judgment of merit, worth and significance? 5. What questions inform the collection of data so that judgments can be made using those criteria? 6. How can the evaluation be feasible? A systemic approach can be summarised as: understanding inter-relationships, engaging with multiple perspectives and reflecting on boundary choices. Systemic evaluation design flows from three key principles: 1. Systemic evaluation design is a process and a product. Evaluation design is a process that occurs throughout the evaluation, not just a product of the first stage of an evaluation. 2. Systemic evaluation design focuses on consequences. And by that I mean the consequences of the evaluation. Some might call it outcomes. 3. Systemic evaluation design emphasises what to leave out rather than what to put in. It is not possible to include everything that happened in the intervention, nor possible to include every single perspective or viewpoint or framing of the intervention." (Page ii)
more
"Diana Ingenhoff und Alexander Buhmann führen in den aktuellen Forschungs- und Wissensstand zu Public Diplomacy und insbesondere zu Landesimages ein. Sie reflektieren dabei Fragen der Messung, Entstehung und Gestaltung von Landesimages und geben Antworten auf die folgenden Leitfragen: Welche Aspekt
...
e/Dimensionen eines Landes sind wichtig für sein Image und wie entsteht es? Welches sind die für die Imagebildung wirksamen Kanäle? Welche Handlungsrelevanz und Wirksamkeit hat das Landesimage? Wie lässt sich die Wirksamkeit von Public Diplomacy und Landeskommunikation messen und evaluieren? Das Buch dokumentiert und diskutiert die facettenreiche Literatur zu Landesimages und Public Diplomacy. Es enthält zahlreiche Abbildungen, ein Glossar und ein Register und fördert damit den Dialog zwischen Forschung und Praxis." (Verlagsbeschreibung)
more
"Process-tracing in social science is a method for studying causal mechanisms linking causes with outcomes. This enables the researcher to make strong inferences about how a cause (or set of causes) contributes to producing an outcome. In this extensively revised and updated edition, Derek Beach and
...
Rasmus Brun Pedersen introduce a refined definition of process-tracing, differentiating it into four distinct variants and explaining the applications and limitations of each. The authors develop the underlying logic of process-tracing, including how one should understand causal mechanisms and how Bayesian logic enables strong within-case inferences. They provide instructions for identifying the variant of process-tracing most appropriate for the research question at hand and a set of guidelines for each stage of the research process." (Publisher description)
more
"Evaluation: A Systematic Approach is the best-selling comprehensive introduction to the field of program evaluation, covering the range of evaluation research activities used in appraising the design, implementation, effectiveness, and efficiency of social programs. Evaluation domains are presented
...
in a coherent framework that not only explores each, but recognizes their interrelationships, their role in improving social programs and the outcomes they are designed to affect, and their embeddedness in social and political context. Relied on as the “gold standard” by professors, students, and practitioners for 40 years, the new Eighth Edition includes a new practical chapter on planning an evaluation, entirely new examples throughout, and a major re-organization of the book’s content to better serve the needs of program evaluation courses." (Publisher description)
more
"Demystifying the evaluation journey, this is the first evaluation mentoring book that addresses the choices, roles, and challenges that evaluators must navigate in the real world. Experienced evaluator and trainer Donna R. Podems covers both conceptual and technical aspects of practice in a friendl
...
y, conversational style. She focuses not just on how to do evaluations but how to think like an evaluator, fostering reflective, ethical, and culturally sensitive practice. Extensive case examples illustrate the process of conceptualizing and implementing an evaluation--clarifying interventions, identifying beneficiaries, gathering data, discussing results, valuing, and developing recommendations. The differences (and connections) between research, evaluation, and monitoring are explored. Handy icons identify instructive features including self-study exercises, group activities, clarifying questions, facilitation and negotiation techniques, insider tips, advice, and resources." (Publisher description)
more
"This text provides a solid foundation in program evaluation, covering the main components of evaluating agencies and their programs, how best to address those components, and the procedures to follow when conducting evaluations. Different models and approaches are paired with practical techniques,
...
such as how to plan an interview to collect qualitative data and how to use statistical analyses to report results. In every chapter, case studies provide real world examples of evaluations broken down into the main elements of program evaluation: the needs that led to the program, the implementation of program plans, the people connected to the program, unexpected side effects, the role of evaluators in improving programs, the results, and the factors behind the results. In addition, the story of one of the evaluators involved in each case study is presented to show the human side of evaluation." (Publisher description)
more
"Within the development field, project evaluations and impact assessments are essential. Donors are increasingly requiring rigorous evaluations in order to (1) ensure that aid dollars are spent on projects that are having positive impacts and not being wasted on projects that are ineffective and (2)
...
promote “evidence-based policy making” in which evaluations contribute to understanding best practices for development aid. These two goals are frequently referred to by the world’s major donors as promoting “accountability” and “learning,” respectively. However, current conceptions of learning and accountability are problematic – at times even counterproductive. This chapter provides an overview of the role of evaluations in the CDS field and the concepts of accountability and learning and then describes the problems, contradictions, and ethical dilemmas that arise in the field because of them. The chapter ends with suggestions for how the field might fine tune the concepts of learning and accountability in a way that would better serve both donors and aid recipients." (Abstract)
more
"Robert G. Picard describes the evolvement of UNESCO's media development indicators. The chapter describes a growing focus on economic, financial and managerial dimensions, since, it argues, they pave the fundament to any sustainable, commercial or non-commercial journalistic venture. What Picard cr
...
itically argues is that there is no universal quick fix for sustainable journalism. Any normative effort to define and measure media development or sustainable journalism also needs to take into account the local contingencies, where sustainability may look quite different depending on its temporal, geographic, economic and cultural context." (Page xxxi)
more
"Chapters feature: A review of 30 frameworks and models that inform processes for evaluation in advertising, public relations, health communication and promotion, government communication and other specialist fields including the latest recommendations of industry bodies, evaluation councils, and re
...
search institutes in several countries; Recommendations for standards based on contemporary social science research and industry initiatives such as the Task Force for Standardization of Communication Planning and Evaluation Models and the Coalition for Public Relations Research Standards; A comprehensive review of metrics that can inform evaluation including digital and social media metrics, 10 informal research methods, and more than 30 formal research methods for evaluating public communication; evaluation of public communication campaigns and projects in 12 contemporary case studies." (Back cover)
more
"This report puts forth a Walton Family Foundation (WFF) Media Impact Framework and identifies key indicators and research methods that can be used to assess journalism investments across program areas, based on best practices in the field of media strategy and research. The companion WFF Media Impa
...
ct Toolkit provides a set of steps for grantmakers to take to get from the decision to invest in media all the way to identifying the appropriate indicators, targets and baselines. The information that follows is customized to reflect WFF goals, priorities and processes, but it is our hope that other partners can utilize the materials to become more effective communicators and grantmakers. This report also provides background information about the journalistic profession, noting in particular its differences from social change and advocacy organizations. This context is especially important for framing what can be difficult conversations with journalism organizations, given their commitment to neutrality and the recent— and ongoing — upheaval in the industry." (Summary, page 3)
more
"Dieser Beitrag widmet sich dem medienvermittelten Vertrauen in NGOs. Ein auf Basis der Theorie des öffentlichen Vertrauens (Bentele 1994) entwickeltes, inhaltsanalytisches Messinstrument, der NGO-Trust Index (NGO-TI) und die Ergebnis-Visualisierung in der NGO-Trust Map (Wohlgemuth et al. 2013), we
...
rden vorgestellt, getestet und diskutiert. Der Methodentest erfolgt anhand einer vergleichenden Fallstudie zur deutschen Sektion des Kinderhilfswerks der Vereinten Nationen (Unicef Deutschland) und der deutschen Umweltstiftung des World Wide Fund for Nature." (Zusammenfassung)
more
"Zunächst werden theoretische Konzepte und Modelle sowie die empirische Forschung zum Einsatz von Evaluation und Controlling für die NGO-Kommunikation gesichtet. Dabei werden auch Konzepte für die Evaluation von Organisationen sowie für die Kommunikation von Nonprofit-Organisationen (NPO) als al
...
lgemeineren Formen der Evaluation herangezogen. Anschließend werden das Wirkungsstufen-Modell der Deutschen Public Relations Gesellschaft (DPRG) und des Internationalen Controller Vereins (ICV), das Logic Model für die Evaluation von Nonprofit-Organisationen sowie das Meinungsklima-/Koorientierungsmodell (MKM) für die Messung, Bewertung und Steuerung der Kommunikation von Organisationen miteinander verknüpft und zu einem generischen Modell für die NGO-Kommunikation weiterentwickelt." (Zusammenfassung)
more
"The idea of using mixed methods has a long tradition in social research. But it is also recognized that mixed methods are often poorly applied. In quantitative analysis, the qualitative component, if any, is often poorly designed, integrated or reported. These guidelines are to support the design,
...
conduct and reporting of mixed-methods in quantitative impact evaluations. That is, impact studies using a large statistical design with a qualitative component. The guidelines are based largely on the CEDIL inception paper by Jimenez et al. (2018), ‘Mixing and matching: using qualitative methods to improve quantitative impact evaluations (IEs) and systematic reviews (SRs) of development outcomes’, supplemented by a review of other relevant guidelines e.g. the CONSORT extension for development effectiveness (Bose, 2010)." (Page 1)
more
"Based on over 40 interviews with practitioners, this report identifies “civic media practice” as media and technology used to facilitate democratic process. It focuses specifically on those practitioners using media tools to form relationships and build trust - a practice that sometimes runs co
...
unter to the apparent needs of organizations to enhance efficiency through technology. This report identifies civic media practice as a direct response to the crisis of distrust and describes the negotiation of values that takes place as media is designed and deployed in organizations.
The process of identification and evaluation of civic media practice is described in detail. The report presents a method of process evaluation that allows practitioners to measure their progress along two central axes: social infrastructure and objective. Civic media practice is always striving towards strong social infrastructure and longevity. As a means of measuring progress along these axes, we identify four activities that can be tracked. They include: 1) Network Building; 2) Holding Space for Discussion; 3) Distributing Ownership; 4) Persistent Input. We present reflective questions that can be asked throughout a civic media project to track progress in these areas.
Finally, we provide recommendations for practitioners and funders as they create and support civic media practice. The institution of civic media is nascent. This report is meant to solidify common principles and provide direction for those invested in transforming civic life through media practice." (Executive summary, page 5)
more