The Advancements and Innovations in Evaluation session will be held from 4:15 PM—4:55 PM on Tuesday May 11, 2021.
Pre-selection is required. Speaker bios can be found here. This session includes the following two concurrent classes:
Co-Creating Impact with Non Profits: United Way Halton & Hamilton Investment Framework Case Study
Presenters: Alecia Korkowski and Vivien Underdown
In this session, participants will learn about the changing nature and identity of the United Way movement from the local chapter, United Way Halton & Hamilton, including how evaluation and program assessment has changed over the past decade. Hear about the shift from “funder” to “impact organization” and what this means for the way in which we invest in community, measure impact, and grapple the concept of “success.”
Learning Objectives:
1. Participants will understand how funding bodies can respond to the multiple human and natural contexts within which a program is embedded.
2. Participants will explore how funding bodies can be self-aware and incorporate reflective thinking to continually improve activities.
Credentialed Evaluator (CE) Competencies:
1.7 Uses self-awareness and reflective thinking to continually improve practice.
3.1 Examines and responds to the multiple human and natural contexts within which the program is embedded.
3.3 Respects all stakeholders and strives to build and maintain trusting relationships.
3.5. Identifies and responds to changes in the context of the program and considers potential positive and negative impacts of the evaluation.
4.3 Identifies and effectively uses required human, financial, and technical resources
5.5 Builds partnerships within the evaluation context.
Ontario’s Data Strategy: Questions and Opportunities for Evaluators
Presenters: John Roberts and Vanessa Parlette
The stories we’re able to tell about program impact are only as good as the data that informs them. How might government data be better structured and accessed to enable evaluators (and the organizations with which they work) to better understand their communities and the impact of specific programs or policies? There are many changes underway with potential to transform the way data are accessed, used, and shared across sectors. In 2019, the Ontario Government launched a new Ontario Data Strategy and the creation of a Digital and Data Task Force. Changes to the Freedom of Information and Protection of Privacy Act (FIPPA) have been enacted to enable data integration across ministries, and in 2020 the province carried out consultation on privacy reform. A new Simpler, Faster, Better Services Act applies to the public sector and some nonprofits.
Join our Question & Answer session with Chief Privacy Officer, John Roberts, and Program Director of the Data Policy Coalition, Vanessa Parlette, to learn more about recent changes and current plans around the Ontario Data Strategy, including data legislation and data integration. What has changed in the context of COVID? What does this mean for the nonprofit evaluation sector and the communities we represent? How might nonprofits, evaluators, and government work together in strengthening ethical access and use of data for public benefit?
Learning Objectives:
1. Participants will discuss how data-sharing can help rebuild more equitable systems through transition and recovery
2. Participants will learn how you can engage provincial and federal government leaders in identifying opportunities for collaboration around data infrastructure and data-sharing with the nonprofit evaluation sector
3. Participants will explore potential partnerships between government and nonprofit leaders/evaluators to co-design solutions to shared policy or social challenges through new applications of data-sharing
Credentialed Evaluator (CE) Competencies:
1.1 Knows evaluation theories, models, methods and tools and stays informed about new thinking and best practices.
1.4 Considers the well-being of human and natural systems in evaluation practice.
2.7 Identifies data requirements, sources, sampling, and data collection tools.
3.1 Examines and responds to the multiple human and natural contexts within which the program is embedded.
3.2 Identifies stakeholders’ needs and their capacity to participate, while recognizing, respecting, and responding to aspects of diversity.
3.5. Identifies and responds to changes in the context of the program and considers potential positive and negative impacts of the evaluation.
3.6 Engages in reciprocal processes in which evaluation knowledge and expertise are shared between the evaluator and stakeholders to enhance evaluation capacity for all.
5.4 Uses a variety of processes that result in mutually negotiated agreements, shared understandings and consensus building.
5.5 Builds partnerships within the evaluation context.
Leading in Times of Uncertainty & Complexity: How Evaluators Can Use Their Skill Sets to Support Innovation, Rapid Adaptation and Emergent Strategy
Presenter: Tanya Dirisi
The global pandemic and resulting shutdowns have put social purpose and philanthropic organizations in a difficult position. Programs cannot run as intended, there are limits on how and where services can be delivered, and remaining staff are burning out. Many nonprofits are scaling back, putting initiatives and commitments on hold. In this session, Tanya Darisi, co-founder of Openly, will lead a conversation designed to inspire evaluators in leveraging their skill sets and approaches to better meet the realities and needs of the times. Drawing on lessons and principles from the entrepreneurial mindset of Lean Startup, Tanya will share how evaluation can itself pivot to support greater innovation, strategy, and most importantly, meaningful social impact. These lessons will be grounded with practical examples from recent client work.
Learning Objectives:
1. Participants will be able to apply principles of emergence and lean approaches to evaluation design
2. Participants will be able to provide more strategic advice about evaluation priorities and methods, to better serve the needs of organizations and institutions
Credentialed Evaluator (CE) Competencies:
3.1 Examines and responds to the multiple human and natural contexts within which the program is embedded.
3.4 Promotes and facilitates usefulness of the evaluation process and results.
3.5. Identifies and responds to changes in the context of the program and considers potential positive and negative impacts of the evaluation.