- Social Media
- Active Citizenship
- Good Giving
- Corporate Responsibility
- Be Fearless
About the Process
Make It Your Own Program Goals
The program had five goals:
- To support and “lift up” citizen-centered work in communities across the country.
- To raise awareness of multiple pathways to active engagement and reach new and diverse audiences.
- To empower people with tools to help them take action into their own hands--widgets, web pages, personal blogs, etc.
- To collect and disseminate compelling stories about new ways people are breaking down barriers to participation and taking back their communities.
- To test a “citizen-centered” approach to philanthropy that would involve real people in every step of the grantmaking process, from input on guidelines and applicant review to deciding on finalists and winners. It would also reflect the Foundation’s commitment to “walking the talk” of citizen participation outlined in Citizens at the Center.
The Process: Components
The MIYO grantmaking process involved several steps:
In conjunction with partner organizations, Case Foundation staff convened meetings in Boston, Minneapolis, and Washington, D.C. to vet draft program guidelines with a diverse group of scholars and practitioners in the civic engagement field. Many of the suggestions raised were incorporated into the final guidelines.
The Case Foundation developed a five-question, online application that was available to any and all individuals who wanted to apply. This process allowed the Foundation to incorporate the best of traditional grantmaking, which often uses a letter of intent or proposal to determine funding potential, with new approaches and tools that capitalized on technology’s efficiency, reach, and marketing/fundraising capacities.
Working with numerous nonprofit partner organizations and using a comprehensive marketing strategy, the Foundation publicized the program and encouraged applications via numerous venues to ensure broad and diverse representation. Nearly 5,000 people filled out the online application, all of whom received a personalized web page and a widget from the Foundation to help them develop and publicize their idea/project with a wider audience. The demographics of these applicants as well as other personal data were collected.
The Foundation also provided several “information sessions”—webinars that provided opportunities for potential applicants to learn more about the process prior to submitting an application to the program. These sessions, held for constituents of partner organizations as well as members of the general public, provided information about the citizen-centered concept, guidelines, and program structure.
The Case Foundation prepared a “job description” to encourage practitioners and ordinary people to apply as citizen reviewers for the applications. This description was disseminated far and wide, resulting in the Foundation receiving approximately 200 resumes. Foundation staff reviewed these and selected 30 reviewers initially. When it was discovered that the number of applications received totaled nearly 5,000, Foundation staff recruited an additional 63 reviewers using all the alternates and also reached out to practitioners/scholars with whom it had a relationship and who were in the field. The total number of external reviewers then increased to 93.
External reviewer process.
Foundation staff required that all external reviewers participate in an online training it prepared. An important part of that training was reviewing and understanding a detailed scoring rubric that attempted to break down and codify each section of the application. Using this rubric, reviewers rated each assigned application using an online scoring template. Two reviewers assessed each application. If there was a major discrepancy between the two scores, a third reviewer was asked to score the application as well.
Getting to the Top 100 with external reviewers.
Case Foundation staff compiled a list of the Top 100 applicants, based on the combined scores of its external reviewers who were charged with this part of the process. A small committee comprising Foundation staff and consultants reviewed this list with an eye toward ensuring representative geographic, racial/ethnic, and age/gender diversity.
While reviewing this list, it was discovered that some of the projects that had initially received lower marks by the first two reviewers had risen on the list—and vice versa with those that had scored higher (and had now dropped). The reason: the introduction of a third reviewer changed the “average” of the first two reviewers’ scores (for example, a project that had received a “50” and a “210” from the first two reviewers may have then received a “200” from the third, changing the average).
To remedy this, staff went through applications that had a third score and re-averaged them. Specifically, if a project received, out of the three scores, two higher marks, staff averaged only those. And if a project received, out of the three scores, two lower marks, staff averaged only those. That led to a slight shifting of the top scores. Once that list had been compiled, staff then re-reviewed it to ensure representative diversity.
An important lesson this evaluation was that if this process is undertaken again, four reviewers should be assigned to each proposal, with the two middle scores averaged and the high and low scores thrown out.
The Foundation also disqualified about 40 applicants that did not submit complete applications or meet basic eligibility criteria, which included residing in and submitting a project focused on one of the 50 states, D.C. or Puerto Rico, and being 14 years of age or older.
Once the final list had been stipulated, the Foundation undertook background checks of these applicants and their nonprofit partners to ensure that they were in compliance with the law and in a financial position to accept and manage the grant.
Top 100 applications.
The Top 100 were invited to submit full proposals (also completed online). Scoring rubrics were also created to help operationalize the sections of the proposal. Ninety-six of the top 100 completed applications.
The Top 100 were also provided with customized, advanced widgets and web pages on the Case Foundation website to help them with their fundraising and outreach.
In addition, each of the Top 100 were offered a proposal coach to help them craft a compelling and distinctive proposal that would enhance their chances of winning a grant. These proposal coaches were paid short term contractors to the Foundation with substantial experience in fundraising, grantmaking, community development, and/or citizen-centered work that the Foundation recruited from its pool of external reviewers and other networks.
Each coach committed to spending up to two hours with up to ten applicants each. Coaches also were asked to participate in a brief online training that provided context about the MIYO program and process, as well as suggestions for what grantees might be looking for in terms of assistance. The coaches served in advisory capacity and did not write proposals on behalf of applicants.
Final 30 to Top 20: Bringing in the experts.
Staff compiled a list of the top 30 proposals (20 top scorers and 10 alternates) and gave it to a small group of expert judges—people in the field with significant knowledge and experience in citizen-centered work, as well as Case Foundation senior staff members and consultants—to score using an online scoring process. Judges were also required to attend an online training orientation.
The inclusion of experts in a process whose primary focus is real people may appear somewhat contradictory, but it was a deliberate decision that, in fact, underscores one of the key tenets of citizen-centered work: It is neither bottom-up or top-down, and includes both experts and non-experts in decision-making processes that affect both groups. After considerable effort to allow the non-experts to decide the Top 100, as well as the top 30, applicants, the Foundation believed that winnowing this list to the Top 20 could and should be the domain of individuals with deep experience in this work.
Each proposal was reviewed by two expert reviewers; again, if there was a discrepancy, a third reviewer added a score. Scores were compiled and the final list was given to judges at a half-day in-person meeting held at the Case Foundation during which participants discussed each finalist to come to consensus on the top 20. Considerable discussion and time was spent on ensuring geographic representation, as well as other factors such as the project’s focus, ethnic/racial diversity, the age of applicants, and gender.
Top 20 finalists were asked to select a nonprofit partner who would serve as a fiduciary agent and implementation partner. In many cases, Top 20 finalists were employed by or had a close relationship to their partner. In cases where the Case Foundation determined that selected partners did not have the capacity or proven history to manage these grants, the Foundation suggested a couple national partners that the finalist could choose to work with.
Finalists and fiduciary partners co-signed grant letters together, with the monies going to the nonprofit and the finalist listed as the project director. In one case, the Foundation issued an “expenditure responsibility” grant to a finalist that operated a small for-profit social enterprise.
Public online voting.
The Case Foundation asked the Top 20 projects to provide photos and a brief description of their project to post on the Case Foundation website as information for public voting. Top 20 participants were also provided with an outreach ambassador to assist in developing their personal voting campaigns, and their fundraising widgets were converted into “vote for me” widgets during this phase of the program. In addition, each received a “candidate kit”—a customized mini-marketing plan that came with press releases, flyers, bumper stickers, and more.
Similar to proposal coaches, outreach ambassadors were external experts who provided support to finalists on mobilizing supporters. The ambassadors served in advisory capacity and did not conduct outreach on behalf of or advocate for their assigned applicants.
To elicit public participation in the voting process, the Case Foundation engaged in several extensive local and national outreach and marketing endeavors, including garnering agreements from more than 100 partner organizations to help publicize the MIYO program and encourage participation in it among their networks and members. The Foundation also released a viral video about MIYO, engaged in a national media campaign, and pitched all Top 20 stories to local media.
Getting to the Final Four and results.
Voting was open for four weeks, during which time 15,232 people voted to select the Final Four. Voters, however, were asked to select four projects, not just their personal favorite—a tactic that was deliberately instituted to encourage voters to read the stories and make informed decisions.
Additionally, the Foundation programmed the ballot so that each time voters entered or refreshed the page, the order in which the Top 20 appeared was randomized, which helped level the playing field and ensure that voters were less likely to be biased by those who appeared at the top of the list. For the same reason, the Foundation chose not to employ use of a “leader board” and did not reveal participants’ standing to them during the voting process.
To prevent “robot voting,” the Foundation employed captcha and email confirmations to stop voters from creating systems that voted repeatedly. It also commissioned an outside vendor with deep experience with online elections to verify all the votes. An additional incentive the Foundation offered was $2,500 to the first 10 people who picked the exact four winners determined by public vote—money that could be used as a gift to any charity of the person’s choice.
Ongoing technical assistance and training.
During and following the MIYO grant process, the Case Foundation provided a series of webinars and other online resources for applicants and, later the Top 100 and winners, including information about marketing, social media, and fundraising.
In June 2008, the Foundation made initial grants to the 20 winners, all of whom were required to submit a progress and final report. In partnership with Everyday Democracy, the Case Foundation also convened the entire group at a special event in Baltimore, Maryland, that allowed winners to meet each other and celebrate their efforts, and learn more about citizen-centered, community-based work from leaders in the field.