Foundation, Mission, Values
Bridge Evidence Group (BE Group) is a nonprofit organization based in Caledonia, Michigan. BE Group was established in 2009 to create opportunities to evidence a bridge between the broader LGBT and Christian communities. BE Group sees Christ as the bridge between people and God, and between people and other people. They also believe that Christians can be the manifestation of this connection between both like-minded and diverse communities. The mission ofBE Group is to work with others to identify areas where transformation is needed to bridge the divide between Christian groups and the LGBT community. The organization also has a set of core values, which all board, staff members, and volunteers must adhere to successfully achieve the mission of the organization (Bridge Evidence Group Archives, 2009). These core values include the following statements:
- We are about creating environments to help people passionately engage in relationship with the LORD.
- We seek to walk with people regardless of where their journey takes them, existing with them as JESUS would.
- We are about creating space for openness to conviction through the SPIRIT.
- We are more concerned about living in an attitude of servant hood born out of CHRIST than living in an attitude of rightness born out of the law.
- We desire unity in CHRIST before all things, recognizing that all things are held together in CHRIST.
- We see any person's relationship with CHRIST as transcending Jew or Gentile, slave or free, male or female, and we desire to see this reality lived out (Bridge Evidence Group, Agreement).our browser may not support display of this image.
Major Sources of Support
Bridge Evidence Group obtains its financial resources from a small pool of donors including some religious groups and churches. From 8/09-12/09 they project an income of $2,000 through donations. Projections for their first full fiscal year in 2010 reveal they expect to take in approximately $6,300 in donations. This modest income shows that the real asset of Bridge Evidence Group is its board, staff and volunteers. Greg Gough and Diane Hartig are both board members as well as part time staff members and provide the primary administrative support for the organization. Mr. Gough is Director and Ms. Hartig is the Assistant Director and acting treasurer (Bridge Evidence Group, Form 1023, p. 10).
Programs and Services
The primary programming of Bridge Evidence Group is focused on mentoring. BE Group works to help these communities and the organizations and individuals within them discover and develop a foundation for mutual understanding and connection, while sorting through the difficult concepts and realities of life. Individuals usually find out about the group either on their own, through organizational outreach, or through referral from local churches, religious organizations, and mental health professionals.
The Group chooses to deliver its services in this manner because of the nuanced nature of their mission is not liberal enough for some and not conservative enough for others. In addition, the wide spread politicizing and extreme ideology concerning issues of sexual orientation and gender identity means that any kind of large programming or large group work could often result in over-heated conversations and fights rather than the building of relationships. One on one interaction allows for intimate and honest conversations and relationships based on the building of trust, which allows an authentic search for whatever each individual chooses for themselves in regard to living out their sexual orientation and/or gender identity in conjunction with their faith (Greg Gough, Personal Interview, November 6, 2009).
Organizational Evaluative Assessment
Leadership and Organizational Culture, Learning
Overall, Bridge Evidence Group is a prime organization to undertake evaluation. While the leadership are not professional non-profit administrators and were unaware exactly what was meant by Program Evaluation and Evaluation Capacity Building they immediately were interested in the offer to design a program evaluation plan as well as methods on how to institutionalize evaluation in their organization, which will ultimately generate organizational learning (It all Starts Here, p. 7). Because the organization is so small the actions and opinions of those in leadership is the culture of the organization. The leadership is open to change because they realize they are not experts in nonprofit management. In addition, as a new organization it is far more likely to implement new ideas into the infrastructure of how the organization runs because currently all policies and procedures are new anyway.
Built into the values of BE Group is an idea that people are profoundly complex beings that might interpret life and reality differently than the BE Group or any of its associated personnel. BE Group has taken a stand that they are concerned with the well being of other people, rather than imposing ideology onto them. Therefore, BE Groups seeks the total development of its own staff because this will allow them to see to the holistic needs of their clients as well. (Senge, 1990, p.131-135). Openness to evaluation reveals interest in holistic development and personal mastery and demonstrates that organizational improvement is on the minds of the leadership and part of the culture. Once the evaluation process officially begins, BE Group also seems interested in institutionalizing what current leaders have learned into the infrastructure of the organization as it grows, thus ensuring organizational learning (Greg Gough, Personal Interview, November 6, 2009. Preskill and Boyle, 2008, p.445, 453, 455).
The political environment surrounding any LGBT issue is typically highly charged between those enthusiastically for LGBT individuality and rights and those dead set against what they deem an immoral life style choice. In many ways this could affect the ability of the organization to evaluate their programs because any kind of survey of clients or potential clients might find data that is split because of ideological differences between different parties. Despite a mission and values that proactively try to prevent taking a stance on the morality or ethics of being an LGBT person, evaluation in the current political environment might cause mission drift due data issues. Some ways suggested to mitigate the effects of the political environment on evaluation are to ask people to define their ideological stance in a survey question if a survey is used, like is suggested below. This will allow the other data gained from the survey to be identified with certain ideological stances, which should allow those analyzing the data to gain a picture that is less skewed by representing the various groups results together and apart. Another mitigation of this environment might be to create a heterogeneous group where all views are fairly represented in a balanced way so holistic feedback can be obtained and everyone feels like they have an equal voice, which will generally create more honest feedback (Abma, 2006, p. 10-13).
Furthermore, the political environment of BE Group also affects its ability to raise funds because they are too liberal for many churches to sponsor because they will not condemn LGBT people as sinful, but they are also considered to conservative by many LGBT organizations because they will not tell an LGBT person who is trying to live celibately that they are doing the wrong thing. Therefore, a small group of people and organizations who are moderate is the key demographic for fundraising, which is limiting for the organization and easily reflected in their small budget.
As noted above BE Group has very few monetary resources and the two staff members who are in charge of administration are being paid very little, working part time for BE Group and full time at other jobs and careers. Therefore, the group has no money to hire an outsider facilitator to carry out evaluation and little time to carry it out by the internal staff until more money can be raised and more time devoted to management of the organization by either the Director or Assistant Director. One benefit of this lack of resources is that few organizational structures have been put in place, which means there are no structural issues to overcome in order to do evaluation. There is a great opportunity to write evaluation into the structures as they are built over time. Nonetheless, despite their great interest in evaluation as noted above, time and resources will be a significant hurdle to jump in order for the organization to successfully implement ongoing program evaluation.
Bridge Evidence Group was founded and incorporated in mid-2009, therefore no program evaluation has yet taken place. Through the design and implementation of this evaluation report BE Group hopes to implement evaluation starting in January of 2010.
The Readiness for Organizational Learning and Evaluation Instrument (ROLE)
Using the ROLE tool has revealed BE Group scored 4.13 in the culture category, 3.58 in leadership, 3.88 in systems and structures, 2.9 in communication, 4.125 in teams, and 4.625 in evaluation. Given that BE Group only scored below 3.5 in one category, the tool's guidelines suggest it has a strong ability to learn from evaluation and make efforts to become a learning organization. In addition, the low score in the communication category can be linked to the fact that infrastructure for communication has not yet been built into the organization due to recently being incorporated. In the near future, it can be assumed this situation will also improve especially when brought to light by this report. Please see attachment one for more information about this data.
Program Evaluation - Mentorship Program
The mentorship program's intent is the creation of relationships between individuals and between individuals and faith based institutions with the hope that bridges of understanding and mutual respect between the Christian and LGBT communities can then form based upon these relationships. It is suggested that the first step for program design and theory should take into account all the populations involved, the goal to achieve, the activities of the program, and the short and long-term outcomes that result from the activity (Baker, 2000, p.15-22). The following information will further break down these categories for the Bridge Evidence Group Mentorship Program.
- Program Participants - LGBT individuals, Christian individuals, Christian Organizations
- Activity - The primary activity of this program is the creation of two different types of bridges. The first bridge is relationship between Christian organizations and BE Group in order to advertise services and begin to build bridges between the Christian and LGBT communities, which BE Group seeks to facilitate. The second bridge of relationship is between Christian people/organizations and LGBT individuals themselves. BE Group will achieve the building of these bridges through one on one interaction, educational presentations, and social activities.
- Short Term Output - Relationships where individuality is and nurtured in a culture of mutual respect.
- Long Term Output - Building bridges of understanding and mutual respect between the Christian and LGBT community, which will result in less cultural tension and a more Christ-like community.
As shown below, the logic model assumes that a particular type of Christian (or Christian organization) and a particular type of LGBT person will be willing, open, and honest to start building a relationship with someone who might have a very different opinion than them. The rest of the model logically flows from one segment to another. Another possible flaw in the model is the assumption that respect and openness to others will be maintained even when very different ideological views are being shared. However, given the benefit of the doubt that only people who are very interested in building these types of bridges will take part, it is a safe conclusion to make that bridges will be built if the process is followed. These models explain the two bridges of the mentorship program, which involve building relationships between BE Group and Christian groups and between self-accepting LGBT people and Christian groups. It should be noted that a key part of the program theory not yet addressed is the power of participants' faiths to create an internal transformation through the building of relationships in this mentorship program. It assumed beyond doubt that with honest and authentic desire and efforts Jesus will facilitate the growth of understanding. Bridge Evidence Group believes Jesus' law of compassion is evident in his teachings and that their efforts through this program will bring about more of Jesus' kingdom to earth.
Evaluation Frame Work
Despite being a very evaluation friendly organization that is just beginning to form its own policies and procedures, actually designing the evaluation process is particularly difficult because the mission of the organization and goals of the evaluated program are not concrete and can potentially be hard to demonstrate. The following frame work suggests possible ideas or relationship signifiers that might be used to measure whether or not the program is successful and if not what types of other secondary programming piece should be running along side the mentorship program to further facilitate the growth of relationships and understanding.
Data Collection tools and methods
The first method for evaluation of this program should be participant surveys. With a clearly defined population to survey the most logical way to obtain data is through a survey administered when a client joins the program and as the guideline stipulates above or as otherwise suggested by the program staff. It would be best to use a mixture of open ended questions as well as questions that can provide definitive answers and values about the program. Specifics should be further worked out with the program staff in regard to question design due to the sensitive political nature of the program, those working in the field should be highly integrated into exact survey design. Another benefit of using the survey in this case is that no sample proportion needs to be decided, the small size of the program and the intimate nature of it virtually guarantees a 100% return rate on surveys and will thus provide excellent and accurate data and feedback on the effectiveness of the program (Newcomer, Triplett, p.257-291).
Another evaluation method especially suitable to this circumstance is the use of a focus group. Due to the sensitive and personal nature of this program it is likely that only providing people with what might seem like a sterile might be detrimental to effective feedback. Providing a more open ended environment for discussion through a focus group will help improve the program because the facilitator will receive comments and suggestions without soliciting them in any highly defined way. Focus groups are also cheap and easy to implement requiring just one or two moderators who can remain unbiased through the session (Goldenkoff, p.341-355).
Below is the suggested course of action of how to implement survey and focus groups, as mentioned above this time line might change after more input from program staff and clients. It is important to note that evaluation is an ongoing process and as long as the program continues evaluation should continue to take place. This will allow changes in culture or those involved to continually be taken into account and the program adjusted as needed according to the data collected. The follow chart is a suggested protocol for the first twelve months of the program beginning January 2010.
A final suggestion not listed on the preliminary evaluation schedule above is reaching beyond those directly involved with the program and gaining information from them through surveys and focus groups as well. This population might be those who are involved with the Christian organization who has individuals taking part in the program, but who are not part of the program themselves. Widespread stakeholder involvement in the evaluation process will not only make the process more legitimate in the eyes of those involved, but also in the eyes of the general public. However, some experts note that different types of stakeholders should participate by different means. Therefore the key stakeholders, such as those involved in the actual programming, might take survey's for a year and have several focus groups whereas those who have secondary or tertiary contact with the program might have one or two focus groups available to them to add their input. It is key to balance the input of all stakeholders according to their stake in the program (Mays, 2004, p.13-28).
Given the small size of the staff and budget of Bridge Evidence Group the best way to analyze the data is through simple spreadsheet formulas and generation of charts and graphs to display that information using the same software. Basic thoughts, feelings, and opinions in regard to the programming and the topics surrounding it should be easy to calculate by creating categories and displaying what percentages of people are in each when analyzing the multiple-choice questions segment of the surveys. However, the most revealing data in a program like this is probably the qualitative data that can really capture the essence of program success in a case where program outcomes are not tangible (Mohr, 2009, p.69-73). Qualitative data gained through open-ended survey questions and/or focus groups should be coded according to key words or ideas found in qualitative responses. Another option is to categorize responses by the organization responders are a part of or an ideological framework they subscribe too. Analyzing the information in these various ways will provide new insights and might also guide program specialization geared towards specific groups in the future rather than the general programming that currently exists for all participants. Basic use of the data in this way allows for evaluation capacity building because those on staff or volunteering can easily analyze the data in this method without advanced training or extensive past experience. The data can then be used to decide whether additional programming needs to be added or whether the existing program needs modifications (Caudle, p. 419, 423).
Conclusions and Recommendations
First, it is appropriate to congratulate Bridge Evidence Group for their progressive and forward thinking attitude towards implementing program evaluation and Evaluation Capacity Building into their governance and management structures from the beginning of their organization's history. The conclusions of this report show that Bridge Evidence Group as a whole as well as its main program piece are highly evaluable.
It is recommended that BE Group continue to seek ways to further build evaluation into its programming because this will create organizational learning where no matter who is administrating the program the successful program practices and evaluation methods will be an institutional asset written into the paper work and archives of BE Group. It is also suggested that the data obtained from surveys and focus groups as suggested above in the evaluation plan be stored long term in order to start seeing trends in the populations involved in the program. This data might eventually be utilized to show not only trends in the ideas of participants about the program and about LGBT and spirituality, but also statistical relationships between various categories of people and their opinions. While BE Group may not have the resources of skills to do this kind of analysis now, a real commitment to program evaluation and improvement will begin building the resources and data to create a more intensive evaluation method in the future.
It is further suggested that the two leaders at BE Group who have been highly involved in the process of creating this evaluation proposal also disseminate what they have learned and the value they have begun to see in evaluation and evaluation capacity building to the rest of the board members. If all board members are highly involved the organization will gain more momentum towards evaluation, evaluation capacity building, and organizational learning. If this process is achieved ultimately the organization will become dynamic and progressive in the ways it identifies problems and implements solutions in its programming. Greater board involvement also ensures greater accountability and fewer chances for fraud in regard to programming, fundraising, and budgeting.
During the creation of this report it has also been revealed that two board members of BE Group are also acting as part-time staff members. While it is understandably difficult to find qualified people to act as staff members at low wages, or to find board members to replace the current board members who are acting as part- time staff given the unique nature of the groups mission, it is recommended that as soon as the proper assets or people are found to remove all paid staff from board positions. This will not only ensure that the Board has unbiased oversight over finances, but also unbiased views on evaluation outcome. Given that two board members currently are in charge of programming and finances as compensated staff members, their views could unfairly tilt the votes of the rest of the Board who feel they have more expertise or hands-on experience as them. This could be used in a way that benefits these two board/staff members rather than he organization and its programming. While this is in no means meant to suggest fraud or inappropriate behavior is going on currently in anyway, it is a suggestion that will prevent either of these from happening in the future as the organization becomes larger and better financed.
Furthermore, it would be an excellent idea to start issuing annual reports that could be sent out to donors, others stakeholders, and the general public by posting it on BE Group's website. The benefits of an annual report are its proactive nature, informing all stakeholders about your activities and how assets are used even when not under scrutiny, it creates a clear use for data collected during the evaluation process beyond direct program improvement, it streamlines development and fundraising processes by having materials ready to provide to donors, it can create a very professional and put together image for your organization, which can also increase donations, and finally it can serve as an overall evaluation piece for the whole organization and its programs by revealing if over the course of the past twelve months the services and activities provided are in line with the mission and goals of the organization and if they were successfully achieved or worked toward.
Finally, the most important thing to remember with evaluation is to make it an ongoing part of the organization and to build its value and benefits into organizational culture. This will create an open environment and a learning organization that can continually improve itself to better fulfill its mission.
- Abma, T. (2006). The Practice and Politics of Responsive Evaluation. American Journal of Evaluation , 27(1), 31-43. Retrieved November 24, 2009
- Baker, Q. E., Davis, D. A., Gallerani, R., Sanchez, V., & Viadro, C. (2000). An Evaluation Framework for Community Health Programs. Durham, NC: The Center for the Advancement of Community Based Public Health . Retrieved November 24, 2009
- Caudle, S., in Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2004). Qualatative Data Analysis. In Handbook of Practical Program Evaluation (2nd ed.). San Fransisco: Jossey-Bass.
- Goldenkoff, R., in Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2004). Using Focus Groups. In Handbook of Practical Program Evaluation (2nd ed.). San Fransisco: Jossey-Bass.
- Kim, D., Cory, D. It Begins Here. (2006). Singapore: Cobee Trading Company
- Mays, C. (2004). Stake Holder Involvement Techniques . Nuclear Energy Agency. Retrieved November 15, 2009
- Mohr, L. B. (1999). The qualitative method of impact analysis. American Journal of Evaluation , 20(1), 69-78. Retrieved December 2, 2009
- Newcomer, K, Tripless, Timothy, in Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2004). Using Surveys. In Handbook of Practical Program Evaluation (2nd ed.). San Fransisco: Jossey-Bass.
- Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. The American Journal of Evaluation , 29(4), 443-459. Retrieved December 3, 2009
- Senge, P. M. (2006). The Fifth Discipline (2ndnd ed.). New York: Doubleday.