January 29th, 2010
A couple of weeks ago, I posted a blog about essential skills found in highly skilled evaluators. Although having the skills I described is undeniably important to evaluation, one must also have certain personal qualities to be a dynamic evaluator.
- Evaluators must be good conversationalists. They are active listeners, engage in dialogue, give others time to ask questions before moving on to another topic, and don’t interrupt people. Being a good conversationalist allows evaluators to build rapport with their clients. They understand and appreciate the different perspectives, backgrounds, experiences, languages, and cultural identities that make each person unique. Evaluators also know how to ask relevant questions. When they go into a meeting, evaluators actively participate and understand how to ask the right questions so that they quickly get the information necessary to provide top-notch research and evaluation.
- Each organization has its own culture, way of thinking, and way of doing business and highly skilled evaluators pick up that information quickly. They also adapt quickly to agency protocols. Doing so helps the evaluator suggest techniques, strategies, and methods most appropriate for an organization, which can aid in the success of any evaluation program.
- A skilled, dynamic evaluator sees the big picture and the minute details. In doing so, an evaluator can provide the best evaluation and research tools to enhance the effectiveness of a client’s program. “Big picture thinking” is an essential skill because the evaluator can then assure that tasks are completed on time and that deliverables will be ready when the client needs them. Paying attention to minute details is also an essential skill because the evaluator can assure clients that deliverables have been completed accurately and that small, but important, elements of the deliverable have not been overlooked.
- Last, and this may seem self-evident, a skilled evaluator is a team player. Skilled, dynamic evaluators are very good at working with a diverse group of stakeholders to craft a successful evaluation plan. This is important at all stages of an evaluation: Skilled evaluators participate in a team to design, carry out, and report on the evaluation.
Hopefully these highlights have given you an idea of what makes a successful evaluator. If you have any questions or thoughts, please feel free to share them!
January 22nd, 2010
Last August Heather and Kirsten offered a free workshop on survey design, which covered how to write useful items for your survey and common pitfalls to avoid. The feedback on the workshop was very positive and all attendees seemed to benefit from the material. ACET also offered a complementary review of an attendees’ existing survey as a means of following-up for the event.
We are pleased to announce that our second workshop has been scheduled! The Results are In: Analyzing and Reporting Survey Data for Stakeholders will cover common options for analysis of your survey data, guide you through the analysis process, and offer you suggestions for sharing your information with stakeholders in a way that is useful and meaningful.
The Results are In: Analyzing and Reporting Survey Data for Stakeholders will be held on:
Tuesday, February 23rd, 2010 from 1:00 to 3:00 pm at the
Neighborhood House Wellstone Center
179 Robie Street East, Saint Paul, MN 55107
Wellstone Center phone: (651) 789-2542
Seats are limited to this FREE event so please RSVP to Heather Scholz via email at firstname.lastname@example.org or via phone at the number below by Thursday, February 18th, 2010. For more information about ACET please visit our website at www.acetinc.com or contact our office at 952-922-1811.
January 21st, 2010
According to a Philanthropy News Digest article, some Minnesota grantmakers appear to be less pessimistic about funding in 2010.
Based on a survey administered by the Minnesota Council on Foundations, overall giving by Minnesota funders is expected to decline by about 1% in 2010. And while 30% of funders anticipate distributing fewer funds this year another 25% of funders expect to give more in 2010 than they did in 2009.
To learn more about the survey results, click here: Minnesota Grantmakers Less Pessimistic About 2010, Report Finds
January 20th, 2010
One of my favorite online resources is Andy Goodman’s “free range thinking.” Andy is a nationally-recognized communications specialist and he is passionate about helping people to better communicate with their audience(s). His monthly newsletter reflects that passion.
What I find most appealing about free range thinking is that each newsletter is short and has one (and only one) major point and the information can be utilized immediately. For example, in the October 2008 newsletter, Andy and a guest contributor, Eric Swartz, discuss how to create a unique tagline. And in the April 2009 newsletter, Andy describes how to develop a story bank, a collection of stories about an agency to help spread their message.
Anyone can subscribe to free range thinking and receive the newsletter for free.
I hope you find this resource helpful!
January 19th, 2010
Twice a month we receive an electronic newsletter from the National Criminal Justice Reference Service (NCJRS). Here are some of the highlights from this week’s edition:
- January is National Mentoring Month! The Office of Juvenile Justice and Delinquency Programs (OJJDP) states, “Mentoring provides the perfect opportunity to consider what we could do to change a child’s life.” OJJDP awarded more than $177 million to support mentoring in fiscal year 2009 alone.
- NCJRS Offers Feature on Mentoring. In recognition of National Mentoring Month (January 2010), NCJRS invites you to view the Mentoring Special Feature. This online resource contains links to publications, funding resources, and Web sites that focus on the importance of mentoring and the involvement of adults in young peoples’ lives. (NCJRS)
- Report on 8th-Grade Youth Available. As reported in a recent issue of the Center for Substance Abuse Research’s CESAR FAX, which highlights results from the 2009 Monitoring the Future survey, the perceived risk of using ecstasy, inhalants, and LSD continues to decline among eighth-grade students. (OJJDP)
This issue also features grant (funding) opportunities within the justice system, online resources, conference and training announcements, and other reports and publications related to justice.
If interested in learning more about NCJRS or to subscribe to their electronic newsletter, view: http://www.ncjrs.gov/.
January 8th, 2010
Often, people new to evaluation, students, and some clients have questions about what an evaluator does or the skills they need. Here are some of the skills I have found to be common among highly skilled evaluators:
- Evaluators need to be very good at data analysis, whether the data is quantitative or qualitative. For quantitative evaluators, it is essential that they be familiar with and use a statistical software application like SPSS or SAS. In addition, quantitative evaluators should be well versed in general linear models and statistical modeling. On the other hand, qualitative evaluators should be very skilled at performing thematic analysis of data, whether that analysis is of focus group discussions or open-ended survey questions. And all evaluators, whether they are more comfortable with quantitative or qualitative data, should be able to compute and explain descriptive statistics.
- Believe it or not, evaluators spend a lot of time considering how to present data to clients so that the figure, table, or chart is user-friendly. It does an evaluator no good to spend hours and hours analyzing survey data and come up with such a complex presentation that their client doesn’t understand what the survey results are!! As a result, it is important that evaluators be very fluent with a wide range of computer applications to create user-friendly tables, graphs, and flow charts. Also, it is essential that evaluators know some desktop publishing in order to create documents with an easy-to-use layout. This may sound trivial, but a poorly formatted survey or document often discourages people from filling it out or even trying to understand the information.
- A skilled evaluator knows the audience they are writing for and communicates the results of an evaluation so that all of the recipients understand the results. And it doesn’t matter if the results are in a formal report, research report, or executive summary – skilled evaluators communicate those results clearly, succinctly, and in such a way that fosters client understanding of the material.
- Skilled evaluators are also skilled researchers. You may be thinking, “Of course, you evaluate so you should be able to perform research!” But in reality, evaluation and research aren’t always the same thing. Skilled evaluators know how to formulate research questions, investigate theory, investigate the existing literature, and frame research methods to carry out the research.
Stay tuned – in my next blog I’ll describe some of the personal qualities that can really make the difference between an evaluator and a dynamic evaluator.
January 7th, 2010
Need a coffee break?
The American Evaluation Association (AEA) has announced a webinar-based demonstration series which may be a valuable resource for education, nonprofit, and small business leaders. Each “Coffee Break Demonstration” session will be short (only 20 minutes). You need only a computer with audio (or a computer and telephone) to participate.
The demonstrations are free for AEA members. If you’re not a member of AEA, you can sign up for a demonstration pass for $80 for one year’s worth of demonstrations and receive a waived membership to AEA along with the pass. Students can also obtain a demonstration pass for $30 which also includes AEA membership.
Below is information for the next demonstration.
Tuesday, January 12, 2:00 PM – 2:20 PM EST: Using the Fantastic Five Checklist to Write Better Survey Questions and Improve Survey Reliability – Amy Germuth
Amy Germuth, founder and President of EvalWorks, LLC, an independent evaluation and survey research firm, will demonstrate how to use five key questions to improve the reliability, validity, and value of the responses you get from individual survey questions. Appropriate for the beginner, this presentation will help you get the most from your surveys and also presents content that is appropriate as a teaching tool or for use with stakeholders during collaborative instrument development.
To learn more about AEA’s demonstrations, including the schedule and a brief description of topics, please visit: http://www.eval.org/demos.asp