on the corner ‘ Challenging the status quo’
By Robert Docter, Editor-In-Chief
The death of Steve Jobs saddens me beyond measure. I hope there is some precocious 20-something person exploring ideas in a family garage, somewhere. I hope that person is wondering if something could be made better, improved, developed, studied, created —working into the night with a brilliance that illuminates the world. Steve Jobs changed the way we think, plan and work. Everything he touched made the world better. What a man. What a leader. What a change-agent.
I saw a picture in a newspaper this morning of a barefooted Steve Jobs sitting on the floor of Bill Gates’ house. The two very young men seemed to be talking, planning, dreaming. Both of them looked to be in their early twenties. I think they must have been exploring how to create tools and processes that would accomplish things more effectively.
We live in a period of extremely rapid change. People over 60 are accustomed to rapid change, but nothing like what we’re seeing today. Just trying to keep up wears me out.
I hope that young person is wearing an Army T-shirt while working in the family garage. I hope he/she has a great love for people, for God, and for the Army. Maybe, that person is exploring how to make the Army better. Maybe he or she is wondering if the Army’s current organizational/decision- making model will keep it effective through the 21st century. Maybe that person ponders some process questions concerning how the organization works—questions like:
• Does The Salvation Army want to have sufficient data and information for long- range planning?
• Does it have the knowledge and skills to evaluate what it is currently doing?
• Are we using our resources effectively?
• What do those who participate in our programs believe to be our strengths—our weaknesses?
• What parts of this Army must be preserved for us to remain what we are?
• Are we organized to engage in profound change of those parts that we determine to be vulnerable?
• What might we gain or lose if we do?
Somewhere, I envision a knowledgeable, committed, consecrated group of people exploring these questions. Maybe it’s happening now. If so, I don’t know where it is. I’ve engaged in many discussions like this—usually over coffee after a Sunday at the corps. I even participated in such a discussion in a formal manner about 20 years ago when some of those same questions were raised. Some significant change took place—but then it died. I never learned why.
I wonder why we don’t have our own “R&D” (research and development) organization to help us with some of these questions—the big questions, like those above, and the little questions, like why does this program succeed and that one doesn’t—and what is “success”?
I don’t think we’re doing too well even on the little questions. To my knowledge, there are only a very few employees and even fewer officers in the Army charged with primary responsibility either to engage in the process of evaluation or to assist others in engaging in that process.
Oh, I know we have boards and councils at the divisional and territorial levels that are required to review and approve programs, but for the life of me, I don’t know what criteria they use to make their judgments. Too often, I fear, they are limited to criteria pertaining to a “goal-based” approach to evaluation. It doesn’t give us much help in understanding how a program really works, its strengths and weaknesses, or the degree to which the program benefits the clients served.
“Mission” and “available resources” are important criteria. The organization’s “mission” is critical to the evaluation process. Our goals should grow from that mission. It’s a good mission statement. It’s short but, at the same time, very broad. Equally true is the importance of an examination of available resources. If, however, these are viewed alone, decision makers could be led astray. We don’t do decision makers any favor by imposing this responsibility on them in the absence of a sufficient database designed specifically to generate a comprehensive evaluation of the program being proposed or under review.
Many of our social service programs are required by funding sources to report on the progress toward their goals and whether or not they meet specific criteria. Additionally, the Adult Rehabilitation Centers command engages in an excellent program review. Moreover, I am confident we are achieving facets of our mission in many other areas of endeavor. Often, however, I believe we do not know exactly what we are doing well or what we are doing that inhibits our effectiveness.
We use an evaluation tool for congregations prior to a corps review. I wonder if we are getting the maximum potential from this goal-based tool. Corps members, for instance, might explore many of the questions from a process-based or an outcome-based perspective that might lead toward modification of some of the objectives. I wonder who might be available to teach members of a corps how to engage in these activities. I wonder if such an exercise would be helpful. In the absence of data—how will we ever know?
We need to expand and improve our program evaluation. It’s not enough to evaluate the quality of a program on the basis of a hunch. I think, all too often, we use this technique and then make decisions about the program based on reputation or personal criteria. Somehow, it’s decided to support it, fund it, expand it, or close it. I don’t think this is in the best long-range interests of the Army or of the people we seek to serve.
So—where do you agree—disagree? Let me know.