Last year, I did a postgraduate diploma in monitoring and evaluation from the University of Stellenbosch. Since I had been working as an MandE practitioner for several years, I was quite excited about the course. My work environment really had the feeling of a social startup - a very dynamic and creative environment. While it was great to feel on the cutting edge of innovation on M&E, I was also familiar with the less exciting donor log frames, and wanted to get a sense of what more traditional M&E work looks like. The course came well recommended by some colleagues who had taken it a few years ago. It seems to be nearly the only offering of its kind in South Africa, so I jumped right in. The demand for the course is clearly high, with nearly 100 of us enrolled, ranging from new graduates to mid career professionals, many sent by government or their employers to learn a much needed skill. It's a shame the course was largely by distance, since networking with other students would have been one of the more valuable aspects of the course.
While I did learn some monitoring and evaluation basics over the course of the year, I was tremendously disappointed by the quality of the course as a whole. The syllabus and readings were solid and thoughtfully selected; but that's pretty much the end of the good news. There were 3 contact sessions throughout the year. I went to the first two, but by the time the third one came around, I was so disillusioned that I gave it a miss. The lecturers were totally unresponsive, and we were graded more or less exclusively on how many references to various readings we used in each sub-section of assignments. While I'm all for objective grading criteria, this smacks of the demise of the American primary education system! At the post graduate level, such discouragement of original thought is ridiculous.
Instead of spending the year learning about how to conduct a good evaluation, I spent most of the year trying to figure out what the lectures wanted. After some very poor performances on assignments, I realised the quality of my argument was irrelevant, and it was better to submit something long, pedantic, repetitive, and obvious. This strategy served me increasingly well, and in the end I wrote a very, very long final evaluation report….It stuck religiously to the guidelines of the assignment and the best examples from previous years - and beyond that, I very little thought into it. It had about 10 times the organizational background and one tenth the analysis I would expect of any professional evaluation. The final product was so unhelpful that I rewrote it completely to be of use to the organization that volunteered to be evaluated. It was picked as one of the 'best examples' from the course, and will be passed on to next year's students. This was one of the few times I was horrified to have done so well on an assignment - I would have much preferred to fail! But instead, I lost any remaining faith I had in the value of the course.
This course exists for people who will be working on M&E in our country - an absolutely crucial skill with service delivery crises at every turn, and with a newly created ministry of M&E. I feel like this new cadre of professionals is being sent the message that M&E is a tick-box exercise. That it's more important to cover your back by nitpicking about details, than to step back and think critically.
I'm sure the explanations for my experience of the course are many - the course grew at a rapid rate (several years ago, when my colleagues who recommended it were studying, there were about 30 students). The professors were overloaded, trying to balance teaching with consultancy work. Since a large contingent of the students were foreign and it was mostly distance education, it may have been seen more as a money making tool for the university than a serious academic endeavor. Whatever the reasons, for the future of the profession in the country, I hope the university steps up its game in future years!