Outcome-Network.org

An International Database and eJournal for Outcome-Evaluation and Research

Paper

Italian and Australian evaluation experience: Building evidence through effective collaboration

abstract

Background. Effective research partnerships are an essential precursor to best practice with vulnerable young people and their families. For social work researchers, however, the engagement of managers and practitioners in evaluation has always proved complex. In 2006, the Fondazione Emanuela Zancan of Padua, Italy hosted an international seminar on "Engaging Social Workers in Research in Italy and Australia" to address some of these challenges. The gathering was an activity of the International Association for Outcome Based Evaluation and Research in Family and Childrens' Services (iaOberfcs). The seminar facilitated a comparison of challenges faced and opportunities available for mounting collaborative research evaluations in Italy and Australia. Discussions at that seminar gave rise to this paper.

Purpose. This paper explores the Italian and Australian experience of evaluation as a core collaborative activity. What is needed for consumers, practitioners, managers and researchers to jointly embrace evaluation? This exploration identifies positive contemporary developments in each country, along with a number of ongoing challenges.

Key themes. In both Italy and Australia, the demands upon the social work profession to undertake evaluations frequently emanates from outside of the profession. Usually there is limited time, money and personnel available for such ventures. There is, however, a need for greater sensitizing of social workers to the usefulness of evaluations. This applies whether the evaluation is requested by the social work profession itself or by other professions, such as those within the health sector. A core element of the international social work mission has always been the 'celebration of difference' in all its domains. For the practitioner, it is vital that the service user or client is viewed as unique and special. Diversity of race, ethnicity, culture, class, gender, economic and geographical situation must always be acknowledged in any social work response. For many practitioners, structured instruments present the alarming prospect of 'collapsing difference'. For social workers, compromising uniqueness in any way is often strongly associated with denial of the integrity of the individual and her/his lived situation. Such compromise is vigorously challenged by many practitioners, and indeed by some academics. Recent challenge to the contemporary push for evidence based practice, associated as it is with proliferation of structured instrumentation, comes from the Australians Gray and McDonald (2006). Such critique must be addressed. Evaluation methods which:

  • give the consumer "voice",
  • have an empowerment orientation,
  • offer an ecological and developmental perspective,
  • incorporate attention to social structures,
  • have a change dimension,
  • can be completed quickly and easily,
  • would seem most engaging of practitioners and managers alike. 

Social work education at both undergraduate and postgraduate levels has an important role to play in the development of such methods. Research in Italy suggests that out of 100 hundred social work programs in 44 faculties only half have any content related to evaluation. This is generally offered in the second or third year of any course (Bressan and Neve, 2006). Regrettably, in such courses in methodology, there is little focus on teaching research methods of evaluation. Social work education in Australian universities has the advantage of an almost seventy year history. Evaluation and research methods have long been essential elements of the Australian Association of Social Workers' accreditation requirements for courses of training in social work (www.aasw.org.au). Australian social work students are taught research methods from early in their undergraduate training. There is, however, a general concern that Australian new graduates remain under prepared for serious research activity and as a result are under-confident in this area.

Implications and recommendations. Researchers must address the charge that evaluation instruments have poor 'goodness of fit' with the realities of the field; research collaborations should privilege optimum synergy between the realities of everyday practice and evaluative rigor. Consumers, practitioners, managers and researchers need to work together in a focussed manner to produce qualitative and quantitative tools - tools that both "collapse" and "celebrate" difference. Workers and service users have not had equal voice with academics and researchers in the development of research proposals; they must be directly engaged in this and in the dissemination and implementation of findings. Forums for democratic dialogue are urgently needed; in such situations practitioners and managers from the field might be "listened to" with greater intent, and "heard" with greater impact than has been the case (Munford and Sanders, 2003). More extensive training in the use of instruments as part of undergraduate education and postgraduate professional development is another immediate imperative; it would seem that collaborative training ventures between the field and universities are likely to achieve best practice in this arena. Evaluations should be economical, accountable, empowering and oriented to achieving real change within realistic time frames. Cross-national comparisons that tap the rich reservoir of evaluation knowledge and experience from around the world are a priority; this evidence is often neither linked nor communicated effectively. The International Association for Outcome Based Evaluation and Research in Family and Children's Services has actively addressed these deficits over recent years (www.outcome-evaluation.org).

Key references

Bressan, F. and Neve, E. (in press). I corsi dell'area professionale nelle università italiane. Rassegna di servizio sociale.

Gray, M., & McDonald, C. (2006). Pursuing good practice. Journal of Social Work, 61, 7-20.

Munford, R. & Sanders, J. (2003). Making a difference in families: Research that creates change. Sydney, Australia: Allen and Unwin.

Contacts: Patricia McNamara, School of Social Work and Social Policy, La Trobe University, Bundoora, Victoria, 3068, Australia, E-mail: p.mcnamara@latrobe.edu.au, Phone +61 3 94795681, Fax +61 3.94793690.

 

© copyright 2024 Outcome-Network.org all rights reserved, in partnership with FondazioneZancan | iaOBERfcs | read the legal notice.