Partnerships between criminal justice agencies and researchers take many forms, but the objectives are, we believe, the same regardless of the form: using social science methods and findings to better inform the development of strategies, programs, organizational structures, and managerial practices in criminal justice agencies or criminal justice task forces. The Finn Institute has served as the research partner to several localities in upstate New York. We assist with the analysis of crime and other public safety problems, and with the formulation (and refinement) of data-driven strategies for crime reduction; we also evaluate the effectiveness of interventions, and assist with the development of systems for crime and intelligence analysis, performance measurement systems, and related programming.
The federal Strategic Approaches to Community Safety Initiative (SACSI) sought to replicate in ten cities the problem-solving process used successfully in the Boston Gun Project. As Erin Dalton describes it, the SACSI model consists of six steps, three groups of two steps each:
- Develop a strategic partnership
- Select a target problem
- Use research and information to assess the specific nature and dynamics of the targeted problem
- Describe the problem in a way that points to an effective intervention
- Design an intervention to have a substantial near-term effect on the targeted crime problem
- Implement, evaluate, and modify the intervention.
We believe that the role of the research partner is especially important with respect to the last two pairs of steps, through several iterations of which one might loop in addressing any public safety problem. The research partner’s role in this process includes the following functions:
- framing strategic issues as analytical questions;
- formulating analytical approaches to understanding crime problems for strategic purposes;
- identifying information sources;
- interpreting analytical results and drawing implications for strategic initiatives;
- assessing proposed strategic interventions as they are formulated, including the specification of the logic model on which the interventions are based.
To this description of the strategic problem-solving process, we would add the researchers’ value in providing guidance in fashioning local versions of programmatic or strategic interventions that are evidence-based, and which appear to fit the local problem. Agency and other partners often are unfamiliar with evidence-based models, nor is it obvious how in any particular setting evidence-based models can be established and implemented within the existing organizational and community infrastructure. Research partners can help by using their knowledge of the models and of the local organizational structures, histories, and practices to form, monitor, and sustain local adaptations of evidence-based practices.
As the national evaluation of the SACSI program points out, the SACSI research partners were envisioned as integral members of the SACSI working groups, and not “solely external evaluators, nor were they to be merely the number-crunchers or data collectors or methodological advisors typically found in problem-solving efforts.” They played a dual role, involved in both strategy development and strategy assessment, and while the potential conflict of interest arising from these dual responsibilities-evaluating the same strategies that they helped to formulate-was cause for some concern at the outset, the SACSI research partners “performed their dual role admirably.”
Be that as it may, assessments of interventions can vary in their breadth and depth, and we would reserve the term evaluation research for the assessments that are broader and/or deeper. Every intervention should be assessed, and every assessment should conform as closely as possible to the best evaluation design that is practical, but not every intervention about which researcher-practitioner partnerships would want an assessment will warrant the monetary and non-monetary costs of a randomized experiment with a full range of outcome measures, even if such an assessment were feasible in principle. Some interventions, by virtue of their scale, cost, or potential for adverse consequences, warrant more intensive efforts at assessment, and for these interventions, research partners should perform evaluation research. Such evaluations should be formative in nature, that is, providing some direction for the modification of an intervention to improve its effectiveness, and not only a summary judgment about whether the intervention “worked” or not.
Organizational development (OD) is a planned, systematic process. One feature of OD is the use of action research and/or field research to complete a looped diagnostic cycle. The phases of the process include:
- data gathering (quantitative or qualitative and focused on organizational functioning);
- analysis (to support a data-driven understanding of organizational issues);
- planning (including developing data-driven strategies to improve identified strengths and weaknesses);
- implementation (putting the developed strategies into effect); and
- feedback (gathering information about the results of implementation and the need for midcourse adjustments).
Law enforcement agencies typically lack some of the tools that are useful in assessing organizational structures and processes. Police agencies’ research and planning units tend to go light on the former—research—and heavier on the latter—planning. Research partnerships can overcome these limitations, working with agency partners to frame research questions and design and execute research that serves agency needs.
Like most organizations, law enforcement agencies collect and store far more information than they use. Information is captured for case-by-case look-up as needed at a later date, as when an agency must defend itself in litigation, but the analytical potential of the information is often untapped, making such information an under-exploited resource for strategic and operational applications. Furthermore, when information is collected but not used, it is liable to be less complete and reliable than it could be; shortcomings are identified as the information is put to various purposes. Because researchers are attuned to the analytical value of information, and the utility of more complete and reliable information, one important function that research partners can perform is to facilitate the development of organizational structures that better exploit the analytical potential of information that is collected and stored, and that better apply the results of such analysis for strategic or operational purposes.
Structures, mainly in the form of procedures for data collection and entry and quality control, may be subject to collaborative development to better meet the needs of analysis that supports strategic problem-solving. Systematic omissions on reports, or long-standing practices of recording that are incompatible with the demands of analysis (even if they met previous standards for reporting), must be identified and procedures established for more complete and reliable data collection.
Many agencies have individual personnel or units that perform crime and/or intelligence analysis, but research suggests that the analysis that these analysts perform tends to serve narrowly defined needs and falls far short of the kinds of analysis that could be usefully performed, especially for strategic purposes. Research partnerships can attend to these limitations, working with agency partners to identify new analytical products, and to develop new analytical protocols, that would better meet agency needs. This involves work on both the demand and the supply sides of the equation: inculcating among operational commanders a demand for strategic analysis, and nurturing in analytical units a capacity to meet those demands.
Compstat is a widely heralded organizational mechanism that connects information – about crime, disorder, enforcement – with operational decision-making, and research partners can assist their agency partners in developing the capacity of the organization to use data and research by working with them to either institute or enhance a Compstat-like mechanism. Research shows that, as popular as Compstat has become in police circles, it is one thing to have an administrative structure that resembles Compstat, and it is quite another to have a structure that stimulates innovative, data-driven problem-solving by operational commanders. The feedback and guidance of a trusted research partner can be invaluable in shaping a Compstat mechanism that is more likely to yield the desired organizational benefits.
Reports and Publications
Robert E. Worden, Sarah J. McLean and Heidi S. Bonner, 2014. “Research Partners in Criminal Justice: Notes from Syracuse,” Criminal Justice Studies 27 (3): 278-293. (doi 10.1080/1478601X.2014.947812.)