PROCESS

The term 'Audit' is borrowed from the financial sector to import a notion of ‘independent inspection’ of a country’s justice system - and its administration of courts and prisons.

The visualisation of data allows the viewer to see how the system conforms to - or falls short of -  general principles.

It introduces an approach to justice reform which avoids 'ranking' or 'scoring' a country and, instead, examines justice data against the aims and objectives of the system / institution and measures the results against internal means and data averages, rather than against any external indicator.

Each audit is conducted independently of government / the institution but with their consent and active participation.

Rather than 'point fingers', the purpose is to encourage greater investment in data to identify causes of system dysfunction and enable remedial action, or reform.

Data What Data?

Data exists: the number of police officers, prisons and prisoners, cases in courts are known; the state of the infrastructure and material resources available are observable; and governance issues concerning appointment, pay and oversight are set down, but tend to be siloed, or scattered.

We bring these data together by the dimensions described above and by location and set them down in the baseline data.

…But who will read this?

Then how accurate are these numbers produced by the institutions?
No data are 100% accurate.
In many countries, the data sought is not available or obviously wrong.

So…

In approaching the data…

…we explore data provenance and data integrity - how data are generated by each institution, captured and communicated up the reporting chain.

In collecting the data…

…we emphasise:

  • granularity - data are collected down to the lowest geographical tier/administrative unit;

  • disaggregation - all data are disaggregated by age, disability, and gender; where informative they are further disaggregated, eg by  case type.

In interrogating the data…

…we ensure:

  • transparency in the way data are sourced, collected, cleaned and organised;

  • triangulation of institutional data with survey and observational data for analysis and ground-truthing; and

  • iteration to check data in the cleaning process; and

  • visualisation of the data in the designing phase; and

  • validation of the data (shown) by each institution.

Where data are incomplete, outliers or plain wrong (i.e.: they do not add up), we flag them up with Data Notes and in the Commentary, so that viewers know what weight to give the data shown.

In designing the look of the data…

…we focus on:

  • organisation by service provider/group surveyed; and 

  • visualisation enabling the viewer to access complex data sets.

By validating the data…

…each institution takes responsibility for their own data; and allows the publication of the data so that it is in the public domain.