chevron-down Created with Sketch Beta.
August 25, 2017 Dialogue

Leveraging Grantee Data for Program Planning and Research

By Carlos A. Manjarrez


In 2016 a new office was established at the Legal Services Corporation (LSC) with the charge of improving data quality, reliability and access, not only for LSC, but for the Access to Justice community at large. As one of the largest grant makers for legal services in the country, LSC collects a tremendous amount of information about the characteristics of grant recipients and the services they provide. This "administrative data" is gathered at many different stages of the grant process. Details about organizational capacity and planned services are gathered during the application stage, additional information about programs is gathered through site visits and, at the end of the year, data on program funding, staffing, and client services are collected in a report known as the Grantee Activity Report.

The Grantee Activity Report, commonly referred to as the GAR, gathers data from recipients of LSC’s Basic Field Grants and is one of the most comprehensive data collection efforts undertaken by LSC. The GAR provides a calendar year summary of each legal service program’s revenue, expenditures, cases handled during the year, outreach services, private attorney involvement, client demographics and staffing details for individual offices. It’s also one of LSC’s oldest data collection efforts, dating back to 1979.

The primary purpose for the GAR data is to monitor the conduct of LSC grantees. But it also serves the broader purpose of informing the Administration, Congress, and the public about the nature of civil legal service delivery throughout the United States and U.S. territories. Each year the GAR data is summarized in a public report called By the Numbers, which is available on LSC's website. 

The Uses of Administrative Data

In recent years, considerable attention has been paid to the uses of administrative data for statistical reporting. Now that many administrative records are in digital format, researchers see several advantages to using this type of data over one-time data collection efforts, like interviews or surveys.

Researching Trends in Legal Service Delivery

Because the data is collected on an ongoing basis, it is particularly helpful for examining trends over time. Administrative data collection efforts are often very detailed and tend to provide accurate measures of program characteristics and outcomes. Detailed collections like the Grantee Activity Report allow researchers to look at the changing characteristics of legal service programs themselves. While there are only 133 Basic Field program grantees at the current time, most of them have been in operation for many years, which means that we have longitudinal data on the organizations, the types of services they provide, and the demographic makeup of the clients they serve.

Because the Basic Field Grants are national in scope, the data can be used to understand legal service delivery by LSC grantees across the country or to make more fine-grained, state-level comparisons. This information cannot be obtained in any other way. And given the fact that the GAR data collection effort is an established and ongoing process, providing broader access to the data means that researchers interested in examining this unique service needn’t incur the additional costs associated with new data collection efforts. This in turn means less burden for grantees who might otherwise be asked to complete new surveys.

Grantee Program Planning

Although some people see a reporting process like the GAR as a limited accountability exercise, there are a variety of ways that the grantees themselves could use the information for program planning purposes. One of the principal uses of the data for program managers is for benchmarking their work.

Staff in legal service programs are motivated people who are continually looking for ways to improve services to the public. Monitoring caseloads and measuring program output can serve as a motivator, allowing program managers to assess progress relative to their past performance or by comparing their work to peer organizations. Improving on a prior year’s accomplishments is a way to monitor and continually update specific program goals. In that way, trend measurement can play a performance-driving role by allowing past performance to function as a de facto goal.

Peer comparisons are an important gauge of performance but they can also present some challenges. It is important to identify peers based on a range of criteria rather than a single measure such as budget or staff size. Failure to identify appropriate peer institutions can result in comparisons that place an organization in a systematically weaker or stronger position relative to the peer institution, skewing the comparison. Therefore, great care must be exercised when using comparative measurement to monitor performance.

Of course, this type of benchmarking is not possible if the data is not accessible to the people that are doing the work on the ground. Below I discuss efforts that LSC is making to improve the quality, relevance and accessibility of the data gathered from its grantees.

Cleaning it Up…

Creating a Comprehensive Database

One of the first things we did to improve data quality was to build large, multi-year research data files of all GAR data dating back over a decade. This seems like a relatively simple task, but in truth it was a time-consuming exercise because there had never been a comprehensive research file in the organization before. While the data had been stored in the grants management system, it was not in a format that could be easily processed using industry standard statistical software. 

We exported data from many different tables to create a comprehensive file for summary analysis. This allowed us to identify trends, outliers and correlates in the data. The file made it much easier to identify data that was outside average reporting bounds for most programs. In short, it gave us a much better handle on the underlying distribution of the data. Of course, identifying reporting outliers is not a dispositive indication of poor data quality. But it is an important first step to understanding the characteristics of data.

Clarifying Grantee Reporting Instructions

In addition to getting better acquainted with the data, we focused on another important, and often overlooked, element of data quality improvement: the reporting instructions. In the case of the GAR, the reporting instructions were never produced in a single handbook that was readily accessible to grantees. Instead, the instructions for reporting were either on different pages of the LSC's website or they were embedded in the reporting forms themselves. For several items in the GAR report, there were no instructions or definitions provided at all, which meant that the respondents had to rely on their own interpretation for certain terms and questions. To reduce ambiguity for our data providers and improve consistency in the GAR reports, we built a comprehensive reporting handbook that provided clear instructions for every report form in the GAR.

Verifying and Validating Data with Grantees

Understanding the data and improving the reporting instructions are important steps to take before data collection started anew. But there were also steps that needed to be taken after grantees submitted their data to LSC. After receiving data from our grantees we validated the information by measuring how close their current responses were to their organization’s three-year average. When grantees reported significantly higher or lower revenue, expenditures or case services, for example, we wrote them back within a week of their submission to ask them to verify the information provided for each item. In total the verification process identified over 2,000 outliers that required validation across all programs. The short turnaround time for validation proved to be a very important part of the process.  The further people get from their report submission date, the more challenging it can be to retrace their steps and validate the information accordingly. 

We were pleased to find that many grantees welcomed the opportunity to review and either validate or revise their reports, particularly when they were given very specific items to review. Asking a data provider to review all the information they provided in a very large report can be a daunting task. But since we provided them with a detailed, item-by-item list, and justified our selection of each item based on their organization’s work reporting history, it was much easier for program staff to dive back into their records and address the issue.

...and Getting it Out

For the information to be useful it is also important to make the data accessible. While fact books and summary reports are important, current standards for open data in publicly funded organizations call for much more. Summary reports do not allow users to access and manipulate the actual data. In that sense, they narrow a user’s ability to investigate their own questions and they limit discovery.

To make the Grantee Activity Report data more accessible to grantees themselves, researchers, and the public, LSC has taken two important steps. First, we developed an entirely new access point for grantees to view their data in a user friendly graphical format. We used free data visualization software to develop data displays that were much more accessible than a traditional table and numbers display.  We grouped the analysis by subject area so that users who were particularly interested in examining revenue trends, for example, could jump to the revenue page and dig into the data. We also made the data filterable so that the exact same trends could be viewed at the national level, the state level, or even for an individual organization. That way grantees could compare their data to other programs or the national average.


The user interface of the new LSC Grantee Data page makes it easy to access detailed information for a wide variety of subjects, with just a few clicks of the mouse. For users that want to roll up their sleeves and access raw data files to perform their own analysis, we have posted the trend file of GAR data on LSC’s DATA.GOV instance. This large trend file allows users to manipulate the data using the statistical software of their choice. 

Making data accessible to more users-from busy program managers to academic researchers-is not only an important way to promote transparency and openness, it transforms the information into something that is useful for the data providers themselves and improves our understanding of the character of legal services in the United States.  We invite you to dive in and explore this data yourself.

Carlos A. Manjarrez

Director of the Office of Data Governance and Analysis, Legal Services Corporation

Carlos A. Manjarrez, the Director of the Office of Data Governance and Analysis at the Legal Services Corporation, is a research manager with more than 20 years of policy research experience.