Too Much Information: After years of collecting every possible scrap of data, Year Up finds that some measures don’t add up

Year Up has a problem that many nonprofits can’t begin to imagine: It collects too much data about its program. “Predictive analytics really start to stink it up when you put too much in,” says Garrett Yursza Warfield, the group’s director of evaluation.

What Mr. Warfield describes as the “everything and the kitchen sink” problem started soon after Year Up begangathering data. The group, which fights poverty by helping low-income young adults land entry-level professional jobs, first got serious about measuring its work nearly a decade ago. Though challenged at first to round up even basic information, the group over time began tracking virtually everything it could: the percentage of young people who finish the program, their satisfaction, their paths after graduation through college or work, and much more.

Now the nonprofit is diving deeper into its data to figure out which measures can predict whether a young person is likely to succeed in the program. And halfway through this review, it’s already identified and eliminated measures that it’s found matter little. A small example: Surveys of participants early in the program asked them to rate their proficiency at various office skills. Those self-evaluations, Mr. Warfield’s team concluded, were meaningless: How can novice professionals accurately judge their Excel spreadsheet skills until they’re out in the working world? The review also has forced the charity to rethink at least one long-held assumption. Program participants go through intensive training to learn technical and professional skills, then spend six months in an internship with one of the nonprofit’s corporate partners. In exchange for this support, participants sign a contract that outlines a code of conduct that includes promptness, a professional demeanor, and the like. Year Up has a point  system to track how well students follow the code. Young people start out with 200 contract points. They can earn more by exceeding expectations or lose points by violating the contract. Get down to zero and you’re out of the program.

The organization long believed that the point system was a critical indicator of who would succeed. But analysis of four years of data showed that contract points, while important, were not as strong a gauge as employees had assumed. A stronger signal: the participant’s satisfaction two months into the internship.

The finding galvanized Year Up to action. It is now even more aggressive about stepping in when participants are unhappy in internships and helping devise a plan to improve the situation, says Mr. Warfield. The points analysis also reinforced the need for a review of the internship-matching process, which was already under way.

Mr. Warfield says Year Up has had to adjust after the data analysis proved the shortcomings of a number once thought critical. “People put a lot of stock in the contract points,” he says. Still, he’s excited to see what else his team’s data review uncovers — and what measures get the ax.

“I wish we could skip the step of ‘Collect everything, collect everything’ and get down to ‘Let’s collect this limited set of variables and add on only as needed,’ “ he says. “But that takes an awful lot of planning and more aggressive about stepping in when participants are unhappy in internships and helping devise a plan to improve the situation, says Mr. Warfield. The points analysis also reinforced the need for a review of the internship-matching process, which was already under way.

*Original article here.