Case Example

Data Routines for Improving Digital Learning Programs

Rafi Santo - New York University

Data and routines around them can play a key role in improving youth digital learning programs, supporting students, and building capacity of educators. In this case example, we highlight what this looks like during one digital learning organization’s staff meetings.


Download Resource PDF

In this case example, we share an account from one of our research team members observing a bi-weekly data sharing meeting at The Knowledge House (TKH), a nonprofit technology education organization based in the South Bronx focused on economic opportunity and workforce development for low-income youth. Through a commitment to not only collecting extensive program data, but also routines for staff to come together to examine and discuss implications of emerging data, we show how technology-focused youth development organizations might use data to better serve students and improve programs.

Opening and Setting

I exit the 6 train at Prospect Place in the Mott Haven neighborhood of the South Bronx and head towards Lafayette Avenue, where the Bronx Digital Sunshine Incubator is housed. The incubator was co-founded by Joe, one of the co-founders of The Knowledge House, and TKH’s offices are based in the back of the incubator. Each week, the full staff, now just under twenty people, gathers in the back of the incubator for a weekly meeting where they review program progress and data, troubleshoot issues that are coming up, share out about organizational changes and activities, and develop strategic plans.

When I show up, Jerelyn Rodriguez and Joe Carrano, the co-founders, are having a meeting in one of the glass-walled conference rooms that are right next to the front desk of the incubator. I wave, and Jerelyn pops out to say that I can just head to the back where their desks are and that she’ll join in once she’s done with her meeting. I head back, where I see Stephany Garcia, another TKH staffer who has sat in on some of the interviews we’ve been doing as part of our research. She lets me know that they’re just starting to set up the back area. Not wanting to just sit around, I begin to help one of the staffers that’s started to wheel out tables and arrange them. His name is Elvis, and he’s an instructor at TKH, and, I learn later, Stephany’s brother. I ask him about his work and learn that he runs programs at three high school sites that TKH works in, including one I’m familiar with, Fannie Lou Hamer High School, where he conducts professional development for teachers around one of TKH’s course offerings.

We continue to set up with some of the other staffers joining in, and by the time 2 pm rolls around about fifteen of us have settled into a U-shape configuration, with a computer projecting onto a screen at the front of the U. The group is a mix of instructors, program coordinators, program managers, various specialist roles like HR, outreach coordinators, talent managers, and, of course, the co-founders, Joe and Jerelyn. I’m the only white person there – everyone is either black or Latinx – and at 35 I’m pretty sure that I might be the oldest person there as well. The group feels young, energetic, earnest. A lot of them feel like they could’ve been my former students from when I was running youth media activism programs in Brooklyn. I learn that a good number of them are former TKH program participants themselves.

As we settle down, I’m chatting with the staffers around me. One, Nohely, I met when the TKH City Saturday program visited my colleague and me at our offices at NYU’s Media and Games Network. She had her laptop in front of her, with spreadsheets upon spreadsheets open to what looked like some form of student data. I asked her about it, and she said that she’s basically just cleaning up all the student data to get it ready as they’ll be moving over from a purely spreadsheet-based system to using Salesforce, and they need to make sure there aren’t any errors so that it goes smoothly.

Start of Program Data Share-Out

The meeting opens with an icebreaker, where folks go around sharing ‘bright spots,’ or little highlights and successes from the week. People share about successes in helping struggling students, looking forward to program ‘graduations’ since it’s the end of semester, and upcoming ‘demo days’ where students will be sharing out final web site projects they’ve been working on over the course of the semester. Some of the higher ups share bright spots around things related to organizational development, with Joe sharing that he’d been working on a new structure for team meetings that he thinks will help streamline things as the organization grows, and Jerelyn sharing that she just heard from their auditor that they’ve been approved to now receive larger grants.

At about a half hour into the meeting, the group moves into what seems like the core agenda: reporting out and discussing data coming out of the different active programs. On the projector, a ‘rolling’ google doc for the regular weekly staff meeting shows data about programs that, based on what I can tell, program coordinators and managers have filled in ahead of the meeting. It looks like most of the staff are also in the document as it’s being projected, indicated by the many multi-colored ‘avatars’ present at the top of the google doc.

Madeline, a program manager, begins by sharing out about a number of programs. There’s a common set of metrics shared about each – the numbers that are enrolled, the average attendance rate, and the two ‘top’ and ‘bottom’ students in the course, based on TKH’s ‘leaderboard’ scores, the organization’s internal ‘gamified’ system of assigning points based on attendance, assignment submissions, quizzes, participation in extra events, and a number of other factors. As she mentions the name of one of the bottom students in one of the courses, Joe notes that “it’s because he doesn’t do homework, that’s why…” She goes on with one of the program share-outs, saying that with week 9 of the program wrapping up, some of the bottom students will “be contacted to talk about next steps, including probation, case management, or dismissal from the class.” She moves on to the next program without comment.

Data as a Prompt for Troubleshooting: “What are we doing about Middlespring?”

The report out from Madeline seems to proceed without incident, until she reports out the numbers for a program co-facilitated with Middlespring Community College, a local partner. It’s a context where they have both weekday and weekend programs in coding (React.js, a javascript) and design (User Experience/User Interface). She reports that the weekday attendance rate is 66.82%, and the weekend is 58.3%. Joe asks, “Is that consistent with the experience so far?” Madeline responds, “Yeah, it’s pretty consistent…I think it’s a 2% decrease from the week before, but it’s pretty consistent…” She goes on to report on some other program numbers, ones that are notably higher in their attendance rates, and as she finishes she sums things up by stating that, basically, overall numbers for high school programs she’s overseeing is over 80%, which is exceeding their target goal of 80%.

Joe jumps in, asking Dre (Andre), who also works on that program, “So, Dre, what are we doing about Middlespring? The attendance?”

Data as Prompt for Context Sharing and Ideation on Partner Coordination

Jerelyn pretty quickly jumps in, sharing that she’d actually connected with Tom, their contact at Middlespring Community College, after this question came up last week. She said there were apparently some mixed messages going out to students related to what to prioritize within their participation. “We have a challenge with both React and with UX/UI,” Jerelyn starts, “because he’s telling me that students are confused about what their number one priority is. Is it going to bootcamp, because they’re in WDC (a workforce development program), or, doing their coursework in either React or UX/UI? I think we’re probably not all on the same page about the messaging. From my understanding, Tom is telling them that they should be focusing on their coding bootcamp applications, so they are trying to get excused from coursework…So I’m not really sure what you all think about that.” The data share-out highlighted an issue that was actionable. Now, it was time to do some coordinated problem-solving with a key partner around how priorities were communicated to students.

Data ‘Trouble Spots’ as Prompts for New Data Strategies

Jerelyn goes on to ask whether there’s a way to know what the overlap is between the UX/UI and React students. Stephany shares that she connected with Tom at the college earlier today, and that they had expressed the same concern. She had advised him to make a separate spreadsheet with the overlap between the WDC students and the others, so that they could be especially mindful of those overlapping students since they might be getting mixed messages. She also advised him to ultimately prioritize bootcamp placement, since that’s the college’s number one priority. Stephany shared that they also need to come up with a strategy to deal with the tension of students completing their coursework while simultaneously making their applications to the bootcamp good. This meant that students should  reach out to TKH staff for help, schedule office hours, and be able to make up work. In this case, the ‘trouble spot’ in the data highlighted a need to potentially create a new data stream that focused on particular students.

Data as Prompt for Considering New Pedagogical Strategies

After Stephany shares about this interaction with Tom, Sam, a Talent Manager who focuses on workforce placements for TKH students, shares that she’d also been attuned to this issue with the bootcamp applications. She says that once she realized it, she went ahead and actually tried to fill out the bootcamp application herself to see how long it would take her and to understand “how grueling the process really is.” “Frankly, I don’t know if I’m being a little bit unforgiving, but I think we could just hold a 3-4 hour workshop and make sure that every single student has a strong draft,” Sam said. “Leaving them to their own devices isn’t getting the kind of results that we need, and it’s leaving room for them to make excuses.” Jerelyn then shares that Tom agrees that there is a need to integrate the application into the program structure. In this case, the data sharing prompted a discussion about program structure and pedagogy and whether a new program element needed to be added to help reach their goals.

Data as a Prompt for Reviewing Student Progress, Highlighting Program Design Tensions, and Discussing Shifts in Program Expectations

Following this, Jerelyn raises a new issue, pointing out that one of the low performing students, José, is actually one of TKH’s most successful alumni. In fact, he’s already been placed in a job. She wonders whether José’s case highlights some tensions in how they communicate what successful participation looks like. She reflects that José probably is not performing well in the TKH course or going to the career events that count towards the leaderboard score because he already has a job. Joe even notes that José is doing the course assignments; he just isn’t turning them in. He also helps out during the classes. Stephany shares that students like José, who have “already met the definition of TKH success,” and therefore do not need this course, should possibly consider dropping the course. This leads into a larger conversation of whether TKH should institute some form of course auditing for cases like José. The group debates on this for a couple of minutes; it’s clear that this is a challenging area. Jerelyn notes that it’s come up before. In order to solve it, the first thing the group needs to do is gain more clarity on the types of exceptions or new policies that might be put into place. Joe suggests that students who already are placed in jobs that pay more than $50,000, TKH’s key metric of student success, should be able to audit classes. Jerelyn concurs.

Prompts for Further Inquiry: Query about Causes of Low Attendance

As the conversation moves on, Jerelyn asks whether the participation in WDC, which I’ve come to understand and associate with the issue of bootcamp applications, is the cause of low attendance scores in the UX/UI course at Middlespring Community College. Dre shares that they might be low because students are missing career day events. This prompts a larger conversation about why they’re not attending these days. Stephany shares, “My high level assumptions of why students are missing out on career days is that they’ve already been placed and they already have this career training and they don’t need to go, but that’s not an excuse for not showing up.”

They go into a larger conversation about how some students need career development activities, like working on resumés or interview skills, while others could be exempted from them. Some suggest assessing students at the beginning of a course to determine what career development skills they need. From there, they can modify what each student is required to complete in the course. Again, I’m seeing questions about how program requirements, associated data and student needs in their actual daily lives intersect.

The conversation moves to questions about the current pedagogical design of career days. Cassandra, who works on outreach, asks, “How different are career days for developers and designers? From what I’ve seen they’re not that different.” She continues, “I think they should be different. Developers and designers are different. They need to market themselves differently.” This leads into questions of standardization versus customization of career day activities within the overall TKH model.

“They’re Dealing with Outside Issues” – Ideation around Addressing Root Causes

The conversation about career day models and requirements comes to an end, and it seems like they may be about to move on to further program share outs. It’s been about twenty minutes or so since this the group began this conversation about low attendance rates at the community college course. As the conversation settles, Jerelyn asks the group “anything else?” There’s a pause, and then Sam jumps in.

Regarding job placement interfering with the students applications, I honestly think there’s something else and that’s just the students’ [difficulties]. They had difficulties coming in. I think it’s more so that they didn’t get support services that they were told to get, so it’s just carrying over to the next class. They’re dealing with outside issues. We are thinking it is just the fellowship, but truly it is that plus a lot of other issues from the outside. That’s why something that should take less than a week is taking longer. If they don’t know where they’re getting their next meal, a student might struggle to complete something… We can see the top level issue, but we really have to deal with the root problem, which is accessing supportive services that are literally right next to them. Sometimes the person from HRA (a social service agency) will be there at a class, and will literally be like ‘I’m here to help you, come talk to me.’

This prompts a larger conversation about what kinds of communication is happening within the organization regarding student needs and why students are not taking advantage of services that are already offered. “So let’s follow up on that,” Jerelyn says. “This sounds like a case management issue, and we might need to change our case management policy. How do we find more services for our students, or ones that are better fits for them? My question is why aren’t students taking advantage of the ones that we already have in place, like Workforce One.” Cassandra responds, “They might be reaching out to them. We need to have a feedback loop from Workforce One.” Jerelyn responds, “We should be tracking that, logging that somewhere.” The conversation moves back and forth about tactics that are already in place to know to track social service uptake among their students, raising examples of how they’ve seen other partners track both qualitative and quantitative data around these sorts of barriers or root cause issues and how these approaches could have implications for both reporting metrics and supporting case management of students.

Closing Out the Program Report

The meeting moves on from the conversation about Middlespring, with Cassandra sharing out data from some other programs. The data includes familiar metrics, such as attendance and ‘top’ and ‘bottom’ students, but there are also program specific data points. For example, one program tracks the number of mock interviews completed, standout fellows, and students flagged as ‘of concern.’ Cassandra shares some positive stories about some of the stand-out fellows. “Michelle and Caroline, despite their challenges in coding, have been very committed to office hours and are trying to figure things out that are hard for them in term of coding. Michelle told me that she was downtown one night until 2am so that she could access wifi. It showed me her perseverance, and that she’s not going to let anything stop her.” After some additional program reports, the data report-out meeting adjourned.