top of page
CKB Logo.JPG
Search

The Faculty Piece of the Student Success Puzzle

Writer: Ross MarkleRoss Markle


One of my first experiences as an assessment professional was working with the University Housing Office. They were so excited to share their work, starting with their impressive set of learning outcomes. Addressing everything from academic success to civic engagement to leadership, their passion for impacting their residents was clear.


There was just one small problem: they had no vehicle for that impact. The Housing Office only really had one intervention at their disposal: the programming put on by RA's. These programs were about an hour in length, twice each semester, and optional. Of course, there were other ways that Housing could impact students - through the design of the residence halls, training of staff, etc., - but there wasn't a lot of student-level interventions that could, for example, foster leadership skills.


In talking with the Housing department, I used the term "intervention capital." The Housing folks only had about 120 mins of intervention. They also had little control over the nature and quality of that intervention and couldn't even ensure that students would participate.


The concept of intervention capital has helped me throughout my career to design interventions, craft programs and resources, and guide organizations looking to have an impact on their constituents. When we're talking about student success in higher education, we have lots of different kinds of intervention capital at our disposal. We can redesign the structure of how we engage with students (e.g., Guided Pathways). We can provide external resources to students. Most often, we design programs, offices, or other institutional mechanisms specifically designed to help students succeed.


In most cases, however, these are limited. Think about the co-curricular resources on your campus. What does their intervention capital look like? How often do they engage with students? What does that engagement look like? How long does it last?


In the end, if we want to better support, engage with, and improve the success of students, there is likely no more intensive intervention than faculty. Whereas our housing example had about 120 minutes of low-intensity, low-control, optional intervention, faculty have (assuming three hours of class each week for a 15-week semester) 45 hours of direct intervention, plus whatever assignments and out-of-class work the course entails.


Before I go any further, let me say a few things. By no means should we abandon or diminish the importance of the critical work that happens in advising, student affairs, and other areas of student success outside the instructional arena. Moreover, I'm not saying that bringing faculty into student success work is simple or easy. What I am saying is that if you're not considering faculty in efforts to improve student success, you're missing a rather large piece of the puzzle.


A Bit of Background

There are two important concepts that I need to introduce, both as a means of citing the proper sources and providing a foundation for my premise: implementation fidelity and program theory.


Program theory refers to the theoretical foundation and intentional design of our interventions. The Pope, Finney, and Bare study linked above is a great resource for educational practitioners as it emphasizes that our efforts to improve student success should be thoughtfully planned and, when possible, rooted in evidence of what works.


Implementation fidelity refers to the measurement of intervention strength through indices such as duration, intensity, and engagement. It's basically a framework for assessing if people receive the intervention in the same way we intended. It's also helpful for assessing a control group, as they might be exposed to some of the same treatment elements through other means.


I could go down many rabbit holes around these topics - they're vitally important for well-designed interventions. Overall, these concepts emphasize a critical yet often overlooked question: If we want to see change in students, our outcomes, etc., how do we expect that to happen? Do we think that the efforts we're proposing are sufficient?


To what end?

In preparing for a recent presentation, I came across a study from the Bureau of Labor Statistics that stated the average college student spends 3.1 hours each weekday on educational activities. Now, if most students are spending 12-15 hours a week in class, at the very least, a vast majority of time is spent with faculty. If we are making an effort to improve student success, how much intervention capital do we have beyond those hours?


Certainly, for some students, there is more time spent on academic activities. For others, there is a greater portion of time spent engaging with co-curricular resources. I'm not willing to play the averages out in full to say that there is only about 30 minutes a week, but I will return to my central point: ignoring faculty is missing a massive chunk of our intervention capital.


One challenge is determining just how we want to use that. There are a few assumptions that I think are untenable from the start. First, we can't add things to the curriculum. That's already full, and to assume that we can pile on to what faculty are teaching is a difficult conversation starter. Second, it's difficult to bring faculty into our co-curricular efforts by asking them to perform additional advising, coaching, or counseling work (i.e., one-on-one conversations with students) because, again, we're adding to already full workloads.


I often say, if we want our student success rates to change, we either need to do something extra or something different. Otherwise we're under that old adage about insanity - doing the same thing and expecting different results. This has been incredibly helpful as I've thought about our approach to faculty development in the service of student success. It's not about doing something extra, it's about doing something different.


When we train faculty (we call that "holistic pedagogy") it's all about developing classroom strategies that foster key areas of student development relevant to success. For the most part, these are agnostic to content area or topic. Our primary focus at the moment is on four popular topics: engagement, sense of belonging, self-efficacy, and growth mindset. The goal here is to help faculty understand these factors and the way their interactions with students can either foster or inhibit these strategies and mindsets.


And for whom?

The concept of holistic pedagogy makes sense to a lot of our partners, but there is the inevitable question of how do we get them to do it? It's a difficult one for sure, as there are few mechanisms that can mandate what training faculty receive. But here are a few approaches that have resonated with some of our institutional partners:

  • Faculty who teach student success courses are typically on-board, and there is usually a central training process that's a great home for this work.

  • Remembering that very few faculty are actually trained in teaching, new faculty are a great audience. Much like new students, they're less set in their ways and can be more willing to learn.

  • Speaking of willing, any faculty who voluntarily participate are a wonderful audience. Admittedly, they often already do many of the things we effort to instill, but the same love of students that brought them into the training tends to make them grateful for whatever new knowledge or skills they can acquire.

  • If I was a provost, targeting faculty with high DFW rates would be an absolutely necessity. It's a difficult topic, and there are many questions about carrots and sticks, but just from an economics perspective, it makes a lot of sense.


Gradual Improvement

Any time we're talking about noncognitive skills or holistic student success, we are challenging existing culture, assumptions, policies, and practice. It's rarely easy to come in - even in the most dire circumstances - and suggest that people need to change the ways they're doing things. I often recommend to start with an audience of the willing. They will help you understand what's working, what's tenable, and what's difficult to achieve.


As they gain excitement for the work, they can help persuade the uncertain. And by the time that happens, the opposition will either be too tired or outnumbered. They may have even seen the change coming and opted to find greener pastures. My sincere hope is that the building critical mass will help them see the value of this, or any, innovative approach.

 
 
 

Comments


bottom of page