Content area
Purpose
This study aims to evaluate the outcomes of participating in a brief three-part mental health literacy (MHL) training, the Mental Health Awareness and Advocacy (MHAA) training among employees in cooperative extension.
Design/methodology/approach
The authors used a case study research approach for program evaluation to understand the outcomes of participating in a brief three-part mental health literacy training, the Mental Health Awareness and Advocacy training. Participants were eight individuals who completed all three parts of the training and consented to participate in semi-structured interviews. The authors used knowledge, self-efficacy and behavior as the theory-informed analytic framework, the authors also engaged in inductive coding to examine other salient areas as discussed by participants.
Findings
All participants reported meaningful increases in knowledge, self-efficacy and covert (non-observable) behavior changes, while some reported overt (observable) behavior changes.
Originality/value
The current study used a case study research approach for program evaluation to understand the outcomes of participating in a brief three-part mental health literacy (MHL) training, the Mental Health Awareness and Advocacy (MHAA) training. While previous quantitative research on MHAA programming have documented significant increases in knowledge and self-efficacy, quantitative analyses have largely failed to identify significant behavioral changes. The qualitative methods used in the current study were valuable in identifying behavioral indicators of MHL that may help refine evaluation of behavioral changes following MHL programming.
Introduction
Mental health concerns (e.g. depression, anxiety, bipolar, etc.) are one of the top five leading causes of disability across the world (WHO, 2021). Evidence-based treatments, primarily behavioral-based therapies (e.g. cognitive behavior therapy; behavioral activation; acceptance and commitment therapy) have garnered considerable research support at improving outcomes for individuals experiencing mental health concerns. Despite this, a majority of individuals that experience mental health concerns go untreated, negatively affecting their overall quality of life (Alonso et al., 2018). It is evident that reducing the burden of mental health concerns will require a multi-pronged approach that includes both intervention and prevention programming to help individuals access care (WHO, 2021). A common framework for prevention programming is based on increasing individuals’ mental health literacy (MHL) to help increase access to effective care for self and others (Morgan et al., 2018; Seedaket et al., 2020).
MHL is defined as “knowledge and beliefs about mental disorders which aid their recognition, management, and prevention” (Jorm et al., 1997, p. 182). Research suggests that the general public typically report low levels of MHL, which could potentially interfere with abilities to recognize and respond to mental health concerns (Furnham and Swami, 2018). A common barrier to seeking treatment is lack of recognition that the mental health concern is serious enough to warrant treatment (Dunley and Papadopoulos, 2019). As a result, multiple training programs (e.g. Mental Health First Aid; Question Persuade Refer; Youth Aware of Mental Health Intervention) have been created to increase MHL and, as a byproduct, treatment seeking behaviors (e.g. screening for mental health concerns, seeking treatment; Wang et al., 2021).
Since their creation, these programs have proliferated across the USA via funding from the Substance Abuse and Mental Health Services Administration (El-Amin et al., 2018) with results generally showing increases in knowledge and self-efficacy, but mixed evidence of changes in behaviors related to MHL (Maslowski et al., 2019; Morgan et al., 2018). Additionally, results also seem to suggest that online compared to in-person delivery of these MHL programs are equally effective, with similar mixed evidence on changes in behaviors (Aller et al., 2022, 2022 Brijnath et al., 2016). It is important to gain a richer understanding of the specific behaviors that participants engage in both overtly and covertly following MHL trainings to determine if the mixed behavior change findings are due to methodological concerns (e.g. measurement issues specific to MHL behaviors, recency issues) or if changing MHL behaviors is simply more difficult via these MHL intervention approaches. Accordingly, the current study is a qualitative program evaluation of one MHL training titled the Mental Health Awareness and Advocacy curriculum ([the authors], 2022) that was adapted and delivered in a three-part Zoom-based webinar series.
Literature review
Mental health literacy
Jorm et al. (1997) divides MHL into five key components:
“(a) knowledge of how to prevent mental disorders, (b) recognition of when a disorder is developing, (c) knowledge of help-seeking options and treatments available, (d) knowledge of effective self-help strategies for milder problems, and (e) first aid skills to support others who are developing a mental disorder or are in a mental health crisis” (p. 182).
Accordingly, research on MHL outcomes can be organized into three categories: knowledge, self-efficacy and behaviors.
Knowledge
MHL knowledge includes declarative knowledge about symptoms of mental health concerns, treatment options and effective responses to crisis. There is a growing body of evidence from systematic reviews (Lo et al., 2018), scoping reviews (Wang et al., 2021) and meta-analyses (Maslowski et al., 2019; Morgan et al., 2018) that MHL training significantly improves knowledge about mental health and attitudes regarding those experiencing mental health concerns.
Self-efficacy
MHL self-efficacy incorporates confidence about one’s capability to perform MHL skills, such as in recognizing symptoms of mental health concerns. Self-efficacy is an important component of MHL training because it has been shown to predict behavior and is therefore often treated as a proxy for actual helping behaviors (Maslowski et al., 2019). Meta-analyses of MHL trainings show significant small to moderate effect sizes for improvements in MHL self-efficacy (Maslowski et al., 2019; Morgan et al., 2018).
Behaviors
MHL behaviors include both overt (observable) behaviors, such as asking someone about suicidal ideation, and covert (not observable) behaviors, such as viewing those who experience mental health concerns with more compassion. Evidence is mixed for behavior change after participating in MHL training. In their meta-analysis, Morgan et al. (2018) found that there was no increase in the frequency of MHL-oriented supportive behaviors at post-test, but a small, statistically significant increase (d = 0.23) in frequency at six-month follow-up. In another meta-analysis, Maslowski et al. (2019) found a small, statistically significant increase in frequency of helping behavior (Hedges’ g = 0.29) after MHL training. Contrary to expectations, rather than increasing with time as participants acted on the trainings, this effect weakened from post-test to follow-up, although it was still significantly improved from baseline.
Multiple factors could influence the mixed findings on behavior change following MHL training, including restrictive definitions of behavior change (e.g. only assessing whether participants referred someone to a mental health provider), using intention to act or confidence as proxies for behaviors, and having too brief of a follow-up period (Lee and Tokmic, 2019; Maslowski et al., 2019). Supporting the role of follow-up length, several studies with long follow-up periods (one year, McCabe et al., 2014; or two years, Reavley et al., 2021) found evidence of behavior change. While sufficient elapsed time may be necessary to assess changes in behaviors, Maslowski et al. (2019) reported that the effect of MHL training on behavior reduced at follow-up compared to post-test. Additionally, too much time can introduce outside factors that can confound results.
In sum, the mixed results on behavior change following MHL training could be due to a number of design or measurement factors (Mansfield et al., 2020). It is important to develop a deeper understanding of the lived experience of participants completing MHL trainings to determine the extent to which their behaviors change to help inform researchers on why behavior change could be hard to detect. To effectively evaluate behavior change, researchers should assess both overt and covert behavior, evaluate actual behavior and confidence separately and have a sufficiently long follow-up period to allow behavior change to occur.
The current study
The purpose of this program evaluation study was to better understand whether helping behaviors were changing after participating in MHL training and why this change occurred. This understanding is useful in helping programs to better study behavior change or change programs if behavior is not changing. The primary research question was, “How do participants describe changes in their knowledge, self-efficacy and overt and covert behaviors related to MHL skills?” The secondary research question was, “How do participants describe the influence of knowledge and self-efficacy on their overt and covert behaviors related to MHL skills?”
Mental health awareness and advocacy curriculum
The Mental Health Awareness and Advocacy (MHAA) training is originally established as a 16-week, evidence-based college curriculum, delivered in both online and in person formats ([the authors]). In quantitative analyses of a quasi-experimental treatment control study, the college curriculum has demonstrated effectiveness in increasing participants’ knowledge and self-efficacy across both modalities of delivery, but not directly impacting behaviors ([the authors]). The MHAA curriculum is strongly informed by the health belief model (Becker, 1974; Rosenstock et al., 1988) and social cognitive theory (Bandura, 2005; Spike & Hammer, 2019). In line with theory, the hypothesized process of behavior change in the MHAA curriculum is that knowledge leads to self-efficacy, which leads to behavior change (see Figure 1). We recognize that in real world learning, the process may not be this linear and may be more iterative between the three components; however, we present this process sequentially for parsimony.
Program overview
The Mental Health Awareness and Advocacy training series used in this study was adapted from the original college curriculum ([the authors]; [the authors]), reduced to three one-hour, skills-based Zoom presentations on the main components of mental health literacy: identifying mental health concerns, including common symptoms of various mental health conditions; locating evidence-based resources, including the definition of evidence-based treatment, criteria for selecting providers and building skills in searching for local providers; and responding to mental health concerns, including how to respond to those experiencing a range of mental health concerns including assessing risk for suicide. The course was delivered by both a licensed mental health professional and a non-mental health professional (but the course has been solely delivered by non-mental health professionals). The depth of content, clinical examples and college-based assessment of learning objectives (quizzes, papers) was eliminated, this reduced format still followed the basic core of the MHAA approach. Despite these shortened materials, participants still engaged with activities throughout the training that included mini-exams (multiple choice questions), role-plays and opportunities for question and answer on the materials. The three training segments had 41, 34 and 32 participants, respectively.
Material and methods
To conduct our program evaluation, we used Yin’s (2014) approach to case study research. This approach was chosen because it provided a systematic, in-depth way to evaluate the program with a relatively small sample size. We conceptualized the case as the three-part MHAA training taught to county Cooperative Extension employees (regional university-community liaisons that consult on issues ranging from agriculture to mental health) in a western state in the USA in June and July of 2021, with individual interview participants serving as the units of analysis. Data for this project was originally part of grant evaluation and later approved for research by our university’s institutional review board (IRB protocol #12473).
Procedures
Three to four months after completing the training, individuals who attended all three training segments of the MHAA training series were emailed an invitation to participate in a 30–45-minute semi-structured interview. Recruitment materials emphasized that the interviewer was not directly involved with the trainings to encourage honest responses. If individuals did not respond to the initial contact email, they received an additional follow-up email. Semi-structured interviews occurred over Zoom.
Interview questions aimed to address perceptions of change in knowledge, self-efficacy and behaviors in the skills of identifying mental health concerns, locating high-quality resources and responding to mental health concerns; and aspects of the training that were especially useful or could be improved upon for participants (interview questions can be found in Appendix). Audio recordings of interviews were transcribed verbatim by one team member and checked against the audio for accuracy by a different team member. No participants skipped an interview question and there were no known adverse events from the interview.
Participants
Participants included in the evaluation were eight county Cooperative Extension workers from the same western state (see Table 1). Participants’ mean age was 36.38 (SD = 5.85). Participants were White (n = 5), Latina (n = 2) and Asian and White (n = 1). Seven participants were female, and one was male. Participants were college educated: associate’s degree (n = 1), bachelor’s degree (n = 5), master’s degree (n = 2). Four participants (50%) reported having been personally diagnosed with a mental health concern and six participants (75%) reported they knew someone who had been diagnosed with a mental health concern. To preserve anonymity, we use gender neutral pronouns, do not define specific employed positions and present demographics in aggregate, rather than the typical case study convention of presenting demographics for each participant.
Data analysis
Theoretical framework
Yin (2014) suggests using an initial theory-informed framework to guide study design and analysis. We used the hypothesized process of behavior change, that knowledge and self-efficacy would lead to behavior change, as our analytic framework (see Figure 1; [the authors]). We defined knowledge as information about prevalence or symptoms of mental health, awareness of available resources, and understanding of severity or how to respond to mental health concerns. We defined self-efficacy as confidence in one’s own mental health literacy. We defined behavior change as overt (observable) and covert (non-observable) behaviors related to mental health that were initiated or altered after participating in the training. While we used knowledge, self-efficacy, and behavior as our theory-informed analytic framework, we also engaged in inductive coding to examine other salient areas as discussed by participants (Gilgun, 2019). The analysis team consisted of three coders: the interviewer, one of the trainers for the program, and a research assistant. At least two coders coded each interview.
Coding
Initial coding consisted of data familiarization and labeling and organizing data segments according to positive and negative feedback about each training module. We then examined each data segment for evidence about knowledge, self-efficacy, behavior or another category. We coded using gerunds to capture actions and occasionally used in vivo codes to capture a participants’ particular experience. Throughout coding, we memoed (Saldaña, 2016; Yin, 2014) about analytic insights and noted aspects of training that were helpful or could be improved. After coding each transcript, the analysis team met to discuss key findings and to examine connections across interviews.
Following initial coding of all interviews, we created a codebook to organize participant quotes for each training segment into themes (Yin, 2014). To facilitate cross-case comparisons, we compiled all participants’ relevant responses into one document. Each of the three coders reviewed the comprehensive document and examined evidence for the themes. Through regular team meetings, we determined which findings were common across participants and refined our results according to the guiding theory and inductive evidence.
As Gilgun (2019) suggests, to prevent a shallow analysis and to encourage refinements and improvement, we intentionally sought elements within the data that supported, contradicted, refined or expanded the hypothesized process of behavior change. This helped to prevent confirmation bias while still allowing us to be informed by theory. We also examined rival explanations for our results, such as previous experience with mental health (Yin, 2014). Our results include both theory-informed and inductively derived themes based on our analysis.
Trustworthiness and reflexivity
Trustworthiness of analysis is supported by using an analysis team, engaging in structured reflexivity activities (Hall et al., 2005) and via detailed analytic memos (Saldaña, 2016). Specifically, the three members of analysis team had varying experience with mental health, familiarity with the training and qualitative research experience. This diversity allowed us to explore different perspectives on the data and to challenge each other’s assumptions. During the active analysis phase, the analysis team met weekly, allowing for a continuity of analysis, and affording ample time to express different views and come to agreement on the meaning and salience of different passages. Finally, by directly examining alternative explanations for our results (i.e. previous experience with mental health rather than the training itself), we were able to explore rival hypotheses and improve trustworthiness (Yin, 2014).
Results
Research question 1: changes in knowledge and self-efficacy
We organize results based on research question. Based on the hypothesized model of change that increased knowledge would lead to increased self-efficacy, which would lead to behavior change, our first research question was whether participants’ knowledge, self-efficacy, and behaviors changed after participating in the training (see Figure 1). We organized findings for this question into three themes – prior experience, knowledge and self-efficacy. Although behaviors could fit within this section of the results, we included it under the heading of Research Question 2 to promote comprehension and limit repetition of quotes.
Prior experience with mental health
Considering that a high portion of our sample reported being personally diagnosed with a mental health concern (50%) or that they knew someone with a diagnosis (75%), it is possible that participants’ knowledge and self-efficacy are based on their previous experiences. To help ensure our analysis was targeting the impact of training, we focused on participants’ descriptions of changes in their knowledge and self-efficacy since participating in the training. That being said, prior context of mental health understanding and awareness was important. Several participants who reported experiencing a diagnosed mental health concern emphasized that this had given them awareness of and compassion for others who struggle. Participant 3 explained, “[I] have more empathy about it, or feel more connected, when somebody is struggling with their mental health.” They continued, “Of course, all the trainings and everything, that is a big help and support. But in me, personally, it’s because I’ve been through [mental health challenges.]”
While all participants identified positive aspects of the training, two participants who had previous exposure to mental health literacy training and direct experience helping others with mental health reported that it was helpful for beginners but not useful for them personally. They reported little change in their knowledge, self-efficacy, or behaviors, although it had helped them to be more intentional. Participant 7 explained, “[the training] made me think a little bit more about what I’m doing. But as opposed to like increasing knowledge, I don’t know, I feel like I had a pretty solid knowledge beforehand.”
Knowledge
Most participants referenced increased knowledge about mental health as one of the primary outcomes of the trainings. Examples of increased knowledge included symptoms of mental illness, available resources and different severities of mental illness.
For some participants, the information in the trainings was new and so they reported large gains in knowledge. Participant 8 explained, “I learned a lot of new things that I never knew … [before,] I might have not known exactly what depression was defined as, but now I have a better definition or just like that understanding.” Several participants expressed that they had never heard of some of the resources discussed in the locating training, like Participant 3 who remarked, “I didn’t know about all that. Not at all. About what can you do, for example, if you’re dealing with domestic violence.”
Multiple participants reported increased knowledge on resources and referrals from the training. Participant 1 reported, “We got on Psychology Today and found what was in our area. And so, since then I just, like, I know the different counseling centers, I know the doctors, I know who’s a psychologist or psychiatrist.”
Some participants reported that the training had expanded their awareness of mental health and different severities of mental health concerns. Participant 3 explained, “[Because of] the training, like you see kind of a more bigger picture about really what is a mental health issue, and what can you do if somebody comes to you and asks you for help, or if you’re just looking [for] some signs.” Participant 6 echoed this sentiment, saying, “now I can say, ‘okay this is depression, but what severity is this, is this just like a temporary down or low in their life, or is this like much deeper, are they considering hurting themselves or others?’”
Self-efficacy
Most participants reported that the training increased their confidence in their ability to help others. When asked to scale their confidence in identifying mental health concerns, Participant 6 expressed, “I’d say like 75%… I can see what’s happening or what the build-up [of stressors is] way better than before.” They attributed their increased confidence to their increased awareness of signs of mental health distress.
Multiple participants referenced that confidence had increased because they now knew a “direction” for resources and referrals. Participant 1 reported feeling “very confident” in responding to others now “because I know the direction or maybe the resources of where to send them … using like the Psychology Today, finding people who can help them in town.” Participants also reported increased confidence in directly asking others about mental health concerns. Participant 8 explained, “I also have the confidence that I’d be able to have a conversation with them [about mental health] where I would know a little bit what I was talking about and … some resources that I knew that I could send them to.”
Several participants reported that group discussion and the instructors’ natural way of talking about mental health concerns were key to increasing their confidence. Participant 8 explained, “The instructor just made it feel like a really natural thing” to talk about mental illness, rather than “a forbidden topic or a forbidden thing to really assess.” As a result, they reported, “now I feel like it’s something where people are actually open to talk about it.”
Multiple participants expressed that their confidence would have been further enhanced by having more opportunities to practice during the training. Participant 7 rated the overall training as 5/10 because, “It was good information, but because it didn’t necessarily increase my knowledge and abilities.” While the training did involve group discussion of several case studies, participants expressed a desire for more active skill-building, such as role-plays. Participant 8 reported that role-plays would have helped them to overcome being “nervous” and “tentative” with responding to someone experiencing a mental health concern.
Research question 2: changes in behaviors through knowledge and self-efficacy
Once we determined that there was some evidence for most participants of improved self-reported knowledge and self-efficacy, our second research question explored the relationship between increased knowledge and self-efficacy and behavior change. Themes for this question include covert and overt behaviors, as well as barriers to acting on the training.
Behaviors
Participants described multiple changes in their overt and covert behavior since the training, in part due to increased knowledge and self-efficacy. Covert behavior changes included interpreting others’ behavior through the lens of mental illness and viewing those who struggle with mental health with greater compassion. After reporting what they had learned about mental illness (knowledge), Participant 1 shared how they had initially been “judging” of a relative but that, “since this training … I’ve looked back at that, and I’ve wondered what kind of mental state she was in—overwhelmed, anxiety, depression, just frustrated.” They continued, “I probably wasn’t the most compassionate [before], I was more like judging. But now I am very like empathetic towards people’s situations.”
Participant 6 credited increased understanding of mental health (knowledge) with changing their overall approach to those who experience mental illness. They compared mental illness to a hole and explained that before the training, they thought, “I’m up here, I’m going to pull you up and out.” After the training, they reported, “My understanding now is more like, “I’m going to climb down and walk with you, you know, or sit with you or whatever it is.” … my role is more to listen, and to be there.”
In addition to covert behavior changes, participants reported engaging in several overt behavior changes, with plans to engage in more as opportunities presented themselves. Two commonly reported overt behavior changes were asking direct questions about mental health and treating others with more empathy and compassion. Both skills were directly addressed during the training. Participant 5 described a situation where they had intervened with someone experiencing a mental health crisis. They reported, “[I] was able to ask, you know, “have you thought about hurting yourself, or causing any harm to yourself?”” They continued, “[the training] definitely helped as far as being brave enough to ask those questions” (self-efficacy). Participant 3 explained that after learning the strategies for intervention taught in the training (knowledge), they realized people experiencing mental health concerns “just want to feel that somebody is there for them… [like,] ‘I’m sorry you’re going through this … you’re not alone.’”
An additional overt behavior change for some participants was taking proactive steps to promote their own mental health. Participant 2 talked about learning common symptoms of mental illness during the training (knowledge) and reported that since the training, “I’ll be like, ‘Okay, this [symptom] is telling me that I need to take care of myself.’” They reported that when they addressed their own mental health needs, they could “then in turn recognize it in others.”
Barriers to action
While most participants reported acting on the information from the training, all participants reported encountering one or more barriers in efforts to help others. These barriers included lack of confidence in their knowledge, lack of a personal relationship with the person and stigma. While most participants reported increased knowledge, several reported that they lacked the confidence to act on this information. Participant 3 reported worrying, “They might have this [mental health concern] – but what if I’m wrong?” They continued, “What if, even though I might be right, the other person doesn’t want to admit it?” This lack of confidence stopped them from intervening.
Participant 2’s experience shows the interplay of stigma, lacking confidence, and not having a personal relationship. They shared a time since the training when they felt like a stranger may have been experiencing a mental health concern but they did not do anything. They reported, “I walked past that person that day thinking, “Oh, don’t bother them, they don’t want to be bothered,” and I talked myself out of it. So maybe the barrier is just doing it. And having the confidence.”
As Participant 2’s quote exemplifies, some participants viewed their need and ability to intervene as dependent on their relationship with the other person. When they had a strong personal relationship, they felt more comfortable talking about mental health. Participant 7 reported that they felt comfortable supporting their friends with their mental health because, “With my friends I feel like I got more of a role to be doing that … I feel like we have a deeper connection and so it feels more comfortable.” However, when participants lacked a personal relationship, they were less likely to intervene, even if they were concerned. Participant 5 explained, “if I were to maybe see warning signs of someone that I’m not as close to, I would maybe go to someone that has a better relationship with that person and express my concern.” They continued, “There needs to be that relationship and that trust [for someone to be willing to listen].” Our participants perceived this relationship to be essential to effectively helping someone with their mental health and felt less capable of helping those they lacked a personal relationship with.
Related to other barriers, multiple participants referenced a “stigma” around mental health discussions. As a result, they hesitated in acting on the information from the training. Participant 7 attempted to circumvent this stigma by focusing on “the positive side” of mental health instead of telling others, “You’ve got a problem.” Two participants referenced Latinx culture as being less open to discussing/recognizing mental health due to cultural stigma. One participant explained, “there is a stigma for us [as Latinx]. And maybe sometimes we don’t want to hear that I have a problem with my mind.” Another participant reported, “from my culture … being depressed or talking about mental health, you automatically think, … ‘Oh, that doesn’t exist.’”
Discussion
This qualitative program evaluation study evaluated participant outcomes of the Zoom-delivered, three-part brief Mental Health Awareness and Advocacy (MHAA) training for employees in Extension. The primary research question was the extent to which participants reported improved knowledge, self-efficacy and behaviors. We found that most participants reported increased MHL knowledge and self-efficacy, particularly when they had not had prior training in this area. This is in line with other research suggesting that MHL trainings can increase participants’ knowledge and self-efficacy (Lo et al., 2018; Maslowski et al., 2019; Morgan et al., 2018).
At three months follow-up, all participants reported covert behavior changes while only some participants reported overt changes. This supports the importance of sufficient elapsed time for accurately measuring changes in helping behaviors (Morgan et al., 2018; Reavley et al., 2021), as well as expanding assessments of behavior to include both overt and covert behaviors. Nevertheless, it aligns with previous research that MHL trainings may not increase participants’ helping behaviors (Maslowski et al., 2019; Morgan et al., 2018).
The secondary research question was the degree to which participants connected aspects of their MHL behaviors with their increased knowledge and self-efficacy in this area, specifically from these trainings. We found evidence for the importance of both increased knowledge and increased self-efficacy in facilitating overt and covert MHL behaviors. Further supporting this, our participants identified lack of knowledge or self-efficacy as barriers to acting on MHL training.
Although not conclusive, some participants’ responses suggested that increased knowledge led to changes in covert behaviors, whereas increased self-efficacy led to changes in overt behaviors. Future research should examine this distinction quantitatively and longitudinally, using qualitative responses to add depth and clarity to understanding these processes. Additionally, several participants did voice a need for continued case examples and more role-play scenarios to further develop their skills. This is likely due to a shortened training where most time gained was by cutting out these time-intensive activities. That being said, it would be important to better understand if these observations do come from the shortened training or some other factors (e.g., online delivery).
Participants’ explanations for limited behavior change centered around lack of personal relationship and stigma. The perception that a personal relationship is essential to intervening may be an artifact of an individualistic culture. Research suggests that in collectivistic societies, task-sharing by professionals and bachelor’s educated individuals is effective in broadening access to care (Magidson et al., 2019). The belief in the necessity of a personal relationship is an important barrier to overcome to promote helping behaviors at a community-wide level.
Lessons learned
First, our findings suggest that other mental health literacy training programs could consider using more holistic assessments of behavior change so as to capture more depth on the impact of MHL training on individual outcomes and learning processes. While the primary outcomes of interest remain the frequency and quality of helping behaviors, changes in covert behaviors may be important preparatory shifts that increase the likelihood of overt helping behaviors. In particular, we recommend future MHL training program evaluations consider integrating mixed methods to holistically examine overt and covert behavior changes. We hoped to accomplish this in our study, but were unable to recruit sufficient participants to run adequately powered quantitative statistical analyses following our trainings.
Second, including analysts with different levels of involvement in the training promoted both an emic (insider) and an etic (outsider) perspective that enhanced analysis. Using an interviewer unaffiliated with the training promoted candid responses about both positive and negative aspects of the training. At the same time, including one of the trainers for the program as a coder enabled a deeper understanding of components of the training referenced by participants. We recommend that future program evaluations similarly use a combination of coders with differing levels of engagement with the program and that interviews be conducted by someone not directly associated with the training.
Limitations
The primary limitation of our results is that the data all came from participants who completed the same series of trainings. Our design would have been strengthened by including participants from multiple training series to better assess whether findings are applicable to a variety of participants. A second limitation was that interviews were conducted three to four months after the training, which introduces potential confounding factors. We attempted to account for this by assessing participants’ engagement with other MHL trainings during the past 6 months; only one participant had engaged in any additional MHL training. Additionally, there was not particular justification in the three-month time-period for measurement and was more based on the participants schedules/needs. A third limitation was that while we assessed for changes in behavior, we did not assess for the frequency of these behaviors. A fourth limitation is that despite efforts to assure participants that the focus was on improving the program, and thus negative feedback was welcome, there may have been the possibility for response bias or social desirability bias. Finally, the study was conducted during the COVID-19 pandemic potentially affecting the results of the study.
Conclusion
This qualitative program evaluation assessed outcomes from Extension employees participating in the Zoom-delivered, three-part brief MHAA training. Participants reported increased knowledge and self-efficacy, and, consistent with theory, connected changes in their overt and covert behaviors with their knowledge and self-efficacy. Future program evaluations should incorporate qualitative or mixed methods analysis to more holistically study behavior change following MHL training.
The authors express appreciation to [anonymized] for their assistance with an early draft of this manuscript. The authors express appreciation to our participants for their thoughtful responses and willingness to share their experiences.
Funding: This research was supported by funding from an Interagency Outreach Training Initiative grant.
Declarations of conflicts of interest: The authors do not report any conflicts of interest.
Declaration of interest: To complete the study, the authors received funds from the Interagency Outreach Training Initiative (IOTI). The purpose of this funding is to support training that responds to needs identified through a statewide needs assessment conducted by the USU Institute for Disability Research, Policy & Practice.
Process-based model of mental health awareness and advocacy curriculum
Table 1
Participant demographics
| Variable | Count (%) or mean (SD) |
|---|---|
| Age | 36.38 (SD = 5.85) |
| Gender |
|
| Ethnicity |
|
| Education |
|
| Diagnosed with MHC |
|
| Know someone diagnosed with MHC |
Note:MHC = Mental Health Concern
Source: Table by author
© Emerald Publishing Limited.
