The School Counseling Program Component Scale (SCPCS) was developed to measure beliefs about program components of the American School Counselor Association (ASCA) National Model. In 2002, results from the SCPCS indicated that “using data” was ranked as less important than other program components. The current follow-up study was conducted in 2009. Results from this study indicated that the component subscales are consistent and reliable. Notable shifts were found in “use of data for program planning” and “use of data for accountability” subscales, suggesting that school counselors may have begun to prioritize the value of using data.
- education theory and practice
- social sciences
- educational administration
- leadership and policy
- educational measurement and assessment
- educational psychology and counseling
- reliability and validity
- research methods
In the late 1990s, The Education Trust, Inc. introduced the National Transforming School Counseling Initiative (TSCI; House & Martin, 1998). Their work paved the way for the future of school counseling to include leadership, advocacy, and systemic change through the use of data. In 2003, American School Counselor Association (ASCA) incorporated many of these Education Trust initiatives to develop the ASCA National Model: A Framework for School Counseling Programs.
The ASCA National Model was created as a tool for building comprehensive school counseling programs that focused on foundation, delivery, management, and accountability (ASCA, 2003). In it, ASCA recommended school counselors engage in designing standards-based programs focusing on increasing student expectations while supporting students in the academic, career, and personal/social domains. The framework was designed to assist school counselors and school counseling program teams to utilize various data and accountability tools as they design, coordinate, implement, manage, and evaluate the school counseling program’s efficiency and effectiveness in regard to student success (ASCA, 2003). ASCA indicated that the primary role of the school counselor is to enhance and promote student achievement and identified the school counselor as a leader, an advocate, and a catalyst for systemic change. The framework was intended to unify the profession, providing a blueprint for program development along with the flexibility to fit every school counseling program, and was designed to be incorporated in conjunction with the district and school’s mission and goals.
Since the ASCA National Model was released in 2003, hundreds of research studies and/or dissertations specifically related to the implementation of the ASCA National Model have been conducted. These articles include studies about school counselor job satisfaction (Pyne, 2011), models of data-based decision making (Poynton & Carey, 2006; Stone & Dahir, 2007), school counselor accountability practices (Topdemir, 2010), school counselors’ readiness to deliver school counseling programs (Dahir, Burnham, & Stone, 2009), obstacles and successes in implementation (Studer, Diambra, Breckner, & Heidel, 2011), current state of school counseling models (Martin, Carey, & DeCoster, 2009), and educating future principals (Bringman, Mueller, & Lee, 2010; Light, 2005). These studies reflect the amount of attention the model has garnered over the years, and the potential the model holds to influence the direction of the school counseling profession.
In addition, the introduction of the ASCA National Model was followed by the creation of ASCA’s Recognized ASCA Model Program (RAMP) designation. RAMP components reflect the implementation of a comprehensive school counseling program based on the ASCA National Model. Since 2003, more than 500 schools throughout the country have been awarded this designation (ASCA, 2015a). Although there is little research to date regarding the impact on students’ achievement in RAMP schools, ASCA National Model programs at the elementary level have been shown to significantly affect students’ attendance rates and reading achievement (Ward, 2009). Wilkerson, Pérusse, and Hughes (2013) found that schoolwide proficiency rates in English/Language Arts and Math were significantly higher in RAMP-designated schools compared with those that were not. These studies suggest that a fully implemented ASCA National Model program may be viewed as one factor of the entire school system that positively affects student success.
Zagelbaum, Kruczek, Alexander, and Crethar (2014) analyzed the content of 413 articles published in the Professional School Counseling journal. They investigated whether the literature had shifted toward the mission and content found in the Education Trust Initiative as well as the ASCA National Model. After reviewing the articles, they concluded that researchers were shifting toward more articles based around data-driven practices and accountability, which are consistent with both the ASCA National Model and the Education Trust Initiatives. However, concerns remained about school counselors’ daily workload and their ability to incorporate the ASCA National Model into their programs.
In 2008, Hatch and Chen-Hayes published an article about school counselors’ beliefs with regard to the ASCA National Model School Counseling Program Components using the School Counseling Program Component Scale (SCPCS). In that article, the authors presented results of a survey administered in 2002 to more than 1,200 ASCA members, to establish the psychometric properties of the SCPCS and collect national baseline data on school counselor beliefs about certain program components in the ASCA National Model prior to its release in 2003.
The results of the first SCPCS survey administered in 2002 revealed that when the ASCA National Model was first released, participants reported that program component activities involving data and accountability were less important than other components, such as mission, goals and competencies, and administrator support (Hatch & Chen-Hayes, 2008). Respondents also ranked the use of data for the purposes of identifying achievement gaps and monitoring student progress as the least important in relationship to other activities (Hatch & Chen-Hayes, 2008). The authors suggested the low rankings for accountability measures may be a result of lack of training in how to collect, measure, analyze, and interpret data. Hatch and Chen-Hayes (2008) recommended that successful implementation of the ASCA National Model would begin when school counselors “understand the importance of developing data skills and then using data in both program management and accountability” (p. 40).
Dahir et al. (2009) identified the “gaps in the school counselors’ ability to embrace and implement the new vision of comprehensive school counseling during the initial stages of implementation” (p. 182). Dahir et al.’s (2009) research focused on school counselors’ “readiness to deliver comprehensive programs by assessing their attitudes, beliefs, and priorities for key program elements affirmed in the ASCA National Model” (p. 182). Dahir et al. (2009) discussed the impact the ASCA National Model has had on the school counseling profession, citing studies that reveal how students benefit from participating in comprehensive school counseling programs, and noted the lack of training opportunities on how to implement a comprehensive program. Dahir and Stone (2009) also expressed concern that, despite the development of the ASCA National Model and increased political attention to eliminating the achievement gap, many school counselors continue to adhere to ineffective methods of accountability.
Hatch and Chen-Hayes (2008) recommended future research be conducted to document whether changes in professional beliefs and attitudes regarding the ASCA National Model might change over time. This study was designed to measure and document any shifts in school counselor beliefs about ASCA National Model School Counseling Program Components between 2002 and 2009, 6 years after the release of the ASCA National Model in 2003.
The primary focus of this study was to examine how the introduction of the ASCA National Model may have influenced school counselors’ beliefs. This was accomplished by comparing data collected in 2009 with data collected in 2002 as reported by Hatch and Chen-Hayes (2008). Specifically, we sought to answer the following research questions:
Research Question 1: Does the SCPCS evidence similar factor structure, reliability, and validity characteristics as previously reported research?
Research Question 2: To what extent have school counselor beliefs shifted on the SCPCS items since the introduction of the ASCA National Model?
Information about the participants and procedures in the 2002 study can be found by consulting the original manuscript (Hatch & Chen-Hayes, 2008). The participants and procedures below describe the current study, with data collected in 2009.
The sample included 12,953 ASCA members who indicated they were school counselors or administrators working in K-12 schools, who had also opted to make their email address publicly available to other ASCA members in the membership database. Of the 1,021 respondents who started the online survey by answering at least one question, 617 provided usable data yielding an overall response rate of 4.8%.
Of the 617 participants who provided usable surveys, 27% (n = 165) were elementary school counselors, 21% (n = 130) were middle or junior high school counselors, 32% (n = 196) were high school counselors, 12% (n = 72) worked at multiple levels, and 4% (n = 27) were school counseling program supervisors. Twenty-seven respondents (4%) did not answer this question. Eighty-three percent of the respondents were female (n = 512), 15% (n = 95) were male, and 2% (n = 10) did not answer this question. Eighty-six percent of the respondents identified themselves as White (n = 512), 4% as Black (n = 23), 4% as Hispanic (n = 24), 2% as Asian or Pacific Islander (n = 12), 2% as multiracial/multiethnic (n = 13), 1% some other race or ethnicity (n = 6), and 10 respondents (2%) did not answer this question. Eighty-four percent of the respondents had a master’s degree (n = 516), 9% had an educational specialist degree (n = 57), 5% had a doctorate, 1 respondent had a bachelor’s degree with more than 30 credits of graduate work completed, and 10 respondents (2%) did not answer this question. Forty-six percent of the respondents worked in suburban settings (n = 286), 28% in rural settings (n = 171), 24% in urban settings (n = 147), and 13 respondents (2%) did not answer this question. The average amount of experience working as school counselors reported by respondents was 8.1 years (SD = 7.1 years).
An invitation to participate in the study was sent via email to the 12,953 ASCA members. The email included information about how to participate, described informed consent, and provided a link to the online survey. A reminder email was sent 2 weeks later. The initial email invitation yielded 285 usable surveys, whereas the remaining 332 participants completed the survey after the reminder email was sent. The online survey was a multipage survey, which contained a page with informed consent, a page with a modified version of the SCPCS, and a page with demographic questions.
The SCPCS was modified slightly for use in the 2009 study, based on feedback provided by Professional School Counseling reviewers (Hatch & Chen-Hayes, 2008). The original five-point scale, which ranged from 1 to 5 with named anchors at 1 (very important), 3 (moderately important), and 5 (not important), was modified to a three-point scale using only the named anchors of the original scale.
To assess the possible impact of changing the scale on the structure, reliability, and validity of the SCPCS, several analyses were performed in a manner identical to those reported by Hatch and Chen-Hayes (2008). Specifically, replication analysis (Osborne & Fitzpatrick, 2012) of the principal components analysis (PCA) and internal consistency estimates were calculated in a manner identical to the initial SCPCS study (Hatch & Chen-Hayes, 2008). To facilitate analysis of changes in the perceived importance of the items over time, individual items were rank-ordered based on observed mean scores. The use of rank-ordered lists of the 2002 and 2009 data facilitates comparison between the two administrations in spite of the change in the scale, and allows conclusions to be drawn about the perceived importance of each item relative to all other SCPCS items.
To compare the component structure derived from the 2002 SCPCS data with the 2009 SCPCS data and the relationship of subscales to the instrument as a whole, a replication PCA was performed. A forced four-component PCA with a Promax rotation and Kaiser normalization revealed that 18 of the 19 SCPCS items loaded on the same factors in 2009 as in 2002. The exception was the item not retained in the PCA of the 2002 data (Item 19), which loaded on the Administrator Support component. Squared differences in the factor loadings ranged from <.001 to .031. The similarity of the factor structure and small differences in item factor loadings indicate that the original PCA was replicated with our sample (Osborne & Fitzpatrick, 2012). Factor loadings for the 2009 data and squared differences of the factor loadings from 2002 are provided in Table 1. The observed component saturation and sample size exceed the recommendations of Guadagnoli and Velicer (1988) and indicate that the component patterns in the sample are stable with respect to the parameters of the larger population.
The eigenvalues of three of the four factors in the 2009 data were greater than 1.00, and the amount of variance each factor explained differed from the 2002 data. The eigenvalues for each factor and the amount of variance explained were 9.94 (Use of Data for Accountability, 52.60%), 1.35 (Use of Data for Program Planning, 7.10%), 1.14 (Administrator Support, 5.99%), and 0.89 (Mission, Goals, and Competencies, 4.71%). The majority of the variance in the 2002 data was explained by the Use of Data for Program Planning factor (43.50%), whereas the majority of the variance in the 2009 data was explained by the Use of Data for Accountability factor (52.60%).
Internal consistency estimates were calculated for each of the four subscales using Cronbach’s (1951) alpha, and are presented with the internal consistency estimates obtained from the 2002 data in parentheses: Use of Data for Program Planning = .87 (.82), Use of Data for Accountability = .91 (.80), Administrator Support = .85 (.78), and Mission, Goals, and Competencies = .84 (.86). All subscales evidenced acceptable reliability characteristics using the commonly accepted .7 criterion provided by Nunnally (1978).
To assess the perceived importance of each individual item in a comprehensive school counseling program, descriptive statistics (mean and standard deviation) were calculated. The SCPCS item means and standard deviations are presented in rank order in Table 2, along with the rank of the items based on the 2002 data. The three items revealing the greatest rank order shifts in a positive direction (toward very important) were all items about the use of data. The item use of data to measure the outcome results of the school counseling program, which ranked 13th in 2002, was ranked 6th on the 2009 survey. The item using various data student data to identify gaps was ranked 17th in 2002, and was ranked 10th in 2009. The item use of data to demonstrate the impact of the school counseling program on student success in school was ranked 10th in 2002 and 5th in 2009.
Subscale means were computed and subjected to a repeated measures ANOVA, which was statistically significant, F(3, 606) = 27.78, p < .001, η2 = .12, a small effect size. Post hoc tests with a Bonferroni correction revealed that observed differences among all possible pairwise comparisons of the Mission, Goals, and Competencies (M = 1.32, SD = .41), Use of Data for Program Planning (M =1.45, SD = .46), Use of Data for Accountability (M = 1.37, SD = .46) subscales were all significant at the p < .001 level. The observed differences between the Administrator Support (M = 1.41, SD = .44) and Use of Data for Program Planning subscales was significant (p = .033), but the difference between the Administrator Support and Use of Data for Accountability did not reach statistical significance (p = .074).
Although Hatch and Chen-Hayes (2008) noted statistically significant differences in subscale scores between groups based on level of practice (e.g., elementary, middle), the effect sizes were very small; all observed η2 values were less than .01 and do not hold practical significance (Sink & Stroh, 2006). Therefore, reporting of the means and standard deviations by level is not provided for this study.
The first purpose of this study was to reassess the factor structure, reliability, and validity of the SCPCS (Hatch & Chen-Hayes, 2008) instrument. Overall, despite the differences between the two studies (scale change and participation due to online administration), our analyses revealed four distinct and internally consistent factors evidencing construct validity. The factor structure of the SCPCS remained intact in the second study. Although the amount of variance each factor explained differed from the 2002 data, differences in factor loadings and internal consistency estimates exceeded acceptable requirements for the scale and each of the four subscales. Thus, the four subscales were found to be consistent and reliable.
Of particular interest is the notable shift from the Use of Data for Program Planning factor accounting for the majority of the variance in 2002 to the Use of Data for Accountability factor accounting for the majority in 2009. Administrator Support and Mission, Goals and Competencies loaded in the same order (third and fourth, respectively) in both years. Consistent with the 2002 findings, the Use of Data for Program Planning item “included five items related to using data to target interventions and identify program foci” (Hatch & Chen-Hayes, 2008, p. 38). The Use of Data for Accountability factor included items relating to monitoring program implementation and measuring and reporting outcomes. One explanation for these findings may be that the introduction of the ASCA National Model, with its focus on accountability, contributed to a shift in school counselor beliefs from prioritizing the act of reviewing data as a means to design the program to prioritizing the importance of being held accountable for measuring the impact of program activities and communicating results to school counseling program stakeholders. This shift aligns with persistent requests for school counselors to do more than use data to design and prepare program activities (Dahir & Stone, 2009; Gysbers, 2004; Sink, 2009). Progressive and evolutionary behaviors for school counselors have been encouraged through calls for purposeful program improvement models designed to improve efficiency and effectiveness and to promote the schools counselors’ value as worth its cost or resource (Dahir & Stone, 2003; Dimmitt, Carey, & Hatch, 2007; Hatch, 2008). Future research is needed utilizing a confirmatory factor analysis approach to further assess the SCPCS factor structure, and to assess whether wording, order of items on the survey, or other factors are influencing the factor structure.
Rank Order Findings
The second purpose of this study was to compare the baseline data, in rank order form, on school counselor beliefs about specific components of the ASCA National Model before its release in 2002 (Hatch & Chen-Hayes, 2008) with data collected 7 years after its release (in 2009), to determine the extent to which school counselor beliefs have shifted since the introduction of the ASCA National Model. The discussion will begin with top rank order findings, followed by largest positive rank shifts, largest negative rank shifts, and finally an item that did not shift.
The top ranking item on both the 2002 and 2009 surveys was developing goals for the program. This finding aligns with calls in the last few decades for business, education, and the school counseling profession to prioritize goal setting (Campbell & Dahir, 1997; Dahir, Sheldon, & Valiga, 1998; Doran, 1981; Haycock, 2001; Marzano, 2010; Meyer, 2003). Setting goals within the school counseling program is a required component of the RAMP (ASCA, 2015b), and SMART goals are included in ASCA’s Making Data Work (Young & Kaffenberger, 2009). Finally, setting program goals has now been added as a new topic in the foundation section of the ASCA (2012a) National Model. “Although the second edition of the ASCA National Model encouraged goal setting through action plans, the third edition increases the focus of goal setting through the use of data” (ASCA, 2012a, p. 129). The item writing a mission statement or philosophy, also a topic in the foundation section, ranked second in 2009 (up four from sixth position in 2002). Results of the 2009 survey reveal that the top six ranked items are markedly consistent with the main tenets of an ASCA (2012a) National Model School Counseling Program.
Develop goals for the counseling program
Write a mission statement or philosophy
Utilize schoolwide and student data to design new counseling activities
Identify specific student competencies to which the school counseling program, curriculum, or activities contribute or align
Use data to demonstrate the impact of the school counseling program on student success in school
Use data to measure the outcome results of the school counseling program
Greatest positive rank order shifts
The findings in this study suggest school counselors have begun to prioritize the value, importance, and necessity of using data in their school counseling programs. All three items with the greatest rank order shifts in the positive direction (toward very important) were items focusing on the use of data. Two items, utilize various student data to identify gaps and use data to measure the outcome results of the school counseling program each moved up seven rankings. The item use data to demonstrate the impact of the school counseling program on student success in school moved up five rankings. These results provide positive feedback for ASCA and national leaders in school counseling who have lamented the paucity of outcome research and the lack of school counselor engagement in data-driven activities and accountability (Dimmitt et al., 2007; Poynton & Carey, 2006; Stone & Dahir, 2006; Whiston & Sexton, 1998).
Encouraging counselor educators to teach using data to identify and eliminate achievement gaps and to measure the impact of interventions has been the focus of the TSCI within the Education Trust Foundation since 1996 (House & Martin, 1998). Promoting skills of using data, teamwork, collaboration, technology, advocacy, and leadership were essential in the Education Trust’s recommendations for preparing school counselors (House & Martin, 1998). The ASCA National Model incorporated these same essential tenets in its text, themes, and in the Closing the Gap Action Plan (ASCA, 2003, 2005, 2012a).
Current and former national reform efforts by The College Board’s National Office for School Counselor Advocacy (NOSCA), TSCI, and the National Association of College Admissions Counselors (NACAC) promote the urgency and necessity of school counselors using data to ensure all students are college and career ready upon graduating from high school (Hatch, 2012; Hatch & Bardwell, 2012; Hines & Lemons, 2011 ). The newest edition of the ASCA National Model contains its strongest language to date regarding the use of data, which is mentioned 124 times in the 161-page document (ASCA, 2012a). The newly revised model incorporates language from the most recent revision of the ethical guidelines (ASCA, 2010) stressing the responsibility of school counselors to use equity-based data to identify, address, and resolve attainment, achievement, and opportunity gaps. The term “gap[s]” is mentioned 60 times. Finally, the ASCA Model’s focus on results (124 mentions) and accountability (48 mentions) is irrefutable. School counselors who responded to the current survey are members of ASCA who have received firm directive from the professional association to answer the following question: How are students different as a result of what school counselors do? Future research that includes non-ASCA members is needed to understand more about the beliefs non-ASCA member school counselors may hold regarding data for planning and accountability.
Greatest negative rank order shifts
The two largest rank order shifts in the negative direction (toward not important) were for items related to non-counseling activities. In the findings from the survey administered in 2002, delineating specific counseling and non-counseling activities ranked 8th, while adhering to non-counseling activities ranked 12th (Hatch & Chen-Hayes, 2008). Results of the survey administered in 2009 reveal these two items’ rankings moved down to 14th and 18th, respectively. It is evident that school counselors surveyed in 2009 prioritized the delineation and adherence to non-school counselor activities far less than those surveyed in 2002. As school counselors have shifted their beliefs from prioritizing the use of data for planning to the use of data for accountability, they are also reporting less concern about the need for delineation and adherence to non-counseling duties. There may be a relationship between school counselor beliefs and behaviors regarding accountability and non-school counselor activities. Future research is needed to discover the relationship between beliefs and behaviors and to determine whether school counselors who produce student and program results are less likely to be asked to perform non-school counselor activities (Hatch, 2008).
In the 2002 survey, the item consulting more with administration regarding improvement of the school counseling program ranked third (Hatch & Chen-Hayes, 2008). In the 2009 survey, this item shifted down four rankings to seventh. A possible reason for this shift may be school counselors are more competent and confident with regard to improving their school counseling programs; and thus, may believe there is less need to consult with administrators. Or, perhaps school counselors have already begun to consult more regularly with administrators, and thus, do not necessarily need to consult even “more” (referring to a possible and unintentional interpretation of the term “more” on the part of the respondents). NOSCA, the National Association of Secondary School Principals (NASSP), and ASCA have worked together to promote the importance of the counselor–principal relationship. This collaboration has resulted in the development of a toolkit designed to improve the relationship between school counselors and administrators and has underscored the belief that an effective relationship is necessary in promoting student achievement (The College Board, 2011). Future research is needed to better understand the differences among school counselors who prioritize accountability with those who prioritize the importance of principal–counselor consultation and collaboration.
Minimal rank order shift
One item that experienced only a minimal shift from the previous survey, presenting results of the school counseling program to a formal body, committee, or advisory board, warrants attention due to its consistently low ranking on both surveys in relationship to other items (16th in 2002 and 15th in 2009). Since the introduction of the ASCA National Standards (Campbell & Dahir, 1997) and continuing in the ASCA National Model (ASCA, 2003, 2005, 2012a), strong and consistent messages have been sent for school counselors to share their results with school counseling program stakeholders (Dimmitt et al., 2007; Hatch, 2008). School districts make decisions on the employment of school counselors based on the results they produce (Fladager, 2012). Further research is needed to understand factors contributing to the lack of perceived importance given to this item in relationship to other items.
Finally, little research exists on the demographics of ASCA members. Therefore, it warrants discussion that 83% of the respondents in both surveys identified as female. The sample in 2009 was slightly more diverse (86% White compared with 92% in 2002). The largest increase in respondents identifying as non-White was the percentage of Hispanic/Latino respondents, which doubled from 2% in 2002 to 4% in 2009. African American respondents increased minimally from 3% to 4%. In 2012, the College Board released the results of a national survey containing a “Profile of America’s School Counselors” in which the respondents identified themselves as 78% female, 10% African American, and 15% Hispanic or Latino (Bruce & Bridgeland, 2012, p. 79). The College Board study surveyed only secondary school counselors; our study included counselors from all grade levels, and may be one possible explanation of the observed race/ethnicity and gender differences of the samples. Given this discrepancy, future research is needed to compare school counselors’ race/ethnicity and gender with membership in professional associations as they frequently provide information through email, newsletters, and conferences that are likely to influence school counselor beliefs.
Implications and Recommendations
Rank ordered shifts found in this study and recent research suggest school counselors are increasingly willing to collect data to create comprehensive school counseling programs (Wilkerson et al., 2013). Professional school counselors are encouraged to participate in collegial discussions at their school sites and within their professional community on the importance and value of analyzing and presenting results data to strengthen their school counseling program. Counselor educators are encouraged to revise their pre-service training to ensure graduates are prepared to use data in schools and to continue researching the impact of implementing data-driven school counseling programs, which continue to yield positive results on student achievement (Wilkerson et al., 2013).
Multiple publications are available to teach school counselors how to collect and analyze data, create measureable action plans, and analyze and utilize results for program improvement (e.g., Hatch, 2014; Young & Kaffenberger, 2013). All of this work is central to pre-service training programs and the professional development of today’s school counselor. Sharing program results with school administrators, school district officials, and other school stakeholders is the essential next step to garnering the political and organizational support necessary for sustaining and promoting school counseling programs (Hatch, 2008; Sink, 2009).
Limitations and Conclusion
Although a focus of the current study was on the comparison of findings from data obtained in 2009 with data obtained in 2002, the data are not longitudinal, as the samples were not related to each other. Similar limitations to those reported by Hatch and Chen-Hayes (2008) regarding the data collected in 2002 also apply to the data collected in 2009. Namely, the SCPCS assesses self-reported beliefs, not actual behaviors; and, the sample is likely representative of only ASCA members, not school counselors in general.
The method and SCPCS instrument used to collect data in 2009 differed from those used in 2002. Specifically, email was used to recruit participants and facilitate participation in the study in 2009, whereas recruitment and participation in 2002 was completed using the postal mail and paper surveys. The online tools utilized for recruitment and participation yielded a relatively low response rate and did not account for invalid or undeliverable email addresses. If events leading to failed email delivery had been effectively documented, the response rate would have likely improved slightly—but more important, it would also be more accurate. A problem unique to studies using email-based recruitment is that there currently is no effective way to determine how many invitations are actually received by potential respondents. For example, because the emailed message was from an unfamiliar source and contained a hyperlink to the online survey, it may have been construed as “junk email” by the school district’s server or the individual’s email service provider.
The generalizability of our findings is inherently affected by the limitations noted above. Although our sample was similar to the sample in the original study, it consisted of only ASCA members. Our response rate was quite low; although, we believe our reported response rate is an underestimation of the actual response rate of those who received our invitation to participate in this study, self-selection bias still likely exists and influenced our findings. Finally, our sample was lacking in gender and racial diversity. Although our sample evidenced more racial diversity than the original 2002 study, we are unsure of how, specifically, our sample compares with the demographic makeup of ASCA members and the larger school counseling community.
The results of this study provide significant and valuable feedback on the shifts in school counselors’ beliefs since the 2003 introduction of the ASCA National Model. The Third Edition of the ASCA (2012a) National Model contains additional important changes and a stronger focus on the use of data. Further research is needed to measure the impact of the new edition on the beliefs and behaviors of practicing school counselors.
Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding The author(s) received no financial support for the research and/or authorship of this article.
- © The Author(s) 2015
This article is distributed under the terms of the Creative Commons Attribution 3.0 License (http://www.creativecommons.org/licenses/by/3.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (http://www.uk.sagepub.com/aboutus/openaccess.htm).
Trish Hatch is the director of the Center for Excellence in School Counseling and Leadership (CESCaL), and associate professor and the director of the School Counseling Program at San Diego State University. She is the author of The Use of Data in School Counseling: Hatching Results for Students, Programs and the Profession (2014); co-author of the ASCA National Model: A Framework for School Counseling Programs (ASCA, 2003, 2005); and co-author of Evidence-Based Practice in School Counseling: Making a Difference With Data-Driven Practices (Dimmit, Carey, & Hatch, 2007). She is a national consultant and trainer on the use data in school counseling to create efficient and effective evidenced-based school counseling programs that align with the ASCA National Model.
Timothy A. Poynton is an associate professor and graduate program director of the School Counseling Program at the University of Massachusetts Boston. His research interests are in the realm of issues affecting K-12 students and the school counselors who serve them, most recently by examining the postsecondary transitions of graduating high school seniors focusing on how in-school experiences and attitudes relate to postsecondary outcomes. Additional research and professional interests include examining instruments designed to measure the attitudes and beliefs of school counselors, technology applications, and issues around the use of online surveys.
Rachelle Pérusse is an associate professor in the Counseling Program in the Department of Educational Psychology at the University of Connecticut. Before becoming a school counselor educator, she worked as a high school counselor. As a MetLife Fellow for the Transforming School Counseling Initiative with The Education Trust, Inc., she conducts consultations with school districts about the role of school counselors and school administrators in relation to closing the achievement gap and advocating for students of color and students from low-income families. She has several articles published about national trends in school counselor education, and has co-edited two books: Leadership, Advocacy, and Direct Service Strategies for Professional School Counselors and Critical Incidents in Group Counseling.