The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Brief ReportsFull Access

Availability of Assertive Community Treatment in the United States: 2010 to 2016

Published Online:https://doi.org/10.1176/appi.ps.201900032

Abstract

Objective:

The study examined change in availability of assertive community treatment (ACT) and associated services over time.

Methods:

Change over time in the availability of facilities in the United States offering ACT and its associated services was examined by using 2010 and 2016 data from the National Mental Health Services Survey.

Results:

The proportion of facilities that self-reported provision of ACT and its associated services declined between 2010 and 2016 (odds ratio [OR]=0.73, 95% confidence interval [CI]=0.63–0.86, p<0.001). Although a higher proportion of facilities that provided ACT reported offering all the required services in 2016 (OR=1.31, 95% CI=1.04–1.66, p=0.026) compared with 2010, this proportion accounted for less than 20% of the programs. Compared with 2010, in 2016 increases were observed in peer (OR=1.72, 95% CI=1.38–2.13, p<0.001) and co-occurring disorders services (OR=1.23, 95% CI=1.08–1.42, p=0.004) as well as in secondary services, such as tobacco cessation (OR=4.53, 95% CI=3.51–5.84, p<0.001) and telemedicine (OR=2.08, 95% CI=1.67–2.57, p<0.001). Continuous education for staff was required at more facilities with ACT in 2016 compared with 2010.

Conclusions:

Although the proportion of facilities with ACT that offer all the required core services has increased in recent years, such programs remain a minority, and the overall number of facilities with ACT has declined.

HIGHLIGHTS

  • From 2010 to 2016, the availability of assertive community treatment (ACT) programs decreased in the United States.

  • Compared with 2010, more ACT programs offered all the required core services in 2016.

  • State fidelity monitoring in 2015 accounted for 5% of the increase in provision of the required ACT services in 2016.

Assertive community treatment (ACT), an evidence-based treatment model for individuals diagnosed as having severe mental disorders (1, 2), has been linked to shorter hospital stays as well as improved quality of life, medication adherence, treatment retention, and patient satisfaction (1, 2). ACT programs with higher fidelity to the ACT model, typically measured with the Dartmouth ACT Scale (DACTS) (1, 3), have been shown to be more effective and to have better patient outcomes (4).

While researchers have examined the clinical efficacy and outcomes of ACT programs (2) and have studied the national distribution and characteristics of facilities with ACT (5), less is known about temporal trends in ACT dissemination, fidelity to the model, and service offerings. To address this research gap, we analyzed results from the 2010 and 2016 waves of the National Mental Health Services Survey (N-MHSS). We explored changes in prevalence of ACT programs over time and changes in provision of core ACT services thought to be critical to overall program fidelity as well as secondary services important in addressing the unique needs of individuals with severe mental illness. We also examined the association of state fidelity monitoring with changes in provision of ACT services and in core services.

Methods

Data on facilities offering ACT and the services these programs provided were collected by the Substance Abuse and Mental Health Services Administration (SAMHSA) as part of the N-MHSS, a national survey of all known mental health treatment facilities in the United States, including the District of Columbia (6). Of the 12,186 eligible facilities in 2010, 11,118 (91.2%) completed the survey and, of these, 9,041 (81% of facilities that completed the survey) reported on provision of ACT. In 2016, of the 13,983 eligible facilities, 12,745 (91.1%) completed the survey and, of these, 12,169 (95.5% of facilities that completed the survey) reported on provision of ACT. In 2016, missing values were imputed by SAMHSA investigators using 2015 data. Because no N-MHSS data existed prior to 2010, we used multiple imputation (11) to replace missing data for this study, using 20 imputed data sets. Complete case analysis did not differ substantially from the analysis using multiple imputation reported here.

In the N-MHSS survey, ACT was defined as “a multi-disciplinary clinical team approach, [which] helps those with serious mental illness live in the community by providing 24-hour intensive community services in the individual's natural setting” (6). Similar to a previous study (5), all facilities that reported offering ACT were included. Services that were examined at facilities reporting provision of ACT included core services approximating those in the DACTS and secondary services addressing the complex needs of individuals with serious mental illness (79) and were defined similarly to their definitions in prior research (5). Newly included were facility quality improvement initiatives, continuous education requirements for staff, and discharge outcomes follow-up to assess commitment to quality among facilities reporting provision of ACT.

Data on state fidelity monitoring were collected as part of the SAMHSA’s Uniform Reporting System (URS), which includes data on evidence-based practices and fidelity measurements (10).

Because both the N-MHSS and the URS are public datasets that do not identify individuals, no institutional review board approval was required for the current study.

We conducted the analyses in two stages, using generalized estimating equations (11) and linking data by state to address possible clustering. Only services included in both the 2010 and 2016 survey data were used.

First, facilities self-reporting ACT were compared between 2010 and 2016 to assess the number of programs as well as core and secondary service offerings.

Second, we added data from SAMHSA’s URS to our models to examine the association of state fidelity monitoring with changes in the proportion of facilities offering ACT and the proportion of facilities with ACT offering core and secondary services.

We conducted all analyses by using SPSS, version 23, software.

Results

ACT was offered by a smaller proportion of facilities in 2016 than in 2010 (OR=0.73, 95% CI=0.63–0.86, p<0.001) (Table 1); however, in 2016 a higher proportion of facilities with ACT reported offering all the ACT core services (OR=1.31, p=0.026). The increase was especially pronounced for peer services (OR=1.72, p<0.001) and for integrated co-occurring disorders services (OR=1.23, p=0.004 (Table 1). Fewer facilities offered case management services in 2016 than in 2010 (OR=0.44, p<0.001).

TABLE 1. Change over time in the proportion of mental health facilities reporting ACT servicesa

Service2010 (N=10,341)2016 (N=12,072)
N%N%OR95% CIp
ACT1,84717.91,65713.7.73.63–.86<.001
Core servicesb
 All core services.26014.129317.71.311.04–1.66.026
 Peer services64234.779147.71.721.38–2.13<.001
 Case management1,68791.31,36582.4.44.33–.58<.001
 Employment92650.188953.71.15.95–1.40.153
 Housing78642.668941.6.94.77–1.15.547
 Co-occurring disorders1,30870.81,24375.01.231.08–1.42.004
 Group therapy1,63288.41,46988.71.03.83–1.28.762
 Medication1,66990.41,48990.0.95.64–1.41.793
 Individual therapy1,74994.71,59696.31.46.88–2.43.144
Secondary services
 Suicide prevention1,35073.11,17370.8.89.73–1.09.264
 Tobacco cessation55630.11,09666.14.533.51–5.84<.001
 Family education1,31071.01,13268.3.90.77–1.05.208
 Education1,00754.565739.6.55.46–.65<.001
 Illness management and recovery93350.577947.0.87.64–1.20.393
 Telemedicine48726.470942.82.081.67–2.57<.001
 Quality review1,81098.01,62898.21.14.68–1.90.610
 Continuous education required1,67891.01,58895.82.321.42–3.78.001
 Discharge outcome1,14862.21,06064.01.08.87–1.34.468

aSource: National Mental Health Services Survey. 2010 is the reference group. Core and secondary services are reported only for facilities that reported providing ACT.

bAs defined in the Dartmouth Assertive Community Treatment Fidelity Scale and approximated by using only services present both in the 2010 and 2016 waves of data.

TABLE 1. Change over time in the proportion of mental health facilities reporting ACT servicesa

Enlarge table

Regarding secondary services, tobacco cessation (OR=4.53, p<0.001) and telemedicine (OR=2.08, p<0.001) services were more commonly offered in 2016 than in 2010, as was requiring ongoing training for staff (OR=2.32, p<0.001); education services for patients were offered less commonly in 2016 (OR=0.55, p<0.001).

We also tested the interaction terms for ACT and year in the analyses of individual services. The interaction term was significant for provision of integrated co-occurring disorders services and was significant at a trend level for peer services, suggesting a larger increase over time in these services among facilities offering ACT compared with facilities not offering ACT. Interaction terms in analyses of provision of tobacco cessation services, telemedicine, and staff training were not significant, suggesting that the increase in these services at facilities offering ACT was in line with increases in these services at mental health facilities not offering ACT.

The proportion of states that monitored fidelity to the ACT model was 51% (N=26) of 51) in 2010 and 59% (N=30) in 2015. Fidelity monitoring in 2015 was associated with the increase in the proportion of facilities with ACT offering all the core services in 2016 compared with 2010 (adjusted odds ratio [AOR]=1.63, 95% CI=1.29–2.07, p<0.001). Only 5.2% of the temporal variance in proportion of facilities with ACT offering all the core services could be accounted for by state fidelity monitoring in 2015.

Discussion

We found a significant decrease in the proportion of facilities offering ACT from 2010 to 2016. This finding is of concern given previously described findings (5) demonstrating that in 2015 the national capacity to serve ACT-eligible individuals was just over 40%.

There are likely numerous underlying causes for this decrease. Maintaining funding, which has been linked to ACT dissemination, may be a challenge (12). Grants, a significant source of ACT funding (5), tend to run out, affecting long-term sustainability of the programs. Providing sliding scale or free services can also negatively affect program viability in the long term. Maintaining staffing may be difficult for ACT programs (13). Although there is some evidence that even ACT teams without full staffing may be effective (14), high staff turnover and prolonged vacancies in key positions inhibit successful team function and patient care (13). Staffing issues can directly affect program viability in states that link service provision to reimbursement.

Just as important as availability of ACT is program fidelity, which has been linked to improved outcomes (4). Using self-reported program parameters to approximate the DACTS, we found that the likelihood of offering all the required core ACT services was higher in 2016 compared with 2010. Despite this increase, less than a fifth of facilities self-reporting provision of ACT offered all the required core services. It is plausible, however, that some ACT programs provide services in a way that is not easily measured using the DACTS, for example through the use of telepsychiatry or creative partnerships.

A few explanations are possible for the increase in core services provision. Independent auditing and feedback have been shown to improve fidelity to the ACT model (15), which is consistent with our findings that state fidelity monitoring explained some of the temporal variance in provision of core services. Additionally, there may be selective attrition favoring programs that provide all the core services in states that link fidelity and funding.

Only a small proportion of the changes in provision of required services could be explained by state fidelity monitoring. Provision of the required ACT services may have been promoted by other means. For example, Veterans Affairs and military facilities are more likely to operate or provide funding for programs that provide all the required ACT services (5). Such facilities may have unique oversight requirements that promote higher fidelity to ACT.

Among the core services offered, there was a significant increase from 2010 to 2016 in the proportion of ACT programs providing peer services. Integrated co-occurring disorders services were more commonly offered by facilities with ACT in 2016 compared with 2010. The increases in these services were significantly more pronounced among facilities providing ACT compared with facilities not providing ACT.

Patients with severe mental illness often have unique and complex needs (8, 9) and may benefit from secondary services not captured by the DACTS. An example would be tobacco cessation services. The odds of tobacco cessation services being offered by facilities with ACT in 2016 were more than four times higher compared with the odds in 2010. Telemedicine, an alternative way to disseminate the specialized expertise of ACT programs, was also more commonly offered in 2016. Although the findings regarding secondary services generally reflected trends among all facilities offering mental health services, they may be particularly relevant to ACT clients.

We used N-MHSS data from 2010 and 2016 to examine quality improvement initiatives unrelated to ACT fidelity. Our results show that the likelihood of facilities with ACT requiring continuous education for staff in 2016 was more than double that of 2010, mirroring a trend among all mental health treatment facilities. Although the proportion of facilities with ACT that conducted quality review of cases did not change between the two time points, more than 98% of facilities with ACT at both time points implemented such practices. Additionally, more than 60% of facilities providing ACT conducted some form of outcome follow-up at both time points, with no significant change noted over time.

This study’s findings are limited by the following factors. First, the unit of analysis was facility, allowing only indirect examination of individual programs. Some facilities may have housed other programs along with ACT. The true fidelity of ACT programs may thus be lower than reported here. Second, because DACTS scores were not available, services offered at facilities with ACT were examined and used to approximate DACTS scores. Finally, the definition of ACT in the N-MHSS was broader than the classical definition of ACT, possibly affecting the generalizability of our findings to classical ACT programs and overestimating the number of facilities offering ACT.

Conclusions

By examining data from facilities with ACT from 2010 to 2016, this study demonstrated that the national availability of self-identified ACT programs has declined, likely decreasing national capacity to provide ACT-eligible individuals with this evidence-based service. The results show an increase in the proportion of facilities with ACT that offer the full array of required services. The overall proportion of facilities offering all the core services, however, remains low. State fidelity monitoring has contributed to increased fidelity, although this monitoring explained only a small proportion of the temporal variance in fidelity, suggesting there may be other causes. Facilities offering ACT continue to provide important secondary services to their clients and have maintained or increased their focus on measurable quality markers not directly tied to program fidelity. Our findings call for greater attention to dissemination of ACT to prevent further decline in the national capacity to serve a vulnerable patient population as well as to continue efforts to improve program fidelity to the ACT model. Further examination of ACT programs is warranted to better understand trends in the provision of this care.

Johns Hopkins University School of Medicine (Spivak, Cullen, Mojtabai); Johns Hopkins University Bloomberg School of Public Health (Cullen, Mojtabai); Johns Hopkins Medical Systems (Green, Firth, Sater).
Send correspondence to Dr. Spivak ().

The authors report no financial relationships with commercial interests.

References

1 Rosen A, Mueser KT, Teesson M: Assertive community treatment—issues from scientific and clinical literature with implications for practice. J Rehabil Res Dev 2007; 44:813–825Crossref, MedlineGoogle Scholar

2 Dixon L: Assertive community treatment: twenty-five years of gold. Psychiatr Serv 2000; 51:759–765LinkGoogle Scholar

3 Assertive Community Treatment: Evaluating Your Program. DHHS pub no SMA–08–4344. Rockville, MD, US Department of Health and Human Services, Substance Abuse and Mental Health Services Administration. Center for Mental Health Services, 2008Google Scholar

4 McGrew JH, Bond GR, Dietzen L, et al.: Measuring the fidelity of implementation of a mental health program model. J Consult Clin Psychol 1994; 62:670–678Crossref, MedlineGoogle Scholar

5 Spivak S, Mojtabai R, Green CE, et al.: Distribution and correlates of assertive community treatment (ACT) and ACT-like programs. Psychiatric Services 2019; 70:271–278LinkGoogle Scholar

6 National Mental Health Services Survey (N-MHSS): 2014 Data on Mental Health Treatment Facilities. BHSIS Series S–87, HHS pub no (SMA) 16–5000. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2016Google Scholar

7 Mojtabai R: National trends in mental health disability, 1997–2009. Am J Public Health 2011; 101:2156–2163Crossref, MedlineGoogle Scholar

8 Vanderlip ER, Henwood BF, Hrouda DR, et al.: Systematic literature review of general health care interventions within programs of assertive community treatment. Psychiatr Serv 2017; 68:218–224LinkGoogle Scholar

9. 2016 Uniform Reporting System Output Tables. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2016. https://www.samhsa.gov/data/report/2016-uniform-reporting-system-urs-output-tablesGoogle Scholar

10 Zeger SL, Liang KY: Longitudinal data analysis for discrete and continuous outcomes. Biometrics 1986; 42:121–130Crossref, MedlineGoogle Scholar

11 Rubin DB, Schenker N: Multiple imputation in health-care databases: an overview and some applications. Stat Med 1991; 10:585–598Crossref, MedlineGoogle Scholar

12 Woltmann EM, Whitley R, McHugo GJ, et al.: The role of staff turnover in the implementation of evidence-based practices in mental health care. Psychiatr Serv 2008; 59:732–737LinkGoogle Scholar

13 Burns T, Catty J, Wright C: Deconstructing home-based care for mental illness: can one identify the effective ingredients? Acta Psychiatr Scand 2006; 113:33–35CrossrefGoogle Scholar

14 Teague GB, Boaz T, Kuhns M, et al.: Large-scale implementation of assertive community treatment: progress and lessons. Presented at the Annual Conference on State Mental Health Agency Services Research, Program Evaluation, and Policy, National Association of State Mental Health Program Directors, Baltimore, Feb 2002Google Scholar

15 Mancini AD, Moser LL, Whitley R, et al.: Assertive community treatment: facilitators and barriers to implementation in routine mental health settings. Psychiatr Serv 2009; 60:189–195LinkGoogle Scholar