The Foreign Service Journal, May-June 2026

48 MAY-JUNE 2026 | THE FOREIGN SERVICE JOURNAL Planned to run for one full year, from October 2024 to September 2025, the pilot was cut short after just four months with the abrupt dismantling of USAID and the pending loss of almost all positions at the agency. Though this prevented us from evaluating long-term effects of the project, we did come away with some useful observations and lessons learned. What We Built The Legislative and Public Affairs DOC Team at USAID addressed the agency’s most complex communications challenges by training and connecting DOC specialists worldwide, advising missions on communications structure, championing priorities with Washington stakeholders, and protecting USAID’s brand equity. In September 2024, in response to requests for support and tools to assist with their heavy workload, we used remaining, expiring funds to acquire a limited number of yearlong ChatGPT Team subscriptions to share with the global DOC network on a pilot basis. We notified the entire network, shared preliminary ideas for how ChatGPT could help, and asked those interested to fill out a simple online form. We provided clear rules for not using it for anything sensitive or unavailable for public consumption. Users had to agree to the rules of use before receiving a seat. A couple of months after the 82 seats were assigned, we scheduled a community of practice call to share what we had learned. The senior DOC specialist with USAID/Benin, who had demonstrated command of the tool and a strategic eye for implementation, addressed the discussion. She described how she had begun by uploading USAID’s style guide, her country’s Country Development Cooperation Strategy (a non-SBU version), and USAID’s branding and marking manual into her workspace. She then “taught the system” about the role of the DOC specialist. Finally, she shared examples of how the tool had helped her review partner-generated content submissions, synthesize talking points, simplify complex language into easyto-understand phrases, and more. Others shared how AI had helped them refine language to fit within character counts for social media platforms. The tool proved useful for brainstorming ideas or understanding options when facing an unfamiliar topic or task. It provided rough translations that could then be verified by human translators. Media monitoring, editing, and text refinement were other common applications. Participants saw possibilities for minimizing time-consuming tasks and creating an AI personal assistant. A Cautious Approach But at the same time, some practitioners were vocal about their ethical and environmental concerns, a tension that ran throughout the pilot. Overall, they were cautious in their use of AI chatbots in late 2024. The tools felt too new to overly trust; usage over time typically builds comfort. Some of the caution can be attributed to the fact that most participants had not been trained in how to use these tools or what their capabilities were. Moreover, these tools are trained primarily on English- language internet text, which means they inherit the assumptions, cultural frameworks, and blind spots of the communities most represented in that data, which are predominantly Western, English-speaking, and relatively affluent. For foreign affairs professionals working across diverse cultural contexts, this is a significant limitation. In addition, environmental concerns surfaced frequently. USAID supported projects focused on protecting the environment and on sustainable development, so we pondered how staff should handle the ethical dilemma of participating in a system that profoundly impacts fresh water and energy supplies. On the one hand, USAID missions had limited resources that AI could directly, positively impact. If staff could spend less time on routine administrative tasks that consume disproportionate time relative to their impact on mission goals, they would have more time for partnership and positive change in the communities they served. On the other hand, reports on AI’s negative environmental impact were, and continue to be, stark. Over the course of the pilot, there were also a couple of instances in which teams developed AI-generated images that were not in line with branding and marking guidelines. They were ultimately allowed to use those images, but for internal materials only. We hypothesize that constrained budgets might have pushed teams toward increased AI photography usage over time. Editing photography is time-consuming after Participants saw possibilities for minimizing time-consuming tasks and creating an AI personal assistant.

RkJQdWJsaXNoZXIy ODIyMDU=