The Biggest Risk Arts Nonprofits Face with A.i? Ignoring It.

I recently had a conversation with an executive of an arts nonprofit about artificial intelligence (A.i). When I asked about their organization’s approach to A.i, they admitted they didn’t have one. They knew that some members of their staff used A.i tools to help with writing and brainstorming, but there was no formal policy in place—just an unspoken assumption that if A.i was useful, people would use it on their own.

Quick note: To avoid confusion between artificial intelligence and our partner Al (who is definitely not a robot—at least as far as we know), I use “A.i” when writing about artificial intelligence.

That conversation set off alarm bells. Treating A.i as something individual staff can experiment with—without oversight or strategy—is a major organizational blind spot. A.i isn’t just a tool people should be playing around with in the background. It’s already changing how we work, how we communicate, and how we make decisions. And yet, many organizations are treating it as a novelty rather than a structural shift that demands attention. A recent survey by TechSoup and Tapp Network found that while 85.6% of nonprofits are experimenting with AI tools, only 24% have a formal strategy in place. That gap mirrors exactly what I’m hearing across the field: high curiosity, but little organizational commitment.

Here’s the issue: just because leadership isn’t thinking about A.i doesn’t mean the workforce isn’t using it. In many arts nonprofits, A.i is already part of the workflow—just not in any official capacity. A development associate might use ChatGPT to draft donor acknowledgments. A marketing manager might plug event details into an image generator to create a quick social media post. Program teams are testing AI tools to outline workshop descriptions or lesson plans. These experiments are happening quietly, without structure, training, or oversight. And that’s the concern: if organizations aren’t actively shaping how A.i is used, it doesn’t mean it’s not shaping them. It just means it’s happening without intention. 

Nonprofits are no strangers to resource constraints. We never have enough time, enough money, or enough people to do everything we need to do. That’s exactly why A.i should be on our radar. If implemented intentionally, it has the potential to close the ever-widening gap between our ambitions and our capacity. If we can apply A.i to our administrative and operational functions—grant writing, donor engagement, audience development, internal communication, data analysis, for example—we could free up time and resources for the things that truly require human ingenuity.

But for this to happen, arts nonprofits need to do more than let individual staff members experiment with A.i in isolation. As David De Cremer says in The AI-Savvy Leader, executives “can't delegate digital transformation for [their] company… Executives need to engage, embrace, and adopt new ways of working with the latest and emerging technologies."  Leaders need to be making intentional decisions about where and how A.i fits into organizational structures. These decisions don’t have to be sweeping or overly technical—but they do have to be deliberate. Leaders can start by asking: Where in our operations could A.i expand our bandwidth? What data are we comfortable exposing to these tools, and what needs stronger safeguards? Do we need internal guidelines for how staff use A.i in development, marketing, or education? Just beginning to surface these questions can help shift A.i from a casual experiment to a conscious strategy. 

Many organizations don’t fully understand how A.i models handle data. When a marketing team member uploads sales data into ChatGPT for analysis, where does that data go? If a development team uses A.i to generate donor emails, how much of that proprietary language is being absorbed into the model for others to access? Many of the most widely used A.i tools, particularly the free versions, retain input data in ways that aren’t immediately obvious to users. Remember: if you’re not paying for the tool, your data is probably the price you’re paying. The absence of an A.i policy means that staff could unknowingly be exposing sensitive organizational data without any oversight. Just as we had to rethink our approach to digital security when online fundraising and cloud-based donor management became the norm, we now have to develop clear guidelines for A.i use within our organizations. Without that structure in place, we risk giving away valuable institutional knowledge without even realizing it.

At the same time, there’s a financial reality that can’t be ignored. When I talk to nonprofit arts leaders about A.i, one of the most common responses I hear is, ‘We just don’t have the budget for that.’ And I get it—every dollar in a nonprofit is already accounted for. But the real question isn’t whether we can afford A.i—it’s whether we can afford to keep operating inefficiently while others adopt tools that make their organizations more agile and sustainable. A.i isn’t coming someday in the future—it’s here now, reshaping industries in real time. I don’t want to see arts organizations repeat the mistakes we made with digital media and online engagement—where we waited too long, invested too little, and are now perpetually behind the curve. A.i is moving faster than any technological shift we’ve seen before, and the organizations that start adapting now will be the ones best positioned to thrive. 

If we get this right, we have a chance to create something more sustainable, more effective, and more aligned with the values that brought us to this work in the first place. But that won’t happen by accident. It has to be a choice.

That choice starts with asking the right questions: Where could A.i actually create capacity, not just novelty? What values should guide how we engage with these tools? Beginning to ask—out loud and together—is the first step toward an intentional strategy. In my next piece, I’ll explore what responsible A.i engagement could look like for arts nonprofits, and how leaders can begin moving from passive experimentation to active, values-driven use.

Next
Next

The Last Five Years: COVID, DEI, and the Arts’ Identity Crisis