The AI Policy Your Church Already Has
73% of churches have nothing written down. Here’s what your unwritten policy actually says, and what to put in its place.
Your church has an AI policy. You may not have written it. The elders may not have approved it. You may not even know what it is. But it is in force right now, in your sanctuary, in your office, in your counseling room, and on the phones of your members. It is shaping the spiritual life of your congregation as you read this.
That is the part of the conversation that is missing.
The 73% figure comes from the 2026 State of AI in the Church survey. Barna’s better-resourced data is sharper still. Only about 1 in 20 churches has any AI guidelines at all. The number you have probably heard is conservative. The actual gap is wider.
But the framing points in the wrong direction. It suggests a void. There is no void. There is a policy. Every church has one.
The unwritten policy in most churches right now reads something like this:
Whatever staff are doing privately, plus whatever members are doing on their phones, with nobody talking about any of it.
That is the default. It has been adopted by your church. It is operating today. And like every default policy in every institution, it drifts toward the path of least resistance and the loudest adopter. Whoever uses AI most enthusiastically and least cautiously is, by sheer gravity, writing the policy that everyone else inherits.
The stakes are slower and higher
A law firm without an AI policy will eventually get sued. The stakes are visible. Lawyers know what their malpractice insurance will not cover. The market enforces a kind of discipline.
A church without an AI policy will not get sued. There is no malpractice insurance for forming people poorly. The stakes are slower, quieter, and harder to see. They are also incalculably higher.
What gets formed in your congregation, when AI is shaping prayer prompts and devotional habits and sermon prep and counseling notes, is formation under the influence of whoever trained the model. The models were built by people. Those people held convictions. The companies that funded the work held priorities. The training data ran from the Westminster Confession to TikTok hot takes about Bible interpretation, with the proportions of each weighted in ways no one outside the lab can see. All of that ends up shaping the words your members hear when they ask the model a question about prayer or grief or whether their marriage can survive. Your congregation is being formed by it, through the unwritten policy your church has already adopted.
The elder who uses AI to write his sermon and tells nobody is making a discipleship decision for his congregation. The decision may be defensible. It may even be wise. But the deciding has happened, and the decision has been made for the flock without the flock or his fellow elders being part of it. That is the part the unwritten policy hides.
Two elders are already in your church
Both of them are using AI. Or, if not yet, both of them are about to.
The first one has three jobs and a sick child and a marriage that is hanging together by grace. He finished his sermon at eleven o’clock last night because he asked Claude to outline 1 Peter 4 with him, and the outline gave him the structure he needed to actually get the text into his bones before Sunday morning. He has been faithful in his study for fifteen years. He preached a good sermon. The tool helped him do it. He feels relief, and he should.
The second one is convinced the church has been waiting for this moment for a thousand years. He is running counseling notes through ChatGPT to identify themes. He is asking Gemini to draft his weekly congregational email. He is using a custom GPT to generate prayer prompts for the prayer team. He thinks the elders are slow. He thinks anyone hesitant is afraid of progress. He has not stopped to ask whether what he is doing is faithful. He has only asked whether it is impressive.
Both of these elders need a framework. Neither one needs a sermon about how AI is the beast or the antichrist or the spirit of the age. The first one does not need to be made to feel ashamed of the help he received. The second one does not need to be allowed to disciple your congregation by accident. They need their fellow elders to think with them, slowly, in writing, about what is faithful and what is not.
Where most attempts at a church AI policy go wrong
They begin with a corporate template. They borrow the posture of an HR document. They start telling members what they may and may not do with AI in their personal lives. This happens because the templates floating around the internet were written for businesses, and businesses do have authority over what their employees do with company tools. The template gets imported wholesale, and the elders end up signing off on something that sounds reasonable until you ask the simple question of where their authority to bind any of this comes from.
The elders of a local church are shepherds. They are not managers. The Scripture that defines their work is not Harvard Business Review. It is Hebrews 13:17, where they are told they will give an account for the souls under their care. It is 1 Peter 5, where they are told to shepherd the flock of God among them, not domineering over those in their charge but being examples to the flock. It is Acts 20:28, where Paul tells the Ephesian elders to pay careful attention to themselves and to all the flock, and warns that fierce wolves will come.
There is a related problem worth naming before going further. In many churches, when an AI policy finally does get written, it does not come from the elders. It comes from a staff member who happens to be tech-fluent. It comes from a deacon in IT who volunteered. It comes from a committee that reports to the elders rather than from the elders themselves. The mistake is structural and theological. Writing the policy is itself a shepherding act. It is the work of those who will give an account for the souls entrusted to them. It cannot be delegated outward to staff or members, however technically qualified they may be, because they do not bear that weight. Staff and members can advise. They can flag the technical realities the elders need to understand. They can be invited to ask questions and offer counsel. They cannot write the policy that defines how this church’s ministry is conducted. That authority belongs to the shepherds because the account belongs to the shepherds.
Those texts authorize a great deal. They also leave a great deal alone.
What the elders have authority to bind is the conduct of their own ministry, what gets preached, taught, and published under the church’s name, the tools and data handled by anyone serving in pastoral roles, the way the church communicates with the body, the doctrine the congregation receives, and the disclosure they make to the flock about how the church’s ministry is being done.
What the elders do not have authority to bind is what members do with AI in their personal Bible study, devotional life, work, or homes. They cannot bind what tools members install on their phones. They cannot bind the internal conscience of members on matters Scripture has not addressed. Romans 14 is not an abstraction. Neither is 1 Corinthians 8 through 10. They are the texts that mark the line. The elders teach. They exhort. They warn. They model. They do not write rules where God has not written rules.
This means the policy your elders adopt is not a policy for your members. It is a covenant the elders make before God about how the elders will conduct the church’s ministry in the age of artificial intelligence. The discipling of your members around AI in their personal lives is a different work. It happens through preaching. It happens through teaching. It happens through the slow work of pastoral conversation. It does not belong in the policy.
What the policy actually has to do
With that line drawn, the policy has three jobs.
The first is to make the implicit explicit. The unwritten policy is operating because no one has said anything. The written policy operates because the elders have said something, in writing, that they will be held to. The point of writing it down is not bureaucracy. The point is so the congregation knows what is being done in their name, and so the elders know what they have committed to.
The second is to protect what cannot be delegated. There are things in pastoral work that do not belong to a tool because they do not belong to the pastor either. Counseling content. Member confidentiality. The act of being present to someone in suffering. These cannot be ghosted out to a model that may train on inputs, and they cannot be substituted by the model’s output, because the work itself is the point. A tool that lightens the load on Greek word studies is one thing. A tool that pretends to listen to a grieving widow is something else entirely.
The third is to distinguish a tool from a substitute. Most arguments about AI in the church collapse this distinction, in both directions. Some treat every use of AI as a substitute for pastoral work, and so they condemn the calculator as if it were a ghostwriter. Others treat every substitute as a tool, and so they let the ghostwriter through the back door because they liked using the calculator. The policy has to draw that line in writing, in specific cases, with the elders together signing their names to it.
A note before the model policy
What follows is a model. It is not a finished document for your church. The elders together will need to sit with it, adapt it, argue with parts of it, and shape it for the flock you have been given to shepherd. It also has to be revisited every year, with the date set in the policy itself, because the tools are changing fast enough that any policy written today will need to be reworked twelve months from now.
Use it as a starting place. Take what is faithful. Push back where you see something missing or wrong.
Model AI Policy for Local Church Use
Adopted by the elders of [Church Name] Effective: [Date] Next review: [Date one year from effective date]
1. Statement of purpose
We, the elders of this church, recognize that artificial intelligence tools have entered our work, our communications, and the lives of our members. We will not pretend otherwise. Nor will we use these tools without thinking about what their use means for the souls we have been called to shepherd.
This policy describes how we, the elders, will use these tools in the conduct of this church’s ministry. It does not bind the personal use of AI by members of this congregation. The discipling of members around AI in their own lives belongs to teaching and exhortation, not to policy. We will do that teaching faithfully. We will not write rules where Scripture has not written them.
2. Pulpit ministry
The preaching of God’s Word is the central act of the church’s worship. The faithful exposition of Scripture cannot be delegated to a machine.
The elders will not allow AI tools to write sermons that are then preached as if the preacher had written them. AI may be used as a study aid in sermon preparation: for outlining a passage, identifying themes across a book, surfacing exegetical questions, or checking historical and linguistic background. The work of grasping the text, framing the message, and bringing the Word to this congregation is the preacher’s own work, and remains his responsibility before God.
3. Pastoral care and counseling
The work of caring for souls in private cannot be delegated to a tool that may train on inputs, and cannot be substituted for by output that pretends to listen.
No identifying information about a counselee, no description of their situation, no sins confessed, and no personal details from a counseling encounter will ever be entered into any AI tool, public or private, by any elder or pastoral staff member, for any reason. This is categorical and admits no exceptions.
AI may be used in general preparation for counseling, such as reviewing biblical passages relevant to a category of issue, surveying counseling literature, or thinking through how to approach a topic in principle. It may not be used to process the specifics of any actual counselee’s life.
4. Resources produced for congregational use
Curriculum, study guides, devotionals, and other written resources produced under the church’s name shape the formation of the body. They are an extension of the elders’ teaching ministry.
Where AI is used substantively in the production of any resource that will be distributed to the congregation under the church’s name, that use will be acknowledged in the resource itself, and an elder will review the resource for theological soundness before it is released. Resources for which AI was used only for proofreading, formatting, or minor editorial assistance do not require specific disclosure beyond what we say in our standing statement.
5. Communications
Most weekly communications from the church are administrative in nature: announcements, scheduling, building updates, vendor correspondence, basic logistical emails. AI use in producing them is permitted and need not be disclosed.
A subset of communications, however, carries pastoral weight. These include letters to the congregation about church discipline, statements addressing public crises, condolence letters to grieving families, written responses to serious accusations, and other communications in which the body or an individual member receives pastoral counsel from the elders.
Pastoral communications, as defined above, are subject to the same disclosure principle as preaching. Where AI is used substantively in their production, the elders will note that use, either in the communication itself or in a manner the recipient is plainly aware of. AI will not be used to substitute for pastoral judgment in such communications. The judgment is the elders’ own.
6. Administrative use
Operations, finance, scheduling, vendor correspondence, building maintenance, and similar work that does not touch directly on the spiritual formation of members or the pastoral relationship may use AI tools as is helpful. No specific disclosure is required.
Common-sense limits apply. No member’s personally identifying information, financial information, or other confidential data will be entered into AI tools that may train on inputs. The church will use only tools whose data handling has been reviewed and approved by the elders.
7. Member confidentiality and data
The trust members place in the church is itself a stewardship.
No personally identifying information about any member, including but not limited to names paired with sins, struggles, family situations, financial circumstances, or pastoral concerns, will be entered into any AI tool. This applies to all elders, pastoral staff, ministry leaders, and any volunteer with access to confidential information. The single exception is the use of approved tools whose data handling agreements explicitly prohibit training on inputs and have been reviewed by the elders.
8. Disclosure to the congregation
The body deserves to know how its ministry is being conducted.
The elders will publish a standing statement to the congregation, reviewed annually, describing how AI is and is not used in this church’s ministry. This statement will be plainly available to members, and will be referenced from the pulpit at least once a year.
In addition, where AI has been used substantively in any sermon, pastoral communication, or counseling preparation that affected the body or an individual member, that use will be acknowledged in a manner appropriate to the context. The elders will not issue per-sermon footnotes that turn disclosure into theater. They will instead practice the kind of plain honesty that allows the congregation to trust them.
9. Decision-making
Decisions about which AI tools the church will use, which uses are appropriate, and how this policy will be interpreted in particular cases belong to the elders together, not to any individual staff member acting alone.
Adoption of new AI tools for any of the categories described above requires the approval of the elders. Individual staff members do not adopt tools for pastoral use on their own initiative. Where a question arises that this policy does not clearly address, the matter is brought to the elders for decision. The drafting and revision of this policy itself remains the work of the elders, who may seek counsel from staff and members but do not delegate that work to them.
10. Annual review
This policy will be reviewed by the elders no later than [date one year from effective date]. The tools, capabilities, and risks involved in AI are changing rapidly. Any policy written today will need to be reworked. The elders commit to that work, and to keeping the congregation informed of any substantive change.
The unwritten policy is already operating in your church. The question facing the elders is not whether to have a policy. The question is whether the one already in force is the one they would have adopted if they had read it first.
Read your church’s unwritten AI policy. Then write a better one.

