The traditional due diligence process has not changed nearly as much as the surrounding technology. Analysts still review files, extract data into working papers, compare versions, and rebuild the same story for different audiences. The main variables have usually been staffing levels and time.
AI changes that pattern, but not in the way the loudest marketing language suggests. The value is not that a model can generate a clever summary. The value is that a better system can ingest large file sets, extract recurring information, surface anomalies, and shorten the translation layer between source evidence and reviewer judgment.
McKinsey reported in January 2026 that respondents using gen AI in M&A observed roughly 20 percent lower costs on average, while 40 percent reported 30 to 50 percent faster deal cycles [McKinsey & Company, "Gen AI in M&A: From theory to practice to high performance", January 2026]. Deloitte also found that 86 percent of surveyed organizations have integrated GenAI into M&A workflows [Deloitte, "2025 GenAI in M&A Study", 2025]. The practical question is no longer whether AI matters. It is how AI changes due diligence when the workflow is designed well enough to benefit from it.
What AI is actually changing in due diligence
AI does not remove the need for diligence. It changes where human effort is spent.
In a manual process, large amounts of time go into:
-
sorting documents by type and relevance
-
pulling recurring fields out of contracts, statements, and returns
-
comparing multiple versions of the same answer
-
identifying which issues deserve immediate escalation
-
translating specialist findings into executive review language
Those steps are all necessary, but they are not where the highest-value judgment lives. AI is most useful when it compresses that first-pass operating work so the human team can spend more time on materiality, negotiation posture, and deal consequences.
McKinsey's 2024 work on gen AI in M&A frames the opportunity broadly: better sourcing, faster diligence and negotiation, stronger execution of integrations or separations, and improved in-house M&A capability building [McKinsey & Company, "Gen AI: Opportunities in M&A", May 21, 2024]. In due diligence specifically, that means AI matters most when it reduces process friction around evidence, not when it tries to replace the reviewer.
How AI due diligence actually works
The clearest way to understand AI in due diligence is as a workflow that moves through a few predictable layers.
Step 1: Document ingestion and classification
The first step is getting the data room or working set into a structure the team can actually use. AI can classify documents by type, identify duplicates, and group files by likely relevance.
That matters because the manual alternative is expensive:
-
contracts arrive in mixed folders
-
financial schedules use inconsistent naming
-
tax files span multiple periods and jurisdictions
-
revised versions appear without clean version control
An AI-assisted system can reduce that noise by organizing the file flow faster, but only if the product keeps the file classification connected to the rest of the review process.
Step 2: Structured extraction
Once the file set is organized, AI can extract recurring fields from different document types.
Typical examples include:
-
revenue, margin, EBITDA, working capital, and debt-like items from financial schedules
-
counterparties, dates, termination rights, and change-of-control clauses from contracts
-
jurisdictional information, attributes, and filing patterns from tax materials
-
reviewer-relevant metadata from minutes, schedules, and side letters
This is not the same as making the conclusion. It is the step that makes the raw material usable faster.
Step 3: Cross-document linkage
This is where the workflow becomes more powerful. AI can help connect information that humans often review in separate passes.
For example:
-
a major customer named in a contract can be linked to revenue concentration questions
-
a clause risk can be compared to the financial consequence it might create
-
a filing inconsistency can be traced across multiple supporting documents
This cross-document work is one reason AI matters in due diligence more than in simple document search. The process is not just about reading one file faster. It is about keeping related facts visible together.
Step 4: Issue detection and first-pass triage
Once the system can read, extract, and compare, it can start helping the team identify likely issues:
-
unusual concentration
-
missing support
-
clause conflicts
-
version inconsistencies
-
patterns that deserve escalation
This is the stage where AI becomes operationally useful, because it helps the team separate “interesting” from “decision-relevant” more quickly.
Step 5: Human review and escalation
This is the point many product demos understate. AI is not the final reviewer. It is the acceleration layer around the reviewer.
The human team still has to decide:
-
whether the issue is real
-
how material it is
-
whether the evidence is strong enough
-
what the commercial consequence should be
-
whether the finding changes price, structure, or willingness to proceed
If the workflow does not preserve that human review layer clearly, the tool may create faster outputs without creating faster decisions.
Step 6: Evidence-anchored synthesis
The best AI workflows do not stop at extraction. They make synthesis easier by keeping the issue, the evidence, the reviewer history, and the current conclusion connected.
That matters because senior review is where many diligence processes slow down. The team may already know the answer, but if the answer cannot be explained and defended quickly, the process still stalls.
Where AI creates the most real value
AI creates the most leverage where the process is high-volume, repetitive, and traceability-sensitive.
See the workflow
Connect AI analysis to a live diligence process.
Sorai keeps extraction, source evidence, and issue review connected so AI output does not break when the partner questions start.
In practice, that usually means three areas.
Financial diligence
AI is useful in financial diligence when it helps teams organize the path from uploaded schedules to a readable view of recurring earnings, working capital, and linked issues.
It can accelerate:
-
first-pass extraction of financial data
-
organization of adjustment candidates
-
comparison across periods and versions
-
early identification of areas that need analyst attention
It does not replace the judgment required for QoE, valuation support, or decision-ready recommendation.
Legal and contract review
AI is particularly useful when large contract populations need first-pass organization. It can help surface:
-
change-of-control language
-
assignment restrictions
-
termination rights
-
clauses that deserve targeted legal review
The benefit is not that the legal answer becomes automatic. It is that the reviewer reaches the right subset of contracts and issues faster.
Cross-workstream synthesis
This is the highest-leverage use case for many deal teams. If AI only improves one specialist lane, the process may still fragment. If AI helps preserve continuity across financial, tax, legal, and pre-LOI review, the team spends less time rebuilding context.
That is where due diligence starts to feel structurally different rather than merely faster at one task.
What AI should not do alone
The limits are as important as the strengths.
McKinsey's January 2026 private-markets work found that in seven out of ten industries analyzed, GenAI deep-research reports presented a more optimistic view than expert-interview-based reports, and about 40 percent of important data points uncovered in expert interviews were absent from corresponding LLM answers [McKinsey & Company, "Harnessing the power of gen AI in private equity", January 5, 2026]. That is a serious warning for diligence teams.
It means AI should not be trusted alone for:
-
final valuation judgment
-
final legal interpretation
-
unsupported pass/fail recommendations
-
unresolved conflicts across workstreams
-
executive conclusions that are detached from source evidence
The right model is not “AI replaces diligence.” It is “AI shortens the manual path to a reviewable answer.”
The controls that matter most
Deloitte found that 67 percent of surveyed organizations cite data security as a leading concern in GenAI adoption for M&A [Deloitte, "2025 GenAI in M&A Study", 2025]. That concern is rational. Due diligence often involves the most sensitive information in the whole transaction process.
For that reason, good AI diligence implementation needs more than model capability. It needs operating controls:
-
clear permissions
-
evidence linkage
-
reviewer history
-
defensible exports and reporting
-
data boundaries the buyer can understand
If the product is fast but weak on those controls, the team may simply shift its bottleneck from review speed to trust and governance.
How buyers should evaluate AI in diligence
The best evaluation is not a polished sandbox demo. It is a test against real process friction.
A buyer should ask:
-
Does the tool reduce time spent organizing and re-organizing files?
-
Can it keep findings tied to source support?
-
Does it improve how issues are escalated across workstreams?
-
Can leadership see the current state without asking the team to rebuild the story?
-
Is the human review layer explicit, or is the system pretending to be more autonomous than it should be?
These questions matter more than whether the first summary looks impressive. A diligence tool is valuable when it changes the operating model, not when it generates the prettiest draft output.
What changes for the deal team
When AI is used well, the deal team's work changes in a useful way:
-
less time spent on repetitive extraction
-
less manual searching across large file sets
-
earlier visibility into likely issues
-
faster movement from working-level review to executive discussion
-
more time available for judgment-intensive decisions
That is why McKinsey's reported cost and cycle-time improvements matter. They should not be read as a promise that every deployment will produce the same result. They should be read as evidence that workflow redesign around AI can create measurable benefits when adopted well.
Where Sorai fits
Sorai is built around the idea that AI in due diligence should improve the operating record, not only the first-pass summary. The point is to keep financial, tax, legal, and pre-LOI findings connected to evidence, reviewer status, and decision context in one workflow.
That is the difference between using AI as a shortcut and using AI as part of a more reviewable diligence system.
The bottom line
AI in M&A due diligence matters because it can compress the manual work around ingestion, extraction, triage, and synthesis. It does not remove the need for human judgment. It makes that judgment easier to apply to the right issues faster.
The teams that get real value from AI in diligence are not the ones chasing the most impressive demo. They are the ones using AI to create a workflow that stays evidence-linked, reviewable, and decision-ready under deal pressure.