The AI that actually read the file
Here’s what most “AI for lawyers” tools do: you paste some text, the AI generates a response based on its training data, and you spend the next 20 minutes fact-checking whether it hallucinated a case citation. Sound familiar?
Aquiles works differently. When you ask a question, the AI searches your documents — the contracts you uploaded, the depositions you indexed, the case logs you wrote. Every answer comes with the receipts.
Does the defendant's deposition testimony about the December meeting contradict the dates in the signed agreement?
How semantic search changes everything
Traditional document search is keyword matching: you type “breach of fiduciary duty” and hope the opposing counsel used those exact words. Miss by a synonym, and you miss the evidence.
Aquiles uses semantic vector search. Every document in your workspace is broken into passages, converted into mathematical representations of meaning, and stored in a local vector database. When you search, you’re matching on concepts — not strings.
This means:
- Searching for “breach of fiduciary duty” also surfaces passages about “violated trust obligations” or “failed to act in the client’s best interest”
- You find relevant evidence you didn’t know existed in your own files
- The AI can connect dots across documents that you haven’t read side-by-side
Scorecard AI: validated before you see it
The hardest problem in AI-assisted legal work isn’t generating text — it’s trusting it. Aquiles addresses this with a multi-pass scorecard validation system that evaluates AI responses before they reach you.
After the initial response is generated, a second analysis pass reviews the output against your source documents. It checks for unsupported claims, verifies that cited passages actually exist in your files, flags logical inconsistencies, and scores the response on accuracy and completeness.
The result: what you see has already been pressure-tested. Unsupported assertions are caught early. Fabricated references are flagged before they make it into your analysis. The first draft you receive is materially better than a single-pass response — closer to what a careful associate would produce after a round of self-review.
This doesn’t replace your judgment. It raises the floor. The AI’s first response is already its second draft.
AI outputs become reusable references
When the AI generates analysis — a research memo, a fact summary, a contradictions report — that output is saved as a workspace reference, not just a chat message that scrolls away.
This changes the workflow entirely:
- Searchable. Every AI-generated reference is indexed and queryable by semantic search, just like your uploaded documents. Six months from now, you can search “personal jurisdiction analysis” and find the memo the AI wrote in week one.
- Citable. AI references appear in the References tab alongside your uploaded exhibits, contracts, and correspondence. They’re first-class work product.
- Composable. Future AI tasks can draw on previous AI outputs. The research memo you generated in March becomes context for the motion you’re drafting in July.
- Editable. Review the AI output, refine it, and the edited version becomes the reference. Your corrections are preserved.
The AI isn’t a chat window — it’s a research engine that produces artifacts you build on.
What you can do with it
Ask open-ended questions. “What are the strongest facts supporting our position?” The AI reviews your entire workspace and synthesizes an answer grounded in your actual evidence.
Identify contradictions. “Does the defendant’s timeline in the deposition match the dates in Exhibit C?” Aquiles cross-references documents you might not have compared manually.
Draft with context. When you use inline AI assistance, it’s not writing in a vacuum. It knows your case — the parties, the jurisdiction, the key facts — and drafts accordingly.
Build structured references. Run an AI task against attached documents, and the output — complete with extracted facts, dates, and key details — is saved as a workspace reference with proper metadata and source attribution.
The privacy equation
All search indexing happens entirely on your machine — your case data never leaves your device for embedding or retrieval. When AI analysis requires cloud processing, only the minimum necessary context is sent over encrypted connections, processed in real-time with zero retention, and never used to train models. Your data in, your answer out, nothing stored.
“I asked it to find inconsistencies in the plaintiff’s account. It surfaced a paragraph from page 47 of a deposition I uploaded two weeks ago. I’d never have found that on my own.” — Early access user
Not magic. Just good engineering.
Aquiles doesn’t make things up. It retrieves, it cites, it validates, it lets you verify. The scorecard system catches errors before you see them. The reference system ensures nothing gets lost. And every answer traces back to your actual files — not a confident guess with a law degree it never earned.