← Back to Blog /

I Built an AI Retro Tool. Then I Audited It. Send Help.

saas compliance auditing
I Built an AI Retro Tool. Then I Audited It. Send Help. — By Dhanush Kandhan

They say every product tells a story. Mine started with excitement, evolved into stress, and ended with an audit I didn’t ask for sponsored by an investor who “just wanted to make sure everything looks scalable.”

Mine started with an innocent email from our investor:

“We’d like to sponsor a full technical and compliance audit. Shouldn’t take long.”

So there I was knee‑deep in the audit of my own creation: LetRetro, an AI‑powered retrospective platform for engineering teams and scrum masters. On the outside, everything looked neat. Beautiful UI with amazing backend running and the neat and clean infrastructure.

Under the hood, though? Things were… less elegant.

Meme on Technical Audit about the SaaS Technical Stuffs!
Image Credits: Supermeme.ai

The Calm Before the Audit

Before the investors announced the audit, we were in full product rhythm. Shipping features, improving AI summaries, polishing onboarding flows. I’d just rolled out a new AI sentiment feature that could detect team morale trends over sprints, it worked like a charm.

Two days later, the audit email hit my inbox:

“As part of our due diligence, we’d like to conduct a technical and compliance audit. Shouldn’t take long.”

I laughed. Then cried. Then opened my terminal.

Day One: The Discovery

The first report came back in hours. A friendly message from the external auditors:

“Do you realize your GCP region us‑central‑1 has six active instances but only three in use?”

Busted.

Turns out, every “test environment” we spun up for internal experiments had become a hidden cost center. We were running forgotten APIs, extra open source backend service replicas, and one ghost VM that had been alive since our MVP days.

That was the first slap of reality: infra debt sneaks up quietly, camouflaged as innovation.

Day Two: Diving into the Database

Our database was another story entirely. DB is fantastic, Postgres at its heart with a developer‑friendly UI. But ours had become a digital attic.

Old test accounts, dummy data from demos, and temporary tokens sitting quietly like squatters refusing to move out. The audit flagged it instantly:

“There are at least 2,300 unused rows in key user-related tables.”

So began what we soon called “Operation DB Detox.”

We spent nights cleaning every table, deleting junk test users, anonymizing anything even remotely human-readable, and this part was painful restructuring relationships that made zero logical sense months later.

Then came query optimization.

We logged every major query’s execution time, cost, and return size. Some lived forever, one dashboard query took 4.3 seconds to return data because of an unindexed column. We indexed aggressively, added caching layers, and introduced query‑performance logging right inside Node.

The outcome? DB size dropped 30%. Query time halved. And database backups, that used to drag through the night, now blink past in minutes.

Day Three: The AI Grey Area

Next came the AI audit.

The auditors went straight for the thing: “How exactly are you handling prompt data, user context, and retention?”

Never a pleasant question.

LetRetro’s AI was built to summarize agile retrospectives, it reads anonymized notes, finds themes, and generates key actions. But and here’s the part I hate admitting some old debug logs still stored partial text snippets for “testing improvements.”

It wasn’t personal data, but still a violation of the principle we’d sworn by: privacy by default.

We quickly redesigned our entire AI pipeline:

  • Introduced a sanitization middleware before any text hits the model.
  • Added temporary in-memory storage for summaries that expire after processing.
  • Updated the privacy policy to include AI data handling and retention periods.

It looked minor to outsiders, but for us, it was a turning point. The line between useful data and sensitive data isn’t always clear in AI products but crossing it even once can break user trust.

Day Four: Code Quality, facing the mirror

Next came the part I secretly dreaded, the code audit.

Picture this:
You’ve been building for months, racing shipping deadlines, merging PRs past midnight. Then someone opens your repo and says, “Do we really need all these packages?”

They were right.

Our package.json had grown into a dependency jungle — libraries long removed from the code, unused analytics SDKs, redundant UI utilities.
Deleting them wasn’t just about tidiness — unused packages are often hidden security gaps.

We wrote a cleanup script that scanned imports and automatically flagged dependencies not in use.
Removed more than 20 % of our node modules.

Then the auditors looked at code structure — inconsistent linting, old patterns, over‑detailed API routes.
We implemented a new baseline:

  • Strict TypeScript linting with eslint‑plugin‑security.
  • Centralized logging with redaction filters.
  • Modularized API handlers with edge middleware for common checks.

After weeks of cleanup, the repo felt light — predictable, readable, and genuinely modern.

And when the new build deployed? Latency dropped by about 300 ms across routes. Turns out, clean code doesn’t just look better — it runs better.

Day Five: Processing and Performance, the hidden thieves of experience

During load testing, we learned our AI response pipeline had a quirk multiple async processes waiting on the same DB call.
Under heavy usage, it created micro‑delays that users felt as “lag.”

We re‑architected inference requests to batch operations, introduced connection pooling, and monitored database query times in real time.

That small optimization cut average AI processing delay from 2.3 s to 1.1 s.

Lesson learned: performance isn’t about hero caching; it’s the sum of small, disciplined optimizations repeated daily.

Day Six: The Compliance Curveball

Just when I thought we were through, compliance came knocking.

The auditors wanted to see our privacy policy, data retention policy, and records of user consent tracking. We had them, technically just not updated since our beta.

That’s when I learned the difference between legal compliance and practical preparedness.

We rolled out consent tracking architecture using cookies and db security policies, mapped every type of data collected through a “data flow diagram,” and introduced log expiry automation, purging logs after 14 days unless explicitly extended.

By the end of it, the investor’s compliance team sent a short email:

“Much improved. Clean work.”
Which, in audit-speak, is basically a warm hug.

The Post‑Audit Silence, actually that cleaning..

After the chaos came a strange calm. The app felt lighter, cleaner code, structured database, efficient infra. AI modules behaved predictably. Monitoring dashboards made sense.

It wasn’t just about passing an investor’s audit. It was about rediscovering discipline the kind of quiet craft that separates builders from operators.

For the first time in months, we could trace exactly what every part of the system was doing. Every query logged, every endpoint monitored, every cost justified.

The Retro on LetRetro :)

Irony wasn’t lost on me — auditing a retrospective platform is the ultimate meta lesson.
The whole experience was a retrospective:

  • What went well: Great tech, fast delivery, users happy.
  • What went wrong: Ignored infrastructure, fuzzy data handling, weak compliance hygiene.
  • What to improve: Everything we touched.

What the Audit Really Taught Me

Here’s what I walked away with… (with some pain + lot of learnings)

  1. Speed hides sins. Move fast for sure, but schedule housekeeping before you scale yourself into trouble.
  2. Infra debt is silent. It never shouts until the bill arrives.
  3. Your DB tells your story. If it’s a mess, so is your product.
  4. AI data must be respected. Once it leaves your control, there’s no undo button.
  5. Compliance is culture, not paperwork. Make it part of the build, not an afterthought.

And, That One Email

Two weeks after the audit, one of our investors replied to our final report with a smiley emoji and said,

“Now you’re building like a company, not just a product.”

That line hit differently.

The audit was painful, yes. But it forced me to mature as a builder, to see beyond code and into the craft of sustainable software.

So yes, I built an AI retro tool. Then I audited it. It broke me for a week.
But if you ask whether it was worth it, the answer’s simple.

Simply, lovely! Thank you for our investor to sponsored for our audit, you regulates and saves us from the future fears of my saas.

Absolutely.
Now please, send coffee, not help.

$
>