Jagoda Kuczkowska
Back to projects
2026Frontend Developer

Animal Helper — Operator Dashboard

A React Native operator dashboard for a non-profit animal rescue organization, built in 24 hours at idea2impact hackathon, featuring AI-powered report intake with Speech-to-Text, automatic duplicate detection, and an integrated case communication cockpit.

Animal Helper — Operator Dashboard
3 screenshots

Overview

Animal Helper is a non-profit organization that shortens the path from a citizen's report to an actual rescue intervention. Their operators face a relentless stream of incoming cases — stressed callers, incomplete information, and around 5,000 reports per month — all under time pressure where every minute can matter for the animals involved.

At the idea2impact hackathon (March 7–8, 2026), a team of five of us had 24 hours to understand the problem, design a solution, and ship a working product. We placed 5th.

I took on two roles: leading the frontend development in React Native, and driving the stakeholder discovery — sitting down with Animal Helper representatives early in the hackathon to map their daily workflow before writing a single line of code. Those conversations shaped every feature we built.

Our goal was clear: reduce the number of repetitive, manual steps an operator has to take for every single case.


The Architecture & Tech Stack

React Native + Expo (Frontend)

The entire operator-facing UI is built with React Native and Expo, making it deployable as both a mobile app and a web interface. This was a deliberate choice — Animal Helper already had an existing technical ecosystem, and React Native gave us the cleanest path to integration without asking them to adopt a completely new stack.

Python (Backend)

The backend is written in Python, handling the AI model pipeline, report processing, and the REST API consumed by the frontend. Python was chosen for its strong ecosystem around NLP and machine learning, and for its fit with the organization's existing infrastructure.

MongoDB

Reports, communication history, and case relationships are stored in MongoDB. The schema-flexible model suited the nature of incoming reports — which are inherently messy, variable, and often incomplete by design.

AI Models + Speech-to-Text

Two internally trained models drive the intelligence layer:

  • An STT (Speech-to-Text) engine that transcribes phone calls in real time
  • An AI analysis model that parses the transcript, suggests form field values, flags urgency, and identifies missing information — each suggestion accompanied by a confidence score

Key Features

AI-Powered Report Intake via Speech-to-Text

Operators traditionally had to listen to a caller, type simultaneously, and re-ask for missing details — all under time pressure. With the STT pipeline, the call is transcribed in real time and the AI model immediately proposes values for every key field: event description, location, species, incident type, priority level, and number of animals.

The model also surfaces what's missing and suggests clarifying questions the operator can ask. Critically, every suggestion comes with a confidence level — the operator sees how certain the model is and can accept or override any field individually. No black box, no blind trust.

Duplicate Detection

With 5,000 reports per month, the same incident being reported by multiple witnesses is routine. The system automatically compares each new report against recent submissions and flags likely duplicates before the case is even saved.

When a potential match is found, the operator gets a clear notification and decides whether to link the cases. If they do, case statuses are synchronized — every citizen who filed a report about the same incident receives the same updates, without the operator manually sending separate emails to each one.

Integrated Case Communication Cockpit

During our interviews, a fragmented communication problem emerged: part of a case lived in the internal system, part in personal email inboxes, part in messages that could never be tied back to a case. Operators were constantly leaving the app to handle email and losing context when they came back.

We built an email panel directly inside the case view. Operators can compose and send messages to both the reporting citizen and the relevant services without switching applications. The full message history is attached to the case record — any operator picking up a case has complete context immediately. Ready-to-send templates for common scenarios (e.g. unfounded reports) are available in one click.

Supporting Improvements

Beyond the three core features, we also shipped:

  • Stale case reminders — cases with no update for a defined period surface automatically so nothing falls through the cracks
  • Next-step suggestions — the AI recommends follow-up actions based on case type and current status
  • Incomplete report filters — a dedicated view lets operators batch-process incomplete cases during quieter periods, rather than hunting for them in the main list

The Challenge & Solution

24 Hours, 5 People, a Real Client

The hardest constraint wasn't technical — it was time. One day to go from "here's the problem" to "here's a working solution," with a real organization watching.

My first move was to resist the urge to start building immediately. I spent the first two hours of the hackathon running structured interviews with the Animal Helper team, mapping the operator workflow end to end: what does a typical call look like? Where does time get lost? What happens to an email that never gets linked to a case? What does "incomplete report" actually mean in practice?

That research directly shaped our priorities. The duplicate detection feature, for example, came from one sentence in those conversations: "Sometimes the same incident gets reported five times and we have no way of knowing." Without talking to the client first, we would have built something reasonable — but probably not that.

Splitting Work Without Losing Coherence

With five people and one day, parallel work was essential — but so was keeping the frontend and backend speaking the same language. I owned the frontend architecture and defined the API contract early, which meant unblocking the backend team with clear interface specs so they could build independently while I built the UI components.

We shipped a functional end-to-end demo covering all three core features within the time limit.


Lessons Learned

Talk to the client before building. The two hours we spent on interviews were the highest-leverage hours of the entire hackathon. The best features we built came from things we heard, not things we assumed.

In a time-limited team sprint, interface contracts are everything. Agreeing on request/response shapes upfront meant backend and frontend could move in parallel without blocking each other — critical when you only have 24 hours.

Confidence scores make AI suggestions feel safe. Showing the operator how certain the model is — rather than just silently pre-filling the form — is the difference between a tool that helps and one that feels risky. Operators need to stay in control.

Scope ruthlessly. After the interviews we had a list of a dozen possible features. We shipped three well-executed ones instead of six half-finished ones. That focus contributed directly to our 5th place finish.

Tech Stack

React NativeExpoTypeScriptPythonMongoDBSpeech-to-TextAI/ML