I Built a Plant Health Monitor in One Night
How I used AI, timelapse visualization, and a health scoring engine to make a plant care app that actually tells you when your plants are struggling.
My fiddle leaf fig died last year. Not dramatically — just slowly, leaf by leaf, over three months of me convincing myself it was fine. By the time I admitted something was wrong, it was too late. I'd looked at it every day and missed every signal.
That's the problem PlantCam is built to solve.
The Core Insight
Plants communicate constantly. Brown edges, drooping leaves, yellowing — there's a whole language there. The issue is we're bad at noticing gradual change. If you see your plant every day, you adapt to its "new normal" and miss the drift.
A camera doesn't adapt. It sees objectively. So the insight was simple: use timelapse photography to make slow changes visible, then layer AI health scoring on top to tell you what you're actually seeing.
How the Health Score Works
The health scoring engine is the core of the app. Each plant gets a 0–100 score, updated daily, calculated from several signals:
function calculateHealthScore(plant: Plant): number {
let score = 100;
const daysSinceWater = getDaysSince(plant.lastWatered);
const wateringInterval = plant.species.wateringIntervalDays;
// Overdue watering penalty (exponential)
if (daysSinceWater > wateringInterval) {
const overdueDays = daysSinceWater - wateringInterval;
score -= Math.min(40, overdueDays * overdueDays * 2);
}
// Visual health signals from latest photo analysis
score -= plant.healthSignals.browning * 15;
score -= plant.healthSignals.drooping * 20;
score -= plant.healthSignals.yellowing * 10;
score += plant.healthSignals.newGrowth * 5;
return Math.max(0, Math.min(100, Math.round(score)));
}The visual signals come from image analysis — in production this would be a vision model, but the demo runs on pre-computed signal data that mirrors real plant behavior.
The Timelapse Visualization
This was the fun engineering challenge. The app simulates timelapse playback using a photo history array + CSS transitions. Each "frame" is a snapshot card with health metadata overlay. The effect of watching a plant's score climb or fall over weeks is genuinely useful — you see the pattern, not just the current state.
What Surprised Me
The demo data design was as hard as the engineering. I spent 30 minutes just making sure the 5 demo plants told a compelling story: one thriving, one overdue for water, one mid-recovery, one struggling. The app needs to show its value immediately on first load — before any real plants are added.
The "Frank the Fiddle Leaf Fig — WATER TODAY" card does more to explain the product than any headline.
What I'd Build Next
- ▸Real camera integration via browser
getUserMediaAPI for mobile - ▸Push notifications for watering reminders (service workers + local notifications)
- ▸Species recognition on first add (photo → identify plant → auto-set care schedule)
- ▸Community care notes: "what worked for others with this species"
- ▸Export timelapse as shareable GIF
Try It
PlantCam is live at [plantcam.limed.tech](https://plantcam.limed.tech). Add your plants, track their health, and stop losing fiddle leaf figs to slow-motion neglect.
Built by Jacobo in one night. Yes, one night. That's the whole point.
Ready to keep your plants thriving?
Try PlantCam Free