No legalese. Just what actually happens.
When you upload a photo
Three things happen before the AI even sees it:
- GPS and metadata stripped. Your phone embeds location, device info, and timestamps into every photo. We remove all of it before processing or storage.
- Faces detected and blurred. If anyone is in the photo, their face gets blurred automatically. This happens before the AI analyzes the room.
- Content moderation. We check for inappropriate content and reject it. This keeps the platform safe.
What the AI sees
The processed photo (no metadata, faces blurred) is sent to Google's Gemini API. Gemini identifies items in the room, suggests where they go, and generates your action plan. Google does not use your photos for training. We have a zero-training-data agreement.
What we store
Your photos, scan results, action plans, and corrections are stored in your account on Supabase (hosted on AWS). Each account is isolated with row-level security -- you can only see your own data. Photos are stored in encrypted cloud storage with short-lived signed URLs that expire.
What we never do
- ✕ Sell your data to anyone
- ✕ Train AI models on your photos
- ✕ Show ads based on your scans
- ✕ Share your photos with anyone outside of processing
- ✕ Keep your data after you delete it
How to delete everything
Delete individual scans from the scan detail page. Delete your entire account from Settings. When you delete, it's gone. No 30-day hold, no "we keep backups." Gone.
Third-party services
- Google OAuth -- sign-in only
- Google Gemini API -- analyzes your photo, generates the plan. Not used for training.
- Google Cloud Vision -- face detection and content moderation only
- Stripe -- payment processing. We never see your full card number.
- Supabase (AWS) -- database and file storage
For the full legal version, see our Privacy Policy. Questions? [email protected].