If someone breaches a fitness app, they learn your step count. If someone breaches a kink app, they learn your punishment log, your pet name, what you were told to do last Tuesday, and exactly how you responded.
That’s life-altering. Outing. Relationship damage. Professional consequences. In parts of the world, it’s a safety issue.
The community has noticed
The kink community didn’t need a security researcher to tell them this was a problem. There’s an entire OPSEC culture that predates the app era: pseudonyms, burner emails, avoiding identifiable photos, never linking a kink profile to a real name. People in the scene know intuitively that their data is a liability if it leaks, and they’ve been acting accordingly for years.
What’s changed is that more of the dynamic is moving into apps. Task assignments, completion notes, intimate feedback, punishment details. Things that used to live in a DM or a conversation are now stored in someone’s database. The attack surface has grown, and the community’s wariness is justified.
What’s actually happened
This isn’t theoretical. Here’s a partial history of what happens when intimate apps fail their users.
1.5 million private photos exposed from kink and LGBTQ+ dating apps (2025). A group of five kink and LGBTQ+ dating apps, all built by the same developer, left user photos sitting in unprotected cloud storage buckets. Over 541,000 images came from a single BDSM dating app alone. The exposed photos included profile pictures, private DM images, verification photos, and even images that had been removed by moderators. The root cause: API keys hardcoded in plaintext in the app code, leading researchers straight to the unsecured storage. The developer was notified in January 2025 and didn’t act until the BBC contacted them months later.
Ashley Madison (2015). 36 million accounts leaked. At least two suicides were linked to the exposure. Extortion campaigns targeting exposed users continue to this day. Seven years later, people are still receiving blackmail emails.
Adult FriendFinder (2016). 412 million accounts exposed, including 15 million “deleted” accounts that were never actually purged from the database. Passwords were stored in plaintext or with trivially crackable hashing.
Grindr (2017–present). User location data was sold to data brokers, then used to track and out a Catholic priest, Monsignor Jeffrey Burrill, forcing his resignation in 2021. Norway’s data protection authority imposed a fine of approximately 65 million NOK (~6.5 million EUR) for sharing user data with advertisers without valid consent.
Feeld (2024). Security researchers at Fortbridge found broken access controls that allowed unauthenticated API requests to access private messages, photos, and sensitive user data. The vulnerabilities were reported in March 2024 and fixed by May, but a communication lapse meant the public disclosure happened before most users learned about the issue.
The most popular BDSM task app. This one hits closest to home. The market leader in our space (we compared it and others in our best BDSM apps for couples roundup), the app with thousands of reviews and the biggest install base, has a privacy policy that explicitly states user data is “not end-to-end encrypted.” The policy recommends switching to Signal for sensitive conversations. They use Firebase Analytics and Crashlytics, third-party services that touch user data. Apple’s App Store privacy label shows they collect “Sensitive Info” and link it to user identity. There’s no published security documentation beyond the privacy policy itself. This is the app most people in the D/s community are using, and that’s the security posture behind it.
Mozilla’s 2024 dating app review. Mozilla’s Privacy Not Included project studied 25 dating apps and gave 22 of them their worst privacy rating. 80% of the apps may share or sell user data for advertising. The report concluded that dating apps have gotten worse for privacy since their 2021 review, not better.
Why most apps fall short
When an app says “your data is encrypted,” they almost always mean encrypted at rest, which is disk-level encryption provided by the cloud provider (AWS, Google Cloud, Azure). It’s on by default. It costs nothing. And it protects against exactly one scenario: someone physically stealing the hard drive from the data center.
It does not protect against:
- A database breach. An attacker who gets a database dump sees all content in plaintext.
- An insider with database access. A developer, contractor, or anyone with credentials.
- A misconfigured backup. Database snapshots exported or restored without access controls.
- A subpoena for database contents. A court order returns readable data.
This is the encryption that the market leader in BDSM task apps uses, and it’s what most apps mean when they say “encrypted.” Technically true. Practically meaningless for the threats that actually matter to kink users.
The difference is between disk encryption (the default) and field-level encryption, where each piece of user-generated content is individually encrypted before it hits the database. With field-level encryption, a database dump yields ciphertext. Not your punishment log. Not your pet name. Ciphertext.
Here’s how the market leader in BDSM task apps compares to what we’ve built:
| Market Leader | SubTasks | |
|---|---|---|
| Field-level encryption | No | Yes (AES-256, 14 fields, 7 tables) |
| DB dump = plaintext? | Yes | No (ciphertext only) |
| Key rotation | N/A | Automatic annual rotation |
| Developer can read your data? | Unknown (not addressed) | Explicit deny on decryption |
| UGC in logs | Unknown | Audited and stripped |
| Third-party analytics on content | Firebase, Crashlytics | None |
| Regular security audits | No | Yes |
| Published security documentation | None (privacy policy only) | Yes |
What SubTasks does
We’re building a D/s task app. Our users type things into our database that they wouldn’t say out loud in most rooms. We take that seriously, and we think you should know exactly how seriously.
Here’s what we’ve built, in plain language.
Field-level AES-256 encryption on every piece of user-generated content. Task titles, descriptions, completion notes, punishment details, reward names, pet names. 14 fields across 7 tables. Every value is encrypted before it’s written to the database. The encryption keys rotate automatically on an annual cycle, and our own developer credentials have an explicit Deny on decryption. We tested it. Access denied. That’s by design.
No user-generated content in logs. We audited every log statement in our backend. Task titles, descriptions, intimate details: none of it appears in our application logs or error tracking. Our crash reporter is configured to never collect personally identifiable information. Logs contain IDs, counts, and system events. Not your content.
No third-party analytics tracking your content. We don’t use any analytics service that processes your content. Our only third-party integration is crash reporting, and it collects no PII. That’s it.
Account deletion means deletion. Delete your account and your data is permanently removed. Not soft-deleted. Not retained for 90 days. Gone.
Regular security audits. We run periodic security audits of our own infrastructure. Not because a regulator told us to, but because we think it’s the minimum for handling this kind of data. No competitor in this space has published anything similar.
The honest limits
We’re not going to pretend we’ve solved every problem.
We’re not end-to-end encrypted. Your partner needs to see your tasks, your completions, your notes. That’s the entire point of the app. True E2E encryption, where keys live exclusively on your devices and the server never sees plaintext, is a different architecture. It’s on our roadmap, not in production.
We’re a small team. That means fewer eyes on the code. It also means a smaller attack surface, no data monetization incentive, and no third-party contractors with database access. Trade-offs.
We’re still building. GDPR data export is on the roadmap. We don’t have a third-party security audit yet. It’s something we’re working toward. We’d rather be honest about where we are than pretend we’re further along.
What to look for in any app
If you’re evaluating any app that handles intimate data (not just ours), here’s a short checklist:
- Does the app explain how your data is encrypted? “We use encryption” means nothing. Disk encryption is the default. Ask whether content is encrypted at the field or application level, or just at rest on the provider’s infrastructure.
- Can you delete your account and all your data? Not deactivate. Delete. And does the company actually purge it? (Adult FriendFinder kept 15 million “deleted” accounts.)
- Does the app use third-party analytics that touch your content? Firebase Analytics, Google Analytics, and similar services process data on third-party infrastructure. Know what’s being sent.
- Has the developer published anything about their security practices beyond a boilerplate privacy policy? A privacy policy is a legal document. It tells you what the company is allowed to do. Security documentation tells you what they’ve actually built.
- Does the privacy policy explicitly say what they can and can’t see? If it doesn’t address developer access to your data, assume they have it.
If you’re coming from the most popular BDSM task app, we also wrote about free alternatives to Obedience that take privacy more seriously.
We built SubTasks because we needed it ourselves, and we built the security architecture because we’d want it if we were the ones typing our punishment log into someone else’s app. If you’re looking for a D/s task app that takes this seriously, that’s what we’re building.