Momentum Docs
  • Getting Started
    • What is Momentum?
    • Onboarding Guide
    • FAQ
      • Joining a Google Meet meeting
  • Platform
    • Orientation
      • Self-Guided Training
    • Inbox
      • Inbox Queue
      • Week In Review
    • Constituents
      • Constituents Tab
      • Segments
      • Lists
      • Constituent Records
        • Overview
        • Plan
        • History
    • Workspace
      • Templates
      • Automations
      • Prospecting
        • Hidden Gems
    • Contacting Constituents
      • Scheduling Outreach
      • Sending Emails
      • AI Drafts
        • AI Drafting Examples
    • Assistant Mode
    • Momentum Mobile
      • Pocket Portfolio
      • Mobile Contact Report
  • Data
    • Data Models
      • Actions Data Model
      • Gifts Data Model
      • Donor Data Model
    • CRM Sync
      • Salesforce Integration
      • Blackbaud Raiser's Edge NXT Integration
    • Security
      • Artificial Intelligence Privacy
      • HIPAA Compliance
    • Email Integration
      • Email Logging
      • Microsoft 365 or Outlook Integration
      • Gmail Integration
    • DonorSearch Integration
  • Links
    • Momentum App
    • Momentum Homepage
    • Release Notes
Powered by GitBook
On this page
  1. Data
  2. Security

Artificial Intelligence Privacy

What Momentum does to secure your data relating to the use of AI

PreviousSecurityNextHIPAA Compliance

Last updated 1 year ago

This guide outlines our approach to ensuring data privacy while leveraging AI capabilities through the "Anonymous In, Personalized Out" process.

While data security measures protect against threats and malicious actions in certain scenarios, our primary focus is on safeguarding data ownership and preventing its use in external AI algorithms.

Handling Personally Identifiable Information (PII) with the "Anonymous In, Personalized Out" Approach

Personally Identifiable Information (PII) pertains to data elements that can uniquely identify individuals, such as names, addresses, and phone numbers. In discussions about data privacy, PII serves as a guiding principle for maintaining the equilibrium between personalized experiences and confidential data handling.

Any data that qualifies as PII is stripped out from database records, such that only non-PII data is passed into the AI. Instead, a randomized hash is used to replace PII data to ensure that no two records are confused.

Then, the anonymized donor data is fed into the AI models and once outputs are finalized the hash replaced with the original identifiable data points.

No PII is ever passed into any AI models.

Strategic Data Processing in the "Anonymous In, Personalized Out" Workflow

The workflow of the "Anonymous In, Personalized Out" process follows a systematic approach:

  1. Anonymization: PII removal occurs meticulously, leaving only contextual data essential for analysis.

  2. AI Model Processing: The anonymized data is processed through our AI models, which operate solely on the contextual data to generate accurate outcomes.

  3. Personalization with Privacy: The AI-generated outputs are re-personalized by matching outputs to original (identifiable) donor data. Only at this point, once the AI models have finished processing, do we reintroduce personalized elements while upholding data privacy.

A demonstration of our “Anonymous In, Personalized Out” approach to preserving privacy using AI models

Conclusion

In conclusion, our “Anonymous in, personalized out” approach should preserve user and organizational data integrity and ensure that no PII is ever used in an artificial intelligence algorithm or model.

We encourage you to read all of our documentation on security and privacy. If you have any specific issues or requests please contact your customer success manager.