Context Masking
A privacy technique that pseudonymizes sensitive entities in data before sending it to an external AI model for processing.
When BasaltHQ agents need to leverage powerful external LLMs for complex reasoning, Context Masking ensures no proprietary data leaks. Before a prompt leaves the BasaltHQ perimeter, an automated masking layer replaces all sensitive entities: company names become "Entity A," dollar amounts become "Value X," and employee names become "Person 1." The external LLM performs its reasoning on this sanitized data. When the response returns, BasaltHQ's reconstruction layer maps the pseudonyms back to the real values inside your secure environment. The external model never sees your actual data.
Related Concepts
See also:
Zero-Trust Architecture
A security model that requires strict identity verification for every person and device attempting to access resources, regardless of their network location.
See also:
Data Sovereignty
The principle that data is subject to the laws and governance structures of the nation or organization where it is collected or stored.
See also:
Homomorphic Encryption
A form of encryption that permits computations to be performed on ciphertext, producing an encrypted result that matches the result of operations performed on plaintext.