Large Blob Storage - FeitianTech/postquantum-webauthn-platform GitHub Wiki
- Introduction
- Architecture Overview
- Core Components
- Chunked Transfer Mechanism
- Encryption and Security
- Integration with Extensions Framework
- PIN Authentication and Access Control
- Test Examples and Usage Patterns
- Security Considerations
- Performance Implications
- Troubleshooting Guide
- Conclusion
The Large Blob Storage extension is a CTAP2.1 feature that enables WebAuthn authenticators to store and retrieve large amounts of data associated with individual credentials. This extension addresses the limitation of traditional WebAuthn credential storage by providing a mechanism to store substantial amounts of data (typically up to several kilobytes) alongside each credential, enhancing the functionality of resident credentials while maintaining security and privacy.
The extension operates through a chunked transfer mechanism that divides large data into manageable fragments, employs strong encryption using AES-CCM for data confidentiality and integrity, and integrates seamlessly with the WebAuthn Extensions framework for credential-specific data storage.
The Large Blob Storage system consists of several interconnected components that work together to provide secure, efficient storage and retrieval of large credential-associated data.
graph TB
subgraph "Client Layer"
Client[Web Application]
Extensions[Extensions Framework]
LargeBlobs[LargeBlobs Class]
end
subgraph "Authentication Layer"
CTAP2[CTAP2 Protocol]
PinProtocol[PIN/UV Protocol]
ECDH[ECDH Key Exchange]
end
subgraph "Storage Layer"
BlobArray[Blob Array Storage]
Encryption[AES-CCM Encryption]
Compression[ZLIB Compression]
end
Client --> Extensions
Extensions --> LargeBlobs
LargeBlobs --> CTAP2
CTAP2 --> PinProtocol
PinProtocol --> ECDH
CTAP2 --> BlobArray
BlobArray --> Encryption
Encryption --> Compression
Diagram sources
- fido2/ctap2/blob.py
- fido2/ctap2/extensions.py
- fido2/ctap2/pin.py
The LargeBlobs class serves as the primary interface for interacting with the large blob storage functionality. It encapsulates the chunked transfer mechanism, encryption operations, and provides a clean API for read, write, and control operations.
classDiagram
class LargeBlobs {
+Ctap2 ctap
+int max_fragment_length
+_PinUv pin_uv
+is_supported(info) bool
+read_blob_array() Sequence~Mapping~
+write_blob_array(blob_array) None
+get_blob(large_blob_key) bytes
+put_blob(large_blob_key, data) None
+delete_blob(large_blob_key) None
}
class PinProtocol {
<<abstract>>
+VERSION int
+encapsulate(peer_cose_key) tuple
+encrypt(key, plaintext) bytes
+decrypt(key, ciphertext) bytes
+authenticate(key, message) bytes
+validate_token(token) bytes
}
class PinProtocolV1 {
+VERSION 1
+kdf(z) bytes
+IV bytes
}
class PinProtocolV2 {
+VERSION 2
+HKDF_SALT bytes
+HKDF_INFO_HMAC bytes
+HKDF_INFO_AES bytes
}
LargeBlobs --> PinProtocol : uses
PinProtocol <|-- PinProtocolV1
PinProtocol <|-- PinProtocolV2
Diagram sources
- fido2/ctap2/blob.py
- fido2/ctap2/pin.py
The Large Blob extension integrates with the WebAuthn Extensions framework through the LargeBlobExtension class, which handles the registration and authentication phases of the extension lifecycle.
classDiagram
class LargeBlobExtension {
+NAME "largeBlobKey"
+is_supported(ctap) bool
+make_credential(ctap, options, pin_protocol) Processor
+get_assertion(ctap, options, pin_protocol) Processor
}
class RegistrationExtensionProcessor {
+prepare_inputs(pin_token) dict
+prepare_outputs(response, pin_token) dict
}
class AuthenticationExtensionProcessor {
+prepare_inputs(selected, pin_token) dict
+prepare_outputs(response, pin_token) dict
}
LargeBlobExtension --> RegistrationExtensionProcessor
LargeBlobExtension --> AuthenticationExtensionProcessor
Diagram sources
- fido2/ctap2/extensions.py
Section sources
- fido2/ctap2/blob.py
- fido2/ctap2/extensions.py
The Large Blob Storage extension implements a sophisticated chunked transfer mechanism to handle large data efficiently while respecting authenticator message size limitations.
The system divides large data into fragments using a calculated maximum fragment length:
flowchart TD
Start([Data Input]) --> CalcSize["Calculate Total Size<br/>size = len(data)"]
CalcSize --> InitOffset["Initialize Offset = 0"]
InitOffset --> CheckOffset{"offset < size?"}
CheckOffset --> |No| Complete([Transfer Complete])
CheckOffset --> |Yes| CalcFragment["Calculate Fragment Length<br/>ln = min(size - offset, max_fragment_length)"]
CalcFragment --> ExtractData["Extract Data Fragment<br/>_set = data[offset:offset+ln]"]
ExtractData --> CheckPin{"PIN/UV Required?"}
CheckPin --> |Yes| PrepareAuth["Prepare PIN/UV Authentication<br/>msg = ff*32 + 0c00 + offset + sha256(_set)"]
CheckPin --> |No| NoAuth["Set pin_uv_param = None"]
PrepareAuth --> AuthToken["Generate Authentication Token<br/>pin_uv_param = protocol.authenticate(token, msg)"]
NoAuth --> SendFragment["Send Fragment via CTAP2.large_blobs()<br/>offset, set=_set, length=size if offset==0"]
AuthToken --> SendFragment
SendFragment --> UpdateOffset["offset += ln"]
UpdateOffset --> CheckOffset
Diagram sources
- fido2/ctap2/blob.py
Each fragment is identified by its offset position within the complete data stream, enabling precise reconstruction during read operations. The sequence numbering ensures data integrity and allows for partial reads and writes.
The read mechanism employs a streaming approach to reconstruct complete data from fragments:
sequenceDiagram
participant Client as Client Application
participant LargeBlobs as LargeBlobs Class
participant CTAP2 as CTAP2 Protocol
participant Authenticator as Authenticator
Client->>LargeBlobs : read_blob_array()
LargeBlobs->>LargeBlobs : Initialize offset = 0, buffer = ""
loop Until Complete
LargeBlobs->>CTAP2 : large_blobs(offset, get=max_fragment_length)
CTAP2->>Authenticator : Request fragment
Authenticator-->>CTAP2 : Return fragment
CTAP2-->>LargeBlobs : Fragment data
LargeBlobs->>LargeBlobs : Append fragment to buffer
LargeBlobs->>LargeBlobs : Check if fragment < max_fragment_length
alt More fragments needed
LargeBlobs->>LargeBlobs : offset += max_fragment_length
else Last fragment received
LargeBlobs->>LargeBlobs : Extract data and checksum
LargeBlobs->>LargeBlobs : Verify checksum
LargeBlobs-->>Client : Return CBOR-decoded array
end
end
Diagram sources
- fido2/ctap2/blob.py
Section sources
- fido2/ctap2/blob.py
The Large Blob Storage extension employs AES-CCM (Counter with CBC-MAC) encryption for data confidentiality and integrity protection. Each blob entry is encrypted independently using a unique key derived from the credential's largeBlobKey.
flowchart TD
Data[Raw Data] --> Compress["ZLIB Compression<br/>_compress(data)"]
Compress --> GetNonce["Generate Random Nonce<br/>os.urandom(12)"]
GetNonce --> AESEncrypt["AES-CCM Encryption<br/>aesgcm.encrypt(nonce, compressed, ad)"]
AESEncrypt --> AD["Additional Data:<br/>b'blob' + struct.pack('<Q', orig_size)"]
AD --> PackEntry["Pack Entry:<br/>{1: ciphertext, 2: nonce, 3: orig_size}"]
PackEntry --> Storage[Store in Blob Array]
Diagram sources
- fido2/ctap2/blob.py
The encryption key is derived from the credential's largeBlobKey, which is obtained through the LargeBlobKey extension during credential creation. This ensures that only authorized applications can access the associated data.
Data integrity is protected through multiple mechanisms:
- Checksum Verification: Each blob array ends with a SHA-256 checksum covering the entire CBOR-encoded data
- Authentication Tags: AES-CCM provides built-in authentication tags for each encrypted fragment
- Sequence Validation: Fragment offsets ensure proper ordering and detect missing or corrupted data
Section sources
- fido2/ctap2/blob.py
The Large Blob extension integrates with the WebAuthn Extensions framework through standardized input and output structures:
classDiagram
class AuthenticatorExtensionsLargeBlobInputs {
+str support
+bool read
+bytes write
}
class AuthenticatorExtensionsLargeBlobOutputs {
+bool supported
+bytes blob
+bool written
}
class LargeBlobExtension {
+NAME "largeBlobKey"
+is_supported(ctap) bool
+make_credential(ctap, options, pin_protocol) Processor
+get_assertion(ctap, options, pin_protocol) Processor
}
LargeBlobExtension --> AuthenticatorExtensionsLargeBlobInputs
LargeBlobExtension --> AuthenticatorExtensionsLargeBlobOutputs
Diagram sources
- fido2/ctap2/extensions.py
During registration, the extension validates support requirements and prepares appropriate inputs. During authentication, it handles both read and write operations based on the provided extension parameters.
Section sources
- fido2/ctap2/extensions.py
The system establishes a secure communication channel using Elliptic Curve Diffie-Hellman (ECDH) key exchange:
sequenceDiagram
participant Client as Client
participant Authenticator as Authenticator
participant ECDH as ECDH Protocol
Client->>Authenticator : GET_KEY_AGREEMENT
Authenticator-->>Client : Public Key (peer_cose_key)
Client->>ECDH : encapsulate(peer_cose_key)
ECDH->>ECDH : Generate ephemeral key pair
ECDH->>ECDH : Perform ECDH exchange
ECDH->>ECDH : Derive shared secret
ECDH-->>Client : Key agreement message + shared_secret
Client->>Client : Encrypt sensitive data with shared_secret
Client->>Authenticator : Encrypted request with authentication
Diagram sources
- fido2/ctap2/pin.py
The shared secret derivation process varies between PIN protocol versions:
- PIN Protocol V1: Uses SHA-256 of the ECDH shared secret
- PIN Protocol V2: Uses HKDF with separate HMAC and AES keys
Access to large blob operations is controlled through PIN/UV tokens with specific permissions:
| Permission | Description |
|---|---|
LARGE_BLOB_WRITE |
Allows writing and deleting large blobs |
MAKE_CREDENTIAL |
Enables credential creation with large blob support |
GET_ASSERTION |
Allows authentication with large blob read/write |
Section sources
- fido2/ctap2/pin.py
The test suite demonstrates fundamental operations for managing large blobs:
sequenceDiagram
participant Test as Test Suite
participant LB as LargeBlobs
participant CTAP2 as CTAP2 Protocol
Test->>LB : get_lb(ctap2, pin_protocol)
LB->>CTAP2 : Get PIN token with LARGE_BLOB_WRITE permission
Test->>LB : read_blob_array()
LB-->>Test : Empty array []
Test->>LB : put_blob(key1, data1)
LB->>LB : Encrypt and store blob
Test->>LB : get_blob(key1)
LB-->>Test : data1
Test->>LB : put_blob(key2, data2)
Test->>LB : delete_blob(key1)
Test->>LB : read_blob_array()
LB-->>Test : [entry2]
Diagram sources
- tests/device/test_largeblobs.py
The implementation includes comprehensive error handling for various failure conditions:
- Integrity Failure: Invalid checksum detection prevents corrupted data access
- Storage Full: Size limit enforcement prevents overflow conditions
- Permission Errors: Proper PIN/UV authentication requirements
- Unsupported Features: Graceful degradation when authenticator lacks support
The extension integrates with WebAuthn through the LargeBlobKey extension, enabling applications to request large blob support during credential creation and access stored data during authentication.
Section sources
- tests/device/test_largeblobs.py
The extension ensures data confidentiality through:
- AES-CCM Encryption: Provides both confidentiality and authenticity
- Unique Keys: Each blob uses a separate encryption key derived from the credential's largeBlobKey
- Secure Key Exchange: ECDH prevents interception of encryption keys
Multiple layers of integrity protection ensure data reliability:
- Fragment Authentication: Each encrypted fragment includes an authentication tag
- Array Checksum: Complete blob arrays include a SHA-256 checksum
- Sequence Validation: Fragment offsets prevent data corruption
Access control mechanisms prevent unauthorized access:
- PIN/UV Authentication: Requires proper authentication for write operations
- Permission-Based Access: Different operations require different permissions
- Credential Isolation: Each credential's blobs are isolated from others
Privacy considerations include:
- Local Storage: Data remains on the authenticator and is not transmitted to servers
- Selective Access: Applications can only access blobs associated with their credentials
- Deletion Capability: Users can remove unwanted data
The implementation considers authenticator memory limitations:
-
Fragment Size Calculation: Respects
max_msg_sizeminus overhead (64 bytes) - Streaming Operations: Processes data in chunks to minimize memory usage
- Compression: Reduces storage requirements through ZLIB compression
Optimizations for efficient data transfer:
- Adaptive Fragmentation: Adjusts fragment size based on authenticator capabilities
- Minimal Overhead: Uses compact CBOR encoding for data representation
- Parallel Operations: Supports concurrent read/write operations where possible
The system scales effectively for typical use cases:
- Maximum Size Limits: Enforces reasonable limits to prevent abuse
- Incremental Storage: Supports adding/removing individual blobs
- Efficient Lookup: Uses key-based access for fast blob retrieval
| Issue | Symptoms | Solution |
|---|---|---|
LARGE_BLOB_STORAGE_FULL |
Write operation fails | Reduce data size or clear existing blobs |
INTEGRITY_FAILURE |
Data corruption detected | Verify data integrity and retry |
PUAT_REQUIRED |
Missing PIN/UV token | Obtain appropriate PIN/UV token |
PIN_AUTH_INVALID |
Incorrect permissions | Use correct permission level |
Authenticator does not support LargeBlobs |
Feature unavailable | Check authenticator capabilities |
-
Verify Authenticator Support: Check
ctap.info.options.get("largeBlobs") - Validate PIN/UV Setup: Ensure proper PIN/UV authentication is configured
- Monitor Fragment Sizes: Verify fragment sizes respect authenticator limits
- Check Data Integrity: Validate checksums and encryption keys
- Batch Operations: Group related operations to reduce overhead
- Compression Benefits: Leverage ZLIB compression for text-heavy data
- Memory Management: Monitor memory usage during large transfers
The Large Blob Storage extension represents a significant advancement in WebAuthn credential management, providing secure and efficient storage of substantial data alongside individual credentials. Through its sophisticated chunked transfer mechanism, robust encryption using AES-CCM, and seamless integration with the Extensions framework, it enables modern web applications to enhance user experiences while maintaining strong security guarantees.
The implementation demonstrates careful consideration of security, performance, and usability requirements, making it suitable for production environments where large credential-associated data needs to be managed securely. The comprehensive test coverage and error handling ensure reliable operation across diverse authenticator implementations.
Future enhancements could include support for larger storage quotas, improved compression algorithms, and additional access control mechanisms to further expand the capabilities of this essential WebAuthn extension.