Docs: Token - rollthecloudinc/quell GitHub Wiki
- Introduction
- Installation
- Overview
-
Key Concepts
- Tokens and Token Generation
- Token Replacement
- Token Discovery and Matching
-
Core Components
TokenModule
-
Key Services
TokenizerService
-
Usage
- Generating Tokens for Attributes
- Managing Generic Tokens
- Replacing and Matching Tokens in Strings
- API Reference
- Examples
- Testing
The Angular Token Library provides utilities to dynamically extract, generate, replace, and manage tokens from attributes, objects, and strings. Tokens are a powerful mechanism for resolving dynamic placeholders within your application's data, configuration, or content.
To install the Angular Token Library and its dependencies, run:
npm install @rollthecloudinc/attributes
The library is centered around the concept of tokens, which are placeholders or dynamic variables used to resolve and transform data structures. It provides efficient methods for:
- Extracting tokens from complex attribute structures or generic objects.
- Replacing tokens within strings dynamically.
- Discovering and matching tokens from textual input.
- Token Generation: Create tokens based on attributes or objects.
-
Token Replacement: Replace placeholders (e.g.,
[token]
) in strings with resolved values. - Token Matching: Find tokens in strings and return matches.
- Token Discovery: Extract unique tokens from text and distinguish token patterns from the input.
Tokens are key-value mappings that represent dynamic placeholders or data variables. The library can generate tokens from attribute-like settings or objects using a hierarchical and recursive approach.
Token replacement occurs by evaluating a string and substituting token placeholders (e.g., [token]
) with their corresponding values. This operation is useful for resolving dynamic text or configurations based on token mappings.
Token discovery provides an efficient mechanism to extract tokens embedded within a string, ensuring that unique tokens can be identified. Matching tokens is the process of evaluating whether certain tokens exist within a given string.
The TokenModule
is the entry point for this library. It encapsulates essential functionality such as services and classes required for token generation and management.
Example:
import { TokenModule } from '@rollthecloudinc/token';
@NgModule({
imports: [TokenModule],
})
export class AppModule {}
The TokenizerService
is the core service provided by the library to facilitate token generation, replacement, discovery, and matching.
Key Responsibilities:
- Generate tokens from
AttributeValue[]
or generic objects. - Replace tokens in strings with dynamic values from a token map.
- Discover tokens embedded in text based on specific patterns (e.g.,
[token]
). - Match tokens to check if placeholders exist within strings.
Use the generateTokens
method to create a map of tokens from an array of AttributeValue
objects. Each token is resolved based on its properties.
Example:
import { TokenizerService } from './services/tokenizer.service';
const attributeValues = [
{ name: 'title', type: 'string', value: 'Angular Token Library' } as AttributeValue,
{ name: 'description', type: 'string', value: 'A dynamic token management library' } as AttributeValue,
];
const tokens = tokenizerService.generateTokens(attributeValues);
console.log(tokens);
// Output: Map { 'title' => 'Angular Token Library', 'description' => 'A dynamic token management library' }
Use the generateGenericTokens
method to generate tokens for any arbitrary object. This method traverses the object recursively and generates tokens for non-object properties or arrays.
Example:
const sampleObject = {
title: 'A Sample Object',
metadata: {
author: 'John Doe',
date: '2023-10-01',
},
};
const genericTokens = tokenizerService.generateGenericTokens(sampleObject);
console.log(genericTokens);
// Output: Map {
// 'title' => 'A Sample Object',
// 'metadata.author' => 'John Doe',
// 'metadata.date' => '2023-10-01'
// }
To resolve dynamic placeholders in a string, use the replaceTokens
method. The function substitutes token patterns (e.g., [placeholder]
) with their respective values from the token map.
Example:
const stringTemplate = 'Welcome to [title]! Authored by [metadata.author].';
const resolvedString = tokenizerService.replaceTokens(stringTemplate, genericTokens);
console.log(resolvedString);
// Output: 'Welcome to A Sample Object! Authored by John Doe.'
Use discoverTokens
to identify all token patterns in a given string and matchTokens
to validate which of the defined tokens exist within the string.
Example:
const stringTemplate = 'Welcome to [title]! Authored by [metadata.author].';
const discoveredTokens = tokenizerService.discoverTokens(stringTemplate);
console.log(discoveredTokens);
// Output: [ 'title', 'metadata.author' ]
const matchedTokens = tokenizerService.matchTokens(stringTemplate, ['title', 'metadata.date']);
console.log(matchedTokens);
// Output: [ 'title' ]
-
generateTokens(settings: Array<AttributeValue>): Map<string, any>
- Generates tokens from an array of
AttributeValue
objects.
- Generates tokens from an array of
-
generateGenericTokens(obj: any, prefix?: string): Map<string, any>
- Creates tokens based on the properties of a generic object.
-
replaceTokens(value: string, tokens: Map<string, any>): string
- Substitutes token placeholders in a string with corresponding values.
-
discoverTokens(value: string, full?: boolean): Array<string>
- Extracts unique tokens from a string based on specific patterns.
-
matchTokens(value: string, tokens: Array<string>): Array<string>
- Determines which tokens from the input array match placeholders in the string.
const attributeValues = [
{ name: 'username', type: 'string', value: 'JaneDoe' } as AttributeValue,
{ name: 'greeting', type: 'string', value: 'Hello, [username]!' } as AttributeValue,
];
const tokens = tokenizerService.generateTokens(attributeValues);
const personalizedMessage = tokenizerService.replaceTokens(tokens.get('greeting'), tokens);
console.log(personalizedMessage);
// Output: 'Hello, JaneDoe!'
import { TokenizerService } from './services/tokenizer.service';
describe('TokenizerService', () => {
let tokenizerService: TokenizerService;
beforeEach(() => {
tokenizerService = new TokenizerService();
});
it('should generate tokens from attributes', () => {
const attributes = [{ name: 'key', type: 'string', value: 'value' } as AttributeValue];
const tokens = tokenizerService.generateTokens(attributes);
expect(tokens.get('key')).toBe('value');
});
it('should replace tokens in a string', () => {
const tokens = new Map([['name', 'John']]);
const resolved = tokenizerService.replaceTokens('Hello [name]!', tokens);
expect(resolved).toBe('Hello John!');
});
it('should discover tokens', () => {
const string = 'Welcome to [title], created by [author].';
const discovered = tokenizerService.discoverTokens(string);
expect(discovered).toEqual(['title', 'author']);
});
});
The Angular Token Library facilitates dynamic token generation, replacement, and discovery. It is ideal for applications requiring adaptable data resolution and inline placeholders. For contributions or bug reports, feel free to reach out!