Docs: Token - rollthecloudinc/quell GitHub Wiki

Documentation for Angular Token Library


Table of Contents

  1. Introduction
  2. Installation
  3. Overview
  4. Key Concepts
    • Tokens and Token Generation
    • Token Replacement
    • Token Discovery and Matching
  5. Core Components
    • TokenModule
  6. Key Services
    • TokenizerService
  7. Usage
    • Generating Tokens for Attributes
    • Managing Generic Tokens
    • Replacing and Matching Tokens in Strings
  8. API Reference
  9. Examples
  10. Testing

1. Introduction

The Angular Token Library provides utilities to dynamically extract, generate, replace, and manage tokens from attributes, objects, and strings. Tokens are a powerful mechanism for resolving dynamic placeholders within your application's data, configuration, or content.


2. Installation

To install the Angular Token Library and its dependencies, run:

npm install @rollthecloudinc/attributes

3. Overview

The library is centered around the concept of tokens, which are placeholders or dynamic variables used to resolve and transform data structures. It provides efficient methods for:

  • Extracting tokens from complex attribute structures or generic objects.
  • Replacing tokens within strings dynamically.
  • Discovering and matching tokens from textual input.

Features:

  • Token Generation: Create tokens based on attributes or objects.
  • Token Replacement: Replace placeholders (e.g., [token]) in strings with resolved values.
  • Token Matching: Find tokens in strings and return matches.
  • Token Discovery: Extract unique tokens from text and distinguish token patterns from the input.

4. Key Concepts

4.1 Tokens and Token Generation

Tokens are key-value mappings that represent dynamic placeholders or data variables. The library can generate tokens from attribute-like settings or objects using a hierarchical and recursive approach.

4.2 Token Replacement

Token replacement occurs by evaluating a string and substituting token placeholders (e.g., [token]) with their corresponding values. This operation is useful for resolving dynamic text or configurations based on token mappings.

4.3 Token Discovery and Matching

Token discovery provides an efficient mechanism to extract tokens embedded within a string, ensuring that unique tokens can be identified. Matching tokens is the process of evaluating whether certain tokens exist within a given string.


5. Core Components

TokenModule

The TokenModule is the entry point for this library. It encapsulates essential functionality such as services and classes required for token generation and management.

Example:

import { TokenModule } from '@rollthecloudinc/token';

@NgModule({
  imports: [TokenModule],
})
export class AppModule {}

6. Key Services

TokenizerService

The TokenizerService is the core service provided by the library to facilitate token generation, replacement, discovery, and matching.

Key Responsibilities:

  • Generate tokens from AttributeValue[] or generic objects.
  • Replace tokens in strings with dynamic values from a token map.
  • Discover tokens embedded in text based on specific patterns (e.g., [token]).
  • Match tokens to check if placeholders exist within strings.

7. Usage

7.1 Generating Tokens for Attributes

Use the generateTokens method to create a map of tokens from an array of AttributeValue objects. Each token is resolved based on its properties.

Example:

import { TokenizerService } from './services/tokenizer.service';

const attributeValues = [
  { name: 'title', type: 'string', value: 'Angular Token Library' } as AttributeValue,
  { name: 'description', type: 'string', value: 'A dynamic token management library' } as AttributeValue,
];

const tokens = tokenizerService.generateTokens(attributeValues);

console.log(tokens); 
// Output: Map { 'title' => 'Angular Token Library', 'description' => 'A dynamic token management library' }

7.2 Managing Generic Tokens

Use the generateGenericTokens method to generate tokens for any arbitrary object. This method traverses the object recursively and generates tokens for non-object properties or arrays.

Example:

const sampleObject = {
  title: 'A Sample Object',
  metadata: {
    author: 'John Doe',
    date: '2023-10-01',
  },
};

const genericTokens = tokenizerService.generateGenericTokens(sampleObject);

console.log(genericTokens);
// Output: Map { 
//   'title' => 'A Sample Object', 
//   'metadata.author' => 'John Doe', 
//   'metadata.date' => '2023-10-01' 
// }

7.3 Replacing Tokens in Strings

To resolve dynamic placeholders in a string, use the replaceTokens method. The function substitutes token patterns (e.g., [placeholder]) with their respective values from the token map.

Example:

const stringTemplate = 'Welcome to [title]! Authored by [metadata.author].';
const resolvedString = tokenizerService.replaceTokens(stringTemplate, genericTokens);

console.log(resolvedString); 
// Output: 'Welcome to A Sample Object! Authored by John Doe.'

7.4 Discovering and Matching Tokens

Use discoverTokens to identify all token patterns in a given string and matchTokens to validate which of the defined tokens exist within the string.

Example:

const stringTemplate = 'Welcome to [title]! Authored by [metadata.author].';

const discoveredTokens = tokenizerService.discoverTokens(stringTemplate);
console.log(discoveredTokens); 
// Output: [ 'title', 'metadata.author' ]

const matchedTokens = tokenizerService.matchTokens(stringTemplate, ['title', 'metadata.date']);
console.log(matchedTokens); 
// Output: [ 'title' ]

8. API Reference

Methods in TokenizerService

  1. generateTokens(settings: Array<AttributeValue>): Map<string, any>

    • Generates tokens from an array of AttributeValue objects.
  2. generateGenericTokens(obj: any, prefix?: string): Map<string, any>

    • Creates tokens based on the properties of a generic object.
  3. replaceTokens(value: string, tokens: Map<string, any>): string

    • Substitutes token placeholders in a string with corresponding values.
  4. discoverTokens(value: string, full?: boolean): Array<string>

    • Extracts unique tokens from a string based on specific patterns.
  5. matchTokens(value: string, tokens: Array<string>): Array<string>

    • Determines which tokens from the input array match placeholders in the string.

9. Examples

Example: Dynamic Token Resolution

const attributeValues = [
  { name: 'username', type: 'string', value: 'JaneDoe' } as AttributeValue,
  { name: 'greeting', type: 'string', value: 'Hello, [username]!' } as AttributeValue,
];

const tokens = tokenizerService.generateTokens(attributeValues);

const personalizedMessage = tokenizerService.replaceTokens(tokens.get('greeting'), tokens);

console.log(personalizedMessage); 
// Output: 'Hello, JaneDoe!'

10. Testing

Example: Testing TokenizerService

import { TokenizerService } from './services/tokenizer.service';

describe('TokenizerService', () => {
  let tokenizerService: TokenizerService;

  beforeEach(() => {
    tokenizerService = new TokenizerService();
  });

  it('should generate tokens from attributes', () => {
    const attributes = [{ name: 'key', type: 'string', value: 'value' } as AttributeValue];
    const tokens = tokenizerService.generateTokens(attributes);
    expect(tokens.get('key')).toBe('value');
  });

  it('should replace tokens in a string', () => {
    const tokens = new Map([['name', 'John']]);
    const resolved = tokenizerService.replaceTokens('Hello [name]!', tokens);
    expect(resolved).toBe('Hello John!');
  });

  it('should discover tokens', () => {
    const string = 'Welcome to [title], created by [author].';
    const discovered = tokenizerService.discoverTokens(string);
    expect(discovered).toEqual(['title', 'author']);
  });
});

Conclusion

The Angular Token Library facilitates dynamic token generation, replacement, and discovery. It is ideal for applications requiring adaptable data resolution and inline placeholders. For contributions or bug reports, feel free to reach out!

⚠️ **GitHub.com Fallback** ⚠️