Salesforce - sgml/signature GitHub Wiki

Salesforce Tools

Tool Primary Function Typical Use Case Cost
Data Loader Bulk data import/export Insert, update, delete, and export records via CSV Free, included with Salesforce
Field Dumper Metadata export Dump object/field definitions into Excel for documentation Free, AppExchange utility
Workbench Web‑based admin tool Query data, manage metadata, run REST/SOAP API calls Free, community-supported
Salesforce Inspector Data inspection Quick view/export of object data directly from Salesforce UI Free
Force.com CLI (sfdx) Command‑line interface Scripted data and metadata operations Free, open‑source
Apache Ant Build automation General build tool used with Salesforce metadata tasks Free, open‑source
Salesforce Migration Tool Metadata migration Deploy/retrieve metadata between Salesforce orgs using Ant scripts (ant-salesforce.jar) Free, included with Salesforce

Salesforce Protocols

1. Salesforce CLI (sf CLI)

Unique capabilities

  • Unified DevOps command surface for metadata deploy/retrieve, org creation, scratch org lifecycle, and packaging.
  • Source-tracking for scratch orgs (detects changed metadata automatically).
  • Plugin ecosystem (official and community) enabling automation beyond metadata (Apex tests, data commands, org snapshots).
  • CI/CD-friendly: designed for headless execution in pipelines.
  • Local project model (SFDX project structure) that no API provides.

What only sf CLI can do

  • Create, delete, or clone scratch orgs.
  • Manage DevHub, second-generation packaging, and unlocked packages.
  • Run Apex tests and retrieve code coverage locally.

2. Salesforce ANT Migration Tool (sf ant)

Unique capabilities

  • Legacy metadata deploy/retrieve using the Metadata API via XML manifests.
  • Deterministic, file-based deployments ideal for older orgs that are not using SFDX.
  • Supports metadata types that are not yet supported by sf CLI (because it directly wraps Metadata API).
  • No project structure is required and it works with any folder layout.

What only ANT can do

  • Run massive metadata deployments in environments where SFDX is not allowed.
  • Operate in air-gapped or restricted enterprise build systems that only support ANT.

3. Salesforce SOAP API

Unique capabilities

  • Strongly typed WSDL contract (Enterprise WSDL) that generates classes in Java or .NET.
  • Full CRUD and describe metadata in a single contract.
  • Transaction-style operations (create, update, or delete multiple records in one call).
  • Field-level metadata introspection (describeSObjects) that is not available in Bulk APIs.

What only SOAP API can do

  • Provide compile-time type safety for enterprise systems.
  • Offer deep metadata descriptions (fields, picklists, relationships) in one call.

4. Salesforce SFTP API (a misnomer because it is really SFTP-based integrations)

Unique capabilities

  • File-based ingestion via SFTP servers (Salesforce does not provide native SFTP; partners or middleware do).
  • Supports very large files (GB-scale) before loading into Salesforce.
  • Batch-friendly for nightly or weekly loads.
  • Works with non-API-capable legacy systems that can only drop files.

What only SFTP workflows can do

  • Handle massive flat-file ingestion without API limits.
  • Integrate with mainframes, ERPs, and HRIS systems that cannot call APIs.

5. Bulk API v1

Unique capabilities

  • Batch-oriented CSV upload with explicit batch creation and chunking.
  • Parallel versus serial mode control (serial mode avoids locking).
  • Hard delete support (v2 supports it now, but v1 was first).
  • PK chunking for large queries.

What only Bulk API v1 can do

  • Fine-grained control over batch size and parallelism.
  • PK chunking for SOQL queries (v2 does not support PK chunking).

6. Bulk API v2

Unique capabilities

  • True RESTful interface (no batch management).
  • Automatic chunking because Salesforce handles splitting and parallelization.
  • Simplified job lifecycle (create job, upload data, and then mark it done).
  • Better error reporting with structured JSON results.
  • Higher throughput for ingest jobs.

What only Bulk API v2 can do

  • Provide server-managed chunking with no client logic.
  • Support multipart uploads for large files via REST.
  • Offer cleaner, modern REST semantics for bulk ingest.

7. Salesforce External Client API

(This refers to the new External Client Credentials Flow and External Client API introduced in 2023 and 2024.)

Unique capabilities

  • Client Credentials OAuth flow that is purpose built for server-to-server integrations.
  • Allows non-Salesforce systems to obtain access tokens without a user.
  • Fine-grained permission scopes (API scopes and object scopes).
  • No Connected App user mapping is required.
  • Modern OAuth 2.0 alignment (RFC compliant).

What only External Client API can do

  • Provide true client-credentials authentication without JWT Bearer or user context.
  • Issue scoped access tokens that restrict API access by object or operation.
  • Enable external clients to authenticate without storing certificates (if using symmetric secrets).

Salesforce Quick Starts

1. Connect to a Salesforce org via the sf CLI

Steps

  1. Install the Salesforce CLI on your workstation.
  2. Open a terminal and run the command: sf org login web --target-org [email protected]
  3. A browser window opens. Log in with your Salesforce credentials.
  4. Approve the OAuth request to allow the CLI to access the org.
  5. Return to the terminal and confirm the connection with: sf org list
  6. (Optional) Set the org as the default for future commands: sf config set target-org=myOrgAlias

Estimated time

  • Installation: 2 to 5 minutes
  • Authentication: 30 to 60 seconds
  • Verification: 10 seconds
  • Total: 3 to 7 minutes

2. Read all objects via CLI commands

Steps

  1. Ensure you are authenticated to the target org.
  2. Run the command to list all objects: sf data query --query "SELECT QualifiedApiName FROM EntityDefinition" --target-org [email protected]
  3. Review the output in the terminal.
  4. (Optional) Export the results to a JSON or CSV file using shell redirection: sf data query --query "SELECT QualifiedApiName FROM EntityDefinition WHERE IsCustomSetting = false AND IsCustomizable = true AND NamespacePrefix = null" --json > custom_objects.json

Estimated time

  • Query execution: 1 to 3 seconds
  • Review and export: 10 to 20 seconds
  • Total: 15 to 30 seconds

3. Export metadata of all objects to CSV

Steps

  1. Retrieve all object metadata using the Metadata API via the CLI: sf project retrieve start --metadata "CustomObject"
  2. After retrieval, navigate to the force-app or metadata directory where the object XML files were downloaded.
  3. Use a script or command-line tool to parse the XML files and extract the fields you want into CSV format. For example, using a simple shell pipeline: grep -R "<fullName>" -n force-app/main/default/objects > object_metadata_raw.txt
  4. Convert the raw output into CSV using a script in Python, Node.js, or a shell tool such as awk or sed.
  5. Save the final CSV file to your project directory.

4. Get all User objects

Steps

  1. Retrieve all Users using the SOQL API via the CLI: sf data query -q "SELECT Id, Name, Email FROM User" --target-org [email protected]

Estimated time

  • User data retrieval: 10 to 20 seconds depending on org size
  • Total: 30 seconds to 2 minutes

Salesforce Auditing

Category What Triggers the Limit Salesforce Default Quota HubSpot Default Quota Soft or Hard Cap Example User Story
API Calls Exceeding daily org API allocation ~15000 calls/day baseline plus license multipliers 100000 calls/day per app Hard cap (requests blocked once exceeded) A marketing team’s nightly ETL job fails when API calls exceed the daily quota, forcing them to reschedule data syncs.
Concurrent Requests Too many long running requests 25 concurrent long running requests HubSpot does not enforce concurrency caps Hard cap (excess requests rejected) A developer triggers multiple heavy SOQL queries at once; the 26th request is rejected until earlier ones finish.
Apex Governor Limits CPU, heap, SOQL, DML, callouts CPU 10000 ms; SOQL 100; DML 150; Callouts 100 Not applicable Hard cap (uncatchable exceptions) A custom Apex trigger loops through too many records, hitting the SOQL query limit and throwing a System.LimitException.
Async Apex Too many queued jobs 250000 AsyncApexExecutions per 24h Not applicable Hard cap (jobs fail once exceeded) A data migration queues thousands of batch jobs; once the async job limit is hit, new jobs are rejected.
Bulk API Too many batches/day or oversized batches 15000 batches/day; 10000 records/batch 10000 records per batch; 100 requests/10s Hard cap (batch rejected) An integration tries to upload 20,000 records in one batch; Salesforce rejects the batch for exceeding the record limit.
Platform Events Exceeding publish or delivery caps 50000 event publishes/day No equivalent Hard cap (publishes blocked) An IoT app publishes sensor events continuously; once the daily publish cap is reached, further events are blocked.
Change Data Capture Too many CDC events 100000 events/day Webhooks: 10000 deliveries/10s per app Soft cap (throttling/delays applied) A sales org enables CDC on multiple objects; when event volume spikes, delivery is throttled and delayed until capacity frees.

Salesforce Idempotency

No-Code Write-Path GoF Pattern Idempotency Determinism Open-Source Specifications Used Salesforce UI Navigation and Form Field Submission Checklist Example Transaction (Step Function -> Salesforce -> Email) Salesforce Request Equivalent Salesforce Response Equivalent Trailhead Courses (Count) Replayability (Impact of Duplicates) Salesforce Custom Code Required (Comparison) External Custom Code and Endpoint Checklist Platform Events Column Ordering Guarantees Retry Semantics
Flow + Platform Events Observer Idempotency Key = hash(foo, bar). Flow must check if a record with this ID already exists before creating a new one. Not fully deterministic. Event ordering is not guaranteed; duplicates may arrive; branching adds nondeterminism. Open-Source Specs: JSON (RFC 8259), REST, SHA-2 or SHA-3 hashing (conceptual), Event-driven patterns Salesforce UI Steps: 1. Setup -> Platform Events -> Create baz__e. 2. Flow -> Trigger: Platform Event baz__e. 3. Assignment: compute ID. 4. Create Records. 5. Send Email. 6. Activate. Step Function publishes baz__e -> Flow subscribes -> Flow computes ID -> Flow creates "bar" -> Flow emails [email protected]. POST /services/data/vXX.X/sobjects/baz__e/ 201 Created with replay ID 3 Replay-dangerous. Duplicate events always re-trigger the Flow and create duplicate records and duplicate emails unless deduped. Least custom code. No Apex, no OpenAPI, no external endpoints. None required. No external API. No custom server. No OpenAPI. No external hosting. Step Function only needs permission to POST the Platform Event. Uses Platform Events. Risks: duplicate delivery, out-of-order events, replay issues, no guaranteed ordering, requires dedupe logic. No ordering guarantees. Platform Events may arrive out of order, may be delayed, and may be replayed unpredictably. Retry is unsafe. Retrying the publish creates a new Platform Event, which always re-triggers the Flow. Requires explicit dedupe logic to avoid duplicate records and emails.
Flow + External Services Adapter Idempotency Key = hash(foo, bar). Upsert makes the operation naturally idempotent. Deterministic. OpenAPI schema enforces strict validation; identical payloads behave identically. Open-Source Specs: OpenAPI 3.x, JSON Schema, REST, SHA-2 or SHA-3 hashing Salesforce UI Steps: 1. External Services -> Register OpenAPI. 2. Flow -> Add External Service Action. 3. Map fields. 4. Assignment: compute ID. 5. Upsert Records. 6. Send Email. 7. Activate. Step Function calls External Service -> Flow maps fields -> Flow computes ID -> Flow upserts "bar" -> Flow emails [email protected]. External Service action defined by OpenAPI schema REST response mapped into Flow 2 Replay-safe for data. Upsert prevents duplicate records. Duplicate emails still possible unless suppressed. Moderate custom code. Requires writing and maintaining an OpenAPI spec. External code required. Must host a REST API endpoint that matches the OpenAPI spec. Must implement request and response bodies exactly as defined. Must handle auth. Must handle validation and idempotency. Must be reachable by Salesforce over HTTPS. Does not use Platform Events. Advantage: no duplicate event risk, no replay issues, no ordering problems, no event bus latency. Strong ordering guarantees. External REST calls execute synchronously and deterministically. Ordering is controlled by the caller. Retry is safe for data. Upsert ensures the same ID overwrites the same record. External API may retry internally depending on its design. Duplicate emails still possible unless Flow suppresses them.
Flow + Named Credentials (HTTP Callout) Proxy Idempotency Key = hash(foo, bar). Flow must dedupe before the no-op callout. Fully deterministic. Platform Event trigger plus no-op callout means deterministic behavior. Open-Source Specs: HTTP/1.1 (RFC 7230-7235), REST, JSON, SHA-2 or SHA-3 hashing, Event-driven patterns Salesforce UI Steps: 1. Platform Events -> Create baz__e. 2. Named Credentials -> New (no-op endpoint). 3. Flow -> Trigger: Platform Event baz__e. 4. Assignment: compute ID. 5. Action: HTTP Callout (no-op). 6. Send Email. 7. Activate. Step Function publishes baz__e -> Flow fires -> Flow computes ID -> Flow performs no-op callout -> Flow emails [email protected]. POST /services/data/vXX.X/sobjects/baz__e/ plus Named Credential callout (no-op) Upstream HTTP response irrelevant 4 Replay-dangerous. Duplicate Platform Events always re-trigger the Flow and send duplicate emails unless deduped. Most custom code (still low-code). Requires configuring Named Credentials and optional custom headers. Minimal external code. Endpoint must exist but can return static JSON. No logic required. No authentication required if public. Can be a mock server. Must be reachable over HTTPS. Uses Platform Events. Risks: duplicate triggers, replay issues, event ordering problems, requires dedupe logic to avoid duplicate emails. No ordering guarantees. Platform Events may arrive out of order, and the Flow will process them in the order received, not the order sent. Retry is unsafe. Retrying the publish creates a new Platform Event, which re-triggers the Flow. No-op callout does not mitigate duplicates. Requires dedupe logic to avoid duplicate emails.

Salesforce SSIS Integration

Setup Method GoF Pattern Idempotency Determinism Open-Source Specifications Used AWS Console Navigation and Setup Checklist Example Transaction (User -> AWS -> Windows Server) AWS Request Equivalent AWS Response Equivalent AWS Training Resources (Count) Replayability (Impact of Duplicates) AWS Custom Code Required (Comparison) External Custom Code and Endpoint Checklist Platform Events Column Ordering Guarantees Retry Semantics
AWS Marketplace AMI with Visual Studio preinstalled Factory Method Idempotent if the same AMI ID is used. Launching multiple instances creates identical environments. Fully deterministic. AMI contents are fixed and identical across launches. Open-Source Specs: RDP (RFC 6143 conceptual), PowerShell, HTTP/HTTPS, JSON for AWS APIs AWS UI Steps: 1. EC2 -> Launch Instance. 2. Choose Marketplace. 3. Search "Visual Studio". 4. Select Windows Server + Visual Studio AMI. 5. Choose instance type. 6. Configure networking. 7. Launch. User launches instance -> AWS provisions AMI -> Windows boots with Visual Studio preinstalled -> User connects via RDP. RunInstances API call with Marketplace AMI ID EC2 returns InstanceId and state transitions 2 Replay-safe. Re-launching creates new identical instances. No risk of partial installs or duplicates. Least custom code. No scripts, no installers, no configuration logic. None required. No external endpoints. No hosting. No bootstrap scripts. Does not use Platform Events. Advantage: no event ordering risk, no replay issues, no asynchronous triggers. Strong ordering guarantees. AMI provisioning is synchronous and deterministic. Retry is safe. Retrying RunInstances simply creates another identical instance.
Manual installation on vanilla Windows Server EC2 Builder Idempotency depends on user discipline. Manual steps can drift or diverge. Not deterministic. Human-driven installation varies by timing, patch level, and manual choices. Open-Source Specs: RDP, PowerShell, HTTP/HTTPS, MSI installer standards AWS UI Steps: 1. EC2 -> Launch Instance. 2. Choose Windows Server base AMI. 3. Launch instance. 4. RDP into server. 5. Download Visual Studio installer. 6. Run installer manually. 7. Configure workloads. User launches instance -> RDP -> manually installs Visual Studio -> config varies by user actions. RunInstances API call with Windows Server AMI ID EC2 returns InstanceId; installation is manual and not reflected in AWS API 1 Replay-dangerous. Manual installs repeated multiple times produce inconsistent environments and possible misconfigurations. Most custom code (human code). Manual steps, custom settings, ad hoc scripts. External code optional. May require downloading installers, extensions, or dependencies from external endpoints. Does not use Platform Events. Advantage: no event bus risk, but manual steps introduce human error. No ordering guarantees. Manual steps can be performed in any order, leading to inconsistent results. Retry is unsafe. Retrying manual installation can produce divergent configurations and inconsistent environments.
Automated installation using EC2 Launch Template + User Data (PowerShell bootstrap) Template Method Idempotent if the same script and template are used. Script produces identical results across instances. Deterministic if script is deterministic. External downloads may introduce variability. Open-Source Specs: PowerShell, HTTP/HTTPS, JSON for AWS APIs, EC2 User Data standards AWS UI Steps: 1. EC2 -> Launch Templates -> Create Template. 2. Add User Data with PowerShell script to install Visual Studio. 3. Launch instance from template. 4. Instance boots and runs script automatically. User launches instance -> EC2 injects User Data -> Windows runs PowerShell -> Visual Studio installs automatically -> User connects via RDP. RunInstances with LaunchTemplateId and UserData EC2 returns InstanceId; bootstrap logs available via EC2 console 3 Replay-safe for data. Re-running template produces identical instances unless external URLs change. Moderate custom code. Requires writing PowerShell bootstrap script. External code required. Script must download Visual Studio installer from Microsoft endpoints. Must handle retries, checksums, and dependencies. Does not use Platform Events. Advantage: no event ordering risk; automation is synchronous at boot. Strong ordering guarantees. User Data executes in a fixed sequence at first boot. Retry is mostly safe. Re-running template creates new identical instances. Script retries must be handled manually.

Simplicity

Apex REST

Test Code

@IsTest
private class QuirkExamplesDataDrivenTest {

    // Simple descriptor for each quirk test
    private class QuirkCase {
        String name;
        String body;
        Boolean expectException;
        String expectedMessageFragment;

        QuirkCase(String name, String body, Boolean expectException, String expectedMessageFragment) {
            this.name = name;
            this.body = body;
            this.expectException = expectException;
            this.expectedMessageFragment = expectedMessageFragment;
        }
    }

    // Build all test cases in one place
    private static List<QuirkCase> buildCases() {
        return new List<QuirkCase>{
            new QuirkCase(
                'case-sensitive-json-keys',
                '{"firstname":"Ada"}',
                false,
                null
            ),
            new QuirkCase(
                'reserved-keyword-field-names',
                '{"type":"example"}',
                false,
                null
            ),
            new QuirkCase(
                'uncatchable-json-parse-errors',
                '{"name":"Ada"',
                true,
                'Unexpected end of JSON'
            ),
            new QuirkCase(
                'strict-content-type-requirements',
                '{"x":"y"}',
                false,
                null
            ),
            new QuirkCase(
                'no-array-object-coercion',
                '{"name":"Ada"}',
                true,
                'Cannot deserialize instance'
            ),
            new QuirkCase(
                'rigid-datetime-formatting',
                '{"when":"2025-01-01 12:00:00"}',
                true,
                'Invalid date'
            ),
            new QuirkCase(
                'heap-limit-on-large-payloads',
                '{"big":"' + ''.padLeft(500000, 'x') + '"}',
                false,
                null
            ),
            new QuirkCase(
                'enum-string-mismatch-failures',
                '{"state":"wrongvalue"}',
                true,
                'Invalid enum'
            ),
            new QuirkCase(
                'nested-dynamic-map-casting',
                '{"foo":{"bar":{"baz":"bop"}}}',
                false,
                null
            )
        };
    }

    // Dispatcher that calls the correct handler based on the quirk name
    private static void dispatch(QuirkCase qc) {
        RestRequest req = new RestRequest();
        req.httpMethod = 'POST';
        req.requestBody = Blob.valueOf(qc.body);

        // Only strict-content-type test needs a wrong header
        if (qc.name == 'strict-content-type-requirements') {
            req.addHeader('Content-Type', 'text/plain');
        }

        RestContext.request = req;
        RestContext.response = new RestResponse();

        if (qc.name == 'case-sensitive-json-keys') {
            CaseSensitiveDemo.run();
            return;
        }
        if (qc.name == 'reserved-keyword-field-names') {
            ReservedKeywordDemo.run();
            return;
        }
        if (qc.name == 'uncatchable-json-parse-errors') {
            ParseErrorDemo.run();
            return;
        }
        if (qc.name == 'strict-content-type-requirements') {
            ContentTypeDemo.run();
            return;
        }
        if (qc.name == 'no-array-object-coercion') {
            ArrayCoercionDemo.run();
            return;
        }
        if (qc.name == 'rigid-datetime-formatting') {
            DateTimeDemo.run();
            return;
        }
        if (qc.name == 'heap-limit-on-large-payloads') {
            HeapLimitDemo.run();
            return;
        }
        if (qc.name == 'enum-string-mismatch-failures') {
            EnumDemo.run();
            return;
        }
        if (qc.name == 'nested-dynamic-map-casting') {
            NestedMapDemo.run();
            return;
        }
    }

    @IsTest
    static void runAllQuirkTests() {
        for (QuirkCase qc : buildCases()) {
            Boolean threw = false;
            String msg = null;

            try {
                dispatch(qc);
            } catch (Exception e) {
                threw = true;
                msg = e.getMessage();
            }

            if (qc.expectException) {
                System.assert(threw, 'Expected exception for ' + qc.name);
                System.assert(msg.contains(qc.expectedMessageFragment),
                    'Expected message fragment for ' + qc.name);
            } else {
                System.assert(!threw, 'Did not expect exception for ' + qc.name);
            }
        }
    }
}

OAuth

project: Low-Code Salesforce REST API Endpoint with OAuth

description: >
  Synopsis of how to build and secure a low-code REST API endpoint in Salesforce
  using OAuth 2.0. Emphasizes environment discipline, minimal Apex scaffolding,
  and provenance tagging for deployments.

prerequisites:
  - Salesforce org (Developer, Sandbox, or Production)
  - Salesforce CLI (sfdx) installed and authenticated
  - Basic Apex knowledge
  - Connected App configured for OAuth

steps:
  - step: Create Connected App
    details:
      - Navigate to Setup -> App Manager -> New Connected App
      - Enable OAuth Settings
      - Callback URL: https://login.salesforce.com/services/oauth2/callback
      - Scopes: api, refresh_token
      - Save and note Client ID and Client Secret

  - step: Define REST Endpoint
    apex_class: |
      @RestResource(urlMapping='/MyService/*')
      global with sharing class MyService {
          @HttpGet
          global static String doGet() {
              return 'Hello from Salesforce!';
          }
      }
    notes:
      - @RestResource maps endpoint to /services/apexrest/MyService
      - Methods (@HttpGet, @HttpPost, etc.) define supported HTTP verbs

  - step: Secure with OAuth
    flows:
      - Web Server Flow: Redirect users to Salesforce login
      - JWT Bearer Flow: Use certificates for server-to-server auth
    example_call: |
      curl https://yourInstance.salesforce.com/services/apexrest/MyService \
        -H "Authorization: Bearer <access_token>"

  - step: Test Endpoint
    tools:
      - Postman
      - curl
    notes:
      - Verify responses
      - Ensure proper scope enforcement

  - step: Deploy Across Orgs
    commands:
      - Add Apex class to force-app source
      - sfdx force:source:deploy -p force-app -u QAOrg
    notes:
      - Maintain provenance tags (alias + timestamp) for traceability

operational_distinctions:
  - step: Connected App
    purpose: OAuth client registration
    characteristic: Point-and-click setup
  - step: Apex REST Class
    purpose: Define endpoint
    characteristic: Minimal Apex code
  - step: OAuth Flow
    purpose: Secure access
    characteristic: Standard Salesforce OAuth
  - step: Testing
    purpose: Validate endpoint
    characteristic: Simple token + REST call
  - step: Deployment
    purpose: Move across orgs
    characteristic: CLI alias discipline

commentary:
  - Low-code aspect: Salesforce handles OAuth and REST scaffolding; minimal Apex required
  - OAuth scopes: Define client permissions (e.g., read vs. write)
  - Environment discipline: Test in Dev/QA before exposing in Production
  - Provenance tagging: Document deployments with org alias + timestamp

references:
  - Salesforce Apex REST Documentation: https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_rest.htm
  - Salesforce OAuth 2.0 Guide: https://help.salesforce.com/s/articleView?id=sf.remoteaccess_oauth_flows.htm

Bad Ideas

1. Messy Data

  • What happens:

    • Old data is full of mistakes
    • Some records are copied twice
    • Some info is missing or wrong
  • Why it's bad:

    • Makes reports confusing
    • People stop trusting the system

2. Trying to Match Everything Exactly

  • What happens:

    • You try to copy your old setup into Salesforce
    • You expect every field to match perfectly
  • Why it's bad:

    • Salesforce works differently
    • Some things don't fit the same way

3. Hardcoding Rules

  • What happens:

    • You write rules that can't change easily
    • You don't plan for updates
  • Why it's bad:

    • If something changes, everything breaks
    • Fixing it takes a long time

4. No Middle Step

  • What happens:

    • You move everything all at once
    • You skip checking if things match
  • Why it's bad:

    • Relationships between records get lost
    • Mistakes are hard to find later

5. Forgetting About Limits

  • What happens:

    • You send too much data at once
    • You don't split it into smaller parts
  • Why it's bad:

    • Salesforce says 'too much!' and stops
    • Your move fails halfway through

6. Not Testing Enough

  • What happens:

    • You don't check if the data is right
    • You skip practice runs
  • Why it's bad:

    • Mistakes sneak in
    • You don't notice until it's too late

7. Ignoring How Salesforce Works

  • What happens:

    • You keep using your old way of thinking
    • You don't learn Salesforce's tools
  • Why it's bad:

    • Things don't work the way you expect
    • You miss out on helpful features

Credentials

  • BS with or MS in Computer Science or related technical field
  • At least 8+ years of engineering experience in global software development and deployment
  • Must have 4+ years of experience with SFDC platform and partner ecosystem
  • Played the role of a Technical lead in 1-2 SFDC projects
  • Thorough knowledge of SFDC. Well versed in configuration, customization and Integration
  • Must be Salesforce Certified Developer or Salesforce Certified Architect
  • Knowledge of Heroku/AWS
  • Experience working with large scale enterprise organization with Front/Back Office
  • Experience with GIT
  • Interaction with Client’s key Business and Technical stakeholder for Solution Designing and Architecture
  • Proven ability to manage and resolve complex and ambiguous technical issues
  • Creativity & Critical Thinking
  • Excellent verbal and written communication skills

Job

API Calls

Architecture

url:servlet ext:pdf site:salesforce.com

Partner WSDL

DTD

Document Library

Flow

Admin

Auditing Tools / Heroku Apps

Syntax

REPL

https://help.salesforce.com/articleView?id=code_dev_console_checkpoints_overlaying_soql.htm&type=5 https://trailhead.salesforce.com/en/content/learn/projects/quick-start-apex-coding-for-admins/instantiate-and-invoke https://help.salesforce.com/articleView?id=collab_files_connect_share.htm&type=5

OAuth/SAML

MIME Types (vCard, Calendar, Attachments)

Metadata Enumeration/Serialization

Omnichannel (OpenSearch, JSON-LD, MHTML, RFC822, RFC6761)

Type Checking

Best Practices

Anti-Patterns

Feature Parity

Math

Best Practices

How-tos

Specifications

Team Building

Preview

Types

Process Builder + Webhooks

Defects

Risks

Certification

My Domain

Newsletter

Email / SMS

Attachments

Feeds

IoT

Samples

Tutorials

Documentation

"Hands-on Challenge" 500 site:trailhead.salesforce.com

force.com canvas

Cookies

Single Sign On

DITA / SFDOC

SOAP/REST

Sales Cloud

License Management

Commerce Cloud

Marketing Cloud

GDPR Compliance / Deletion

Customer Support / Desk.com / Service Cloud

Knowledge Base

Community

Pardot

DX

SFTP

SFDX/SF CLI

Heroku Postgres

Heroku CI

Heroku Connect

UFA / 2-Factor Auth

Platform Events

API Documentation

External Services (Swagger/RAML)

MuleSoft/AWS Proxy Setup

Mulesoft Open Source Model

mulesoft_components:
  open_source:
    - Mule_ESB: "The core integration platform is open source, allowing developers to access, modify, and extend its functionality."
    - Core_Components: "These include basic building blocks like message processors, transformers, and routers."
    - Connectors: "Some connectors, like the HTTP and FTP connectors, are open source and can be used to integrate with various systems."
  proprietary:
    - Anypoint_Platform: "The broader suite of tools, including Anypoint Studio, API Manager, and CloudHub, are proprietary and require a subscription."
    - Advanced_Connectors: "While some connectors are open source, others, especially those for specific enterprise applications (e.g., SAP, Salesforce), are proprietary."
    - Modules: "Certain modules that provide advanced functionality, such as data transformation and aggregation, are proprietary."

Kafka Integration

ServiceNow/Workday Integration

Change Set Deployment

IDE

My Domain

Continuous Integration

Chrome Extensions

AppExchange API

AppExchange Prospects

Salesforce Labs

ISV

EEO

Data Quality and Premium Content

Partner Relationship Management

Subcontractor Market

Recovery

GDPR

Live Agent API

S-Controls

JSForce

ESAPI

WorkBench

DML Data Deduplication

Remoting

VisualForce

Standard Objects

Custom Objects and Custom Settings

Custom Metadata

Permissions

SOQL

Code

public class SalesforceData {
    public Boolean is_active;
    public Object metadata;
    public String customer_name;
}

Query

List<salesforce_data__c> records = [
    SELECT Id, IsActive__c, Metadata__c, CustomerName__c
    FROM salesforce_data__c
    WHERE IsActive__c = true
];

Apex

for (salesforce_data__c record : records) {
    insert new salesforce_data (
        is_active = record.IsActive__c,
        metadata = record.Metadata__c,
        customer_name = record.CustomerName__c
    );
}

Tables

  • Schema
  • account
  • profile
  • dashboard
  • objectpermissions
  • permissionset
  • organization
  • platformaction
  • processnode
  • scratchorginfo
  • site
  • user
  • activityhistory
  • oauthToken

References

SQL

CSV

DML and Triggers

Here's a comprehensive, exhaustive table listing Trailhead courses specific to databases, in markdown source code:

### Trailhead Courses Specific to Databases

| Course Name                                     | Description                                                                                             | Link                                                                                          |
|------------------------------------------------|---------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------|
| Apex Basics & Database                          | Learn how to add business logic and manipulate data using Salesforce Apex programming language.         | [Apex Basics & Database](https://trailhead.salesforce.com/content/learn/modules/apex_database)|
| Learn Apex Programming Basics                   | Get started with Apex and learn how to interact with the database using Apex.                           | [Learn Apex Programming Basics](https://trailhead.salesforce.com/content/learn/modules/apex_database/apex_database_intro)|
| Data Cloud Trail                                | Dive into Data Cloud with technical learning on unifying enterprise data and driving AI results.        | [Data Cloud Trail](https://trailhead.salesforce.com/data-cloud-trail/)|
| SOQL and SOSL Basics                            | Learn how to write SOQL queries to retrieve data and SOSL queries to search across multiple objects.    | [SOQL and SOSL Basics](https://trailhead.salesforce.com/content/learn/modules/apex_database/salesforce_object_query_language)|
| DML Operations                                  | Understand how to perform DML operations such as insert, update, delete, and upsert in Apex.            | [DML Operations](https://trailhead.salesforce.com/content/learn/modules/apex_database/dml_operations)|
| Apex Triggers                                   | Learn how to create and use Apex triggers to automate processes and enforce business rules.             | [Apex Triggers](https://trailhead.salesforce.com/content/learn/modules/apex_database/apex_triggers)|

Feel free to ask if you need more details on any of these courses or additional assistance!

Scheduled Apex

anti_patterns:
  - Spaghetti Sharing Model: "Occurs when the sharing model becomes overly complex and difficult to manage."
  - Bulk Updates Without Limits: "Scheduling classes from triggers without ensuring they don't exceed the limit."
  - Updating Classes with Active Jobs: "Attempting to update a class with active scheduled jobs through the Salesforce UI."

quotas:
  - Scheduled Apex Jobs: "Maximum of 100 scheduled Apex jobs at one time."
  - Scheduled Apex Executions: "Maximum of 250,000 executions per 24-hour period or number of user licenses in your org multiplied by 200, whichever is greater."

rate_limits:
  - Synchronous Apex Limits: "Applies to scheduled Apex jobs."
  - Bulk API Jobs: "Maximum number of Bulk Apex jobs added to the queue with System.enqueueJob is 50."
  - Callouts: "Maximum of 100 callouts (HTTP requests or web services calls) in a transaction."
  - CPU Time: "Maximum CPU time on Salesforce servers is 10,000 milliseconds for synchronous Apex and 60,000 milliseconds for asynchronous Apex."

Routing

Lightning Out

Bug Bounty

Lightning

Lightning Web Components

Generate

sf lightning generate component --type lwc --name myLightningWebComponent --output-dir force-app/main/default/lwc

Build

sf build:lwc -p force-app/main/default/lwc/myLightningWebComponent

Zip

zip -r myLightningWebComponent.zip force-app/main/default/lwc/myLightningWebComponent

Background Utilities

Accessibility

CPQ

Internationalization / Localization

Beta/Pilot Programs

Automation

Object Manager

Validation Rules

Date Fields

Custom Fields

Formula Fields

Auditing

Blogs

Investor Timeline

Comparison

Audio/Video

Known Issues

Format:2xx0xx000

Gov Cloud

Release Notes

Deprecation

Exporting

GraphQL API

No Code

Comparison of External Services vs. Flow HTTP Callout

Focus: Query‑String Correlation, Log Granularity, and HTTP Verb Semantics

This document compares Salesforce External Services and Flow HTTP Callout specifically for use cases where the query string is used for correlation, idempotency, or trace propagation. It also evaluates how each feature behaves in terms of Salesforce logging and supported HTTP verbs.


Query‑String Correlation Behavior

Flow HTTP Callout

Flow HTTP Callout preserves query parameters exactly as written in the URL template.

  • Query parameters can be dynamic, using Flow variables.
  • Correlation keys can be generated at runtime and inserted directly into the query string.
  • Salesforce does not reorder, strip, or normalize the parameters.
  • The final URL is visible in Flow logs and Event Monitoring.

This makes Flow HTTP Callout the stronger option for correlation‑driven integrations.

External Services

External Services preserves query parameters only if they are defined in the OpenAPI specification.

  • Query parameters are static and cannot be added or removed dynamically.
  • Correlation keys cannot be generated at runtime unless the OpenAPI spec explicitly defines a parameter for them.
  • The final URL is abstracted behind the Named Credential and may not appear in logs.

This makes External Services less suitable for correlation or idempotency keys.


Salesforce Log Granularity

Flow HTTP Callout

Flow provides high‑visibility logging:

  • Flow debug logs show the fully resolved URL, including query parameters.
  • Event Monitoring (API Callout events) logs the full outbound URL.
  • Flow fault logs include the URL and parameter values.
  • Additional logging steps can be inserted before or after the callout.

This level of detail makes troubleshooting and correlation straightforward.

External Services

External Services provides lower‑visibility logging:

  • Debug logs show the operation name, not the full URL.
  • Query parameters may not appear unless explicitly included in the OpenAPI spec.
  • Event Monitoring logs the callout but may show a masked or abstracted URL.
  • No ability to insert logging inside the External Services call.

This makes correlation more difficult, especially when relying on query‑string keys.

HTTP Verb Semantics (GET/POST/PATCH/DELETE)

Flow HTTP Callout

Flow supports all major HTTP verbs:

  • GET
  • POST
  • PUT
  • PATCH
  • DELETE

Flow also supports:

  • request bodies for POST/PUT/PATCH,
  • dynamic query parameters for any verb,
  • conditional logic around which verb to use.

This makes Flow the most flexible declarative callout mechanism.

External Services

External Services supports verbs only if defined in the OpenAPI spec:

  • GET and POST are well supported.
  • PATCH and DELETE support depends on schema complexity.
  • Query parameters must be declared in the spec and cannot be dynamic.

This makes External Services more rigid and schema‑driven.


Summary Table

Category Flow HTTP Callout External Services
Query‑string preservation Fully preserved; dynamic; Flow variables allowed Preserved only if defined in OpenAPI; static
Dynamic correlation keys Supported Not supported
Log granularity High: full URL visible in logs Medium: URL may be abstracted
HTTP verb support Full support for GET/POST/PATCH/DELETE Limited to verbs defined in OpenAPI
Runtime flexibility High Low
Best use case Correlation, idempotency, dynamic routing Stable, schema‑driven integrations

Practical Implications

Flow HTTP Callout is the better choice when:

  • correlation keys must be generated at runtime,
  • query parameters must be preserved exactly,
  • logs must show the full URL for troubleshooting,
  • the integration requires flexible HTTP verbs.

External Services is better when:

  • the integration is schema‑driven and stable,
  • the OpenAPI contract is fixed,
  • dynamic correlation is not required.

Rest API

User Interface API

Static Content

Porting

Slack

Einstein

Analytics

Chatter

SalesforceIQ

Content Deliveries / Google Suite

Live Agent

Chatbot

Field Validation

Tooling

Mobile SDK

MobilePush

Watson

Alexa

API Integration

XSS

Configuration

Development Edition / Scratch Org Sync

Cost Structure

Messaging / Chaining

Websockets

Pub/Sub API (Protocol Buffers)

Streaming API (cometd/bayeux)

DOM

StAX

JSONP

Web Components

Edition Licensing Comparison

Big Data

Concurrency

Workflows

Logging/Debugging/Troubleshooting

Developer Console

Browser Extensions

OpenSearch Federation

Mocking

WSDL

OData

Test Data

Testing

XSD/WSDL Transformation

Connected Apps

Salesforce Scheduled Push Sequence (Long‑Polling Pull Model)

1. Change‑Triggered Detection

  • A Record‑Triggered Flow fires when a record is created or updated.
  • The Flow writes a row into a queue object (e.g., Sync_Queue__c).
  • Each row represents a unit of work that the external system will later pull.

Minimum and Maximum Values for X (Polling Interval)

  • Minimum: As low as your external system allows (e.g., 5 seconds)
  • Maximum: Any interval (e.g., 24 hours), depending on long‑poll timeout strategy

2. Queue Accumulation

  • The queue object stores all pending changes.
  • Each row includes:
    • Record ID
    • Operation type
    • Timestamp
    • Processed flag

3. External System Long‑Polling

  • The external system opens a long‑polling connection to Salesforce.
  • It repeatedly queries the queue object for:
    • New rows
    • Unprocessed rows
    • Rows newer than the last cursor timestamp
  • The long‑poll request blocks until:
    • New queue rows appear, or
    • The timeout expires

Prerequisites for “Opens a Long‑Polling Connection”

  • A Connected App with OAuth 2.0
    • Must support JWT, Web Server, or Username/Password flow.
  • API Enabled permission
    • Required for REST access.
  • Read access to the queue object
    • Object‑level and field‑level.
  • A stable cursor strategy
    • Timestamp, record ID, or custom token.
  • A long‑polling–capable HTTP client
    • Supports long timeouts, retries, and backoff.
  • Salesforce REST API access
    • Uses /services/data/vXX.X/query or /queryAll.
  • API limit planning
    • Long‑polling consumes API calls.
  • A dedicated integration user
    • Ensures stable permissions and predictable limits.

4. External System Pulls and Processes the Batch

  • When the long‑poll returns data:
    • The external system retrieves the queued rows.
    • It processes the records.
    • It marks the queue rows as processed using REST PATCH/UPDATE.

Steps to Create a Flow (Assuming 4 Objects, Each With 3 Fields)

Objects

  • Object_A__c
  • Object_B__c
  • Object_C__c
  • Object_D__c

Each object has:

  • Field_1__c
  • Field_2__c
  • Field_3__c

Steps

  1. Create a Record‑Triggered Flow for each object.
  2. Set the trigger to When a record is created or updated.
  3. Add a Get Records element to check for an existing queue entry.
  4. Add a Create Records element to insert a new queue row if none exists.
  5. Map the 3 fields into the queue row’s structure.
  6. Save and activate the Flow.
  7. Repeat for all 4 objects.

Steps to Create a Queue Object

Object: Sync_Queue__c

  1. Create a new Custom Object named Sync_Queue__c.
  2. Add fields:
    • RecordId__c (Lookup or Text)
    • Operation__c (Picklist: Create, Update, Delete)
    • Timestamp__c (DateTime)
    • Processed__c (Checkbox)
    • BatchId__c (Text, optional)
  3. Grant Flow read/write permissions.
  4. Add a list view for “Pending Items.”
  5. Deploy to production.

API Version Definition and Maintenance

Steps Required to Define the API Version

  • Choose a supported API version
    • Must exist in your org and support your integration needs.
  • Verify the version is available
    • Call GET /services/data/ to list supported versions.
  • Set the version in your integration configuration
    • Store as a configurable value (see sublist below).
  • Construct the base REST URL
    • https://<instance>.salesforce.com/services/data/vXX.X/query
  • Use the version consistently
    • Across /query, /queryAll, /sobjects, /composite, etc.
  • Document the version
    • Include rationale and update instructions.

Where to Find the API Version Selection GUI (Browser Navigation)

  • Setup → Integrations → API → API Versions
    • Shows supported and deprecated versions.
  • Setup → Apex API Versions
    • Shows version usage across metadata (informational).
  • Setup → Connected Apps →
    • Displays OAuth scopes and compatibility notes.
  • Workbench → REST Explorer
    • Lists all available versions under /services/data/.

Where to Store the Version as a Configurable Value

  • Environment Variables
    • SALESFORCE_API_VERSION=61.0
  • Application Config Files
    • config.yaml, settings.json, .env
  • Database Configuration Table
    • integration_settings.api_version
  • Kubernetes Secrets or ConfigMaps
    • apiVersion: "v61.0"
  • CI/CD Pipeline Variables
    • Version injected at deploy time.

Steps Required Once an API Version Is Deprecated

  • Monitor Salesforce release notes
    • Deprecations announced ~1 year in advance.
  • Identify the next supported version
    • Use GET /services/data/.
  • Update integration configuration
    • Change version in env vars, config files, or DB settings.
  • Regression test all SOQL queries
    • Validate cursor logic, timestamp filters, and query behavior.
  • Validate long‑polling behavior
    • Ensure timeouts, 204/408 responses, and rate limits behave the same.
  • Update documentation
    • README, diagrams, runbooks.
  • Deploy the version update
    • Through Dev → QA → UAT → Prod.
  • Remove references to deprecated versions
    • Clean up old configs and tests.
  • Monitor after cutover
    • Watch for 404, 400, 500, and 429 errors.

Q&A

What prerequisite permissions are needed?

  • API Enabled
  • Read on all 4 source objects
  • Read/Write on the queue object
  • View All Data (optional but helpful)
  • Manage Flows
  • Customize Application

What sandbox vs real org limitations are a potential showstopper?

  • Lower API limits in sandboxes
  • Sparse sandbox data affecting long‑poll realism
  • OAuth domain differences
  • External endpoints blocking sandbox traffic
  • Flow behavior differences with incomplete data

What prerequisite data quality issues are a potential showstopper?

  • Missing or invalid record IDs
  • Duplicate records
  • Null or malformed field values
  • Inconsistent picklists
  • Orphaned queue rows
  • Incorrect timestamps
  • Missing external identifiers for idempotency

REST API

Integration Type Cost Model Limits Rate Limits Tier Availability Introduced In Release Date Reference URL
Outbound Callouts (Apex HTTP) Free (license-based) 10 min timeout, 25 concurrent callouts 25 concurrent long-running callouts Enterprise, Performance, Unlimited, Developer API v20.0 Spring '11 (Feb 2011) Apex Callouts Guide
Inbound REST API License-based (daily quota) 15K–100K+ calls/day depending on edition 100K base + per-license allocation; 1K–5K per license Professional (with API access), Enterprise, Performance, Unlimited, Developer API v20.0 Spring '11 (Feb 2011) REST API Developer Guide
Webhooks via Apex REST + Sites Free (self-hosted endpoint) Apex governor limits, guest user access No formal rate limit; governed by Apex execution and guest user throughput Enterprise, Performance, Unlimited, Developer (via Sites or Experience Cloud) API v22.0 Winter '12 (Oct 2011) Salesforce Webhooks Guide

Authentication

Packaging

Server-side JavaScript

Open CTI (Telephony)

Reporting

Migration

NPM

Trailhead

Non-Profit / Education

Nonprofit Cloud

NPC Fundraising Objects

What is DonorGiftConcept used for?

It represents an early-stage idea or proposal for a gift before it becomes a formal commitment.
Use case: A major gift officer records a donor’s verbal interest in funding a scholarship.

What is DonorGiftConceptOpportunity used for?

It links a DonorGiftConcept to an Opportunity when the idea becomes a real fundraising effort.
Use case: A conceptual gift idea is converted into an Opportunity and must remain traceable.

What is DonorGiftSummary used for?

It provides consolidated giving information for a donor.
Use case: A gift officer reviews lifetime giving and commitments before a donor meeting.

What is GiftAgreement used for?

It formalizes the terms of a donor’s gift, including restrictions and schedules.
Use case: A donor signs a multi-year pledge agreement.

What is GiftStewardship used for?

It tracks who is responsible for stewarding a gift.
Use case: A stewardship manager assigns a staff member to steward a major gift.

What is GiftStewardshipActivity used for?

It records specific stewardship actions taken for a gift.
Use case: Logging the sending of an annual impact report.

What is GiftCommitment used for?

It represents a donor’s pledge or long-term commitment to give.
Use case: A donor pledges $50,000 over five years.

What is GiftCommitmentSchedule used for?

It defines the installment schedule for fulfilling a pledge.
Use case: Annual installments for a multi-year pledge.

What is GiftCmtChangeAttrLog used for?

It maintains an audit trail of changes to a GiftCommitment.
Use case: Tracking updates to pledge amounts for compliance.

What is GiftEntry used for?

It represents gifts entered individually or in batches before processing.
Use case: A gift processor enters mailed checks before posting them.

What is GiftBatch used for?

It groups multiple GiftEntry records into a batch for review and posting.
Use case: Daily batches of online donations.

What is GiftDesignation used for?

It specifies how a gift should be allocated, such as to a fund or program.
Use case: A donor restricts a gift to the STEM Scholarship Fund.

What is GiftDefaultDesignation used for?

It automatically assigns default designations based on context.
Use case: All gifts from a specific campaign default to the General Fund.

What is GiftDefaultSoftCredit used for?

It automatically assigns soft credits for recurring or automated transactions.
Use case: A recurring foundation gift soft-credits individual family members.

What is GiftActuarialEntry used for?

It stores actuarial calculations for planned gifts.
Use case: Recording life expectancy and discount rate calculations for a charitable remainder trust.

What is DocGenerationQueryResult used for?

It stores metadata about document generation jobs.
Use case: Logging details of a generated gift agreement PDF.

NPC Non-Fundraising Objects With No Standard Salesforce Equivalent

These objects have no native equivalent in standard Salesforce and would require custom objects to replicate.

Program Management Objects (No Standard Equivalent)
What is ProgramCohort used for?

It represents a cohort or group within a program.
Use case: “Job Training Spring 2026 Cohort.”

What is ProgramOutcome used for?

It represents an intended outcome for a program.
Use case: “Client obtains full-time employment.”

What is ProgramOutcomeResult used for?

It represents a participant’s actual outcome.
Use case: A participant achieves the employment outcome.

Service Delivery Objects (No Standard Equivalent)
What is Service used for?

It represents a service offering within a program.
Use case: “Resume Workshop” or “Financial Coaching Session.”

What is ServiceDelivery used for?

It represents the delivery of a service to a participant.
Use case: A client attends a resume workshop on April 10.

What is ServiceSchedule used for?

It represents a recurring or scheduled service.
Use case: Weekly financial coaching sessions every Tuesday.

Case Management Objects (No Standard Equivalent)
What is CasePlan used for?

It represents a structured plan for a client receiving services.
Use case: A case manager creates a plan with goals for a client.

What is CasePlanGoal used for?

It represents a goal within a case plan.
Use case: “Obtain stable housing.”

What is CasePlanTask used for?

It represents a task required to achieve a goal.
Use case: “Submit housing assistance application.”

Intake and Assessment Objects (No Standard Equivalent)
What is Assessment used for?

It represents a structured assessment or questionnaire.
Use case: A housing stability assessment.

What is AssessmentQuestion used for?

It represents a question within an assessment.
Use case: “How many nights have you slept in a shelter this month?”

What is AssessmentResponse used for?

It represents a participant’s response to an assessment question.
Use case: The client answers “10 nights.”

Eligibility and Referral Objects (No Standard Equivalent)
What is EligibilityRule used for?

It represents a rule determining whether a participant qualifies for a program.
Use case: Income must be below a threshold.

What is Referral used for?

It represents a referral into a program or service.
Use case: A partner agency refers a client to a food assistance program.

Limits

Feature Limit
API Calls 100,000 per 24-hour period
Storage 10GB per organization + 20MB per user
Workflow Rules 500 active rules per organization
Validation Rules 100 per object
Custom Fields 500 per object
Custom Objects 2,000 per organization
Reports 2,000 per organization
Dashboards 500 per organization
Scheduled Jobs 100 active jobs per organization
Attachments 25MB per file, 2GB for feed attachments
Content Deliveries 50GB bandwidth per rolling 24-hour window
Category Groups 5 active groups at a time
Custom Apps 260 per organization
Big Objects 100 per organization

Developer Relations

Change Management

https://www.salesforce.com/content/dam/web/en_us/www/documents/datasheets/ds-uas.pdf https://help.salesforce.com/servlet/servlet.FileDownload?file=015300000037bACAAY https://developer.salesforce.com/blogs/developer-relations/2014/12/salesforce1-enterprise-environment-management.html https://help.salesforce.com/articleView?id=push_scheduling_upgrades.htm&type=5 https://engineering.salesforce.com/how-the-salesforce-technology-products-organization-runs-on-salesforce-725b11a0a638 https://www.salesforce.com/blog/2018/04/tips-for-launching-implementations.html https://www.salesforce.com/blog/2017/05/how-to-successfully-transform-business.html https://www.salesforce.com/blog/2017/05/confronting-the-danger-of-legacy-attitudes.html https://www.salesforce.com/blog/2018/03/citizen-development-untold-perks-damian-ofarrill.html https://engineering.salesforce.com/autodesks-best-practices-for-continuous-innovation-with-salesforce-4f2971715a5e https://www.salesforce.com/au/blog/2018/10/technology-change-management--5-steps-to-success.html https://www.salesforce.com/blog/2017/07/scaling-sales-team-3-to-300.html

Hubspot/Spreadsheet Integration

Adobe

Headless Architecture

IoT

https://www.cc.gatech.edu/~isbell/papers/isbell-discovery-puc-2006.pdf http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.141.5734&rep=rep1&type=pdf https://www.umbc.edu/rssipl/people/aplaza/Papers/Journals/2011.JSTARS.Review.pdf

Web Component Interop

https://sebastiandedeyne.com/react-for-vue-developers/ https://marmelab.com/blog/2019/03/13/react-dependency-injection.html https://vuejs.org/v2/guide/components-dynamic-async.html

Business Model Psychology

https://www.apnews.com/350bb628d1f7468489be90f4f7539181

Bulk Upload of SQL Data

+-----------------------------------------------------------------------+
|                           1. Curated Data                             |
|-----------------------------------------------------------------------|
| - Curated datasets (CSV, parquet, JSON, etc.)                         |
| - Validated for completeness and referential integrity                |
| - Approved, non-synthetic source data                                 |
+--------------------------------------+--------------------------------+
                                       |
                                       |  (curated inserts)
                                       v
+--------------------------------------+--------------------------------+
|                         2. Seeder Loads Postgres                      |
|-----------------------------------------------------------------------|
| - Connects to Postgres                                                  |
| - Inserts curated records into domain tables                           |
| - Preserves parent/child relationships                                 |
| - Commits curated data as authoritative source                         |
+--------------------------------------+--------------------------------+
                                       |
                                       |  SQL (psycopg2 / SQLAlchemy)
                                       v
+--------------------------------------+--------------------------------+
|                            Postgres Database                          |
|-----------------------------------------------------------------------|
| - Stores curated domain data                                           |
| - Stores Bulk API job logs                                             |
| - Stores success/error responses                                       |
+--------------------------------------+--------------------------------+
                                       ^
                                       |
                                       |  SQL queries (read curated data)
                                       |
+--------------------------------------+--------------------------------+
|                     3. Python Bulk Loader Service                     |
|-----------------------------------------------------------------------|
| - Reads curated data from Postgres                                     |
| - Transforms records into Bulk API CSV/JSON                            |
| - Creates Salesforce Bulk API jobs                                     |
| - Uploads batches and polls job status                                 |
| - Normalizes success/error results                                     |
| - Writes logs back into Postgres                                       |
+----------------------+-------------------------------+----------------+
                       |                               ^
                       | HTTPS (Bulk API)              |
                       v                               |
+----------------------+-------------------------------+----------------+
|                        4. Salesforce Bulk API (V1/V2)                |
|-----------------------------------------------------------------------|
| - Receives batched CSV/JSON payloads                                  |
| - Inserts/updates Salesforce records                                   |
| - Returns per-row success/error results                                |
+----------------------+-------------------------------+----------------+
                       |
                       |  (results returned to Python)
                       v
+----------------------+-------------------------------+----------------+
|                     5. Python Logs Results to Postgres               |
|-----------------------------------------------------------------------|
| - Job IDs, batch IDs, timestamps                                      |
| - Success rows                                                        |
| - Error rows + error messages                                         |
| - Retry metadata                                                      |
| - Correlation IDs                                                     |
+----------------------+-------------------------------+----------------+

                               (separate inbound path)
                               -------------------------
                               |                       |
                               v                       |
+----------------------+-------------------------------+----------------+
|            6. Salesforce External Services (OpenAPI)                 |
|-----------------------------------------------------------------------|
| - Imports OpenAPI schema for Python service                           |
| - Auto-generates invocable actions for Flows/Apex                     |
| - Calls Python service for real-time operations                       |
| - Python logs these calls into Postgres as well                       |
+-----------------------------------------------------------------------+
⚠️ **GitHub.com Fallback** ⚠️