ScraperOptions - itsManeka/amazing-scraper GitHub Wiki
amazing-scraper / ScraperOptions
Defined in: src/index.ts:36
Configuration options for the scraper factory.
optionaldelayMs:DelayConfig
Defined in: src/index.ts:38
Random delay range between requests (default: { min: 1000, max: 2000 })
optionalhttpClient:HttpClient
Defined in: src/index.ts:50
Custom HTTP client (default: AxiosHttpClient with cookie jar)
optionallogger:Logger
Defined in: src/index.ts:40
Custom logger implementation (default: ConsoleLogger with JSON output)
optionalonBlocked: (error) =>Promise<void>
Defined in: src/index.ts:48
Callback invoked before throwing on block/CAPTCHA/session errors.
Promise<void>
optionalpaginationLimits:PaginationLimits
Defined in: src/index.ts:42
Pagination safety limits to prevent runaway extraction
optionalretryPolicy:RetryPolicy
Defined in: src/index.ts:46
Custom retry policy (default: ExponentialBackoffRetry with 3 retries)
optionalsessionRecycle:object
Defined in: src/index.ts:71
Session recycling configuration for preventing session degradation over multiple requests. Default behavior includes preventive recycling (every 5 requests) and reactive degrade detection.
optionalafterRequests:number
Recycle session (reset cookies, create new jar) after this many successful requests. Default: 5. Set to 0 to disable preventive recycling.
optionalreactive:boolean
Enable reactive degrade detection on FetchProduct. When enabled, if a product page appears degraded (200 OK but missing critical data), the session is automatically reset and the request retried once. Default: true. Set to false to disable.
// Enable with defaults (preventive: every 5 requests, reactive: true)
const scraper = createScraper({ sessionRecycle: {} });
// Custom preventive interval (recycle every 10 requests)
const scraper = createScraper({ sessionRecycle: { afterRequests: 10 } });
// Disable reactive detection, keep preventive recycling
const scraper = createScraper({ sessionRecycle: { reactive: false } });
// Disable all recycling (legacy mode)
const scraper = createScraper({ sessionRecycle: { afterRequests: 0, reactive: false } });
// Or omit sessionRecycle entirely for defaults (recommended)
const scraper = createScraper();
optionaluserAgentProvider:UserAgentProvider
Defined in: src/index.ts:44
Custom User-Agent provider (default: RotatingUserAgentProvider)