Multi‐threading Strategies Flutter - G33-Moviles-2026-1/Wiki GitHub Wiki
Multi-threading & Concurrency Strategies
1. async / await — Non-blocking API Calls
Protocol: async/await
Where: All use cases, repositories, notifiers, and widget event handlers.
Reason: Every network request suspends the current function at each await, returning control to the event loop so the UI keeps rendering. Without it, a slow API call would freeze the UI, dropping frames and showing a stuck loading indicator.
2. compute() — One-shot Background Isolate
Protocol: compute(function, argument)
Where: Room availability parsing in the rooms data layer.
Reason: JSON parsing of structured objects (slots, time ranges, availability state) is CPU-bound work. Running it on the main isolate blocks rendering for the duration of the parse. compute() sends the raw JSON to a background isolate, parses there, and returns the domain entity to the main isolate — main thread stays free throughout.
3. Future.microtask() — Deferred Async Init from Sync Context
Protocol: Future.microtask(callback)
Where: PathNotifier.build(), bookings notifier load.
Reason: Some screens need to load data from local storage or the network as soon as they open, but the screen setup runs synchronously — it cannot wait for async work before showing the UI. Scheduling the load as a microtask lets the screen render instantly with an empty or loading state, then update automatically the moment the data is ready, without blocking the first frame.
4. Two-phase Async Load — Cache Then Network
Protocol: Sequential await calls with independent try/catch per phase.
Where: Bookings list, navigation path cache restore on screen open.
Reason: Two operations run in sequence — first read local SQLite (instant), emit state so the user sees data immediately; then fetch from network, emit updated state. Each phase is independently non-fatal.
5. Stream.periodic + asyncMap — Continuous Background Polling
Protocol: Stream.periodic(interval).asyncMap(asyncCallback)
Where: ConnectivityRecoveryService — fires DNS lookup every 3 seconds.
Reason: Platform connectivity APIs report network interface state, not real internet reachability. A DNS lookup to a known hostname is the only reliable reachability check. asyncMap ensures ticks do not stack if a lookup takes longer than the interval. Stream runs for the app lifetime with no manual polling loop.
6. Sequential Async Queue Flush — Eventual Connectivity
Protocol: while loop with await per action, subscribed to recovery Stream.
Where: ConnectivityQueueService, triggered on connectivity recovery.
Reason: On recovery, each pending action executes sequentially. Sequential (not parallel) execution prevents race conditions between related actions. Connectivity failure stops the flush and re-enqueues at the front for the next recovery. Server errors are discarded and reported to the user via event stream.
Summary
| Strategy | Protocol | Purpose |
|---|---|---|
| Non-blocking API calls | async/await |
Suspend function without blocking UI thread |
| Off-thread JSON parsing | compute() |
CPU-bound work off main isolate |
| Async init from sync context | Future.microtask() |
Kick off async work from synchronous build() |
| Two-phase cache + network | Sequential await with isolated try/catch |
Instant cached UI then background refresh |
| Connectivity polling | Stream.periodic + asyncMap |
Non-stacking continuous reachability check |
| Offline queue flush | await loop on recovery stream |
Order-preserving sequential action replay |