The Optimization Process - danielep71/VBA-PERFORMANCE GitHub Wiki

[Home]] ](/danielep71/VBA-PERFORMANCE/wiki/[[Previous|Home) | Next


3. The optimization process

3.1. Measure performance

To measure code performance and disable time wasters, I have prepared a dedicated class here.

3.2. Restructure (macro-optimization)

Macro-optimization focuses on the overall structure of a procedure; how the work is organized, which algorithms are used, and how much computation is performed in total. Unlike micro-optimizations (small, localized tweaks), macro-optimization targets the “shape” of the solution: it asks whether the procedure is doing the minimum necessary work, in the most efficient way. This is typically where the largest performance gains are found, because changing the approach can reduce runtime by orders of magnitude rather than a few percentage points.

Removing bottlenecks at this level often means finding a different way to achieve the same outcome, one that avoids expensive steps altogether or dramatically reduces how often they occur. In practice, this can take two forms:

  • Elimination: Re-think the process so the slow operation is no longer required (e.g., precomputing results, caching, restructuring inputs, or changing the workflow so the computation is unnecessary).

  • Reduction: Keep the operation, but redesign how it’s executed so it completes faster (e.g., switching to a better algorithm, reducing the number of passes over data, or using bulk operations rather than repeated per-item calls).

Macro-optimization is as much a design exercise as it is an analytical one. Sometimes the direct, “obvious” implementation is slow simply because it mirrors how humans think about the task, not how computers execute it efficiently. In those cases, performance improvements come from reframing the problem: using imagination and engineering judgment to arrive at a solution that produces the same result with less processing, fewer dependencies, and fewer repeated actions. The goal is not to make the existing steps marginally faster, but to change what gets done and how often.

In most real-world procedures, the slowest components reliably fall into a small number of categories:

  • External data retrieval (I/O-bound work): Calls to databases, filesystems, network services, APIs, spreadsheets, or other applications are typically orders of magnitude slower than in-memory operations. Macro-optimization here often means batching requests, reducing round-trips, caching results, filtering earlier, or moving computation closer to the data source.

  • Repeated iteration over datasets (CPU-bound work): Tight loops over large ranges, records, or arrays are a common performance sink. Large loops are almost always an opportunity for optimization because cost scales with the number of iterations. Even a small improvement inside a loop (removing one unnecessary lookup, avoiding repeated conversions, or hoisting invariant computations outside the loop) can produce an outsized impact when multiplied by thousands or millions of times.

In short, macro-optimization concentrates on reducing total work: fewer passes, fewer calls, fewer iterations, and more efficient algorithms. When done well, it replaces “doing the same thing faster” with “doing less—and doing it smarter”—while preserving correctness and maintainability.

3.3. Micro-optimization

Need improvements? Use micro-optimization and coding best practices.

Micro-optimization focuses on the fine-grained efficiency of the code inside a given design. Once the overall approach is sound (i.e., after macro-optimization), micro-optimization seeks to reduce overhead at the statement and data-type level: choosing faster VBA constructs, minimizing implicit conversions, and using the most appropriate types and access patterns. In other words, it does not change what the procedure does; it improves how each step is executed.

These refinements typically deliver the “last few percentage points” of performance. They matter most when a code path is executed extremely frequently, most commonly inside loops that run thousands or millions of times, because even tiny per-iteration savings compound into noticeable runtime reductions. Outside of hot paths, micro-optimizations often produce little measurable benefit and can distract from higher-impact improvements.

In both pure VBA and Excel automation, there are usually multiple ways to achieve the same outcome, but their performance characteristics can differ substantially. For example, two expressions that look equivalent may behave very differently due to:

  • Call overhead and object model costs (especially in Excel automation, where every property or method call can incur cross-boundary overhead).

  • Implicit type coercion (Variant conversions, String/Double coercions, and default property resolution).

  • Memory allocation behavior (particularly with Strings and dynamic arrays).

  • The relative cost of language constructs (For…Next vs. For Each, With blocks, Select Case vs. nested Ifs, etc.), which can vary with context.

Effective micro-optimization, therefore, requires a solid understanding of the toolchain's execution model—VBA itself and, where relevant, Excel’s object model. Many micro-level choices are non-obvious: a construct that is more “elegant” or idiomatic may be slower than a more mechanical alternative, and what is optimal can differ depending on data type and usage patterns (for example, operations on Long values behave very differently from equivalent operations on String or Variant values).

The main trade-off is that micro-optimizations can increase complexity. They can introduce more verbose code, reduce readability, and make maintenance harder—especially when the optimization relies on subtle language behaviors. For that reason, micro-optimization is best applied selectively:

  • Concentrate it in proven hot spots (identified via profiling or timing).

  • Favor changes that remain clear and well-commented.

  • Avoid “clever” optimizations that save little but obscure intent.

In summary, micro-optimization is a precision tool: valuable for squeezing extra performance from critical loops and heavily used routines, but most effective only after the algorithm and structure are already efficient—and only when the benefit justifies the added complexity.

For micro-optimization, I have prepared a selection of tips and tricks or coding "best practices" here.

3.4. Improve the user experience

Can't you do better? Improve the user experience.

Improving a small number of the slowest procedures often yields a disproportionate benefit to perceived performance. Users do not experience an application as an average of all runtimes; they remember the moments when it “hangs,” hesitates, or provides no feedback. Addressing the worst bottlenecks—especially those that run frequently or block the UI—can make the entire solution feel significantly more responsive, even if total CPU time across all operations changes only modestly. In practice, perceived responsiveness is a combination of actual speed and timely, credible feedback.

Beyond raw optimization, you can materially improve the user experience by managing expectations during unavoidable delays:

  • Show a splash screen at startup. A splash screen reassures the user that the application is loading rather than stalled. It can briefly communicate what is happening (initialization, connecting to data, loading configuration) and, when appropriate, display progress (e.g., “Loading modules…”, “Preparing templates…”, “Connecting…”) via a progress indicator. The splash screen should disappear as soon as the main window is ready. While splash screens can be visually polished—branding, graphics, subtle animations—their primary value is functional: immediate feedback that the program is working.

  • If a procedure takes more than ~1 second, change the cursor to an hourglass (busy cursor) at the start and restore it at the end. This is a simple, effective signal that the application is actively processing. It reduces user uncertainty and prevents repeated clicks or inputs that may trigger re-entrancy or duplicate work. Psychologically, it also sets an expectation of a short wait; if the operation completes quickly, users perceive it as “snappy” rather than “laggy.” The key is discipline: always restore the cursor reliably (including in error paths), otherwise users interpret the application as broken.

  • If a procedure takes more than ~5 seconds, display a progress bar on the screen or on the status bar. For longer operations, “busy” is not enough. Users need evidence of forward motion. A progress bar (or at minimum a step counter and message) reduces frustration by:

    • confirming the process is alive and not frozen,

    • giving a sense of advancement,

    • providing a rough estimate of remaining time (when feasible),

    • and offering something to focus on other than the delay itself.

    A well-designed progress indicator can also improve perceived speed. If the bar moves early and steadily—especially reaching high percentages quickly— it creates the impression of momentum. Conversely, a bar that sits at 0% for a long time or advances in irregular jumps can increase anxiety. When precise measurement is difficult, a “staged” progress model (e.g., discrete steps with clear labels) is often more credible than a false precision estimate.

Operationally, these techniques do not replace performance work; they complement it. The best results come from combining (1) targeted optimization of the worst bottlenecks with (2) clear user feedback for any remaining unavoidable waits, so the user never has to guess whether the application is still functioning.


[Home]] ](/danielep71/VBA-PERFORMANCE/wiki/[[Previous|Home) | Next