Array Stack - WilfullMurder/DataStructures-Java GitHub Wiki
An ArrayStack implements the list interface using an array a, called the backing array. The list element with index i is stored in a[i]. At most times, a is larger than necessary, so an integer n is used to keep track of the number of elements actually stored in a. In this way, are stored in a[0],...,a[n-1] and, at all times, a.length ≥ n.
Accessing and modifying the elements of an ArrayStack using get(i) and set(i, x) is trivial. After performing any necessary bounds-checking we simply return or set, respectively, a[i].
The operations of adding or removing elements from an ArrayStack are illustrated in Fig.1. To implement the add(i,x) operation, we first check if a is already full. If so, we call the method resize() to increase the size of a. How resize() is implemented is covered later. For now, it is sufficient to know that, after a call to resize(), we can be sure that a.length > n. With this out of the way, we now shift the elements a[i],...,a[n-1] right by one position to make room for x, set a[i] equal to x, and increment n.
If we ignore the cost of the potential call to resize(), then the cost of the add(i,x) operation is proportional to the number of elements we have to shift to make room for x. Therefore, the cost of this operation (ignoring the cost of resizing a) is O(n-i).
Implemeting the remove(i) operation is similar. We shift elements a[i+1],...,a[n-1] left by one position (overwriting a[i]) and decrement n. After doing this, we check if n is getting much smaller than a.length by checking if a.length ≥ 3n. If so, then we call resize() to reduce the size of a.
If we ignore the cost of the resize() method, then the cost of a remove(i) operation is proportional to the number of elements we shift, which is O(n-i).
The resize() method is fairly straightforward; it allocates a new array b whose size is 2n and copies the n elements of a into the first n positions in b, and then sets a to b. Thus after a call to resize(), a.length = 2n.
Analysing the actual cost of the resize() operation is easy. It allocates an array b whose size is 2n and copies the n elements of a into b. This takes O(n) time.
The running time analysis from the previous section ignored the cost of calls to resize(). In this section we analyse the cost using a technique known as amortized analysis. This technique does not try to determine the cost of resizing during each individual add(i,x) and remove(i) operation. Instead, it considers the cost of all calls to resize() during a sequence of m calls to add(i,x) or remove(i). In particular we can show that:
If an empty ArrayStack is created and any sequence of m≥1 calls to add(i,x) and remove(i) are performed, then the total time spent during all calls to resize() is O(m).
We can show that any time resize() is called, the number of calls to add or remove since the last call to resize() is at least n/2-1. Therefore, if n𝒾 denotes the value of n during the ith call to resize() and r denotes the number of calls to resize(), then the total number of calls to add(i,x) or remove(i) is at least
which is equivalent to
On the other hand, the total time spent during all calls to resize() is
since r is not more than m. All that remains is to show that the number of calls to add(i,x) or remove(i) between the (𝒾-1)th and the 𝒾th call to resize() is at least n𝒾/2.
There are two cases to consider. In the first case, resize() is being called by add(i, x) because the backing array is full, i.e., a.length=n=n𝒾. Consider the previous call to resize(): after this previous call, the size of a was a.length, but the number of elements stored in a was at most a.length/2=n𝒾/2. But now the number of elements stored in a is n𝒾=a.length, so there must have been at least n𝒾/2 calls to add(i,x) since the previous call to resize().
The second case occurs when resize() is being called by remove(i) because a.length≥3n = 3n𝒾. Again, after the previous call to resize() the number of elements stored in a was at least a.length/2-1 (The -1 in this formula accounts for the special case that occurs when n=0 and a.length=1). Now there are n𝒾≤a.length/3 elements stored in a. Therefore, the number of remove(i) operations since the last call to resize() is at least
𝑅≥a.length/2-1-a.length/3
= a.length/6-1
= (a.length/3)/2-1
≥ n𝒾/2-1
In either case, the number of calls to add(i,x) or remove(i) that occur between the (𝒾-1)th call to resize() and the 𝒾th call to resize() is at least n𝒾2/1, as required to complete the proof.
The following theorem summarizes the performance of an ArrayStack:
An ArrayStack implements the List interface. Ignoring the cost of calls to resize(), an ArrayStack supports the operations
- get(i) and set(i,x) in O(1) time per operation
- add(i,x) and remove(i) in O(1 + n-i) time per operation
Furthermore, beginning with an empty ArrayStack and performing any sequence of m add(i,x) and remove(i) operations results in a total of O(m) time spent during all calls to resize().
The ArrayStack is an efficient way to implement a Stack. In particular we can implement push(x) as add(n,x) and pop() as remove(n-1), in which case these operations will run in O(1) amortized time.