Pointer attention is just a simple application of attention mechanism, which only produce a pointer to the most focused input element.
Here is my simple implementation of a pointer attention network with bi-directional gated-recurrent-unit as input encoder and a uni-directional GRU as previous output encoder. This network is applied to a list of random integer sorting task to evaluate its performance.