Although Global Navigation Satellite Systems (GNSS) receivers currently achieve high accuracy when processing their geographic location under line of sight (LOS), multipath interference and noise degrades the accuracy considerably. In order to mitigate multipath interference, receivers based on multiple antennas became the focus of research and technological development. In this context, tensor-based approaches based on Parallel Factor Analysis (PARAFAC) models have been proposed in the literature, providing optimum performance. State-of-the-art techniques for antenna array based GNSS receivers compute singular value decomposition (SVD) for each new sample, implying into a high computational complexity, being, therefore, prohibitive for real-time applications. Therefore, in order to reduce the computational complexity of the parameter estimates, subspace tracking algorithms are essential. In this work, we propose a tensor-based subspace tracking framework to reduce the overall computational complexity of the highly accurate tensor-based time-delay estimation process.