This paper proposes a novel sequential neural network that integrates the collaborative context-learning module to learn sparse and lengthy POI data for top-N recommendation tasks. Our proposed model explores either time and distance irregularities from check-in sequences and incorporates them into the model learning process by carrying a decomposition of the memory cell into short-and long-term interests. The short-term counterpart is adjusted by a proper weight and combined with the long-term counterpart before further processing with the light version of LSTM, where forget and input gates are united into a single decision unit. Also, we further improve our proposed model by allowing multiple contexts collaboratively trained together under the gating mechanism of recurrent neural networks. Extensive experiments over two public datasets show the superiority over existing baselines and state-of-the-art sequence-based models.