Learning continually, accumulating knowledge, and using it to learn new tasks was the characteristic of lifelong learning. Lifelong learning, which is also known as Continual Learning, takes benefit from the one that called the previous (first) task to solve the new task—this schema work by configuring a proper regularization between them. Elastic weight consolidation (EWC) method proposed by Google Deepmind provides a way of calculating the importance (preserving the previously acquired knowledge) of weight and selectively adjusts the plasticity. In this paper, the EWC is exploited to tackle the sequence tasks of predicting churn activity. The tasks involve two distinct datasets from the domain of Telecom. The experimental results show that EWC can elevate the model performance in sequence training. Lifelong learning offers a more flexible way of learning to further research in dynamic learning.