📄️ Long Short Term Memory
Long Short Term Memory (LSTM) is a type of recurrent neural network (RNN) specifically designed to learn and retain patterns in sequences of data over long time periods. Traditional RNNs struggle with the "vanishing gradient" problem, which limits their ability to learn dependencies in long sequences. LSTMs solve this by using a unique architecture with memory cells that can store, update, or forget information over time. These cells are controlled by three gates: input, forget, and output gates, which regulate the flow of information into, out of, and within the cell, respectively.
📄️ Data and Pretreatment
The data utilized comprises all previously mentioned and replicated datasets. These are updated on a weekly basis, with no additional preprocessing applied, apart from the rebasing to 100 on row data, as outlined in the section detailing the replication of certain variables.
📄️ Structure and parameterization
The "Eclipse" LSTM model is designed to predict cyclical patterns in Bitcoin markets using time-series data. The model architecture and its parameters have been carefully configured to ensure robustness and reliability in capturing temporal dependencies and nonlinear relationships. Below is a detailed explanation of its structure and parameterization: