Integrating LSTM, Transformer, and LightGBM for Enhanced Predictive Modeling: A Mechanistic Approach

Authors

  • Laila A. Wahab Abdullah Naji University of Aden-Faculty of Aden, Yemen
  • Ibrahim Khider Eltahir Sudan University of Science and Technology, Khartoum, Sudan
  • Hadeil Haydar Ahmed Elsheikh Sudan University of Science and Technology, Khartoum, Sudan

Abstract

In the rapidly evolving field of predictive analytics, the ability to efficiently process and analyze diverse data types is crucial for advancing decision-making processes across various domains. This paper introduces a novel mechanism that synergistically integrates Long Short-Term Memory (LSTM) networks, Transformer models, and Light Gradient Boosting Machine (LightGBM) to address the challenges associated with analyzing sequential, time-series, and tabular data. By leveraging the unique strengths of LSTM networks in handling sequential dependencies, Transformer models in capturing long-range interactions through self-attention mechanisms, and LightGBM's efficiency in predictive modeling with tabular data, the proposed mechanism aims to enhance predictive performance and accuracy across a wide range of applications. Our methodology involves a comprehensive integration strategy that ensures seamless interaction between the three models, enabling them to complement each other's capabilities effectively. Experimental results, obtained from applying the integrated model to diverse datasets, demonstrate significant improvements in predictive accuracy and efficiency compared to traditional approaches and standalone models. These findings underscore the potential of combining LSTM, Transformer, and LightGBM models as a robust solution for complex predictive analytics tasks, opening new avenues for research and application in the field.

Published

2024-04-02