README – GA-FL-Transformer: Intelligent Optimization Framework for Corporate Financial Management Title GA-FL-Transformer: An Intelligent Optimization Framework for Corporate Financial Management Description This repository contains the implementation code and dataset preprocessing scripts for the research paper titled “Design of an Intelligent Optimization Framework for Corporate Financial Management Based on GA-FL-Transformer.” The GA-FL-Transformer is a hybrid optimization model that integrates Genetic Algorithm (GA), Fuzzy Logic (FL), and Transformer architecture to address challenges in enterprise financial management. It aims to enhance financial decision-making by achieving multi-source data perception, adaptive optimization, and interpretable reasoning in dynamic business environments. The framework performs unified encoding of heterogeneous financial data, attention-guided genetic-fuzzy optimization, and interpretable rule-based reasoning, providing a closed-loop decision-making process from perception to optimization. Dataset Information Two publicly available financial datasets were employed in this study: 1. Compustat Dataset - Source: Zenodo Repository (https://zenodo.org/records/15369177) - DOI: 10.5281/zenodo.15369177 - Description: Contains structured financial statements (balance sheet, income statement, cash flow) for North American listed firms. 2. CRSP Dataset (CRISPR Therapeutics AG Stock Performance) - Source: Zenodo Repository (https://zenodo.org/records/12555376) - DOI: 10.5281/zenodo.12555376 - Description: Provides market behavior data such as stock prices, returns, and trading volumes for financial performance and risk analysis. Preprocessing steps: - Normalization of numerical features (z-score) - TF-IDF and Word2Vec embedding for textual data - Sliding-window smoothing for temporal sequences - Gaussian imputation for missing values - Unified embedding concatenation for Transformer input (see Eq. (2) in the paper) Code Information The repository contains the following main modules: - data_preprocessing.py — Prepares and encodes multi-source data (structured, text, time series). - transformer_encoder.py — Implements multi-head self-attention for feature perception. - genetic_fuzzy_module.py — Performs genetic optimization and fuzzy rule adaptation. - model_training.py — Handles joint training, validation, and optimization of GA-FL-Transformer. - evaluation_metrics.py — Calculates Precision, Recall, F1-Score, and Error Rate. - ablation_analysis.py — Conducts ablation experiments for model component verification. - config.yaml — Contains hyperparameter settings (learning rate, batch size, epochs, crossover/mutation rates). Usage Instructions 1. Installation git clone https://github.com//GA-FL-Transformer.git cd GA-FL-Transformer pip install -r requirements.txt 2. Dataset Preparation Download datasets from the Zenodo links above and place them in the ./data/ directory: data/ ├── compustat.csv ├── crsp.csv 3. Model Training python model_training.py --dataset compustat --epochs 200 --batch_size 64 4. Evaluation and Visualization python evaluation_metrics.py --dataset crsp python ablation_analysis.py Requirements - OS: Windows 10/11 or Ubuntu 22.04 - Python ≥ 3.10 - Libraries: torch >= 2.2.0 numpy >= 1.25.0 pandas >= 2.1.0 scikit-learn >= 1.3.0 gensim >= 4.3.0 matplotlib >= 3.7.0 tqdm >= 4.66.0 - GPU: CUDA 12.1 + cuDNN 8.9 (optional) Methodology The GA-FL-Transformer framework comprises three core layers: 1. Perception Layer: Transformer encoder extracts multi-source financial features via multi-head attention. 2. Optimization Layer: Genetic Algorithm evolves fuzzy rules and membership functions guided by attention weights. 3. Decision Layer: Fuzzy Logic performs interpretable reasoning and outputs financial decisions. The training process alternates between Transformer feature updates and GA-FL optimization to achieve closed-loop convergence. Citations If you use this code or dataset in your research, please cite: Zhu F., Liu S., Yuan F., & Arshad M. (2025). Design of an Intelligent Optimization Framework for Corporate Financial Management Based on GA-FL-Transformer. Journal of Computational Methods in Sciences and Engineering. License & Contribution Guidelines - License: MIT License – free for academic and research use with proper attribution. - Contributions: Contributions, pull requests, and extensions (e.g., alternative optimizers or Transformer variants) are welcome. Please open an issue before submitting major changes. - Contact: For questions, email jenny2025226@163.com (corresponding author).