# Natural Language Inference Transformer Models Code ## Overview This code implements transformer-based models (BERT, GPT, RoBERTa, and XLNet) for Natural Language Inference (NLI) using various pooling strategies (Max, Min, and Mean) and norm calculations (L1, L2, L-inf). ## Prerequisites The code requires Python 3 and the following libraries: - Hugging Face Transformers - PyTorch - numpy