runtime error
Exit code: 1. Reason: Traceback (most recent call last): File "/home/user/app/app.py", line 2, in <module> from llmtuner import create_ui File "/usr/local/lib/python3.10/site-packages/llmtuner/__init__.py", line 3, in <module> from .api import create_app File "/usr/local/lib/python3.10/site-packages/llmtuner/api/__init__.py", line 1, in <module> from .app import create_app File "/usr/local/lib/python3.10/site-packages/llmtuner/api/app.py", line 9, in <module> from ..chat import ChatModel File "/usr/local/lib/python3.10/site-packages/llmtuner/chat/__init__.py", line 1, in <module> from .chat_model import ChatModel File "/usr/local/lib/python3.10/site-packages/llmtuner/chat/chat_model.py", line 11, in <module> from ..model import dispatch_model, load_model_and_tokenizer File "/usr/local/lib/python3.10/site-packages/llmtuner/model/__init__.py", line 1, in <module> from .loader import load_model_and_tokenizer File "/usr/local/lib/python3.10/site-packages/llmtuner/model/loader.py", line 11, in <module> from .patcher import patch_config, patch_model, patch_tokenizer, patch_valuehead_model File "/usr/local/lib/python3.10/site-packages/llmtuner/model/patcher.py", line 18, in <module> from ..extras.patches.llama_patch import apply_llama_patch File "/usr/local/lib/python3.10/site-packages/llmtuner/extras/patches/llama_patch.py", line 6, in <module> from transformers.models.llama.modeling_llama import ( ImportError: cannot import name 'LlamaFlashAttention2' from 'transformers.models.llama.modeling_llama' (/usr/local/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py)
Container logs:
Fetching error logs...