runtime error

Exit code: 1. Reason: fig.json: 0%| | 0.00/1.22k [00:00<?, ?B/s] config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1.22k/1.22k [00:00<00:00, 6.81MB/s] tokenizer_config.json: 0%| | 0.00/361 [00:00<?, ?B/s] tokenizer_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 361/361 [00:00<00:00, 1.76MB/s] merges.txt: 0%| | 0.00/456k [00:00<?, ?B/s] merges.txt: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 456k/456k [00:00<00:00, 73.2MB/s] special_tokens_map.json: 0%| | 0.00/239 [00:00<?, ?B/s] special_tokens_map.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 239/239 [00:00<00:00, 1.33MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> model = LingMessCoref() File "/home/user/.local/lib/python3.10/site-packages/fastcoref/f_coref.py", line 223, in __init__ super().__init__('LingMessCoref', device) File "/home/user/.local/lib/python3.10/site-packages/fastcoref/f_coref.py", line 110, in __init__ self.tokenizer = AutoTokenizer.from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2060, in _from_pretrained slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( File "/home/user/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.local/lib/python3.10/site-packages/transformers/models/longformer/tokenization_longformer.py", line 230, in __init__ with open(vocab_file, encoding="utf-8") as vocab_handle: TypeError: expected str, bytes or os.PathLike object, not NoneType

Container logs:

Fetching error logs...