Failed to run on MacBook: requiring flash_attn

#72
by jamesbraza - opened

Running the below Python code on my MacBook Pro with M1 chip, macOS Ventura 13.5.2:

# With Python 3.11.7, transformers==4.36.2
from transformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", trust_remote_code=True)

I get this error:

  File "/Users/user/code/project/play.py", line 6, in <module>
    model = AutoModelForCausalLM.from_pretrained(MODEL_NAME, trust_remote_code=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/user/code/project/venv/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 553, in from_pretrained
    model_class = get_class_from_dynamic_module(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/user/code/project/venv/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 488, in get_class_from_dynamic_module
    final_module = get_cached_module_file(
                   ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/user/code/project/venv/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 315, in get_cached_module_file
    modules_needed = check_imports(resolved_module_file)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/user/code/project/venv/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 180, in check_imports
    raise ImportError(
ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`
python-BaseException

Which clearly states flash_attn is required. However, per this issue, flash-attn cannot be installed on my MacBook because it doesn't have an Nvidia GPUs.

Is there anything I can do here to bypass this flash_attn dependency?

It looks like https://github.com/huggingface/transformers/blob/v4.36.2/src/transformers/models/phi/modeling_phi.py#L50-L52 is getting picked up by https://github.com/huggingface/transformers/blob/v4.36.2/src/transformers/dynamic_module_utils.py#L154 as a requirement, when it actually isn't.

Using unittest.mock.patch, we can work around this:

# With Python 3.11.7, transformers==4.36.2
import os
from unittest.mock import patch

from transformers import AutoModelForCausalLM
from transformers.dynamic_module_utils import get_imports


def fixed_get_imports(filename: str | os.PathLike) -> list[str]:
    """Work around for https://huggingface.co/microsoft/phi-1_5/discussions/72."""
    if not str(filename).endswith("/modeling_phi.py"):
        return get_imports(filename)
    imports = get_imports(filename)
    imports.remove("flash_attn")
    return imports


with patch("transformers.dynamic_module_utils.get_imports", fixed_get_imports):
    model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", trust_remote_code=True)

I think that this is actually a bug in transformers.dynamic_module_utils.get_imports, it needs to comprehend conditional imports. What do you think?

I opened https://github.com/huggingface/transformers/issues/28459 in transformers to surface this there.

Microsoft org

Hello @jamesbraza !

Thanks a lot for your debugging and raising an issue in transformers.

Something is definitely off with the dynamic modules loading. I suspect this is a combined problem with trust_remote_code=True. It could also be related to how things are imported since we are externally importing transformers.utils in the remote file.

Whenever you have some time, could you please clone transformers, install it with “pip install -e .” and test the model without trust_remote_code=True?

You will be in version 4.37.0.dev, which has support for the Phi model uploaded on this repository.

If it works, we will be able to isolate even more where the issue is located.

Thanks and regards,
Gustavo.

Microsoft org

I confirm that it only happens with trust_remote_code=True. It must be something behind the scenes on how transformers process these files.

Nevertheless, I deployed a fix to use try/except and seems to be working.

Yeah it looks like until the transformers automatic import detection is generalized, try-except is the way to go:

try:  # noqa: SIM105
    if is_flash_attn_2_available():
        from flash_attn import flash_attn_func, flash_attn_varlen_func
        from flash_attn.bert_padding import index_first_axis, pad_input, unpad_input
except ImportError:
    # Workaround for https://github.com/huggingface/transformers/issues/28459,
    # don't move to contextlib.suppress(ImportError)
    pass

Feel free to tag me on a PR if needed

Microsoft org

@jamesbraza I prefer your fix over mine lol

Could you please open a PR and update it here? Feel free to do on Phi-2 as well.

I tested it in Phi-1 and works beautifully.

gugarosa changed discussion status to closed

Hello @gugarosa looks like you implemented this in 426ea90. Thanks for being responsive and improving the source code here :)

hi, could you please tell me where to add the try-except? it seems it's not in the transformers.dynamic_module_utils file...
The version of my transformers is 4.44.2

Sign up or log in to comment