why is this giving this error???
TypeError: load_model() missing 1 required positional argument: 'ckpt_path'
Please use pip install git+https://github.com/ai4bharat/IndicF5.git
to avoid this error
Subject: Issue with Loading AI4Bharat Model β Meta Tensor Error in Colab/Kaggle
Dear AI4Bharat Team,
I am currently trying to use your model (such as ai4bharat/IndicF5) in Google Colab and Kaggle environments, but I am continuously encountering the following error during model loading:
NotImplementedError: Cannot copy out of meta tensor; no data!
Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
I have already tried several fixes:
- Downgrading/upgrading PyTorch versions (1.13, 2.0, 2.1, etc.)
- Trying different Transformers versions (v4.28 to v4.40+)
- Using both device_map="auto" and low_cpu_mem_usage=False
- Downloading the model locally and loading manually
- Running on both CPU and GPU environments
Even after trying all these, the model fails to load properly and gives the same meta tensor error. This seems to indicate that the model is not fully initialized and gets stuck in the meta device, making .to() operations fail.
Earlier this same model was working perfectly fine. Could you please confirm:
- Is there any recent update in the model architecture or Hugging Face config?
- Is there a specific version of PyTorch or Transformers needed now?
- Is there any patch we can apply to make it work again?
Attached is a screenshot of the full error trace.
Thank you for your time and support. Looking forward to your response.
Best regards,
Vikas
NotImplementedError: Cannot copy out of meta tensor; no data!
Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.