Remove development branch of transformers

#11

The development branch of transformers for Gemma 3 (v4.49.0-Gemma-3) is not required anymore and in fact this branch is quite buggy. I suggest removing the branch requirement.

In fact it took me a long time to try and weed through the bugs until I realized that main branch of transformers already has Gemma 3 in working condition.

How were you able to use gemma 3 w the main branch of transformers?

Actually maybe not completely. I'm not interested in the image modality part so I haven't explored that, but for text, I could get both the 1B and the 4B models working.
Even the text part causes trouble for me on the Gemma3 branch (not mentioning bugs caused for other models).

This is a bit hacky because 1) the tokenizer was not being loaded and 2) I had trouble with the image-text-to-text pipeline, but the normal text-generation pipeline works:

import torch
import transformers

use_4b_model = True

pipe = transformers.pipeline("text-generation", model="google/gemma-3-1b-it", device="cuda", torch_dtype=torch.bfloat16)
pipe.tokenizer = transformers.AutoTokenizer.from_pretrained("google/gemma-3-1b-it")

if use_4b_model:
    model = transformers.AutoModelForPreTraining.from_pretrained("google/gemma-3-4b-it", device_map="cuda", torch_dtype=torch.bfloat16)
    pipe.model = model.language_model

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Write a poem on Hugging Face, the company"}
]

output = pipe(messages, max_new_tokens=50)
print(output[0]["generated_text"][-1]['content'])
# outputs:
# Okay, here's a poem about Hugging Face, aiming to capture its spirit and impact:
#
# **The Open Embrace**
#
# In realms of code, a vibrant hue,
# Hugging Face emerges, fresh and new.
# Not just a
Cannot merge
This branch has merge conflicts in the following files:
  • README.md
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment