Gemma-3-27B-Glitter

Creative Model

View on Hugging FaceBack to Models

Hourly Usage

Performance Metrics

Avg. Total Time

49.90s

Avg. TTFT

20.28s

Avg. Prefill TPS

754.64

Avg. Gen TPS

50.95

Model Information

Context Size

32768

Quantization

r64

Engine

aphrodite

Creation Method

Unknown

Model Type

Gemma27B

Chat Template

Gemma 2

Reasoning

No

Vision

Yes

Parameters

27B

Added At

6/24/2025


base_model:

  • google/gemma-3-27b-it
  • google/gemma-3-27b-pt
  • Columbidae/gemma-3-27b-half library_name: transformers tags:
  • mergekit
  • merge

✨G3 Glitter 27B✨

A creative writing model based on Gemma 3 27B.

Columbidae/gemma-3-27b-half, a 50/50 merge of 27B IT and 27B PT, was used as the base model. (This was done because of the success of Starshine, a 50/50 IT and PT merge.)

The inclusion of PT model does weaken the instruct, but it also weakens the censorship/hesitancy to participate in certain fictional stories. The prose also becomes more natural with less of the IT model included.

This model does better with short and to-the-point prompts. Long, detailed system prompts will often confuse it. (Tested with 1000-2000 token system prompts to lackluster results compared to 100-500 token prompts).

Instruct Format

Uses Gemma2/3 instruct and context. Like Glitter 12b, this works well with temp = 1, top-nsigma = 1.5.

<start_of_turn>user
{User messages; can also put sysprompt here to use the built-in g3 training}<end_of_turn>
<start_of_turn>model
{model response}<end_of_turn>