Recommended Settings

#73
by Lef00 - opened

What are the recommended settings?
Temperature, top_p etc?

Google org

Hi @Lef00 It usually depends on what your building specifically your usecase. The standard recommendation is to use a temperature 1.0 to allow for moderate randomness and creativity, a Top_P of 0.95 to ensure the model considers a broad but not overly sparse set of probable tokens , and a Top_K of 64 to limit the sampling to the top 64 tokens at each step. These parameters are designed to produce high-quality, diverse, and contextually relevant responses, but for more deterministic or fact-intensive tasks, you might lower the temperature towards 0.2 and Top_P towards 0.8 to increase predictability.
Thank you

I used Gemma 2 2B it and Gemma 7B it for a question decomposition task, and I was wondering what the recommended settings are for each model (e.g., the default ones) to ensure a fair comparison. Are temperature = 1.0, top_p = 0.95, and top_k = 64 considered in general the optimal or standard settings for these models?

Thank you in advance!

Google org

I believe you can achieve consistent and well-structured question decomposition task, when using temperature between 0.3 and 0.5 . The lower temperature range promotes the necessary logical thinking over creativity. Simultaneously, a top_p of 0.9 and a top_k around 40 introduce a controlled level of vocabulary diversity helping the model handle various question formats while keeping the decomposition patterns relevant. You can use these settings as a solid foundation for experimentation and can be fine-tuned based on your specific output requirements. Also try to ensure both Gemma 2B and Gemma 7B use these exact same settings for fair comparison.

Sign up or log in to comment