About 266,000 results
Open links in new tab
  1. LLM Temperature: How It Works and When You Should Use It

    What is LLM Temperature? The temperature parameter controls the randomness of the generated text. Adjusting the temperature changes how the model selects the next word in a sequence, influencing the creativity and predictability of the output.

  2. LLM Parameters Explained: A Practical Guide with Examples for …

    Jan 16, 2025 · Master LLM parameters like temperature, top-p, penalties, and stop sequences to optimize text generation for creative, technical, or structured tasks.

  3. Ollama Python Library – Tutorial with Examples - Developers Blog

    Jan 29, 2024 · In the following example, we call the Mistral LLM model for the first time using the Ollama Python library. To do this, we need to structure our question as a message object. A message object contains the message text as a content field and an additional role field. The role field can be either system, user or assistant.

  4. Setting Top-K, Top-P and Temperature in LLMs - Medium

    Apr 3, 2024 · We can control LLMs output using Top-K, Top-P, and Temperature. In this article, we’ll dive into what these settings do. How do these parameters work together? Let’s take an example of making...

  5. Understanding Temperature in Language Models (LLMs)

    Jan 15, 2025 · Below is a simple python code which uses GROQ API for accessing LLAMA 3.3 70B parameter model.

  6. Nigel Gebodh - Why Does My LLM Have A Temperature?

    What exactly is an LLM’s Temperature? LLM Temperature is a parameter that controls the probability distribution of the predicted next words of an LLM. It adds some randomness or variety to the LLM’s outputs by changing the likelihood of the next words being selected.

  7. Maximize Your LlamaIndex Performance with Temperature Settings

    Aug 26, 2024 · Here’s how you can set the LLM's temperature: ```python from llama_index.llms.openai import OpenAI from llama_index.core import Settings. Settings.llm = OpenAI (model="gpt-3.5-turbo", temperature=0.3) ``` …

  8. How to Control the Creativity of LLMs Using the Temperature

    Nov 16, 2024 · Temperature scaling fundamentally alters the probability distribution used for selecting the next token in the sequence. Here's how it works, demonstrated with Python code: In this example, the temperature %\text{T}% affects the …

  9. LLM Chain OpenAI Python Example - Analytics Yogi

    Aug 29, 2023 · The temperature argument is used to specify how creative LLM can get while producing the output. The value of temperature varies from 0 to 1 . A value near 1 signals LLM to be as creative as possible.

  10. How to set temperature when using the python module? #255 - GitHub

    I saw on the docs you can set the temperature option through the CLI. Is there a way to do this in python?

  11. Some results have been removed
Refresh