A chat response may get cut off for several reasons:

  • Internet connectivity
    • Make sure your connection works.
  • Token limit reached
    • Simplify the prompt or break it down into smaller chunks.
    • Try asking for a “summary” first, to capture key pointsm before expanding further. This may allow you to work on smaller chunks at a time, making it less likely to reach a max token limit.
    • If the above do not work, let the administrator know. They may be able to adjust max limits, other pertinent settings, and advise you on which LLM may be best suited to your query while also preventing a token limit issue.