Groq Component

Drag & Drop LLM Component
Groq Component

Overview

A drag-and-drop component for integrating Groq's high-performance LLM inference into your workflow. Configure model parameters and connect inputs/outputs to other components.

Component Configuration

Basic Parameters

  • InputText input for the model
  • System MessageSystem prompt to guide model behavior
  • StreamToggle for streaming responses
  • Modele.g., llama-2-1-8b-instant

API Configuration

  • Groq API KeyYour API authentication key
  • Groq API Basehttps://api.groq.com

Model Parameters

  • Max Output TokensMaximum number of tokens to generate
  • TemperatureCreativity control (0.1 default)
  • NNumber of completions to generate

Output Connections

  • TextGenerated text output
  • Language ModelModel information and metadata

Usage Tips

  • Use environment variables for API key
  • Enable streaming for faster responses
  • Start with default temperature (0.1)
  • Test with single completions first

Best Practices

  • Secure API key handling
  • Monitor token usage
  • Implement error handling
  • Set appropriate token limits