LLM
Last updated
Last updated
LLM transform plugin
Leverage the power of a large language model (LLM) to process data by sending it to the LLM and receiving the generated results. Utilize the LLM's capabilities to label, clean, enrich data, perform data inference, and more.
model_provider
enum
yes
output_data_type
enum
no
String
prompt
string
yes
model
string
yes
api_key
string
yes
openai.api_path
string
no
The model provider to use. The available options are: OPENAI
The data type of the output data. The available options are: STRING,INT,BIGINT,DOUBLE,BOOLEAN. Default value is STRING.
The prompt to send to the LLM. This parameter defines how LLM will process and return data, eg:
The data read from source is a table like this:
Jia Fan
20
Hailin Wang
20
Eric
20
Guangdong Liu
20
The prompt can be:
The result will be:
Jia Fan
20
Chinese
Hailin Wang
20
Chinese
Eric
20
American
Guangdong Liu
20
Chinese
The API path to use for the OpenAI model provider. In most cases, you do not need to change this configuration. If you are using an API agent's service, you may need to configure it to the agent's API address.
Determine the user's country through a LLM.
The model to use. Different model providers have different models. For example, the OpenAI model can be gpt-4o-mini
. If you use OpenAI model, please refer of /v1/chat/completions
endpoint.
The API key to use for the model provider. If you use OpenAI model, please refer of how to get the API key.
Transform plugin common parameters, please refer to for details