The llm.generateText() method in SuiteScript 2.1 enables server scripts to interact with large language models (LLMs) by generating text responses based on provided prompts. This functionality is part of the N/llm module, introduced in version 2024.1.
Key Features
- Response Type: Returns an llm.Response object containing the generated text.
- Governance: Consumes 100 governance units per execution.
- Supported Script Types: Designed for server scripts.
Parameters
- options.prompt (string, required): Defines the prompt for the LLM.
- options.chatHistory (optional): Includes previous chat messages for context.
- options.modelFamily (optional): Specifies the LLM to be used (defaults to Cohere Command R).
- options.modelParameters (optional): Customizes the model’s behavior:
- maxTokens: Limits output length.
- temperature: Controls randomness (lower = more deterministic and higher = more creative).
- topK/topP: Adjust token selection strategies.
- frequencyPenalty & presencePenalty: Influence repetition in outputs.
- options.ociConfig (optional): Required for OCI Generative AI Service access, including fields like userId, tenancyId, and security credentials.
- options.preamble (optional): Provides initial context for responses (supported only by Cohere Command R).
- options.timeout (optional): Sets a custom timeout (default is 30,000 ms).
Sample Usage
const response = llm.generateText({
prompt: "Hello World!",
modelFamily: llm.ModelFamily.COHERE_COMMAND_R,
modelParameters: {
maxTokens: 1000,
temperature: 0.2,
topK: 3,
topP: 0.7,
frequencyPenalty: 0.4,
presencePenalty: 0
},
ociConfig: {
userId: 'ocid1.user.oc1..exampleuserid',
tenancyId: 'ocid1.tenancy.oc1..exampletenancyid',
compartmentId: 'ocid1.compartment.oc1..examplecompartmentid',
fingerprint: 'custsecret_oci_fingerprint',
privateKey: 'custsecret_oci_private_key'
}
});
Conclusion
llm.generateText() offers flexible text generation capabilities in SuiteScript, with extensive options for fine-tuning model behavior and integrating with Oracle Cloud Infrastructure when needed.