Class OpenAIModerationChain

Class representing a chain for moderating text using the OpenAI Moderation API. It extends the BaseChain class and implements the OpenAIModerationChainInput interface.

Hierarchy

Implements

  • OpenAIModerationChainInput

Constructors

Properties

caller: AsyncCaller
client: OpenAI
clientConfig: ClientOptions
inputKey: string = "input"
outputKey: string = "output"
throwError: boolean
verbose: boolean

Whether to print out response text.

callbacks?: Callbacks
memory?: BaseMemory
metadata?: Record<string, unknown>
openAIApiKey?: string
openAIOrganization?: string
tags?: string[]
lc_runnable: boolean = true

Accessors

Methods

  • Stream all output from a runnable, as reported to the callback system. This includes all inner runs of LLMs, Retrievers, Tools, etc. Output is streamed as Log objects, which include a list of jsonpatch ops that describe how the state of the run has changed in each step, and the final state of the run. The jsonpatch ops can be applied in order to construct state.

    Parameters

    Returns AsyncGenerator<RunLogPatch, any, unknown>

  • Helper method to transform an Iterator of Input values into an Iterator of Output values, with callbacks. Use this to implement stream() or transform() in Runnable subclasses.

    Type Parameters

    Parameters

    • inputGenerator: AsyncGenerator<I, any, unknown>
    • transformer: ((generator, runManager?, options?) => AsyncGenerator<O, any, unknown>)
        • (generator, runManager?, options?): AsyncGenerator<O, any, unknown>
        • Parameters

          Returns AsyncGenerator<O, any, unknown>

    • Optional options: BaseCallbackConfig & {
          runType?: string;
      }

    Returns AsyncGenerator<O, any, unknown>

Generated using TypeDoc