public static final class GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder extends Object
| 限定符和类型 | 方法和说明 |
|---|---|
GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder |
blockWord(GetModelOutputContentDetectResultResponseBody.BlockWord blockWord)
Detected keywords
|
GetModelOutputContentDetectResultResponseBody.TraceInfo |
build() |
GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder |
denyTopics(GetModelOutputContentDetectResultResponseBody.DenyTopics denyTopics)
Sensitive topic object list
|
GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder |
harmfulCategories(GetModelOutputContentDetectResultResponseBody.HarmfulCategories harmfulCategories)
List of harmful category result objects
|
GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder |
promptAttack(GetModelOutputContentDetectResultResponseBody.PromptAttack promptAttack)
PromptAttack
|
public GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder blockWord(GetModelOutputContentDetectResultResponseBody.BlockWord blockWord)
Detected keywords
public GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder denyTopics(GetModelOutputContentDetectResultResponseBody.DenyTopics denyTopics)
Sensitive topic object list
public GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder harmfulCategories(GetModelOutputContentDetectResultResponseBody.HarmfulCategories harmfulCategories)
List of harmful category result objects
public GetModelOutputContentDetectResultResponseBody.TraceInfo.Builder promptAttack(GetModelOutputContentDetectResultResponseBody.PromptAttack promptAttack)
PromptAttack
public GetModelOutputContentDetectResultResponseBody.TraceInfo build()
Copyright © 2025. All rights reserved.