Index
All Classes and Interfaces|All Packages
A
- applyDiff(List<String>, String) - Static method in class io.codemodder.plugins.llm.LLMDiffs
-
Applies a diff in unified format to
target
.
B
- BinaryThreatRisk - Enum Class in io.codemodder.plugins.llm
-
The possible outcomes of a binary risk analysis -- "high" or "low".
C
- CodeChangingLLMRemediationOutcome - Record Class in io.codemodder.plugins.llm
-
Models the parameters for a remediation analysis + actual direction for changing the code.
- CodeChangingLLMRemediationOutcome(String, String, String) - Constructor for record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Creates an instance of a
CodeChangingLLMRemediationOutcome
record class. - configure() - Method in class io.codemodder.plugins.llm.LLMServiceModule
- contextWindow() - Method in interface io.codemodder.plugins.llm.Model
- contextWindow() - Method in enum class io.codemodder.plugins.llm.StandardModel
- countTokens(List<String>, int, EncodingType) - Static method in class io.codemodder.plugins.llm.Tokens
-
Estimates the number of tokens the messages will consume.
- createCodemodChange(Result, int, String) - Method in class io.codemodder.plugins.llm.SarifToLLMForMultiOutcomeCodemod
-
Create a
CodemodChange
from the given code change data.
D
- description() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Returns the value of the
description
record component. - description() - Method in interface io.codemodder.plugins.llm.LLMRemediationOutcome
-
A description of the code that the LLM will attempt to use to match.
- description() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Returns the value of the
description
record component.
E
- equals(Object) - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Indicates whether some other object is "equal to" this one.
- equals(Object) - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Indicates whether some other object is "equal to" this one.
F
- FileDescription - Interface in io.codemodder.plugins.llm
-
Provides a set of useful methods to simplify working with the files.
- fix() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Returns the value of the
fix
record component. - fix() - Method in interface io.codemodder.plugins.llm.LLMRemediationOutcome
-
A description of the fix for cases that match this description.
- fix() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
- formatLinesWithLineNumbers() - Method in interface io.codemodder.plugins.llm.FileDescription
-
Return the file as a single string, but with each line prefixed with the line number, starting with 1.
- from(Path) - Static method in interface io.codemodder.plugins.llm.FileDescription
- fromAzureOpenAI(String, String) - Static method in class io.codemodder.plugins.llm.OpenAIService
-
Creates a new
OpenAIService
instance with the given Azure OpenAI token and endpoint. - fromOpenAI(String) - Static method in class io.codemodder.plugins.llm.OpenAIService
-
Creates a new
OpenAIService
instance with the given OpenAI token.
G
- getCharset() - Method in interface io.codemodder.plugins.llm.FileDescription
-
Returns the file's charset.
- getFileName() - Method in interface io.codemodder.plugins.llm.FileDescription
-
Returns the file name.
- getFixPrompt() - Method in class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
-
Instructs the LLM on how to fix the threat.
- getJSONCompletion(List<ChatRequestMessage>, Model) - Method in class io.codemodder.plugins.llm.OpenAIService
-
Gets the completion for the given messages.
- getLines() - Method in interface io.codemodder.plugins.llm.FileDescription
-
Returns the file as a list of lines.
- getLineSeparator() - Method in interface io.codemodder.plugins.llm.FileDescription
-
Returns the file's preferred line separator by locating the first line separator and assuming "\n" if none are found.
- getModules(Path, List<Path>, List<String>, List<String>, List<Class<? extends CodeChanger>>, List<RuleSarif>, List<Path>, List<Path>, Path, Path) - Method in class io.codemodder.plugins.llm.LLMProvider
- getResponseForPrompt(List<ChatRequestMessage>, Model, Class<T>) - Method in class io.codemodder.plugins.llm.OpenAIService
-
Returns an object of the given type based on the completion for the given messages.
- getThreatPrompt() - Method in class io.codemodder.plugins.llm.SarifToLLMForMultiOutcomeCodemod
-
Instructs the LLM on how to assess the risk of the threat.
- getThreatPrompt(CodemodInvocationContext, List<Result>) - Method in class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
-
Instructs the LLM on how to assess the risk of the threat.
- GPT_4_TURBO_2024_04_09 - Enum constant in enum class io.codemodder.plugins.llm.StandardModel
- GPT_4O_2024_05_13 - Enum constant in enum class io.codemodder.plugins.llm.StandardModel
H
- hashCode() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Returns a hash code value for this object.
- hashCode() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Returns a hash code value for this object.
- HIGH - Enum constant in enum class io.codemodder.plugins.llm.BinaryThreatRisk
I
- id() - Method in interface io.codemodder.plugins.llm.Model
- id() - Method in enum class io.codemodder.plugins.llm.StandardModel
- io.codemodder.plugins.llm - package io.codemodder.plugins.llm
- isPatchExpected(Patch<String>) - Method in class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
-
Returns whether the patch returned by the LLM is within the expectations of this codemod.
- isServiceAvailable() - Method in class io.codemodder.plugins.llm.OpenAIService
-
Returns whether the service is available.
K
- key() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Returns the value of the
key
record component. - key() - Method in interface io.codemodder.plugins.llm.LLMRemediationOutcome
-
A small, unique key that identifies this outcome.
- key() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Returns the value of the
key
record component.
L
- LLMDiffs - Class in io.codemodder.plugins.llm
-
Utilities for working with diff patches returned by an LLM.
- LLMProvider - Class in io.codemodder.plugins.llm
-
Provides LLM-related functionality to codemods.
- LLMProvider() - Constructor for class io.codemodder.plugins.llm.LLMProvider
- LLMRemediationOutcome - Interface in io.codemodder.plugins.llm
-
Describes a possible remediation outcome.
- LLMServiceModule - Class in io.codemodder.plugins.llm
-
Provides configured LLM services.
- LLMServiceModule() - Constructor for class io.codemodder.plugins.llm.LLMServiceModule
- LOW - Enum constant in enum class io.codemodder.plugins.llm.BinaryThreatRisk
M
- Model - Interface in io.codemodder.plugins.llm
-
Internal model for a GPT language model.
N
- NoActionLLMRemediationOutcome - Record Class in io.codemodder.plugins.llm
-
Models the parameters for a remediation analysis that results in no code changes.
- NoActionLLMRemediationOutcome(String, String) - Constructor for record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Creates an instance of a
NoActionLLMRemediationOutcome
record class. - noServiceAvailable() - Static method in class io.codemodder.plugins.llm.OpenAIService
O
- onFileFound(CodemodInvocationContext, List<Result>) - Method in class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
- onFileFound(CodemodInvocationContext, List<Result>) - Method in class io.codemodder.plugins.llm.SarifToLLMForMultiOutcomeCodemod
- openAI - Variable in class io.codemodder.plugins.llm.SarifPluginLLMCodemod
- OpenAIService - Class in io.codemodder.plugins.llm
-
A custom service class to wrap the
OpenAIClient
P
- providerName() - Method in class io.codemodder.plugins.llm.OpenAIService
S
- SarifPluginLLMCodemod - Class in io.codemodder.plugins.llm
-
A base class for LLM codemods that process SARIF and use the OpenAI service.
- SarifPluginLLMCodemod(RuleSarif, OpenAIService) - Constructor for class io.codemodder.plugins.llm.SarifPluginLLMCodemod
- SarifToLLMForBinaryVerificationAndFixingCodemod - Class in io.codemodder.plugins.llm
-
An extension of
SarifPluginRawFileChanger
that uses large language models (LLMs) to more deeply analyze and then fix the files found by the static analysis tool. - SarifToLLMForBinaryVerificationAndFixingCodemod(RuleSarif, OpenAIService) - Constructor for class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
-
For backwards compatibility with a previous version of this API, uses a GPT 3.5 Turbo model.
- SarifToLLMForBinaryVerificationAndFixingCodemod(RuleSarif, OpenAIService, Model) - Constructor for class io.codemodder.plugins.llm.SarifToLLMForBinaryVerificationAndFixingCodemod
- SarifToLLMForMultiOutcomeCodemod - Class in io.codemodder.plugins.llm
-
An extension of
SarifPluginRawFileChanger
that uses large language models (LLMs) to perform some analysis and categorize what's found to drive different potential code changes. - SarifToLLMForMultiOutcomeCodemod(RuleSarif, OpenAIService, List<LLMRemediationOutcome>) - Constructor for class io.codemodder.plugins.llm.SarifToLLMForMultiOutcomeCodemod
- SarifToLLMForMultiOutcomeCodemod(RuleSarif, OpenAIService, List<LLMRemediationOutcome>, Model, Model) - Constructor for class io.codemodder.plugins.llm.SarifToLLMForMultiOutcomeCodemod
- shouldApplyCodeChanges() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
- shouldApplyCodeChanges() - Method in interface io.codemodder.plugins.llm.LLMRemediationOutcome
-
Whether this outcome should lead to a code change.
- shouldApplyCodeChanges() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
- shouldRun() - Method in class io.codemodder.plugins.llm.SarifPluginLLMCodemod
-
Indicates whether the codemod should run.
- StandardModel - Enum Class in io.codemodder.plugins.llm
-
Well-known GPT models used in Codemod development.
T
- tokens(List<String>) - Method in interface io.codemodder.plugins.llm.Model
-
Estimates the number of tokens the messages will consume when passed to this model.
- Tokens - Class in io.codemodder.plugins.llm
-
A set of utilities around LLM tokens.
- toString() - Method in record class io.codemodder.plugins.llm.CodeChangingLLMRemediationOutcome
-
Returns a string representation of this record class.
- toString() - Method in record class io.codemodder.plugins.llm.NoActionLLMRemediationOutcome
-
Returns a string representation of this record class.
V
- valueOf(String) - Static method in enum class io.codemodder.plugins.llm.BinaryThreatRisk
-
Returns the enum constant of this class with the specified name.
- valueOf(String) - Static method in enum class io.codemodder.plugins.llm.StandardModel
-
Returns the enum constant of this class with the specified name.
- values() - Static method in enum class io.codemodder.plugins.llm.BinaryThreatRisk
-
Returns an array containing the constants of this enum class, in the order they are declared.
- values() - Static method in enum class io.codemodder.plugins.llm.StandardModel
-
Returns an array containing the constants of this enum class, in the order they are declared.
All Classes and Interfaces|All Packages