To use this model you need to have the @mlc-ai/web-llm module installed. This can be installed using npm install -S @mlc-ai/web-llm. You can see a list of available model records from here: https://github.com/mlc-ai/web-llm/blob/eaaff6a7730b6403810bb4fd2bbc4af113c36050/examples/simple-chat/src/gh-config.js
@mlc-ai/web-llm
npm install -S @mlc-ai/web-llm
// Initialize the ChatWebLLM model with the model record.const model = new ChatWebLLM({ model: "Phi2-q4f32_1", chatOptions: { temperature: 0.5, },});// Call the model with a message and await the response.const response = await model.invoke([ new HumanMessage({ content: "My name is John." }),]); Copy
// Initialize the ChatWebLLM model with the model record.const model = new ChatWebLLM({ model: "Phi2-q4f32_1", chatOptions: { temperature: 0.5, },});// Call the model with a message and await the response.const response = await model.invoke([ new HumanMessage({ content: "My name is John." }),]);
Optional
Static
Protected
Generated using TypeDoc
To use this model you need to have the
@mlc-ai/web-llm
module installed. This can be installed usingnpm install -S @mlc-ai/web-llm
. You can see a list of available model records from here: https://github.com/mlc-ai/web-llm/blob/eaaff6a7730b6403810bb4fd2bbc4af113c36050/examples/simple-chat/src/gh-config.jsExample