DuckDuckGoSearch
This notebook provides a quick overview for getting started with DuckDuckGoSearch. For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference.
DuckDuckGoSearch offers a privacy-focused search API designed for LLM Agents. It provides seamless integration with a wide range of data sources, prioritizing user privacy and relevant search results.
Overviewβ
Integration detailsβ
Class | Package | PY support | Package latest | |
---|---|---|---|---|
DuckDuckGoSearch | @langchain/community | β |
Setupβ
The integration lives in the @langchain/community
package, along with
the duck-duck-scrape
dependency:
- npm
- yarn
- pnpm
npm i @langchain/community duck-duck-scrape
yarn add @langchain/community duck-duck-scrape
pnpm add @langchain/community duck-duck-scrape
Credentialsβ
Itβs also helpful (but not needed) to set up LangSmith for best-in-class observability:
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "your-api-key";
Instantiationβ
You can instantiate an instance of the DuckDuckGoSearch
tool like
this:
import { DuckDuckGoSearch } from "@langchain/community/tools/duckduckgo_search";
const tool = new DuckDuckGoSearch({ maxResults: 1 });
Invocationβ
Invoke directly with argsβ
await tool.invoke("what is the current weather in sf?");
[{"title":"San Francisco, CA Current Weather | AccuWeather","link":"https://www.accuweather.com/en/us/san-francisco/94103/current-weather/347629","snippet":"<b>Current</b> <b>weather</b> <b>in</b> San Francisco, CA. Check <b>current</b> conditions in San Francisco, CA with radar, hourly, and more."}]
Invoke with ToolCallβ
We can also invoke the tool with a model-generated ToolCall
, in which
case a ToolMessage
will be returned:
// This is usually generated by a model, but we'll create a tool call directly for demo purposes.
const modelGeneratedToolCall = {
args: {
input: "what is the current weather in sf?",
},
id: "tool_call_id",
name: tool.name,
type: "tool_call",
};
await tool.invoke(modelGeneratedToolCall);
ToolMessage {
"content": "[{\"title\":\"San Francisco, CA Weather Conditions | Weather Underground\",\"link\":\"https://www.wunderground.com/weather/us/ca/san-francisco\",\"snippet\":\"San Francisco <b>Weather</b> Forecasts. <b>Weather</b> Underground provides local & long-range <b>weather</b> forecasts, weatherreports, maps & tropical <b>weather</b> conditions for the San Francisco area.\"}]",
"name": "duckduckgo-search",
"additional_kwargs": {},
"response_metadata": {},
"tool_call_id": "tool_call_id"
}
Chainingβ
We can use our tool in a chain by first binding it to a tool-calling model and then calling it:
Pick your chat model:
- OpenAI
- Anthropic
- FireworksAI
- MistralAI
- Groq
- VertexAI
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
Add environment variables
OPENAI_API_KEY=your-api-key
Instantiate the model
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/anthropic
yarn add @langchain/anthropic
pnpm add @langchain/anthropic
Add environment variables
ANTHROPIC_API_KEY=your-api-key
Instantiate the model
import { ChatAnthropic } from "@langchain/anthropic";
const llm = new ChatAnthropic({
model: "claude-3-5-sonnet-20240620",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
Add environment variables
FIREWORKS_API_KEY=your-api-key
Instantiate the model
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";
const llm = new ChatFireworks({
model: "accounts/fireworks/models/llama-v3p1-70b-instruct",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/mistralai
yarn add @langchain/mistralai
pnpm add @langchain/mistralai
Add environment variables
MISTRAL_API_KEY=your-api-key
Instantiate the model
import { ChatMistralAI } from "@langchain/mistralai";
const llm = new ChatMistralAI({
model: "mistral-large-latest",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/groq
yarn add @langchain/groq
pnpm add @langchain/groq
Add environment variables
GROQ_API_KEY=your-api-key
Instantiate the model
import { ChatGroq } from "@langchain/groq";
const llm = new ChatGroq({
model: "mixtral-8x7b-32768",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/google-vertexai
yarn add @langchain/google-vertexai
pnpm add @langchain/google-vertexai
Add environment variables
GOOGLE_APPLICATION_CREDENTIALS=credentials.json
Instantiate the model
import { ChatVertexAI } from "@langchain/google-vertexai";
const llm = new ChatVertexAI({
model: "gemini-1.5-flash",
temperature: 0
});
import { HumanMessage } from "@langchain/core/messages";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableLambda } from "@langchain/core/runnables";
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant."],
["placeholder", "{messages}"],
]);
const llmWithTools = llm.bindTools([tool]);
const chain = prompt.pipe(llmWithTools);
const toolChain = RunnableLambda.from(async (userInput: string, config) => {
const humanMessage = new HumanMessage(userInput);
const aiMsg = await chain.invoke(
{
messages: [new HumanMessage(userInput)],
},
config
);
const toolMsgs = await tool.batch(aiMsg.tool_calls, config);
return chain.invoke(
{
messages: [humanMessage, aiMsg, ...toolMsgs],
},
config
);
});
const toolChainResult = await toolChain.invoke(
"how many people have climbed mount everest?"
);
const { tool_calls, content } = toolChainResult;
console.log(
"AIMessage",
JSON.stringify(
{
tool_calls,
content,
},
null,
2
)
);
AIMessage {
"tool_calls": [],
"content": "As of December 2023, a total of 6,664 different people have reached the summit of Mount Everest."
}
Agentsβ
For guides on how to use LangChain tools in agents, see the LangGraph.js docs.
API referenceβ
For detailed documentation of all DuckDuckGoSearch features and configurations head to the API reference
Relatedβ
- Tool conceptual guide
- Tool how-to guides