Skip to main content

Using Anthropic's API

Following Anthropic's getting started documentation let's create a simple code just to use the API. This is the basis for a future agent we can create.

Let's do this in Node.js. If you don't have Node installed on your system, then the node --version command will show the installed version.

❯ node --version
v22.16.0

Create any folder and inside it run the command below. This will install the libraries to be used in this folder. They will be inside the node_modules folder.

❯ npm install @anthropic-ai/sdk

Checking.

❯ tree -L 1
.
β”œβ”€β”€ node_modules
β”œβ”€β”€ package-lock.json
β”œβ”€β”€ package.json

Now we need a main file that calls this library. Create a main.js file with the following content.

import Anthropic from "@anthropic-ai/sdk"; // Importing the library

// Initializing the library
const anthropic = new Anthropic();

// Actually this function does nothing more than build the post request that will be the call to the api.

// By default this lib looks for the ANTHROPIC_API_KEY environment variable that should be available in your terminal.
const msg = await anthropic.messages.create({
model: "claude-opus-4-20250514",
max_tokens: 1000,
temperature: 1,
// Here we declare the system prompt
system: "Respond objectively and with short sentences",
messages: [
{
role: "user",
content: [
{
type: "text",
// Here we declare the user's text.
text: "Why is the ocean salty?"
}
]
}
]
});
console.log(msg);
❯ tree -L 1
.
β”œβ”€β”€ main.js
β”œβ”€β”€ node_modules
β”œβ”€β”€ package-lock.json
β”œβ”€β”€ package.json

Export the environment variable with the ANTHROPIC_API_KEY and run main.js.

To be able to make the calls you'll need to buy some credits there at Anthropic and generate an APIKEY.

export ANTHROPIC_API_KEY=xxxxxxxxxx

❯ node main.js
## OUR RESPONSE.
{
id: 'msg_01SHqZPvKhbhLHzqqAbH2kJw',
type: 'message',
role: 'assistant',
model: 'claude-opus-4-20250514',
content: [
{
type: 'text',
text: 'The ocean is salty because:\n' +
'\n' +
'1. **Rivers carry salts**: Rainwater dissolves minerals from rocks and rivers carry these salts to the sea\n' +
'\n' +
'2. **Evaporation concentrates salt**: Water evaporates, but salt remains in the ocean, increasing its concentration over millions of years\n' +
'\n' +
'3. **Volcanic activity**: Submarine volcanoes release minerals and salts directly into the water\n' +
'\n' +
'4. **Geological time**: This process has been happening for billions of years, accumulating more and more salt\n' +
'\n' +
'The main salt is sodium chloride (table salt), but there are other dissolved minerals too.'
}
],
stop_reason: 'end_turn',
stop_sequence: null,
usage: {
input_tokens: 30,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
output_tokens: 182,
service_tier: 'standard'
}
}

Notice how the response text came formatted in markdown inside a string and the use of 30 input tokens and 182 output.

If this text were exposed in markdown it would be this.

The ocean is salty because:

  1. Rivers carry salts: Rainwater dissolves minerals from rocks and rivers carry these salts to the sea
  2. Evaporation concentrates salt: Water evaporates, but salt remains in the ocean, increasing its concentration over millions of years
  3. Volcanic activity: Submarine volcanoes release minerals and salts directly into the water
  4. Geological time: This process has been happening for billions of years, accumulating more and more salt The main salt is sodium chloride (table salt), but there are other dissolved minerals too. .

The same request we're building here can be done through the console directly on Anthropic's site at https://console.anthropic.com by clicking Create Prompt

We can make a prediction with the token counter before sending the request for processing. It's actually an estimate, but there can be very small differences. Let's put it in a new file counttoken.js.

import Anthropic from '@anthropic-ai/sdk';

const client = new Anthropic();

const response = await client.messages.countTokens({
model: 'claude-opus-4-20250514',
system: "Respond objectively and with short sentences",
messages: [
{
role: "user",
content: [
{
type: "text",
// Here we declare the user's text.
text: "Why is the ocean salty?"
}
]
}
]
});

console.log(response);

Running we have the same expected value we had before.

❯ node counttoken.js
{ input_tokens: 30 }