Business & policy

Salesforce plans $300 million anthropic token spend in 2026, eyes slack coding layer

At a glance:

  • Salesforce will spend about $300 million on Anthropic tokens in 2026, mainly for coding.
  • CEO Marc Benioff says Slack will get AI‑driven coding features, but details are still under wraps.
  • Benioff proposes an intermediary routing layer to send simple tasks to cheaper models and complex ones to Claude.

What benioff announced

Marc Benioff told the All‑In podcast that Salesforce expects to burn roughly $300 million on Anthropic tokens this year, with the bulk of that consumption dedicated to coding agents. He described AI‑powered coding assistants as “awesome” and said the spend will make product development at Salesforce cheaper and faster. The figure, if accurate, would place Salesforce among Anthropic’s top commercial accounts, even though neither company has confirmed the number in official filings.

Benioff also hinted at a new, as‑yet‑undisclosed feature set that will make coding inside Slack more seamless. He promised “cool stuff” that integrates code generation directly into the workplace chat platform, signalling a strategic push to embed development tools where teams already collaborate.

Impact on Salesforce’s AI strategy

The announcement builds on a broader narrative that AI agents have already delivered “unprecedented” efficiency gains across Salesforce’s service, support, distribution, and marketing functions. Last August, Benioff said the company cut its support workforce from 9,000 to 5,000 thanks to agent‑driven productivity. He now suggests a similar transformation on the engineering side, where AI‑generated code could accelerate product iteration and lower development costs.

From a revenue perspective, Salesforce’s Agentforce line—its dedicated AI‑agent product suite—has reached $800 million in annual recurring revenue, up 169 % year‑on‑year, with 29,000 deals closed. Starting this summer, every new Salesforce customer will have Slack automatically provisioned and AI‑enabled from day one, reinforcing the company’s vision of Slack as “the interface to AI.”

Token economics and routing proposal

Benioff warned that not every token needs to be processed by a frontier model like Anthropic’s Claude. He called for an “intermediary layer” that would intelligently route simple queries to smaller, cheaper models (such as Anthropic’s Haiku line, Meta’s Llama, or DeepSeek) while reserving Claude Opus 4.7 for complex reasoning tasks. Claude Opus 4.7 is priced at $5 per million input tokens and $25 per million output tokens, whereas the smaller models cost a fraction of that.

With a projected $300 million token bill, even modest routing optimisation could save Salesforce tens of millions of dollars. Benioff’s stance suggests Salesforce may build its own routing infrastructure rather than wait for Anthropic to provide one, potentially creating a proprietary cost‑control layer that could be leveraged across its enterprise customer base.

Slack’s new AI capabilities

In March, Salesforce overhauled Slack, adding more than 30 AI features to the Slackbot. These upgrades turn the bot from a conversational assistant into an agentic system capable of transcribing meetings, monitoring desktop activity, executing tasks via third‑party tools through the Model Context Protocol, and acting as a lightweight CRM. All of these capabilities run on Anthropic’s Claude, underscoring the deep technical integration between the two companies.

The rollout positions Slack as a development hub where engineers can generate, test, and iterate code without leaving the chat environment. While Benioff declined to reveal specifics, the promise of “coding inside Slack” hints at future integrations such as inline code suggestions, automated pull‑request creation, and real‑time debugging assistance.

Anthropic partnership and financial backdrop

Salesforce has invested more than $300 million in Anthropic since its Series C round in early 2023, giving it roughly a 1 % stake in a company now valued at $380 billion. Benioff has previously claimed that Microsoft blocked Salesforce from investing in OpenAI, steering the partnership toward Anthropic instead. That early investment has already yielded a paper return of over ten times the original outlay.

Anthropic recently closed a $1.5 billion joint venture with Blackstone, Hellman & Friedman, and Goldman Sachs to embed Claude across the portfolio companies of the world’s largest private‑equity firms. This move is designed to turn token revenue from discrete contracts into a structural cost of doing business for large organisations, making Salesforce’s projected spend a data point in a broader industry shift.

What to watch next

Analysts will be monitoring how Salesforce implements the proposed routing layer and whether the anticipated savings materialise. The effectiveness of Slack‑based coding tools will also be a key metric, especially as competitors race to embed AI directly into developer workflows. Finally, the evolution of Anthropic’s pricing and model roadmap could influence how other enterprise customers allocate token budgets in the coming years.

Editorial SiliconFeed is an automated feed: facts are checked against sources; copy is normalized and lightly edited for readers.

FAQ

How much does Salesforce expect to spend on Anthropic tokens in 2026?
Marc Benioff told the All‑In podcast that Salesforce plans to spend roughly $300 million on Anthropic tokens during 2026, with the majority of that consumption dedicated to AI‑powered coding agents.
What new AI capabilities are being added to Slack?
In March, Salesforce added over 30 AI features to Slackbot, including meeting transcription, desktop activity monitoring, task execution via the Model Context Protocol, and a lightweight CRM, all powered by Anthropic’s Claude model.
What is the routing layer Benioff wants to build?
Benioff proposes an intermediary system that routes simple queries to cheaper, smaller models (such as Anthropic’s Haiku, Meta’s Llama, or DeepSeek) while sending complex reasoning tasks to Claude Opus 4.7, aiming to reduce token‑related costs.

More in the feed

Prepared by the editorial stack from public data and external sources.

Original article