Get the latest tech news

The rise of prompt ops: Tackling hidden AI costs from bad inputs and context bloat


AI models can get fatigued, prompt ops can help manage, measure, monitor and tune prompts.

Couple this with all the tinkering involved with prompting — it can take a few tries to get to the intended result, and sometimes the question at hand simply doesn’t need a model that can think like a PhD — and compute spend can get out of control. Additionally, incorrect prompting API configurations (such as OpenAI o3, which requires a high reasoning effort) will incur higher costs when a lower-effort, cheaper request would suffice. While using bullet points, itemized lists or bold indicators (****) may seem “a bit cluttered” to human eyes, Emerson noted, these callouts can be beneficial for an LLM.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of Context

Context

Photo of rise

rise

Photo of bloat

bloat

Related news:

News photo

Echo Chamber: A Context-Poisoning Jailbreak That Bypasses LLM Guardrails

News photo

Meetings After 8 p.m. Are On the Rise, Microsoft Study Finds

News photo

Rack scale is on the rise, but it's not for everyone... yet