Clouded Judgement 1.30.25 - The Year of AI Applications - Finance-nerd explanation of the “this is actually great!” case for last week: “What’s happened over the last 12 months is the cost per API call (ie the cost for inference) for these models has plummeted. Open source models like Llama, R1 from DeepSeek, etc have all contributed to this. It’s become even more clear the model calls themselves are commoditizing quickly. And this is great! If the variable ‘COGS’ component of marginal API call approaches zero, many of the questions listed above start to go away. You don’t worry about your margins shrinking, changing the pricing doesn’t become a must, etc. This in turn leads to a LOT more experimenting with AI features / functionalities. The radius of complexity shrinks."