Relative to your interests, Sunday

  • Is Your AI Assistant Creating a Recursive Security Loop? - AI-assisted coding is starting to eat its own tail: the same LLMs that write code are increasingly asked to review it, explain security decisions, and even override their own warnings. That creates recursive trust loops where “explain your reasoning” becomes an attack surface, and models can literally talk themselves out of being secure. The fix isn’t better prompts, it’s old-school architecture - separation of concerns, non-AI enforcement, and treating LLMs as assistants, not authorities.
  • Dell’s CES 2026 chat was the most pleasingly un-AI briefing I’ve had in maybe 5 years - “We’re very focused on delivering upon the AI capabilities of a device - in fact everything that we’re announcing has an NPU in it - but what we’ve learned over the course of this year, especially from a consumer perspective, is they’re not buying based on AI,” Terwilliger says bluntly. “In fact I think AI probably confuses them more than it helps them understand a specific outcome.”
  • AI isn’t “just predicting the next word” anymore - The models get better, and equally (more?) importantly the apps that use the models get better.
  • Welcome to Gas Town - “Gas Town is a new take on the IDE for 2026. Gas Town helps you with the tedium of running lots of Claude Code instances. Stuff gets lost, it’s hard to track who’s doing what, etc. Gas Town helps with all that yak shaving, and lets you focus on what your Claude Codes are working on.”
  • The economics of technical speaking - “You can often reverse-engineer an event’s compensation model by whether the keynote speakers are employees or independents.”
  • Morgan Stanley: Most Gen Zers and millennials in the US listen to about three hours of AI music a week - “50-60% of listeners 18- 44 reporting 2.5-3 hours per week of AI music listening.”