Welcome to my public knowledge archive, where I document insights from articles, research, and ideas worth remembering.

About Duly Noted | RSS Feed

You might also like my longer pieces

Author: Bloomberg

Note: ai and business process automation

More than three quarters (77%) of companies’ usage of Anthropic’s Claude AI software involved automation patterns, often including “full task delegation,” according to a research report the startup released on Monday. The finding was based on an analysis of traffic from Anthropic’s application programming interface, which is used by developers and businesses.

I think the critical thought here is maybe users are only leaning into AI where it makes sense now? So I would expect more of anthropic usage patterns to reflect actual work instead of exploratory work 2 years in.

Author: Armin Ronacher

Note: AI can type faster, but it can't program faster

Do I program any faster? Not really. But it feels like I’ve gained 30% more time in my day because the machine is doing the work. I alternate between giving it instructions, reading a book, and reviewing the changes.

I think this is the way. and matches what I’ve experienced. Sure you can spool code from the CLI to your editor and review via git diffs. But are you accomplishing more throughput or just offloading syntaxual structure. I think more of AI as a mega-linter. Able to apply architectural patterns not just whitespace fixes.

Author: Rahul Ramesh

Note: ai code must pass through the human mind - thus a new bottleneck emerges

Code review, however, emerged as the most significant challenge. Reviewing the generated code line by line across all changes took me approximately 20 minutes. Unlike pairing with a human developer—where iterative discussions occur at a manageable pace—working with an AI system capable of generating entire modules within seconds creates a bottleneck on the human side, especially when attempting line-by-line scrutiny. This isn’t a limitation of the AI itself, but rather a reflection of human review capacity.

Author: Kenneth

Note: watch the hands of AI not their mouths

Seven years from GPT-1 to the plateau. How many more until we stop trying to build intelligence and start trying to understand what we’ve already built? That’s the real work now - not training the next model, but figuring out what to do with the ones we have. Turns out the singularity looks less like transcendence and more like integration work. Endless, necessary integration work.

I think this is it. The last 5 years and especially 3 have been amazing to watch. From GPT-2 to today AI has only gotten better at helping me code basic things. But it still tries to run non-sensical commands and just recently added a “Utilties-2” folder to my project because I already had one?

Author: Scott A. Wolla

Note: lumpy labor fallacy - don' be a loser I guess

Automation affects workers in different ways. In some cases, technology acts as a complement to human labor, and in other cases as a substitute for human labor. Over the long run, technological advance creates new goods and services, raises national income, and increases the demand for labor throughout the economy. However, it is important to note that these changes can create winners and losers—some workers will lack the skills to transition to new jobs. Recent technological advance has increased the demand for highly skilled workers, whose labor is a complement to the new technology, but the new technology has replaced the labor of some less-skilled workers.11 Therefore, it’s important that workers invest in their human capital and continue to improve their skills throughout their working years.

Author: Klint Finley

Note: all the way around to static content site generators

Harris took a different approach. Svelte performs its middle-layer work before a developer uploads code to a web server, well before a user ever downloads it. This makes it possible to remove unnecessary features, shrinking the resulting app. It also reduces the number of moving parts when a user runs the app, which can make Svelte apps faster and more efficient. Wang says he likes to use Svelte for web pages, but he still uses React for larger applications, including his professional work. For one thing, the larger an app, the more likely a developer will use all of React’s features.

Author: Lisa Dziuba

Note: great real world notes on AI-coding

Some of us lean on AI coding to push side projects faster into the delivery pipeline. These are not core product features but experiments and MVP-style initiatives. For bringing that kind of work to its first version, the speed-up is real. … output quality gets worse the more context you add. The model starts pulling in irrelevant details from earlier prompts, and accuracy drops. … AI can get you 70% of the way, but the last 30% is the hard part. The assistant scaffolds a feature, but production readiness means edge cases, architecture fixes, tests, and cleanup

Author: Chantal Kapani

Note: writing code was never the bottleneck

“We need to stop talking about AI as a magic fix and instead focus on the specifics: where are the biggest points of friction for developers, how can AI help alleviate that friction, and specifically how should developers use AI tools to overcome that friction and move faster?” Laura Tacho, CTO at DX told LeadDev earlier this year.

Not sure if I captured this before. but writing code was never the bottleneck in software development. And no engineer is able to drive a single feature to completion. Stakeholders are always involved.

Author: Ed Zitron's Where's Your Ed At

Note: growth at all costs burns everything

As my friend Kasey put it in a recent conversation, growth is a fire. If you build a nice, sustainable fire, it’ll keep you warm, cook food and sustain life. And if the only thing you care about is how big your fire is, then it’ll set fire to everything around it, and the more you throw into it, the more it’ll burn. Eventually, you’ll have nothing left, but if you desperately desire that fire, you will constantly have to find new things to burn at any cost.

Author: Ed Zitron's Where's Your Ed At

Note: Nvidia's GPUs are propping up the mag 7 and the whole market

NVIDIA’s earnings are, effectively, the US stock market’s confidence, and everything rides on five companies — and if we’re honest, really four companies — buying GPUs for generative AI services or to train generative AI models. Worse still, these services, while losing these companies massive amounts of money, don’t produce much revenue, meaning that the AI trade is not driving any real, meaningful revenue growth.

We’re three years in, and generative AI’s highest-grossing companies — outside OpenAI ($10 billion annualized as of early June) and Anthropic ($4 billion annualized as of July), and both lose billions a year after revenue — have three major problems: