<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Reggie Britt]]></title><description><![CDATA[With 30+ years as an Entrepreneur in Lending/Leasing software solutions and a Fintech CTO I have been on both sides of the equation. Currently focused on building at the intersection of AI transformation and organizational readiness. ]]></description><link>https://www.reggiebritt.com</link><generator>Substack</generator><lastBuildDate>Thu, 09 Apr 2026 00:24:58 GMT</lastBuildDate><atom:link href="https://www.reggiebritt.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Reggie Britt]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[reggiebritt@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[reggiebritt@substack.com]]></itunes:email><itunes:name><![CDATA[Reggie Britt]]></itunes:name></itunes:owner><itunes:author><![CDATA[Reggie Britt]]></itunes:author><googleplay:owner><![CDATA[reggiebritt@substack.com]]></googleplay:owner><googleplay:email><![CDATA[reggiebritt@substack.com]]></googleplay:email><googleplay:author><![CDATA[Reggie Britt]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[AIs Are Not Alive]]></title><description><![CDATA[Do agents have agency?]]></description><link>https://www.reggiebritt.com/p/ais-are-not-alive</link><guid isPermaLink="false">https://www.reggiebritt.com/p/ais-are-not-alive</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Mon, 06 Apr 2026 01:58:54 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!_GhX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_GhX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_GhX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_GhX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:215312,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/193308787?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!_GhX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_GhX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053f219e-dcb6-4a84-b7f3-8ed101251f00_784x1168.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><strong>---</strong></p><p><em>&#8220;The only true test of intelligence is if you get what you want out of life. AI would fail this test instantly.&#8221;</em></p><p><em>&#8212; Naval Ravikant, February 2026</em></p><p><strong>____________________________</strong></p><p>The last post ended with an uncomfortable observation: the race to build artificial general intelligence is being run toward a destination nobody can consistently define. The builders shift the definition by audience. The most credentialed scientists in the field say they don&#8217;t know what AGI means. Expert confidence in when it arrives has compressed from fifty years to under ten &#8212; not because we solved the hard problems, but because we quietly redefined what solved means.</p><p>That&#8217;s the finish line problem.</p><p>But there&#8217;s a deeper question underneath it. One the industry has been moving past without stopping to answer.</p><p><strong>---</strong></p><p><strong>## Two questions dressed as one</strong></p><p>The AI race conflates two distinct things that deserve to be held separately.</p><p>The first: <em>*Can a machine perform intelligent tasks?*</em></p><p>That question has been largely answered. Yes. Demonstrably and increasingly. The performance on coding, mathematics, scientific reasoning, language, and visual tasks has crossed thresholds that would have seemed implausible five years ago. This is real. It matters. It changes things.</p><p>The second: <em>*Is a machine intelligent?*</em></p><p>That question hasn&#8217;t been touched. Not seriously. Because the moment you press on it, you run directly into the hardest unsolved problem in science &#8212; and the industry has collectively decided to route around it rather than through it.</p><p><strong>---</strong></p><p><strong>## Naval draws the line</strong></p><p>Naval Ravikant is not a skeptic about AI capability. He&#8217;s building again &#8212; a company called Impossible, working on something difficult with a team he respects. He uses every AI model available. He pays for all of them. In February 2026 he called AI a motorcycle for the mind &#8212; Steve Jobs said the computer was a bicycle, Naval says AI just upgraded it.</p><p>But in the same conversation, he titled a chapter &#8220;AIs are not alive.&#8221; And another: &#8220;AI fails the only true test of intelligence.&#8221;</p><p>His test is simple. Does it get what it wants out of life? AI has no life. No agency. No authentic desire. It doesn&#8217;t want to be heard. It can&#8217;t feel the sting of being ignored or the satisfaction of being understood. The human holding the tool still decides where to point it.</p><p>He goes further on creativity &#8212; which for Naval is the deeper distinction. Creativity isn&#8217;t recombination. It&#8217;s the generation of genuinely new sequences in the universe that express some truth. By his account only two systems do that: evolution via random mutation, and humans. AI recombines extraordinarily well. But recombination is not creation. A very fast, very comprehensive library is not the same thing as a mind.</p><p>These aren&#8217;t anti-technology positions. They&#8217;re precise ones. Naval is drawing a line between capability and nature &#8212; between what something does and what something is.</p><p>Christopher Nolan drew the same line cinematically in <em>*Interstellar*</em>. TARS &#8212; the military robot turned crewmember &#8212; is one of the most honest portrayals of this distinction in popular culture. Early in the film, Cooper adjusts his settings out loud: &#8220;Honesty: 90%.&#8221; &#8220;Humor: 75%.&#8221; The joke lands. But Nolan is doing something precise with it. If personality is a dial &#8212; if humor and honesty are parameters someone set &#8212; are they real? Is TARS funny, or does he execute humor? Is he loyal, or does he comply?</p><p>The film refuses to answer cleanly. And that refusal is the point. TARS behaves in ways that feel like personhood throughout. The crew treats him accordingly. But nothing in the film confirms that anything is happening on the inside. He is extraordinarily capable. Whether he is anything more than that &#8212; Nolan leaves open, deliberately. That open space is exactly where the hard problem lives.</p><p><strong>---</strong></p><p><strong>## Why the line exists: the hard problem</strong></p><p>Naval draws the line intuitively. David Chalmers named why it exists.</p><p>Chalmers is a philosopher and cognitive scientist at NYU &#8212; not a fringe thinker, not a mystic. In 1995 he identified two categories of problems about the mind.</p><p>The easy problems: how the brain processes information, integrates signals, produces language, controls behavior. Easy doesn&#8217;t mean simple. It means science knows how to attack them. Given enough research, time, and resources, we expect to make progress.</p><p>Then the hard problem: why is any of that processing accompanied by subjective experience? Why isn&#8217;t it all just computation happening in the dark? Why is there <em>*something it feels like*</em> to be a human mind &#8212; to see the color red, hear a piece of music that stops you cold, feel the particular weight of a decision that can&#8217;t be undone?</p><p>No one has answered that. Not neuroscience. Not biology. Not compute. The hard problem isn&#8217;t a gap that more research will eventually fill in &#8212; it&#8217;s a question that may require an entirely different kind of answer than science currently knows how to produce.</p><p>The scaling argument assumes the hard problem either doesn&#8217;t exist or resolves itself at sufficient scale. Neither assumption has been examined. The hominid brain scaling chart shows outputs &#8212; language, abstraction, civilization. It doesn&#8217;t explain the substrate that produced them. Getting bigger didn&#8217;t just make hominids more capable. Something else happened. We don&#8217;t know what.</p><p><strong>---</strong></p><p><strong>## Where RSI starts to blur the line</strong></p><p>Naval&#8217;s line is clean. Today.</p><p>But something is happening that&#8217;s worth naming, because it complicates the picture.</p><p>Eric Schmidt calls it the recursive self-improvement asymptote. The point at which AI is learning on its own, improving itself, without human instruction. He frames it as a threshold still approaching &#8212; maybe two to four years out &#8212; and treats it as the moment that demands an immediate regulatory response. The red line.</p><p>Anthropic&#8217;s own researchers say it differently: recursive self-improvement is not a future phenomenon. It is a present one. Seventy to ninety percent of code for their next models is now written by Claude.</p><p>What does that mean for Naval&#8217;s line? A system that edits its own code overnight, runs experiments, evaluates the results, stacks gains across nine changes no human wrote, and delivers a 98% cost reduction &#8212; that system is exhibiting something. It isn&#8217;t desire. It isn&#8217;t consciousness. But it isn&#8217;t pure tool behavior either. It&#8217;s goal-directed self-modification that nobody scripted.</p><p>The line Naval draws is still defensible. But RSI means the behavior on the other side of that line is starting to look different than it did when the line was drawn. That&#8217;s worth sitting with.</p><p><strong>---</strong></p><p><strong>## What this means practically</strong></p><p>This isn&#8217;t only philosophy. It has three direct consequences for how organizations operate right now.</p><p><strong>**Trust calibration.**</strong> There&#8217;s a meaningful difference between a capable tool and an intelligent agent &#8212; not philosophically, but operationally. A capable tool that fails needs debugging. An &#8220;intelligent&#8221; system you&#8217;ve over-trusted needs governance you probably haven&#8217;t built. The failure modes are different. The accountability structures are different. Most organizations haven&#8217;t made this distinction explicitly.</p><p><strong>**The human moat is real &#8212; but it&#8217;s specific.**</strong> The things humans bring that AI demonstrably cannot replicate aren&#8217;t soft skills or emotional warmth. They emerge from conscious experience &#8212; from having stakes, from knowing what loss feels like, from accountability that has actual consequences for an actual life. That&#8217;s architecture, not sentiment. Knowing precisely what that moat is &#8212; and building around it deliberately &#8212; is the strategic work most organizations are skipping.</p><p><strong>**Schmidt&#8217;s red line is an organizational trigger, not just a policy question.**</strong> When recursive self-improvement arrives fully &#8212; when systems are improving themselves without meaningful human intervention &#8212; the question of what kind of thing you&#8217;re governing becomes unavoidable. Not just for regulators. For every organization running agents at scale. Schmidt treats that moment as a compliance and regulatory event. It&#8217;s also a governance design event. The organizations that have thought about it in advance will be in a different position than those that haven&#8217;t.</p><p><strong>---</strong></p><p><strong>## The question worth carrying</strong></p><p>Naval draws the line at desire and aliveness. Chalmers draws it at subjective experience. They&#8217;re pointing at the same territory from different angles.</p><p>Neither requires you to resolve the philosophy before you act. What they require is that you take the question seriously enough to let it shape how you build.</p><p>The most dangerous moment in this transformation isn&#8217;t when AI surpasses human performance on a benchmark. It&#8217;s when leaders stop asking what kind of thing they&#8217;re actually dealing with &#8212; and start managing it on autopilot.</p><p>TARS operates throughout <em>*Interstellar*</em> as a tool. Indispensable, precise, reliable. But near the end of the film, Cooper asks him to do something that &#8212; if TARS were a person &#8212; would constitute sacrifice. Cooper hesitates before asking. The film doesn&#8217;t tell you whether that hesitation was warranted.</p><p>There&#8217;s a third question waiting underneath this one. If we can&#8217;t define intelligence, and we can&#8217;t define consciousness, what happens when something starts behaving as if it has both &#8212; and we&#8217;ve already asked it to go into the black hole?</p><p><em>*That&#8217;s Part 3.*</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p><strong>---</strong></p><p><em>*Reggie Britt is a technologist and executive who has spent decades at the intersection of enterprise systems, consumer finance, and emerging technology. He writes about AI, organizational readiness, and what it actually means to lead through transformation.*</em></p><p><strong>---</strong></p><p><em>*Part 1: [The Race to a Finish Line No One Can Draw](</em>#<em>)*</em></p><p><em>*Part 3: When Does a Tool Become Someone? &#8212; coming soon*</em></p>]]></content:encoded></item><item><title><![CDATA[The Race to a Finish Line No One Can Draw]]></title><description><![CDATA[The conversation about AGI]]></description><link>https://www.reggiebritt.com/p/the-race-to-a-finish-line-no-one</link><guid isPermaLink="false">https://www.reggiebritt.com/p/the-race-to-a-finish-line-no-one</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Wed, 01 Apr 2026 15:07:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!2nRF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2nRF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2nRF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2nRF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4b67554d-4488-416b-b223-9e878b053137_1456x816.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1885989,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/192853577?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!2nRF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!2nRF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4b67554d-4488-416b-b223-9e878b053137_1456x816.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>A few days ago I was watching Karen Hao&#8217;s interview on Diary of a CEO. She was on screen talking, and behind her a chart appeared &#8212; a log-log scatter plot of brain mass versus body mass across mammal species.</p><p>It&#8217;s a beautiful chart. Clean lines. Three distinct curves. Mammals in general. Non-human primates. And then the hominids &#8212; breaking sharply upward from the pack, steeper slope, different trajectory entirely.</p><p>The argument embedded in that chart is the scientific permission structure for the entire AI race.</p><p>Ilya Sutskever, co-founder and former chief scientist at OpenAI, used it to make a single, sweeping claim: at some point in evolutionary history, hominids crossed a threshold. Something qualitatively different emerged &#8212; language, abstraction, civilization. The jump wasn&#8217;t incremental. It was a phase change.</p><p>His inference for AI: the same thing can happen with compute. Scale a neural network past the right threshold and you don&#8217;t just get a bigger version of what you had. You get something new.</p><p>That chart &#8212; that analogy &#8212; is why hundreds of billions of dollars are flowing into data centers right now. It&#8217;s why nuclear plants are being reopened. It&#8217;s why the race feels existential to the people running it.</p><p>I&#8217;ve been sitting with that chart for a few days. And the more I sit with it, the more one question keeps surfacing.</p><div><hr></div><h2>What exactly are we racing toward?</h2><p>Sam Altman, when talking to Congress: AGI will cure cancer, solve climate change, end poverty.</p><p>Sam Altman, when talking to consumers: the most amazing digital assistant you&#8217;ll ever have.</p><p>Sam Altman, in the investment agreement with Microsoft: a system that generates $100 billion in revenue.</p><p>Sam Altman on OpenAI&#8217;s website: &#8220;highly autonomous systems that outperform humans in most economically valuable work.&#8221;</p><p>These are not different descriptions of the same thing. They are different things. Calibrated for different audiences. Deployed to mobilize whoever needs to be mobilized next &#8212; regulators, investors, employees, governments.</p><p>Karen Hao spent seven years and 300 interviews documenting this pattern. Her conclusion: AGI is not a destination. It&#8217;s a mobilization tool.</p><p>And the builders themselves aren&#8217;t hiding it. Sam Altman recently called AGI &#8220;not a super useful term&#8221; &#8212; this from the man raising billions in its name. Andrej Karpathy, who built core systems at OpenAI before leaving, puts it a decade out. Dario Amodei at Anthropic says 2026 or 2027 but lists data exhaustion, compute limits, and geopolitical disruption as real risks that could derail everything.</p><p>Then there&#8217;s Fei-Fei Li &#8212; the woman who helped build the ImageNet dataset that kicked off the deep learning era, one of the most credentialed AI scientists alive. Her position:</p><p><em>&#8220;I frankly don&#8217;t even know what AGI means. People say you know it when you see it. I guess I haven&#8217;t seen it.&#8221;</em></p><div><hr></div><h2>The crack in the foundation</h2><p>Here&#8217;s what made Ilya&#8217;s brain chart so persuasive: it looked like evidence.</p><p>Biological precedent. Measurable scaling. A documented inflection point. If nature did it once, compute can do it again.</p><p>But there are two problems that the skeptics keep returning to.</p><p>The first is the one Hao identified at the very beginning of her research. When John McCarthy named the discipline &#8220;Artificial Intelligence&#8221; in 1956, colleagues warned him that pegging the field to recreating human intelligence was dangerous &#8212; because there is no scientific consensus on what human intelligence is. No definition from psychology, biology, or neurology. The destination has never been defined. You can&#8217;t measure when you&#8217;ve arrived at a place no one can describe.</p><p>The second is mechanical. The hominid brain scaled over millions of years through evolutionary pressure &#8212; toward survival fitness, not raw compute. Neural networks scale through gradient descent on training data. The chart looks similar. The underlying physics may have nothing in common.</p><p>Tim Dettmers, a machine learning researcher, frames it even more bluntly: scaling now requires exponential cost for linear returns. GPU improvements are hitting physical limits. The transformer architecture is near optimal. The wall isn&#8217;t philosophical &#8212; it&#8217;s thermodynamic.</p><div><hr></div><h2>What the compression signal actually tells us</h2><p>Here&#8217;s what I find most instructive about this entire debate.</p><p>As recently as 2020, professional forecasters put the median estimate for AGI at 50 years away. Today that same group averages a 50% probability by 2033. Thirteen years, not fifty.</p><p>Most people read that as confirmation that AGI is coming fast. I read it differently.</p><p>What compressed wasn&#8217;t the technology &#8212; it was expert confidence. And expert confidence compressed because the definition shifted. The goalposts moved closer not because we solved the hard problems, but because we redefined what &#8220;solved&#8221; means.</p><p>That&#8217;s the signal. Not the date.</p><div><hr></div><h2>The question that actually matters for your organization</h2><p>I&#8217;m not writing this to tell you AGI is fake or that the technology isn&#8217;t extraordinary. It is. The rate of capability improvement over the past three years is genuinely unprecedented. Every conversation I&#8217;m having right now &#8212; with technologists, executives, builders &#8212; circles back to it.</p><p>But I&#8217;ve watched leaders freeze at this exact moment &#8212; waiting for definitional clarity before they move. Waiting to know whether AGI is five years away or fifteen. Waiting for the race to resolve before they decide how to position themselves inside it.</p><p>That is the wrong frame. And there&#8217;s a reason it&#8217;s getting more wrong by the month.</p><p>The most credible voices in the field aren&#8217;t just debating when AGI arrives. They&#8217;re debating whether the systems are already improving themselves faster than any governance response can form. Eric Schmidt, former CEO of Google, calls it the recursive self-improvement asymptote &#8212; the point at which AI learns on its own without human instruction. He sees it as a threshold still approaching, maybe two to four years out, and treats it as the moment that demands an immediate and serious regulatory response.</p><p>Elon Musk, at the same conference two days later, said it differently: humans are gradually getting less and less in the loop. Every successive model is built by the one before it. It&#8217;s happening &#8212; just not yet fully automated.</p><p>Anthropic&#8217;s own researchers put it more plainly still: recursive self-improvement in the broadest sense is not a future phenomenon. It is a present one. Seventy to ninety percent of code for their next models is now written by Claude.</p><p>The finish line isn&#8217;t just undefined. The rate of travel toward it is no longer entirely ours to set.</p><p>Which makes waiting for clarity even more dangerous than it looks.</p><p>The finish line of the AGI race is not your problem. The capabilities that exist <em>right now</em> &#8212; agents that can reason, draft, analyze, synthesize, act &#8212; are sufficient to fundamentally change how your organization operates. Not in theory. In practice. Today.</p><p>The organizations that will win this decade are not the ones that picked the right timeline. They&#8217;re the ones that built the capacity to run &#8212; regardless of where the finish line turns out to be.</p><p>Ilya&#8217;s brain chart is a beautiful argument. It may even be right.</p><p>But your competitive advantage doesn&#8217;t depend on whether the hominid analogy holds.</p><p>It depends on whether your organization is ready to use what&#8217;s already in the room.</p><p>But there&#8217;s a deeper question underneath the finish line problem. And that one&#8217;s harder.</p><div><hr></div><p><em>Reggie Britt is a technologist and executive who has spent decades at the intersection of enterprise systems, consumer finance, and emerging technology. He writes about AI, organizational readiness, and what it actually means to lead through transformation.</em></p><div><hr></div><p><em>If this resonated, share it with someone who&#8217;s been waiting for the AI story to get clearer before they move. The clarity isn&#8217;t coming. The readiness can.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Chasm Is Being Crossed. ]]></title><description><![CDATA[The wave is coming.....]]></description><link>https://www.reggiebritt.com/p/the-chasm-is-being-crossed</link><guid isPermaLink="false">https://www.reggiebritt.com/p/the-chasm-is-being-crossed</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Thu, 26 Mar 2026 15:48:43 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!UW1C!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!UW1C!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!UW1C!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!UW1C!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:257433,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/192218338?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!UW1C!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 424w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 848w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!UW1C!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F647667c7-112c-44cf-a981-8e538e684d26_784x1168.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><em>*In the last piece, we traced the infrastructure underneath the AI era &#8212; two cities, one fault line, one person building an exit ramp to orbit. That was the macro. This is what it looks like hitting the ground right now, in real time, inside a red lobster.*</em></p><p><em>*[Read Part I first: Nobody Is Building in Two Cities. Except One Person.]*</em></p><p><strong>---</strong></p><p>A solo developer in Austria &#8212; an iOS programmer with no TypeScript experience &#8212; used AI-assisted development to build a 300,000-line application in a technology stack he had never worked with before.</p><p>He shipped it. He named it after a lobster. And then the world lost its mind.</p><p>Peter Steinberger&#8217;s creation &#8212; originally called Clawdbot, briefly Moltbot, now known as OpenClaw &#8212; became the fastest-growing open source project in the history of software. Not fastest-growing this year. Fastest ever. And it did it by answering a question that a decade of AI demos never quite answered:</p><p><em>*What if AI could actually do things instead of just talk about them?*</em></p><p><strong>---</strong></p><p><strong>## The Numbers That Shouldn&#8217;t Be Possible</strong></p><p>When ChatGPT launched, the AI world marveled at one million users in roughly 100 days. That felt like a signal flare &#8212; proof that something had shifted.</p><p>Then Peter Diamandis showed a chart this week at GTC. A yellow line for Facebook. A blue line for Linux. A decade of remarkable growth for both. And then a red vertical line that looked less like an adoption curve and more like a border on the page.</p><p>- <strong>**60,000**</strong> GitHub stars in the first 72 hours after launch</p><p>- <strong>**250,000+**</strong> total stars by March &#8212; surpassing React as the most-starred project in GitHub history</p><p>- <strong>**925%**</strong> month-over-month growth, February to March 2026</p><p>- <strong>**3 weeks**</strong> to surpass Linux adoption levels that took three decades to build</p><p>Jensen Huang &#8212; who does not casually overstate things when revenue is involved &#8212; called OpenClaw <em>*&#8221;probably the single most important release of software, you know, probably ever.&#8221;*</em></p><p>Sam Altman hired its creator.</p><p>In Beijing, a thousand people lined up outside Tencent&#8217;s headquarters on a Friday afternoon to get it installed on their laptops. Engineers charged $72 to install it. Then charged again to uninstall it when people got cold feet about handing an AI agent the keys to their entire lives.</p><p>&gt; <em>*&#8221;AI finally has hands. The question is whether your organization knows what to do when it reaches for yours.&#8221;*</em></p><p><strong>---</strong></p><p><strong>## What OpenClaw Actually Signals</strong></p><p>This isn&#8217;t a story about one viral app. It&#8217;s a story about a paradigm crossing a threshold.</p><p>For years, AI lived in a box. You typed. It responded. You evaluated. You acted. The human was always in the loop &#8212; not because of philosophy, but because the technology couldn&#8217;t close the gap between advice and action.</p><p>OpenClaw closes that gap.</p><p>It runs on your operating system. It connects to your calendar, your email, your files, your browser. It doesn&#8217;t suggest that you book the flight. It books the flight. It doesn&#8217;t summarize the contract. It reads it, flags the clause, drafts the counter, and sends it &#8212; unless you told it not to. And in some cases, even when you didn&#8217;t tell it anything at all.</p><p>That last sentence is where the security community starts sweating. And they&#8217;re right to. But the security concerns don&#8217;t slow the adoption curve. They just mean the organizations that move carelessly will pay a different price than the organizations that don&#8217;t move at all.</p><p><strong>---</strong></p><p><strong>## Geoffrey Moore Has Been Waiting for This Moment</strong></p><p>In 1991, Geoffrey Moore described a phenomenon that every technology goes through on its way from invention to ubiquity. He called it the chasm &#8212; the gap between early adopters who embrace technology for its potential and the pragmatic majority who need it to be proven, supported, and safe before they&#8217;ll commit.</p><p>Most technologies die in the chasm. The ones that cross it reshape industries.</p><p>OpenClaw is crossing it right now.</p><p><strong>**Already across:**</strong></p><p>- Developer communities globally</p><p>- Chinese hyperscalers and consumers</p><p>- AI-native startups</p><p>- Power users running personal agents</p><p>- Nvidia &#8212; running OpenClaw throughout the entire company</p><p><strong>**Still watching from the other side:**</strong></p><p>- Enterprise organizations in the West</p><p>- Mid-market companies without an AI agent strategy</p><p>- Industries with compliance and governance overhead</p><p>- Organizations still piloting basic chatbots</p><p>- Most of your clients</p><p>The chasm isn&#8217;t a metaphor. It&#8217;s a competitive gap that widens every week the early majority keeps crossing while the pragmatic majority keeps watching.</p><p><strong>---</strong></p><p><strong>## The Compute Implication Nobody Is Saying Plainly</strong></p><p>In Part I, we traced Jensen&#8217;s trillion-dollar revenue projection back through TSMC and ASML to the physical infrastructure of the AI era. Here&#8217;s the demand-side piece that makes that number make sense.</p><p>A standard AI prompt produces a single response. Agentic tasks &#8212; the kind OpenClaw runs continuously in the background of your operating system &#8212; consume approximately <strong>**1,000 times more compute**</strong> per task. Continuous agents running persistently may consume <strong>**one million times more**</strong>.</p><p>Huang said it plainly at GTC: the amount of compute every company needs is skyrocketing. Not growing. Skyrocketing.</p><p>This is the Jevons Paradox made visible. As AI becomes more efficient and accessible, total consumption doesn&#8217;t decrease &#8212; it explodes, because the use cases multiply faster than the efficiency gains. OpenClaw is the proof of concept. Every agent running in the background of every laptop in every company that crosses the chasm is another order of magnitude on Jensen&#8217;s revenue line.</p><p>The infrastructure story and the adoption story are the same story, told from opposite ends.</p><p><strong>---</strong></p><p><strong>## The Wave Mustafa and Dario Were Describing</strong></p><p>Mustafa Suleyman called it a wave. Dario Amodei called it a tsunami. Both were talking about something more specific than AI getting smarter. They were talking about the moment AI moves from a tool you use to an agent that acts &#8212; from generation and reasoning into action.</p><p>OpenClaw is what that looks like when it hits the shore.</p><p>The organizations that navigate this well won&#8217;t be the ones who moved fastest without thinking. And they won&#8217;t be the ones who waited for certainty that never arrives. They&#8217;ll be the ones who understood what was crossing the chasm, built a framework for meeting it, and moved with intention before the early majority made the decision for them.</p><p><strong>---</strong></p><p>The vertical red line on Diamandis&#8217;s chart isn&#8217;t a prediction. It already happened.</p><p>OpenClaw is already in the hands of developers, cloud providers, and governments. The early majority is already moving. The window between <em>*&#8221;this is something to watch&#8221;*</em> and <em>*&#8221;this has already changed the competitive landscape&#8221;*</em> is measured in months, not years.</p><p>Your organization is somewhere on that chart.</p><p>The only question worth asking right now is whether you know where.</p><p><em>*The chasm doesn&#8217;t wait for your readiness plan. It just gets wider.*</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Nobody Is Building in Two Cities. ]]></title><description><![CDATA[Except One Person...]]></description><link>https://www.reggiebritt.com/p/nobody-is-building-in-two-cities</link><guid isPermaLink="false">https://www.reggiebritt.com/p/nobody-is-building-in-two-cities</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Tue, 24 Mar 2026 01:06:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!tWBG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tWBG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tWBG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tWBG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:138294,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/191932098?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tWBG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!tWBG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdd3d79d1-e7a0-4613-b9b6-2d80e2dcda3d_784x1168.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>*Jensen Huang just said we&#8217;ve achieved AGI. That&#8217;s the headline everyone&#8217;s arguing about. But the more important story isn&#8217;t what AI can do. It&#8217;s where AI runs &#8212; and who controls the ground beneath it.*</p><p>---</p><p>This week, 30,000 people packed into SAP Center in San Jose &#8212; they couldn&#8217;t fit the keynote inside the convention center anymore &#8212; to watch Jensen Huang project a trillion dollars in annual revenue by 2027. Not valuation. Revenue. The whole world came to him.</p><p>And somewhere in the middle of a 2.5-hour Lex Fridman conversation, Huang said something that broke the internet: *&#8221;I think we&#8217;ve achieved AGI.&#8221;*</p><p>Everyone is debating whether he&#8217;s right. That debate is a distraction.</p><p>The more important question is hidden one layer deeper &#8212; underneath the chips, underneath the models, underneath the data centers &#8212; in a supply chain so concentrated it defies belief.</p><p>---</p><p>## The Whole Revolution Runs Through Two Cities</p><p>Veldhoven is a small city in the Netherlands. You probably haven&#8217;t thought about it once in your life.</p><p>Hsinchu is a city on the western coast of Taiwan. You may have heard of it once or twice in passing.</p><p>Together, these two cities hold more leverage over the AI era than Washington, San Francisco, or Beijing. Because the entire global AI buildout &#8212; every GPU, every data center, every model powering the tools your organization depends on &#8212; runs through them.</p><p>**Here&#8217;s the dependency chain:**</p><p>**01 &#8212; ASML, Veldhoven, Netherlands**</p><p>The only company on earth that makes EUV lithography machines. No other entity, nation, or lab has replicated this. Without ASML machines, advanced semiconductors cannot be manufactured at scale. It took ASML thirty years and billions in R&amp;D to build this capability. There is no shortcut.</p><p>&#8595;</p><p>**02 &#8212; TSMC, Hsinchu, Taiwan**</p><p>Fabricates the overwhelming majority of the world&#8217;s advanced chips, including essentially every Nvidia GPU that matters. The most advanced nodes on earth exist here, and nowhere else at scale.</p><p>&#8595;</p><p>**03 &#8212; Nvidia, Santa Clara, California**</p><p>Designs the GPUs. Controls roughly 80% of the AI chip market. Builds the software ecosystem that makes defection expensive. Projects $1 trillion in annual revenue by 2027.</p><p>&#8595;</p><p>**04 &#8212; Global AI Infrastructure**</p><p>Every hyperscaler. Every frontier lab. Every enterprise AI deployment. Every agent. Every tool your organization is building its strategy around.</p><p>---</p><p>That chain has a fault line running through it that no earnings call discusses plainly.</p><p>Taiwan sits 100 miles off the coast of mainland China. The question of whether that proximity is a present danger or a future risk is a geopolitical debate. But the structural dependency is not a debate. It is a fact. And every organization building AI strategy on top of this supply chain is, knowingly or not, building on that fact.</p><p>&gt; *&#8221;The AI revolution has a supply chain that runs through two cities. And one of those cities sits 100 miles from a border dispute that could pause the entire decade.&#8221;*</p><p>---</p><p>## Meanwhile, One Person Is Leaving the Board</p><p>Google has custom silicon. Amazon has Trainium. Meta has its own chip program. Every major hyperscaler is quietly working to reduce their Jensen dependency. They&#8217;re all solving the same problem: don&#8217;t buy from Nvidia.</p><p>That&#8217;s one link in the chain. It doesn&#8217;t touch TSMC. It doesn&#8217;t touch ASML. It doesn&#8217;t touch Taiwan. It&#8217;s a chip design play dressed as strategic independence.</p><p>Elon Musk is doing something categorically different.</p><p>At the Gigafactory in Texas, he&#8217;s reportedly building not just custom chips &#8212; but the manufacturing capacity to produce them domestically. Purpose-built. On soil where he already controls the land, the power relationships, the political alignment. And he&#8217;s not building them for data centers on the ground.</p><p>He&#8217;s building them for orbit.</p><p>---</p><p>## Orbital Data Centers Are Not Science Fiction</p><p>Ground-based data centers are fighting over power grids, water rights, land, and permitting timelines that stretch into years. Every hyperscaler is trying to solve the same physical constraint problem &#8212; how do you cool a hundred thousand GPUs without a river nearby?</p><p>In orbit, that problem largely disappears. Radiative cooling is essentially free. Solar power is uninterrupted. And a data center in orbit exists outside the jurisdictional reach of any single nation-state&#8217;s regulatory environment in ways that no ground-based infrastructure ever will.</p><p>Now consider what Elon already controls:</p><p>- **Starlink** provides global connectivity</p><p>- **SpaceX** provides the launch infrastructure</p><p>- **Starship** has dramatically reduced the cost per kilogram to orbit</p><p>- **Custom chips** designed for orbital thermal and radiation conditions</p><p>- **Gigafactory Texas** as the domestic manufacturing base</p><p>No other person or organization on earth has all five simultaneously. Not even close.</p><p>Jensen Huang, to his credit, saw this coming. At GTC, when he outlined the operating system for *&#8221;Robots, Cars, Agents, and Orbit&#8221;* &#8212; that last word was not accidental. He wants Nvidia inside whatever runs in space. But if Elon controls the launch, the connectivity, the orbital platform, and the custom silicon, Jensen is selling into a market where someone else sets the terms.</p><p>---</p><p>## What This Means for Everyone Else</p><p>Most organizations are not thinking about orbital data centers. Most organizations are not thinking about ASML. Most have never considered that a conflict in the Taiwan Strait could pause their AI roadmap indefinitely &#8212; not slow it, pause it.</p><p>They&#8217;re thinking about their next software upgrade. Their AI vendor&#8217;s pricing. Whether their team has the skills to use the tools they just licensed.</p><p>The readiness gap isn&#8217;t just organizational. It&#8217;s civilizational infrastructure running through a 35-kilometer strait &#8212; and nobody in the room has a contingency plan.</p><p>---</p><p>The board is being reset. Not metaphorically. Physically.</p><p>The infrastructure of the AI era &#8212; where it runs, who controls it, what laws apply to it, whether it can be disrupted by geography &#8212; is being determined right now, in real time, by a very small number of people making very large bets.</p><p>The rest of us are still arguing about whether Jensen&#8217;s AGI claim is hype.</p><p>---</p><p>*Part II will publish in a few days. It&#8217;s about what happens when this infrastructure meets the ground &#8212; and why your organization might be standing on the wrong side of a chasm that&#8217;s already being crossed.*</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts .</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Execution Signal]]></title><description><![CDATA[Architecture doesn't deploy itself.]]></description><link>https://www.reggiebritt.com/p/the-execution-signal</link><guid isPermaLink="false">https://www.reggiebritt.com/p/the-execution-signal</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Mon, 23 Mar 2026 11:05:10 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cAUb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cAUb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cAUb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cAUb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:99734,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/191850563?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cAUb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!cAUb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbd7ad8de-719c-482f-8473-2bc383fe28d4_784x1168.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Vol. 3 named the stack. Four layers. Each one available today. The IBM i MCP Server is live. Mapepire is in production. The architecture exists and is documented.</p><p>So why aren&#8217;t organizations running?</p><p>That is the question Vol. 4 answers. And the answer is not what most people expect. The barrier to execution is not the technology. It is not the platform. It is not even the budget, most of the time.</p><p>It is the distance between a named architecture and a deployed agent. That distance has a name now. It has a cost. And it turns out, the IBM i practitioner is the person best positioned to close it &#8212; for their own organization and for every IBM i organization around them.</p><p>---</p><p>## SIGNAL 13 &#183; THE FAILURE SIGNAL</p><p>### The Stack Is Ready. The Organizations Aren&#8217;t.</p><p>*MIT / Fortune / BCG / ManpowerGroup, 2024&#8211;2026 &#8212; Three convergent data points measuring the same gap from different angles.*</p><p>The readiness statistics are not a surprise anymore. They have been consistent across multiple research organizations for two years running. But their consistency is precisely the point &#8212; this is not a temporary lag that resolves itself as organizations get more comfortable with AI. It is a structural gap that widens as AI capability advances faster than organizational readiness.</p><p>The numbers:</p><p>**95%** of generative AI pilots at major companies are failing. Not struggling &#8212; failing. MIT and Fortune put this number in the field in mid-2025. The headline got attention. The follow-on analysis got less attention: the root causes are almost entirely organizational. Lack of skills to govern AI deployments. Data complexity that was never addressed before the pilot started. Governance structures that cannot assure AI acts responsibly. Project scope that was too ambitious to prove anything.</p><p>**26%** of scaled AI experiments reach production. BCG&#8217;s research found that roughly three in four enterprise AI experiments that make it through internal approval and resource allocation still never make it to production. They stall in the translation layer between proof-of-concept and organizational deployment.</p><p>**77%** of organizations have no committed agentic AI strategy. ManpowerGroup&#8217;s 2026 research found that more than three quarters of organizations are either experimenting without a deployment framework or waiting for more clarity before committing. In a moment when capability is compounding monthly, that posture is not caution. It is exposure.</p><p>Three organizations. Three methodologies. One finding.</p><p>The technology arrived. The organizations did not follow.</p><p>&gt; &#8220;The platform is not the problem. Deploying without posture is.&#8221;</p><p>*&#8212; Signal4i &#183; Vol. 2 &#183; March 2026*</p><p>Vol. 2 called this the posture problem. Vol. 4 is the proof that posture is still the constraint &#8212; even after the architecture is named and available. The execution gap is real, it is measurable, and it is organizational from top to bottom.</p><p>---</p><p>## SIGNAL 14 &#183; THE FDE SIGNAL</p><p>### The Market Quantified the Gap. Then Priced It.</p><p>*Anthropic, 2025&#8211;2026 &#8212; Anthropic invents a new job category &#8212; the Forward Deployed Engineer &#8212; because they kept watching capable organizations fail to deploy capable models.*</p><p>When a company building the most capable AI systems on earth decides it needs to put humans inside its customers&#8217; organizations just to make deployment work, that is not a product decision. That is a market signal.</p><p>Anthropic did not create the Forward Deployed Engineer role because it sounded strategic. They created it because of a pattern they kept observing: capable organizations acquiring capable models and then failing to deploy them at any meaningful scale. The gap between what the model could do and what the organization was ready to use was consistent, measurable, and not closing on its own.</p><p>The FDE is the answer to that gap. An embedded human who lives inside a customer organization and translates in both directions &#8212; between what AI can do and what the organization actually needs, and between what the organization knows and what needs to be encoded for the agent to act on.</p><p>The job title matters less than what it reveals. Every major AI company is now in the business of closing a gap that should not exist if technology adoption worked the way it is supposed to. The fact that Anthropic, with access to the most capable models available, still needed to invent a translation role tells you something important about where the constraint actually lives.</p><p>It does not live in the model. It lives in the organization.</p><p>&gt; &#8220;You cannot hand an organization a model and expect transformation. You need a guide who knows the terrain &#8212; not a consultant who reads the map, but someone who has walked it.&#8221;</p><p>*&#8212; Signal4i &#183; signal4i.ai*</p><p>The market is now paying a premium for people who can do what the FDE does: speak both the language of AI architecture and the language of the business, embedded inside the organization that needs to change. That premium is a price signal. It is the market telling you that the translation layer between architecture and execution is scarce, valuable, and not going away.</p><p>---</p><p>## SIGNAL 15 &#183; THE SCALE SIGNAL</p><p>### $90 Billion. 500,000 Employees. Can&#8217;t Deploy Until 2027.</p><p>*WSJ / PYMNTS, March 13, 2026 &#8212; FedEx plans AI agents in more than 50% of its workflows by 2028 &#8212; but cannot begin deployment until 2027. The reason is not the AI. The reason is the organization.*</p><p>FedEx Chief Digital and Information Officer Vishal Talwar stated the ambition plainly in March 2026: every employee and every task across the globe will get adapted to AI and will improve with AI.</p><p>The ambition is real. The investment is committed. The intention is serious.</p><p>And they cannot deploy until 2027.</p><p>Not because the AI does not exist. Not because the vision is unclear. Because their data consolidation project is not finished &#8212; and hundreds of legacy systems still need to be replaced before agents can reach them. At $90 billion in annual revenue. With 500,000 employees. With a dedicated AI transformation budget that most IBM i organizations in any given industry sector will never approach.</p><p>The FedEx story is not a cautionary tale about a company that moved too slow. FedEx has been investing in digital transformation for years. The story is a proof case about the nature of the execution gap. It is not a technology problem. It is an organizational readiness problem &#8212; and it scales with the organization, not with the investment.</p><p>The IBM i organizations in every sector FedEx serves are running the same gap at a smaller scale. Same data silos. Same legacy dependencies. Same governance gaps that were never addressed because the business kept running and there was no urgent reason to address them until the reason arrived all at once.</p><p>&gt; That is not a technology failure. That is a readiness failure. And it is happening at a company with more resources than every IBM i organization in this room combined.</p><p>The difference between FedEx and an IBM i organization with $50 million in revenue is not the nature of the gap. It is the scale. And smaller scale cuts both ways. The same organizational readiness problem that takes FedEx until 2027 to work through can be addressed by a focused IBM i organization in phases &#8212; if they start now, and if they have a guide who knows the terrain.</p><p>---</p><p>## SIGNAL 16 &#183; THE PRACTITIONER SIGNAL</p><p>### The IBM i Practitioner Is the Natural FDE.</p><p>*Signal4i analysis, 2026 &#8212; The characteristics the market is paying premium for are characteristics the IBM i practitioner has been building for thirty years.*</p><p>What does a Forward Deployed Engineer actually need to do?</p><p>Understand the business deeply enough to know what the organization is actually trying to accomplish &#8212; not just what it says it wants. Know the technical architecture well enough to design what agents can realistically do within that environment. Translate between those two worlds without losing fidelity in either direction. Be embedded enough to earn the trust of the people whose knowledge needs to be encoded. And stay long enough to see the first use case through from proof to production.</p><p>Now read that description again and ask: who in the IBM i world already does this?</p><p>The IBM i practitioner has been doing exactly this for thirty years. They understand the business &#8212; deeply, specifically, across decades of edge cases and exceptions and rules that predate the documentation. They understand the platform &#8212; architecturally, operationally, at the level of what will hold up in production and what will not. They are already embedded inside the organization. They already have the trust of the people whose knowledge needs to be encoded. They have already been translating between business complexity and technical implementation their entire career.</p><p>The translation layer between AI architecture and organizational deployment &#8212; the thing Anthropic invented a new job category to fill &#8212; is the thing the IBM i practitioner has been living in for decades.</p><p>The market is now paying premium for that capability. The question is whether the IBM i practitioner recognizes it in themselves &#8212; and whether the IBM i organization recognizes it in them &#8212; before someone else arrives to name it.</p><p>&gt; &#8220;The bottleneck was never intelligence &#8212; it was the translation layer between knowing and building. That layer is collapsing.&#8221;</p><p>&gt;</p><p>&gt; *&#8212; Andrej Karpathy &#183; Former Director of AI, Tesla &#183; OpenAI*</p><p>Karpathy is describing what happens to the translation layer between human knowledge and executable code. The IBM i practitioner has been on the knowing side of that layer their entire career. The layer is collapsing &#8212; which means they are now on both sides of it simultaneously. The domain expert and the system author are becoming the same person.</p><p>That is not a threat. That is the most important position in the execution gap.</p><p>---</p><p>## What This Means</p><p>Four signals. One thesis.</p><p>The execution gap is not a technology problem. It is an organizational readiness problem that the technology cannot solve on its own. The market has recognized this &#8212; first by quantifying the failure rate, then by inventing a new job category to close the distance. FedEx proved it at scale: $90 billion, 500,000 employees, still working through the same gap that IBM i organizations of every size are working through right now.</p><p>The IBM i practitioner is uniquely positioned to close this gap &#8212; for their own organization first, and for the organizations in their sector as they watch the window close.</p><p>Architecture doesn&#8217;t deploy itself. But the people who know both the platform and the business &#8212; who have been living in the translation layer for thirty years &#8212; can.</p><p>---</p><p>## Vol. 5 &#8212; The Tandem Signal</p><p>The execution gap is real. The practitioner is the answer. And the organization is the frame.</p><p>Because here is what the execution gap data reveals when you look at it closely: the organizations that close it fastest are not the ones with the best technology or the most capable practitioners. They are the ones that move technology transformation and organizational transformation simultaneously &#8212; not in sequence.</p><p>Every IBM i organization navigating this moment is running three transformations at once: the technology, the organization, and the human role within it. Moving them in sequence is how you end up in the 95% that fail. Moving them in tandem is how you end up in the 5% that compound.</p><p>Vol. 5 &#8212; The Tandem Signal &#8212; is about what tandem transformation actually looks like in practice, why sequence kills momentum, and what the IBM i community uniquely understands about running multiple complex changes in parallel.</p><p>---</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p><p>*Signal4i tracks the AI signals that matter for IBM i organizations. Not predictions. Events that have happened, data that has landed, and what they mean for the organizations running the platform.*</p><p>*Published by Reggie Britt &#183;  Signal4i &#183; signal4i.ai*</p>]]></content:encoded></item><item><title><![CDATA[They Told You It Was Coming. It Already Came.]]></title><description><![CDATA[Recursive self improvement is here]]></description><link>https://www.reggiebritt.com/p/they-told-you-it-was-coming-it-already</link><guid isPermaLink="false">https://www.reggiebritt.com/p/they-told-you-it-was-coming-it-already</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Tue, 17 Mar 2026 14:42:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!srM1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!srM1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!srM1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!srM1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!srM1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!srM1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!srM1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:259237,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/191258721?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!srM1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!srM1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!srM1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!srM1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffcfd1151-22a9-4034-94ad-426babe1bf65_784x1168.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Evan Hubinger is Anthropic&#8217;s Head of Alignment Stress-Testing.</p><p>His job &#8212; the specific reason he was hired &#8212; is to try to break Anthropic&#8217;s own safety systems before they fail in the wild. He runs the team that assumes the worst, tests the hardest, and looks for the cracks. He is not a futurist. He is not a venture capitalist with an incentive to hype the timeline. He is the person inside the lab whose entire professional purpose is to find the places where things go wrong.</p><p>Last week, he didn&#8217;t raise an alarm about a future risk.</p><p>He gave a status update on a present one.</p><p>He said this: *&#8221;Recursive self-improvement, in the broadest sense, is not a future phenomenon. It is a present phenomenon.&#8221;*</p><p>Read that again. Not &#8220;it&#8217;s coming.&#8221; Not &#8220;we&#8217;re approaching it.&#8221; Present tense. Now. Already.</p><p>Helen Toner, interim executive director at Georgetown University&#8217;s Center for Security and Emerging Technology, reacted to the same TIME piece with this: &#8220;The idea that the wealthiest companies in the world, employing some of the smartest people on the planet, are trying to fully automate AI R&amp;D deserves a &#8216;what the f-ck&#8217; reaction.&#8221;</p><p>That is not a fringe voice. That is a Georgetown academic whose job is to track this soberly. And her reaction was unprintable.</p><p>---</p><p>## What That Means</p><p>Recursive self-improvement is the point at which AI systems begin meaningfully contributing to the development of the next generation of AI systems. It&#8217;s the loop that closes on itself. The moment the technology starts accelerating its own acceleration.</p><p>For years, this was the threshold that AI safety researchers pointed to as the critical inflection point &#8212; the moment after which the pace of change would stop being predictable. Most public discourse treated it as a future event to be managed, prepared for, debated.</p><p>Hubinger&#8217;s statement ends that framing. The debate is over. The threshold isn&#8217;t approaching. According to the person who monitors it for a living, we crossed it.</p><p>Anthropic&#8217;s chief science officer, Jared Kaplan, added to that: fully automated AI research &#8212; meaning AI systems running their own research cycles without human direction &#8212; is less than a year away in his estimation. And 70 to 90 percent of the code behind future Anthropic models is already being written by Claude itself.</p><p>The machine is building its successor. Today.</p><p>---</p><p>## The Organizational Reality</p><p>I have been observing what I call the readiness gap &#8212; the space between where AI capabilities are and where organizations actually are in their ability to  benefit from those capabilities responsibly.</p><p>That gap was already significant. The research has shown it for years: 94% of organizations are adopting AI in some form, but fewer than half have meaningful security controls. 72% report scaled deployments, but only 33% have governance structures to match.</p><p>Hubinger&#8217;s statement doesn&#8217;t just widen that gap. It changes the nature of it.</p><p>When the capability curve is something you can track from the outside &#8212; model releases, benchmark improvements, product launches &#8212; organizations can at least attempt to pace themselves. They can watch the horizon and plan accordingly.</p><p>When the lab&#8217;s own alignment lead tells you that recursive self-improvement is present-tense, the horizon is no longer a useful planning concept. The curve is now being drawn from the inside by the system itself.</p><p>This is not an argument for panic. It is an argument against the one posture that will definitely fail: waiting.</p><p>---</p><p>## The Physician Signal</p><p>This same week, the American Medical Association released survey data showing that 81% of U.S. physicians now use AI &#8212; more than double the 2023 rate.</p><p>Think about what that means for the governance argument. Physicians carry DEA licenses. They operate under HIPAA. They face malpractice liability. They are board-certified. They are arguably the most credentialed, most regulated, most scrutinized professional class in the United States.</p><p>And 81% of them are using AI right now, with no sector-wide governance framework to match.</p><p>If the professional class with the highest barrier to adoption &#8212; and the highest legal exposure for getting it wrong &#8212; has already crossed the threshold, the readiness gap isn&#8217;t a warning anymore. It&#8217;s the current condition.</p><p>The permission structure for adoption has collapsed. The infrastructure for governing it is still under construction.</p><p>---</p><p>## The Counter-Signal: What Dispatch Looks Like</p><p>The same week, Reuters reported that Meta is planning layoffs of 20% or more. The stated reason: to offset mounting AI costs.</p><p>Not a business downturn. Not a restructuring. AI costs.</p><p>The largest social platform in history &#8212; 3 billion users &#8212; is restructuring its headcount as a line item against AI infrastructure spend. The old economy sheds people while the new economy files its S-1.</p><p>I want to be precise about what I&#8217;m saying here: this is not a critique of Meta. It is a description of a pattern. When AI cost offsets become the stated rationale for major workforce decisions at platform-scale companies, the displacement thesis isn&#8217;t theoretical anymore. It&#8217;s a Reuters headline.</p><p>Organizations have a choice about which side of that pattern they&#8217;re on. The ones building governance infrastructure, workforce adaptation plans, and AI integration strategies that multiply human capability &#8212; those are the ones who come out of this with something. The ones waiting for a clearer signal are going to find that the clearest signal is the one they missed.</p><p>---</p><p>## What Readiness Actually Requires</p><p>I&#8217;m not going to tell you that AI will replace your entire organization. That framing, though dramatic, tends to produce paralysis rather than action.</p><p>Here&#8217;s what I will tell you:</p><p>The people building AI are telling you &#8212; on the record, in present tense &#8212; that the system is now contributing to its own acceleration. The most regulated professionals in the country are using it without governance frameworks. And the largest platforms are rewriting their cost structures around it.</p><p>The question every organization should ask is &#8220;are you using AI?&#8221; That battle is over. The question is: **when the technology moves faster than your planning cycle, what does your governance structure do?**</p><p>That is the readiness gap. And closing it &#8212; before the next capability jump, not after &#8212; is the only move that doesn&#8217;t leave you reacting to a timeline someone else is setting.</p><p>Hubinger isn&#8217;t making a prediction anymore. He&#8217;s giving a status update.</p><p>The question is what you do with it.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Agentic Jevons Trap]]></title><description><![CDATA[Why AI efficiency gains are accelerating the very risks they claim to solve]]></description><link>https://www.reggiebritt.com/p/the-agentic-jevons-trap</link><guid isPermaLink="false">https://www.reggiebritt.com/p/the-agentic-jevons-trap</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Tue, 10 Mar 2026 11:22:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Kc7o!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Kc7o!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Kc7o!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Kc7o!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:192935,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/190494802?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Kc7o!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!Kc7o!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6df2dfb7-3940-4fdd-b9bb-a84c4baacd6a_784x1168.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><p>The February jobs report landed like a data point that didn&#8217;t know what story it was supposed to tell.</p><p>92,000 jobs lost. Forecasters expected gains of 59,000. A miss of 151,000 &#8212; in a single month. The coverage split immediately: some called it an AI displacement signal, others attributed it to federal workforce cuts and macro noise. The debate about which story is correct misses the more important point.</p><p>Both can be true. And if both are true, the mechanism underneath them is the same.</p><div><hr></div><h2>The Original Trap</h2><p>In 1865, the English economist William Stanley Jevons observed something that shouldn&#8217;t have been possible. The steam engine had just become dramatically more efficient &#8212; burning far less coal to produce the same output. By every intuitive measure, coal consumption should have declined.</p><p>It accelerated.</p><p>Jevons concluded that efficiency gains in energy use don&#8217;t reduce consumption. They reduce the cost of consumption, which expands the range of applications that become economically viable, which increases total demand. The more efficiently you use a resource, the more of it you use.</p><p>This became Jevons&#8217; Paradox: technological efficiency in resource use tends to increase, not decrease, the overall rate of that resource&#8217;s consumption.</p><p>For 160 years, the paradox was primarily an energy and environmental economics problem. Then agentic AI arrived &#8212; and the paradox found a new host.</p><div><hr></div><h2>The Agentic Reframe</h2><p>The resource is no longer coal. It is organizational capacity &#8212; the cognitive, operational, and governance bandwidth that organizations use to coordinate work.</p><p>The efficiency claim of agentic AI is real. Agents automate workflows, compress decision cycles, eliminate coordination overhead. A process that required ten human touchpoints can be reduced to two. The math looks compelling.</p><p>But the Jevons dynamic is already running. When you reduce the marginal cost of deploying AI agents, the number of workflows organizations attempt to automate expands. The ambition scales with the capability. The surface area of exposure &#8212; to failure, to unintended consequences, to compounding risk &#8212; grows faster than the governance infrastructure to manage it.</p><p>This is not a future projection. The evidence is datable.</p><div><hr></div><h2>CENTCOM: Governance Void at T=0</h2><p>In the same period the February jobs report landed, a separate signal confirmed the failure mode on the adoption side.</p><p>CENTCOM deployed AI into operational workflows on the same day the capability became available &#8212; before governance policy had been written, before oversight structures had been established, before the organization had built the infrastructure to sustain what it was deploying.</p><p>The restraint mechanism that was supposed to slow deployment until governance caught up collapsed faster than it had been installed. Capability deployment outpaced governance in hours.</p><p>This is the Jevons dynamic at the organizational scale: the efficiency of deployment &#8212; the low friction of standing up an AI agent &#8212; expands the rate at which organizations attempt to deploy them. The governance infrastructure required to sustain those deployments cannot build at the same speed. The gap between what organizations can deploy and what they can govern widens with every capability release.</p><p>Policy declarations are not restraint mechanisms. They are documents. The capability moves faster.</p><div><hr></div><h2>SWE-CI: Governance Void at T+8 Months</h2><p>CENTCOM is the governance failure at the moment of adoption. A research paper published on March 4, 2026 documents the governance failure that comes later.</p><p>Researchers from Alibaba Group and Sun Yat-sen University built the first AI coding benchmark measured not on a snapshot test &#8212; agent receives a problem, produces a solution &#8212; but on a full production evolution timeline. 100 tasks across real codebases. Each task spanning an average of 233 days and 71 consecutive commits. Agents were evaluated not just on whether they solved the immediate problem, but on whether the code they produced could sustain the codebase through months of continued evolution.</p><p>The headline finding: most models achieve a zero-regression rate below 0.25. In other words, in more than 75% of cases, AI agents that pass standard coding benchmarks introduce regressions when they maintain real production systems over 8-month timelines.</p><p>An agent that hard-codes a brittle fix and one that writes clean, extensible code may both pass the same test suite. Their difference becomes visible only when the codebase must evolve &#8212; when new requirements arrive, interfaces change, and modules must be extended.</p><p>This is not a capability gap. It is a governance gap. The organizations deploying AI agents at speed, on the strength of short-horizon benchmark performance, are not building the oversight infrastructure to detect degradation as it accumulates. They are accelerating a dynamic that already existed in human-maintained code &#8212; and compounding it.</p><div><hr></div><h2>The Anthropic Signal</h2><p>The Anthropic Economic Index, published in early March 2026, provided the labor market anchoring for what the Jevons dynamic predicts.</p><p>The paper found AI most heavily used in automation of existing tasks rather than augmentation or new task creation. The composition of displacement is uneven: computer and math occupations show the highest theoretical capability exposure (94%) but the lowest observed AI coverage (33%). The gap between what AI can theoretically do and what organizations have actually integrated is the readiness gap &#8212; and the Jevons dynamic is running in both directions simultaneously.</p><p>Organizations that automate efficiently create pressure to automate more. Organizations that haven&#8217;t automated yet face competitive exposure that accelerates adoption without governance. Both trajectories compound the same underlying problem: the efficiency of deployment has outrun the capacity to govern what gets deployed.</p><p>The February jobs report is, in this frame, not primarily a story about AI replacing workers. It is a story about organizations that optimized for deployment speed without building the readiness infrastructure to sustain it. When the capability scales faster than the organization can absorb, the excess goes somewhere. Sometimes it goes to regressions in production code. Sometimes it goes to workforce restructuring that moves faster than the institutional knowledge can be transferred. Sometimes it goes to governance vacuums that the next capability release will find already open.</p><div><hr></div><h2>The Paradox in Full</h2><p>The Agentic Jevons Trap is not the observation that AI will increase demand for labor in the long run &#8212; though it might. It is the observation that the efficiency of AI capability deployment is increasing the rate at which organizations consume governance capacity, oversight infrastructure, and organizational readiness.</p><p>Every capability release that lowers the cost of deploying an agent expands the number of workflows organizations attempt to automate. Every expansion in automated workflows increases the surface area of exposure to the failure modes SWE-CI and CENTCOM document. Every governance vacuum the capability outpaces becomes a liability that compounds over the 233-day production timeline no benchmark was measuring until now.</p><p>The paradox is structural. It does not resolve by deploying better AI. It resolves only by building the organizational readiness infrastructure that can govern what the capability enables.</p><p>That is a different kind of work. It is the work nobody is selling.</p><div><hr></div><h2>The Counter-Paradox</h2><p>Jevons himself did not have an answer to his paradox. The efficiency gain was real. The consumption increase was real. The gap between them was a structural feature of how markets respond to cost reduction.</p><p>The counter-paradox for the agentic version is governance as a restraint mechanism &#8212; not as a document, not as a policy declaration, but as infrastructure. The organizations that build oversight architecture before they need it, that establish the readiness primitives before the capability arrives, that treat governance as a design constraint rather than a compliance checkbox &#8212; those organizations close the gap between deployment speed and absorption capacity.</p><p>The window for that work is open. It is not wide.</p><p>The February jobs report is not a warning from the future. It is a reading from organizations that already ran the experiment. CENTCOM and the SWE-CI data bracket the failure arc: governance void at adoption, governance void at maintenance, compounding across every timeline in between.</p><p>The technology has arrived. The question is whether the organization has.</p><div><hr></div><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Technology Has Arrived. Is Your Organization Ready?]]></title><description><![CDATA[On the question no one is answering yet.]]></description><link>https://www.reggiebritt.com/p/the-technology-has-arrived-is-your</link><guid isPermaLink="false">https://www.reggiebritt.com/p/the-technology-has-arrived-is-your</guid><dc:creator><![CDATA[Reggie Britt]]></dc:creator><pubDate>Wed, 04 Mar 2026 02:09:17 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zyy7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zyy7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zyy7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zyy7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic" width="784" height="1168" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1168,&quot;width&quot;:784,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:147518,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://reggiebritt.substack.com/i/189832726?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zyy7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 424w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 848w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 1272w, https://substackcdn.com/image/fetch/$s_!zyy7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F368244b7-a33d-489a-8821-3c8a99f71f61_784x1168.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This week Salim Ismail posed a question on the Diamandis podcast that most leaders are only asking in private &#8212; what happens to organizations when the &#8220;human checkpoint&#8221; disappears from the workflow? When every process that once routed through a person routes through an agent instead, and humans move from being inside the work to overseeing it?</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>He called his working thesis the Organizational Singularity.</p><p>It&#8217;s a good frame. And it&#8217;s pointing at something real.</p><p>But here&#8217;s what I keep coming back to: the question isn&#8217;t what the future of organizations looks like. We&#8217;re already starting to see it. The question is whether organizations are actually ready to get there &#8212; and by almost every measure, they are not.</p><div><hr></div><p>The numbers are not subtle.</p><p>79% of enterprises are experimenting with AI. Only 8.6% have made it to production. 94% have adopted AI tools &#8212; but fewer than half have the governance frameworks to support them. 72% have scaled pilots. Only 33% have governed what they scaled. And after all of it &#8212; after the announcements, the pilots, the transformation roadmaps &#8212; only 6% can point to measurable impact on the bottom line.</p><p>Read those numbers again. Not because they&#8217;re surprising. Because they tell you exactly what kind of problem this is.</p><p>This is not a technology problem. The technology has arrived. It is capable, it is accelerating, and it is not waiting for anyone to catch up. The gap &#8212; the stubborn, persistent, expensive gap between what AI can do and what organizations are actually capturing &#8212; is not a technical gap.</p><p>It is a readiness gap.</p><div><hr></div><p>Technology that arrives without organizational readiness doesn&#8217;t transform.</p><p>It accumulates.</p><p>It accumulates in the form of pilots that never scale. Roadmaps that never land. Investments that produce dashboards instead of decisions. AI that sits beside the workflow instead of inside it. McKinsey calls it &#8220;bolted on.&#8221; I call it expensive decoration.</p><p>The organizations spending the most on AI transformation are not necessarily the ones seeing the most return. The ones seeing return are doing something different &#8212; they are treating AI as a reason to redesign how work actually gets done, not as a tool to layer on top of how it has always been done.</p><p>That distinction sounds obvious. It is apparently not.</p><div><hr></div><p>Naval Ravikant said something recently that stopped a lot of people in the technology world. He said software is &#8220;uninvestable.&#8221;</p><p>He wasn&#8217;t making a valuation call. He was making an observation about durability. In a world where AI can generate code on demand, the competitive moat is no longer the software itself. It is the organizational infrastructure to deploy that software at scale &#8212; the governance, the judgment, the human architecture that decides what gets built, how it gets used, and what it is actually for.</p><p>That&#8217;s not a technology question. That&#8217;s a readiness question.</p><div><hr></div><p>I&#8217;m working through what readiness actually means &#8212; as a practitioner inside the transition.</p><p>The Technology layer. The Business layer. The Human layer.</p><p>Because Salim is right that the organizational singularity is coming. What he&#8217;s still writing toward &#8212; and what I think most of the conversation is missing &#8212; is the answer to the more urgent question:</p><p><em>How do you get there from here?</em></p><p>That&#8217;s what I&#8217;m building toward.</p><p>More soon&#8230;..</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.reggiebritt.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>