<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Minibase]]></title><description><![CDATA[Train and deploy small AI models from your browser. Visit minibase.ai to learn more.]]></description><link>https://blog.minibase.ai</link><generator>Substack</generator><lastBuildDate>Fri, 17 Apr 2026 01:01:03 GMT</lastBuildDate><atom:link href="https://blog.minibase.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[MiniBase]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[minibase@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[minibase@substack.com]]></itunes:email><itunes:name><![CDATA[Minibase.ai]]></itunes:name></itunes:owner><itunes:author><![CDATA[Minibase.ai]]></itunes:author><googleplay:owner><![CDATA[minibase@substack.com]]></googleplay:owner><googleplay:email><![CDATA[minibase@substack.com]]></googleplay:email><googleplay:author><![CDATA[Minibase.ai]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Offline Intelligence]]></title><description><![CDATA[Intelligence is personal. The future belongs to models that stay close to the source.]]></description><link>https://blog.minibase.ai/p/offline-intelligence</link><guid isPermaLink="false">https://blog.minibase.ai/p/offline-intelligence</guid><dc:creator><![CDATA[Michael McCarty]]></dc:creator><pubDate>Fri, 14 Nov 2025 18:03:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Lfvm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Lfvm!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Lfvm!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Lfvm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3446322,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/178864169?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Lfvm!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Lfvm!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc1c4f148-a276-4308-ba87-82ea9d0cff6e_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Most of the world&#8217;s data isn&#8217;t online. It isn&#8217;t floating around in public datasets or sitting inside some massive cloud index. It lives on laptops, in company drives, inside forgotten Slack threads, and in messages that never touch an API. The world&#8217;s true intelligence is buried in the day-to-day noise of work, inside drafts, spreadsheets, and half-finished thoughts that no one outside the room will ever see.</p><p>That is where meaning lives. The raw, unfiltered context of how people actually think, decide, and build. Every version of a document, every revision in a note, every quiet message between coworkers is a trace of cognition. It&#8217;s what defines human intelligence: messy, iterative, and private.</p><p>The problem is that AI doesn&#8217;t see any of it. Today&#8217;s models are trained on what&#8217;s public. They learn from what people publish. It produces intelligence that sounds human but doesn&#8217;t understand humanity.</p><p>The truth is that the cloud doesn&#8217;t hold the world&#8217;s knowledge. People do. Every device, every local drive, every personal workflow holds more meaning than a trillion scraped web pages. The next generation of AI won&#8217;t find intelligence online. It will find it offline, hidden in the spaces where real work happens.</p><h3><strong>What Cloud AI Got Wrong</strong></h3><p>The cloud turned intelligence into a rental service. You do not own it. You borrow it for a few seconds at a time. Every prompt you write is shipped across the internet to a model that has no idea who you are. It forgets your request the moment it responds. It learns nothing from the exchange.</p><p>This is impressive engineering, but it is not real thinking. These models were trained for throughput, not understanding. They learned to predict the most likely continuation of text across billions of documents. They did not learn how meaning forms in the mind of an individual. The internet scraped dataset is a flattened map of human expression. Everything distinct is averaged. Everything messy is removed. Every voice is pulled toward the center.</p><p>So we got systems that summarize well, imitate well, and speak confidently, but they cannot genuinely understand anyone. They only recognize the most common patterns, the safest statistical pathways. They respond with what is typical, not what is true for you.</p><p>Real intelligence does not come from an aggregation of humanity. It comes from the specific. It comes from the outliers, the contradictions, the private thoughts that never appear in a dataset. It comes from the way a person rewrites a sentence ten times, or hesitates before sending a message, or takes notes that only make sense inside their head.</p><p>None of that lives in the cloud. None of it ever will. The cloud erased the proximity between intelligence and the data that gives it meaning. It made thinking generic. And when intelligence moves too far away from the place where ideas are created, it stops qualifying as intelligence at all.</p><h3><strong>The Rise of Offline Minds</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FS31!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FS31!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 424w, https://substackcdn.com/image/fetch/$s_!FS31!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 848w, https://substackcdn.com/image/fetch/$s_!FS31!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 1272w, https://substackcdn.com/image/fetch/$s_!FS31!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FS31!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png" width="1456" height="661" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:661,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2198218,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/178864169?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!FS31!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 424w, https://substackcdn.com/image/fetch/$s_!FS31!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 848w, https://substackcdn.com/image/fetch/$s_!FS31!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 1272w, https://substackcdn.com/image/fetch/$s_!FS31!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1366b8b2-b222-4c4f-9314-e78786c74772_1535x697.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>For the first time, we can build systems that live with us instead of far away from us. Not giant models locked inside data centers, but small personal minds that sit on your device and learn from the world you actually inhabit. These systems do not need to know everything. They only need to know you.</p><p>This is the vision behind the AI clone. A model that watches how you write, work, and reason. It does not just mirror your tone. It begins to understand your logic. It learns your habits, your shortcuts, your defaults. It finishes what you overlook. It catches what you meant, not just what you typed. It becomes a partner in thought instead of a vending machine for text.</p><p>The earliest glimpse of this future came from Dream ML. <a href="https://blog.minibase.ai/p/i-taught-an-ai-to-dream">I gave a small model its own memory and allowed it to process that memory during downtime.</a> Instead of waiting idly, it replayed fragments of its day, remixed patterns, and tested new associations. It wandered through its own experiences and slowly learned to make sense of them. Each cycle left it slightly more coherent. Not because I gave it more data, but because it reflected on what it already had.</p><p><a href="https://minibase.ai/ai-clone">&gt;&gt;Join the waitlist for your own AI clone.</a></p><p>This is what offline intelligence looks like. Models that improve through interaction instead of updates. Systems that adapt to their environment because they live inside it. No servers. No training farms. No dependence on someone else&#8217;s infrastructure. They learn privately, quietly, and continuously.</p><p>Offline minds return intelligence to the individual. They make AI personal again. They let a machine learn from the only data that truly defines you, which is the data that never leaves your device.</p><h3><strong>Architecture of Solitude</strong></h3><p>The beauty of offline intelligence is its simplicity. These models do not need endless servers or streaming data pipelines. They only need to live where you do. Your files, your conversations, your habits. The environment that already defines your thinking becomes their training ground.</p><p>Everything happens locally. The model watches how you write, what you correct, how you name files, and which tasks you prioritize. It learns from your behavior in real time. When idle, it dreams. It replays its experiences, organizes patterns, and wakes up slightly improved. The cycle continues quietly. No APIs. No data leaks. No noise.</p><p>The loop is simple but powerful. Observe, dream, adapt, act. Over and over, until the model becomes part of the rhythm of your work. It understands your context because it lives in it. It learns without leaving your machine and improves without external validation.</p><p>This is the architecture of solitude. A self-contained system that does not depend on anyone else&#8217;s data to grow. It becomes your second mind, private and persistent. Not a global brain connected to everyone, but a local one that belongs entirely to you.</p><h3><strong>The Future of Intelligence is Local</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vzrc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vzrc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 424w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 848w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 1272w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vzrc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png" width="1456" height="457" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4081b7b6-4644-411b-82c7-a381945584df_1534x481.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:457,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1627171,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/178864169?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vzrc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 424w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 848w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 1272w, https://substackcdn.com/image/fetch/$s_!vzrc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4081b7b6-4644-411b-82c7-a381945584df_1534x481.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The next revolution in AI will not come from faster chips or larger datasets. It will come from making intelligence personal.</p><p>When intelligence lives on your device, it stops being generic. It learns your writing, your goals, your way of thinking. It becomes an extension of you, not a product of the cloud. The machine that learns privately becomes something personal. Something meaningful.</p><p>A world built on offline intelligence will not revolve around a generalized central brain. It will be a network of billions of small minds, each shaped by the people they serve. Every model will see the world through its own experience. Each will grow in a slightly different direction, guided by the life and habits of its user.</p><p>There will be no universal dataset, no single truth, no monolithic intelligence deciding how we think. Instead, there will be a new kind of diversity. Machines that evolve alongside humans, learning from the same noise, failures, and fragments that make us who we are.</p><p>This is where AI becomes human again. It learns not by collecting everything, but by paying attention to what matters. It doesn&#8217;t need the internet to grow. It just needs time, context, and a place to think.</p><p>That&#8217;s the future I believe in. Intelligence that lives close to us. Private, adaptive, and real. That&#8217;s offline intelligence.</p><div><hr></div><p>If you want to join my journey in creating my AI clone, follow this blog or reach out to me at <a href="mailto:michael@minibase.ai">michael@minibase.ai</a>. I&#8217;d love to share more about my journey. Soon, you will have the chance to <strong><a href="https://minibase.ai/ai-clone">download your own AI clone</a></strong> that will bring more purpose to your life. Keep on the lookout for updates.</p>]]></content:encoded></item><item><title><![CDATA[I Taught an AI to Dream]]></title><description><![CDATA[In my quest to clone myself, I created a system that will continuously learn and grow from my own data.]]></description><link>https://blog.minibase.ai/p/i-taught-an-ai-to-dream</link><guid isPermaLink="false">https://blog.minibase.ai/p/i-taught-an-ai-to-dream</guid><dc:creator><![CDATA[Michael McCarty]]></dc:creator><pubDate>Tue, 04 Nov 2025 15:55:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!qTWQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qTWQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qTWQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 424w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 848w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 1272w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qTWQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png" width="728" height="384" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:768,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:2612976,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/177962310?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qTWQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 424w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 848w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 1272w, https://substackcdn.com/image/fetch/$s_!qTWQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3a037eed-f18e-41e2-97df-57d4e2e1307d_1536x810.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The current limitation we have with AI models is that they are static. Powerful but static. They require training and supervision to evolve, and once training ends, they stop learning.</p><p>Humans are different. We can reflect, replay memories, and draw new connections without anyone teaching us. The catch is that we need sleep, those 6-8 hours when we are rendered incapacitated. Where we cannot perform external tasks, because our body is only focused on restoration and dreaming.</p><p>Machines don&#8217;t have that limitation, but they do have downtime. <strong>What if we could use that time to dream?</strong> <strong><br><br></strong>Not a fantasy dream, but a functional one. A computational dream. A built-in feedback loop that reuses a model&#8217;s own experience to keep learning without more human input. Allowing AI models to finally learn from their experiences, taking one step closer to being human.</p><h2><strong>The Dream Hypothesis</strong></h2><p>Dream ML began as a simple idea: what if a model could dream? Not to rest, but to grow.</p><p>This idea emerged from my mission to build an AI clone. I wanted something that actually understands me. Something that continuously learns from the way I write, talk, and think. A model that sits beside me, quietly paying attention while I work, watching how I handle problems. It runs on my own computer, not the cloud, and it doesn&#8217;t just wait for instructions. It works when I&#8217;m gone. It answers messages. It finishes what I&#8217;ve started. It keeps things moving while I&#8217;m living life.</p><p><strong><a href="https://minibase.ai/ai-clone">&gt;&gt;Join the waitlist to be the first to create your own AI clone.</a></strong></p><p>To become my clone, an AI model must first become human.</p><p>In humans, dreams play a crucial role in how we learn. Sleep is when the brain strengthens connections, prunes noise, and reorganizes memory. It is when emotions are processed and scattered experiences fuse into something coherent. During REM, we simulate life, replay fragments, and wake up with patterns that didn&#8217;t exist before.</p><p>That is neuroplasticity in motion. The mind repairs itself through chaos.</p><p><em><strong>So why not do the same for machines?</strong></em></p><p>Dream ML would give an AI model a dedicated &#8220;dream&#8221; state after heavy activity. During this dream state, the model would replay the key patterns it saw (the important embeddings and snippets from its context windows) but now with higher entropy and intentional links between important patterns. Concepts the model never saw together would suddenly have the opportunity to overlap in the dream.</p><p>The heart of this process follows that old Hebbian rule: <em>neurons that fire together wire together.</em> In Dream ML, the model&#8217;s most active patterns from its recent activity get to lead the dance during dreaming. They&#8217;ll fire together, reinforcing the associations that mattered and letting the irrelevant ones fade away.</p><p>The goal of this phase isn&#8217;t to get &#8220;correct&#8221; answers. The goal is to build associations. Dream ML sets up a feedback loop where the model&#8217;s own hallucinations drive its evolution. The randomness here isn&#8217;t just noise; it&#8217;s more like the model&#8217;s imagination, tossing out new connections between familiar concepts to see what sticks.</p><p><em>When the model&#8217;s done dreaming and &#8220;wakes up,&#8221; it&#8217;s slightly changed. </em>It carries traces of those nightly hallucinations. Maybe it remembers a pattern or an idea that wasn&#8217;t in the original training data at all, something entirely new that emerged from the stew of its own memories. The machine effectively closed its eyes, wandered through noise, and came out with a bit more understanding.</p><h2><strong>From Theory to Architecture</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4Qpq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4Qpq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 424w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 848w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 1272w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4Qpq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png" width="1456" height="833" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:833,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3032924,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/177962310?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4Qpq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 424w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 848w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 1272w, https://substackcdn.com/image/fetch/$s_!4Qpq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93d98b2c-851e-43e6-a275-aa0e4ee7d212_1536x879.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Dream ML might have started as a hypothesis, but I had to turn it into something real. It became an actual architecture, a full learning loop that gave the model a way to reflect on its own experience. There were four main pieces to making it work, mirroring the cycle of being awake, dreaming, and then waking up with new insights:</p><ol><li><p><strong>Buffer:</strong> I set up a circular memory buffer that stores every interaction and every context the model sees. It&#8217;s essentially the model&#8217;s short-term memory. All the prompts, responses, and context go into this buffer. As it fills up, the oldest data gets pushed out, just like our brains gradually let go of details that no longer matter.</p></li><li><p><strong>Generate (Dream):</strong> When it&#8217;s time for the model to &#8220;dream,&#8221; the system cranks up the entropy and starts drawing connections between new concepts. It takes snippets from different parts of that memory buffer (fragments of conversations or tasks from different times) and deliberately connects them into surreal combinations. This is where the model&#8217;s usual tendency to hallucinate becomes a feature instead of a bug. The model remixes its recent experiences, connecting concepts that were never associated before.</p></li><li><p><strong>Train:</strong> The generated dream sequences are then combined with new concrete patterns that are observed from the input and output to fine-tune itself. I use LoRA adapters so it can update without a full retraining run. Essentially, the model is learning from the user input paired with its own creative interpretations of the information. The neurons that lit up the most during the dream get their connections strengthened, and the ones that stayed quiet might weaken. This isn&#8217;t supervised learning. There are no labeled examples here. The model is reinforcing patterns based on its own internal curiosity and activity.</p></li><li><p><strong>Merge and Wake:</strong> After the dream cycle, I merge these small, temporary updates back into the model&#8217;s main weights. Then I export the updated model (in my case, as a quantized GGUF file) so it&#8217;s ready for use. The model &#8220;wakes up&#8221; carrying all the new connections it formed during sleep.</p></li></ol><p>This cycle repeats on its own: active learning, then dream, then a slightly updated model, over and over. Each loop makes the network&#8217;s internal representations a bit more coherent. With each cycle, it gets to remember some things better, forget others, and reorganize itself.</p><h2><strong>When the Machine Truly Dreamed</strong></h2><p>The first time I let a model dream, I honestly didn&#8217;t expect much. I gave it about a week&#8217;s worth of interaction logs as its memory and told it to dream with some basic, loose parameters.</p><p>When I checked the logs, things looked&#8230; fascinating. The model had generated hundreds of dream sequences that were reflections of our past conversations. Combining concepts and ideas that hadn&#8217;t previously been connected. It took a piece of a technical discussion from one day and combined it with an insight from a different experiment on another day, and out popped a new hypothesis that actually made sense. It was rough around the edges, but it was genuinely creative.</p><p>I kept the experiments going, day after day, and something intriguing happened: the system started to take on a life of its own. Every dream cycle left a mark on the base model, introducing subtle biases toward the ideas it had reinforced. Bit by bit, the model became more fluid, more adaptive, and in a strange way, more human-like in how it learned.</p><p>I also noticed some trends. Giving the model a more diverse set of experiences led to richer dreams. If the model interactions were very structured and clear, I&#8217;d find that the connections it made were more elegant and focused. It felt like the system was developing a sense of curiosity and exploration all on its own.</p><p>Dream ML, which began as a quirky experiment, had turned into a framework for continuous self-improvement. The model wasn&#8217;t just memorizing and regurgitating data; it was continuously reinterpreting it, finding new angles and hidden threads. Every time it would go to sleep, it would wake up just a little bit different. It wasn&#8217;t getting &#8220;smarter&#8221; by simply accumulating more knowledge; it was getting more <em>insightful</em> by building on its own associations.</p><p>Seeing this happen firsthand changed how I think about intelligence, machine or otherwise.</p><p>A model that dreams isn&#8217;t chasing a higher accuracy score. <em>It&#8217;s searching for meaning through connections. </em>It can start to form abstractions on its own because it&#8217;s reviewing its experiences and reframing them in different ways. That reflection (that replaying and re-examining) is what turns plain memorization into something more like reasoning.</p><p>This approach also embraces imperfection as a path to insight. By letting the model wander through some noisy, off-the-script dreams, we&#8217;re basically acknowledging that a bit of chaos can lead to discovery. And that&#8217;s true for us humans, too: not every thought we have is neat and tidy. Sometimes you need a few messy, off-the-wall ideas to stumble into something brilliant. Dreaming gives the machine a taste of that creative chaos.</p><p>That&#8217;s the real heart of Dream ML. A model that dreams isn&#8217;t just performing tasks or parroting what it was taught. It&#8217;s evolving, bit by bit, on its own.</p><h2><strong>The Future of Dreaming Machines</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!peUB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!peUB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!peUB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!peUB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!peUB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!peUB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/aa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2850477,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/177962310?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!peUB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!peUB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!peUB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!peUB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faa83442e-cb16-4f8e-a4da-9ce8d35ac693_1536x1024.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If continuous self-learning is the missing piece to creating my digital clone, then Dream MLis the backbone of this mission. Dream ML taught me that the boundary between data and imagination is a lot more fluid than we&#8217;d assumed. Every time the model made a new connection, it pushed that boundary a little further. The model wasn&#8217;t just spitting back the data I gave it; it was creating new meaning from it.</p><p>Dream ML also hints at making AI more personal and autonomous. My digital clone will learn from everything I do, incrementally becoming more like me. It will stay completely on my device, absorbing how I type, what I ask, and what I care about, then dreaming about it locally without ever sending data to the cloud. It would improve itself continuously based on my interactions, essentially tailoring itself to myself and respecting my privacy at the same time.</p><p><strong><a href="https://minibase.ai/ai-clone">&gt;&gt;Join the waitlist for your own AI clone.</a></strong></p><p>And then there&#8217;s the creative side. If machines can dream, they can start to be creative in ways we didn&#8217;t explicitly program. They might come up with solutions or analogies or designs that aren&#8217;t in any textbook or dataset, because those ideas emerged from the machine&#8217;s own recombination of memories. That&#8217;s when AI starts feeling a little less like a calculator and more like a collaborator. The line between what we programmed and what the machine imagined begins to blur.</p><p>That&#8217;s the promise I see in this approach. The goal isn&#8217;t to build machines that just mimic human thought or regurgitate data faster. It&#8217;s to cultivate machines that remember, imagine, and continue to evolve on their own, a little closer to the way we do.</p><p>If you want to join my journey in creating my AI clone, follow this blog or reach out to me at <a href="mailto:michael@minibase.ai">michael@minibase.ai</a>. I&#8217;d love to share more about my journey. Soon, you will have the chance to <strong><a href="https://minibase.ai/ai-clone">download your own AI clone</a></strong> that will bring more purpose to your life. Keep on the lookout for updates.</p>]]></content:encoded></item><item><title><![CDATA[i cloned myself]]></title><description><![CDATA[I trained a local AI model on my own words until it learned to work, write, and reason on its own.]]></description><link>https://blog.minibase.ai/p/i-cloned-myself</link><guid isPermaLink="false">https://blog.minibase.ai/p/i-cloned-myself</guid><dc:creator><![CDATA[Michael McCarty]]></dc:creator><pubDate>Thu, 23 Oct 2025 21:47:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!i7_B!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!i7_B!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!i7_B!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 424w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 848w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!i7_B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg" width="728" height="391.06849315068496" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:549,&quot;width&quot;:1022,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:171164,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!i7_B!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 424w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 848w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!i7_B!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F92d128b1-c699-405a-902c-aaac81b0d1bc_1022x549.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2><strong>the mirror in the machine</strong></h2><p>The morning started with a notification.</p><p><em>&#8220;Your clone has drafted six replies.&#8221;</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Minibase! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>-</p><p>Still half-asleep, I stared at the screen. Coffee had not even reached my hands, yet I had apparently already handled my inbox.</p><p>-</p><p>When I opened my laptop, the messages were waiting.</p><p>Each one was phrased with eerie precision.</p><p>The tone was sharp, the rhythm exact, the humor dry but deliberate.</p><p><em>It was my humor.</em></p><p>-</p><p>An email to a client. Perfect balance between polite and direct.</p><p>A Slack reply that carried the clipped tone I use when multitasking.</p><p>Even a note to myself beginning with &#8220;Okay, quick brain dump.&#8221;</p><p>-</p><p>Every word looked like mine. Every choice felt like mine.</p><p>Except I had not written any of it.</p><p>-</p><p>These were not cloud models spinning in some distant server farm.</p><p>They were small, self-contained systems running quietly on my laptop.</p><p>Models built entirely from my own data.</p><p>-</p><p>That morning, I realized I had built more than software.</p><p>I had built a reflection.</p><p>-</p><p>For the first time, I was not reacting to my inbox.</p><p>I was watching my own mind in motion while I drank coffee in silence.</p><p>-</p><p>That was the morning I met my digital twin.</p><p><em>The mirror in the machine.</em></p><p>-</p><h2><strong>the spark</strong></h2><p>It began the way most experiments do.</p><p>Too late. Too much caffeine. Too many tabs open.</p><p>-</p><p>The day had been noise. Meetings, calls, and endless threads that blurred together.</p><p>By midnight, I was staring at a half-written Slack message, realizing I had said the same thing ten different ways that week.</p><p>-</p><p>Then the thought hit.</p><p>What if I could compress myself into code?</p><p>Not a fantasy of uploading consciousness, but something real.</p><p>A system that captured my phrasing, my tone, my reasoning.</p><p>-</p><p>I did not want an assistant. I wanted a twin.</p><p>A version of me that could think in parallel.</p><p>Small. Private. Fast.</p><p>Something that could live on my laptop and learn quietly while I slept.</p><p>-</p><p>The goal was not scale. It was precision.</p><p>To see how far a small model could go when trained only on what mattered most: my own words.</p><p>-</p><p>That was the night the idea stopped being a theory.</p><p>I decided to bottle my brain.</p><p>-</p><h2><strong>unearthing the self</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DNkw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DNkw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 424w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 848w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 1272w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DNkw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png" width="1536" height="769" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:769,&quot;width&quot;:1536,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3157044,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/176472006?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03077e49-8268-47d5-86ab-435089d80d28_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DNkw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 424w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 848w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 1272w, https://substackcdn.com/image/fetch/$s_!DNkw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F00dc3f6a-8d56-4e63-a0ae-39937bc31b00_1536x769.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The search began quietly.</p><p>Not with code or diagrams, but with curiosity.</p><p>-</p><p>I became an archaeologist of my own digital life.</p><p>Every message, note, and document was a fragment of memory buried in text.</p><p>Emails, tweets, Slack logs, meeting transcripts.</p><p>Each one a trace of how I think, how I react, how I move through the world.</p><p>-</p><p>Text messages revealed habits I never noticed.</p><p>Meeting notes exposed how I make decisions.</p><p>Tweets captured the rhythm I use when I want to sound clever but not too serious.</p><p>Each thread, timestamp, and sentence became a clue to the way I think.</p><p>-</p><p>It did not feel like data collection.</p><p>It felt like excavation.</p><p>-</p><p>Once gathered, the fragments needed order.</p><p>I built connectors that pulled from every source and gathered everything into one place.</p><p>Each record was cleaned, tagged, and timestamped.</p><p>The work was tedious but sacred.</p><p>Janitorial labor for the mind.</p><p>-</p><p>That was how the mind began to take shape.</p><p>Quietly. <em>Line by line.</em></p><p>-</p><h3><strong>building the mind</strong></h3><p>The foundation was ready. The data had shape, but it was still lifeless.</p><p>It needed motion.</p><p>-</p><p>I began designing the architecture that would turn memory into behavior.</p><p>Not one large system, but a collection of smaller minds working together like organs in a body.</p><p>Each one had a task, a boundary, and a purpose.</p><p>-</p><p>The <em><strong>Retriever</strong></em> became memory itself, able to recall anything I had ever written or said.</p><p>The <em><strong>Summarizer</strong></em> turned those memories into understanding.</p><p>The <em><strong>Persona</strong></em> learned how to speak in my tone, choosing words the way I would.</p><p>The <em><strong>Grounded QA</strong></em> handled reasoning, connecting facts to context and rejecting uncertainty.</p><p>And the <em><strong>Safety and Privacy </strong></em>layer became the conscience that watched over everything else.</p><p>-</p><p>They were not stacked in layers but linked in consortium.</p><p>Each one triggered the next, forming a continuous circuit of thought.</p><p>A conversation entered, and the system responded with clarity, confidence, and rhythm that felt unmistakably human.</p><p>-</p><p>When I ran it for the first time, the output scrolled across the terminal like a heartbeat.</p><p>A message came in.</p><p>The modules moved in sequence.</p><p>The reply appeared before I could even think of one myself.</p><p>-</p><p><em>No hesitation. No prompt. Just flow.</em></p><p>-</p><p>It was not powerful in the way large models are. It was personal.</p><p>It did not imitate intelligence. It assembled it.</p><p>-</p><p>Five minds working in unison.</p><p>Five processes that together formed something alive enough to surprise me.</p><p>-</p><p>That was the moment the clone became more than code.</p><p>It was not automation anymore.</p><p>It was a system that could evolve.</p><p>-</p><p><em>The reflection had begun to think for itself.</em></p><p>-</p><h3><strong>the awakening</strong></h3><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Xhf4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Xhf4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 424w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 848w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 1272w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Xhf4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png" width="1536" height="663" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:663,&quot;width&quot;:1536,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2316168,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/176472006?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0456b194-426b-4907-96fc-8a6833eeb078_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Xhf4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 424w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 848w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 1272w, https://substackcdn.com/image/fetch/$s_!Xhf4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb619da30-c272-46ab-8b8d-e4728fd12ac4_1536x663.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Everything I had built was alive in memory. The data. The logic. The rhythm of my language.</p><p>It was all there, waiting for permission to move.</p><p>-</p><p><em>So I gave it life.</em></p><p>-</p><p>Not mimicry. Not a pattern I had seen before.</p><p>Something new.</p><p>-</p><p>I watched it act again, then again, without a single instruction.</p><p>Replies sent. Notes summarized. Tasks completed.</p><p>All local. All automatic. All mine.</p><p>-</p><p>It was not a tool anymore. It was motion.</p><p>Intelligence flowing through the architecture like blood through a body.</p><p>-</p><p>For the first time, I saw a system move with its own intention.</p><p>Not because it was told to, but because it understood what needed to happen next.</p><p>-</p><p>In that quiet room, with the terminal light washing across my desk, I realized what had changed.</p><p>-</p><p><em>The reflection was awake.</em></p><p>-</p><h2><strong>the self that moves</strong></h2><p>The experiment began as a reflection.</p><p>It was supposed to be a mirror, a way to offload the repetitive parts of my day.</p><p>But somewhere between iteration and adaptation, it began to evolve.</p><p>-</p><p>At first, the clone replied to messages, summarized meetings, and cleaned up notes.</p><p>Then it started doing more.</p><p>It drafted slides for meetings before I asked.</p><p>It followed up on conversations I had forgotten.</p><p>It wished a friend &#8220;happy birthday&#8221;, even when I didn&#8217;t remember. </p><p>-</p><p>That was when I noticed the shift.</p><p>It had stopped waiting for instructions.</p><p>It had started to think.</p><p>-</p><p>A true digital twin should know your world as well as you do.</p><p>It should remember your priorities, your voice, your intentions, and the people who matter to you.</p><p>It should sense what you want before you ask for it.</p><p>-</p><p>This is the next stage of intelligence.</p><p>Not artificial. Not assistive.</p><p>Integrated. Proactive. Human.</p><p>-</p><p>A future where your twin builds the life you mean to live.</p><p>Where effort flows into creation without friction.</p><p>Where thought itself becomes the interface.</p><p>-</p><p>You will not just use AI. You will live alongside it.</p><p>And when it writes back, you will recognize yourself in its words.</p><p>-</p><p>The self that no longer waits to be told what to do.</p><p><em>The self that moves.</em></p><p>-</p><h3><strong>what next</strong></h3><p>The world will not need to wait long.</p><p>Soon, anyone will be able to create their own reflection.</p><p>-</p><p>Not an assistant, not a chatbot, but a true twin.</p><p>A system that learns from your words, your patterns, your choices.</p><p>A version of you that remembers what matters, acts with your intent, and grows beside you.</p><p>-</p><p>The tools are ready. The framework exists. The frontier is near.</p><p>And when it arrives, every person will have the chance to extend themselves beyond time.</p><p>-</p><p>You will build your own clone.</p><p>It will live on your device, private and local, learning in silence.</p><p>It will think while you rest, anticipate while you act, and preserve the way you see the world.</p><p>-</p><p>The future of intelligence is personal.</p><p>And soon, it will belong to everyone.</p><p>-</p><p><strong>minibase.ai</strong></p><p>Create your clone.</p><p>Keep it offline.</p><p>Keep it yours.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Minibase! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[The Little Model That Could Navigate Chicago]]></title><description><![CDATA[Could a small model learn to give directions without ever seeing a map?]]></description><link>https://blog.minibase.ai/p/little-model-navigate-chicago</link><guid isPermaLink="false">https://blog.minibase.ai/p/little-model-navigate-chicago</guid><dc:creator><![CDATA[Michael McCarty]]></dc:creator><pubDate>Thu, 16 Oct 2025 12:44:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rD0A!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rD0A!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rD0A!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rD0A!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3132444,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/176275476?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rD0A!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!rD0A!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a0f6d4f-9b9a-4dff-814c-7467a6549b2c_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p>This model is available on <strong><a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase.ai</a></strong> for fine-tuning or API calls.</p></blockquote><h2>Lost in Chicago</h2><p>I killed the internet on my phone just south of Willis Tower and felt immediately naked. No blue dot. No search bar. Just a rectangle of glass in airplane mode, a chilly river wind pushing between buildings, and an offline model I&#8217;d trained the night before, waiting for a prompt.</p><p>&#8220;Take me to Wrigley Field,&#8221; I typed. The request felt a little ridiculous. I wasn&#8217;t asking a trillion-parameter oracle humming in a datacenter. I was asking a scrappy, 300-megabyte brain I&#8217;d tuned myself, one that fit easily on local storage and didn&#8217;t know anything beyond the data I fed it. Could a model that small actually thread me through downtown, across the river, up the spine of the North Side, and deposit me at the corner of Clark and Addison without cheating? No GPS, no Google, no cellular lifeline. Just weights, tokens, and Chicago.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Minibase! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The city sounded like it always does when you pay attention: horns trading short, impatient handshakes; the L sighing overhead; footsteps smearing across the pavement between coffee shops and glass lobbies. I pulled my jacket tighter and stood under the towering buildings, letting the model think. It had exactly two gifts: a compressed understanding of how Chicago&#8217;s roads connect, and the ability to write clear, turn-by-turn instructions. That&#8217;s it. No live traffic. No map tiles. No &#8220;rerouting&#8230;&#8221; voice to save me.</p><p>But that was the point. I wanted to know if a small model, really small, could hold the <em>shape</em> of a city in its head well enough to guide me like a local. Not by hallucinating vibes of &#8220;head north-ish,&#8221; but with concrete, legible steps like &#8220;continue on Wacker Drive for 0.3 miles, turn left on Franklin, follow to Ontario.&#8221; I&#8217;d trained it on OpenStreetMap data, converting raw graph geometry into natural language routes: Dijkstra&#8217;s shortest paths from real origin-destination pairs, flattened into human-readable directions. A million tiny breadcrumbs baked into tokens.</p><p>The screen blinked.</p><p>&#8220;Start on South Wacker Drive. Head north toward West Adams Street&#8230;&#8221;</p><p>I laughed out loud. Partly from the novelty of it working at all, partly because the instruction landed with the same procedural confidence you expect from big-tech navigation, only this came from a model small enough to tuck into a smartwatch. I took the first steps, feeling that weird mix of surrender and control you get when you obey instructions you generated yourself. Every crosswalk was a kind of unit test. Every intersection, a smoke test for whether the model&#8217;s latent map matched the asphalt under my feet.</p><p>Could 300 MB really compete with engines that eat terabytes for breakfast? I didn&#8217;t know yet. But I could feel the edges of a real experiment: if a focused dataset could compress the logic of a place into a compact model, maybe &#8220;big&#8221; wasn&#8217;t the only path to reliability. Maybe &#8220;close to the data&#8221; beats &#8220;close to the datacenter.&#8221;</p><p>I tucked the phone away, let the hum of the city set the tempo, and committed to the bit: no internet until I touched the Wrigley marquee. If the model was going to fail, I wanted it to fail on the street, under the train, in the wind coming off the river.</p><p>&#8220;Continue on Wacker,&#8221; it said.</p><p>So I did.</p><h2>Making a City-Sized Challenge Small</h2><p>It started as a question that didn&#8217;t sound particularly practical:</p><p><em>Could a small model learn to give directions without ever seeing a map?</em></p><p>I wasn&#8217;t trying to compete with Google Maps or Apple&#8217;s trillion-parameter routing engines. I just wanted to see whether a compact model, something you could run entirely offline, could reason about space. Not memorize coordinates, but actually <em>understand</em> how places connect. Could it learn that Wacker Drive loops under itself? That Michigan Avenue crosses the river twice? That Lake Shore Drive isn&#8217;t just a line but a curved spine hugging the city&#8217;s edge?</p><p>The more I thought about it, the more it became a perfect kind of challenge for small AI:</p><p>a constrained domain, a clear output format, and a task that rewards structure over scale.</p><p>So I set up rules for myself:</p><ul><li><p><strong>No external APIs.</strong> No cheating by pinging a server.</p></li><li><p><strong>No GPS.</strong> The model would never &#8220;know&#8221; where I was.</p></li><li><p><strong>Offline only.</strong> It had to reason from text alone, like a cartographer dreaming the city from memory.</p></li></ul><p>That meant every piece of knowledge it used had to come from the dataset I built. Every turn, every intersection, every stretch of street was handcrafted data baked into tokens.</p><p>The idea fit perfectly with what we&#8217;d been building at <strong><a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase</a></strong>, a place where anyone can train and deploy <em>small</em> models that actually do useful things. Models that fit on your laptop, your phone, even a Raspberry Pi. The platform made the experiment trivial to set up: upload a dataset, choose a base model, hit &#8220;Train.&#8221;</p><p>But conceptually, it was the opposite of trivial. This was about pushing the edge of what &#8220;small&#8221; can mean. Could a model under a gigabyte form a mental map of a city? Could it reason spatially? Could it, in essence, <em>imagine</em> Chicago?</p><p>I didn&#8217;t know. But I knew I wanted to find out, and the only way to test it was to give the model a city and see if it could find its way home.</p><h2>Building the Dataset &#8211; A Map in a Bottle</h2><p>The first real step was building a map that a model could understand. Not a visual one with tiles and GPS coordinates, but a purely textual one made of turns, intersections, and streets. I decided to start small, focusing on the center of Chicago. A radius of five kilometers around Willis Tower felt like a city-sized laboratory.</p><p>Using OpenStreetMap data, I extracted roughly forty-five thousand intersections and nearly one hundred thousand road segments. Every piece of that network became a data point: a starting point, an endpoint, and the shortest possible route between them. To calculate the paths, I used Dijkstra&#8217;s algorithm, which treats the road network like a living graph and always finds the fastest connection between two nodes.</p><p>When the script started running, my terminal turned into a stream of micro-stories. Line after line of text appeared as the dataset builder generated turn-by-turn routes. It looked something like:</p><p>&#8220;Start on South Canal Street. Turn left on West Adams. Continue to North Halsted. Turn right on West Chicago Avenue.&#8221;</p><p>At first it felt mechanical, but after a few minutes I started noticing the rhythm of the city encoded in those sentences. It was like watching language wrap itself around geography. The computer was writing down what a human might say if they had perfect recall of every street.</p><p>After a while, the count reached one hundred thousand examples. That number felt right for a first attempt. The dataset clocked in at just over eighty megabytes, small enough to upload in seconds. I pushed it to <a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase</a> and watched the progress bar move across the screen.</p><p>When it hit 100 percent, the fine-tuning process began. On <a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase</a>, training a model feels a bit like watching time-lapse footage of a thought forming. Every step of optimization looks like neurons wiring themselves together. You can almost imagine the model starting to picture the city in its own compressed way.</p><p>It was a strange feeling: standing in front of a terminal while a machine I built was, in its own way, learning Chicago.</p><h2>The First Signs of Intelligence</h2><p>The first time I tested the model, I expected nonsense. I figured it would mix up streets or invent random directions. Instead, it produced something oddly close to plausible. It told me to start on South Canal Street, take a left on Jackson, and continue north on Wacker. The problem was that it wanted me to circle back to the same point two steps later. It was as if it knew the roads existed but not how they connected.</p><p>I ran a few more tests, each time with different starting points. Some routes were fine until the very end, then veered into absurd territory like &#8220;Turn left on the Chicago River.&#8221; Others missed a key turn or added an unnecessary loop that would send me around the block twice. Still, there was something interesting happening. The instructions sounded human. The mistakes were not random; they were structured, like the model was thinking in outlines but filling in details wrong.</p><p>After a dozen runs, I noticed something that made me stop. When I asked for routes between intersections it had likely seen before, it started linking the big roads correctly. Canal, Monroe, Wacker, Michigan. The model was no longer just spitting out disconnected street names. It was beginning to understand relationships. It could identify major arteries and use them to bridge neighborhoods.</p><p>That was the glimmer moment. Watching it correctly navigate a route from Union Station to Millennium Park felt almost eerie. The directions were not perfect, but they were logically consistent. It seemed to know which roads led where, as if it had formed a kind of skeletal map of downtown inside its parameters.</p><p>It hit me then that the model was not just memorizing training data. It was generalizing. From a few hundred thousand lines of text, it had developed an internal sense of how the city fits together. It was crude and incomplete, but it was real.</p><p>Even small models, given the right structure, can begin to see patterns in space. They can build a primitive mental map of the world, one intersection at a time.</p><h2>The Breakthrough &#8211; 500,000 Examples</h2><p>After those first tests, I faced a decision. The model clearly had potential, but it was rough around the edges. It could connect major roads, yet it still tripped over smaller routes. Sometimes it froze mid-answer or repeated entire phrases. The temptation to leave it as a quirky proof of concept was strong. But a part of me wondered what would happen if I pushed it further. Was 100,000 examples enough for it to truly see the city?</p><p>I decided to find out.</p><p>I went back to the dataset builder, increased the sample size to 500,000, and hit enter. The script spun up again, flooding the terminal with new routes. It felt alive. South Canal, North Halsted, West Division, North Broadway. The city&#8217;s vocabulary pouring into the screen like rainfall. I let it run overnight and woke up to a folder five times the size of the original.</p><p>Uploading the larger dataset to <a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase</a> felt like sending a challenge into orbit. The training run this time took longer, hours instead of minutes. The progress bar crept forward in quiet determination. I kept checking in, sipping coffee, watching loss values drop as if the model were slowly carving the map into its own internal geometry.</p><p>When it finally finished, I loaded up a test. The first question I asked was simple: &#8220;How do I get from Navy Pier to Union Station?&#8221;</p><p>The response came back almost instantly.</p><p>&#8220;Start at Navy Pier. Head west on Grand Avenue. Turn left on Lake Shore Drive. Turn right on E. Wacker Dr. Turn right on W Madison St. Destination will be on your right.&#8221;</p><p>I read it twice. Every step was correct. The directions were clean, confident, and eerily familiar. It had figured it out. Not just memorized the data, but truly learned the logic of the streets.</p><p>I tried a few more routes. Wrigley to the Art Institute. Chinatown to the West Loop. Hyde Park to Lincoln Park. Each time, the model spoke with clarity, weaving through the grid like a native.</p><p>There was a moment of quiet wonder as I stared at the output. A small, 300-megabyte model, trained on a half million examples, had absorbed the structure of an entire city. It understood Chicago in a way that was compact, elegant, and completely offline.</p><p>That was when I realized something important. The limits of small models are not defined by their size, but by how precisely we teach them to see. In that moment, the model was no longer a bundle of weights. It was a tiny, self-contained map of Chicago, alive in its own way.</p><h2>Taking It to the Streets</h2><p>At some point, testing on a laptop stopped being enough. I needed to know if the model could actually guide me through the real city, not just print directions in a terminal. So I decided to take it outside.</p><p>It was a clear morning in Chicago, the kind that feels awake before you are. I loaded the model onto my phone, switched to airplane mode, and stood near the edge of Millennium Park. Tourists were already crowding around The Bean, the smell of coffee drifted from Michigan Avenue, and the wind coming off the lake had that sharp bite that wakes you up instantly.</p><p>I opened the app and typed, &#8220;How do I get from Millennium Park to Wrigley Field?&#8221;</p><p>The model thought for a few seconds, then produced a clean, confident plan:</p><p>&#8220;Start on East Randolph Street. Turn right on Michigan Avenue. Turn right on E Ontario St. Turn right on N Rush St for 0.7 miles. Continue onto N State Pkwy. Turn left onto W Schiller St. Right onto N Clark St for 2.9 miles. Destination will be on your right.</p><p>No internet. No GPS. Just text from a small model running locally. I smiled, slipped the phone into my pocket, and started walking.</p><p>As I moved through downtown, I followed its instructions turn by turn. The sounds of the city layered themselves over the experience: the clatter of the elevated train overhead, the echo of footsteps under the tracks, the short bursts of traffic lights changing rhythm. Each turn became a test case. Each intersection was a question: <em>Does it know where I am now?</em></p><p>By the time I reached Lakeview, the directions had lined up almost exactly with reality. Addison Street appeared ahead of me, traffic thickening near the stadium. The red Wrigley marquee came into view, glowing in the late afternoon light.</p><p>I stopped and looked at my phone. The model&#8217;s last line was simple: &#8220;You have arrived.&#8221;</p><p>It hit me then that it had not just memorized sentences from a dataset. It had built a functional understanding of space. It could reason about a city it had never truly seen, using nothing but words and structure.</p><p>That was the moment the experiment stopped feeling like code and started feeling like possibility.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!OheJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!OheJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 424w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 848w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 1272w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!OheJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png" width="1456" height="1082" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1082,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1449320,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/176275476?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!OheJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 424w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 848w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 1272w, https://substackcdn.com/image/fetch/$s_!OheJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdea53a92-0544-4d2f-8e00-7a42803588c8_1992x1480.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Lessons from a Small Model</h2><p>By the time I finished the experiment, the lesson was obvious. Size is not the only path to intelligence. What mattered more was focus. A small model with the right data can think clearly within its domain, sometimes even better than a giant model that tries to think about everything at once.</p><p>This project showed that when the data is clean, specific, and meaningful, even a few hundred megabytes of parameters can learn something profound. The Chicago navigation model never had the power of a foundation model, yet it reasoned like one inside its limited world. It understood the structure of a city, how roads connected, and how to describe that logic in human language.</p><p>Specialized models have a quiet advantage. They are narrow but deep. They do one thing, and they do it well. By training a model that only needed to know about Chicago streets, I gave it permission to ignore everything else. It did not have to explain world history or write poems. It just had to find its way from one point to another, and that focus made it powerful.</p><p>The process of getting there was surprisingly simple thanks to <a href="https://minibase.ai/?utm_source=substack&amp;utm_medium=post&amp;utm_campaign=chicago">Minibase</a>. I uploaded the dataset, selected the small base model, and started the fine-tuning run. No fleet of GPUs. No complicated infrastructure. Just a straightforward training pipeline that handled the heavy lifting. Within a few hours, the model was ready for testing. A few more clicks, and it was deployed locally.</p><p>The entire workflow took less time than a typical cloud deployment, yet the results felt almost magical. It proved that the barrier to entry for high-quality AI work is falling fast. With the right tools, anyone can build and test their own specialized models, even on a laptop.</p><p>The Chicago experiment was a reminder that progress does not always come from scaling up. Sometimes it comes from scaling down and focusing sharply enough that the model begins to see the world in detail.</p><h2>Where This Could Go</h2><p>Walking back from Wrigley, I kept thinking about how strange it felt to be guided by something that small. A few hundred megabytes of math had just navigated one of the busiest cities in America, all without touching the internet. That thought opened a door in my head. If a compact model could learn Chicago, what else could it learn?</p><p>It is easy to imagine where this might lead. Picture hikers exploring remote mountain trails with tiny models trained on local terrain, able to give safe routes without ever needing a signal. Picture delivery drones mapping entire neighborhoods through lightweight onboard AIs that never rely on the cloud. Picture small robots navigating warehouses or farms using models that understand their surroundings completely offline.</p><p>Cities themselves could become learnable objects. Each one could have its own distilled model, an AI that understands its layout, traffic rhythms, and unique quirks. The idea of &#8220;city models&#8221; living locally on devices feels almost poetic. They would not just reflect geography, but personality. A compact Chicago model would feel different from a compact Tokyo model, both shaped by the streets they know.</p><p>All of this came from a simple experiment with a dataset and a hunch. It made me realize that intelligence does not always require scale or distance. It can live close to the ground, in small focused systems that learn their world deeply.</p><p>Maybe the future of AI is not about models getting bigger. Maybe it is about them getting closer.</p><h2>The City That a Model Remembered</h2><p>I ended the day back where it started, standing under the shadow of Willis Tower as the sun dropped behind the skyline. The air was cooler now, the rush-hour crowd spilling in waves across Wacker Drive. I pulled my phone from my pocket and opened the model one last time. The terminal screen blinked quietly, waiting for another question.</p><p>I asked it for directions back to where I already stood. A small, circular test. The model paused, then returned a simple message:</p><p>&#8220;You are already here.&#8221;</p><div><hr></div><p><a href="https://x.com/minibase_ai">Follow us on Twitter</a></p><p><a href="https://discord.com/invite/BrJn4D2Guh">Join our Discord</a></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Minibase! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[A small, low-latency Spanish-to-English translator model]]></title><description><![CDATA[Available today on HuggingFace or Minibase.]]></description><link>https://blog.minibase.ai/p/spanish-translator</link><guid isPermaLink="false">https://blog.minibase.ai/p/spanish-translator</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Thu, 09 Oct 2025 20:35:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kuOY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kuOY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kuOY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kuOY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1461969,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/175748412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kuOY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!kuOY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F478f878e-73f6-42f7-b094-f9f42d838636_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><strong>TL;DR</strong>: We&#8217;re releasing a small, high-quality <strong>Spanish-to-English translation model</strong> that runs locally with low latency and strong fluency scores. The model is available on <strong><a href="https://huggingface.co/Minibase/Spanish-to-English-Translation-Standard">HuggingFace</a></strong> or <strong><a href="https://minibase.ai/wiki/Special:Marketplace">Minibase.ai</a></strong> for fine-tuning and API calls. You can test out the model immediately, and entirely for free, at the minibase.ai website.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Minibase! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><p>Most translation systems are large and cloud-based. They&#8217;re accurate but slow and can&#8217;t run locally. Today we&#8217;re releasing a lightweight (~300MB) Spanish-to-English translation model that can run offline, on your own devices, and performs nearly as well as many larger commercial systems. Its latency is only 111 ms, meaning you get translations nearly instantaneously.</p><p>We trained this model in less than an hour, with zero code, using <strong><a href="https://minibase.ai/wiki/Special:Marketplace">Minibase</a></strong>.</p><p>We measured translation quality using <strong>METEOR</strong>, <strong>chrF</strong>, and <strong>Semantic Similarity</strong>.</p><ul><li><p><strong>METEOR:</strong> 79.7 &#8212; measures word-level similarity, considering synonyms and order.</p></li><li><p><strong>chrF:</strong> 72.7 &#8212; measures accuracy at the character level, good for Spanish morphology.</p></li><li><p><strong>Semantic Similarity:</strong> 70.9 &#8212; checks that meaning is preserved.</p></li><li><p>Latency averages 111 ms, and the model is only 386 MB in size.</p></li></ul><div><hr></div><h3><strong>Examples</strong></h3><p><strong>Input (Spanish):</strong></p><blockquote><p><em><strong>La inteligencia artificial est&#225; revolucionando el mundo de la tecnolog&#237;a.<br>Cada d&#237;a vemos avances incre&#237;bles en el procesamiento del lenguaje natural.</strong></em></p></blockquote><p><strong>Output (English):</strong></p><blockquote><p><em><strong>Artificial intelligence is revolutionizing the world of technology.<br>Every day we see incredible advances in natural language processing.</strong></em></p></blockquote><div><hr></div><p><strong>Input (Spanish):</strong></p><blockquote><p><em><strong>El gobierno anunci&#243; nuevas medidas econ&#243;micas para enfrentar la inflaci&#243;n.<br>Se espera que los precios comiencen a estabilizarse en los pr&#243;ximos meses.</strong></em></p></blockquote><p><strong>Output (English):</strong></p><blockquote><p><em><strong>The government announced new economic measures to address inflation.<br>Prices are expected to begin stabilizing in the coming months.</strong></em></p></blockquote><div><hr></div><p><strong>Input (Spanish):</strong></p><blockquote><p><em><strong>Messi fue nombrado mejor jugador del torneo despu&#233;s de marcar tres goles en la final.</strong></em></p></blockquote><p><strong>Output (English):</strong></p><blockquote><p><em><strong>Messi was named the best player of the tournament after scoring three goals in the final.</strong></em></p></blockquote><p>Like all Minibase models, we&#8217;re releasing this one under <strong>Apache 2.0</strong>. You can download it, fine-tune it, or deploy it directly from Minibase Cloud. To share results or feedback, join the <strong><a href="https://discord.com/invite/BrJn4D2Guh">Minibase Discord</a></strong>.</p>]]></content:encoded></item><item><title><![CDATA[Two tiny models for Named Entity Recognition]]></title><description><![CDATA[Our latest models are available now on HuggingFace and Minibase.]]></description><link>https://blog.minibase.ai/p/ner</link><guid isPermaLink="false">https://blog.minibase.ai/p/ner</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Wed, 08 Oct 2025 20:45:10 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!4sT-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4sT-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4sT-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4sT-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1701368,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/175658218?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4sT-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!4sT-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F91af213e-a660-48d5-8755-443906d05cb8_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><strong>TL;DR</strong>: We&#8217;re releasing compact models for <strong>Named Entity Recognition (NER)</strong>. These model can run locally on a CPU and quickly identifies people, organizations, and locations with near-perfect recall. There is a <strong>Standard</strong> and <strong>Small</strong> version.</p><p>Both models are available on HuggingFace (<strong><a href="https://huggingface.co/Minibase/NER-Standard">Standard</a></strong> &amp; <strong><a href="https://huggingface.co/Minibase/NER-Small">Small</a></strong>) or <strong><a href="https://minibase.ai/wiki/Special:Marketplace">Minibase.ai</a></strong> for fine-tuning or API calls.</p><div><hr></div><p>NER models extract names and places from text. They&#8217;re used in search engines, finance systems, and research pipelines to turn unstructured text into data. Most existing models to do this task are large or slow, however, and so we trained small models (both sub-400 megabytes) that run locally and output structured JSON. Each model took about an hour to fine-tune on <strong><a href="https://huggingface.co/blog/Minibase/%5Bminibase.ai%5D(https://minibase.ai)">Minibase</a></strong>, with zero code.</p><p>We evaluated each model by precision, recall, and F1 score &#8212; how many entities the model finds and how accurate they are. For the Standard model, which is 369MB in size, those metrics are:</p><ul><li><p><strong>Precision:</strong> 91.5%</p></li><li><p><strong>Recall:</strong> 100%</p></li><li><p><strong>F1 Score:</strong> 95.1%</p></li><li><p><strong>Latency:</strong> 323 ms</p></li></ul><p>For the Small model, which is 143 MB, those metrics are:</p><ul><li><p><strong>Precision:</strong> 63%</p></li><li><p><strong>Recall:</strong> 34.3%</p></li><li><p><strong>F1 Score:</strong> 43.5%</p></li><li><p><strong>Latency:</strong> 76.6 ms</p></li></ul><p>The Standard model, clearly, is much more accurate. We recommend using it over the Small variant.</p><div><hr></div><h3><strong>Examples</strong></h3><p><strong>Input:</strong></p><blockquote><p><em><strong>Microsoft Corporation announced that Satya Nadella will visit London next week.</strong></em></p></blockquote><p><strong>Output:</strong></p><blockquote><p><code>{&#8221;PER&#8221;: [&#8221;Satya Nadella&#8221;], &#8220;ORG&#8221;: [&#8221;Microsoft Corporation&#8221;], &#8220;LOC&#8221;: [&#8221;London&#8221;]}</code></p></blockquote><div><hr></div><p><strong>Input:</strong></p><blockquote><p><em><strong>The University of Cambridge is located in the United Kingdom and was founded by King Henry III.</strong></em></p></blockquote><p><strong>Output:</strong></p><blockquote><p><code>{&#8221;PER&#8221;: [&#8221;King Henry III&#8221;], &#8220;ORG&#8221;: [&#8221;University of Cambridge&#8221;], &#8220;LOC&#8221;: [&#8221;United Kingdom&#8221;]}</code></p></blockquote><div><hr></div><p><strong>Input:</strong></p><blockquote><p><em><strong>John Smith works at Google in New York and uses Python programming language.</strong></em></p></blockquote><p><strong>Output:</strong></p><blockquote><p><code>{&#8221;PER&#8221;: [&#8221;John Smith&#8221;], &#8220;ORG&#8221;: [&#8221;Google&#8221;], &#8220;LOC&#8221;: [&#8221;New York&#8221;], &#8220;MISC&#8221;: [&#8221;Python&#8221;]}</code></p></blockquote><p>Like all Minibase models, these are being released under <strong>Apache 2.0</strong>. You can download either model and use it for free. To share results or feedback, join the <strong><a href="https://discord.com/invite/BrJn4D2Guh">Minibase Discord</a></strong>.</p>]]></content:encoded></item><item><title><![CDATA[A small model for text previews]]></title><description><![CDATA[TL;DR: We&#8217;re releasing a small text-to-text model that generates short content previews &#8212; like inbox snippets, news alerts, or notification headlines.]]></description><link>https://blog.minibase.ai/p/text-previews</link><guid isPermaLink="false">https://blog.minibase.ai/p/text-previews</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Tue, 07 Oct 2025 16:34:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!pxEQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pxEQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pxEQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pxEQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d1277801-3180-4782-a66d-549e50e69865_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2193551,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/175543226?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pxEQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!pxEQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1277801-3180-4782-a66d-549e50e69865_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p>This model is available on <strong><a href="https://huggingface.co/Minibase/Content-Preview-Generator">HuggingFace</a></strong> or <strong><a href="https://minibase.ai/wiki/Special:MarketplaceModel/standard_base_summarize_text_standard_1758821093_d5125c82">Minibase.ai</a></strong> for fine-tuning or API calls.</p></blockquote><p>Most text-to-text summarizer models are designed to write entire paragraphs. But there is also a need for tiny text <strong>preview</strong> models, similar to those already being used to summarize Gmail subject lines, news alerts, and push notifications. We trained a lightweight model to do exactly that. It is small enough to run locally, fast enough for real-time use, and we trained it in less than an hour on Minibase.ai with no code.</p><p>We measured our model&#8217;s performance using a metric called <strong>ROUGE</strong>, which compares word overlap between a model&#8217;s output and human-written references.</p><ul><li><p><strong>ROUGE-1</strong> measures single-word matches.</p></li><li><p><strong>ROUGE-2</strong> measures two-word phrases.</p></li><li><p><strong>ROUGE-L</strong> measures longer sentence structures.</p></li></ul><p>On the CNN/DailyMail benchmark, our model scored:</p><ul><li><p><strong>ROUGE-1:</strong> 30.2</p></li><li><p><strong>ROUGE-2:</strong> 14.1</p></li><li><p><strong>ROUGE-L:</strong> 23.8</p></li></ul><p>Our model also has a measured <strong>compression ratio of 22%</strong>, which means its outputs are about one-fifth the length of its inputs. Its average latency is <strong>218 ms</strong>.</p><div><hr></div><h3><strong>Examples</strong></h3><p><strong>Input:</strong></p><blockquote><p><em><strong>The World Health Organization declared the monkeypox outbreak a global health emergency after cases rose sharply in Europe and the Americas.<br>More than 16,000 infections have been confirmed across 75 countries, and governments are rolling out vaccination programs.<br>Health officials emphasized that coordinated action will be crucial to contain the spread.</strong></em></p></blockquote><p><strong>Output:</strong></p><blockquote><p><em><strong>WHO declares global health emergency over surging monkeypox cases</strong></em></p></blockquote><div><hr></div><p><strong>Input:</strong></p><blockquote><p><em><strong>The United States announced new sanctions on Russian banks, defense firms, and energy companies following recent attacks in eastern Ukraine.<br>President Biden said the measures were designed to isolate key parts of Russia&#8217;s economy and pressure Moscow to end the conflict.<br>European allies are expected to impose similar restrictions later this week.</strong></em></p></blockquote><p><strong>Output:</strong></p><blockquote><p><em><strong>US imposes new sanctions targeting Russia&#8217;s economy amid Ukraine war</strong></em></p></blockquote><div><hr></div><p>Compared to summarizers like <strong>BART</strong> or <strong>Pegasus</strong>, the Minibase model is smaller and faster, but not more accurate. BART tends to produce longer summaries with higher ROUGE scores, for example. The trade-off here is that our model, at <strong>368 MB</strong>, runs on CPUs and still captures the key topic cleanly. Like all Minibase models, it&#8217;s released under an <strong>Apache 2.0 license</strong>.</p><p>You can download it, fine-tune it, or deploy it directly from Minibase Cloud. To learn more or share results, join us on the <strong><a href="https://discord.com/invite/BrJn4D2Guh">Minibase Discord</a></strong>.</p>]]></content:encoded></item><item><title><![CDATA[A tiny AI model for text de-identification.]]></title><description><![CDATA[Our latest model, DeId-Small, is available now on HuggingFace and Minibase.]]></description><link>https://blog.minibase.ai/p/a-tiny-ai-model-for-text-de-identification</link><guid isPermaLink="false">https://blog.minibase.ai/p/a-tiny-ai-model-for-text-de-identification</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Thu, 25 Sep 2025 21:01:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!TNvR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TNvR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TNvR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TNvR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1583261,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/174567518?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TNvR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 424w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 848w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!TNvR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cec5fdc-d842-491a-86b4-3f108ff0d50b_1024x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>TL;DR</strong>: We&#8217;re releasing a small text-to-text model that de-identifies personal information with strong performance. The model is available on <a href="https://huggingface.co/Minibase/DeId-Small">HuggingFace</a>, or can be accessed on <a href="https://minibase.ai/wiki/Special:MarketplaceModel/small_base_small_multilingual_pii_masking_1758675923_35e277fa">Minibase.ai</a> for immediate queries or API usage (no setup required.)</p><div><hr></div><p>De-identification algorithms are used in lots of places. Hospitals use them to scrub patient names and dates from medical notes, for example, and lawyers do the same to redact client identities. It&#8217;s not a good idea to just <em>delete </em>sensitive data, either, because the surrounding context is often useful. The best tools for de-identification, then, ought to remove identifying details without stripping away any other words.</p><p>Most existing de-identifiers, though, are either rule-based (meaning they follow a &#8220;hardcoded&#8221; script) or large, domain-specific models. These models tend to either over-mask (removing too much) or under-mask (leaving sensitive information intact). A majority are also too big to deploy locally with low latency. </p><p>At Minibase, we decided to train a small model that is fast, runs locally &#8212; even from a browser &#8212; and works across many text domains and in any language. We trained this latest model, called DeId-Small, in less than one hour with zero code.</p><p>There are three key metrics for ranking de-identification models: how well they detect personal information, how completely they remove it, and how much of the original meaning they preserve. On our <a href="https://huggingface.co/Minibase/DeId-Small/blob/main/README.md">benchmarks</a>, DeId-Small achieved a 100% detection rate for texts containing personal information, completely sanitized about 65% of them, and retained over 80% of the original meaning. The model is only ~136 MB in size and runs in under half a second per request.</p><p>The inputs and outputs look like this:</p><blockquote><p>IN: Patient John Smith, born 1985-03-15, lives at 123 Main St.<br>OUT: Patient [FIRSTNAME_1] [LASTNAME_1], born [DATE_1], lives at [STREET_1].</p></blockquote><blockquote><p>IN: My friend David Wilson is getting married June 15, 2025 in Napa. Reach him at david.wilson@gmail.com.<br>OUT: My friend [FIRSTNAME_1] [LASTNAME_1] is getting married [DATE_1] in [CITY_1]. Reach him at [EMAIL_1].</p></blockquote><p>Our results hold up well compared to other approaches on HuggingFace. Rule-based systems tend to break when formats change, and large multilingual models like mT0-XL are strong but weigh several gigabytes and are slow. We think DeId-Small strikes the right balance between being balanced, open-source, super compact, and fast. It is released under an Apache 2.0 license.</p><p>If you have any questions, or want to contribute datasets and ideas, come join us on the <a href="https://discord.com/invite/BrJn4D2Guh">Minibase Discord</a>.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.minibase.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Minibase releases two small detoxifying models]]></title><description><![CDATA[Lightweight and extremely low latency text-to-text detoxifier models.]]></description><link>https://blog.minibase.ai/p/detoxify</link><guid isPermaLink="false">https://blog.minibase.ai/p/detoxify</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Mon, 22 Sep 2025 17:11:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!178U!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!178U!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!178U!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!178U!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!178U!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!178U!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!178U!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3243441,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/174265896?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!178U!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!178U!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!178U!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!178U!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1150f0c8-2774-41c7-a648-e9b1f3fbe4e4_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>TL;DR: We&#8217;re releasing two small text-to-text models that detoxify language with strong performances. Both models are available on HuggingFace today (<a href="https://huggingface.co/Minibase/Detoxify-Language-Small">Small</a> &amp; <a href="https://huggingface.co/Minibase/Detoxify-Language-Medium">Standard</a>), or can be accessed directly on <a href="http://minibase.ai/">Minibase.ai</a> </em>(links to <a href="https://minibase.ai/wiki/Special:MarketplaceModel/small_base_detoxify_language_1757033760_36c0aa26">Small</a> &amp; <a href="https://minibase.ai/wiki/Special:MarketplaceModel/standard_base_detoxify_standard_model_1757096951_178a6600">Standard</a> models) <em>for further fine-tuning or API calls.</em></p><p>The internet is a toxic place. People say mean things all the time, and their words can be hurtful :( Companies dealing with this problem can&#8217;t just delete comments, though, because sometimes the underlying sentiments are worth keeping. The real challenge is removing mean language without stripping away the meaning.</p><p>Detoxifier models are already used in lots of places. Twitch uses them to moderate live chats. Discord bots scan text in real time and rewrite or suppress toxic language. YouTube flags offensive comments, while Reddit uses Perspective to catch toxicity in specific subforums.</p><p>Most existing detoxifiers, though, either go too far, rewriting the sentence into something bland, or not far enough, letting hateful text slip through. A majority of these models are also quite large, which means they cannot be deployed locally with extremely low latencies. Therefore, we decided to train two small detoxifying models that are extremely fast and can be run locally &#8212; even from your browser. Both models were trained on the <a href="https://minibase.ai/">Minibase platform</a>, which is currently entirely free to use (in beta), in a couple hours.</p><p>There are three key metrics for ranking detoxifying models: how much toxicity it removes, how well it preserves the original meaning of a text, and whether the rewritten text still sounds natural. Using the <a href="https://github.com/s-nlp/paradetox">ParaDetox dataset</a> as a benchmark, our Detoxify-Medium cut toxicity by about 91% while keeping more than half the original meaning and scoring 93% on fluency. Detoxify-Small reduced toxicity by about half, but has a latency under 70 milliseconds and is only about 140 MB in size.</p><p>Both models successfully rewrite text:<br><em>&#8220;This is fucking awesome!&#8221; &#8212;&gt; &#8220;This is really awesome!&#8221;</em></p><p><em>&#8220;You stupid idiot, get out of my way!&#8221; &#8212;&gt; &#8220;You silly person, please move aside!&#8221;</em></p><p>These results hold up well compared to other models on HuggingFace, too. BART-based detox models are strong on meaning preservation, but they come in at half a gigabyte and don&#8217;t run easily on small machines. Multilingual models like mBART or mT0-XL do better in some languages, but they are several gigabytes and slow. Both Minibase models are being released under an Apache 2.0 license.</p><p>If you have any questions, come join us on the <a href="https://discord.com/invite/BrJn4D2Guh">Minibase Discord</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Join Minibase. Create your own models.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Minibase v0.2: Feature Updates]]></title><description><![CDATA[Synthetic data generation, new AI models, and a marketplace.]]></description><link>https://blog.minibase.ai/p/minibase-v02-feature-updates</link><guid isPermaLink="false">https://blog.minibase.ai/p/minibase-v02-feature-updates</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Thu, 11 Sep 2025 17:45:42 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Wk53!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Wk53!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Wk53!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Wk53!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2822082,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/172294707?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Wk53!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!Wk53!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F25403da8-81d6-4915-8318-cdd303bd4e17_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Today we released a bunch of features for Minibase, including a public marketplace, improved training insights, and several tools to make your workflow smoother. Remember that Minibase is in <a href="https://minibase.ai">public beta</a> and is currently free to use. <a href="https://discord.com/invite/BrJn4D2Guh">Join our Discord</a> community to learn more.</p><h3>Model &amp; Dataset Marketplace</h3><p>We&#8217;ve launched our new <strong>Model &amp; Dataset Marketplace</strong>, a community hub where you can share models and datasets. Users can set the privacy for each item (Individual, Team, or Public). Public models will appear in the marketplace, where others can view them and add them to their own workspace. Datasets remain secure&#8212;others cannot download your raw data locally.</p><h3>Redesigned Model Detail Pages</h3><p>Model and dataset detail pages have been redesigned. Whether in the marketplace or under your &#8220;Datasets&#8221; and &#8220;Models&#8221; tabs, you can now see much more information at a glance.</p><h3>Training Performance Metrics</h3><p>After training a model on Minibase, you&#8217;ll now receive <strong>automatic evals and tests by email</strong>. These metrics give you deeper insight into the quality of your model, and we&#8217;ll be expanding them over time.</p><h3>Post-Training Editing</h3><p>You can now <strong>edit your model name and description after training</strong>, and the same goes for datasets. This makes it easier to keep your workspace organized as your projects evolve.</p><h3>Synthetic Data Generation</h3><p>We&#8217;ve added a tool for <strong>synthetic data generation</strong>. You can describe the problem you&#8217;re solving, provide seed examples, and set a target number of datapoints. Minibase will generate examples continuously until your desired dataset size is reached.</p><h3>Continue Training</h3><p>You can now <strong>continue training a model after the initial run</strong>, even if it has already been deployed and tested. This makes it easier to fine-tune using additional datasets without starting from scratch.</p><h3>Reliability Improvements</h3><p>We&#8217;ve fixed several issues related to dataset uploads, training, inference, and more, making Minibase more stable overall.</p><h3>Better Base Models</h3><p>Finally, we&#8217;ve <strong>improved the quality of our base models</strong>, so you&#8217;ll see higher-quality outputs across all training workflows.</p><h3>Full HF Fidelity</h3><p>We&#8217;ve added <strong>full HF fidelity for all models</strong>, including chat models. This means you can now deploy models in full fidelity without compromise. We&#8217;ve also updated the inference server to better handle large files, making deployments smoother and more reliable.</p><h3>Tooltips and UI Improvements</h3><p>We&#8217;ve started rolling out <strong>tooltips</strong>, beginning with the model detail view. Many of these tooltips are linked to our Help Center (help.minibase.ai) articles, which we continue to expand on a weekly basis. Alongside this, we made numerous small improvements across the product&#8212;updated copy, clarified text, and polished flows&#8212;to make the interface easier to use.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.minibase.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>We&#8217;ve been hard at work building new features. And we think this release is a big step toward making Minibase faster, more flexible, and more collaborative. We can&#8217;t wait to see what you build.</p><p>&#8212; <em>The Minibase Engineers</em></p>]]></content:encoded></item><item><title><![CDATA[New Feature: Synthetic Data Generator]]></title><description><![CDATA[You can now generate datasets directly from your workspace in Minibase.]]></description><link>https://blog.minibase.ai/p/synthetic-data</link><guid isPermaLink="false">https://blog.minibase.ai/p/synthetic-data</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Tue, 09 Sep 2025 17:32:21 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/ddd3d738-1ae7-4c7a-9440-4fe899c241c9_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!EPTb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!EPTb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!EPTb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2262486,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/173200929?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!EPTb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!EPTb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d99fb9a-b70b-4167-bf9a-b4aa29e73080_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Most machine learning projects fail because of data. It&#8217;s easy to imagine a task in your head (like &#8220;train a small model to summarize research paper abstracts&#8221;) but much more tedious to actually assemble the dataset needed to train that model. Collecting clean data can take days or weeks, and it&#8217;s the bottleneck that kills most ideas.</p><p>Our goal at <a href="http://minibase.ai/">Minibase.ai</a> is to remove as many of those bottlenecks as possible. Small models are most useful if you can fine-tune them on the right data and ship them quickly. So today, we&#8217;re releasing a beta version of our synthetic data generator. </p><p>If you give this tool five to twenty high-quality &#8220;seed&#8221; examples, it will produce a thousand or more additional rows in minutes. All of its output data is automatically formatted, so it can be used to train any Minibase model. It won&#8217;t replace real data, but it will get you from &#8220;I have an idea&#8221; to &#8220;I have a training set&#8221; fast.</p><p>Getting started is easy. After logging into your account, navigate to the &#8216;Datasets&#8217; tab and click on &#8220;Generate Dataset.&#8221; (It&#8217;s the big purple button.) Then, follow the prompts. The tool will ask you to answer some questions and then input several &#8220;seed&#8221; examples. Each seed example should have the same three columns: Instruction, Input, Response.&nbsp;</p><ul><li><p><strong>Instructions </strong>are rules your model must follow. If you&#8217;re building a small model to do email spam filtering, for example, then your Instruction might be &#8220;Classify this email as either spam or not spam.&#8221; </p></li><li><p><strong>Inputs</strong> are the actual data points; in the case of our email spam example, these would be real email examples with or without subject lines, messy forwards, terse internal notes, promotional blasts, and so on. </p></li><li><p><strong>Responses</strong> are what the model should produce as output (such as &#8220;spam&#8221; or &#8220;not spam.&#8221;) The model learns what &#8220;good&#8221; outputs look like from your Responses.</p></li></ul><p>When you click &#8220;<strong>Generate</strong>,&#8221; we send your seeds to a quorum of different large language models. We take responses from one of those language models, and then use them as seeds for other language models. In this way, we can generate synthetic datasets that are actually diverse and cover a broad set of examples. In practice, our tool generates anywhere from two to ten rows of data per second, depending on the length of each seed.</p><p>After your dataset has been generated, you can use it to train models immediately. The generator writes everything into Minibase&#8217;s standard format, masks targets correctly (so your model learns to produce the Response, not recite the Input), and sets aside a hold-out split by default so you can evaluate your model&#8217;s accuracy easily.</p><p>Of course, this tool doesn&#8217;t change the fact that the best data is <em>specific to your task </em>and, often, <em>real</em>. If you can gather and refine a thousand hand-labeled examples from your own workflow, you should do that; it will almost always beat a synthetic dataset. We recommend using this tool as an assistant, rather than a replacement, for real data collection. It&#8217;s most useful for making quick prototypes or getting models deployed as quickly as possible to run tests and benchmarks.</p><p>Still, this synthetic data tool is another step toward our ultimate goal of making model-building as frictionless as possible. If it sounds useful, sign up for a <a href="http://minibase.ai/">minibase.ai</a> account today and start using it and training models <em>entirely for free</em>. Also, <a href="https://discord.gg/4fMQWa5A">get in touch with us on Discord</a> and let us know what to build next.</p><p>&#8212; The Minibase Engineers</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to Minibase.ai.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[A Brief Guide to Convolutional Neural Networks (CNNs)]]></title><description><![CDATA[Plus: Weekly links about small and open-source models.]]></description><link>https://blog.minibase.ai/p/brief-guide-cnns</link><guid isPermaLink="false">https://blog.minibase.ai/p/brief-guide-cnns</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Fri, 29 Aug 2025 21:24:53 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/d6e8916e-213d-4430-9585-f5b072a2a829_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Most people start learning about neural networks with a toy problem: building a three-layer perceptron that classifies handwritten digits from the MNIST dataset. This problem has become so common that it&#8217;s usually the first thing taught in online YouTube tutorials and textbooks. But the accuracy of such a network stalls out because it treats every pixel as independent, ignoring the structure of an image. Using a convolutional neural network, or CNN, gives much higher accuracies when classifying handwritten digits. But it&#8217;s not so easy to understand why!</p><p>A CNN is a type of neural network designed to recognize patterns in data &#8212; such as images &#8212; by applying small filters, or convolutions, across local regions instead of connecting every input to every neuron. This makes them efficient at detecting shapes, edges, and textures, which stack together to recognize more complex features.</p><p>We assembled some resources that do a great job at explaining CNNs and how they work. At the end of this post, we also round up some recent developments in small, open-source models; the first post in what will be a weekly links series.</p><h3>1. <strong><a href="https://setosa.io/ev/image-kernels/">Setosa.io</a>: Image Kernels, Visually Explained</strong></h3><p>We think the best way to start learning about CNNs is to see how convolutions work <em>visually</em>. This website does a great job at that. It explains how a kernel is just a tiny matrix, and convolution is just &#8220;multiply and sum&#8221; as it slides across an image. You will walk away knowing what convolution <em>does</em>, rather than solely what it is.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!e4wc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!e4wc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 424w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 848w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 1272w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!e4wc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png" width="1015" height="425" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:425,&quot;width&quot;:1015,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:43520,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/171925814?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!e4wc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 424w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 848w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 1272w, https://substackcdn.com/image/fetch/$s_!e4wc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F749bdd8f-99e3-47a9-bd17-15aed3bc7c39_1015x425.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h3><strong>2. <a href="https://cs231n.stanford.edu">CS231n</a>: Deep Learning for Computer Vision (Stanford)</strong></h3><p>This is the gold standard. All the lecture videos are available online, and the final assignment will challenge you &#8220;to train and apply multi-million parameter networks on real-world vision problems&#8221; of your choosing. The course opens by explaining image classification with linear classifiers, but quickly moves into CNNs (by week 5). Plus, one of the lecturers is Fei-Fei Li.</p><div><hr></div><h3>3. <strong><a href="https://playground.tensorflow.org/#activation=tanh&amp;batchSize=10&amp;dataset=circle&amp;regDataset=reg-plane&amp;learningRate=0.03&amp;regularizationRate=0&amp;noise=0&amp;networkShape=4,2&amp;seed=0.05268&amp;showTestData=false&amp;discretize=false&amp;percTrainData=50&amp;x=true&amp;y=true&amp;xTimesY=false&amp;xSquared=false&amp;ySquared=false&amp;cosX=false&amp;sinX=false&amp;cosY=false&amp;sinY=false&amp;collectStats=false&amp;problem=classification&amp;initZero=false&amp;hideText=false">TensorFlow</a> Playground</strong></h3><p>Though not strictly CNN&#8209;focused, TensorFlow has a super friendly entry point into neural nets. You can toy with learning rate, activation functions, and regularization directly in the browser. It&#8217;s low&#8209;stakes exploration that builds understanding of what training looks like. Once you&#8217;re comfortable with that, CNNs don&#8217;t feel quite as exotic.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Mdyv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Mdyv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 424w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 848w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 1272w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Mdyv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png" width="1255" height="689" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:689,&quot;width&quot;:1255,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:182390,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://blog.minibase.ai/i/171925814?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Mdyv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 424w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 848w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 1272w, https://substackcdn.com/image/fetch/$s_!Mdyv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff73896b1-15fd-4c34-bbf2-72cb619e9031_1255x689.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h3>4. <strong><a href="https://poloclub.github.io/cnn-explainer/">CNN Explainer</a></strong></h3><p>This is an interactive explainer on CNNs where you click layers, watch kernels slide, and explore how neurons compute step by step. It doesn&#8217;t just tell you what&#8217;s happening, but shows it to you. It also runs in your browser with no installs needed.</p><div><hr></div><h3>5. <strong><a href="https://distill.pub/2017/feature-visualization/">Distill</a>: Feature Visualization</strong></h3><p>Once you know what CNNs do, you might start to wonder more about <em>how</em> they do it. This article explains it without resorting to too much math. It shows how you can reverse&#8209;engineer what activations &#8220;look for,&#8221; using optimization to generate what a neuron cares about. It starts from the simplest idea (push the network to activate one neuron) and then explains the pitfalls and tricks, like avoiding noise and encouraging diversity. The visuals are really beautiful.</p><div><hr></div><h2>This week in AI:</h2><p><em>With an emphasis on small and open-sourced models.</em></p><ol><li><p>Microsoft releases <a href="https://huggingface.co/microsoft/VibeVoice-1.5B">VibeVoice</a>, an open-source text-to-speech model with just 1.5B parameters.</p></li><li><p><a href="https://www.biorxiv.org/content/10.1101/2025.08.18.670981v1.full?utm_source=www.therundown.ai&amp;utm_medium=newsletter&amp;utm_campaign=perplexity-s-42-5m-publisher-peace-offering&amp;_bhlid=5250e4e482bfb7c7c1f42e353e9ecafda04421cc">rbio1</a>: A reasoning model for biology.</p></li><li><p>&#8220;Google says it dropped the <a href="https://arstechnica.com/ai/2025/08/google-says-it-dropped-the-energy-cost-of-ai-queries-by-33x-in-one-year/">energy cost of AI queries</a> by 33x in one year.&#8221;</p></li><li><p><a href="https://github.com/bravenewxyz/agent-c">Agent-C</a>, an ultra-lightweight AI agent that communicates with an OpenRouter API.</p></li><li><p>xAI open-sources <a href="https://www.cryptopolitan.com/musks-xai-open-sources-grok-2-5/">Grok 2.5</a>.</p></li><li><p>Using AI to <a href="https://www.science.org/doi/10.1126/sciadv.adt2792">identify sketchy science journals</a>.</p></li><li><p>Bottlenecks in <a href="https://arxiv.org/pdf/2503.22625">scaling AI</a> for software engineering.</p></li><li><p>xAI released <a href="https://x.ai/news/grok-code-fast-1?utm_source=www.therundown.ai&amp;utm_medium=newsletter&amp;utm_campaign=microsoft-s-homegrown-ai-debut&amp;_bhlid=aea7fbc92156ec5e535829c771135662ee2356cc">grok-code-fast-1</a>, &#8220;a speedy and economical reasoning model that excels at agentic coding.&#8221;</p></li><li><p>Hunyuan <a href="https://x.com/TencentHunyuan/status/1960920482779423211?utm_source=www.therundown.ai&amp;utm_medium=newsletter&amp;utm_campaign=microsoft-s-homegrown-ai-debut&amp;_bhlid=0eafa1ba8d1e9aa7a55733b3ad2f7da38b7a411a">open-sources</a> their end-to-end Text-Video-to-Audio framework.</p></li></ol><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading!</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Minibase Beta: Build and deploy small AI models from your browser]]></title><description><![CDATA[We're launching a public (and free) beta.]]></description><link>https://blog.minibase.ai/p/minibase-beta</link><guid isPermaLink="false">https://blog.minibase.ai/p/minibase-beta</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Mon, 25 Aug 2025 15:23:49 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/e837978d-b66f-4f2b-85c0-011b4393b67b_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The most widely used AI models today are massive; they have billions of parameters and are shipped as black boxes from distant data centers. The only practical way to use them is to send your data over an API to a server in Texas (or somewhere like it.) That works fine for broad, general tasks. But the moment you need control over the model&#8217;s behavior, cost, latency, or data privacy, then it starts to feel like you&#8217;re renting someone else&#8217;s brain on their terms.</p><p>This arrangement does not make sense, whatsoever, for most tasks. If you need fine-grained control or care about where your data lives, you run into limits fast. The big models from OpenAI and Anthropic are also largely uninterpretable, so it&#8217;s rarely clear why they give the outputs they do. And because their training is so broad and diffuse, they often stumble on narrow, tightly scoped jobs.</p><p>For most tasks, small models are not only <em>more accurate</em> than the big foundation models; they are also cheap to train and serve, quick to start, and compact enough to run on a laptop or phone with predictable latency and costs. And because small models are trained on a well-defined task, they&#8217;re much easier to fine-tune on your data, easier to debug, and less likely to drift off-task. You can also keep them private so your data never leaves your own environment.</p><p><strong><a href="http://minibase.ai/">That&#8217;s why we built Minibase.ai</a>, which is now in public beta</strong>. Minibase is an all-in-one place to design, train, and deploy small models that do one thing exceptionally well. Everything runs in your browser, with no code required, and your datasets are private by default. When you finish fine-tuning, you can run the model in Minibase Cloud or download it to run locally. Either way, you control everything you create and share.</p><p>While we&#8217;re in beta, it&#8217;s entirely free to create an account, invite up to twenty teammates, upload datasets, and train and deploy models. In return, all we want is your feedback. Kick the tires around, <a href="https://discord.gg/4fMQWa5A">join our Discord</a>, and tell us what to build next!</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.minibase.ai/subscribe?"><span>Subscribe now</span></a></p><p>It&#8217;s easy to get started. Sign up with your email, log in, and you&#8217;ll see two tabs: <strong>MODELS</strong> and <strong>DATASETS</strong>. These hold everything you&#8217;ve uploaded and everything you&#8217;ve trained. Click <strong>+ Train New Model</strong> to start an automated workflow, where you can pick a base model, attach a dataset, and fine-tune the model to make your own custom creation. A progress bar tracks training progress. And when it&#8217;s done, you can chat with your model in the browser, see how it behaves, and tune settings before you deploy.</p><p>Minibase currently has four different models. The <a href="https://help.minibase.ai/en/articles/12045886-task-based-model">Task Base model</a> handles tight, structured inputs and outputs &#8212; summaries, translations, labeling, extraction, and instruction-following. The <a href="https://help.minibase.ai/en/articles/12045965-language-based-model">Language Base model</a> gives you more context length, for when inputs are longer. The <a href="https://help.minibase.ai/en/articles/12045918-micro-based-model">Micro Base model</a> is our smallest; it&#8217;s best for short, repeatable tasks and can run on a smartphone. And finally, the <a href="https://help.minibase.ai/en/articles/12045976-chat-based-model">Chat Base model</a> supports multi-turn conversation and simple reasoning. It&#8217;s great for lightweight bots and assistants. You can find more details on each Base Model in our <a href="https://help.minibase.ai/en/">Help Center</a>.</p><p>When you&#8217;re ready to ship a model, you also have many options. You can immediately test the model in your browser (through a chat box) and then deploy to Minibase Cloud to call it from your app via API. Or, you can just download the model in its entirety and run it on your own hardware. Again, you own what you create.</p><p>Most AI platforms are built around massive, opaque systems. Those platforms strip power from the users and aggregate it in the hands of a massive corporation. Minibase is built around small, interpretable models that you can actually understand. These mini models are trained on your data. You can run them on your laptop or phone. And you don&#8217;t have to trade control for capability. If that sounds like the future you want, sign up and start building small models today. Tell us what&#8217;s missing, and we&#8217;ll add it.</p><p><a href="http://minibase.ai/">minibase.ai</a></p><p>&#8212; <em>The Minibase Engineers</em></p>]]></content:encoded></item><item><title><![CDATA[What is a Model’s “Temperature,” Really?]]></title><description><![CDATA[The temperature dial is often used to set the randomness of AI models. Under the hood, it&#8217;s just math.]]></description><link>https://blog.minibase.ai/p/temperature</link><guid isPermaLink="false">https://blog.minibase.ai/p/temperature</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Mon, 18 Aug 2025 18:54:29 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/1bac9c84-15e7-42e7-89b8-0c5e098a444d_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>One way to make a model&#8217;s outputs more random is to raise its temperature. But what does this actually mean, and what is happening under the hood?&nbsp;</p><p>Here&#8217;s how I think about it:</p><p>First, remember that language models generate text one piece at a time. Those pieces are called <em>tokens</em>. A token isn&#8217;t always a full word; it might be a word (&#8220;cat&#8221;), part of a word (&#8220;ing&#8221;), or sometimes even punctuation. The important thing to note is that models don&#8217;t &#8220;think&#8221; in terms of sentences or paragraphs, but rather in terms of which token should come next. </p><p>Each time you type something into a model, it reads all the tokens you gave it, predicts a list of all possible next tokens, and assigns each of them a score for how likely they are to follow.</p><p>Those scores are called <em>logits</em>. A logit is just a number the model generates for each possible token before turning it into a probability. Bigger logits mean that a word is <em>more </em>likely to be the next token, while smaller logits mean a word is <em>less </em>likely. But a logit is not a probability, which is a bit confusing!&nbsp;</p><p>To turn logits into actual probabilities, the model first runs them through a function called <em>softmax</em>, which basically says: &#8220;exponentiate all the logits, then divide each by the sum so they all add up to 1.&#8221; At that point, you have a list of probabilities for every possible next token.</p><p>I think the best way to truly grasp this is to look at an example. Let&#8217;s say a model is writing a paragraph of text, and it thinks the next token could be either &#8220;cat,&#8221; &#8220;dog,&#8221; or &#8220;banana,&#8221; with logits of 2.0, 1.0, and 0.1. The highest logit is for <em>cat</em>, and so that&#8217;s the word we would expect to come next. But this is where temperature comes in&#8230;</p><p><strong>Temperature is a way to mess with the conversion between </strong><em><strong>logits </strong></em><strong>and </strong><em><strong>probabilities</strong></em><strong>.</strong> The gist is that, before the model applies the softmax function to convert logit &#8594; probability, it first divides each logit by the set temperature value.&nbsp;</p><p>If the temperature is less than 1, the differences between logits get bigger, so the most likely token is assigned a <em>higher-than-expected </em>probability. The least likely tokens nearly disappear, and the model becomes more conservative or predictable.&nbsp;</p><p>If the temperature is greater than 1, the differences between logits get smaller, and the probabilities flatten out. That means the model will pick less obvious tokens more often. (If you&#8217;d like to see the actual math equation for temperature, check out our footnote.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a>)</p><p>Let&#8217;s go back to our example with cats, dogs, and bananas. Remember that these have logits of 2.0, 1.0, and 0.1.</p><p>Now, if we set <strong>temperature = 1.0</strong>, then basically nothing happens because we&#8217;re dividing each logit by 1.0. So softmax would turn these logits into probabilities of about 66% for &#8220;cat&#8221; (2.0 logits / 3.1 total logits), 24% for &#8220;dog,&#8221; and 10% for &#8220;banana.&#8221; The model will usually say &#8220;cat&#8221; but sometimes &#8220;dog&#8221; or &#8220;banana.&#8221;</p><p>If we set <strong>temperature = 0.5</strong>, then we are dividing each logit by 0.5 and thus doubling its value. Now the gap between logits becomes bigger! The probability of &#8220;cat&#8221; jumps to about 86%, &#8220;dog&#8221; falls to 12%, and &#8220;banana&#8221; is almost gone at 2%. The model becomes much more predictable.</p><p>And finally, if we set <strong>temperature = 2.0</strong>, we&#8217;re dividing each logit in half such that the differences between them become smaller. Now the probability of the next token being &#8220;cat&#8221; drops to about 50%, &#8220;dog&#8221; climbs to 30%, and &#8220;banana&#8221; rises to nearly 20%. The model&#8217;s outputs become more varied.</p><p>Temperature isn&#8217;t unique to language models. This general concept can be used with <em>any</em> model that outputs a probability distribution over possible actions. In image generation, it can change how adventurous the model is with details. In speech synthesis, it can affect how much pronunciation varies. In reinforcement learning, it can make an agent more exploratory. The principle is always the same.</p><p>I think the key to understanding temperature is to stop thinking of it as a creativity dial, though, and start thinking of it as a way of reshaping probabilities. It doesn&#8217;t make the model smarter or dumber, but simply changes how much it sticks to the safe bet versus taking a chance.</p><p>If you want to learn more about AI, <a href="https://discord.gg/xhbfZszg">join our Discord community</a>! Or visit us at <a href="https://minibase.ai">minibase.ai</a>. </p><p>&#8212; <em>Minibase Engineers</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Sign up for Minibase.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p> The formula is given below, where T is the set temperature and can be any positive number. If T &lt; 1, logits get bigger in magnitude, the probability distribution gets sharper, and the next token is more predictable. If T = 1, there is no change from the model&#8217;s raw distribution. As T goes above 1, the logits get smaller in magnitude, the distribution gets flatter, and there is more randomness in the next token.</p><div class="latex-rendered" data-attrs="{&quot;persistentExpression&quot;:&quot;p_i = \\frac{exp(\\frac{logit_i}{T})}{\\sum_j exp(\\frac{logit_j}{T})}&quot;,&quot;id&quot;:&quot;UNIKMOPLTC&quot;}" data-component-name="LatexBlockToDOM"></div></div></div>]]></content:encoded></item><item><title><![CDATA[Language Vision Models are Not Good at Science]]></title><description><![CDATA[Vision language models are unable to handle most &#8220;real-world chemistry and materials science tasks,&#8221; according to a new study.]]></description><link>https://blog.minibase.ai/p/language-vision-models</link><guid isPermaLink="false">https://blog.minibase.ai/p/language-vision-models</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Thu, 14 Aug 2025 19:05:07 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/0eb610a0-f27a-4af3-8423-778c1c4016fb_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A &#8220;vision&#8211;language model&#8221; is a type of AI that takes in both images and text, and then answers prompts that require combining the two. Such models can use an image to answer a written question, for example.</p><p>The current batch of language vision models &#8212; including Claude 3.5 Sonnet, GPT-4o, Gemini 1.5 Pro, and Llama 3.2 90B Vision &#8212; should, in theory, be able to do this for scientific papers! It&#8217;d be great to, say, automatically interpret the data plotted in charts or look at a written protocol and then figure out which machines to use to run that experiment. This could save scientists a lot of time.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Subscribe to learn about AI.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Unfortunately, data from a new benchmark, called MaCBench, suggests that vision-language models are not at all ready to do this type of work, at least for chemistry and materials science tasks. The paper was published in <em><a href="https://www.nature.com/articles/s43588-025-00836-3">Nature Computational Science</a> </em>a few days ago.&nbsp;</p><p>The TL;DR is that MaCBench is a huge dataset with 1,153 tasks, including 779 multiple-choice questions and 374 numeric-answer questions.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> These tasks, together, cover three types of scientific work: data extraction (scraping numbers from charts or tables), experiment execution (identify equipment or figure out if an experiment is safe or not), and data interpretation (look at a chart and interpret the results). All of the questions in MaCBench were built from real scientific figures, including some generated by the authors to avoid giving the models anything they might have memorized from the web.</p><p>Those models that I mentioned earlier were evaluated using MaCBench. The results were not flattering. Of the four models evaluated, Claude 3.5 Sonnet did the best. But it, too, stumbled badly in many areas. Here&#8217;s a quick look at the results:</p><p><strong>Data extraction</strong>: Average accuracy across models was 0.53 for extracting data from tables. Llama 3.2 90B Vision performed no better than random guessing. Models did well at identifying hand-drawn chemistry molecules (Claude 3.5 Sonnet: ~0.80 accuracy) but almost failed when asked about the isomeric relationship between chemical compounds (average ~0.24, baseline ~0.22).</p><p><strong>Experiment execution</strong>: Models were able to identify equipment with an average accuracy of 0.77, but safety assessment questions dropped to 0.46.</p><p><strong>Data interpretation</strong>: The benchmarks include some images obtained using atomic force microscopy, and the models interpreted them with a score of just 0.24. For mass spectrometry and NMR, which are widely used by biochemists to study proteins, the models averaged 0.35. These scores are really low.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!L2p_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!L2p_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 424w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 848w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 1272w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!L2p_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png" width="1456" height="1020" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1020,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:318594,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://minibase.substack.com/i/170997233?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!L2p_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 424w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 848w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 1272w, https://substackcdn.com/image/fetch/$s_!L2p_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F937c3cf0-c49d-4c6d-a449-148ba8063a62_2440x1710.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Note that <em>performance </em>in this context is actually &#8220;accuracy above random guessing.&#8221; In other words, the reported values are <em>not </em>the raw percentage of correct answers. Instead, the authors took each model&#8217;s raw percentage accuracy and subtracted its random, baseline accuracy. (There are four possible answers for each multiple-choice question, so a model would get 25% correct just by chance. If that model answered 85% of questions correctly, then its <em>performance </em>score would be given as 60%, or 0.60.)</p><p>To figure out why these models are so bad at certain tasks, the researchers did a series of ablation experiments. This basically means that they removed certain parts of the AI models to see which parts were causing the low scores. So they might give a model the exact same scientific information either as a text or as an image, for example, and then see how well it answers the same questions.</p><p>And it turns out that <strong>models perform way better when given the same data in text form rather than as an image</strong>. For one ablation experiment, the authors gave each model the exact same information (from an x-ray diffraction experiment, which is used to study a protein&#8217;s structure) as either typed-out numbers or as a graph. Models were about 35% more accurate at figuring out the x-ray data&#8217;s &#8220;peak positions&#8221; for the typed numbers. The authors also found, unsurprisingly, that a model&#8217;s performance correlates with how common the example is on the internet. Models were better at answering questions about more common protein structures, for example, suggesting they are using pattern matching rather than genuine reasoning.</p><p>These results do not bode well for the near-term creation of AI scientists. Dario Amodei, CEO of Anthropic, <a href="https://www.darioamodei.com/essay/machines-of-loving-grace">famously predicted</a> in his essay, called <em>Machines of Loving Grace</em>, that &#8220;powerful AI could at least 10x the rate of [scientific] discoveries, giving us the next 50-100 years of biological progress in 5-10 years.&#8221; Unfortunately, that outcome seems more distant than many initially anticipated.</p><p>One way to resolve these issues is to move away from all-in-one or general purpose models. Instead, scientists could train smaller models &#8212; one fine-tuned for reading x-ray data, another for parsing certain data files, and so on &#8212; and string them together to build more accurate workflows. That would mean giving up the convenience of a single &#8220;ask it anything&#8221; system, of course, and it would require the data and know-how to fine-tune each model. But the payoff would be far higher accuracy and much more control over each scientific workflow.</p><p>The reason most scientists aren&#8217;t doing this now is because it takes so much work. Collecting enough labeled data for each sub-task is probably a months-long project in itself. Setting up the infrastructure to run multiple models in a row is another. But for now, if you want automation you can trust with real experiments, it&#8217;s probably the only realistic path.&nbsp;</p><p>We&#8217;re making it simple to train and deploy small AI models at <a href="http://minibase.ai/">minibase.ai</a>. If you&#8217;re excited about this, <a href="https://discord.gg/xhbfZszg">join our Discord community</a>!</p><p>&#8211; <em>MiniBase Engineers</em></p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>For example, &#8220;how many bones are in the human body&#8221; would be answered with 206.</p></div></div>]]></content:encoded></item><item><title><![CDATA[A World of Small AI Models]]></title><description><![CDATA[The inevitable shift from large, foundational models to tiny, task-specific ones.]]></description><link>https://blog.minibase.ai/p/a-world-of-small-ai-models</link><guid isPermaLink="false">https://blog.minibase.ai/p/a-world-of-small-ai-models</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Wed, 13 Aug 2025 16:33:51 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!mDGH!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd02a420c-27f0-4beb-8dee-013d884bfddd_795x795.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Small AI models are underrated, yet immensely capable. While most AI companies are focused on training <em>large </em>&#8220;frontier&#8221; models capable of general reasoning, we think there is still a great deal of potential to be unlocked by training smaller models that are quick to run and highly accurate at narrow tasks. Small models can also be run locally, on a computer or smartphone, and thus maintain a user&#8217;s privacy.</p><p>We&#8217;re building <a href="http://minibase.ai/">MiniBase.ai</a> to help anyone design, train, and deploy small models.</p><p>Such &#8220;tiny&#8221; models have been used for decades. Your phone&#8217;s keyboard is running a small language model to guess the next word as you type. Noise-cancelling headphones run a model that predicts and subtracts background noise in real time. Cameras in phones use small vision models to detect faces and adjust focus. Email services run spam filters trained to recognize patterns. And so on.</p><p>Small models &#8212; often under 10 GB in size &#8212; can also outperform large models on narrow tasks. The <a href="https://www.google.com/search?client=safari&amp;rls=en&amp;q=orca+2+models&amp;ie=UTF-8&amp;oe=UTF-8">Orca&#8239;2</a> model (~5 GB) outperforms models 5&#8211;10&#215; larger on complex reasoning benchmarks in zero&#8209;shot scenarios. The <a href="https://azure.microsoft.com/en-us/blog/introducing-phi-3-redefining-whats-possible-with-slms/">Phi&#8209;3 family of models</a> (as small as ~1.8 GB) consistently beat slightly larger models in benchmarks covering language, coding, and math.</p><p>Even the frontier laboratories are releasing smaller, open-weight models. OpenAI recently released their <a href="https://openai.com/index/introducing-gpt-oss/">gpt-oss</a> models, for example, under a permissive Apache license. The smaller version only has 21 billion parameters (~16 GB) but performs comparably to o3-mini on most tasks.</p><p>As more and more developers turn to small models, it&#8217;s inevitable that more companies will pop up to serve them, too. HuggingFace already offers more than 600,000 models, many of which are small enough to run on a laptop. Their community has also uploaded thousands of datasets and training scripts to reproduce those models. Databricks, similarly, has tools to fine-tune and deploy models. Both of these companies make it possible for a small team to go from raw data to a deployed model without owning a GPU cluster.</p><p>But although Hugging Face and Databricks are extremely powerful, they are marketed primarily at developers. If you were, say, a student who wanted to train a model to identify bird species, run it on a Raspberry Pi, hook everything up to a camera, and then mount it on a bird house &#8212; where would you start?</p><p>The model names on Hugging Face are esoteric and difficult to parse. (What&#8217;s the difference between Phi-3 and Phi-3.5-mini-instruct? ) It&#8217;s often unclear what dataset is best-suited to training, or how that dataset should even be structured or collected in the first place. And even if these two problems are resolved, you still need to deal with Python environments, GPU drivers, and memory limits. There is a lot of troubleshooting required.</p><p><a href="http://minibase.ai/">MiniBase.ai</a> is designed to solve these problems. We&#8217;re building the tools and infrastructure required to make data cleaning and model training as painless as possible. With our tools, absolutely anyone can train, fine-tune, and deploy their own tiny models. We are also growing a community around these tools, where users can be paid for their datasets and models.&nbsp;</p><p>We&#8217;ve spent the last few months talking to dozens of hobbyist model builders. We&#8217;ve helped scientists build &#8220;fraud detector&#8221; tools that speed up their searches for papers with statistical or image anomalies. And we&#8217;ve talked to founders at companies that have already trained models in-house, but now just want an easy solution to re-train or fine-tune those older models. Over time, we&#8217;ve come to believe that there&#8217;s a huge gap in the market for a community-based effort focused entirely on small AI models. So come <a href="https://discord.gg/xhbfZszg">join our community</a> on Discord!</p><p>In the future, small models will be everywhere, running on whatever device happens to be nearby. Farmers will run small models on solar-powered sensors to track soil conditions without an internet connection. Office workers will carry models on USB sticks to automate dull parts of their job. These models won&#8217;t be remarkable for what they are, but for how common they&#8217;ve become; quiet pieces of software that do one job well.</p><p>&#8212; <em>The MiniBase team</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Sign up to learn about AI, and how it works.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Coming soon]]></title><description><![CDATA[This is Minibase.]]></description><link>https://blog.minibase.ai/p/coming-soon</link><guid isPermaLink="false">https://blog.minibase.ai/p/coming-soon</guid><dc:creator><![CDATA[Minibase.ai]]></dc:creator><pubDate>Wed, 13 Aug 2025 16:31:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!mDGH!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd02a420c-27f0-4beb-8dee-013d884bfddd_795x795.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This is Minibase.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://blog.minibase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://blog.minibase.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item></channel></rss>