<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Cycls]]></title><description><![CDATA[Blog by Cycls]]></description><link>https://blog.cycls.com</link><generator>Substack</generator><lastBuildDate>Wed, 29 Apr 2026 12:05:04 GMT</lastBuildDate><atom:link href="https://blog.cycls.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Cycls]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[cycls@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[cycls@substack.com]]></itunes:email><itunes:name><![CDATA[Mohammed Faisal]]></itunes:name></itunes:owner><itunes:author><![CDATA[Mohammed Faisal]]></itunes:author><googleplay:owner><![CDATA[cycls@substack.com]]></googleplay:owner><googleplay:email><![CDATA[cycls@substack.com]]></googleplay:email><googleplay:author><![CDATA[Mohammed Faisal]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Unfreezing LLMs with Wiki Power]]></title><description><![CDATA[Wiki RAG]]></description><link>https://blog.cycls.com/p/unfreezing-llms-with-wiki-power</link><guid isPermaLink="false">https://blog.cycls.com/p/unfreezing-llms-with-wiki-power</guid><dc:creator><![CDATA[Mohammed Alrujayi]]></dc:creator><pubDate>Thu, 25 Jul 2024 18:17:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!9QRE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9QRE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9QRE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 424w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 848w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 1272w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9QRE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp" width="656" height="369" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:656,&quot;bytes&quot;:93644,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9QRE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 424w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 848w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 1272w, https://substackcdn.com/image/fetch/$s_!9QRE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb46ae884-a6d6-4264-b55c-e237e24c5830_1920x1080.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Image source: Wired (<strong><a href="http://www.wired.com/">wired.com</a></strong>).</figcaption></figure></div><p><a href="https://en.wikipedia.org/wiki/Large_language_model">Large Language Models</a> (LLMs) are incredibly powerful, and continue to amaze us every day. Yet beneath their power lies a subtle but significant weakness.</p><p><strong>They are frozen in time.</strong></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.cycls.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Cycls! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>Once deployed, LLMs cannot update their knowledge or adapt to new information. Their understanding of the world remains static. These models are trained on data and then used for inference. They <strong>can't learn </strong>anything once they're in action.</p><p>Which means LLMs <strong>can't keep up</strong> with our world, where information is constantly being generated.</p><p>However, this limitation, while challenging, isn't a deal breaker. The key lies in augmenting these powerful models, enabling them to stay current. By successfully implementing such augmentation, we can extend LLMs' capabilities beyond their training cutoff, keeping them always <strong>'fresh'</strong> and relevant.</p><h1>RAGs  </h1><p>Vector embeddings are a key part of how LLMs work. They're simply a way to turn words into numbers that models can process. In LLMs, embeddings convert text into long lists of numbers (vectors), allowing the models to understand and manipulate language.</p><p>Now, imagine we took a book published after the LLM's training cutoff date. What if we broke it down into pages and used this same 'vector embedding' technique on each page? We could then store this vectorized book in an <strong>external</strong> database.</p><p>Here's where it gets interesting: when you ask the LLM about something in this book, it doesn't require prior training on its content. Instead, it can just <strong>search</strong> the vectorized pages, find the most relevant one, and use it as reference material to answer your question.</p><p>This method is known as <a href="https://en.wikipedia.org/wiki/Retrieval-augmented_generation">Retrieval-Augmented Generation</a> (RAG). By storing embeddings in an external, updatable database rather than within the LLM itself, RAGs allows the model to stay current. When the LLM needs to answer a question, it searches this up-to-date database, retrieves relevant information, and uses it to generate a response.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JsXD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JsXD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 424w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 848w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 1272w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JsXD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png" width="606" height="261.7953296703297" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:629,&quot;width&quot;:1456,&quot;resizeWidth&quot;:606,&quot;bytes&quot;:1597941,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JsXD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 424w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 848w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 1272w, https://substackcdn.com/image/fetch/$s_!JsXD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8d9c1884-5cc8-4016-912f-e55d44eaf387_2880x1244.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Left is LLM frozen in time, right is LLM augmented by a vector database (knowledge base) to stay <strong>fresh. </strong>Image source: Pinecone (<strong><a href="http://www.pinecone.io/">pinecone.io</a></strong>).</figcaption></figure></div><p>RAGs augment LLMs with real-time, domain-specific knowledge. This combination quickly became popular for building <strong>AI apps</strong> across many fields.</p><h1>RAGs in practice</h1><p>Creating a RAG is simple, start with any dataset you have. Let's consider a hospital using patient records to create an AI assistant for doctors.</p><p>The hospital turns its patient database into a RAG system. When a doctor asks a question, the AI searches these records and uses an LLM to craft a response, combining language skills with specific patient data.</p><p>This approach creates a <strong>specialized</strong>, <strong>up-to-date</strong> tool and addresses two additional LLM weaknesses:</p><ol><li><p>It reduces <a href="https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)">hallucination</a> by grounding responses in real data</p></li><li><p>It gives developers control over the AI's behavior with data</p></li></ol><p>As records update, the hospital adds new info to the database, keeping the AI current without retraining.</p><p>This turns patient data into a powerful AI tool, helping doctors quickly access and understand patient history while maintaining accuracy and control.</p><h1>Heresy</h1><p>But what if we start with empty data?</p><p>This might sound heretical in the context of RAGs. After all, the whole point of a RAG is to ground an LLM in specific, relevant data. But let's entertain this thought for a moment.</p><p>Starting with an empty database could actually be a powerful way to build a knowledge base from scratch. </p><p>In this approach, users not only query the RAG but also <strong>'write'</strong> back to it. This is the key distinction: users become both consumers and producers of knowledge.</p><p>Here's how it might work:</p><ol><li><p>Start with an empty database</p></li><li><p>Enrich through user submissions</p></li><li><p>Grow via <strong>consume-contribute</strong> Cycles</p></li></ol><p>This approach enables the system to learn and specialize based on <strong>actual usage</strong>. It grows organically, capturing nuances and knowledge that fixed datasets simply can't.</p><p>Organic growth is crucial, as it means the system evolves naturally in response to real user needs and interests, rather than following a predetermined path. It's a dynamic, living knowledge base that adapts and expands with each interaction.</p><p>Of course, this approach raises intriguing questions. How quickly would the system become useful? How would we manage potential misinformation? Could this lead to RAGs with vastly different knowledge bases depending on their user communities? And so on.</p><h1>Analogy</h1><p>A good analogy for this RAG-based collaborative knowledge building is the immensely successful <a href="https://www.wikipedia.org">Wikipedia</a>. Both models rely on user contributions to grow.</p><p>The RAG approach, however, adds a conversational layer powered by LLMs. This creates a more casual interface for knowledge sharing, lowering the barrier for contribution and encouraging spontaneous input.</p><p>Users can easily ask questions, offer insights, or make connections, potentially leading to unexpected <strong>'happy accidents'</strong> in knowledge creation. This fluid approach to collaborative learning could make knowledge-building more accessible and dynamic than ever before.</p><h1>Next</h1><p>Collaborative knowledge is an exciting topic. As an <a href="http://Cycls.com">AI startup founder</a>, I'm excited to explore its practical implementation.</p><p>We'll be experimenting with prototypes in the coming weeks. <a href="https://x.com/cycls_">Stay tuned for updates</a> on this mix of AI, knowledge management, and community collaboration.</p><div><hr></div><p>Thanks for reading! Follow me on <a href="https://x.com/aburjg">Twitter</a> for more AI insights and updates. For company updates, check out <a href="https://x.com/cycls_">Cycls</a>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.cycls.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Cycls! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>