<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>Sid Decodes</title><description>Siddhant Minocha&apos;s blog on building, AI, genetics, and preventive healthcare.</description><link>https://sidmin.blog/</link><item><title>MaxToki: The AI That Predicts How Your Cells Age</title><link>https://sidmin.blog/blog/maxtoki-cell-aging-ai/</link><guid isPermaLink="true">https://sidmin.blog/blog/maxtoki-cell-aging-ai/</guid><description>A 1B parameter temporal model trained on 22 million age-annotated cells predicts how gene activity shifts across your lifespan.</description><pubDate>Wed, 08 Apr 2026 00:00:00 GMT</pubDate></item><item><title>AI Embedding Models Explained: Google, Perplexity &amp; OpenAI Compared</title><link>https://sidmin.blog/blog/embedding-models-compared/</link><guid isPermaLink="true">https://sidmin.blog/blog/embedding-models-compared/</guid><description>Google released Gemini Embedding 2 and Perplexity released pplx-embed in the same week. Here&apos;s the science behind embeddings and how these models compare.</description><pubDate>Thu, 12 Mar 2026 00:00:00 GMT</pubDate></item><item><title>Protein Language Models: Treating Amino Acid Sequences Like Sentences</title><link>https://sidmin.blog/blog/protein-language-models/</link><guid isPermaLink="true">https://sidmin.blog/blog/protein-language-models/</guid><description>DNA is a language. Proteins are its stories. Here&apos;s how AI models trained on amino acid sequences are learning to read, write, and design biology from scratch.</description><pubDate>Sat, 07 Mar 2026 00:00:00 GMT</pubDate></item><item><title>KV Caching Explained: Why LLM Inference Is Memory-Bound</title><link>https://sidmin.blog/blog/kv-caching-explained/</link><guid isPermaLink="true">https://sidmin.blog/blog/kv-caching-explained/</guid><description>LLM inference isn&apos;t bottlenecked by compute. It&apos;s bottlenecked by memory. Here&apos;s how KV caching works, why it matters, and the math behind it.</description><pubDate>Tue, 03 Mar 2026 00:00:00 GMT</pubDate></item><item><title>The Math Behind LLMs</title><link>https://sidmin.blog/blog/math-behind-llms/</link><guid isPermaLink="true">https://sidmin.blog/blog/math-behind-llms/</guid><description>A single equation drives every large language model. Here&apos;s the math that keeps the world moving — explained without the hand-waving.</description><pubDate>Sun, 01 Mar 2026 00:00:00 GMT</pubDate></item></channel></rss>