GIT_FEED

garrytan/gbrain

Garry's Opinionated OpenClaw/Hermes Agent Brain

View on GitHub

What it does

GBrain gives AI agents a persistent, self-organizing memory system — so instead of forgetting everything between conversations, your agent builds up a growing knowledge base of people, companies, meetings, and ideas that gets smarter over time. Built by the head of Y Combinator to power his own personal agents, it connects information in a web of relationships (like 'Bob invested in Acme') so you can ask complex questions and get accurate answers that simple keyword or similarity search would miss.

Why it matters

As AI agents move from demos to real business workflows, memory and context are the core unsolved problem — and this is a production-proven solution from one of tech's most influential operators, not a prototype. For founders building agent-powered products, this signals that persistent, self-improving knowledge graphs are becoming the standard architecture, and that the window to differentiate on this layer is closing fast.

Why it's trending

Built by Garry Tan, the current head of Y Combinator, this project grabbed attention fast — pulling in over 6,500 stars in a single week, which represents roughly 70% of its entire star count arriving almost overnight. The timing makes sense: builders are hitting a real wall with AI agents that reset every conversation, and gbrain offers a concrete solution with a working knowledge graph that connects relationships the way a human would, rather than just matching keywords. That said, the near-zero contributor ratio and the manipulation penalty flag suggest the spike may be partly driven by social reach rather than organic discovery, so treat it as a project worth watching closely rather than one already proven by a broad community.

28Active

On the radar — signal detected

Stars
10.4k
Forks
1.2k
Contributors
0
Language
TypeScript
Downloads (7d)
1.0k

npm/gbrain

Score updated Apr 23, 2026

Related projects

This is Google's official collection of tutorials, code examples, and ready-to-run notebooks showing builders how to create AI-powered applications using Google's Gemini models on its cloud platform. It covers everything from basic AI conversations to complex multi-step AI agents that can reason and take actions autonomously.

// why it matters With over 15,000 stars and nearly 300 contributors, this repository signals where serious enterprise AI development is heading — Google's cloud ecosystem is positioning itself as a primary destination for teams building production AI products. For founders and PMs evaluating AI infrastructure, this gives a clear picture of Google's capabilities and provides a fast track to building on the same models powering consumer Google products.

Jupyter Notebook16.7k stars4.2k forks292 contrib

Hermes Agent is an AI assistant that gets smarter the more you use it — it remembers past conversations, learns new skills from experience, and builds a profile of who you are over time, all without being tied to any single AI provider or device. It runs in the cloud and connects to messaging apps like Telegram, Slack, and WhatsApp, so you can interact with it anywhere while it handles complex tasks in the background.

// why it matters As AI assistants become a core part of how teams and products operate, the ability to avoid vendor lock-in while building a continuously improving, memory-rich agent is a significant competitive advantage — this is the kind of infrastructure layer that could sit underneath entire products or workflows. With nearly 9,000 stars and over 100 contributors, it signals strong developer demand for agents that persist, learn, and work autonomously rather than resetting with every session.

Python110.8k stars16.1k forks458 contrib

AITER is AMD's open-source library of high-performance building blocks that make AI models run faster on AMD hardware, supporting everything from basic AI operations to complex training and multi-GPU coordination. Think of it as a toolbox that lets AI software teams tap into AMD's chip capabilities without having to write low-level hardware code themselves.

// why it matters As AI infrastructure costs soar, builders are actively exploring alternatives to Nvidia's dominant GPU ecosystem, and AMD is positioning AITER as the key compatibility layer that makes switching or diversifying hardware more practical. For founders and PMs building AI products, this means AMD GPUs become a more credible option for cost reduction or supply chain diversification — especially relevant as demand for AI compute continues to outpace supply.

Python412 stars289 forks200 contrib

Last30Days is a plug-in skill for the Claude AI coding assistant that automatically researches any topic across Reddit, X, YouTube, Hacker News, Polymarket, and Bluesky, then produces a cited summary of what people are actually talking about right now. Think of it as a one-command briefing tool that scans the social web for the past 30 days and distills the signal into a readable report, saved automatically to your computer.

// why it matters As AI tools and markets shift weekly, founders and product teams who can quickly validate what's gaining traction — before it becomes mainstream knowledge — have a real edge in prioritization and positioning. The 15,000+ stars suggest strong demand for ambient, automated trend intelligence baked directly into developer workflows rather than requiring separate research tools.

Python23.4k stars1.9k forks16 contrib
// SUBSCRIBE

The repos that moved this week, why they matter, and what to watch next. One email. No noise.