GIT_FEED

Lordog/dive-into-llms

《动手学大模型Dive into LLMs》系列编程实践教程

View on GitHub

What it does

Dive into LLMs is a free, hands-on educational tutorial series from Shanghai Jiao Tong University that teaches people how to build with large AI language models through practical exercises and code notebooks. It covers topics ranging from fine-tuning AI models for specific tasks, to prompt engineering, knowledge editing, math reasoning, and AI safety — all designed to get beginners up and running quickly.

Why it matters

With over 30,000 stars, this project signals massive demand in the Chinese-speaking developer community for practical, accessible AI education — a strong indicator of where the next wave of AI builders and startups will emerge. For founders and investors, this represents both a talent pipeline and a window into which AI capabilities (reasoning, agents, alignment) practitioners are prioritizing as they build real products.

Why it's trending

A free, university-backed course teaching hands-on LLM development is pulling in nearly 4,800 new stars this week — a 26% jump over last week's already-strong pace, suggesting word is spreading fast through developer communities in China and beyond. The project's practical focus on fine-tuning, prompt engineering, and AI safety fills a real gap for builders who want structured, code-first guidance rather than theory. That said, the star velocity here is sharp enough to have triggered a manipulation flag, so treat the raw numbers with some skepticism — the underlying content from Shanghai Jiao Tong University is legitimate, but the growth pattern warrants a closer look before drawing conclusions about organic demand.

10Active

On the radar — signal detected

Stars
33.7k
Forks
4.1k
Contributors
14
Language
Jupyter Notebook

Score updated Apr 23, 2026

Related projects

This is Google's official collection of tutorials, code examples, and ready-to-run notebooks showing builders how to create AI-powered applications using Google's Gemini models on its cloud platform. It covers everything from basic AI conversations to complex multi-step AI agents that can reason and take actions autonomously.

// why it matters With over 15,000 stars and nearly 300 contributors, this repository signals where serious enterprise AI development is heading — Google's cloud ecosystem is positioning itself as a primary destination for teams building production AI products. For founders and PMs evaluating AI infrastructure, this gives a clear picture of Google's capabilities and provides a fast track to building on the same models powering consumer Google products.

Jupyter Notebook16.7k stars4.2k forks292 contrib

Hermes Agent is an AI assistant that gets smarter the more you use it — it remembers past conversations, learns new skills from experience, and builds a profile of who you are over time, all without being tied to any single AI provider or device. It runs in the cloud and connects to messaging apps like Telegram, Slack, and WhatsApp, so you can interact with it anywhere while it handles complex tasks in the background.

// why it matters As AI assistants become a core part of how teams and products operate, the ability to avoid vendor lock-in while building a continuously improving, memory-rich agent is a significant competitive advantage — this is the kind of infrastructure layer that could sit underneath entire products or workflows. With nearly 9,000 stars and over 100 contributors, it signals strong developer demand for agents that persist, learn, and work autonomously rather than resetting with every session.

Python110.8k stars16.1k forks458 contrib

AITER is AMD's open-source library of high-performance building blocks that make AI models run faster on AMD hardware, supporting everything from basic AI operations to complex training and multi-GPU coordination. Think of it as a toolbox that lets AI software teams tap into AMD's chip capabilities without having to write low-level hardware code themselves.

// why it matters As AI infrastructure costs soar, builders are actively exploring alternatives to Nvidia's dominant GPU ecosystem, and AMD is positioning AITER as the key compatibility layer that makes switching or diversifying hardware more practical. For founders and PMs building AI products, this means AMD GPUs become a more credible option for cost reduction or supply chain diversification — especially relevant as demand for AI compute continues to outpace supply.

Python412 stars289 forks200 contrib

Last30Days is a plug-in skill for the Claude AI coding assistant that automatically researches any topic across Reddit, X, YouTube, Hacker News, Polymarket, and Bluesky, then produces a cited summary of what people are actually talking about right now. Think of it as a one-command briefing tool that scans the social web for the past 30 days and distills the signal into a readable report, saved automatically to your computer.

// why it matters As AI tools and markets shift weekly, founders and product teams who can quickly validate what's gaining traction — before it becomes mainstream knowledge — have a real edge in prioritization and positioning. The 15,000+ stars suggest strong demand for ambient, automated trend intelligence baked directly into developer workflows rather than requiring separate research tools.

Python23.4k stars1.9k forks16 contrib
// SUBSCRIBE

The repos that moved this week, why they matter, and what to watch next. One email. No noise.