• Practicaly AI
  • Posts
  • Meta's Muse, HeyGen's clone machine, and Gemini finally gets folders

Meta's Muse, HeyGen's clone machine, and Gemini finally gets folders

Today in AI: three launches that quietly change how we work, plus a Claude trick hiding in plain sight.

👋 Hello hello,

Big week. Meta dropped a reasoning model nobody saw coming, HeyGen says it cracked the one thing every AI avatar tool has been faking, and Gemini finally, finally gave us folders. (Yes, in 2026. Let that sit.)

I spent the morning poking at all three so you don't have to scroll twelve threads to figure out what's worth your time.

Here's the quick scoop.

💬 Quick note: We’re building something to help teams actually get good at AI (not just use it). → Get early access here

 ðŸ”¥ðŸ”¥ðŸ”¥ Three Highly Curated AI Stories

Meta's new lab just shipped its first model, and it's not a quiet release. Muse Spark is a natively multimodal reasoning model with tool-use, visual chain-of-thought, and multi-agent orchestration baked in from day one.

Translation: it can look at things, think through them step by step while showing its work, call tools, and coordinate with other agents on longer tasks. That's the full stack most labs are still stitching together.

It's live now on meta.ai and the Meta AI app, with a private API preview for select partners. Meta is also hinting at open-sourcing future versions, which would be a big deal for anyone building on top.

Character consistency has been the uncanny valley of AI video. Your avatar looks like you in one clip and like your cousin in the next. HeyGen is claiming they've fixed it.

Avatar V captures you in 15 seconds and then holds your identity across every video you generate. Swap outfits, change the setting, rewrite the script, and the person on screen still looks like you.

💡We're going to put this head-to-head with Gemini's Veo and Copilot's video tools on our channels over the next week and report back with the honest differences. If you've tested it already, reply and tell us what broke.

Gemini now has project organization. You can spin up a notebook, pull in past chats, attach relevant files as sources, and keep everything for one project in one place instead of scrolling through an endless sidebar.

Hit "New notebook" in the side panel to start one. It's the kind of feature that sounds boring until you've lost a conversation thread for the third time that week. For anyone juggling multiple clients or research threads, this is the quiet upgrade that actually changes your daily flow.

🔥🔥 Two Pro AI Tips

Claude Code now lets you spin up subagents: smaller, task-specific assistants that handle a piece of a larger job while the main agent keeps the plot. Think of it as delegating. One subagent reviews code, another writes tests, another handles docs, and none of them clutter the main context window.

Best for developers running bigger projects where context bloat has been slowing Claude down.

2. 💬 Claude's reply-to-selection feature

Finally in Claude chat: highlight any chunk of text in Claude's response and you get a "Reply" button that lets you respond just to that piece. The selected text gets quoted in your next message, so Claude knows exactly what you're reacting to.

Best for long responses where you want to push back on one point without re-explaining the whole thread. Small feature, surprisingly good upgrade.

Is this you? Your team is using AI. But they’re not getting better results.
We’re fixing that. Join the waitlist to find out how.

🔥 Things You Should Know About AI

You've seen "MCP" in every other AI post this month. Most people nod along. Almost nobody can explain what it actually does.

MCP is the open standard that lets AI models plug into your tools, data, and apps in a consistent way. Instead of every app inventing its own integration, MCP gives models one common language to talk to Gmail, Notion, GitHub, your database, whatever. It's why Claude can suddenly "use" your stuff without custom code per tool.

Here's a simple way to start using it today:

  1. Pick one tool you use daily (Notion, Gmail, GitHub, a local folder).

  2. Install the matching MCP server from the public directory.

  3. Connect it to Claude Desktop or another MCP-compatible client.

  4. Ask the model to fetch, summarize, or act on something from that tool in plain English.

  5. Chain a second MCP server in and let the model move data between them.

We broke down the full protocol, how it actually works, and the workflows that matter. If you want the deep version with the working examples, check this out.

💬 Quick poll: What’s one AI tool you’ve tried recently that actually stuck?

Did you learn something new?

Login or Subscribe to participate in polls.

Until next time,
Team @PracticalyAI

Reply

or to participate.