Anthropic hits $5B revenue run-rate and 300K enterprise customers, racing to build global infrastructure as Microsoft, Google embed AI everywhere. The pure-play bet faces a closing window: can independent AI survive when every cloud platform becomes a competitor?
Meta launches Vibes—an all-AI video feed—just as rivals crack down on synthetic content. Users aren't buying it, partnerships reveal internal gaps, and the timing suggests competitive desperation over consumer demand.
Meta launches Vibes—an all-AI video feed—just as rivals crack down on synthetic content. Users aren't buying it, partnerships reveal internal gaps, and the timing suggests competitive desperation over consumer demand.
👉 Meta launched Vibes Thursday—a TikTok-style feed populated entirely by AI-generated videos that users can create, remix and share.
📊 User reaction was brutal, with top comments calling it "AI slop" and questioning why anyone wants synthetic video feeds.
🏭 Meta relies on partners Midjourney and Black Forest Labs for core functionality, suggesting internal video models aren't production-ready.
⚖️ The launch contradicts Meta's own recent guidance advising creators toward "authentic storytelling" over low-value content.
🌍 Timing looks defensive—YouTube and other platforms are cracking down on AI content while Meta doubles down on synthetic feeds.
🚀 The move tests whether consumer appetite for artificial content exceeds authenticity preferences in social media's next phase.
Meta is rolling out Vibes, a dedicated feed of AI-generated short videos inside the Meta AI app and on meta.ai—a move the company frames as creative empowerment even as other platforms clamp down on synthetic filler. See the official details in Meta’s Vibes announcement. As a product bet, it’s audacious. As a timing signal, it looks defensive.
Vibes arrives with a simple premise: a TikTok-style, full-screen feed populated entirely by AI. You can generate a clip from scratch, remix what others made, add visuals or music, and then share within the Vibes feed or cross-post to Instagram and Facebook. Personalization rides Meta’s existing recommendation systems, so the stream should sharpen as you scroll.
The gap between pitch and reception
The early reaction wasn’t subtle. Under Mark Zuckerberg’s launch post, the most-liked replies dismissed the feature as more “AI slop.” That response tracks broader creator fatigue with uncanny, near-but-not-human content flooding social feeds. TechCrunch’s launch piece—bluntly headlined—captured the tone of the day.
What’s actually new
This isn’t Meta’s first step into AI video. In June, the company pushed out generative video editing across the Meta AI app and meta.ai; Vibes elevates those tools into a standalone, remix-friendly feed. The company describes this week’s release as an “early preview,” emphasizing iteration over completeness. The mechanics—remix, re-style, cross-post—are straightforward; the bet is that a dedicated feed will stoke creation loops the main apps couldn’t.
A strategic contradiction
Meta has spent the summer telling creators it would punish “unoriginal” reposts and low-effort filler on Facebook to boost authentic storytelling. Now it’s introducing a product that by design amplifies synthetic, de-personalized clips. That’s not necessarily hypocrisy—the company can want less plagiarism while testing new formats—but it does set up mixed incentives: the main apps nudge toward “real,” while Vibes rewards infinite remix. Creators will notice the whiplash.
Partnerships expose a timeline
Meta says it’s “continuing to develop our own models,” but reporting indicates the early Vibes stack leans on partners including Midjourney and Black Forest Labs. That’s a pragmatic shortcut—and a tell. If true, the dependency buys Meta speed while conceding that its in-house video generation isn’t yet ready for primetime. It also creates a leverage point for partners on pricing and access until Meta closes the capability gap.
Competitive context: swimming against the current
Across the industry, moderation is moving one way while Vibes sails another. YouTube is tightening monetization rules to limit mass-produced, low-value AI content, clarifying that “inauthentic” spam won’t get paid. Meta itself has talked up demoting repetitive, derivative posts. Vibes—an all-AI river of clips—runs against that tide and will test whether a fenced-off feed can contain the downside while showcasing the upside.
Product logic—and the risk surface
There is a coherent product thesis here. Vibes funnels users into the Meta AI app, which is also the command center for Ray-Ban smart glasses and other camera-adjacent features. If Vibes sticks, it becomes a showcase for multimodal assistants and a distribution spine for future models. The flip side is a moderation and trust headache. Synthetic video at scale expands the attack surface for scams, harassment, political manipulation, and run-of-the-mill junk. Unless guardrails and provenance signals are visible and enforced, the feed could become a magnet for what users are already rejecting elsewhere. (Meta’s post doesn’t detail firm guardrails yet, beyond “preview” framing.)
The business read
Meta booked nearly $165 billion in revenue last year. But on AI mindshare, it trails OpenAI, Google, and Anthropic. Vibes is a public demo track for Meta’s creative-media ambitions and its reorganized AI push under the “Superintelligence Labs” banner. Launching a consumer-facing, model-hungry feed may also help justify the company’s escalating AI spend by tying research directly to visible, daily use. The open question is whether users want an all-synthetic stream—or if the novelty decays into numbness.
Bottom line
Vibes is both a product and a provocation. If it lands, Meta reframes “AI slop” as a creative playground with social gravity. If it misses, it becomes another reminder that scale alone can’t will an audience into caring about content that feels a little off.
Why this matters
Platform direction: If Vibes finds traction, “authenticity” may stop being the default value prop of social media, with recommendation engines optimizing for remixable style over lived experience.
AI execution: Reports of external model partners suggest Meta’s in-house video generation still isn’t production-ready, a gap that shapes timelines, costs, and investor expectations.
❓ Frequently Asked Questions
Q: What exactly is Meta's "Superintelligence Labs" that's behind Vibes?
A: Meta created this AI division in June 2025 after key staff departures and Llama 4 setbacks. It coordinates AI development across all Meta products—from the $165 billion company's core apps to Ray-Ban smart glasses. Vibes represents the division's first major consumer test.
Q: Why does Meta need partners like Midjourney for video generation?
A: The partnerships reveal Meta's internal video models aren't production-ready yet. While Meta develops proprietary alternatives, it's white-labeling competitors' technology for consumer launch. This creates dependency risks—partners could adjust pricing or restrict access during Meta's catch-up period.
Q: How does Vibes connect to Meta's other AI products?
A: Vibes sits inside the Meta AI app, which also manages Ray-Ban smart glasses and photo editing features. The strategy creates a testing ground for multimodal AI capabilities and drives user adoption of Meta's hardware ecosystem through content creation tools.
Q: What specific moderation challenges does AI video create that text doesn't?
A: AI video enables convincing deepfakes, false documentation of events, and synthetic harassment at scale. Unlike text-based misinformation, AI video can create "evidence" of things that never happened, making verification exponentially more complex for content moderators.
Q: Why launch Vibes when users clearly don't want more AI content?
A: Meta trails OpenAI, Google, and Anthropic in AI mindshare despite generating nearly $165 billion in revenue last year. Vibes serves as a public demonstration of AI capabilities and justifies escalating AI infrastructure spending by connecting research to daily consumer use.
Tech translator with German roots who fled to Silicon Valley chaos. Decodes startup noise from San Francisco. Launched implicator.ai to slice through AI's daily madness—crisp, clear, with Teutonic precision and sarcasm.
E-Mail: marcus@implicator.ai
Anthropic hits $5B revenue run-rate and 300K enterprise customers, racing to build global infrastructure as Microsoft, Google embed AI everywhere. The pure-play bet faces a closing window: can independent AI survive when every cloud platform becomes a competitor?
OpenAI's new benchmark claims GPT-5 matches human experts 40% of the time on real work tasks—a 3x jump in 15 months. But the test reveals as much about what AI can't measure as what it can. The productivity promise meets measurement reality.
Microsoft integrates Claude AI into Office alongside OpenAI, marking first major diversification from its $13B partner. The move signals model commoditization while preserving strategic alliances—enterprises gain choice, vendors hedge bets.
Alibaba promises to exceed its $53B AI spending target while burning $2.6B quarterly—and investors love it. The launch of trillion-parameter Qwen3-Max signals China's bid to control the entire AI stack, profits be damned.