Thinking Machines seeks a $50 billion valuation four months after raising $2 billion. The OpenAI spin-out has one API in private beta. Investors aren't pricing the product—they're pricing the fear of missing out on Mira Murati's next move.
Baidu's ERNIE 5.0 matched GPT-5 on benchmarks and undercut OpenAI on price. Investors sold anyway, dropping shares 9.8%. With 1,500+ AI models competing in China, technical excellence stopped being enough to win.
DeepProtein: A One-Stop Shop for AI-Powered Protein Research
Researchers from ETH Zurich and Nanjing University have created DeepProtein, a new deep learning library that makes complex protein analysis as simple as ordering takeout.
The new tool brings together cutting-edge AI models under one roof, saving scientists precious time they'd otherwise spend wrestling with code. DeepProtein tackles everything from predicting how proteins fold to mapping their interactions with other molecules. It's built for both AI experts and biologists who just want their protein analysis to work without a PhD in computer science.
The team didn't just build a tool – they put it through its paces. They tested eight different types of AI architectures across multiple protein analysis tasks. These ranged from basic classification problems to the more complex challenge of predicting protein structures in 3D space.
Credit: Jiaqing Xie, Department of Computer Science ETH Zurich & Tianfan Fu, National Key Laboratory for Novel Software Technology, School of Computer Science Nanjing University
The star of the show is their new model family, DeepProt-T5. Based on the powerful Prot-T5 architecture, these fine-tuned models achieved top scores on four benchmark tasks and strong results on six others. Think of it as a straight-A student who also plays varsity sports.
What sets DeepProtein apart is its user-friendly approach. Previous tools often required researchers to understand both complex biology and deep learning. DeepProtein strips away this complexity with a simple command-line interface. It's like having an AI research assistant who speaks plain English.
The library builds on DeepPurpose, a widely used tool for drug discovery. This heritage means researchers can easily integrate DeepProtein with existing workflows and databases. The team also provides detailed documentation and tutorials, ensuring scientists don't get stuck in implementation details.
DeepProtein fills several gaps in the protein research toolkit. While previous benchmarks like PEER focused mainly on sequence-based methods, DeepProtein adds structure-based approaches and pre-trained language models to the mix. It's the difference between having just a hammer and owning a complete toolbox.
The timing couldn't be better. The success of tools like AlphaFold 2.0 has sparked renewed interest in applying machine learning to protein research. DeepProtein rides this wave by making advanced AI techniques accessible to more researchers.
Credit: Jiaqing Xie, Department of Computer Science ETH Zurich & Tianfan Fu, National Key Laboratory for Novel Software Technology, School of Computer Science Nanjing University
For the technically minded, the library supports various neural network architectures: CNNs, CNN-RNNs, RNNs, transformers, graph neural networks, graph transformers, pre-trained protein language models, and large language models. Each brings its own strengths to different protein analysis tasks.
The team has made everything open source and available on GitHub. Their pre-trained models live on HuggingFace, ready for researchers to download and use. They've eliminated the need for redundant training, making model deployment faster and more efficient.
Why this matters:
DeepProtein democratizes AI-powered protein research. What once required expertise in both biology and deep learning now needs just a basic understanding of command-line interfaces
The comprehensive benchmarking across different AI architectures gives researchers clear guidance on which tools work best for specific protein analysis tasks. No more guessing games or trial and error
Bilingual tech journalist slicing through AI noise at implicator.ai. Decodes digital culture with a ruthless Gen Z lens—fast, sharp, relentlessly curious. Bridges Silicon Valley's marble boardrooms, hunting who tech really serves.
Deezer receives 50,000 AI tracks daily—34% of all uploads. Yet they generate just 0.5% of streams, with 70% of plays flagged as fraud. The flood isn't about whether AI sounds convincing. It's about zero-cost content enabling industrial-scale royalty theft.
DeepMind's AlphaEvolve can search millions of mathematical constructions in hours, not weeks. Fields Medalist Terence Tao already builds on its outputs. But the system finds candidates, not proofs. The real shift: math discovery at industrial scale.
Enterprises report 74% positive AI returns while cutting training budgets 8%. The Wharton study reveals companies extracting productivity gains today by depleting tomorrow's capabilities—a business model that works until skills erode.
Chinese researchers abandon AI's rigid think-act-observe loops for fluid reasoning that discovers tools mid-thought. DeepAgent hits 89% success where competitors reach 55%, revealing the bottleneck was never intelligence but architectural rigidity.