News

A new communication-collective system, OptiReduce, speeds up AI and machine learning training across multiple cloud servers by setting time boundaries rather than waiting for every server to catch up, ...
B-v2, an open-source hybrid Transformer-SSM model trained on 3T tokens, claiming faster inference than comparable LLMs.
Qwen3’s open-weight release under an accessible license marks an important milestone, lowering barriers for developers and organizations.
Researchers from UCLA and Meta AI have introduced d1, a novel framework using reinforcement learning (RL) to significantly enhance the reasoning capabilities of diffusion-based large language models ...
That’s where pretrained models come in. These ready-to-use tools are making AI faster, cheaper and more scalable. The rise of ...
Palmyra X5, developed for efficiently powering multi-step agents, available exclusively via Writer and Amazon Bedrock as a ...
Researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a novel artificial ...
A new “periodic table for machine learning,” is reshaping how researchers explore AI, unlocking fresh pathways for discovery.