News

The Mu small language model enables an AI agent to take action on hundreds of system settings. It’s now in preview for some ...
The 330 million parameter model was trained using Azure’s A100 GPUs and fine-tuned through a multi-phase process.
There is a new paper by Google and Waymo (Scaling Laws of Motion Forecasting and Planning A Technical Report that confirmed ...
Mu is built on a transformer-based encoder-decoder architecture featuring 330 million token parameters, making the SLM a good ...
The company will also preview its Pro Convert Xmit AIO, the first member of the Xmit family, at Infocomm. The streaming encoder/decoder is a standalone device that converts one channel of SDI or HDMI ...