Posted December 20, 20231 yr Today, Databricks is excited to announce support for Mixtral 8x7B in Model Serving. Mixtral 8x7B is a sparse Mixture of Experts (MoE) open... View the full article
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.