首页 News 正文

On April 23rd local time, Microsoft launched the open-source lightweight AI model Phi-3 series, claiming that it is currently the most powerful and cost-effective "Xiaoyu Yan model" on the market. The smallest version of this series, the Phi-3-mini, although only with a parameter size of 3.8 billion, demonstrates performance that exceeds the model's parameter size by more than twice, outperforming the Meta's Llama 3 8B in multiple benchmark tests. The Phi-3-small and Phi-3-medium versions can even surpass the GPT-3.5 Turbo. Even more noteworthy is that the Phi-3-mini uses very little memory and can generate 12 tokens per second on the A16 Bionic chip on the iPhone 14. This means that this model does not need to be connected to the internet and can run directly on the phone. Moreover, it is revealed that the cost of Phi-3 may only be one tenth of that of an equivalent performance model.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

我放心你带套猛 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    31