Home NewsX A better Phi-3 Family is coming – multi-language support, better vision, intelligence MOEs

A better Phi-3 Family is coming – multi-language support, better vision, intelligence MOEs

by info.odysseyx@gmail.com
0 comment 12 views


In addition to computing power, model size is also a key factor in improving model performance in order to achieve higher model performance. Under a limited computing resource budget, training a larger model with fewer training steps is often better than training a smaller model with more steps.

The MoE model has the following characteristics:

  • Pre-training is faster than dense models.
  • Faster inference speed than models with the same number of parameters
  • All expert systems must be loaded into memory, requiring a lot of video memory.
  • Although fine-tuning poses many challenges, recent research suggests that instructional tuning for mixed-expert models holds great potential.

Now that there are many AI agent applications, MOE can be used to enhance AI agents. In multi-task scenarios, the response is faster.

We can explore a simple scenario where we use AI to create a Twitter based on some content, translate it into Chinese, and post it on a social network. We can combine Phi-3 MOE to complete this. We can use Prompt to set up and organize tasks such as posting blog content, translated content, and best answers.


"""

sys_msg = """You are a helpful AI assistant, you are an agent capable of using a variety of tools to answer a question. Here are a few of the tools available to you:

- Blog: This tool helps you describe a certain knowledge point and content, and finally write it into Twitter or Facebook style content
- Translate: This is a tool that helps you translate into any language, using plain language as required
- Final Answer: the final answer tool must be used to respond to the user. You must use this when you have decided on an answer.

To use these tools you must always respond in JSON format containing `"tool_name"` and `"input"` key-value pairs. For example, to answer the question, "Build Muliti Agents with MOE models" you must use the calculator tool like so:


{
    "tool_name": "Blog",
    "input": "Build Muliti Agents with MOE models"
}


Or to translate the question "can you introduce yourself in Chinese" you must respond:


{
    "tool_name": "Search",
    "input": "can you introduce yourself in Chinese"
}


Remember just output the final result, ouput in JSON format containing `"agentid"`,`"tool_name"` , `"input"` and `"output"`  key-value pairs .:


[
    {   "agentid": "step1",
        "tool_name": "Blog",
        "input": "Build Muliti Agents with MOE models",
        "output": "........."
    },

    {   "agentid": "step2",
        "tool_name": "Search",
        "input": "can you introduce yourself in Chinese",
        "output": "........."
    },
    {
        "agentid": "final"
        "tool_name": "Result",
        "output": "........."
    }
]


The users answer is as follows.


"""


Given the skills and task arrangements required for your model, Phi-3 MOE can assign your model to various tasks to complete the associated tasks.





Source link

You may also like

Leave a Comment

Our Company

Welcome to OdysseyX, your one-stop destination for the latest news and opportunities across various domains.

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Laest News

@2024 – All Right Reserved. Designed and Developed by OdysseyX