A better Phi-3 Family is coming – multi-language support, better vision, intelligence MOEs by info.odysseyx@gmail.com August 20, 2024 written by info.odysseyx@gmail.com August 20, 2024 0 comment 12 views 12 In addition to computing power, model size is also a key factor in improving model performance in order to achieve higher model performance. Under a limited computing resource budget, training a larger model with fewer training steps is often better than training a smaller model with more steps. The MoE model has the following characteristics: Pre-training is faster than dense models. Faster inference speed than models with the same number of parameters All expert systems must be loaded into memory, requiring a lot of video memory. Although fine-tuning poses many challenges, recent research suggests that instructional tuning for mixed-expert models holds great potential. Now that there are many AI agent applications, MOE can be used to enhance AI agents. In multi-task scenarios, the response is faster. We can explore a simple scenario where we use AI to create a Twitter based on some content, translate it into Chinese, and post it on a social network. We can combine Phi-3 MOE to complete this. We can use Prompt to set up and organize tasks such as posting blog content, translated content, and best answers. """ sys_msg = """You are a helpful AI assistant, you are an agent capable of using a variety of tools to answer a question. Here are a few of the tools available to you: - Blog: This tool helps you describe a certain knowledge point and content, and finally write it into Twitter or Facebook style content - Translate: This is a tool that helps you translate into any language, using plain language as required - Final Answer: the final answer tool must be used to respond to the user. You must use this when you have decided on an answer. To use these tools you must always respond in JSON format containing `"tool_name"` and `"input"` key-value pairs. For example, to answer the question, "Build Muliti Agents with MOE models" you must use the calculator tool like so: { "tool_name": "Blog", "input": "Build Muliti Agents with MOE models" } Or to translate the question "can you introduce yourself in Chinese" you must respond: { "tool_name": "Search", "input": "can you introduce yourself in Chinese" } Remember just output the final result, ouput in JSON format containing `"agentid"`,`"tool_name"` , `"input"` and `"output"` key-value pairs .: [ { "agentid": "step1", "tool_name": "Blog", "input": "Build Muliti Agents with MOE models", "output": "........." }, { "agentid": "step2", "tool_name": "Search", "input": "can you introduce yourself in Chinese", "output": "........." }, { "agentid": "final" "tool_name": "Result", "output": "........." } ] The users answer is as follows. """ Given the skills and task arrangements required for your model, Phi-3 MOE can assign your model to various tasks to complete the associated tasks. Source link Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post Harnessing the power of KQL Plugins for enhanced security insights with Copilot for Security next post Visualizing Data as Graphs with Fabric and KQL You may also like 7 Disturbing Tech Trends of 2024 December 19, 2024 AI on phones fails to impress Apple, Samsung users: Survey December 18, 2024 Standout technology products of 2024 December 16, 2024 Is Intel Equivalent to Tech Industry 2024 NY Giant? December 12, 2024 Google’s Willow chip marks breakthrough in quantum computing December 11, 2024 Job seekers are targeted in mobile phishing campaigns December 10, 2024 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.