Hugging Face has recently introduced Transformers Agent, which allows customers to control over 100,000 HF models using the Transformers and Diffusers interface. Other HF models can be generated using the new API of tools and agents to meet complicated, multimodal issues.
Transformers Agent is a natural language API built on top of transformers that includes a set of curated tools and an agent meant to read natural language and use these tools. The system is designed to be extendable, with the flexibility to simply add any relevant community tools.
Users must first create an agent, which is an LLM, before they can utilize the agent run functionality. Both OpenAI modes and open source alternatives like BigCode and OpenAssistant are supported by the system. Hugging Face provides free access to BigCode and OpenAssistant APIs.
Read More: OpenAI Closes $300 Million Funding Round Between $27-$29 billion Valuation
The tools are made up of a single function with a specific name and description that is then used to prompt an agent to undertake a specific activity. The agent is taught how to use these tools with a prompt that shows how the tools can be used to complete the desired inquiry. While pipelines usually integrate numerous jobs into a single operation, tools are designed to focus on a single, straightforward action.
There are two APIs available, Single execution (run) and Chat-based Execution (chat). The agent’s run() method is available to the user as a single execution method. This method automatically identifies and executes the relevant tool or tools for the task at hand. The chat() method is used by the agent in its chat-based approach. This approach is especially beneficial when the state must be maintained over multiple instructions.