Multi-Modal Transformer AGENTS, controlled by StarCoder (W/o LangChain)

New Transformer Agents, controlled by a central intelligence: StarCoder, now connect the transformer applications on HuggingFace Hub. The combinatorial set of transformer actions is amazing: From audio to visual, to written and back to audio or anything else.

A coding LLM (like StarCoder: A State-of-the-Art LLM for Code) is the main switching intelligence for coding task, or we choose a) OpenAssistant or b) OpenAI's "text-davinci-003" model for instructional tasks.

With Gradio tools we extend the classical multimodal toolbox of transformer agents on the HF Hub and can integrate them into a LangChain network, which is connected to external APIs (professional service providers).

Transformer agents act primarily linking the different multimodal transformer on the HuggingFace platform, reconnecting them according to the task, while LangChain connects primarily external service providers API.

A COLAB notebook and coding in real-time to integrate the new Transformer Agents, a central switching Intelligence (LLM) and a prompt template for a variety of tasks, detailing the ordered flow of transformer execution.

A great new step forward, combining the possibilities of singular transformers on the HF platform with a controlling and switching AI /LLM.

Further info on Transformer Agents (HuggingFace):

HuggingFace Transformer AGENTS: official HF Colab notebook:
——————————————————————————————————
(update to transformer v4.29.1)

#ai
#chatgpt
#huggingface
#agents
#transformers

Leave a Reply

Your email address will not be published. Required fields are marked *

Amazon Affiliate Disclaimer

Amazon Affiliate Disclaimer

“As an Amazon Associate I earn from qualifying purchases.”

Learn more about the Amazon Affiliate Program