-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gradio Local Editor cookbook #826
Conversation
2f2f846
to
8ba46eb
Compare
602e134
to
5eca3c6
Compare
The HF prompt schema has been updated in #850 to support a "model" property, which can specify the model to use for a given HF task. This change updates the model parsers to use this setting. Test Plan: Tested with the following prompt: ``` { "name": "image_caption", "input": { "attachments": [ { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/1742/bear_with_honey.png", "mime_type": "image/png" }, { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/7275/fox_in_forest.png", "mime_type": "image/png" } ] }, "metadata": { "model": { "name": "HuggingFaceImage2TextTransformer", "settings": { "max_new_tokens": 10, "model": "Salesforce/blip-image-captioning-base" } }, "parameters": {} }, "outputs": [ { "output_type": "execute_result", "execution_count": 0, "data": "a bear sitting on a rock eating honey", "metadata": {} }, { "output_type": "execute_result", "execution_count": 1, "data": "a red fox in the woods", "metadata": {} } ] }, ``` Validated the the model setting was respected and worked.
|
||
## Prompt IDE with LastMile AI | ||
|
||
This cookbook is meant to demonstrate the capabilities of the AI Workbook editor. It can run inference against locally hosted or remote models from many inference providers, including Hugging Face, OpenAI and others. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should "AI Workbook editor" be "local AIConfig Editor"?
|
||
It supports text, image and audio model formats, allowing you to easily chain them together in a single notebook session! | ||
|
||
With `aiconfig`, it lets you save the state in a single json config file which you can version control or share with others. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit, "it lets you" --> "you can"
**Instructions**: | ||
|
||
- Clone https://github.com/lastmile-ai/aiconfig | ||
- Go back to top-level directory: `cd <aiconfig>` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit, just "Go", no "back"
## TODO | ||
|
||
- Publish new version of aiconfig_extension_hugging_face package | ||
- Update huggingface.aiconfig.json with clean examples | ||
- Add video demo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can remove TODO
Now that we have non-text input model parsers, we can see some issues in our parameterization. 1) ASR and Image-to-Text modelparsers should NOT be `ParameterizedModelParser` instances, since their inputs cannot be parameterized. 2) Parameter resolution logic shouldn't throw errors in the case where it's parsing prompts belonging to regular model parsers. Test Plan: Ran the translation prompt for this aiconfig: ``` prompts: [ { "name": "image_caption", "input": { "attachments": [ { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/1742/bear_with_honey.png", "mime_type": "image/png" }, { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/7275/fox_in_forest.png", "mime_type": "image/png" } ] }, "metadata": { "model": { "name": "HuggingFaceImage2TextTransformer", "settings": { "max_new_tokens": 10, "model": "Salesforce/blip-image-captioning-base" } }, "parameters": {} }, "outputs": [ { "output_type": "execute_result", "execution_count": 0, "data": "a bear sitting on a rock eating honey", "metadata": {} }, { "output_type": "execute_result", "execution_count": 1, "data": "a red fox in the woods", "metadata": {} } ] }, { "name": "translation", "input": "Once upon a time, in a lush and vibrant forest, there lived a magnificent creature known as the Quick Brown Fox. This fox was unlike any other, possessing incredible speed and agility that awed all the animals in the forest. With its fur as golden as the sun and its eyes as sharp as emeralds, the Quick Brown Fox was admired by everyone, from the tiniest hummingbird to the mightiest bear. The fox had a kind heart and would often lend a helping paw to those in need. The Quick Brown Fox had a particular fondness for games and challenges. It loved to test its skills against others, always seeking new adventures to satisfy its boundless curiosity. Its favorite game was called \"The Great Word Hunt,\" where it would embark on a quest to find hidden words scattered across the forest. \n\nOne day it got very old and died", "metadata": { "model": { "name": "HuggingFaceTextTranslationTransformer", "settings": { "model": "Helsinki-NLP/opus-mt-en-fr", "max_length": 100, "min_length": 50, "num_beams": 1 } }, "parameters": {} } } ] ``` Before: ``` File "/opt/homebrew/lib/python3.11/site-packages/aiconfig/util/params.py", line 235, in get_prompt_template raise Exception(f"Cannot get prompt template string from prompt: {prompt.input}") Exception: Cannot get prompt template string from prompt: attachments=[Attachment(data='https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/1742/bear_with_honey.png', mime_type='image/png', metadata=None), Attachment(data='https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/7275/fox_in_forest.png', mime_type='image/png', metadata=None)] data=None ``` After: * Beautiful translation
LGTM, just nits |
Fix parameterization issues Now that we have non-text input model parsers, we can see some issues in our parameterization. 1) ASR and Image-to-Text modelparsers should NOT be `ParameterizedModelParser` instances, since their inputs cannot be parameterized. 2) Parameter resolution logic shouldn't throw errors in the case where it's parsing prompts belonging to regular model parsers. Test Plan: Ran the translation prompt for this aiconfig: ``` prompts: [ { "name": "image_caption", "input": { "attachments": [ { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/1742/bear_with_honey.png", "mime_type": "image/png" }, { "data": "https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/7275/fox_in_forest.png", "mime_type": "image/png" } ] }, "metadata": { "model": { "name": "HuggingFaceImage2TextTransformer", "settings": { "max_new_tokens": 10, "model": "Salesforce/blip-image-captioning-base" } }, "parameters": {} }, "outputs": [ { "output_type": "execute_result", "execution_count": 0, "data": "a bear sitting on a rock eating honey", "metadata": {} }, { "output_type": "execute_result", "execution_count": 1, "data": "a red fox in the woods", "metadata": {} } ] }, { "name": "translation", "input": "Once upon a time, in a lush and vibrant forest, there lived a magnificent creature known as the Quick Brown Fox. This fox was unlike any other, possessing incredible speed and agility that awed all the animals in the forest. With its fur as golden as the sun and its eyes as sharp as emeralds, the Quick Brown Fox was admired by everyone, from the tiniest hummingbird to the mightiest bear. The fox had a kind heart and would often lend a helping paw to those in need. The Quick Brown Fox had a particular fondness for games and challenges. It loved to test its skills against others, always seeking new adventures to satisfy its boundless curiosity. Its favorite game was called \"The Great Word Hunt,\" where it would embark on a quest to find hidden words scattered across the forest. \n\nOne day it got very old and died", "metadata": { "model": { "name": "HuggingFaceTextTranslationTransformer", "settings": { "model": "Helsinki-NLP/opus-mt-en-fr", "max_length": 100, "min_length": 50, "num_beams": 1 } }, "parameters": {} } } ] ``` Before: ``` File "/opt/homebrew/lib/python3.11/site-packages/aiconfig/util/params.py", line 235, in get_prompt_template raise Exception(f"Cannot get prompt template string from prompt: {prompt.input}") Exception: Cannot get prompt template string from prompt: attachments=[Attachment(data='https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/1742/bear_with_honey.png', mime_type='image/png', metadata=None), Attachment(data='https://s3.amazonaws.com/lastmileai.aiconfig.public/uploads/2024_1_10_18_41_31/7275/fox_in_forest.png', mime_type='image/png', metadata=None)] data=None ``` After: * Beautiful translation --- Stack created with [Sapling](https://sapling-scm.com). Best reviewed with [ReviewStack](https://reviewstack.dev/lastmile-ai/aiconfig/pull/866). * #826 * __->__ #866 * #863
Will clean up readme afterwards. Thanks @rholinshead |
Gradio Local Editor cookbook
Stack created with Sapling. Best reviewed with ReviewStack.