New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question answering on a table #1356
Comments
Table QA is currently limited by GPTs to tables of a relatively small number of rows/columns, and this current workflow is limited to non SQL table formats, e.g. ascii tables depending on the model downloaded from huggingface. 😢 There has been some research into SQL scale table QA: https://arxiv.org/abs/2107.07653 As PostgresML matures, I'd love for this to become a primary use case. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I want to run the table-question-answering task on my table residing in postgres. I have tried using pg.transform() function to acheive the same. My query:
SELECT
vendor_name,
pgml.transform(
task => '{"task": "table-question-answering"
}'::JSONB,
inputs => ARRAY['{"question":"How many distinct vendors are present",
"table":"vendor_name"}']
-- inputs => array[vendor_name]
)
FROM pgml.vendor_summary
WHERE vendor_name IS NOT NULL
LIMIT 10;
Traceback:
ERROR: Traceback (most recent call last):
File "", line 196, in transform
File "", line 168, in call
File "/var/lib/postgresml/pgml-venv/lib/python3.10/site-packages/transformers/pipelines/table_question_answering.py", line 348, in call
pipeline_inputs = self._args_parser(*args, **kwargs)
File "/var/lib/postgresml/pgml-venv/lib/python3.10/site-packages/transformers/pipelines/table_question_answering.py", line 56, in call
raise ValueError(
ValueError: Keyword argument
table
should be a list of dict, but is <generator object TableQuestionAnsweringArgumentHandler.call.. at 0x7f9f511fa3b0>SQL state: XX000
Wanted to know the correct syntax of the sql query to do question answering on existing table using postgresml
The text was updated successfully, but these errors were encountered: