-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to get all model to generate datasets on my device? #100
Comments
I knew that I can generate kernel dataset just like here https://github.com/microsoft/nn-Meter/blob/ffd51e32c31026896fe2bda198b49fe5d756f184/docs/builder/build_kernel_latency_predictor.md, but this is not what I want, I want to get the datasets like here https://github.com/microsoft/nn-Meter/releases/download/v1.0-data/datasets.zip, but the latency should be on my device |
I have the same question, I want to get the models in datasets.zip, and infer them by myself. How can I get the models? |
I want to generate my training dataset (including model and corresponding latency) on my own device, I have read the article here (https://github.com/microsoft/nn-Meter/blob/ffd51e32c31026896fe2bda198b49fe5d756f184/docs/dataset.md ), it mentioned there is no model files but only the structure(in .jsonl files) of these model, so I havn't find any way to get the delay data of these model files running on my device.
So how can I get these models?
For example, you mentioned in the article that "it requires hundreds of GB storage to store the full dataset", is it possible to store these data in the network disk and open for download, or is there any other way for me to get my own dataset, so that I can get valuable datasets for me, thanks very much!
The text was updated successfully, but these errors were encountered: