Yes. I am currently working on this feature. There will be queues called "express lanes" which will be ran against a small random dataset. These queues will tell you whether your prediction file is in the correct format.
Created by Thomas Yu thomas.yu Dear Kyle,
This will be released soon. We are currently still in the dry-run phase. What I will do is write up a script that takes in parameters:
```
--outputDir
--dockerRepo+Sha
--subchallenge
```
The script download the training data and truth associated with the subchallenge, run your docker image, generate predictions and score it against the truth. However since the training data files are named different from the test data files, it will be a little bit difficult to completely test your code.
Best,
Tom Has the validation and scoring code been released? Being able to run scoring and validation code on our predictions for the training data sets would go a long way to ensuring that our outputs will be compatible with the leaderboard system. Dear Kyle,
I could put something really fast together, but here is the `docker run` command that I would use for subchallenge 2.
```
docker run -d \
-v /path/to/output/dir:/output \
-v /path/to/test/data:/evaluation_data:ro \
--name testsubmission \
docker.synapse.org/syn1234/examplerepo \
bash /score_sc2.sh
```
Then I zip up the prediction file that gets written to `/path/to/output/dir` and the prediction file gets stored onto synapse. There files are then validated and scored.
Best,
Tom Would it be possible to get some sort of local sandbox for development testing? Even if it ran on the training data, it would be very useful for debugging our submissions before submission.
Drop files to upload
(Webinar #1) Is it possible to submit docker projects to a sandbox to see if everything is working? page is loading…