You may upload at most ten submissions a day. A sample submission with the necessary formatting is available here.
Submitted systems may use any public or private data when developing their systems with a few exceptions:
Beyond this you may submit results from any kind of system that is capable of producing labels for the six target and the analysis tasks. This includes systems that do not share any components across tasks or systems not based on machine learning.
System submissions are not automatically public. Once you mark a submission as public, a notification will go to Russian SuperGLUE admins who will then approve your submission. Once approved, you will get a notification about approval and your entry should be on the leaderboard. If you update your entry in future, it will have to go through the process again. Therefore, please make sure that everything is correct when you submit your entry for approval.
We will only make submissions public if they include either a link to a paper or a short text description. In addition, to ensure reasonable credit assignment since SuperGLUE builds very directly on prior work, we ask the authors of submitted systems to directly name and cite the specific datasets that they use, including the SuperGLUE datasets. We will enforce this as a requirement for papers linked from the leaderboard.
Yes, you can. We currently display names for all submissions, but you may create a Google account with a placeholder name if you prefer. Since other users couldn’t question anonymous authors, we require to attach a link to a reasonably detailed (anonymized) paper to every anonymous submission.
The primary SuperGLUE tasks are built on existing datasets or derived from them. In Russian version we have created the equivalents from scratch. All our datasets are published by MIT License.
If you have just submitted, please wait at least 5 minutes for grader to run and grade your submission.
First, in your profile, check if your submission is present. If submissions status is error, hover over the error symbol to see it.
In other cases, when submission is not present check below or contact us.
A submission may not be graded in case of any of the following issues:
We calculate scores for each of the tasks based on their individual metrics. These scores are then averaged to get the final score. For tasks with multiple metrics, the metrics are averaged. On the leaderboard, only the top scoring submission of a user is shown or ranked by default. Other submissions can be seen under the expanded view for each user. Competitors may submit privately, preventing their results from appearing on the public leaderboard. To make the results be shown on the leaderboard please, click on the “Public” checkbox.
Yes, you can use pretrained jiant model. See how to do it in our jupyter notebook
Contacts us via russiansuperglue@gmail.com