-
-
Notifications
You must be signed in to change notification settings - Fork 327
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Export the evaluation summary table in csv format to the outputs folder #221
Comments
Hello @Udayraj123 ! |
Hey @Drita-ai, sure. Please share an approach that you'd like to take after going through the resources mentioned. Once you're clear on what needs to be done I can assign the issue to you |
Hello @Udayraj123. |
Hey @Anshu370, sure. Please share an approach that you'd like to take after going through the resources mentioned. Once you're clear on what needs to be done I can assign the issue to you |
@Udayraj123 I would like to work on this issue, so basic idea is that we can use pandas lib to manage csv files and store it it new folder as per instructions provided above, but I am unable to understand the codebase and how it is working. If there is any readme/instruction files which explain codebase it will be very helpful since understanding the complexity of this code just by understanding each program will be a tough task. |
Hi @Udayraj123 ,
|
Looks like there's multiple folks working on this. Since @Drita-ai has shared the approach on this issue first, will have to assign the issue to @Drita-ai. @tushar-badlani your efforts are appreciated but please make sure to get yourself assigned on the issue first before working on a PR directly. Especially when others have already commented and shown interest to work on it. I'll have to keep the PR on on hold now. @Drita-ai it's upto you now if you wish to continue working on this issue or a new one. Either way I'll make sure both of your efforts are given due credit. |
@offline-keshav the code is better structured in the dev branch. And currently the variable naming and code comments are the guide for understanding. One good way to start with is going through the There's no separate docs yet(we have an issue created for this). |
@Udayraj123 I appreciate the feedback and the guidance. I'll definitely make sure to get assigned to issues before working on them in the future. |
If it helps to know @tushar-badlani, your code may already get picked up by someone facing the issue recently. I've asked them to try it out and let us know the feedback on discord channel |
Hi @Udayraj123, it's pretty late to ask this but after following the Getting Started guidelines where it suggested to fork the OMRChecker, which creates a |
@Drita-ai at the time you created a fork, you'd work on the |
Assigning @tushar-badlani as well for working on dev branch |
Is your feature request related to a problem? Please describe.
Currently when we use a evaluation.json file with
should_explain_scoring
as true, a table is printed in the console that explains how the scoring happened for a particular OMR sheet -This is particularly useful for debugging complex answer keys with features such as section-wise marking, negative marking, bonus questions etc. (ref for all capabilities)
Now there's no way to output this table as a separate CSV file per OMR sheet.
Describe the solution you'd like
evaluation/
inside the outputs directory for each run (using the outputs namespace).to_csv
util from pandas for exporting the csv from the evaluation class itselfDescribe alternatives you've considered
N/A
Additional context
The text was updated successfully, but these errors were encountered: