Reduce errors in data by making schema more forgiving#598
Conversation
|
This reduces errors from 5672 to 2449 and there is probably more we can do. However we should think about if this is what we want to do - it makes the data entry form a bit more complicated, as now people can select whether to have null there. And there may be issues for people using the data files directly (altho they already have to deal with null's everywhere) Interesting as an exercise to consider tho - if we can reduce errors till we only have actually useful ones that would be great. |
7f9e937 to
97fcd5f
Compare
|
@jarofgreen - hmm. I see what you mean about the form. i just took a look myself. It makes me question whether to go ahead with the PR. Given that the checks on PRs will now only show errors on edited records (not all records), I think we can probably live with the validation errors resulting from nulls in the dataset. This PR was a bit of a workaround anyway: we know that ultimately we should fix the data, removing the nulls. cc-ing @emmajclegg and @IsabelBirds in case they want to chip in. |
I'd agree with this comment @jarofgreen @kd-ods . If this isn't changing anything for end users, I think it's cleaner for us to remove nulls properly as and when we get to reviewing lists. |


No description provided.