No way to check my work?
Why is there no way to check my own work? That is something I found VERY useful before the indexing system was changed a few years ago. I want to learn from any mistakes I might make, but currently there is no way to know if I made a mistake or not.
I hope you will change the system to include the feature of checking one's own work. How else are we users to know if we're doing it correctly????
Answers
-
My advice is to read through the Project Instructions and the Field Helps and look at the examples before and as you go in a project. You can also check batches for mistakes prior to submitting it. There is also this board to ask any questions you may have about batches. We're here to help.
1 -
When there was a way to check your work (arbitration on desktop indexing), there were quite a number of mistakes made by the arbiters. So, you really couldn't be sure it was you who made the mistake or the person who was judging your work against that of another indexer. For instance, on obit indexing the project instructions clearly noted that if there wasn't an indication of gender on the document then the field was blank. And yet, the reviewers/arbiters who did not read the instructions, continued to use given names to determine gender and relationship status, assume surnames for individuals, added maiden names when they shouldn't, and calculated death dates and birth years, just to name a few errors.
With the new system, an indexer creates the first record. It is then reviewed by another individual who has review rights. The review button appears once a person has indexed 1000 names. If that reviewer changes more than 20% of your work, it goes to a second reviewer. If the second reviewer disagrees with the 1st reviewer, it goes to a third reviewer. If the 2nd and 3rd reviewer are in agreement, it goes on to pre-publication. If not, it goes to FamilySearch for a 4th and final review.
In a nutshell - this way is probably a better route to accuracy on the final product.
2 -
What I remember from the days of desktop indexing (the two indexers plus arbitrator model that Melissa mentioned) was the constant vain attempts at convincing people that it wasn't an accuracy score, that there was no shame in not getting 100%, that this wasn't grading like in school. I think this was a major factor in getting rid of it, because what else am I supposed to think when a program tells me that I got, say, 88%?
But I agree that the current black hole is not an improvement. People can keep making the same error over and over and over again, totally oblivious to the fact that it is an error -- and if other people can be doing that, I could be, too. So there's this nagging worry: am I doing something wrong without realizing?
Perhaps what we need is just a comparison, without numbers: this is what I submitted, this is what the reviewer submitted. Maybe we only need it for the ones that went to a second reviewer.
Or heck, never mind the individual results: would it be possible for the project instructions to include some samples of twice- or thrice-reviewed batches, to show the sorts of situations that people disagree about the most?
1 -
There are crowd-sourcing transcription programs that allow one to see every change that is made and notes can be left for future transcribers. Even then, "someone" makes a change that they think is the way it should be. Then it gets changed back to the correct text, and lo and behold, that same "someone" comes back and errs again. The instructions are clear - don't correct misspellings, don't expand abbreviations, etc., but, sometimes folks can't resist. After conversations with owners of many of these programs, the consensus is "don't let perfection be the enemy of progress". All they want is a searchable index so people can find the information. With proper programming, (Soundex), minor errors don't hinder their search engines.
FamilySearch did say that they retain all the versions of indexing and that one day all versions might be published. So, there is still a chance that in the future folks will see each indexers variation of the same document.
0