Identifying Mislabels for Prompting
In this section, we focus on identifying mislabels for text prompting. We aim to generate freeform text answers given a question and compare the outputs of two models. The outputs will be sorted by similarity, and the least similar responses will be considered as potential mislabels.
Step 1: Zero-Shot Prompting Predictions
To start, we perform zero-shot prompting predictions using the GPT and Claude models. Each model provides freeform text answers for each instance in the dataset.
ID | Question | Model 1 Answer | Model 2 Answer |
---|---|---|---|
1 | What is the capital of France? | Paris, the romantic city of love and lights. | Tokyo, the bustling metropolis of Japan. |
2 | What is the chemical formula of water? | H2O, a compound essential for life. | NaCl, the formula for table salt. |
3 | Which planet is closest to the Sun? | Mercury, the scorched planet in our solar system. | Mars, the red planet with its captivating mysteries. |
4 | What is the capital of Japan? | Tokyo, the vibrant capital known for its blend of tradition and modernity. | London, the historic city and capital of the United Kingdom. |
5 | Which ocean is the largest in the world? | Pacific, the vast expanse of water covering one-third of the Earth's surface. | Arctic, the frigid ocean surrounding the North Pole. |
6 | What is the longest river in Africa? | Nile, the legendary river flowing through ancient civilizations. | Amazon, the mighty river teeming with diverse wildlife. |
Step 2: Calculate Text Similarity
Next, we use the Word2Vec model to calculate the similarity between the responses of GPT and Claude for each instance. We can alternatively calculate the similarity between the answers generated by the two models using a text similarity metric, such as cosine similarity or Levenshtein distance. This helps us identify the rows where the model predictions do not overlap, which can help us find where mislabeled prompts may occur.
ID | Question | Model 1 Answer | Model 2 Answer | Similarity Score |
---|---|---|---|---|
1 | What is the capital of France? | Paris, the romantic city of love and lights. | Tokyo, the bustling metropolis of Japan. | 0.70 |
2 | What is the chemical formula of water? | H2O, a compound essential for life. | NaCl, the formula for table salt. | 0.55 |
3 | Which planet is closest to the Sun? | Mercury, the scorched planet in our solar system. | Mars, the red planet with its captivating mysteries. | 0.60 |
4 | What is the capital of Japan? | Tokyo, the vibrant capital known for its blend of tradition and modernity. | London, the historic city and capital of the United Kingdom. | 0.70 |
5 | Which ocean is the largest in the world? | Pacific, the vast expanse of water covering one-third of the Earth's surface. | Arctic, the frigid ocean surrounding the North Pole. | 0.65 |
6 | What is the longest river in Africa? | Nile, the legendary river flowing through ancient civilizations. | Amazon, the mighty river teeming with diverse wildlife. | 0.60 |
Step 3: Sort by Similarity
We sort the rows based on the similarity scores in descending order. This allows us to identify the instances with the least similar answers, which may indicate potential mislabels.
ID | Question | Model 1 Answer | Model 2 Answer | Similarity Score |
---|---|---|---|---|
2 | What is the chemical formula of water? | H2O, a compound essential for life. | NaCl, the formula for table salt. | 0.55 |
3 | Which planet is closest to the Sun? | Mercury, the scorched planet in our solar system. | Mars, the red planet with its captivating mysteries. | 0.60 |
6 | What is the longest river in Africa? | Nile, the legendary river flowing through ancient civilizations. | Amazon, the mighty river teeming with diverse wildlife. | 0.60 |
5 | Which ocean is the largest in the world? | Pacific, the vast expanse of water covering one-third of the Earth's surface. | Arctic, the frigid ocean surrounding the North Pole. | 0.65 |
1 | What is the capital of France? | Paris, the romantic city of love and lights. | Tokyo, the bustling metropolis of Japan. | 0.70 |
4 | What is the capital of Japan? | Tokyo, the vibrant capital known for its blend of tradition and modernity. | London, the historic city and capital of the United Kingdom. | 0.70 |
Step 4: Present Results
The outputed mislabels table includes the rows that have a similarity score less than the specified threshold of 0.66
, which means the model thinks these prompts are incorrect.
Instance ID | Question | Model 1 Answer | Model 2 Answer | Similarity Score |
---|---|---|---|---|
2 | What is the chemical formula of water? | H2O, a compound essential for life. | NaCl, the formula for table salt. | 0.55 |
3 | Which planet is closest to the Sun? | Mercury, the scorched planet in our solar system. | Mars, the red planet with its captivating mysteries. | 0.60 |
6 | What is the longest river in Africa? | Nile, the legendary river flowing through ancient civilizations. | Amazon, the mighty river teeming with diverse wildlife. | 0.60 |
5 | Which ocean is the largest in the world? | Pacific, the vast expanse of water covering one-third of the Earth's surface. | Arctic, the frigid ocean surrounding the North Pole. | 0.65 |
Next Steps
One thing to note is while zero shot AI models can be useful with data validation, there are limitations in performance with traditional zero shot data validation approaches. On Anote, we have built proprietary algorithms perform these data validation steps in real time, actively learning from human feedback to identify rows in datasets where AI models or data annotators may have made errors. This performs more accurately over time than the zero shot models, leading to stronger data validation. We hope to release some of these findings to the community soon.