1
$\begingroup$

I know question-question match is a text similarity problem.

What about question-answer or question-doc match? It is used in information retrieval.

question-question match is indeed text similarity. But how do you define question-answer similarity?

Thank you!!

$\endgroup$
1
  • $\begingroup$ Assuming that the answer has more sentences than the single-lined question, there is always a sentence in the answer which shows more similarity with the sentence of the question. You can encode the question to a vector. Then one-by-one, encode each sentence of the answer into a vector. Take all the similarity scores and take the mean of them. This averaged score will represent the mean similarity of the answer with the question. $\endgroup$ Commented Aug 3, 2019 at 5:31

2 Answers 2

0
$\begingroup$

You can refer to the paper "A Deep Look into Neural Ranking Models for Information Retrieval" to get more discussion about different matching tasks.

$\endgroup$
0
$\begingroup$
from sentence_transformers import SentenceTransformer, util # Load a pre-trained model for embeddings model = SentenceTransformer('all-MiniLM-L6-v2') # Example question and candidate answers question = "What is the capital of France?" answers = [ "Paris is the capital of France.", "Berlin is the capital of Germany.", "Madrid is the capital of Spain." ] # Encode question and answers into embeddings q_emb = model.encode(question, convert_to_tensor=True) a_embs = model.encode(answers, convert_to_tensor=True) # Compute cosine similarity between the question and each answer cos_scores = util.cos_sim(q_emb, a_embs) # Print each answer with its similarity score for answer, score in zip(answers, cos_scores[0]): print(f"Answer: {answer}\nSimilarity: {score:.4f}\n") 

Comments:

  • SentenceTransformer provides semantic embeddings for both questions and answers.
  • cos_sim computes similarity; higher score = more relevant answer.
  • This is a standard approach for question-answer matching in deep learning NLP.
$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.