如何在聊天机器人中有效使用QnA和Luis

时间:2019-07-26 16:59:03

标签: botframework chatbot luis qnamaker

我正在使用C#构建聊天机器人,

  

.NetCore 2.1;   SDK 4.0;

我首先使用this turotiral构建集成了QnA Maker和LUIS的聊天机器人。

我对如何有效地同时使用QnA和LUIS感到困惑。例如,可以说这个机器人现在将充当FAQ机器人。我希望机器人回答50个常见问题。在我的脑海中,我将使用这50个带有备用短语的问题在QnA中创建一个知识库

分发文件会创建一个应用程序和一个意图,该意图会将话语映射到QnA Maker的知识库,我完成了吗?

我想知道为什么要向LUIS添加意图,就像在教程中一样,只有HomeAutomation和Weather这两个意图仅在LUIS中存在。...直到我将这些意图映射到QnA Maker KB中的问题...他们执行任何功能吗?我感到困惑,为什么Microsoft认为有必要区分答复是来自QnA Maker还是达到LUIS的意图。根据我的理解,有意图使用LUIS而没有来自QnA的答复是没有用的吗?

第二,我想让客户端能够在其中维护知识库和意图。...如果添加新的意图或问题,是否每次都要刷新调度文件?

1 个答案:

答案 0 :(得分:0)

部分问题是,如果您仅使用一个QnA KB,而不是多个,则说明使用的是错误的指南。您要关注的是这个:

Use QnA Maker to answer questions

如果您添加一个额外的KB,或者添加一个LUIS模型,那么您将要添加调度。否则,通过添加它只会使它变得更加复杂。

要解决您的问题:您所参考的教程介绍的NLP-with-Dispatch示例仅显示了如何实现Dispatch。在dispatch.cs对话框中有一个部分,显示了一个基本的“返回意图后会发生什么”:

 private async Task DispatchToTopIntentAsync(ITurnContext<IMessageActivity> turnContext, string intent, RecognizerResult recognizerResult, CancellationToken cancellationToken)
{
    switch (intent)
    {
    case "l_HomeAutomation":
            await ProcessHomeAutomationAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
            break;
        case "l_Weather":
            await ProcessWeatherAsync(turnContext, recognizerResult.Properties["luisResult"] as LuisResult, cancellationToken);
            break;
        case "q_sample-qna":
            await ProcessSampleQnAAsync(turnContext, cancellationToken);
            break;
        default:
            _logger.LogInformation($"Dispatch unrecognized intent: {intent}.");
            await turnContext.SendActivityAsync(MessageFactory.Text($"Dispatch unrecognized intent: {intent}."), cancellationToken);
            break;
    }
}

如果您确实决定使用调度模型来构建NLP,则在这种情况下,由您决定是否启动对话框。例如:

protected override async Task RouteAsync(DialogContext dc, CancellationToken cancellationToken = default(CancellationToken))
{
    // Get cognitive models for locale
    var locale = CultureInfo.CurrentUICulture.TwoLetterISOLanguageName;
    var cognitiveModels = _services.CognitiveModelSets[locale];

    // Check dispatch result
    var dispatchResult = await cognitiveModels.DispatchService.RecognizeAsync<DispatchLuis>(dc.Context, CancellationToken.None);
    var intent = dispatchResult.TopIntent().intent;

    // Identify if the dispatch intent matches any Action within a Skill if so, we pass to the appropriate SkillDialog to hand-off
    var identifiedSkill = SkillRouter.IsSkill(_settings.Skills, intent.ToString());

    if (identifiedSkill != null)
    {
        // We have identiifed a skill so initialize the skill connection with the target skill
        var result = await dc.BeginDialogAsync(identifiedSkill.Id);

        if (result.Status == DialogTurnStatus.Complete)
        {
            await CompleteAsync(dc);
        }
    }
    else if (intent == DispatchLuis.Intent.l_general)
    {
        // If dispatch result is general luis model
        cognitiveModels.LuisServices.TryGetValue("general", out var luisService);

        if (luisService == null)
        {
            throw new Exception("The general LUIS Model could not be found in your Bot Services configuration.");
        }
        else
        {
            var result = await luisService.RecognizeAsync<GeneralLuis>(dc.Context, CancellationToken.None);

            var generalIntent = result?.TopIntent().intent;

            // switch on general intents
            switch (generalIntent)
            {
                case GeneralLuis.Intent.Escalate:
                {
                    // start escalate dialog
                    await dc.BeginDialogAsync(nameof(EscalateDialog));
                    break;
                }

                case GeneralLuis.Intent.None:
                default:
                {
                    // No intent was identified, send confused message
                    await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
                    break;
                }
            }
        }
    }
    else if (intent == DispatchLuis.Intent.q_faq)
    {
        cognitiveModels.QnAServices.TryGetValue("faq", out var qnaService);

        if (qnaService == null)
        {
            throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
        }
        else
        {
            var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);

            if (answers != null && answers.Count() > 0)
            {
                await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
            }
            else
            {
                await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
            }
        }
    }
    else if (intent == DispatchLuis.Intent.q_chitchat)
    {
        cognitiveModels.QnAServices.TryGetValue("chitchat", out var qnaService);

        if (qnaService == null)
        {
            throw new Exception("The specified QnA Maker Service could not be found in your Bot Services configuration.");
        }
        else
        {
            var answers = await qnaService.GetAnswersAsync(dc.Context, null, null);

            if (answers != null && answers.Count() > 0)
            {
                await dc.Context.SendActivityAsync(answers[0].Answer, speak: answers[0].Answer);
            }
            else
            {
                await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
            }
        }
    }
    else
    {
        // If dispatch intent does not map to configured models, send "confused" response.
        // Alternatively as a form of backup you can try QnAMaker for anything not understood by dispatch.
        await _responder.ReplyWith(dc.Context, MainResponses.ResponseIds.Confused);
    }
}

这是一个如何更有效地使用调度的强大示例,摘自Virtual Assistant

如果您的机器人只有一个KB,我会避免使用调度,因为是的,每次更新时,您都必须刷新调度(更新,训练,重新发布,测试)。