GithubHelp home page GithubHelp logo

Comments (11)

pseudozach avatar pseudozach commented on May 22, 2024 3

@benbrown I've tried both increasing tick time from 1 sec -> 2 sec and calling convo.next() inside my callback function but behavior is still erratic.
I think I'll try promisify method or use some other workaround until I optimize my datastore response time.

@zachdunn you can increase the tick time to 1.5 seconds but this still does not guarantee your call will be done until the next tick. Also now your conversation will flow %50 percent slower overall.

from botkit.

zachdunn avatar zachdunn commented on May 22, 2024 1

We've found some stability improvements by wrapping the handler with Promisify. It's not a perfect solution, but it's one of the ways we've managed to keep things a bit more predictable until it's patched.

Here's a sample gist for response handler:

'use strict';
const Promise = require('bluebird');

module.exports = function (bot, message) {
  Promise.promisifyAll(bot);
  bot.startConversationAsync(message).then(function (convo) {
    return Promise.fromCallback(function(callback) {
        return convo.ask('What did you need?', function (a, b) {
          callback(null, [a, b]);
        });
    }, true).spread(function (response, convo) {
      convo.say(`Let's see what I can do for *"${response.text}"*`);
      convo.next();
      return new Promise((resolve) => {
        convo.ask(`Found something. Do you want it?`, [
          {
            pattern: bot.utterances.yes,
            callback: (response, convo) => {
              convo.say(`Great. It's all yours.`);
              resolve();
            }
          },
          {
            pattern: bot.utterances.no,
            callback: (response, convo) => {
              convo.say('OK, nevermind.');
              resolve();
            }
          },
          {
            default: true,
            callback: (response, convo) => {
              convo.repeat();
            }
          }
        ]);
      });
    }).finally(function () {
      convo.next();
    });
  }).catch(function (err) {
    bot.reply(message, err.toString());
  });
};

This is not a magic wand, and there's still some kind of race condition in play here.

from botkit.

ChrisCinelli avatar ChrisCinelli commented on May 22, 2024

Are you hitting the 1 message/sec api limit?

from botkit.

zachdunn avatar zachdunn commented on May 22, 2024

I'm not sure that limit would be in play here. You can see that the bot always replies to the message, and it's a matter of the conversation closing on the Node side (as logs show) intermittently before the API response is sent.

Of course I may have just missed your point entirely.

from botkit.

pseudozach avatar pseudozach commented on May 22, 2024

I face a similar issue when I'm trying to get some data from datastore and do convo.ask with this data, in some cases conversation ends prematurely, in other cases convo.ask succeeds. This has all the indications of a race condition.
What is the time limit to add a new ask into a conversation before it ends?
How can we set it?
More importantly how can we make sure conversation doesn't end prematurely before some API request is completed?

from botkit.

kylemac avatar kylemac commented on May 22, 2024

+1 – i've encountered this unusual behavior as well. any thoughts on what might be the culprit?

from botkit.

kylemac avatar kylemac commented on May 22, 2024

💋 @zachdunn

from botkit.

benbrown avatar benbrown commented on May 22, 2024

The issue here is that conversations proceed, at roughly 1 message per second until there are no messages left in the queue.

If you do something asynchronous while the message queue is processing as normal, it may end prematurely because you have not yet pushed another message onto the queue.

The current best way to handle this is to not call convo.next() until after your asynchronous action has finished.

Another approach would be to do all the asynchronous actions before starting the conversation.

from botkit.

zachdunn avatar zachdunn commented on May 22, 2024

If you do something asynchronous while the message queue is processing as normal, it may end prematurely because you have not yet pushed another message onto the queue.

Suddenly #20 (comment) makes a lot more sense. Is there a way to disable/adjust this timeout on conversations? Seems like there's a parallel to KeepAlive in server configurations.

Or was that achieved by your convo.next() comment?

Reason I ask -- you can imagine a bot where the workflow isn't as predictable and ability to specify how quickly a conversation expires (or if it ever does without explicit commands) is needed:

User says something like "Let's search for hats"
Conversation starts
Bot switches into a context that's primed for hats. "What type of hat did you want?"
User says "baseball hats"
Bot fetches results from whatever leading hat-based API exists out there
Riveting hat conversation continues. User can ask for more information (more API calls)
User is done, says "OK we're done"
Exit hat context
End conversation

Happy to be wildly off base here. Would make life much simpler. 😀

from botkit.

darrenparkinson avatar darrenparkinson commented on May 22, 2024

It's a shame this is closed, as it seems it's still a problem. I'm doing a convo.say, then mapping over an array to make an api request for each item to build a response and the conversation ends before I get the next response out. This happens even if there is only a single item in the array. I'm only doing this because bot.reply messages aren't guaranteed to be sent in order.

from botkit.

benbrown avatar benbrown commented on May 22, 2024

@darrenparkinson the solution to your issue is to do those asyncronous calls outside of the conversation - by using createConversation, doing the async, then calling convo.activate(), or by doing it in a convo.ask callback before calling convo.next.

from botkit.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.