GithubHelp home page GithubHelp logo

alan-ai / alan-sdk-ionic Goto Github PK

View Code? Open in Web Editor NEW
1.7K 10.0 18.0 210.95 MB

Conversational AI SDK for Ionic to enable text and voice conversations with actions (React, Angular, Vue)

Home Page: https://alan.app

JavaScript 3.98% TypeScript 58.36% HTML 7.42% CSS 13.61% Ruby 0.90% SCSS 15.73%
alan-ionic-sdk alan-studio chatbot voice voice-assistant voice-ai ionic sdk voice-commands voice-control

alan-sdk-ionic's People

Contributors

aermilin avatar andreyryabov avatar annamiroshoshnichenko avatar annmirosh avatar dvl-es avatar mikrowelt avatar okolyachko avatar snyuryev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

alan-sdk-ionic's Issues

Adjusting Alan's listening phase

Is it possible to increase the time that Alan listens for a longer input?

For example, we have an intent to capture a larger chunk of free text. The intent is defined like this:

intent(vSupportScreen, 'My message is $(TICKET* (.+))', async p => {
    p.play(`You've said: ${p.TICKET.value}`);
    ...
}

During testing we have observed that a lot of the time user will make a little pause in their talking, such as when ending one sentence and starting another. And in that pause Alan will stop listening and start processing the input. So we were wondering if it possible to increase the amount of time Alan listens, before it determines the end of input and starts processing.

Tips on handling production releases

Hello.

I have a question regarding handling production releases. What is your "best practice" or advice when releasing new production versions?

For example we release a mobile app with Alan SDK key xxxx/prod and we select a v1 version of Alan scripts on production environment. So now users can use the mobile app with v1 alan scripts.

Then we continue development and create a v2 of Alan scripts which is compatible with the next release/version of our mobile app. Then we release the new version of mobile app. In this moment we have users with old version of the app and users who have upgraded the app to new version. If we change Alan production environment to v2 scripts, the new app will work ok, but now the old version of the app will also use v2 alan scripts, which can be incompatible. For some time, until all users update their mobile app, we should support 2 versions of alan scripts (probably more if we don't force users to update app).

How do you recommend to tackle this issue? Should this be handled in scripts/code or with some project settings?

Callback for callProjectApi() doesn't get executed on native platforms

When we call callProjectApi() from ionic application to execute a command from Alan scripts, the callback doesn't get executed when running on Android or iOS device (or simulator). It does work when running in browser (ionic serve).

For example we call Alan's function greetUser from ionic:

this.alanButtonRef.componentOnReady().then(async () => {
	try {
		if (!await this.alanButtonRef.isActive()) {
			console.log('Calling: ' + 'await this.alanButtonRef.activate();');
			await this.alanButtonRef.activate();
		}

		console.log('Calling: ' + 'this.alanButtonRef.callProjectApi(\'greetUser\', {}, ...');
		this.alanButtonRef.callProjectApi('greetUser', {}, (error, result) => {
			console.log(error, 'error');
			console.log(result, 'result');
		});
	} catch (e) {
		console.log('EXCEPTION in sendGreeting()');
		console.log(e);
	}
});

The callback function doesn't get called and those console logs don't ever get printed. Neither is there any caught exception. When testing on Android device we didn't see anything in the logs that would indicate what's the problem. But when running on iOS there is a log entry that could point to the problem:

To Native Cordova ->  alanVoice isActive alanVoice1282791712 ["options": []]
⚡️  [log] - Calling: await this.alanButtonRef.activate();
To Native Cordova ->  alanVoice activate alanVoice1282791713 ["options": []]
⚡️  [log] - Calling: this.alanButtonRef.callProjectApi('greetUser', {}, ...
To Native Cordova ->  alanVoice callProjectApi alanVoice1282791714 ["options": [greetUser, {
}]]
⚡️  [log] - Error in Success callbackId: alanVoice1282791714 : TypeError: second argument to Function.prototype.apply must be an Array-like object (evaluating 'callback.apply(this, ...arguments)')
⚡️  [error] - {}

⚡️  ------ STARTUP JS ERROR ------

⚡️  TypeError: second argument to Function.prototype.apply must be an Array-like object (evaluating 'callback.apply(this, ...arguments)')
⚡️  URL: capacitor://localhost/polyfills-es2015.js
⚡️  polyfills-es2015.js:3737:36

⚡️  See above for help with debugging blank-screen issues

Strange panel on iOS in landscape mode

We have noticed that when our app is in landscape mode, a strange panel shows up on left side of the screen. The panel only shows when Alan button in enabled. I've attached screenshots. The panel can be scrolled a bit and it shows an animation of a dog in the top part. This only happens on iOS in simulator and on devices. It's not part of webview, but 'overlays' it.

Could this be some part of your SDK or libraries? I can't find the origin of the panel, but it's consistent with the fact that it only shows when Alan button is visible.

Simulator Screen Shot - iPhone 11 - 2022-03-11 at 16 13 55

Screenshot 2022-03-11 at 16 16 01

Error in Alan Studio: resolve() - globalContext can't be resolved

We have noticed that from today (21.4.), clearing context programaticaly no longer works. And an error message is displayed in Alan Studio - resolve() - globalContext can't be resolved.

The code that we use is:

projectAPI.greetUser = async function(p, param, callback) {
    p.play('Would you like me to give you a short overview of this month\'s statistics for your site?');
    
    const answer = await p.then(getYesNo);
    
    if (answer === 'yes') {
        playStatistics(p);
    } else if (answer === 'no') {
        p.play('OK, never mind then.');
    } else {
        // Cancelled context.
    }

    callback(null, 'ok');
};


// Context that locks user into yes/no answer.
const getYesNo = context(() => {
    title('Statistics overview');
    
    intent('(yes|sure|please|yes please|ok)', p => p.resolve('yes'));
    intent('(no|nope|never|cancel)', p => p.resolve('no'));
    fallback('Please say yes or no if you would like to hear a short overview of statistics.');
});

projectAPI.clearContextGlobal = function(p, param, callback) {
    try {
        p.resolve('cancel'); // Cancel yes/no context if it was active.
    } catch (e) {}
    callback(null, 'ok');
}

From mobile app we call greetUser function which locks user into a yes/no context. But if user performes some action in app, we must start another Alan dialog, so we call clearContextGlobal function, to close the yes/no context.

This used to work in the past, but no longer. Now the context stays active. Has there been any change in the way contexts work? I've read through the documentation on contexts, but i can't find if anything must be done differently.

The error message in Alan studio (resolve() - globalContext can't be resolved) happens at the same time as we call clearContextGlobal function from mobile app, so i presume it's connected.

Crash on Android device

Sometimes when running project on Android device, the app will crash as soon as Alan button is initialized. It doesn't happen regularly - on the same device it will sometimes work and sometimes crash. This behaviour was observed when running ionic debug build from Android studio to physical Android device (NOKIA 8 TA-1004, running Android 9).

When crash happens, we have observed 2 different errors in Android Studio's run log.

  1. Exception with stack trace:
    E/AndroidRuntime: FATAL EXCEPTION: Thread-10
    Process: xx.app, PID: 26930
    java.lang.IllegalThreadStateException
    at java.lang.Thread.start(Thread.java:724)
    at com.alan.alansdk.alanbase.recorder.AlanRecorder.startRecording(AlanRecorder.java:30)
    at com.alan.alansdk.Alan.record(Alan.java:143)
    at com.alan.alansdk.Alan.onDialogStateChanged(Alan.java:408)

crash1-edit.txt

  1. Single error message (not sure if it's the cause of crash, but it's not present in log when it doesn't crash):
    E/AudioRecord: start() status -38

crash2-edit.txt

If there's any other information we can provide that will help you locate the problem, please tell us.

Alan does not recognize/record voice input on specific Android devices

We have encountered an issue where on specific Android devices the voice input is not recognized. When Alan is listening, nothing happens, as if the voice is not recorded. No amount of app re-installs or different builds have helped, the issue is always present.

Affected devices are:

  • Samsung Galaxy S20 FE (SM-G780F/DS), OneUI 3.1, Android 11, sec patch 1.nov.2021
  • Samsung Galaxy Note 10 (SM-N970F/DS), OneUI 3.1, Android 11

On Samsung S20 FE we tested your reference app "Alan Playground" (installed from Google Play) and voice input also doesn't work in your app. We will also test your app on Note 10 device at a later time and we will update the issue once we get results.

I am attaching logcat from running our Ionic app on Samsung S20 FE. In the log i've marked the approximate location when user started speaking (line 416). A bit above that, on line 398 is app's log entry when button state callback fired and state was LISTEN. So somewhere around here Alan started listening.

samsung_s20_fe_dg1_app_logcat-edit.log

Ionic info:

Ionic:

   Ionic CLI                     : 5.4.16 
   Ionic Framework               : @ionic/angular 5.3.2
   @angular-devkit/build-angular : 0.803.29
   @angular-devkit/schematics    : 8.3.29
   @angular/cli                  : 8.3.29
   @ionic/angular-toolkit        : 2.3.3

Capacitor:

   Capacitor CLI   : 2.4.0
   @capacitor/core : 2.4.0

Utility:

   cordova-res                          : not installed
   native-run (update available: 1.5.0) : 1.3.0

System:

   NodeJS : v10.18.1
   npm    : 6.13.4
   OS     : macOS Big Sur

Is it possible to change required score to match intent

Is it possible to adjust at what score the intent is matched?

We have observed cases where there are multiple intents available and user says something completely different and it will get matched to some intent with a low score (ex. 0.6).

Another example, we have a situation in the scripts where we lock the user into a yes/no context (with fallback). At this point we expect a clear yes or no from the user and everything else should trigger fallback message. During testing we observed that saying random words/sentences will sometimes get matched to yes/no with low score. Could we set a score threshold for specific intents?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.