Skip to content

[Gaia] Ep. 3: Talk to Gaia, she can understand you

Posted in My Projects

Loading Likes...

I know it’s been a while since I haven’t posted updates. I was working on my end-of-study project and I started my end-of-study internship in February, so honestly I haven’t had much free time. But that doesn’t mean I forgot about my personal projects: I worked on my chatbot app and I’m going to show you the result!


I built the app with Ionic and connected it to the Dialogflow API, so my agent can receive vocal or written messages and respond with pre-defined scenarios. What I still have to do is connecting my app to a database to store the user’s data and use it in conversations. I’m working on it, but I’ll tell you more about it in the next post.


When I first tried to connect my app to Dialogflow, I ran into numerous issues. I literally spent a day trying to get the app working, uninstalled and re-installed at least 5 different things, and it turns out that with the current version of Cordova (a framework used when building apps with Ionic to enable deployment on mobile platforms), the API simply doesn’t work. More precisely, there is a conflict when you try to build the app for an android platform and the plugin is installed. So if you want to develop a Ionic application using the Dialogflow API, you need to downgrade to Cordova 6 or 7 (I got the 6.5.0 version). But then you also need to install an older version of Ionic. And then, you need to get the proper Android SDKs and gradle version (if you need more details, ask in the comments). As you can imagine, the hours I spent trying to just run the project properly weren’t fun.


When it finally worked, I was able to send text or voice requests to the API and show the response. Dialogflow only returns text, which you can display. But as I also wanted my app to talk, I used a Ionic plugin called Text-to-speech to have a voice say out loud the speech returned by the API.

If you only want a chatbot that will make conversation, you just need your app to be connected to the API and you can get the responses you want. But if you want your app to be able to perform actions, you need a webhook. You can use Firebase as a webhook for your app. If you want more details on how to do that, you can check out this free Udemy course on how to build a chatbot with Dialogflow (the audio quality is bad, but it does help you get started with the API). As soon as your app should be able to take the variables spotted in the user’s speech and do something with them, the API isn’t sufficient on its own and you need a webhook.


Now let’s talk about the UI. I changed my mind several times and I’ll probably want to improve it further, but for now the home page has two tabs – interaction view and conversation view, and there are 3 other pages:

  • Family & Friends: the place where the family and friends of the user are documented
  • Planning: the planning of the user
  • Reminders: the place where various reminders for the user are kept

These 3 pages will allow to update the user’s data in a database, and the chatbot (accessible from the homepage) will use this data in conversations. As I mentioned, I haven’t connected my app to a database yet, so it doesn’t work for now.


As pictures are better than words, here are a few screens of what we can do with my app for the time being:







When you launch the app, you get the interact view. You can type text or use your mic to talk to the chatbot, and it will reply vocally and display the answer in the text zone. In the first screenshot, I ask “Who are you”, and this is the response I get.

For now, I’ve used a photo of myself because I didn’t have anything to put instead, but I’m hoping I can create a visual for Gaia and put it there. That way, the user will have a face to interact with and will probably feel more connected to the chatbot. Also, the image will be replaced by other images (such as photos of family members or friends) when the situation calls for it.

In the conversation view, you can see the previous messages exchanged with the bot. In this view too you can either use your mic or type to say something. The screenshots are example of conversations you might have with the bot.


One of the reasons I chose to use Ionic is that it’s simple to build a UI using Ionic components. For instance, icons, buttons or navbars are built-in (and they’re different depending on the platform). You can build pages using html and then style them with sass (css framework), and display content that you get from your typescript file dynamically.


So my next step is storing content in a database and displaying it in the other pages. I’ve already started looking into storing content with Firebase but implementing everything takes time – this is why I haven’t got around to it yet.


Stay tuned,

A girl in tech.

Be First to Comment

Leave a Reply