Building a Chat Bot in under 60 seconds with Karmaloop AIML Bot Server

The Rise of the Bots

We have entered the century of Artificial Intelligence, it is finally here. However we won’t have super smart Human like AI from day one. We will ease into the   process with dumb bots taking over mundane jobs and automating them. As an example, I am already done with manually setting up alarms. That job has gone to Siri and Alexa. I am also done setting reminders. They have taken that too. In this article, let us see how we can build a bot in under 60 seconds.

Karmaloop AIML Bot Framework

Brief History

Chat bots have been here since the 1990s. In the late 90s, Dr. Richard Wallace created an XML based specification called AIML or Artificial Intelligence Markup Language. This was clean, simple and easy to customize. I was going through my Computer Science graduation in 2005, and that is when I stumbled upon Dr. Richard Wallace’s work with ALICE. ALICE was a super smart chat bot for her time and her AIML files were craftfully done. At that time, I was learning C# and decided to build an AIML parser in C# to bring ALICE to life on my computer, and also use her as my final semester project. Long story short, it all went well, and I got through my engineering and got busy with a job and later, business. It was only until 2016 when the bug bit me again and I restarted work on the libraries and in 2018 I finally released the first Open Source version of what I called, Karmaloop AIML Bot Server. Today, I will show you how you can get started with a solid underlying foundation of ALICE bot and add your custom capabilities on top.

Download and Run Karmaloop AIML Bot Server

Get the binaries from the Releases page on Github – https://github.com/KarmaloopAI/karmaloop-aiml-bot-server/releases

If you are on Windows, extract and navigate to the folder then run the KarmaloopAIMLBotServer.exe file to run the bot server. If you are on macOS or Linux, install and use Mono to run the above file.


mono KarmaloopAIMLBotServer.exe

Awesome! Now you should have ALICE bot running on your machine. Open a web browser window and point it to http://localhost:8880/api/ChatUi/index.html

On Windows, you may need to run the following command to make sure the server can open port 8880 for listening to incoming API requests.


netsh http add urlacl url=http://*:8880/ user=Everyone listen=yes

Now chat and have fun!

Customize and add your own conversations

To explore this topic in full, you may first want to aprise yourself of AIML 2.0 specifications – https://callmom.pandorabots.com/static/reference/

Step 1.

In the folder where you extracted the binaries, should be a folder called “aiml”. This is where the AIML files are stored by default. Let us create a new aiml file called magicmirror.aiml and then copy paste the following code into the file


<!--?xml version="1.0" encoding="UTF-8"?-->
<aiml version="1.0">
<category>
  <pattern>MAGIC MIRROR IN MY HAND WHO IS THE FAIREST IN THE LAND</pattern>
  <template>My Queen, you are the fairest in the land.</template>
</category>
</aiml>

Save the file, and restart your bot server. Then at the chat prompt, type the question – Magic mirror in my hand, who is the fairest in the land?

The conversation should look somewhat like below:

As expected! The response is exactly what we coded into the aiml file. That is it. If this took you under 60 seconds to do, then you built your first chat bot in less than a minute.

OK, but how can I produce dynamic responses?

Hmm… so you are ready to build something more complex, and would like to mash external data into your responses… right? Like build a weather skill that can tell you the weather of any city in the world. So let us keep this complex bit for a later post, but if your curiosity is insatiable, then simply open the file called “zweather.aiml”. Then open the source code (which you can download from Github) and look up weather skill. Set breakpoints and see how the skill executes itself. If you are good, you should already be able to see the simplicity and power of this approach, but hey more on that later!

Where to go from here

Now that you know how to create a basic conversational bot with Karmaloop AIML Bot Server you may be itching to build your own conversational bot and handle complex conversations. May be you want to automate arguments with your wife (LOL). Whatever your need may be, you may want to invariably start with reading up on AIML 2.0 specs for which link has been shared above. You may also want to visit the Github page and see how to compile from source on your platform. If you need any help, do post in the comments section and I can surely help.

Happy Bot Building!

India-China Standoff – Checkmate – Part 1

India and China have been on a military standoff in the Doklam region and tri-junction area in the Sikkim Sector. The disagreement from China’s angle is that Indian troops entered undisputed Chinese territory and blocked their road construction. They also claim that the Doklam region is disputed between itself and Bhutan and India has no part to play in a bilateral dispute. India’s angle is not very clear to the public apart from defending the Siliguri corridor, also known as the Chicken’s neck. Arguments from both the sides, in my humble opinion, are foolish because both governments have strategic objectives. For China, building a road leading to no-where, is foolish and serves no other purpose than sending military equipment out to the front more quickly and effectively. India too, is giving unwarranted importance to the Doklam plateau because in the event of a full scale military confrontation, China would loose Doklam first due to clear advantages that the Indian military has in the region.

It is imperative we look at what the geography of this region looks like and what may or can happen in case of a full confrontation or a limited tactical confrontation.

India - China border standoff 2017
India – China border standoff 2017

The interesting bit is, that this standoff has two very distinct sides, one public, and the other private yet strategic. We will be exploring the strategic angle in this post series, because in all honesty, that is the angle government is unable to disclose publicly, yet is what it has in mind as the primary objective. Being no expert by any means on military strategy, I would still like to make a humble effort at decoding the strategic angle. Here goes the outline from India’s perspective:

  • A strong protest by India (read Prime Minister Modi) against China’s dual play – play with us or play against us – pick a camp and stick to it
  • We will block any attempts by Chinese at “Salami-Slicing” tactics
  • If we can’t have a full scale confrontation with Pakistan due to Nuclear deterrence issues, neither can China with India – hence – the checkmate
  • Force China to up the ante and explore all confrontation options – come to the realisation it can’t do much – and then get to the negotiation table to re-build the entire partnership
  • Realising a full scale confrontation will destroy India and severely damage China, tame the dragon to play along bilaterally, because China isn’t a Kamikaze state but a mature world power
  • Having a humungous trade deficit between the two countries, let China know it has the lions share to loose even in case of a limited conflict war-theatre
  • Send a message to all countries having disputes with China specially in the South China Sea, that the dragon has met it’s match in India, regardless of Chinese rhetoric about how inferior the Indian Military is in comparison to the Chinese Military
  • Exploit the negative rhetoric against China in India to push the manufacturing sector in India
  • Let China be very well reminded of how upset (and terrified) India is of the CPEC/OBOR because of the easy troop movement it allows for China en-circling India
  • Let the reverse of the above also be known that it is China that will have much to loose if India starts exploring pre-emptive blocking strategies against CPEC/OBOR

I hope to cover each of the strategic objectives in detail in subsequent posts and play out mock confrontation scenarios. Much of this has probably already happened and discussed behind closed doors on both sides, however it is not in the open or public domain.

 

And I am back!

Fellas! My blog is back after being dead for more than a year. It took me some maintenance and technical fixing to get the website back up and running. I intend to be more active with my blogging from now on, and talk about subjects that I care about from the bottom of my heart. The latest edition to my loved subjects is Artificial Intelligence (AI) and my work with a new startup called Karmaloop AI (www.karmaloop.ai)

Post your comments!

Linux Mint / Ubuntu – Beats Audio on HP Laptop

I am glad someone figured it out!! I will repost him so that all those stuck with crappy sound without Beats on their HP laptops while using Ubuntu get a breather.

Please note – if you are using Ubuntu 13.10 or above, you do not need to install hda-jack-retask separately, its a part of the alsa package. Install alsa-tools-gui in that case using the standard software manager.

Follow these steps (skip installing hda-jack-retask if Ubuntu 13.10 or higher)

OK! I figured it out! It sounds *awesome*!

Step 1: Install hda-jack-retask from here: https://launchpad.net/~diwic/+archive/hda (ppa:diwic/hda)

Step 2: Open hda-jack-retask

Step 3: Select the IDT 92HD91BXX codec (may be different on other models)

Step 4: Check the “Show unconnected pins” box (the internal speakers do not show as connected)

Step 5: Remap 0x0d (Internal Speaker, Front side) to “Internal speaker”

Step 6: Remap 0x0f (“Not connected” but is the under-display speakers) to “Internal speaker”

Step 7: Remap 0x10 (“Not connected” but is the subwoofer) to “Internal speaker (LFE)”

Step 8: Apply now, then test with your favorite audio program (some may not work due to Pulse reset, so find one that does, verify sound is coming from all speakers).

Step 9: If it works, select “Install boot override” to save the settings to apply at boot time.

Step 10: Reboot. When it comes back, you should have full sound from all speakers. Also test headphones. Plugging in headphones should disable sound from all internal speakers.

 

This worked awesome on my laptop! If you have questions just post in comments here.

Secret of the Romani People

A few years back when I used to be big on finding out about the origins of Indo-European people/community, I had learned about the gypsies tracing their roots back to India. Yesterday I stumbled upon the history of Romani people who are a part of the gypsy tribe and have lived in Europe for centuries not really knowing their place of origin. It will be surprising to a lot of readers that the Romani people have been genetically proven to be a race of the North-Middle-Indian territory but have lived across Europe for at least a 1000 years. The language they speak is very closely related to Hindustani or the Hindi language of India which is another startling fact.

What probably remains a big mystery is how did the Romanis or the Gypsys get to Europe from India? What made them forget the land they came from and what made them never make an effort to go back. It is a mystery even the Romani people may not know the answer to. Someday their history and their historic connection to India will be lost in the annals of time and it is unlikely if anyone will be interested in finding out the real reasons.

Do the Romani people consider themselves Indians and would they ever want to come back to India? Hmm interesting question and only a Romani connected to the roots can answer…

Getting your cheap Android Phone/Tablet to get detected for Debugging by Linux (Mint or Ubuntu)

Welcome to a post another road block I recently solved on the Android development saga. I got myself a cheap Android tablet (Byond Mi-1). In an effort to use it for Android Development with Linux Mint / Ubuntu, I had to get across quite a few steps other than what is normal. Lets go step by step:

  1. Figure out your Tablet’s Vendor ID – Use the lsusb command. It will dump out the details of all the USB devices connected to your machine. Usually your cheap tablet will not show up with a name on the dump, however in most likelihood it will be the last item on that list. To be sure, copy the output of the lsusb command into a text editor or spreadsheet. Then connect your Tablet with the computer and turn on Mass Storage (on the tablet). Run lsusb again and grab the dump and put it into a text editor or spreadsheet. There should be an extra line pertaining to your device. There will be an ID in the form of ID 1234:5678. 1234 will be your Vendor id. Take a note of it.
  2. Run the command:
    sudo gedit /etc/udev/rules.d/51-android.rules
    Copy paste these lines:
    SUBSYSTEM==”usb”, ATTR{idVendor}==”1234″, MODE=”0666″, GROUP=”plugdev”
    SUBSYSTEM==”usb”, ENV{DEVTYPE}==”usb_device”, ENV{PRODUCT}==”1234/*”, MODE=”0666″
    SUBSYSTEM==”usb”, SYSFS{idVendor}==”1234″, MODE=”0666″ 

    Please appropriately change 1234 to your correct device id.

  3. Run the following command to create a adb_usb.ini file in your .android folder in your home.
    sudo gedit ~/.android/adb_usb.ini
    Simply write your device id in this format:
    0x1234
    Save and exit
  4. Reboot your computer
  5. Unlock your tablet and go to settings. Find Developer Settings and switch on USB debugging. This step will depend on your Android version.
  6. Connect your tablet to the computer
  7. Get to your android sdk’s platform tools folder and run the command:
    ./adb devices
  8. If your device is listed, then yuhoo you got your cheap tablet ready for development.

Pretty cool eh!?

Laptop LCD screen brightness in Linux Mint 13 or Ubuntu 12.04

I recently set up a Linux workstation and based on my lookup on best distributions available, two came to fore: Ubuntu 12.04 and Linux Mint 13 (Maya). Ubuntu has always been a fantastic Linux distro, but as I learned Linux Mint is actually based off of Ubuntu and did a better job at being a full featured OS, I decided to get it setup on my desktop. I have been very pleased so far!
One of the issues faced was inability to control brightness of the screen. I could not do say from the keys on the keyboard and neither did system settings work. The fix was easy as I learned about it on other forums. Here is the link to fix the problem:
http://shellboy.com/linux-mint-13-on-dell-xps-15-brightness-keys-not-working.html

Android Development Beginning

Having begun learning Android development a little while back, and today I made some excellent progress. In order to get all my knowhow straight, I planned to create a simple app that would show me a list of people in a group. The list of people will be shown using a ListView control on the Android UI. The App would fetch this information from a RESTful Web Service written in ASP.NET C# on the Mono platform. I plan to use JSON as my choice of data communication format. Internally, the ASP.NET service will pull this information from a PostgreSQL Server.

Here is how I approached the task. I quickly threw in a table on my PostgreSQL installation which contained the names of people I want to show. On Monodevelop, I created a ASP.NET project with a simple ASPX page that dumps out a JSON for the list of users. To fetch the data from PostgreSQL we need a library called npgsql. Its pretty slick and gets the job done. Using standard mechanisms, I was able to pull data from the database and convert into a readable JSON.
The most important and challenging part (for me) came after this. That is, how to consume this service in my Android application. Coming from C# Windows Forms background, I am used to doing this in a line or two and if the call is small enough, I don’t even bother making it Async. But Android really wanted me to make the network call on a separate thread, and it did make sense! They enforced a good practice from the very beginning. Alright, so my quick reading led me AsyncTask<> class. I found some really nice tutorials on it’s implementation, specially this one: http://www.vogella.com/articles/AndroidPerformance/article.html
I faced an interesting situation here. I was trying to access my local ASP.NET server via a standard URL format like http://localhost:8080/ but that always threw an exception at me that the connection was refused. Then I tried browsing that URL from the Android emulator and that too failed. It took a while but it did strike me then, that localhost would be a loopback address on the emulator itself, not the host development machine. A quick search showed me that the emulator then uses the IP 10.0.2.2 for loopback on the host machine. Modifying my url to http://10.0.2.2:8080/ worked on the Android browser. Pretty sure it would run in my code as well, I ran my App, which to my not-my-surprise, threw the same exception.
Digging further over the internet, I realized I had missed a very basic step. Asking permissions on Android to use the Internet service. So all I had to do was put in a line in the Manifest file like:
The next I ran the App, and viola! it worked! But hey, that didn’t work all the way till showing it on the UI as a list. It only pulled the JSON string into the variable.
I now had to parse the JSON and appropriately get it into a Java consumable format. Gson by Google came to my rescue, which is a nice little framework for working with JSON on Java. I added references to it on my project and wrote some test code to see how it works. It was simple, clean and perfect. You use a Gson class, initialize it, and simply call the fromJson method to convert a JSON string to the desirable type.
Now came another roadblock, i.e. to bind it to the list view. The way ListView works is that it has an Adapter for it which can be of any custom implementation type. I used a standard ArrayAdapter and set it to the ListView. What kept tripping it off was the fact that the ListView would not consume the adapter on a thread other than the UI thread. Now this is pretty basic in the Windows Forms world and I should have thought the same would be true here. Solution would have to be similar as well, and as I found out, there is a very much similar method in Android called the runOnUiThread which takes a delegate, similar to the Invoke method on controls in Windows Forms. Now I gave a final run and yeah it was the Eureka moment to see everything working. Even after 6 years of programming, sometimes the smallest of achievements in new areas turns me on. Yeah to an extent that I spent this much writing about it… Though this write up is just for my own log purpose in an effort to track the development process of whatever it is that I am developing.
I am following a self imposed Agile Development methodology and am attempting to follow a Sprint cycle. Sprints are short burst of all round developments that enable us to understand the end-to-end process from the very beginning. For example, in my really silly simple project, I learned how to get PostgreSQL installed on my Linux box, how to get it working on C# in Mono and finally how to consume it on an Android App. This allowed me to do an integrated test of all three work environments that will form the major part of my upcoming days in deep development…
I will sign off now and log my next effort!

iPhone water damage causing no sound to come from the handset

Wow I never thought I would spill water on my precious iPhone 4… until yesterday when I actually did the forbidden. Like most cell phone companies, Apple doesn’t cover any water damage to the phone. The moment I spilled water on it, I reacted in about 3 seconds and wiped my iPhone dry. I pressed the home button and it seemed to light up and everything was responding ok. What I didnt notice at that time was that it wasn’t making the usual click sounds while unlocking and neither did it play tones when I pressed numeric buttons. It became apparent to me when after an hour I tried making a call to a friend and I wasn’t able to hear their voice from the handset. I switched it to the Loud Speaker mode, then I could hear my friend. Crap, I thought to myself and believed I had successfully damaged my iPhone’s earpiece.

Alright so lets do the problem-solution round:

Problem:
No sound coming from the handset’s earpiece, however loud speaker working fine

Solution:
The solution came to me from this link:
http://faithtoh.wordpress.com/2010/01/20/how-i-fixed-my-iphone-has-no-sound-problem/

What I did was to simply take a thin napkin that would absorb water and wrap it delicately over something like a toothpick. Turn off the phone and then clean the headphone jack with it and make sure the toothpick touches the very end. Ensuring that all the water was out of the headphone jack and then turning on the phone brought my sound back!!! It was like a miracle!!!

You must be wondering what does the headphone jack have to do with sound not coming from earpiece of the handset. Well my theory is, that water when gets inside the headphone jack, kind of shorts it and makes the phone believe that a headphone has been connected, hence causing the sound to not come from the handset earpiece.

If this post helps you, please yell yay!