Language Processing

To be able to communicate vocally with Cortana, she needs to understand what we’re saying. In our situation that involves us using a service called LUIS (luis.ai), which is described as:

A machine learning-based service to build natural language understanding into apps, bots, and IoT devices.

https://azure.microsoft.com/en-us/services/cognitive-services/language-understanding-intelligent-service/

LUIS allows you to create an application that examines your request, tries to define your intent, and associated entities, and then tell you what it believes is the correct response. To do this, you start by creating some intents, for example: HomeAutomation.TurnOn, or 
HomeAutomation.TurnOff.

You then create a list of utterances for these intents, which are sentences you would expect someone to speak to perform the task, such as: “Turn the kitchen light on”. Once you have a number of utterances (usually a minimum of 5), you can then train your application to find commonalities and patterns in the utterances which cause it to be associated with the given intent. If you’ve ever done any Machine-Learning before, you’ll understand what’s going on here. If you haven’t, don’t worry, it’s not essential knowledge to getting this all working.

You can now test your application if you want, to make sure it can differentiate your intent from the input. This is where you provide a sentence not in your list of utterances and see if your application correctly provides the right intent.

Intents are examined and given a score (0 to 1 in decimal format), which is the applications confidence in knowing what you are saying. The intent with the highest score is then declared to be the correct intent.

Example:

{ text: 'turn off kitchen light',
intents:
{ HomeAutomation_TurnOff: { score: 0.772598863 },
HomeAutomation_Control: { score: 0.0294161271 },
HomeAutomation_Scene: { score: 0.01918456 },
HomeAutomation_Status: { score: 0.01667956 },
HomeAutomation_TurnOn: { score: 0.0116767986 },
None: { score: 0.00246443041 },
HomeAutomation_GetLocation: { score: 0.00227342732 } },

As you can see, in our example of turning off the kitchen light, the TurnOff intent was clearly the winner.

Well, that all great, but what is it we’re trying to control? These are known as entities, and they are the context objects in your request. These are added in your LUIS app in a similar way as your intents, but with some additional properties, such as type: Simple, List, Composite, personName, DateTime, and many more. For lists, you can also add synonyms to your values, so for example: an entity named Event of type List, may have an item value of “Get Up” which has a synonym of “Wake Up”. This helps the language processor know that when you say something like “Turn on the coffee maker when I wake up“, it’s the same as “Turn on the coffee maker when I get up“. 

Once you have defined your entities, your app will try to put them into context with your utterances automatically, although sometimes you may need to help it if it picks the wrong entity.

The culmination of this, along with training the app at various stages as you add new elements of speech, results in something like this:

As you can see, my request to “Turn off kitchen light” resulted in my app recognizing a device, an operation, and a room. From there I could pass that data to a HomeAssistant script which could then pick the correct action and perform it.

The testing for this is done in the Bot Framework Emulator, available for all platforms, which we will get to later.

So, for this first step I highly advise not to bother creating a bot and not writing any code yet. Create a LUIS app and add some basic intents and entities and build something that understands what you want to do by asking it in a textual way.

Home Automation

Home Automation

Lately I’ve been consumed with the desire to automate certain things in my home. To be honest it started when I saw the movie “Electric Dreams” when I was a teen. Over the last few years though, home automation has become a realization after companies and individuals launched home automation platforms, such as HomeAssistant, OpenHAB, and others, bringing the “Electric Dreams” of the past to the now.

I’ve tried a number of these systems, the devices they support, and various device firmware to try to find just what I want…even if I don’t know exactly what that is right now.

I know I want “presence detection”, which is knowing where someone is, specifically knowing when someone is home. A good example of this may be to switch off all the lights, decrease the temperature in the house, and activate an alarm system, when you leave the house for the day. You can attach conditions to this as well, for example you wouldn’t want to do those things if someone else was still home after you leave…so again, presence detection to the rescue.

One of the other things I certainly want is voice control and voice recognition. Being able to say “Turn the heating up 2 degrees” or “Dim the lights to 25%” is a must for me. I don’t want to think of my home automation system as just a computer, but more of a presence that’s intuitive and learns patterns about my behavior, and others in my home.
That leads me to my current project, which is using the Harmon Kardon Invoke, powered by Microsoft Cortana, as the voice interface between me and my home automation system.

Microsoft Cortana is really what’s doing the work here, or at least the detection and initial processing of my voice. That data then gets sent up to the cloud (Microsoft Azure in this case) to a “web bot” that handle the request and formulates a response. Some of my other posts will talk about the actual technology involved, some of the design and architecture, and the actual code. For now I’ll just say: This is going to be fun!

Let’s get started: The Smart Home Roadmap

Quick Tip #2 (Mosquitto ACL)

If setting up a new instance of Mosquitto and the instructions (or someone) tells you to add an ACL file, do yourself a favor and ignore them…at least initially. I just came across this and an ACL file can cost you days of troubleshooting.

Once you have your services communicating over MQTT, then go ahead an add an ACL file to your Mosquitto instance, but add a single user at a time and restart the services to make sure everything still functions as expected.

Quick Tip #1: Please press a key (CLI)

I saw a question by someone on some site that was asking about running a PHP CLI script until the user pressed a key and then quitting (finishing up, not just an exit).

This should be very simple using STDIN as a file stream.

<?php
$cli = fopen('php://stdin', 'r');
function prompt($msg, $cli)
{
    echo $msg . "\n";
    @ob_flush();
    // YOUR CODE HERE
    fgets($cli); // <--- must be last
}

$key = prompt('Please press a key', $cli);
@fclose($cli);

 

This essentially runs the ‘prompt’ function once, and executes the code inside of it until it hits the last line. The ‘fgets($cli)’ will cause a pause in script execution as it just starts a read on the STDIN.

Binary Search Tree (BST)

I know this has been asked a million times, and there are a million instances of this online, but this is my blog…so this is my version.

This was a question asked on an Amazon coding interview:
Given a BST created from an array of number, calculate the distance between 2 nodes of the tree.

So if the input array was [5, 6, 3, 4, 2, 1], the tree would look like this:

So to calculate the distance between 1 and 4, we would do the following:

  • Find the lowest common ancestor (LCA) for 1 and 4
  • Count the number of hops from the LCA to 1
  • Count the number of hops from the LCA to 4
  • Combine them

…and that’s the distance: so in the case of 1 and 4, the distance is 3.

I’m not someone who writes things like this on a regular basis, preferring to utilize tried and tested libraries, but it was a good coding exercise for sure, and lot’s of fun.

So here is my implementation in ES6-style JavaScript

class BinarySearchTree {
    
    constructor(list) {
        this.tree = {};
        this.branches = [];
        for (let n in list) {
            this.branches.push(this.branch(list[n]));
        }

        for (let [key, branch] of Object.entries(this.branches)) {
            if (Object.entries(this.tree).length == 0) {
                this.tree.root = {};
                this.tree.root = Object.assign(branch, this.tree.root);
            } else {
                this.build(branch, this.tree.root);
            }
        }

        return this;
    }

    branch(val) {
        let branch = {
            val: val,
            lft: {},
            rgt: {}
        };

        return branch;
    }

    build(branch, node) {
        if (node.val > branch.val) {
            if (Object.entries(node.lft).length == 0) {
                node.lft = Object.assign(branch, node.lft);
            } else {
                this.build(branch, node.lft);
            }
        } else {
            if (Object.entries(node.rgt).length == 0) {
                node.rgt = Object.assign(branch, node.rgt);
            } else {
                this.build(branch, node.rgt);
            }
        }
    }

    getDistance(from, to) {
        if (!this.find(this.tree.root, from) || !this.find(this.tree.root, to)) {
            return -1;
        }

        let lca = this.findLowestCommonAncestor(from, to);
        if (!lca) {
            return -1;
        }

        let node = this.find(this.tree.root, lca);
        let count = 0;
        let counter = function(_node, _count, _direction, _specifier) {
            while (_node.val != _specifier) {
                _count ++;
                _node = _node[_direction];
            }

            return _count;
        };

        count = counter(node, count, 'lft', from);
        count = counter(node, count, 'rgt', to);

        return count;
    }

    find (node, val) {
        if (node.val == val) {
            return node;
        } else if (node.val > val) {
            return this.find(node.lft, val);
        } else if (node.val < val) {
            return this.find(node.rgt, val);
        }

        return false;
    }

    findLowestCommonAncestor(node1Val, node2Val) {
        let common = false;
        let node1Ancestors = this.findAllAncestors(node1Val);
        let node2Ancestors = this.findAllAncestors(node2Val);

        if (node1Ancestors.length == 0 || node2Ancestors.length == 0) {
            return false;
        }

        node1Ancestors = node1Ancestors.sort();
        node2Ancestors = node2Ancestors.sort();

        for (let n in node1Ancestors) {
            if (node2Ancestors.indexOf(node1Ancestors[n]) != -1) {
                common = node1Ancestors[n];
                break;
            }
        }

        return common;
    }

    findAllAncestors(nodeVal) {
        let ancestors = [];
        let iter = this.tree.root;
        ancestors.push(iter.val);

        while (iter.val != nodeVal) {
            if (iter.val > nodeVal) {
                iter = iter.lft;
            } else  if (iter.val <= nodeVal) {
                iter = iter.rgt;
            } else {
                break;
            }

            if (iter != undefined) {
                ancestors.push(iter.val);
            }
        }
        
        return ancestors;
    }

}