PSC Tech Talk – Microsoft Bot Framework

Starting in early 2016 cloud vendors started to promote the concept of bots as a cool new feature and new way for users to interact with their applications within the enterprise. Understanding the general acceptance of a bot as a user interface, the push to gain traction in the space began in earnest.  Seeing this shift in emphasis in the vendor landscape prompted PSC Labs to create an investigation team for a short-term project.

In this presentation Adam Lepley (@AdamLepley) presented the first of a number of talks (here, here and here) he has given on the MS Bot Framework, how it works, why it was created and how easy it is to use.

What is the Microsoft Bot Framework? 

The bot framework is as the name implies a framework for building “bots”. What this means as a developer is that in C# and JavaScript Microsoft has created libraries containing methods and functions for simplifying the creation of an interactive chat bot. Once the created the framework also provides the ability to publish the bot to many chat “channels” like facebook, slack, teams, skype, SMS and others.

Chat bots are not a new concept. Various web sites and chat clients have been leveraging varied forms bots for many years, but mostly with the emphasis on consumer facing applications.  Targeting chat applications also comes with the benefit of building on a platform the user already is familiar with. It also removes the friction of leaning a new application and lessens the burden on developers on creating complex custom UI.

The Plan

Within the lab we always like to learn about a new technology and then make a plan to better understand it and demonstrate capability in it. The Plan was initially to download the examples, install, learn and then expand on what we learn and make our own examples with broader applicability to PSC clients.

We looked into:

  • How to create a bot
  • How to deploy a bot to different channels
  • How to add Artificial intelligence (LUIS)
  • How can we build something applicable to our clients / What else can we play with?

What did we find?

Adam discussed and demonstrated how easy it is to create a bot using the framework. He was able to build a hello world bot in about 10 minutes and publish it to a point where we could actually interact with it in the meeting itself.

The investigation team created the three bots aimed at demonstrating increased productivity gains and enhanced user experiences:

  • Common data capture – The ability to quickly and easily view and create timesheets from a bot.
  • Predictive Analytics – Using Machine Learning techniques to return projected sales results back to users based on Product information hosted in an external database.
  • Cognitive Services – Using cognitive services and natural language processing to demonstrate free text entry in a bot to create task logging on an external site.

The common assumption is text is the primary integrations when using chat clients. This is mostly true when two humans communicate over chat, but as it relates to bots, we have a variety options Microsoft provides with its abstraction.

The bot framework supports text (plain and rich), images (up to 20 Mb), video (up to 1 mins), buttons and the following rich content cards…

In addition to the rich content cards, Microsoft has released a separate service which enables you to build more complex card content layouts which can be rendered from data coming from the bot framework. This also exposed more native platform specific custom rendering of cards.

Timesheet Bot

We set out to build a bot which would help fill out weekly timesheet for our consultants. Our bot has two main features: displaying and creating a weekly timesheet. For displaying the previous week’s timesheet, used a carousel card which can display a collocation of cards representing the days of the week. Each card also has a set of buttons which can either link to additional actions within the bot, or external links to an existing website.

Product information Bot

We created a bot demonstrating the ability to search a product database which in turn triggered an external API call to an associated Azure machine learning service. Users can interact with the bot via a series of question and answers. e.g. “What product are you searching for? Please select one of the following”. The results are then fed back to the bots in the form of a chart graphic. This bot demonstrates a powerful way to access a variety of on demand reports right within a chat client.

Productivity Bot using Natural Language interpretation

We used the Azure LUIS service (Language Understanding Intelligent Service), which is a part of Microsoft’s cognitive services and uses machine learning to help derive intent from text. Users can use an unstructured text request to “create a task” or “create new task” or “I want a new task” which the LUIS service derives the intent to be “Create a Task”. Using the secure integration with and external task tracking service (Trello) the bot is then able to ask the user the necessary questions to create a task based on user inputs.

Conclusion

Bots are being used today by startup and some commercial enterprises trying to break into the corporate enterprise space. Our time spent with the Microsoft Bot Framework has convinced us that bot are ready for the enterprise and there are use cases for them effective implementation today.

 

Advertisements

PSC Tech Talk: How does blockchain work and what is cryptomining?

This week one of the Labs team members Toby Samples (@tsamples) gave a presentation on How does blockchain work and what is cryptomining. We are looking at Blockchain in the Labs right now and with the considerable press around cryptomining and how you can even hack a website to do it, we figured it would be good to educate everyone internally and also come up with some policy around preventing this as part of our delivery excellent to clients.

What is blockchain?

Well simply put it is a distributed digital record which enables the ability to prove that every transaction within the “chain” is correct and has not been tampered with. Most people know the association of blockchain and bitcoin.

Blockchain works by “hashing” the contents of a transaction and adding them to the “chain”. Once the chain is started the next link in the chain is created using the hash from the previous chain. If the contents of any link are changed the hashes will not match and the chain is broken.

The implication for bitcoin transactions on a massive scale is that every transaction is recorded in the chain, which makes the chain large, which makes validating the chain expensive and processor intensive. (One bitcoin transaction costs as much as the energy for a house for a week)

In a financial ledger it is critical to the confidence of the company/investor/buyer that bank records are accurate and no-one is faking the numbers for their own personal gain. But there are many other potential usages which less “volume” but just as much use.

Bitcoin and other distributed cryptocurrencies allow for transactions to happen all over the global and more importantly transaction validation can be a distributed process. It is not instantaneous that the transactions occur.

When a digital transaction is carried out, it is grouped together in a cryptographically protected block with other transactions that have occurred in the last 10 minutes and sent out to the entire network.

Miners (members in the network with high levels of computing power) then compete to validate the transactions by solving complex coded problems. The first miner to solve the problem and validate the block receives a reward. (In the Bitcoin Blockchain network, for example, a miner would receive Bitcoins). This is a really nice article explaining how the proof of work, works.

Explaining How Proof of Stake, Proof of Work, Hashing and Blockchain Work Together

So what is cryptomining?

Cryptomining is using a computer to do the coin mining processing. This is generally cost prohibitive to run as an individual. Unless you have a powerful gaming pc and are making a long term investment, it is not really a financially viable thing to do for an individual. The process is relatively simple: you create an online account to process financial transactions (you get paid), sign up to a service which will give you transactions to process, and install a program to churn through validations. Once you sign up to a service the validations are transmitted to your computer for processing.

It becomes illegal (cryptojacking) when you commandeer  someone else’s machine to do the mining for you. Why not have someone else pay for the mining while you reap the profits for the validation?

Where this becomes especially nefarious when services like coinhive allow you to make your website customers do this mining for you. Some people are starting to use this as income from their websites rather than advertising. Coinhive offer a service whereby you can add a coinhive js file to your website and then anyone who visits that site gets a javascript load of coin mining assigned to the computer and it churns away while you are on the page.

What happened earlier in Feb 2018 became international news when a remote 3rd party js library site used by UK and AUS government sites was hacked and these .gov sites started to behave like coinhive processing sites. See this great blog for more details (The JavaScript Supply Chain Paradox: SRI, CSP and Trust in Third Party Libraries).

There are ways and means to prevent your site becoming victim to this JavaScript attack as the article describes. The tale is cautionary and it is important that awareness of this kind of behavior is out there.

Conclusion

Blockchain is not just for financial transactions, there are many other real world applications for it. Understanding why how cryptocurrency works in principle, and the necessity for Coin Mining it breeds, gives us a better preparedness to prevent its illegal usage.

 

PSC Tech Talk: Azure API Management

In this presentation Alex Zakhodin (@AZakhodin) talked about his experience implementing Azure API Management within a large client.

The situation

The client is a globally focused customer currently providing certification services to their clients. They wanted to be able to provide a new service to their clients so that they are able to access their certification data in real time through a consumable, monetized service.

Client challenges

The client’s main application and multiple data sources are on premises and would not be moved to the cloud, so a hybrid application needed to be created and managed.

The client wanted to be able to securely manage traffic accessing the APIs. They needed to be able to track not only the number of users calling the API but control the amount of access over time.

The payment model proposed for this service also needed a way to track everything to a discrete level; the number of hits and the volume of data provided.

PSC solution

PSC implemented a solution using Azure API management which enabled the client to Abstract the data, Govern the process, Monitor the usage and provide the flexibility to on-board new services at any time.

The Azure API Management platform creates an API proxy model to facilitate the monitoring of API traffic through a centralized set of end points. This allows for developers to expose their internally hosted services without risk of exposing a direct connection. It allows for administrators to configure access to the data (down to users), provide limits to the amount of data accessible over a period of time, and to then create accurate reports on the volume of usage for billing.

The platform provides the ability to track traffic geographically and determine volumes and accessibility. For a globally application the end points and data can be made available via geo-replication.

For developers the API management portal provides the ability to not only track usage but also see how the APIs are performing.

To take advantage of the cost pricing models available in Azure, the wherever possible Azure Functions were used. In this way the client is only billed for usage. The direct cost per transaction means that the cost billed to the end client per transaction is easily manageable and competitive.

Conclusion

The Azure API Gateway platform is a mature, enterprise-ready, capability which allows for the creation of a hybrid cloud/internal architecture for companies to monitor, track and monetize their services in a secure and consistent manner.