PSC Labs 2018 review

PSC Labs was founded in 2015 to provide unbiased, vendor-agnostic technology insights. Our mission is to ensure client
delivery excellence and new solution offerings through the adoption of emerging technologies.

For more information check us out at https://labs.psclistens.com

2018 review

PSC Labs undertook a wide-variety of projects in 2018. From Robot Process Automation to Event Driven Architecture seven projects were undertaken to improve our understanding of these technologies/capabilities.

Blockchain

The team looked into how Blockchain worked and then on a more practical level looked specifically into Ethereum and the ability to incorporate Smart Contracts into the chain. We looked at the services provided by various cloud vendors and found that at the time, the examples for implementations were on a very large scale.

Blockchain is not difficult to understand technically, but the broad questions about scalability, long term viability and adoption though are still quite open ended.

Custom Vision API

The team looked at the newly released Azure Cognitive Services Image process capabilities and built a custom app capable of recognizing every day images. The application built on top of a Xamarin iOS mobile app provides a user with the ability to take a number of pictures of an object, from different angles, and store them within the application.

The Azure Cognitive Services are used to generate a Machine Learning model which can then be downloaded back to the device. The application is then capable of using the camera to identify objects with a predicted level of accuracy.

Grouping Models Training The Model Running Locally

The investigation team successfully demonstrated the ability to build a real-world application around the Azure Cognitive Service.

Azure/AWS IoT

The team investigated the IoT services available in Azure and AWS. To build on the previous work the Labs had done with GE’s Predix platform, these investigation teams were focused on using the available abstraction services from the cloud vendors and not on the low level device/data interaction.

We discovered that both platforms were very good at easily setting up the ability to handle data ingestion from devices. The ease of setup on the device to allow for secure authenticated transmission of data was simple and easy to understand in both cases.

The Azure platform service stood out however with their Azure IoT Suite and Remote Monitoring. Once the data ingestion was set up, the IoT Suite enabled us to create a monitoring dashboard and set controls for performance monitoring. The ability to configure limits for data and automate notifications based on those limits promise considerable potential.

The Azure IoT Suite highlighted how far IoT as a service has come in a short period of time and is a viable solution to any company seeking to set up and start to take advantage of the burgeoning IoT landscape.

Fly.io

The investigation team looked at the intriguing concept of a programmable CDN and the promise of being able to enhance website performance without having to change any of the code on the site directly. An example of this capability would be the adding of a watermark to an image. The Fly.io server would proxy in between the image server and programmatically add the watermark. The watermarked version of the image would then be cached for the next user, at the CDN closer to the user than the original image on the server.

The fly.io Edge Application runtime is an open core Javascript environment built for proxy servers. It gives developers powerful caching, content modification, and routing tools.

The runtime is based on V8, with a proxy-appropriate set of Javascript libraries. There are built in APIs for manipulating HTML and Image content, low level caching, and HTTP requests/responses. When possible, we use WhatWG standards (like fetch, Request, Response, Cache, ReadableStream).

The team found that the implementation of Fly.io as a developer was not complex and the examples provided were easy to set up and run. But overall the team found that this capability feels more like a solution waiting for a problem.

Event Driven Architecture

Kafka

At the start of 2018, as part of their 10 technologies to watch Gartner declared “Event Driven Architectures” as something to pay attention to. The Labs team looked into Kafka specifically although there are others (Azure Event Hub being one) with this in mind. Kafka was originally a project created by LinkedIn to handle their massive data volume and was subsequently open sourced through the Apache foundation.

The team created a demo application which ingested data from an HR application managing people and their records. From the input of the data multiple complex processes were initiated and executed by the event driven architecture. The response of the application, even running locally was very impressive.

Robot Process Automation

UIPath

While Robot Process Automation (RPA) is not a new technology, it’s coming to the forefront of business rapidly. With VC funding for major RPA vendors more prevalent (AutomationAnywhere, UIPath), it demonstrates the capacity for the market to absorb this new technology quickly.

RPA as an industry is all about the automation of repetitive mundane tasks, such as manual data entry into multiple systems. Many companies have long established manual business processes, mainly due to the cost to automate the process. RPA can help address this problem by accurately and repeatedly following the same steps a person would.

We looked at UIPath as a vendor for RPA and looked into the more advanced capabilities of the platform. We created an ability for a code check-in process within AzureDevOps, to trigger a build process chain and instruct the RPA robot to automate a UI test through a browser. If the robot found a failure it created a bug within AzureDevOps related to the failing test.

RPA is mature and already being used across many industries, there is significant opportunity for cost effective savings for companies to use RPA.

GraphQL

GraphQL is a technology created by Facebook in response to a problem they found themselves when facing a growth model based on a service-based architecture. As Facebook pages grew in complexity and functionality, the number of services being called increased and caused various performance issues. The PSC labs team set up to investigate whether or not GraphQL would be applicable to the projects we were planning to work on in the future.

The investigation team took an existing mobile application where the load time was in excess of 10 seconds and was able, using GraphQL, to reduce the load time of the page by over 50%. In a case where the user was on a mobile network with high latency the loading speed was increased by over 65%.

GraphQL has many advantages for a developer and project team when considering a services architecture, from the creation of a standard endpoint, to the reduction in network calls and speed of time to page load, it proved itself very valuable.

 

Conclusion

PSC Labs had another successful year investigating many broad technology innovations. As in previous years, some of the projects show great promise and we will be working on new iterations of them in 2019.

If you want to find out more about PSC Labs and/or have an interesting project you would like us to share with you please contact

info@psclistens.com for more information

 

PSC Tech Talk – Microsoft Bot Framework

Starting in early 2016 cloud vendors started to promote the concept of bots as a cool new feature and new way for users to interact with their applications within the enterprise. Understanding the general acceptance of a bot as a user interface, the push to gain traction in the space began in earnest.  Seeing this shift in emphasis in the vendor landscape prompted PSC Labs to create an investigation team for a short-term project.

In this presentation Adam Lepley (@AdamLepley) presented the first of a number of talks (here, here and here) he has given on the MS Bot Framework, how it works, why it was created and how easy it is to use.

What is the Microsoft Bot Framework? 

The bot framework is as the name implies a framework for building “bots”. What this means as a developer is that in C# and JavaScript Microsoft has created libraries containing methods and functions for simplifying the creation of an interactive chat bot. Once the created the framework also provides the ability to publish the bot to many chat “channels” like facebook, slack, teams, skype, SMS and others.

Chat bots are not a new concept. Various web sites and chat clients have been leveraging varied forms bots for many years, but mostly with the emphasis on consumer facing applications.  Targeting chat applications also comes with the benefit of building on a platform the user already is familiar with. It also removes the friction of leaning a new application and lessens the burden on developers on creating complex custom UI.

The Plan

Within the lab we always like to learn about a new technology and then make a plan to better understand it and demonstrate capability in it. The Plan was initially to download the examples, install, learn and then expand on what we learn and make our own examples with broader applicability to PSC clients.

We looked into:

  • How to create a bot
  • How to deploy a bot to different channels
  • How to add Artificial intelligence (LUIS)
  • How can we build something applicable to our clients / What else can we play with?

What did we find?

Adam discussed and demonstrated how easy it is to create a bot using the framework. He was able to build a hello world bot in about 10 minutes and publish it to a point where we could actually interact with it in the meeting itself.

The investigation team created the three bots aimed at demonstrating increased productivity gains and enhanced user experiences:

  • Common data capture – The ability to quickly and easily view and create timesheets from a bot.
  • Predictive Analytics – Using Machine Learning techniques to return projected sales results back to users based on Product information hosted in an external database.
  • Cognitive Services – Using cognitive services and natural language processing to demonstrate free text entry in a bot to create task logging on an external site.

The common assumption is text is the primary integrations when using chat clients. This is mostly true when two humans communicate over chat, but as it relates to bots, we have a variety options Microsoft provides with its abstraction.

The bot framework supports text (plain and rich), images (up to 20 Mb), video (up to 1 mins), buttons and the following rich content cards…

In addition to the rich content cards, Microsoft has released a separate service which enables you to build more complex card content layouts which can be rendered from data coming from the bot framework. This also exposed more native platform specific custom rendering of cards.

Timesheet Bot

We set out to build a bot which would help fill out weekly timesheet for our consultants. Our bot has two main features: displaying and creating a weekly timesheet. For displaying the previous week’s timesheet, used a carousel card which can display a collocation of cards representing the days of the week. Each card also has a set of buttons which can either link to additional actions within the bot, or external links to an existing website.

Product information Bot

We created a bot demonstrating the ability to search a product database which in turn triggered an external API call to an associated Azure machine learning service. Users can interact with the bot via a series of question and answers. e.g. “What product are you searching for? Please select one of the following”. The results are then fed back to the bots in the form of a chart graphic. This bot demonstrates a powerful way to access a variety of on demand reports right within a chat client.

Productivity Bot using Natural Language interpretation

We used the Azure LUIS service (Language Understanding Intelligent Service), which is a part of Microsoft’s cognitive services and uses machine learning to help derive intent from text. Users can use an unstructured text request to “create a task” or “create new task” or “I want a new task” which the LUIS service derives the intent to be “Create a Task”. Using the secure integration with and external task tracking service (Trello) the bot is then able to ask the user the necessary questions to create a task based on user inputs.

Conclusion

Bots are being used today by startup and some commercial enterprises trying to break into the corporate enterprise space. Our time spent with the Microsoft Bot Framework has convinced us that bot are ready for the enterprise and there are use cases for them effective implementation today.

 

PSC Tech Talk: How does blockchain work and what is cryptomining?

This week one of the Labs team members Toby Samples (@tsamples) gave a presentation on How does blockchain work and what is cryptomining. We are looking at Blockchain in the Labs right now and with the considerable press around cryptomining and how you can even hack a website to do it, we figured it would be good to educate everyone internally and also come up with some policy around preventing this as part of our delivery excellent to clients.

What is blockchain?

Well simply put it is a distributed digital record which enables the ability to prove that every transaction within the “chain” is correct and has not been tampered with. Most people know the association of blockchain and bitcoin.

Blockchain works by “hashing” the contents of a transaction and adding them to the “chain”. Once the chain is started the next link in the chain is created using the hash from the previous chain. If the contents of any link are changed the hashes will not match and the chain is broken.

The implication for bitcoin transactions on a massive scale is that every transaction is recorded in the chain, which makes the chain large, which makes validating the chain expensive and processor intensive. (One bitcoin transaction costs as much as the energy for a house for a week)

In a financial ledger it is critical to the confidence of the company/investor/buyer that bank records are accurate and no-one is faking the numbers for their own personal gain. But there are many other potential usages which less “volume” but just as much use.

Bitcoin and other distributed cryptocurrencies allow for transactions to happen all over the global and more importantly transaction validation can be a distributed process. It is not instantaneous that the transactions occur.

When a digital transaction is carried out, it is grouped together in a cryptographically protected block with other transactions that have occurred in the last 10 minutes and sent out to the entire network.

Miners (members in the network with high levels of computing power) then compete to validate the transactions by solving complex coded problems. The first miner to solve the problem and validate the block receives a reward. (In the Bitcoin Blockchain network, for example, a miner would receive Bitcoins). This is a really nice article explaining how the proof of work, works.

Explaining How Proof of Stake, Proof of Work, Hashing and Blockchain Work Together

So what is cryptomining?

Cryptomining is using a computer to do the coin mining processing. This is generally cost prohibitive to run as an individual. Unless you have a powerful gaming pc and are making a long term investment, it is not really a financially viable thing to do for an individual. The process is relatively simple: you create an online account to process financial transactions (you get paid), sign up to a service which will give you transactions to process, and install a program to churn through validations. Once you sign up to a service the validations are transmitted to your computer for processing.

It becomes illegal (cryptojacking) when you commandeer  someone else’s machine to do the mining for you. Why not have someone else pay for the mining while you reap the profits for the validation?

Where this becomes especially nefarious when services like coinhive allow you to make your website customers do this mining for you. Some people are starting to use this as income from their websites rather than advertising. Coinhive offer a service whereby you can add a coinhive js file to your website and then anyone who visits that site gets a javascript load of coin mining assigned to the computer and it churns away while you are on the page.

What happened earlier in Feb 2018 became international news when a remote 3rd party js library site used by UK and AUS government sites was hacked and these .gov sites started to behave like coinhive processing sites. See this great blog for more details (The JavaScript Supply Chain Paradox: SRI, CSP and Trust in Third Party Libraries).

There are ways and means to prevent your site becoming victim to this JavaScript attack as the article describes. The tale is cautionary and it is important that awareness of this kind of behavior is out there.

Conclusion

Blockchain is not just for financial transactions, there are many other real world applications for it. Understanding why how cryptocurrency works in principle, and the necessity for Coin Mining it breeds, gives us a better preparedness to prevent its illegal usage.

 

PSC Tech Talk: Azure API Management

In this presentation Alex Zakhodin (@AZakhodin) talked about his experience implementing Azure API Management within a large client.

The situation

The client is a globally focused customer currently providing certification services to their clients. They wanted to be able to provide a new service to their clients so that they are able to access their certification data in real time through a consumable, monetized service.

Client challenges

The client’s main application and multiple data sources are on premises and would not be moved to the cloud, so a hybrid application needed to be created and managed.

The client wanted to be able to securely manage traffic accessing the APIs. They needed to be able to track not only the number of users calling the API but control the amount of access over time.

The payment model proposed for this service also needed a way to track everything to a discrete level; the number of hits and the volume of data provided.

PSC solution

PSC implemented a solution using Azure API management which enabled the client to Abstract the data, Govern the process, Monitor the usage and provide the flexibility to on-board new services at any time.

The Azure API Management platform creates an API proxy model to facilitate the monitoring of API traffic through a centralized set of end points. This allows for developers to expose their internally hosted services without risk of exposing a direct connection. It allows for administrators to configure access to the data (down to users), provide limits to the amount of data accessible over a period of time, and to then create accurate reports on the volume of usage for billing.

The platform provides the ability to track traffic geographically and determine volumes and accessibility. For a globally application the end points and data can be made available via geo-replication.

For developers the API management portal provides the ability to not only track usage but also see how the APIs are performing.

To take advantage of the cost pricing models available in Azure, the wherever possible Azure Functions were used. In this way the client is only billed for usage. The direct cost per transaction means that the cost billed to the end client per transaction is easily manageable and competitive.

Conclusion

The Azure API Gateway platform is a mature, enterprise-ready, capability which allows for the creation of a hybrid cloud/internal architecture for companies to monitor, track and monetize their services in a secure and consistent manner.

 

PSC Group Tech Reviews

One of the coolest parts of my job is being an enabler of others. Since the inception of the PSC Labs we have given the opportunity for developers/managers/designers to give what we call “Tech Reviews” and share some of the cool things they have worked on in recent months.

The tech review platform serves many purposes (beyond the free pizzas for those attending in person). The reviews enable:

  • Cross sharing of ideas and experience across the company
  • The opportunity for those unaccustomed to giving technical presentations to learn from others in a safe friendly environment
  • Sharing some of the cool stuff we are playing in PSC Labs and sharing some thoughts and ideas on the future solutions we believe PSC will be able to provide our customers

In just over two years we have had nearly 30 tech reviews broadly crossing almost every aspect of the work we do at PSC Group. We have also had nearly 20 different speakers which is amazing. I was concerned when we started that it would always be the same few people giving the presentations but I was very happily proven wrong. The most any one consultant has done is three. Topics have varied from PGP Encryption, to Azure API Gateways, to formula management using PLM software, to the process of building a HIPAA compliant network on AWS for a Medical practice.

We use Skype for Business to share the presentation with those who are unable to attend in person. We record the presentation and post it internally to the Office365 Video portal so that anyone can use it for reference at a later date.

In the coming weeks I am going to start to blog about some of the Tech Review we have had in the past and the new ones as they are happening. I really want to share the ideas and concepts and demonstrate the breadth of interesting work we get to do at PSC.

 

 

 

PSC Director of Technology Solutions

I am very humbled and excited to have been given the title of Director of Technology Solutions at PSC Group.

It has been a fascinating and varied 5 1/2 years since I joined PSC. I am very fortunate to have had the opportunity to work in almost every role at PSC and experience how client delivery excellence is achieved at all levels. I get to professionally hang out with some of the most talented people I have ever had the pleasure of working with. It’s so much fun!

I am really excited about what 2018 holds. PSC Labs is going strong and we continue to receive reassuringly positive feedback from clients who truely value our role as trusted technical advisors. In 2017 PSC successfully branched out into a number of new and challenging emerging technologies and we have plans a plenty to continue to learn, evolve and grow.

I love what I do and I am very grateful to have a job which gives me the opportunity to be the best I can be, and still challenge me to be better.

It’s going to be a fun year, but then it always is… 🙂

Speaking at SharePoint Chicago Dec 2017

I will be speaking at the 2017 SharePointFest Chicago conference December 8th, at McCormick Place, Chicago.

I will be talking about how O365 adoption can be made easier by having and internal emerging technologies team try and solve some simple business problems and lead to a broader adoption of the platform within an organization.

This will be the third time I have spoken about “the Labs team” this year and I am really honored to be able to speak at SharePointFest again this year 🙂

————-

BV 205 – Enabling O365 adoption from within – How an emerging technologies team can make a big difference

Too often when an organization makes major technology shifts, it is often not the technology change which causes problems, it’s the people and how they adapt to change. While we often associated this to end users, the same is just as true for developers. If we want to retain our best development talent then we have to give them a part in the transition. Allow them to understand it and own it.

With so many new technologies and capabilities being exposed within Office 365 and Azure, many business are frankly overwhelmed with the possibilities and often fall back on the bare minimum of mail, calendaring and SharePoint. This talk will demonstrate cost effective measures to keep developers engaged while providing benefit to the company in a mutually beneficial manner.

An internal research and development team can create a sustainable balance of creative knowledge growth for the individual, matched with a method to future proof the overall organization. How changes in technology affect the success of a company need to be understood, managed and the effects managed. With the unrestrained freedom to explore emerging technologies and without the constraints of today’s corporate development policies your best talent can achieve great things, stay engaged, and more importantly stay.

In this presentation Mark will discuss and demonstrate how a creative “labs” team can lead to short, medium and long term benefits for any business willing to invest in people and technology.