Installing SQL Server on Linux

September 29, 2017 Leave a comment

SQL on Linux is here , and its a brand new experience for us SQL guys.

I decided to give installing it a shot , and the installation process is certainly different from what we’re used to. Since I’m more familiar with Ubuntu than Red Hat , I spun up a VM with Ubuntu server 16.04 LTS. To make things easier, I also decided to install the Ubuntu desktop environment, using the command

sudo apt-get install ubuntu-desktop

Little did I realize that installing the desktop isn’t needed for SQL on Linux. You setup SQL via the command line , and use Management Studio from a Windows machine to connect. So you can skip that step if you don’t need a desktop on your Linux VM.  Once Ubuntu is up and running ( which is fairly quick) the next steps are :

1. Import the Public Repository keys

curl | sudo apt-key add –

2. Register the SQL Server UBUNTU repository

sudo add-apt-repository “$(curl”

EDIT : The above worked while SQL on Linux was in preview. To install SQL 2017 GA use :
sudo add-apt-repository “$(curl”

3. Install SQL

sudo apt-get install -y mssql-server

That’s right. You don’t insert a disk ( or ISO image ) as you would do in Windows. SQL is installed like other Linux packages , and the entire thing is downloaded and installed.

After the install was successful , I immediately checked to see if the service was running , and got my first error.

Failed to start Microsoft(R) SQL Server(R) Database Engine.

The reason for this is because I didn’t run the configuration command ! This is a very important step and this is where you :

  1. Select your Edition ( and enter License Key if needed )
  2. Select components
  3. Setup the SA password

Run the following command to start , you will need to elevate :

sudo /opt/mssql/bin/mssql-conf setup


Once this process is done , you can check that the service is running with :

sudo systemctl status mssql-server

If all is well you should see the following :

Now if you’re still getting an error where the service isn’t starting , the one thing you can also check is if you’ve allocated enough RAM to your VM. SQL 2017 needs at least 3.5GB of RAM and the service won’t start if it can’t find enough.

Now you’re ready to connect ! So how do we do this without a GUI ? You simply install the new Management Studio on a Windows machine and connect from there.


Once that’s working , how do we get data in. You can actually restore a backup from SQL on Windows to SQL on Linux. In terms of getting the backup onto the VM ,there are various ways of doing this,  but what I’ve tried recently ( and liked ) is the Linux Subsystem for Windows 10 , where using bash commands like SCP I could easily move files onto the VM.

Once the .bak file is there, I’ve also found that the restore process via Management Studio is quite straightforward as well.

With that you’re good to go. My first area of interest now is determining if there is any performance advantage on the Linux platform. So far I’ve run smaller queries and its pretty much even, but I’ll be testing with larger more complex queries next.

I’d love to hear your thoughts on SQL 2017 on Linux thus far.

Categories: SQL Server

Blockchain 3.0 – The Enterprise Blockchain

July 10, 2018 Leave a comment

With all the hype around “Blockchain”, I was thinking the other day that people are starting to get Blockchain fatigue. The term itself has been heavily overexposed – and it’s probably not the best time to write ANOTHER blockchain article, right?

The problem is, though, that the overexposure of the Blockchain concept has caused confusion. Everyone is talking about the potential solutions that we’ll see using “blockchain technology”, but we don’t always see practical examples of how these will be delivered. In fact, there are currently serious technical limitations to overcome for feasible solutions to be delivered, which I will explore in this article.

To ultimately overcome these limitations, I believe that we are now seeing the next iteration, Blockchain 3.0, which is designed with the demands of large enterprises in mind. Blockchain 3.0 will be where the bulk of consortium solutions are implemented.

This is no way means that the “public” blockchains (like Ethereum), the basis of cryptocurrencies and public DApps currently, won’t be used for fantastic projects in future. It just means that there are certain requirements that Blockchain 3.0 will have to address. So, another article it is….

Blockchain is not just about currencies

Recently, I’ve been doing some public speaking at various events. When the topic is about blockchain, and potentially the exciting business solutions unlocked by blockchain, I normally hear questions about Bitcoin. Many people still associate the two very closely, and with good reason, as Bitcoin gave us the original Blockchain.

However, as much as I like Bitcoin, when I’m having an Enterprise Blockchain discussion with customers, I normally start by saying “Forget about Bitcoin for a minute”.

The reason I say this is because many people automatically bring all of the current issues with the public blockchains, the ones designed for cryptocurrencies, into the Enterprise Blockchain discussion. This sometimes seems to “kill” the Enterprise Blockchain discussion before it even starts, with many not getting to see potential benefits of the Enterprise Blockchain. To explore this concept, lets forget about cryptocurrency and re-examine the use case of Blockchain outside of Cyrpto.

The opportunity

Let’s go back to the opportunity that we see with the Enterprise Blockchain. Now here, you’ve probably read a million articles already on this topic – “10 Ways Blockchain will change Financial Services, Number 5 is particularly shocking!”……but really, clickbait headlines aside, the possibilities are very exciting.

Blockchain in the Enterprise context is about a data layer that can be shared across organizations in a safe and secure manner. You will see the use cases grow, across industries, day by day, but what you will also see are some patterns emerging.

Firstly, an enterprise blockchain solution seems to regularly involve the transfer of an asset, be it a physical asset, contract, or an asset of high value.

Secondly, there is a cross organizational workflow. The easy example here is a manufacturing process with many actors on the supply chain. A blockchain could be created to identify the state of a particular batch of goods, as it is being manufactured, with all actors being given a view into the current state of that batch. Supply chain is indeed a field where blockchain will play a big role going forward.

Thirdly, there would be an element of audit or reconciliation, that is usually causing pain, that can be relieved by an enterprise blockchain solution.

The potential use cases have already exploded, and everyday we see articles describing these. As I mentioned though, nobody talks about how these will be deployed. You are not going to deploy thousands of these “semi-private”, or Consortium solutions, on Ethereum, or any similar platform, at least not today. Let’s investigate why……

How we got here

As mentioned, Bitcoin gave us the original blockchain, call it Blockchain 1.0. It is a simple ledger that records transactions in sequence, and the entire chain is distributed, with verification happening across multiple nodes to confirm a transaction.

We then saw this evolve to something which could include logic to perform all manner of tasks. The first Blockchain 2.0 implementation was the Ethereum Network, and the added code was called Smart Contracts.

Both of these became massive global networks, with nodes popping up everywhere to perform verification, and miners earning rewards for their verification efforts (in the form of the cryptocurrency itself). Further alternative blockchains were developed, with improvements in security, transaction speed and other areas, but the core pattern was a massive distributed public network on which new types of solutions could be built. It really was the start of something entirely new in the tech world.

The new problem

The “public” blockchains were never meant specifically for the Enterprise world though. Although many of these have been proposed as platforms for Enterprise solutions and could well end up being used for such solutions, especially Business to Consumer, a few deficiencies have come to light.

The public blockchains were designed to work in a tremendously hostile environment. This is the reason that the consensus mechanism implemented by blockchain technology was a big deal in the first place. Furthermore, transactions are “in the clear” for everyone to see, and multiple nodes execute the transaction for verification. This secure verification guaranteed (outside of a 51% attack), that transactions were correct and immutable.

When looking at using Blockchain in the Enterprise world, the very safeguards that ensure the integrity of public blockchain networks, some of which listed above, bring about scalability problems. For instance, the public Ethereum network has an average processing rate of 20 transactions per second, with a typical transaction latency of around 10-20 seconds. By contrast, the Visa credit card processing system averages 2000 transactions per second. Newer blockchain technologies do address this, with other compromises. This isn’t the only issue though.

All transactions, smart contract code (bytecode) , and state are typically in the clear—visible to anyone who joins the network. This may be desirable in the cryptocurrency / public DApps world, but not in the enterprise world. For a public solution, total transparency is desirable. For a private Enterprise/ Consortium solution that utilizes Blockchain technology, why would you want everything exposed to the public? Once again, there are solutions being developed to address this, but there’s more to consider.

Large enterprises have become accustomed, over the last few decades, to building out secure, reliable high-performance infrastructure that they manage, to run their critical business applications and systems. These often come at significant cost, but this was also very necessary to keep an enterprise running in a highly competitive market environment.

Being used to this, will enterprises now build critical applications on public networks that offer no or unproven guarantees in terms of scale, uptime and security? Any enterprise architect knows that platform choice is critical due to risk mitigation, auditability, fault tolerance and level of service. Who provides these guarantees in a public network where one of the nodes processing your code could be a PC in someone’s bedroom?

You also can’t store alot of information on the public chains, and when you do it costs you. If you reference “Calculating Costs in Ethereum contracts” ( link in the addendum) , it mentions that at an ETH price of $295, it would could $5000 to store 1MB of data on the blockchain. The “gas” concept is actually there to make you use as little resources as possible. This is to ensure that the network runs efficiently.

Then there is the power issue. You may have seen the stats but basically power consumption of all the Bitcoin mining nodes are significant. On other public blockchains, consensys mechanisms like “Proof of Stake” have been proposed to counter this, but this will not work in the Enterprise world. You cannot rely on actors owning a predetermined amount of a cryptocurrency to keep an Enterprise platform running. What do we do about this? Firstly, Enterprise blockchain platforms will typically run on advanced clouds, where the power draw of compute has been optimized. Secondly, enterprise blockchains will allow you to choose consensus algorithms to determine the level of computational intensity required – being “private”, the consensus method need not be the most stringent.

What is the point of a “private” blockchain?

Firsty, these Enterprise blockchains could be Consortium Blockchains, shared between large enterprise organizations, government organizations and regulatory bodies. The consortium members are known and controlled. The actors are mature, with robust enterprise grade IT environments and security policies.

Secondly, when organizations are sharing confidential information with other organizations, they don’t normally want this to be public. The point of the blockchain is easy sharing of information between members of the consortium. In the past you could have had a single database to do this – the question is where does it sit and who own it? The blockchain makes this “public” between members of the consortium.

Lastly, enterprises want familiar development and management options, as per the rest of their IT systems. This reduces risk, both in terms of security and time to solution. Enterprise blockchains will allow Smart contract development in a variety of languages and tools like Visual Studio, which are already familiar to enterprise development teams.

Blockchain 3.0 – The COCO framework

An early leader in presenting an Enterprise Blockchain platform is the open-source COCO framework. The COCO framework allows a consortium of enterprises and government bodies to implement a blockchain solution, using the ledger technology of their choice ( eg. Ethereum, Corda, Hyperledger), but also implements some technological enhancements that address the shortcomings of the public blockchain solutions.

Some of the benefits of a framework like the above would include :

  • Faster transaction speed – The COCO framework runs in the Azure cloud, and this implements Intel’s SGX in order to create a Trusted Execution Environment. The network of trusted nodes created reduces the consensus problem from Byzantine fault tolerance to crash fault tolerance. This means that the consensus algorithm can be simplified in certain applications – ultimately giving you a much faster transaction speed than on a public blockchain (where you would never do this).
  • Flexible confidentiality models – Because COCO uses industry standard authentication and authorization ( like Azure AD ) , transactions and smart contract code can be processed in the clear yet revealed only to authorized parties. This would reduce the need for complicated confidentiality schemes, like Zero Knowledge proofs and zkSNARKS, which can become computationally intensive.
  • Reduced energy usage – This would be a major advantage. By reducing computationally intensive consensus algorithms, like Proof-of-Work, and running nodes in an optimized cloud, power usage can be reduced and controlled.
  • Enterprise storage and interoperability – I mentioned the problems with storage limits and “gas” above – with the Coco framework, Microsoft allows you to interact securely with other cloud assets for storage and analytics functionality. An important development here is the invention of the “Cryptlet“- Cryptlets are off-chain code modules that are written in any language that can execute within a secure, isolated, trusted container.



With the Coco framework, consensys is still required for transactions and smart contract state. However, compared to a public blockchain, every node fully trusts every other node.

Because of this, there is no need to defend against Byzantine faults. Blockchain updates that do not conflict with existing state maintained by a VN can be unequivocally accepted. The end result is that the Coco framework does not require wasteful, compute-intensive algorithms for proof-of-work, potentially unfair proof-of-stake algorithms, or latency-inducing time-bound algorithms.

In addition to the above, the Coco Framework is designed to support pluggable consensys algorithms, like Paxos, or the Microsoft developed Caeser. This ability to choose will impact the transaction speed.

When I speak to enthusiasts about Enterprise blockchains and all of the above, some of them immediate start protesting about whether its decentralized, whether it’s a “real” blockchain etc. This isn’t the point.

As Enterprise blockchains develop (and the Coco Framework is one example, probably the most developed offering thus far) , you WILL see a deviation from some of the goals of the public blockchains in order to satisfy the needs of private consortium solutions. This is perfectly fine – enterprises building blockchain applications for public consumption could still choose to utilize one of the public chains where required, and those will continue to stay true to the original goals of blockchain (some of them at least).


With large corporations now scrambling to think about how they will utilize Blockchain, the time for this technology has arrived. The technical community has sold the concept, and the business community has started to embrace it. What would be a disaster now is if some applications were deployed onto public chains and the performance or security let the solution down, causing negativity around the concept.

The evolution of public blockchains will continue though, make no mistake. Look at technologies such as Bitcoins Lightning Network and Ethereum’s Casper Protocol and Sharding system. But it will take time for the winners to emerge, and even a highly performant public chain may not be desirable for a consortium to utilize for various reasons.

Therefore Blockchain 3.0 will be the emergence of the cloud-driven, enterprise grade blockchain platforms, like the Coco framework, and the further innovation that those will bring. Once enterprise architects start understanding the differences between the different types of blockchain platforms, I think that we will see the anticipated uptick in consortium solutions being delivered.


1) Mark Russinovich announcing the COCO Framework


2) Discussion on COCO Framework

Microsofts COCO Framework

3) Calculating Costs on the Ethereum network

Categories: Uncategorized

My tech predictions for 2018

February 14, 2018 Leave a comment
Categories: Uncategorized

Understanding Ethereum

December 4, 2017 Leave a comment


Right now is absolutely the prime time to talk about cryptocurrencies. In November 2017, the value of Bitcoin exploded, surging so quickly, that it was the catalyst to bring Bitcoin into the public lexicon. However, Bitcoin is just the most well known of the cryptocurrencies – there’s a whole world of them out there. Many people who have been investigating cryptocurrencies have come across something called Ethereum. What is Ethereum ?

Well, Ethereum is NOT a cryptocurrency. Bitcoin is a cryptocurrency. Ethereum is a platform.

Ethereum is an open source platform, which exists to build and distribute decentralised applications. These applications will operate on a network of hundreds of thousands of nodes across the globe. There will be no central “server” for these applications, they operate in a peer-to-peer fashion. These decentralised apps are known as “dapps”, and many are excited but the potential they offer. The value of the Ethereum platform increases as more apps are written for it.

There are benefits to this approach. Applications that are centralized could, for example, have a higher risk of attack. In a decentralized architecture, there is no single point of failure.

The Blockchain

One trait shared with Bitcoin is the reliance on “Blockchain” technology at the heart of Ethereum. Bitcoin, as a digital currency, was the first and most well known use of the blockchain technology. Ethereum makes the blockchain technology available to uses other than cryptocurrency. Technically, you could say that Bitcoin is the first “dapp”, with the cryptocurrency being the first widespread application of blockchain technology. The proponents of Ethereum however will tell you that the real potential of the blockchain is to come. The blockchain nodes could run everywhere, on premises or in the cloud.

The developers writing applications to run on Ethereum need to write in something called a “smart contract”. This refers to a series of steps that will define how a transaction is handled. The smart contract can store data, perform logic or interact with other contracts. Developers can use Smart Contracts for things such as asset registration, land ownership and anything else where keeping a permanent record is essential.

Who is backing it ? 

Back in 2015, Microsoft became the first of the tech giants to really talk seriously about cryptocurrencies and blockchain. Firstly, Microsoft announced that it will support Bitcoin as a currency for purchases on the Microsoft online store. The details of how to go about this can be found here :

However, when it comes to Ethereum, Microsoft announced that it was betting big on Ethereum as a platform. Microsoft rolled out Coco, a framework designed to facilitate blockchain adoption by adapting existing blockchain protocols or by creating entirely new protocols, and the Azure Blockchain service, a BaaS (blockchain as a service) that enables businesses to quickly and easily configure and deploy a blockchain network.

The main advantages of the Coco framework are its ability to process over 1,600 transactions per second, something which neither the Bitcoin nor Ethereum blockchain can support at the moment (This would be for your private Blockchain).

The Coco framework will also use a unique technology called trusted execution environment (TEE). The trusted execution environment will be able to host the blockchain code in a secure box which will use Intel’s Software Guard Extensions or Windows’ Virtual Secure Mode in order to validate the environment.

Coco Framework

As you can see, the Coco framework works with a few Blockchains, but Ethereum is probably the most popular so far.

Now what is interesting, is that at the AWS re:Invent summit in November 2017, AWS has, surprisingly to many, not announced anything around Blockchain technology. Time will tell as to whether this is wise.

So what am I actually buying on my crypto exchange ? 

What you buy, if you’re investing in Ethereum, is something called Ether (ETH). This is the cryptocurrency piece of Ethereum. What is the relationship between the two ?

If you write an application on Ethereum, why should someone running a node process your program’s transactions ? Well, because you pay them to do so. And you pay them in Ether, the currency of Ethereum.

From – “It is a form of payment made by the clients of the platform to the machines executing the requested operations. To put it another way, ether is the incentive ensuring that developers write quality applications (wasteful code costs more), and that the network remains healthy (people are compensated for their contributed resources).”

If Bitcoin is compared to DIGITAL GOLD, an asset that is worth investing in, then Ether can be compared to DIGITAL OIL. Now that is a comparison that many of you would understand.

Some thoughts 

Now, to add my thoughts / concerns to the Ethereum debate.

Firstly, many are actually investing in Ether hoping to see similar returns as with Bitcoin. In other words, they want to see the price of Ether go through the roof. Wouldn’t this be bad for people running applications on the Ether Blockchain ? If the price of Ether is both highly volatile and also quickly inflates, how does that make Ethereum an attractive platform to run applications (Since your running costs for your application are in Ether) ?

On the other hand, if its too low, there is no incentive to mine. Perhaps someone can clear this up for me in the comments section.

Secondly, there’s the issue of blockchain size. While researching Ethereum, I decided to get into the spirit of things and do some mining, not to really make much profit, but to experience how it all works. After installing Geth, I waited for it to download the blockchain and I could then start. After downloading over 20Gb on my crappy connection ( with no end in sight ) I decided that this wasn’t a great idea ( I then saw that you don’t need to download the entire thing ). Still, some are saying that the Ethereum blockchain will be more than 1TB very soon. Good thing that there is talk of a sharding system.

Thirdly, transaction speed has always been an issue with cryptocurrencies. Bitcoin, believe it or not, can only really process about 5 transactions per second. Looking at other platforms :

•DASH – 10 transactions per second

•Ethereum – 20 transactions per second

•PayPal – 193 transactions per second average

•Visa – 1,667 transaction per second

•Ripple 1000-24000 transactions per second, real number is unknown

So Ethereum is better than Bitcoin in transaction speed, but considering what it wants to do, the transaction speed is not enough. There has been talk of a project called Raiden, however, which aims to dramatically improving scaling of Ethereum.

The other thing we’ve already seen is the blockchain become “jammed” and a whole backlog of unprocessed transactions form. The screenshot below shows a point where over 9000 transactions were sitting unprocessed in the blockchain.

Ethereum Network Jammed.jpeg

Once again, perhaps something like Raiden will address this.

Fourthly, there is the debate about running Ethereum nodes in the cloud – is this counter to decentralisation ? The one thing I can think of is the potential benefit of energy efficiency. There is already a debate about the inefficiency of mining from an electricity perspective, and how this in turn makes cryptocurrency inefficient. If nodes are in the cloud, however, remember that the large data centres are built at tremendous economies of scale, and you will not find better efficiency in smaller scale DCs, so from that perspective it may help somewhat.

Lastly, in spite of the early issues mentioned, there really is a sense in the community that Ethereum could be the Internet of the 2010s, and that we’re on the verge of something big. At the very least, it provides us with options.

As for Ether, I cannot tell you whether to invest in it or not, but as you can see, you do want to be following it very closely over the next few months.

Let me know your thoughts in the comments section.

Categories: Azure, Futurism, Tech

Personal A.I

October 26, 2017 Leave a comment

Engaging with customers daily can sometimes be fascinating. With so many new ideas and innovations in the tech world every year, the eagerness to talk to them and share this with them is great. It can be humbling then, astonishing even, when you go to a customer to talk about a topic like Machine Learning, only to see that not only have they embraced your technology, but are already pushing the boundaries in ways you haven’t seen previously. And when the person giving you a detailed instruction of what they’re doing is barely out of college, it does fill you with some excitement and optimism.  The gist of the above is that terms like Machine Learning and A.I have certainly entered the mainstream lexicon.

The cloud has really enabled us to both speed up development in the A.I space (note how this field has progressed post-cloud), and bring A.I technologies to a wider audience with the cloud’s consumptive model. A big advantage of the cloud is gaining access to hardware platforms that you perhaps wouldn’t have invested in previously in the on-premises world. Azure, for example, gives you access to GPU-based VMs, on which you can build HPC and AI solutions, scaling quickly to even 1000s of GPUs.

In August 2017, MS revealed Project Brainwave, a platform built on FPGAs (Field Programmable Gate Arrays) in partnership with Intel. This is a leap over simply chaining GPUs together, the entire architecture is optimized for Deep Learning. At the unveiling of Brainwave, the Intel Stratix 10 FPGA, built on a 14nm process, demonstrated performance of a sustained 39.5 Teraflops.

Project Brainwave

The above developments in the cloud are expected to continue at breakneck pace. A performance of 39 TF may not be so impressive in a years’ time. This is an example of what is possible with the cloud.

To complete the picture, however, we need to move from the cloud to the edge…..

The Intelligent Edge

In the world of IOT a new concept is becoming prevalent – the Intelligent Edge. Edge computing refers to data processing power at the edge of a network instead of holding that processing power in a cloud.

Now this may seem odd to some of you, however the importance of the cloud does not diminish in this scenario. The intelligent edge recognizes that to deliver what businesses require, data processing and intelligence need to be applied at the edge before data is synced into the cloud.

Intelligent Edge.PNG

I’ve spoken previously about the concept of having tremendous processing power on you, enough to power intelligence, as I’ve always been fascinated by artificial intelligence. In 2010, way before the current wave, I spoke of the concept of a “Local AI” on my blog, tremendous computing and analytical power in your pocket ( and on your wrist ) , that changes your daily life.  Looking back now, the term seems clumsy, so I am renaming the concept to Personal AI. The concept of the Intelligent Edge will extend to the device you carry, not just industrial sensors, and deliver Personal AI to you.

Now in the present, to build massive neural networks, you need the scale of the cloud. To train your models, you need vast amounts of data, as well as massive CPU for things like Backpropagation. You’re not going to do that on a mobile device. However, you could run pre-trained models on your mobile device with the right custom chips. You thought that chips don’t matter anymore?

Personal AI

In 2017, we are starting to see Personal AI form.

For many years it was felt that CPU’s were becoming a commodity, and that the real innovation lied elsewhere. As an Engineering graduate who loved chips, I was dismayed by that. I still get excited when the latest desktop CPUs are announced and was pleased to see the CPU wars stronger than ever in 2017, with AMD Ryzen launching and then Intel fighting back.

One company a few years back decided that central to their differentiating strategy would be to go back to designing their own custom chips to give themselves a massive advantage, which went against what the market was saying – Can you guess the company name ?

It was Apple.

Their custom “Ax” series of custom System-on-a-chips (SOCs) are, according to Apple, key to the smooth and fast experience on their devices. In September 2017, Apple announced the A11 Bionic, their latest custom SOC, and according to them it includes a neural engine.

The Chinese have already responded……

Personal AI is happening, just as I thought it would.

This is the first mainstream example of pushing Machine Leaning to The Edge, and already we can see why. Apart from the personal assistant utilizing this capability, there are other use cases – the new iPhone requires facial detection to unlock the device, and that action would need to work even when you’re offline.

Intel, however, will not readily give up any advantage in an exciting space like this. Also announced in September 2017, Intel has created a chip that simulates a neural network (Neuromorphic Computing), called Loihi. Expect this chip to find its way into smart devices that can now process more data on the Edge before sending to the cloud.

Lastly, there is the PCIe card by BrainChip, that plugs into a PC. Why would you need a Neural Network running there? The one application is processing simultaneous video feeds, and doing facial recognition on the fly.

Back to Personal AI, there are so many uses for increased intelligence on the Edge that it will soon be taken for granted. By 2020, I see most pocket computers (what you still call mobile phones) having advanced ML capability as standard to perform a whole host of tasks. The entry point into the capability will be the personal assistant, which will rapidly become much smarter. Speech recognition will improve as it will be done on device. Instead of the personal assistant being just a front to a search engine in the cloud (as it is mostly today), the Personal AI will have real capability to process and understand, using the search engine to reference data.

I see advanced scenarios such as the following:

1) Realtime Health Analysis – currently your smartwatch monitors your heart rate and steps and sends it to your phone, which sends it into the cloud. In the near future, your Personal AI will read this data and analyze it in real-time, with the ability to alert you as early as possible should you be at risk of a heart attack , or stroke ,for example. The number of complex sensors built into your smartwatch will increase in order to enable this. Advancement in sensor technology will be key to deliver on the capability of Artificial Intelligence. We need more capable sensors, all small enough to fit on a wristwatch.

2) Environmental Analysis – Another scenario that’s easy to predict will be to use the capability on your pocket computer to perform analysis of the environment around you. Air quality is a problem in many parts of the world – imagine taking out your device, some basic readings being taken, and then machine learning kicking in to advise you if the air is safe to breathe in that location, and what the risks are. This is especially useful for travelers. Once again, I am saying that development of better sensors is critical. Imagine a form of sensor that could analyze water in a glass and tell you if it is safe to drink – very useful in certain countries.

3) Realtime Language Translation – This is already happening with products like Skype, but could be augmented with the power available on the Edge device. I would imagine that a future version of Skype could take advantage of Personal AI, to improve Realtime translation (the already translated language is sent to the cloud and all the way to the other end).

4) Custom Apps –Once you have the processing power (and the sensors) at the Edge, you will see all kinds of custom apps being built to take advantage of this. Environmental Sensors could be utilized to create an app for workers in dangerous environments, mines for example. Apart from just the raw sensor readings, it is the Personal AI engine that would add real value, delivering insight in near real-time. It would also assist in sending more relevant data into a bigger engine in the cloud, with the knock-on effect of helping train more accurate models.

The development of Personal A.I is a area of technological advancement that I believe will have more of a personal effect on your life. Just like the smartphone and social media did a decade back , the ability to both interact with a smarter personal assistant, and also get access to life changing services as listed above, will change our daily life. Perhaps a more natural interaction with technology will even stop us all from staring at a screen all day – we can only hope.

Categories: AI, Futurism, Tech

Tech Summit Cape Town is happening in Feb 2018 !

October 25, 2017 Leave a comment

Many people still have fond memories of the old Tech-Ed events that were held initially at Sun City, and then for a few years in Durban.

While we have no idea if Tech-Ed will ever return, the exciting news is that Tech Summit, a free 2 day event, will happen in February 2018. This year we had a successful event in Johannesburg, and the good news is that for 2018 we will host Tech Summit in Cape Town !!

This is free event where you register to enter – first come first served. Registration will open on the 15th November , so make sure to save the link below !!




Categories: SQL Server

Demystifying A.I.

August 21, 2017 Leave a comment



With all the hype around “AI” and Machine Learning, I thought that I’d dabble in unpacking some of the key concepts. I find that most of my reading time now is spent in this area.

Firstly, Artificial Intelligence isn’t really something brand new. It has been written about for decades, most famously by Alan Turing in 1951. In his article “Computing Machinery and Intelligence” he spoke of the “imitation game”, which later came to be known as the Turing Test. There was a movie in 2014 called “The Imitation Game” which details Turing’s life, and how he cracked the enigma code in WW2 (starring the usually excellent Benedict Cumberbatch). Highly recommended.

In terms of actually experiencing a simple AI, the easiest way to do this is to play a videogame. Even in the 80’s with “Pac-Man”, as the player you were trying to outwit the 4 enemies on screen – the algorithms behind the enemy players could be considered an early implementation of simple AI. Modern games have more complex implementations of “AI” that may surprise you with their intelligence.


Pac Man

Was Blinky an A.I ?


So why is AI becoming more prominent now? Very simply, we have now reached a tipping point where Big Data, software advances and cloud computing can be leveraged together to add value to businesses and society. Even though we are still many years from creating a sentient AI like HAL 9000, we can implement things like Machine Learning today to bring about improvements and efficiencies.

Machine Learning

Machine Learning is a subset of A.I, not really A.I in totality. Way back in 1959, Arthur Samuel defined Machine Learning as “the ability to learn without being explicitly programmed”. Basically, this field entails creating algorithms that can find patterns or anomalies in data, and make predictions based on those learnings. Once “trained” on a set of data, these algorithms can very reliably (if chosen correctly), find those patterns and make predictions. Using tools like Azure Machine Learning, you can deploy this as a web service to automate the prediction process, and also do these predictions as a batch.

Now I personally became exposed to similar concepts around 2006 when working with SQL 2005. That product release included a lot of “data mining” functionality. Data Mining basically involved using algorithms (many built into SSAS 2005) to find patterns in datasets, a precursor to Machine Learning today.

I was really exciting by the possibilities of Data Mining, and tried to show it to as many customers as possible, however the market was just not ready. Many customers told me that they just want to run their data infrastructure as cheaply as possible and don’t need any of this “fancy stuff”. Of course, the tools today are a lot easier to use and we now include support for R and Python, but I think what was missing back in 2007 was industry hype. Industry hype, coupled with fear of competitors overtaking them, is possibly forcing some of those old I.T managers to take a look at Machine Learning now, while we also have a new breed of more dynamic I.T management (not to mention business users who need intelligent platforms) adopting this technology.

Machine Learning today has evolved a lot from those Data Mining tools, and the cloud actually makes using these tools very feasible. If you feel unsure about the value Machine Learning will bring to your business, you can simply create some test experiments in the cloud to evaluate without making any investment into tools and infrastructure, and I’m seeing customers today embrace this thinking.

Deep Learning

Deep Learning can be considered a particular type of Machine Learning. The difference is that Deep Learning relies on the use of Neural Networks, a construct that simulates the human brain. We sometimes refer to Deep Learning as Deep Neural Networks, i.e. Neural Networks with many, many layers. The scale of deep learning is much greater than Machine Learning.

Neural Networks

Neural Networks have been around for decades. As mentioned, this is a construct that mirrors the human brain. In the past, we could build neural networks but simply didn’t have the processing power to get quick results from them. The rise of GPU computing has given Neural Networks and Deep Learning a boost. There is a fast widening gap in the number of Floating Point Operations per second (FLOPS) that is possible with GPUs compared to traditional CPUs.   In Azure, you can now spin up VMs that are GPU based and build neural networks (where you might not have invested in such resources in the old on-premises world).


CPU vs GPU FLOPs (image courtesy NVIDIA) 

Edit 23/8/2017 : A day after publishing this , Microsoft announced a Deep Learning acceleration platform built on FPGA (Field Programmable Gate Array) technology – more here :

While Machine Learning works well with repetitive tasks (i.e. Finding a pattern in a set of data), a neural network is better for performing tasks that a human is good at ( i.e. Recognizing a face within a picture).

Narrow AI vs General AI

All of the above would typically fall under Narrow AI (or Weak AI). Narrow AI refers to non-sentient AI that is designed for a singular purpose (I.e. A Machine Learning model designed to analyze various factors and predict when customers are likely to default on a payment, or when a mechanical failure will occur). Narrow AI can be utilized today in hundreds of scenarios, and with tools like Azure ML, it’s very easy to get up and running.

General AI (or Strong AI) refers to a sentient AI that can mimic a human being (like HAL). This is what most people think of when they hear the words “Artificial Intelligence”. We are still many years away from this type of AI, although many feel that we could get there by 2030. If I had to predict how we get there, I would say perhaps a very large scale neural network built on quantum computing, with software breakthroughs being made as well. This is the type of AI that many are fearful of, as it will bypass human intelligence very quickly and there’s no telling what the machine will do.

Why would we need a Strong AI? Some obvious use cases would be putting it onboard a ship for a long space journey – it is essentially a crew member that does not require food or oxygen and can work 24/7. On Earth, the AI would augment our capabilities and be the catalyst for rapid technological advancement. Consider this : we may not know where the AI will add the most value , however, once we build it, it will tell us where.

The good news is that you don’t need General AI (a HAL 9000) to improve businesses in the world today. We are currently under-utilizing Narrow AI, and there is tremendous opportunity in this space. I encourage you to investigate what’s out there today and you will be amazed at the possibilities.

Image used via Creative Commons license
Categories: AI, Azure, Futurism, Tech