Demystifying A.I.

August 21, 2017 Leave a comment

 

TheMatrix

With all the hype around “AI” and Machine Learning, I thought that I’d dabble in unpacking some of the key concepts. I find that most of my reading time now is spent in this area.

Firstly, Artificial Intelligence isn’t really something brand new. It has been written about for decades, most famously by Alan Turing in 1951. In his article “Computing Machinery and Intelligence” he spoke of the “imitation game”, which later came to be known as the Turing Test. There was a movie in 2014 called “The Imitation Game” which details Turing’s life, and how he cracked the enigma code in WW2 (starring the usually excellent Benedict Cumberbatch). Highly recommended.

In terms of actually experiencing a simple AI, the easiest way to do this is to play a videogame. Even in the 80’s with “Pac-Man”, as the player you were trying to outwit the 4 enemies on screen – the algorithms behind the enemy players could be considered an early implementation of simple AI. Modern games have more complex implementations of “AI” that may surprise you with their intelligence.

 

Pac Man

Was Blinky an A.I ?

 

So why is AI becoming more prominent now? Very simply, we have now reached a tipping point where Big Data, software advances and cloud computing can be leveraged together to add value to businesses and society. Even though we are still many years from creating a sentient AI like HAL 9000, we can implement things like Machine Learning today to bring about improvements and efficiencies.

Machine Learning

Machine Learning is a subset of A.I, not really A.I in totality. Way back in 1959, Arthur Samuel defined Machine Learning as “the ability to learn without being explicitly programmed”. Basically, this field entails creating algorithms that can find patterns or anomalies in data, and make predictions based on those learnings. Once “trained” on a set of data, these algorithms can very reliably (if chosen correctly), find those patterns and make predictions. Using tools like Azure Machine Learning, you can deploy this as a web service to automate the prediction process, and also do these predictions as a batch.

Now I personally became exposed to similar concepts around 2006 when working with SQL 2005. That product release included a lot of “data mining” functionality. Data Mining basically involved using algorithms (many built into SSAS 2005) to find patterns in datasets, a precursor to Machine Learning today.

I was really exciting by the possibilities of Data Mining, and tried to show it to as many customers as possible, however the market was just not ready. Many customers told me that they just want to run their data infrastructure as cheaply as possible and don’t need any of this “fancy stuff”. Of course, the tools today are a lot easier to use and we now include support for R and Python, but I think what was missing back in 2007 was industry hype. Industry hype, coupled with fear of competitors overtaking them, is possibly forcing some of those old I.T managers to take a look at Machine Learning now, while we also have a new breed of more dynamic I.T management (not to mention business users who need intelligent platforms) adopting this technology.

Machine Learning today has evolved a lot from those Data Mining tools, and the cloud actually makes using these tools very feasible. If you feel unsure about the value Machine Learning will bring to your business, you can simply create some test experiments in the cloud to evaluate without making any investment into tools and infrastructure, and I’m seeing customers today embrace this thinking.

Deep Learning

Deep Learning can be considered a particular type of Machine Learning. The difference is that Deep Learning relies on the use of Neural Networks, a construct that simulates the human brain. We sometimes refer to Deep Learning as Deep Neural Networks, i.e. Neural Networks with many, many layers. The scale of deep learning is much greater than Machine Learning.

Neural Networks

Neural Networks have been around for decades. As mentioned, this is a construct that mirrors the human brain. In the past, we could build neural networks but simply didn’t have the processing power to get quick results from them. The rise of GPU computing has given Neural Networks and Deep Learning a boost. There is a fast widening gap in the number of Floating Point Operations per second (FLOPS) that is possible with GPUs compared to traditional CPUs.   In Azure, you can now spin up VMs that are GPU based and build neural networks (where you might not have invested in such resources in the old on-premises world).

NVIDIA

CPU vs GPU FLOPs (image courtesy NVIDIA) 

Edit 23/8/2017 : A day after publishing this , Microsoft announced a Deep Learning acceleration platform built on FPGA (Field Programmable Gate Array) technology – more here : https://www.microsoft.com/en-us/research/blog/microsoft-unveils-project-brainwave/

While Machine Learning works well with repetitive tasks (i.e. Finding a pattern in a set of data), a neural network is better for performing tasks that a human is good at ( i.e. Recognizing a face within a picture).

Narrow AI vs General AI

All of the above would typically fall under Narrow AI (or Weak AI). Narrow AI refers to non-sentient AI that is designed for a singular purpose (I.e. A Machine Learning model designed to analyze various factors and predict when customers are likely to default on a payment, or when a mechanical failure will occur). Narrow AI can be utilized today in hundreds of scenarios, and with tools like Azure ML, it’s very easy to get up and running.

General AI (or Strong AI) refers to a sentient AI that can mimic a human being (like HAL). This is what most people think of when they hear the words “Artificial Intelligence”. We are still many years away from this type of AI, although many feel that we could get there by 2030. If I had to predict how we get there, I would say perhaps a very large scale neural network built on quantum computing, with software breakthroughs being made as well. This is the type of AI that many are fearful of, as it will bypass human intelligence very quickly and there’s no telling what the machine will do.

Why would we need a Strong AI? Some obvious use cases would be putting it onboard a ship for a long space journey – it is essentially a crew member that does not require food or oxygen and can work 24/7. On Earth, the AI would augment our capabilities and be the catalyst for rapid technological advancement. Consider this : we may not know where the AI will add the most value , however, once we build it, it will tell us where.

The good news is that you don’t need General AI (a HAL 9000) to improve businesses in the world today. We are currently under-utilizing Narrow AI, and there is tremendous opportunity in this space. I encourage you to investigate what’s out there today and you will be amazed at the possibilities.

Image used via Creative Commons license
Advertisements
Categories: AI, Azure, Futurism, Tech

Announcing SQL Server 2012 Service Pack 4

July 25, 2017 Leave a comment

In case you missed it , it looks like SP4 will be the final big thing for SQL 2012. You really need to move to SQL 2016 though , the improvements to the engine alone are worth it.

https://blogs.msdn.microsoft.com/sqlreleaseservices/announcing-sql-server-2012-service-pack-4/

 

Categories: Uncategorized

Sql Server was unable to communicate with the LaunchPad service

July 26, 2016 Leave a comment

OK , so SQL 2016 is here , and I’m sure we’re all already playing with the new features.

I’ve decided to start learning R, and did a fresh installation of SQL 2016 with R Services installed.

While attempting to run my “Hello World” script I encountered the error –

Msg 39011, Level 16, State 1, Line 1
SQL Server was unable to communicate with the LaunchPad service. Please verify the configuration of the service.

OK, so the first thing you have to do is to enable SQL to run external scripts. Run

Exec sp_configure ‘external scripts enabled’, 1
Reconfigure with override

However , make sure that the new “Launchpad” service is running – I simply had to start this to solve the error on my VM.

LaunchPad

Categories: SQL Server, Uncategorized

SQL 2016 requires ….Oracle ?

August 6, 2015 Leave a comment

OK , that’s a click-bait headline….. but its not totally untrue.

When I installed SQL 2016 for the first time on a VM, the install initially failed the list of checks due to 2 items. One was an update for Windows Server that wasn’t installed , but the other was an update for Oracle Java ( JRE ). Now this was puzzling to me – since when does SQL Server require Java ?

The answer lies in one of SQL 2016s most interesting features – Polybase.

Polybase is Microsoft’s “SQL-over-Hadoop” solution , a layer that allows you to write SQL and query relational and non-relational data. Originally launching with the APS appliance, the inclusion in 2016 is a milestone. So why Java ? Remember that MapReduce scripts are typically written in Java.

Sql 2016 allows you to query data from a connected Hadoop or HD Insight instance , a true sign of the Big Data times that we live in.

Categories: Uncategorized

Finally ! SQL 2014 certified for SAP

March 12, 2015 Leave a comment

After a long wait , we finally hap SAP Certification for SQL 2014. The details can be found at Juergan Thomas’ blog here
As expected , there is support for the new 2014 ColumnStore feature – particularly useful for SAP BW scenarios.

Amongst the other features , I see that Azure also plays a big role – In fact Azure DR for SAP Netweaver seems to be a hot topic. You can find a whitepaper on this here.

Now , there was something missing in the announcement – Support for SAP Netweaver using SQL Server In-Memory OLTP. Apparently , it has to do with the fact that IMOLTP uses Snapshot Transaction Isolation Level , while SAP Netweaver normally uses Read Committed.

Disappointing – but I’m sure that we’ll have some good news on this soon !

Categories: Uncategorized

SQL 2014 ColumnStore – The best of both worlds !

September 11, 2014 Leave a comment

I’m sure that many of you are familiar with the Column Store index feature that was launched with SQL 2012. Using Microsoft’s In-memory technology, the Column Store index delivered a massive performance boost to data warehouse queries. However , I’ve found that many people were put off the feature because of the fact that the Fact table becomes read only, once you implement Column Store. Well , you’ve probably heard the big news ( I’m writing this post months later than I originally wanted to ) , and for those of you who haven’t, the Column Store feature in SQL 2014 is update-able. That’s right , no more read-only fact table. However, that isn’t the reason for this post – I have something else here that might interest you. Space, the final frontier….  Firstly , you actually have 2 options with SQL 2014 – You can create the Non-Clustered Column Store as per 2012 ( which is NOT update-able ). Alternatively , you can create the new Clustered Column Store which IS update-able. The Clustered Column Store has some other interesting features : 1) It becomes the primary storage mechanism for the table. 2) It eliminates the need for other covering indexes. 3) It dramatically reduces the storage requirements for the table – and that’s a fact. Now its point 3 that I want to talk more about. In my test DW that I built for demo purposes, I have a fact table with about 20 million rows. Lets take a look at the storage statistics of that table. 01_Normal This table is currently taking up about 2.3 GB of space. We could apply Page Compression to this table ( which I believed has also been improved in 2014 ) , and we would get the following result. 02_Normal with page Comp Not bad , a reduction of the space used to less than 25% of the original size.However, we haven’t built any indexes yet to boost performance. Prior to applying the Page compression , I created a copy of the table , called FactSales2. Let’s apply the Non Clustered Column Store index to that table , to give us that performance boost , and see what happens to the storage. 03_Non Clustered ColumnStore The storage space of the table increases , and we can see that there is a 242MB overhead for the index. Now we could implement page compression and then the Non Clustered Column Store index , but then your table would still only be read-only. In addition to that , you would probably need to implement more indexes which takes up more space. In SQL 2014 , we have a better solution , which is, implement the Clustered Column Store Index. What about the Page Compression ? ….. 04_Clustered ColumnStore ….. it simply isn’t needed. The Clustered Column Store delivers better compression than Page Compression , reducing the size of the Table and the Index together to a mere 200MB. Technically , the index is the table. This is astonishingly less than a tenth of the storage space required when compared to the regular table with the Non-Clustered Column Store. Is the performance the same ? I’ve written a typical Data Warehouse reporting query with joins to the dimensions , and executed it on all 4 table types. The results are as follows :

Type Avg. Execution Time Notes
Regular Table 32 Seconds No Indexes
Regular Table with Page Compression 23 Seconds No Indexes
Table with Non Clustered Column Store 5 Seconds No Compression ( 2.3 GB Table ), Read Only
Clustered Column Store Table 5 Seconds Table is 200MB !

The only thing left to say is that, if you had reservations about using the Non-Clustered Column Store previously, you would be mad not to use it on Fact tables in SQL 2014. The benefits with regards to storage and performance are astounding. It remains to be seen if there is any impact to insert performance , but that’s a topic for another day.

Categories: SQL Server Tags: ,

Tech Ed South Africa ?

August 23, 2014 Leave a comment

Lots of people have been asking me about Tech Ed South Africa recently. The official comms was that there will not be an event this year , but Microsoft will be hosting other smaller events throughout the year. I refer you to the local Tech Ed page here : 

http://www.teched.co.za/closed.aspx

If I hear of any new events I’ll mention it here on my blog. 

Thavash

Categories: Uncategorized