Deep Learning for the Masses (… and The Semantic Layer)
Deep learning is everywhere right now, in your watch, in your television, your phone, and in someway the platform you are using to read this article. Here I’ll talk about how can you start changing your business using Deep Learning in a very simple way. But first, you need to know about the Semantic Layer.
Scaling the Semantic Layer for your organization
When searching for something that can help me and you in implementing an end-to-end platform for delivering a true Semantic Layer at enterprise scale I found a great platform: Anzo created by a company called Cambdrige Semantics.
I’m going to edit this. Keep reading!!
You can build something called “The Enterprise Knowledge Graph” with Anzo.
The nodes and edges of the graph flexibly capture a high-resolution twin of every data source — structured or unstructured. The graph can help users answer any question quickly and interactively, allowing users to converse with the data to uncover insights.
In addition to making every day big data analytics problems easy the graph unlocks new possibilities where graph is particularly well suited. The graph, based on open-standards is a platform for continuous improvement. Within the graph, sources are quickly linked and harmonized using business rules, text analytics and even machine learning (this is going to be important soon).
Also I loved the idea of a Data Fabric. Then I realized that other people use the same concept. It reminded me of the time-space fabric. I went ahead and defined the Data Fabric (without knowing is that what the authors mean and without reading other definitions).
The concept of the Space-Time Fabric in Physics is a construct created to explain the continuous of space and time, and it’s made of four (or eleven or twenty six depending on the theory you are) dimensions. Inside of this construct gravity is a manifestation of warping the fabric of space-time.
From Ethan Siegel: You can talk about space as a fabric, but if you do, be aware that what you’re doing is implicitly reducing your perspective down to a two-dimensional analogy. Space in our Universe is three dimensional, and when you combine it with time, you get a four dimensional quantity.
So what would a Data Fabric? If we think of the definition in Physics, we can say that for an organization:
The Data Fabric is the platform that supports all the data in the company. How it’s managed, described, combined and universally accessed. This platform is formed from an Enterprise Knowledge Graph to create an uniform and unified data environment.
And with Anzo this is possible. This is what a Data Fabric with Anzo can look like (it kinda looks like the space-time fabric, awesome!):
The things on top of the data fabric are data layers. These data layers can add stuff like data cleansing, transformation, linking and access control — dynamically enhancing the in-memory graph in an iterative manner.
Data Layers in this stacked fashion are very flexible, meaning that you can easily turn layers on or off, and remove, copy and create layers as needed.
With Anzo you have automatic query generation (yep that’s a thing) and using them against the complex graph makes extracting features easy and eventually fully automated!
With the several components of Anzo an user can truly have a conversation with their data — quickly and easily pivoting to take the analysis in new directions based on the answers to questions, without specialized query knowledge, they can traverse even the most complicated multi-dimensional data on the way to building exploratory charts, filters, tables and even network views.
And with the connection of Open Source technologies like Spark, Featuretools and Optimus you can fully prepare your data and finally make it ready for machine and deep learning.
I’ll write more about this in the future, but for now, let’s think we have our data fabric and everything on point, and we want to do machine and deep learning.
Deep Learning for You
Ok, Deep Learning. You want to use it. What are its main applications?
Here you can see some of them:
In the few years since deep learning has been the king of the AI world, it achieved great things, François Chollet list following breakthroughs of Deep Learning:
- Near-human level image classification.
- Near-human level speech recognition.
- Near-human level handwriting transcription.
- Improved machine translation.
- Improved text-to-speech conversion.
- Digital assistants such as Google Now or Amazon Alexa.
- Near-human level autonomous driving.
- Improved ad targeting, as used by Google, Baidu, and Bing.
- Improved search results on the web.
- Answering natural language questions.
So there’s a lot of things you can do with it. Now, how can you do them?
Sadly, there is a (big) shortage of AI expertise that creates a significant barrier for organizations ready to adopt AI. Normally we do Deep Learning programming, and learning new APIs, some harder than others, some are really easy an expressive like Keras.
Right now you can use a more expressive way of creating deep learning models. And that’s using Deep Cognition. I’ve talked about it before:
Their platform, Deep Learning Studio is available as cloud solution, Desktop Solution ( http://deepcognition.ai/desktop/ ) where software will run on your machine or Enterprise Solution ( Private Cloud or On Premise solution).
You can use pre-trained models as well as use built-in assistive features simplify and accelerate the model development process. You can also import model code and edit the model with the visual interface.
The platform automatically saves each model version as you iterate and tune hyperparameters to improve performance. You can compare performance across versions to find your optimal design.
This system is built with the premise of making AI easy for everyone, you don’t have to be an expert when creating this complex models, but my recommendation is that it’s good that you have an idea of what you are doing, read some of the TensorFlow or Keras documentation, watch some videos and be informed. If you are an expert in the subject great! This will make your life much easier and you can still apply your expertise when building the models.
You can actually download the code that produced the predictions, and as you will see it is written in Keras. You can then upload the code and test it with the notebook that the system provides or use it in your laptop or other platforms.
The Semantic Layer and Deep Learning
So with the connection of the semantic layer with a platform like Anzo and a Deep Learning system like Deep Learning Studio you can accelerate the use of data and AI for your company. This is the path that I’m imagining it can work for almost all organizations:
I went ahead and modified the original picture. I think this is with a touch of Python, Spark and stuff like that can be the future of data science and data technologies.
I think that this together with a methodology like the Agile Business-Science Problem Framework (ABSPF) can really bring value to an organization from an end-to-end perspective. More on ABSPF:
Agile Framework For Creating An ROI-Driven Data Science Practice
Data Science is an amazing field of research that is under active development both from the academia and the industry…www.business-science.io
I think we can change the world for the better, improve our lives, the way we work, think and solve problems, and if we channel all the resources we have right now to make these area of knowledge to work together for a greater good, we can make a tremendous positive impact in the world and our lives.
This is the beginning of a longer conversation I want to start with you. I hope it helped you getting started in this amazing area, or maybe just discover something new.
If this articles helped you please share it with your friends!
If you have questions just follow me on Twitter:
See you there :)
Bio: Favio Vazquez is a physicist and computer engineer working on Data Science and Computational Cosmology. He has a passion for science, philosophy, programming, and music. He is the creator of Ciencia y Datos, a Data Science publication in Spanish. He loves new challenges, working with a good team and having interesting problems to solve. He is part of Apache Spark collaboration, helping in MLlib, Core and the Documentation. He loves applying his knowledge and expertise in science, data analysis, visualization, and automatic learning to help the world become a better place.
Original. Reposted with permission.
- A “Weird” Introduction to Deep Learning
- Why do I Call Myself a Data Scientist?
- DIY Deep Learning Projects