Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc... ...
Full BioNand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc...
3 Best Programming Languages For Internet of Things Development In 2018
919 days ago
Data science is the big draw in business schools
1092 days ago
7 Effective Methods for Fitting a Liner
1102 days ago
3 Thoughts on Why Deep Learning Works So Well
1102 days ago
3 million at risk from the rise of robots
1102 days ago
Top 10 Hot Artificial Intelligence (AI) Technologies
347022 views
2018 Data Science Interview Questions for Top Tech Companies
98754 views
Want to be a millionaire before you turn 25? Study artificial intelligence or machine learning
93564 views
Here's why so many data scientists are leaving their jobs
90720 views
Google announces scholarship program to train 1.3 lakh Indian developers in emerging technologies
70638 views
Making Deep Learning User-Friendly, Possible?
- DL Frameworks ( Theano, Caffe, Tensorflow, MXNet)
- Meta-Frameworks or API (Keras, ONNX ?)
- Open Models (ResNet, LeNet, VGG16, SqueezeNet, ...) that can be used for transfert learning
- Jupyter Notebooks
- No programming -> better especially for people who don't program much or don't know about the specific Deep Learning frameworks
- Intuitive Graphical Editor to design/modify Deep Learning architecture -> easier to use and better to reach less advanced users
- better productivityâ??-â??user are more efficient and creative
- model more easily understandable and easier to modify/maintain.
- having an efficient development/experiment environnement
- integrated environnement to perform model inference,
- DL model performance monitoring
- easy deployment of the trained models
- automation in the management of files produced by the DL model construction process.
- students /self-learners / teachers
- SME companies
- Engineers / Researchers who want to master the technology but need a tool that will make them more productive
- Researchers in Deep Learning
- Deep Learning engineers, specially in advances uses and production context
- Deep Learning Studio from DEEP COGNITION
- Neural Network Console from SONY
- Neural Network Modeler by IBM ( previous names: IBM Deep Learning IDE and DARVIZ )
- NVIDIA Digits
- Tensorboard (part Tensorflow environment)
- SN Neural Network Simulator from Neuristique (http://leon.bottou.org/projects/neuristique) in which Yan LeCun was involved
- SNNS (http://www.ra.cs.uni-tuebingen.de/SNNS/) from the Institute for
- Parallel and Distributed High Performance Systems (IPVR) of the University of Stuttgart
- and others.


- Make designing/modifying the architecture of DL easier
- Provides all the DL Layer of the Keras API, plus a few technical one (e.g. merge)
- Drad n' Drop and copy/paste helps building large networks
- Allow for easy configuration of each layer
- Automatic checking of the coherence of the constructed DL network
- Pre-trained KERAS layer. SqueezeNet available ! last fully connected can be retrained making easy to implement transfert learning (add reference)
- The Restore model (which actually restore, Data Preprocessing and Hyperparm ) is very useful when making several trial for data preprocessing, architecture learning hyperparm.
- integration of pre-trained Keras models
- AutoML (see further down)
- DATAâ??-â??to load the datasets and pre-processing them
- HYPERPARAMETERSâ??-â??to edit the training hyper-parameters
- TRAININGâ??-â??to start/stop and monitor training
- RESULTSâ??-â??to analyse and compare results from several experiments
- INFERENCE / DEPLOYâ??-â??to test the model and deploy it
- Notebookâ??-â??to use/program Jupiter notebooks
- Environnementsâ??-â??to manage and use ML environnement from the command line
- Deploymentsâ??-â??to manage the deployed DL models constructed with DLS
- plus a few more practicality orientated sections
- File Browserâ??-â??to managed the files in the environment (cloud or local)
- Datasetsâ??-â??to preprocess and to load dataset so there available as model inputs
- Forumâ??-â??to get help from support ans other DLS users
- Videosâ??-â??to access training videos
- Support- to get support from Deep Cognition
- Have several projects train at the same time and still be able to work on others projects
- AutoML
- Jupiter notebook and "command line" programming are provided
- possibility to define custom loss function (objective functions)
- the cloud and desktop are the "same" (might just a few technical differences) and further to that the desktop version is available for FREE.





- Trained models saved in the Keras H5 format
- DLS model in .yaml format containing a DLS specific description of the model
- Apache MXNet
- Keras like API
- possibility to load other pre-trained Keras model -> can't make use of model from Zoo models
- export to Python/Keras code of the model
- viewing the performance with a confusion matrix when there is a classification problem
- a detailed documentation (not a big problem since the environment is pretty intuitive to use)

- logical processing layer are providedâ??-â??LogicalAnd, LogicalOr, ...
- loop layers are provided (but no exemple provided)â??-â??useful to build residual networks and recurrent neural networks
- different data-preprocessing can be done with layers in the model or in the Dataset tab.
- the hyper-parameters can be set in the CONFIG tab
- the Evaluation tab has a useful confusion matrix for classification problems

- DASHBOARDâ??-â??to monitor resource usage
- DATASETâ??-â??to preprocess and to load dataset so there available as model inputs
- JOB HISTORYâ??-â??to monitor the ended training jobs
- underlying framework is, for now, pretty "confidential", hardly any echo of it being used outside SONY
- list of layers is restricted and "non standard" (as the Keras Layer API can be)
- pre-trained DL are not available as layer so transfert might no be that easy to set-up
- loading trained KERAS models is not possible
- no mechanism for easily deploying the DL models is provided
- For a Video on NNC see : https://www.youtube.com/watch?v=-lXjnaUSEtM

- automatic optimal architecture search
- different data-preprocessing can be done with layers in the model or in the Dataset tab.
- the hyper-parameters can be optimized by tool in Watson Studio suite
- integrated into a Data Science/Machine Learning suite, so could potentially benefit from other functionalities of the suite like deployment mechanism
- only produces the code of the DL model , all the other functionalities are in the Watson Studio suite, somewhere (you have to have and know the suite)
- only available on cloud (and maybe soon only available in the Watson Studio suite ?)
