It’s been a while since Salesforce announced the arrival of Salesforce Einstein. With the phrase “AI for everyone” they revealed their AI platform to the world.

In the last few months, more and more details about the platform are revealed and also on Trailhead, a few modules have been published about it since. The current road map is about integrating selected AI features directly into the platform, some features are already GA. Examples of this are:

  • Recommended follow ups (Sales cloud)
  • Predictive scoring (Marketing cloud)
  • Predictive Audiences (Marketing cloud)
  • Recommended Experts (Community cloud)
  • Automated service escalation (Community cloud)
  • Smart data discovery (Analytics cloud)
  • Product recommendations (Commerce cloud)
  • Predictive email (Commerce cloud)

And more..

One of the most important features of Salesforce Einstein is the promise of integrating AI through API’s making it possible to integrate any kind of AI solution into the platform. For this, Salesforce currently offers two paths under the Salesforce Einstein suite:

Prediction Vision Service

This service is used for image recognition and is integrated in Salesforce by installing a few apex classes from git. Through metamind, a library of training sets is provided that can be used for training the classifier.


The idea behind PredictionIO is that it provides a framework in which customizable templates are used to plugin AI for any given scenario. In this case you won’t have to focus on the overall architecture but you can just dive into applying your own logic, either by creating your own implementation or by customizing an existing template. Currently, the PredictionIO template gallery contains templates in, among others, the following categories: Recommenders, classification, clustering and NLP.

The architecture, as described on their website, looks like this:

PredictionIO Architecture

PredictionIO Architecture

Salesforce integration

On high level, both paths integrate in the same way:

Einstein highlevel architecture

Einstein highlevel architecture

This is in essence the same architecture as described here. The generic reference to “AI Server” can be for example on Heroku, or you can setup your own server. An apex service class is used to fulfill the integration by providing the implementation of the required interfaces. This will typically consist of two API’s, one to train and one to predict.

AI for everyone

With this slogan, Salesforce Einstein got introduced. This slogan is supported by the following:

  • Directly integrated AI functionality in the various Salesforce clouds as described above
  • Easy API integration through Metamind or PredictionIO

The idea is that with easy API integration, AI is available for everyone. However, integration nowadays is typically not an issue anymore. With the rise of various cloud platforms (e.g. AWS or Heroku but also Azure), services of any kind can easily be exposed and called in a secure manner. Given this, AI itself was already available to everyone, you just need to expose it as a service and call it. As described in my other Machine Learning post, it’s just as easy to open up all the python scipy functionality as a service using a similar architecture as described above.

The use of pre-trained models is another step in lowering the boundary for using AI-as-a-service. However, the pre-trained model must be specific enough for the given situation to have optimal prediction result. This can work vary well in e.g. image recognition, but for particular case classification this is very unlikely. This would lead to you having to train your own classifier. While calling the API’s is not difficult, as in, the parameters to use are not necessarily complicated, it is not always straight forward how to create the most optimal model. You will need a good understanding of the requirement to solve with AI, come up with the pre-processing steps, try different algorithms or specify the algorithm(s) to use, and come up with a strategy for finding the most optimal settings used for training the algorithm. Lacking knowledge in this area will lead to a sub optimal or counter productive algorithm.

Commenting is closed

© 2016