HERE Android SDK Developer's Guide

Natural Language Processing (NLP)

The Natural Language Processing (NLP) feature adds a "natural language" interface to the HERE Android SDK. For example, the end user says "find me a gas station". This prompts the NLP interface to detect that the user wants to search for gas stations, trigger a search via the HERE SDK, use Places to find gas stations, and speak the results to the user.

Important: NLP is currently offered as a beta feature, and it is only available for use in the English language. APIs may change without notice. Do not use this feature in a commercial application.

NLP covers the main features offered by the HERE Android SDK: search, routing, navigation, and traffic. NLP also saves and handles context for the user's request. For example, it can handle the command "find 3 gas stations”, followed by "take me to the third one".

Initialization

You can start using the NLP feature by initializing the Nlp class after you have successfully initialized the MapFragment in your application.

Nlp is a singleton, so you first need to retrieve the instance of it and then call init() to initialize the whole engine. The Nlp.init(Context, MapFragment, CollectionProvider, SpeechToTextProvider, OnInitializationListener) method takes the following input parameters, as demonstrated in the next example:
  • The application context
  • The MapFragment object
  • A CollectionProvider (Or null if you don't want to implement a collection feature)
  • A SpeechToTextProvider to enable the Speech Recognition functionality
  • An OnInitializationListener
// Create Map NLP object to control voice operations
// Pass Activity as a Context
m_nlp = Nlp.getInstance();
m_speechToTextProvider = new MyASR(getApplicationContext);

m_nlp.init(AppActivity.this, mapFragment,
    m_nlpCollectionProvider, m_speechToTextProvider, m_nlpListener);

private OnInitializationListener m_nlpListener = new OnInitializationListener() {
  @Override
  public void onComplete(Error error) {

    if (error == Error.NONE) {

      m_speechToTextProvider.setNlp(m_nlp);

      // Enable talk-back
      m_nlp.setTalkBackEnabled(true);

      // Set speech volume percentage
      m_nlp.setSpeechVolume(25);
    }
  }
};
Note: Nlp can only be used after it is successfully initialized.
Note: If you want to support a collection feature in your application via NLP, implement a CollectionProvider. If the CollectionProvider interface is not implemented, all collection-related utterances result in "feature not supported" announcements to the user. Collection-handling use cases include: saving a found place in a collection, creating a collection, renaming a favorite place, and deleting a collection.

Speech Recognition

You need to create your own Speech Recognition class, by implementing the SpeechToTextProvider interface, to listen to the user’s voice commands. The example below uses the Android SpeechRecognizer API, but your application can use any available Automatic Speech Recognition (ASR). Once the ASR results are received, the recognized text is processed through NLP to be analyzed and understood using the Nlp understand(String) API.

If the SpeechRecognizer interface is not implemented, NLP will not be able to automatically start listening when asking a question to the user for clarification or confirmation.

It is also recommended to use Nlp.startListening() API instead of calling your speech recognizer's start() directly. This will allow NLP to automatically stop navigation instructions from speaking when the application wants to start listening to user’s speech.

public class MyASR implements SpeechToTextProvider {

  private Context m_context = null;
  private volatile SpeechRecognizer m_stt = null;
  private Nlp m_nlp = null;

  /**
   * Create Speech recognizer
   */
  MyASR(Context context, int resStartEarcon, int resStopEarcon, int resErrorEarcon) {
    m_context = context;

    if (m_stt == null) {
      // Creating an instance of Google SpeechRecognizer to listen to user’s utterances
      m_stt = SpeechRecognizer.createSpeechRecognizer(m_context);
      m_stt.setRecognitionListener(m_sttListener);
    }
  }

  /**
   * Schedule to start listening
   */
  @Override
  public synchronized void start() {

    final Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
    intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, Locale.US.toString());
    intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, Locale.US.toString());
    intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 4);

    try {
      m_stt.startListening(intent);
    } catch (Exception e) {
      destroy();
    }
  }

  /**
   * Google Speech Recognizer listener
   */
  private final RecognitionListener m_sttListener = new RecognitionListener() {
    @Override
    public void onResults(final Bundle results) {
      synchronized (GoogleASR.this) {

        final ArrayList<String> data =
            results.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);

        if (data != null && !data.isEmpty() && m_nlp.isInitialized()) {
          m_nlp.understand(data.get(0));  // Pass the utterance to NLP for analysis
        }
      }
    }
  };
}

Using the Nlp Class

To receive callbacks when different utterances are said by the users, set listeners after the Nlp has successfully initialized. For example, if you want to know when the user says "search for restaurants", you need to override the OnSearchListener callback. All available listeners are listed in the Nlp class definition.

The following is an example of how to set an OnSearchListener:

private OnInitializationListener m_nlpListener = new OnInitializationListener() {
  @Override
  public void onComplete(Error error) {

    if (error == Error.NONE) {

      // Create handlers
      m_nlp.addListener(m_searchListener);
    }
  }
};

private OnSearchListener m_searchListener = new OnSearchListener() {
  @Override
  public void onStart(final String subject, final GeoBoundingBox box) {
    android.util.Log.d(TAG, "onStart: Search STRING start event");
  }

  @Override
  public void onStart(final CategoryFilter filter, final GeoBoundingBox box) {
    android.util.Log.d(TAG, "onStart: Search CATEGORY start event");
  }

  @Override
  public void onStart(final GeoCoordinate center) {
    android.util.Log.d(TAG, "onStart: Search REVERSE start event");
  }

  @Override
  public void onComplete(final Error error,
               final String searchString,
               final String whereString,
               final String nearString,
               List<PlaceLink> placeLinks) {
    android.util.Log.d(TAG, "onComplete: Search results are available");
    if (error == Error.NONE) {
      // Show all found places on the map.
    }
  }
};