Delete interactive map layer data

The Data Client Library provides the class LayerUpdater to perform update operations on interactive map layers.

The LayerUpdater has 3 methods:

  • updateLayer(catalogHrn, layerId) defines the catalog and layer which should be updated.
  • option("olp.connector.interactive-map-context", "EXTENSION") allows the operation to be executed only in the extension and no deletion will be performed in the extended layer.
  • delete(queryString) performs the delete operation according the query string. The query string is in RSQL format. The delete function call is blocking/synchronous. It returns when the delete operation finished.

Project dependencies

If you want to create an application that uses the HERE platform Spark Connector to delete data from interactive map layer, add the required dependencies to your project as described in chapter Dependencies for Spark Connector.

Examples

The following snippet demonstrates how to delete data from an interactive map layer of a catalog.

Scala
Java
import com.here.platform.data.client.spark.LayerDataFrameReader.SparkSessionExt
import org.apache.spark.sql.SparkSession
val layerUpdater = sparkSession
  .updateLayer(catalogHrn, layerId)
layerUpdater.option("olp.connector.interactive-map-context", "EXTENSION")

val deleteBatchSize = 500
val queryStart = "mt_id=in=("

val query = new StringBuilder().append(queryStart)
val it = oidList.iterator
var total: Long = 0
var i = 0
while (it.hasNext) {
  query.append(it.next())
  i += 1
  if (i >= deleteBatchSize) {
    query.append(")")
    total += deleteBatchSize
    log.info("Removing batch of " + deleteBatchSize + " objects. (" + total + " total elapsed)")
    layerUpdater.delete(query.toString())
    query.clear()
    query.append(queryStart)
    i = 0
  } else {
    if (it.hasNext)
      query.append(",")
  }
}
if (i > 0) {
  query.append(")")
  log.info("Removing batch of " + i + " objects.")
  total += i
  layerUpdater.delete(query.toString())
}
import com.here.hrn.HRN;
import com.here.platform.data.client.spark.LayerUpdater;
import com.here.platform.data.client.spark.javadsl.JavaLayerUpdater;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import org.apache.spark.sql.SparkSession;

LayerUpdater layerUpdater =
    JavaLayerUpdater.create(sparkSession).updateLayer(catalogHrn, layerId);
layerUpdater.option("olp.connector.interactive-map-context", "EXTENSION");

final String queryStart = "mt_id=in=(";

long total = 0;

int i = 0;
Iterator<String> it = oidList.iterator();
StringBuilder query = new StringBuilder(queryStart);
while (it.hasNext()) {
  query.append(it.next());
  i++;
  if (i >= 500) {
    query.append(")");
    layerUpdater.delete(query.toString());
    query = new StringBuilder(queryStart);
    total += i;
    log.info("Removing batch of " + i + " objects. (" + total + " total elapsed)");
    i = 0;
  } else {
    if (it.hasNext()) query.append(",");
  }
}
if (i > 0) {
  query.append(")");
  // Query string: "mt_id=IN=("ID1","ID2",...,"IDn")"
  total += i;
  layerUpdater.delete(query.toString());
}

Note

  • The interactive map layer type currently only supports deleting objects by their ID. No other query types are supported for this type of layer. Queries can be either direct match or IN-type queries.
  • For information on RSQL, see RSQL.
  • It is currently only possible to delete objects in small batches of up to ~500 objects, depending on the length of the object's ids. This limitation will be alleviated in the future, however.

results matching ""

    No results matching ""