Skip to main content
Release notes 16 min read

HERE Workspace & Marketplace 2.13 release

HERE Workspace & Marketplace 2.13 release

Highlights

 

Submit multiple VINs to expedite Neutral Server Consent Management

As a Marketplace consumer subscribed to Neutral Server listings, you can now submit multiple VINs (Vehicle Identification Numbers) when creating a new or editing an existing consent request. A list of VINs can be programmatically submitted using a single CSV or JSON file enabling you to process multiple VINs with the use of an external API. The Consent Manager simplifies the integration between HERE and Data Consumers' systems and reduces human interactions when using Neutral Server to request access to PII data.

 

Delete catalog versions to manage storage costs of your Versioned layer data

Deleting catalog versions helps you manage storage costs and gives you more granular lifecycle management controls for your Versioned layer data. Safely remove (delete) older versions manually or automatically to manage how long your Versioned data is stored. Deleting a catalog's version safely removes data from Versioned layers only and in a way which doesn't break existing dependencies between different Versioned layers. Deleting catalog versions maintains catalog configuration information and therefore the overall data integrity of the Versioned layers within a catalog. You will find this functionality via the "More" menu on the Catalog Details Page as long as there are Versioned layers in the catalog.

Example of manual deletion controls:

 

Access additional HERE Map Content: Advanced Navigation Attributes via the Optimized Map for Location Library

The attribute "Railroad Crossing" from the Advanced Navigation Attributes layer in the HERE Map Content catalog is now also available in the Optimized Map for Location Library. We will continue to iteratively compile HERE Map Content attributes into the Optimized Map for Location Library enabling fast and simplified direct access to these attributes via the Location Library. For more detail on the added attribute, please see: HERE Map Content- Advanced Navigation Attributes layer

 

Location Referencing: encode/decode locations in the OpenLR format regardless of source or target map format

The popular OpenLR open standard for Dynamic Location Referencing is added to the Location Library. Events and locations can be decoded and encoded with this format, allowing locations localized on one map to be found on another map to which the data have been transferred, even if the map formats are completely different from each other.

 

Centralized SDKs, Tools, and examples to improve discoverability

Following the addition of new SDKs to the platform in the last release, the platform web pages have been restructured with this release. platform.here.com/sdk now lists all the platform SDKs, Tools and the CLI.

The Java and Scala examples that were previously distributed with the CLI are now all published on GitHub https://github.com/heremaps/here-workspace-examples-java-scala. Note, openlocation.here.com/resources/code-examples still has Java and Scala examples that are not yet published to this GitHub repository. 

The CLI now has its own page: https://platform.in.here.com/sdk/cli.

If you are new to the platform and are a Windows user, an installer for the CLI is now available to make the setup easier; it's available at platform.here.com/cli.

 

Get information about scheduled maintenance on the Status Page

The HERE Status page at https://status.here.com/status/status now supports information about scheduled maintenance. Organization Account Owners are notified of maintenance as well as additional users within an organization who have opted in to receive Change Notifications in the support portal.

 

Changes, Additions and Known Issues

SDK for Java and Scala

To read more information about the SDK for Java and Scala, please visit the HERE platform changelog.

 

SDK for C++

To read more information about the SDK for C++, please visit the HERE platform changelog.

Added: The latest release enables you to authenticate via the SignInFederated API and you are now able to request an authentication token using your own request body.

Added: It is possible to download metadata for a specified list of partitions. You can also specify additionalFlags that you are interested in. For example, you can specify: data size, compressed data size, crc or checksum.

Added: The SDK lets you retrieve data via the GetData API with TileKey. The metadata for surrounding tiles is downloaded in one request, this enables you to retrieve map tiles faster.

 

SDK for TypeScript

To read more information about the SDK for TypeScript, please visit the HERE platform changelog.

Added: The SDK for TypeScript now enables you to read data from Stream layers.

 

SDK for Python

Added: The SDK for Python installation has been updated with simpler steps using Conda. Refer to the SDK for Python Setup Guide for the latest updates.

 

Web & Portal

Issue: The custom run-time configuration for a Pipeline Version has a limit of 64 characters for the property name and 255 characters for the value.
Workaround: For the property name, you can define a shorter name in the configuration and map that to the actual, longer name within the pipeline code. For the property value, you must stay within the limitation.

Issue: Pipeline Templates can't be deleted from the Portal UI.
Workaround: Use the CLI or API to delete Pipeline Templates.

Issue: In the Portal, new jobs and operations are not automatically added to the list of jobs and operations for a pipeline version while the list is open for viewing.
Workaround: Refresh the Jobs and Operations pages to see the latest job or operation in the list.

 

Projects & Access Management

Issue: A finite number of access tokens (~ 250) are available for each app or user. Depending on the number of resources included, this number may be smaller.
Workaround: Create a new app or user if you reach the limitation.

Issue: Only a finite number of permissions are allowed for each app or user in the system across all services. It will be reduced depending on the inclusion of resources and types of permissions.

Issue: All users and apps in a group are granted permissions to perform all actions on any pipeline associated with that group. There is no support for users or apps with limited permissions. For example, you cannot have a reduced role that can only view pipeline status, but not start and stop a pipeline.
Workaround: Limit the users in a pipeline's group to only those users who should have full control over the pipeline.

Issue: When updating permissions, it can take up to an hour for changes to take effect.

Issue: Projects and all resources in a Project are designed for use only in Workspace and are unavailable for use in Marketplace. For example, a catalog created in a Platform Project can only be used in that Project. It cannot be marked as "Marketplace ready" and cannot be listed in the Marketplace.
Workaround: Do not create catalogs in a Project when they are intended for use in both Workspace and Marketplace.

 

Data

Deprecated: For security reasons, the platform will start validating schema reference changes in layer configurations as of September 30, 2020. Schema validation will check if the user or application trying to make a layer configuration change indeed has at least read access to the existing schema associated with that layer (i.e. a user or application cannot reference or use a schema they do not have access to). If a non-existing or non-accessible schema is associated with any layer after this date, any attempt to update any configurations of that layer will fail. Please ensure all layers refer only to real, existing schemas, or contain no schema reference at all before September 30, 2020. It is possible to use the Config API to remove or altogether change schemas associated with layers to resolve these invalid schema/layer associations. Also, any CI/CD jobs referencing non-existing or non-accessible schemas will need to be updated by this date or they will fail.

Issue: The changes released with 2.9 (RoW) and with 2.10 (China) to add OrgID to Catalog HRNs and with 2.10 (Global) to add OrgID to Schema HRNs could impact any use case (CI/CD or other) where comparisons are performed between HRNs used by various workflow dependencies.  For example, requests to compare HRNs that a pipeline is using vs what a Group, User or App has permissions to will  result in errors if the comparison is expecting results to match the old HRN construct.  With this change, Data APIs will return only the new HRN construct which includes the OrgID (e.g. olp-here…) so a comparison between the old HRN and the new HRN will be unsuccessful.   

  • Reading from and writing to Catalogs using old HRNs is not broken and will continue to work until July 31, 2020.
  • Referencing old Schema HRNs is not broken and will work into perpetuity.

Workaround: Update any workflows comparing HRNs to perform the comparison against the new HRN construct, including OrgID.

Issue: Versions of the Data Client Library prior to 2.9 did not compress or decompress data correctly per configurations set in Stream layers. We changed this behavior in 2.9 to strictly adhere to the compression setting in the Stream layer configuration but when doing so, we broke backward compatibility wherein data ingested and consumed via different Data Client Library versions will likely fail. The Data Client LIbrary will throw an exception and, depending upon how your application handles this exception, could lead to an application crash or downstream processing failure. This adverse behavior is due to inconsistent compression and decompression of the data driven by the different Data Client Library versions. 2.10 introduces more tolerant behavior which correctly detects if stream data is compressed and handles it correctly.

Workaround: In the case where you are using compressed Stream layers and streaming messages smaller than 2MB, use the 2.8 SDK until you have confirmed that all of your customers are using at least the 2.10 SDK where this Data Client Library issue is resolved, then upgrade to the 2.10 version for the writing aspects of your workflow.

Issue: Searching for a schema in the Portal using the old HRN construct will return only the latest version of the schema.  The Portal will currently not show older versions tied to the old HRN.

Workaround: Search for schemas using the new HRN construct OR lookup older versions of schemas by old HRN construct using the OLP CLI.

Issue: Visualization of Index layer data is not yet supported.

 

Pipelines

Issue: A pipeline failure or exception can sometimes take several minutes to respond.

Issue: Pipelines can still be activated after a catalog is deleted.
Workaround: The pipeline will fail when it starts running and will show an error message about the missing catalog. Re-check the missing catalog or use a different catalog.

Issue: If several pipelines are consuming data from the same Stream layer and belong to the same Group (pipeline permissions are managed via a Group), then each of those pipelines will only receive a subset of the messages from the stream. This is because, by default, the pipelines share the same Application ID.
Workaround: Use the Data Client Library to configure your pipelines to consume from a single stream: If your pipelines/applications use the Direct Kafka connector, you can specify a Kafka Consumer Group ID per pipeline/application.  If the Kafka consumer group IDs are unique, the pipelines/applications will be able to consume all the messages from the stream.
If your pipelines use the HTTP connector, we recommend you to create a new Group for each pipeline/application, each with its own Application ID.

Issue: The Pipeline Status Dashboard in Grafana can be edited by users. Any changes made by the user will be lost when updates are published in future releases because users will not be able to edit the dashboard in a future release.
Workaround: Duplicate the dashboard or create a new dashboard.

Issue: For Stream pipeline versions running with the high-availability mode, in a rare scenario, the selection of the primary Job Manager fails.
Workaround: Restart the stream pipeline.

 

HERE Content

Deprecated: Within the Real Time Traffic catalog (https://platform.here.com/data/hrn:here:data::olp-here:olp-traffic-1) the Incidents2.4 layer is being deprecated and has been labeled as such since August 2019. This is a step in the effort to migrate to the new Delta Incidents layer which is now available in the same catalog. There will no longer be new incidents published in the Incidents2.4 layer by April 30, 2020. Once this happens, volatile data will no longer have incidents after 1hr of their time-to-live. As this will be a schema change, there are instructions in the layer description on how to migrate.

 

Location Services

Issue: Lack of usage reporting for Location Services released in version 2.10 (Routing, Search, Transit, and Vector Tiles Service)
Workaround: Usage is being tracked at the service level.  Following the 2.12 release wherein usage reporting is expected to be in place, customers may request usage summaries for usage incurred between the 2.10 release and 2.12.

 

Marketplace (Not available in China)

Added: You can attach an optional link or PDF file (e.g. executed agreement) to a subscription during the activation process. Once the link or PDF file is attached, both you and your subscriber can access it in the subscription detailed page.

Added: You can add multiple VINs ((Vehicle Identification Numbers) when creating or editing consent request while requesting access to PII data via Neutral Server. The list can be added as a single CSV or JSON format file with max. size of 2 MB

Changed: When consuming catalog data or schemas using Java or Scala, the Maven plugin will be automatically installed. We recommend that you update to the new Maven wagon plugin version. Please, refer to the documentation to learn how to do this. The plugin has been open sourced and can be found on https://github.com/heremaps/here-artifact-maven-wagon.

Issue: Users do not receive stream data usage metrics when reading or writing data from Kafka Direct.
Workaround: When writing data into a Stream layer, you must use the Ingest API to receive usage metrics. When reading data, you must use the Data Client Library, configured to use the HTTP connector type, to receive usage metrics and read data from a Stream layer.

Issue: When the Technical Accounting component is busy, the server can lose usage metrics.
Workaround: If you suspect you are losing usage metrics, contact HERE technical support for assistance rerunning queries and validating data.

Issue: Projects and all resources in a Project are designed for use only in Workspace and are unavailable for use in Marketplace. For example, a catalog created in a Platform Project can only be used in that Project. It cannot be marked as "Marketplace ready" and cannot be listed in the Marketplace.
Workaround: Do not create catalogs in a Project when they are intended for use in the Marketplace.

 

Summary of active deprecation notices across all components

 

Jeff Henning

Have your say

Sign up for our newsletter

Why sign up:

  • Latest offers and discounts
  • Tailored content delivered weekly
  • Exclusive events
  • One click to unsubscribe