HERE Platform
Release announcements

HERE Workspace & Marketplace 2.25 release

By Torsten Linz | 17 March 2021

Highlights

Visualize non-HERE-tiled data 

It is now possible to view partitions from generically tiled versioned and volatile layers in Data Inspector. So far, only HERE-tiled partitions could be visualized. Go to the partition list view and choose "view on the map" in the context menu of the partition. (This does require a valid rendering plugin in the layer's schema, or GeoJSON as the layer's data format.)

 

Manage your organization's pipelines and pipeline templates

Organization admins can now manage all the pipelines and pipeline templates within their Org and perform all the actions that the original pipeline author can perform, including but not limited to:

  • Deactivate, cancel or delete a pipeline created by anybody within the Org
  • Delete a pipeline template created by anybody within the Org
  • Activate a pipeline created by anybody within the Org

 

Data Validation Library removed from Data SDK for Java & Scala

The Data Validation Library (DVL), which was deprecated earlier, has now been removed from the HERE Data SDK for Java & Scala 2.25 package. It had the purpose of enabling scalable and efficient testing and validation of versioned catalogs on the platform. While it abstracted the underlying distributed processing on Spark and Data API, knowledge of the underlying Data Processing Library (DPL) as well as data partitioning was required.

To lower the entry barrier for new platform developers, we followed a more lightweight approach, providing a new data validation module which integrates popular testing frameworks such as Cucumber, Junit and ScalaTest, allowing test engineers to write the tests without any previous knowledge of the DPL, concepts of partitioning or map compilation. This is achieved by separation of the test-data extraction phase from the test scenarios, which are executed in parallel on each self-contained test-data partition. Only the test-extraction phase requires knowledge of DPL features and underlying catalog structure. Once created, it can be used by test engineers without platform experience for implementation of numerous tests. The new module pragmatically computes metrics inside the test scenarios and does the assessment based on a predicate on the aggregated metrics satisfying the vast majority of testing use cases.

The data validation module in the DPL covers the functionality of the DVL and was released in the Data SDK for Java & Scala 2.20 release. If you have use cases for data validation, please use this module going forward. See the Data Processing Library documentation for more information about this module.

 

Changes, additions and known issues

SDKs and tools

To see details of all changes to the CLI, Data SDKs for Python, TypeScript, C++, Java and Scala as well as to the Data Inspector Library, visit the HERE platform changelog.

 

Web and portal

Known issue: Pipeline templates can't be deleted from the platform portal UI. 
Workaround:
Use the CLI or API to delete pipeline templates.

Known issue: In the platform portal, new jobs and operations are not automatically added to the list of jobs and operations for a pipeline version when the list is open for viewing.
Workaround: Refresh the "Jobs" and "Operations" pages to see the latest job or operation in the list.

 

Projects and access management

Known issueA set number of access tokens (~250) is available for each app or user. Depending on the number of resources included, this number may be smaller.
Workaround: Create a new app or user if you reach the limit.

Known issue: A set number of permissions is allowed for each app or user in the system across all services. This will be reduced depending on the inclusion of resources and types of permissions.

Known issue: All users and apps in a group are granted permissions to perform all actions on any pipeline associated with that group. There's no support for users or apps with limited permissions. For example, you can't have a role that is limited to viewing pipeline statuses, but not starting and stopping a pipeline.
Workaround: Limit the users in a pipeline group only to those who should have full control over the pipeline. 

Known issue: When updating permissions, it can take up to an hour for the changes to take effect.

Known issue: Projects and all resources in a project are designed for use only in HERE Workspace, not the Marketplace. For example, a catalog created in a platform project can only be used in that project. It can't be marked as "Marketplace-ready" nor be listed in the Marketplace.
Workaround: 
Don't create catalogs in a project that are intended for use in both Workspace and Marketplace.

 

Data

Known issue: In support of the Object Store layer type, a newer version of the Blob API (blob v2) is available in production. The availability of this newer Blob API version can impact existing workflows if developers use Lookup API to get a list of all provided endpoints for a given resource BUT do not select the right baseUrl based on the right API and API version. Because multiple versions of the same API exist, Lookup API responses will include specific URLs per API version.

Workaround: Always select the right baseUrl from Lookup API responses based on the API and API version that you intend to work with. To support existing workflows until you can correct your API selection logic, the Lookup API will return multiple Blob API v1 baseUrls in various positions in responses for the next 6 months, starting January 2021. Please see the deprecation summary at the end of these release notes for more information.

Known issue: The "Upload data" button available via "More" within the versioned layer details page is hidden when the "Content encoding" field in the layer is set to "gzip".
Workaround: Files (including .zip files) can still be uploaded and downloaded as long as the "Content encoding" field is set to "Uncompressed".

Known issue: The changes released with 2.9 (RoW) and with 2.10 (China) - for adding OrgIDs to catalog HRNs - and with 2.10 (Global) - for adding OrgIDs to schema HRNs - could impact any use case (CI/CD or other) where comparisons are made between HRNs used by various workflow dependencies.  For example, requests to compare HRNs that a pipeline is using with those to which a group, user or app has permissions will result in errors if the comparison is expecting results to match the old HRN construct. With this change, data APIs will return only the new HRN construct, which includes the OrgID, e.g. olp-here…, so a comparison between the old HRN and the new HRN will fail.   

  • Reading from and writing to catalogs using old HRNs will continue to work until this functionality is deprecated (see deprecation notice summary).
  • Referencing old schema HRNs will continue to work indefinitely.

Workaround: Update any workflows comparing HRNs to perform the comparison against the new HRN construct, including the OrgID.

Known issue: Searching for a schema in the platform portal using the old HRN construct returns only the latest version of the schema. The portal won't show older versions associated with the old HRN.
Workaround: Search for schemas using the new HRN construct, or look up older versions of schemas using the old HRN construct in the CLI.

Known issue: Visualization of data is not yet supported inside HERE Workspace for the following layer types: index, object store and interactive map.

 

Pipelines

Added: All Pipeline APIs now accept UUID or HRN of pipelines and pipeline templates as path parameter.

Added: HRNs for Pipeline and Pipeline Template now include the Realm name to help in quickly identifying the realm of the Pipeline and Pipeline Template. 

Added: Organization admins can now manage all the pipelines and pipeline templates within their Org and perform all the actions that the original pipeline author can perform, including but not limited to. For details see the highlights section above.

Deprecation reminder: The batch-2.0.0 environment will soon be removed as its deprecation period has ended. It would be best to migrate your batch pipelines to the batch-2.1.0 run-time environment to benefit from the latest functionality and improvements.

Known issue: A pipeline failure or exception can sometimes take several minutes to respond.

Known issue: Pipelines can still be activated after a catalog is deleted.
Workaround: The pipeline will fail when it starts running and show an error message about the missing catalog. Find the missing catalog or use a different one.

Known issue: If several pipelines are consuming data from the same stream layer and belong to the same group (pipeline permissions are managed through a group), then each pipeline will only receive a subset of the messages from the stream. This is because, by default, the pipelines share the same application ID.
Workaround:
Use the Data Client Library to configure your pipelines so they consume from a single stream. If your pipelines/apps use the Direct Kafka connector, you can specify a Kafka Consumer group ID per pipeline/application.  If the Kafka consumer group IDs are unique, the pipelines/apps can consume all the messages from the stream.
If your pipelines use the HTTP connector, create a new group for each pipeline/app, each with its own app ID.

 

Content

Fixed: We fixed the GeoJSON rendering plugin that visualizes HERE Weather data with the Data Inspector. The plugin includes a number of adjustable filters for visibility, precipitation, wind (speed + direction) and temperature, so that you can focus on the weather type that you’re interested in. This plugin is a great example of how you can add simple arrows on a map to indicate a direction. You can explore the weather visualization here: https://platform.here.com/data/hrn:here:data::olp-here:live-weather-eu/latest-data/inspect.

 

Marketplace (Not available in China)

Fixed: Subscription reporting now works (i.e. usage metrics are generated) for object store layers when added to catalogs in Marketplace.

Known issue: When adding an Interactive Map layer to a Marketplace catalog and offer with a subscription option, the subscription reporting doesn't generate usage metrics.

Workaround: Subscription reporting will support Interactive Map layer type in future releases.

Known issue: There is no throttling for the beta version of the External Service Gateway. When the system is overloaded, the service slows down for everyone reading from the External Service Gateway.
Workaround
: Contact HERE support for help.

Known issue: When the Technical Accounting component is busy, the server can lose usage metrics.
Workaround: If you suspect you're losing usage metrics, contact HERE support to get help with rerunning queries and validating data.

Known issue: Projects and all resources in a project are designed for use only in HERE Workspace and not available for use in HERE Marketplace. For example, a catalog created in a platform project can only be used in that project. It can't be marked as "Marketplace-ready" nor be listed in the Marketplace.
Workaround: Don't create catalogs in a project if intended only for use in the Marketplace.

 

Summary of active deprecation notices

This lists only deprecation notices for APIs that are not part of the HERE platform changelog. For those APIs that are covered by the changelog you can filter by 'deprecated' to list all deprecation notices.

No.

Feature Summary

Deprecation Period Announced (Platform Release)

Deprecation Period Announced (Month)

Deprecation Period End

1

OrgID added to Catalog HRN (RoW)

2.9 (ROW)

2.10 (China)

November 2019

June 30, 2021 (extended)

 

Deprecation Summary:

Catalog HRNs without OrgID will no longer be supported in any way.

  • Referencing catalogs and all other interactions with REST APIs using the old HRN format without OrgID OR by CatalogID will stop working after this deprecation period.
    • Please ensure all HRN references in your code are updated to use Catalog HRNs with OrgID before this date so your workflows continue to work.
  • HRN duplication to ensure backward compatibility of Catalog version dependencies resolution will no longer be supported after this date.
  • Examples of old and new Catalog HRN formats:
    • Old (without OrgID/realm): hrn:here:data:::my-catalog
    • New (with OrgID/realm): hrn:here:data::OrgID:my-catalog

2

Batch-2.0.0 run-time environment for Pipelines

2.12

February 2020

August 19, 2020 (past due)

 

Deprecation Summary:

The deprecation period is over and Batch-2.0.0 will be removed soon. Pipelines still using it will be canceled. Migrate your batch pipelines to the Batch-2.1.0 run-time environment to benefit from the latest functionality and improvements. For more details about migrating a batch pipeline to the new Batch-2.1.0 run-time environment, see Migrate Pipeline to new Run-time Environment.

3

Schema validation to be added

2.13

March 2020

June 30, 2021

 

Deprecation Summary:

For security reasons, the platform will start validating schema reference changes in layer configurations after this deprecation period. Schema validation will check if the user or application trying to make a layer configuration change has at least read access to the existing schema associated with that layer (i.e., a user or application cannot reference or use a schema they do not have access to).

 

If the user or application does not have access to a schema associated with any layer after this date, attempts to update configurations of that layer will fail until the schema association or permissions are corrected. Make sure all layers refer only to real, current schemas - or have no schema reference at all - before the deprecation period end. It's possible to use the Config API to remove or change schemas associated with layers to resolve these invalid schema/layer associations. Also, any CI/CD jobs referencing non-existent or inaccessible schemas need to be updated by this date, or they will fail.

4

Stream-2.0.0 run-time environment for Pipelines

2.17

July 2020

February 1, 2021 (past due)

 

Deprecation Summary:

Stream-2.0.0 (with Apache Flink 1.7.1) run-time environment is now deprecated. Existing stream pipelines that use the Stream-2.0.0 run-time environment will continue to operate normally until February 1, 2021. During this time, Stream-2.0.0 run-time environment will receive security patches only.

For this period, to continue developing pipelines with the Stream-2.0.0 environment, use platform SDK 2.16 or older. After February 1, 2021, the Stream-2.0.0 run-time environment will be removed and pipelines using it will be canceled. Migrate your stream pipelines to the new Stream-3.0.0 run-time environment to benefit from the latest functionality and improvements. For more details about migrating an existing stream pipeline to the new Stream-3.0.0 run-time environment, see Migrate Pipeline to new Run-time Environment. For general support for Apache Flink, please see Stream Pipelines - Apache Flink Support FAQ.

5

‘pipeline_jobs_canceled’ metric in Pipeline Status Dashboard

2.17

July 2020

February 1, 2021 (past due)

 

Deprecation Summary:

The pipeline_jobs_canceled metric used in the pipeline status dashboard is now deprecated because it was tied to the pause functionality and caused confusion. The metric and its explanation will be available to use until February 1, 2021. Thereafter, the metric will be removed.

6

Stream throughput configuration changes from MB/s to KB/s

2.19

September 2020

March 31, 2021

 

Deprecation Summary:

Support for stream layers with configurations in MBps is deprecated and will no longer be supported in six months or by March 31, 2021.  

After March 31, 2021 only kBps throughput configurations will be supported. This means that Data Client Library and CLI versions included in SDK 2.18 and earlier can no longer be used to create stream layers because these versions do not support configuring stream layers in KB/s.

7

Monitoring stability improvements

2.20

October 2020

April 30, 2021

 

Deprecation Summary:

The "kubernetes_namespace" metric has been deprecated and will be supported until April 30, 2021. Update all Grafana dashboard queries using this metric to use the "namespace" metric.

The label_values(label) function has been deprecated and will be supported until April 30, 2021. Update all Grafana dashboard queries using this function to use label_values(metric, label)

The datasource "<Realm>-master-prometheus-datasource" has been deprecated and will be supported until April 30, 2021. Update all Grafana dashboards using this datasource to use the Primary datasource.

8

Additional support to help distinguish Blob API versions in Lookup API responses

2.22

December 2020

June 30, 2021

 

Deprecation Summary:

Because multiple versions of different Data APIs exist, it's important that your automated workflows that request service endpoints from Lookup API are updated to select the right baseUrls for the right API and API version you are working with. As some existing customer workflow automation is not yet updated to select the right baseUrls from the Lookup API responses, the Look Up API will return multiple Blob API V1 baseUrls in various positions in responses over the next 6 months, starting January 2021.

To prevent downtime, update your workflow automation during this deprecation period to select the right baseUrl from Lookup API responses based on the right API and API version.