Run an Archiving Application Locally

After you have created an archiving application using the Data Archiving Library, you may want to run the application locally to test it before deploying it as pipeline on the HERE platform. There are two ways you can run the application locally:

  • Run locally with Intellij
  • Run locally with a local Flink cluster

The following information shows how to run the SDK example apps using both these methods.

Run with IntelliJ

  1. Download the HERE Data SDK examples project.
  2. Fill in the necessary information in these two files:

    examples/data-archive/java/avro-example/src/test/resources/application.conf

    examples/data-archive/java/avro-example/src/test/resources/credentials.properties

  3. If you want to use a custom logger, modify the log4j.properties file inside the test folder.
  4. Run this Java application:

    examples/data-archive/java/avro-example/src/test/java/com/here/platform/data/archive/example/AvroExampleRunner.java

    Ensure that either the maven profile add-dependencies-for-IDEA is selected or the checkbox for Include dependencies with "Provided" scope in Edit Configurations for AvroExampleRunner.java is selected before running the application.

  1. Download Flink 1.7.1 and start a local cluster:

    wget https://archive.apache.org/dist/flink/flink-1.7.1/flink-1.7.1-bin-hadoop27-scala_2.11.tgz
    tar -xvf flink-1.7.1-bin-hadoop27-scala_2.11.tgz
    chmod 777 flink-1.7.1
    cd flink-1.7.1/bin
    start-cluster.sh
    
  2. Download the HERE Data SDK examples project.

  3. Fill in the necessary information in this file:

    examples/data-archive/java/avro-example/src/main/resources/application.conf

  4. Get a credentials.properties file containing the credentials to allow the example application to access the input and output catalogs and place the file in the ~/.here/ folder. For instructions, see Get Credentials.

  5. Make sure that the credentials you use to generate the credentials.properies file provide read permission to the input stream layer and read/write permission to the index layer. The credentials should match those in the application.conf file.

    Alternatively, you can place the credentials.properties file in the folder:

    examples/data-archive/java/avro-example/src/main/resources/

    Note that the ~/.here/ folder takes priority over the examples/data-archive/java/avro-example/src/main/resources/ folder. The format for the credentials.properties file is:

       here.client.id = <Client Id>
       here.access.key.id = <Access Key Id>
       here.access.key.secret = <Access Key Secret>
       here.token.endpoint.url = <Token Endpoint>
    
  6. Go to your example project root folder (examples/data-archive/java/avro-example) and run this command:

    mvn clean install

    This command builds the JAR file to upload to the local Flink cluster. The output JAR file should be generated in the folder:

    examples/data-archive/java/avro-example/target

  7. Go to your local Flink UI at http://localhost:8081. Click the left menu Submit new job, then Add New to upload the JAR file (has to be platform jar file).

  8. To run the application, click the checkbox on the left to select your uploaded JAR file.
  9. Set the Entry class field to com.here.platform.dal.DALMain, then click Submit.
  10. Go to "Running job" in the left menu to check whether your job is successfully running. You can also look at the Logs tab inside each job to see the generated logs. There is a log4j.properties file inside src/test/resource that you can copy to the src/main/resources/ folder.

results matching ""

    No results matching ""