News and Updates

How to integrate Bose AR with HERE Location Services

By HERE Technologies | 13 January 2020

Try HERE Maps

Create a free API key to build location-aware apps and services.

Get Started

Recently, we announced a strategic partnership with Bose, the brand that you know for best-in-class audio, innovation, and noise cancelation.  Bose launched Bose AR in 2018 which allows developers to create immersive multidimensional spatial audio experiences and unique location applications.  And now Bose AR developers will be able to easily integrate HERE Location Services to their apps.   

We’re also excited to provide sample code and how-to’s so that each of the hundreds of thousands of HERE developers may now Bose AR enable their apps. 

In this blog post we’ll show you how to leverage the Bose IMU to capture direction of travel: Who hasn’t come out of a subway or building and started walking in one direction only to find out a block or two later that you’re heading in the wrong direction?  With this simple code you’ll be able to take the heading from your Bose Frames or QC/NC headsets and reflect that direction on a map by overriding the HERE Map’s position indicator. This is a building block to many solutions you can leverage Here’s location information to add more context to the Bose device’s sensor data.

This code overwrites the positioning icon on the HERE mobile SDK map from the phone’s heading which can be fickle to the true heading provided by the Bose IMU:

Another option you have as a HERE developer integrating the Bose AR Kit is to maintain the phone’s heading but leverage the IMU’s yaw and tilt to interpolate field of vision.  This is useful when a user is on the move and wants to know more about a point of interest or wants to use the Bose AR capabilities to orient themselves (“just to the right of the Chase Bank you’re looking at”).

You’ll need to work out the logic between the direction of travel and field of vision states to provide a seamless user experience; most applications you may dream up will have elements of both.  For example, I’m driving and I look at a building in the distance and ask, “What’s that mountain over there?” and in that instance the pointer icon would not change from direction of travel but rather the app would use the information from the IMU to know field of vision to disambiguate the request.  Later, as the user parks and exits their vehicle, the directional pointer may change to control via the IMU.  Finally, a developer may want to flip between states with context as a user may want heading and then along the way make queries or need orientation.

We’re thrilled about our partnership with Bose and can’t wait to see what you all dream up!


//HERE DevRel


 * Copyright (c) 2011-2019 HERE Europe B.V.
 * All rights reserved.
import UIKit
import NMAKit
import BoseWearable
import simd
class MainViewController: UIViewController {
    class Defaults {
        static let latitude = 49.260327
        static let longitude = -123.115025
        static let zoomLevel : Float = 13.2
    @IBOutlet weak var mapView: NMAMapView!
    var positionIndicator : NMAMapLocalModel?
    var mapFixed = false
    private var boseConnected = false
    private var token: ListenerToken?
    private let sensorDispatch = SensorDispatch(queue: .main)
    var session: WearableDeviceSession! {
        didSet {
            session?.delegate = self
    override func viewDidLoad() {
        // create geo coordinate
        let geoCoordCenter = NMAGeoCoordinates(latitude: Defaults.latitude,
                                               longitude: Defaults.longitude)
        // set map view with geo center
        mapView.set(geoCenter: geoCoordCenter, animation: .none)
        // set zoom level
        mapView.zoomLevel = Defaults.zoomLevel
        //listen for position updates
                                               selector: #selector(MainViewController.didUpdatePosition),
                                               name: NSNotification.Name.NMAPositioningManagerDidUpdatePosition,
                                               object: NMAPositioningManager.sharedInstance())
    //listen to any position updates and set the map view to use location
    @objc func didUpdatePosition() {
        guard let position = NMAPositioningManager.sharedInstance().currentPosition,
            let coordinates = position.coordinates else {
        coordinates.altitude = 0
            mapView!.set(geoCenter: coordinates, animation: .linear)
            mapFixed = true;
            createCustomPositionIndicator(coordinates: coordinates)
            positionIndicator?.coordinates = coordinates;
    /*this method creates a custom NMAMapLocalModel to serve as the representation of the position indicator.
     The advantage of using NMAMapLocalModel is that we'll be able to control the YAW of the icon as BOSE
     angle data is updated*/
    private func createCustomPositionIndicator(coordinates: NMAGeoCoordinates){
        var mesh = NMAFloatMesh()
        let size = 5.0 as Float
        var vertices : [Float]
        vertices = [-size/2, -size/2, 0,-size/2,
                    size / 2, 0.0,size / 2,
                    -size/2, 0.0,size/2,   size/2, 0.0]
        mesh.setVertices(UnsafePointer(&vertices), count: 4)
        var textureCoordinates : [Float]
        textureCoordinates = [0.0, 0.0,
                    0.0,  1.0,
                   1.0,  0.0,
                   1.0,  1.0]
        mesh.setTextureCoordinates(textureCoordinates, count: 4)
        var triangles : [Int16]
        triangles = [2,1,0,1,2,3]
        mesh.setTriangles(UnsafePointer(&triangles), count: 2)
        var image = NMAImage.init(uiImage: UIImage.init(imageLiteralResourceName: "indicator.jpg"))
        positionIndicator = NMAMapLocalModel.init(mesh: mesh)
        positionIndicator!.autoscaled = true
        positionIndicator!.coordinates = coordinates
        mapView.add(mapObject: positionIndicator!)
     Initiate Bose Session
    private func initiateBoseSessions(){
        sensorDispatch.handler = self
        self.sensorDispatch.handler = self
        let sensorIntent = SensorIntent(sensors: [.rotation, .accelerometer], samplePeriods: [._20ms])
        let mode: ConnectUIMode
        if let device = MostRecentlyConnectedDevice.get() {
            mode = .reconnect(device: device)
        else {
            mode = .alwaysShow // some other way to connect
        // Perform the device search and connect to the selected device. This
        // may present a view controller on a new UIWindow.
        BoseWearable.shared.startConnection(mode: mode, sensorIntent: sensorIntent) { result in
            switch result {
            case .success(let session):
                // A device was selected, a session was created and opened. Show
                // a view controller that will become the session delegate.
                self.boseConnected = true;
                print("bose session started")
                self.session = session
            case .failure(let error):
                // An error occurred when searching for or connecting to a
                // device. Present an alert showing the error.
            print("error: \(error)")
            case .cancelled:
                // The user cancelled the search operation.
    private func listenForWearableDeviceEvents() {
        // Listen for incoming wearable device events. Retain the ListenerToken.
        // When the ListenerToken is deallocated, this object is automatically
        // removed as an event listener.
        token = session.device?.addEventListener(queue: .main) { [weak self] event in
    private func listenForSensors() {
        // Configure sensors at 50 Hz (a 20 ms sample period)
        session.device?.configureSensors { config in
            // Here, config is the current sensor config. We begin by turning off
            // all sensors, allowing us to start with a "clean slate."
            // Enable the rotation and accelerometer sensors
            config.enable(sensor: .rotation, at: ._320ms)
            config.enable(sensor: .accelerometer, at: ._320ms)
    func configureGestures() {
        session.device!.configureGestures { config in
            // First, disable all currently-enabled gestures.
            // Next, configure the gestures we are interested in.
            config.set(gesture: .doubleTap, enabled: true)
    private func wearableDeviceEvent(_ event: WearableDeviceEvent) {
        switch event {
        case .didWriteSensorConfiguration:
            // The sensor configuration change was accepted.
            print("did write sensor configuration")
        case .didFailToWriteSensorConfiguration(let error):
            // The sensor configuration change was rejected with the specified error.
            print("wearableDeviceEvent error: \(error)")
        case .didWriteGestureConfiguration:
            // The gesture configuration change was accepted.
            print("did write gesture configuration")
        case .didFailToWriteGestureConfiguration(let error):
            // The gesture configuration change was rejected with the specified error.
            print("wearableDeviceEvent error: \(error)")
extension MainViewController: WearableDeviceSessionDelegate {
    func sessionDidOpen(_ session: WearableDeviceSession) {
        // Session opened successfully. Note that the session passed to the
        // startConnection completion handler is already open. This delegate
        // method is only useful when re-opening a session.
        print("Opened session")
    func session(_ session: WearableDeviceSession, didFailToOpenWithError error: Error?) {
        // The session failed to open. Present the error to the user. As above,
        // the session passed to the startConnection completion handler is
        // already open. This delegate method is only useful when re-opening a
        // session.
        print("error: \(error)")
    func session(_ session: WearableDeviceSession, didCloseWithError error: Error?) {
        // The session closed. If error is nil, this was an expected closure
        // (e.g., the connection was requested to be closed).
        if error != nil {
            print("error: \(error)")
extension MainViewController: SensorDispatchHandler {
    func receivedAccelerometer(vector: Vector, accuracy: VectorAccuracy, timestamp: SensorTimestamp) {
//        print("receivedAccelerometer, vector: \(vector), vectorAccuracy:\(accuracy), sensorTimeStamp: \(timestamp)")
        // handle accelerometer reading
    func receivedGyroscope(vector: Vector, accuracy: VectorAccuracy, timestamp: SensorTimestamp) {
        // handle gyroscope reading
//        print("receivedGyroscope, vector: \(vector), vectorAccuracy:\(accuracy), sensorTimeStamp: \(timestamp)")
    func receivedGesture(type: GestureType, timestamp: SensorTimestamp) {
        // handle gesture
//        print("receivedGyroscope, gesureType: \(type) timestamp: \(timestamp)")
     On rotation, get yaw based on BOSE hardware. Then set the position indicator YAW accordingly
    func receivedRotation(quaternion: Quaternion, accuracy: QuaternionAccuracy, timestamp: SensorTimestamp) {
        let qMap = Quaternion(ix: 1, iy: 0, iz: 0, r: 0)
        let qResult = quaternion * qMap
        let pitch = qResult.xRotation
        let roll = qResult.yRotation
        let yaw = Float(-qResult.zRotation)
        //Convert YAW from radians to degrees
        var yawDegrees = (yaw  * 180 / .pi) as Float
        //indicator JPG's default position is pointing south, so adding 180 to normalize position
        let trueYaw = positionIndicator!.yaw + 180 as Float
        //find positive degree value
        if(yawDegrees < 0 as Float){
            yawDegrees = 360 as Float + yawDegrees
        //keeping a threshold delta of 2 degrees before reflecting direction change on the map
        if(abs(trueYaw - yawDegrees) > 2){
            //180 is because of indicator JPG's default position is pointing south. rotating 180 fixes this.
            positionIndicator!.yaw = yawDegrees - 180