Adding and Interacting with LiveSight Content
This section covers how to add content to be displayed in LiveSight and how to handle user interactions with that content. The classes covered in this section are NMAARObject
and NMAARIconObject
. Additionally, several NMAARController
methods and properties are used:
addObject:
removeObject:
pressObject:
unpressObject:
selectObject:
deselectObject
focusObject:
defocusObject
objectsAtPoint:
objectsInRect:
The following delegate protocols are also used alongside NMAARController
:
-
NMAARControllerDelegate
-
NMAARControllerGestureDelegate
LiveSight Object Model
A LiveSight object has several visual representations. The representation to be displayed is decided based on the state of the object. The object state which influences display is the Focus state. The Focus state is discussed in detail later.
While in camera-enabled LiveSight view, there are two planes in which an object can be displayed, and the object representation is different in each. The planes are the "Front" plane and the "Back" plane. By default, objects which are geographically closer to the LiveSight center are displayed in the Front plane, and objects which are further away are displayed in the Back plane. Objects can be moved from one plane to the other using the vertical pan gesture.
![]() | ![]() |
While in the Front plane, a LiveSight object is represented by both its icon as well as an information view (Info View) which extends out from the side of the icon. This information view is intended to act as a mechanism for displaying more detailed information about the object. In the Back plane, an object is initially represented by a single icon. It is possible to have an object in the Back plane display its Info View by putting it in focus. The icon for the Front plane and the Back plane can be different, and by default the transition from one plane to the other is animated.
NMAARObject Class
NMAARObject
is the base class for all other objects which can be added to LiveSight in order to be displayed. It contains methods common to all LiveSight objects enabling the following operations:
- Set and retrieve the object current position
- Set and retrieve the object front and back icons
- Set and retrieve the object front and back icon sizes
- Set and retrieve the size, image, and extension state of the information view
setFrontPlaneIconSize:
method in NMAARController
. NMAARIconObject Class
The single child class to NMAARObject
is NMAARIconObject
. NMAARIconObject
represents the object model described in LiveSight Object Model. Because it is the only concrete NMAARObject
, most of its functions reside in NMAARObject
.
Adding and Interacting with NMAARObject
Adding an NMAARObject
to LiveSight is accomplished by way of addObject:
method in NMAARController
:
NMAGeoCoordinates *objLocation =
[NMAGeoCoordinates geoCoordinatesWithLatitude:49.276744 longitude:-123.112049];
NMAARIconObject *iconObject =
[NMAARIconObject iconObjectWithIcon:[NMAImage imageWithUIImage:image1]
infoImage:[NMAImage imageWithUIImage:image2]
coordinates:objLocation];
[arController addObject:iconObject];
Similarly, NMAARObjects
can be removed using removeObject:
method in NMAARController
:
[arController removeObject:iconObject];
To facilitate interactivity with NMAARObjects
, an NMAARControllerGestureDelegate
can be registered by way of gestureDelegate
property in NMAARController
. When a tap event occurs, the NMAARObject
at the tap point can be found through arController:shouldProcessTouchUpOnObjects:atPoint:
method in NMAARControllerGestureDelegate
. If you would like to provide custom handling for this gesture, you can implement the method as follows:
-(BOOL)arController:(NMAARController *)arController shouldProcessTouchUpAtPoint:(CGPoint)point
{
// Get object closest to touch point if more than one
NMAARObject* arObject =
objects.count > 0 ? [objects objectAtIndex:0] : nil;
if (arObject) {
// perform some custom action such as focus on the object
}
return NO;
}
The NMAARObject
can be put into focus by way of focusObject:
method. While in focus, an info pane for an NMAARObject
, which is in the back plane, is displayed. Only one NMAARObject
may have focus at a time. To defocus an NMAARObject
, call focusObject:
on another NMAARObject
. You can also call defocusObject
method to defocus from the focused NMAARObject
.
In addition to event driven NMAARObject
retrieval, objectsAtPoint:
and objectsInRect:
methods can be used to programmatically get NMAARObject
at a screen location:
CGPoint point = {50, 50};
NSArray* objectsAtPoint = [arController objectsAtPoint:point];
CGRect viewRect = CGRectMake(50, 50, 25, 25);
NSArray* objectsInViewRect = [arController objectsInRect:viewRect);
Selecting NMAARObjects
After retrieving an NMAARObject
, you can choose to select and deselect it by calling selectObject:
and deselectObject
methods. Selecting an object causes an object Info image to collapse, and the Back plane image to replace the Front plane image (if the object is in the Front plane).
NMAARObject
cannot be focused and selected simultaneously. However, it is possible to have one NMAARObject
focused and another NMAARObject
selected at the same time. Reading the Current Pose
NMAARController
provides a convenient way to retrieve the current positional and directional (pose) values of your LiveSight session. By using poseReading
property you can retrieve the NMAARPoseReading
instance, which contains the following values: - Heading (Yaw)
- Pitch
- Roll
- Location (Latitude, Longitude, Altitude)
- Timestamp
NMAARPoseReading
values are derived from device sensors, they are interpolated and smoothed by the LiveSight engine.