ARchi VR Behavior

Revision Info: Documentation for ARchi VR Version 3.1 - March 2023

Table of Contents

Up to ARchi VR Content Creation

Up to ARchi VR Content Creation

Event-driven Behavior in AR

ARchi VR is supporting declarative programming to build AR experiences. „declARe“ is the declarative programming approach for the specification of AR experiences in ARchi VR. The content of an AR scene is a composition of virtual 3D items that are placed into the spatial context of the running AR session.

Imperative Programming versus Reactive Programming

In an imperative approach, dynamic behavior would be realized by calling functions or object methods. This is not the way to go with „“declARe“. Instead the dynamic behavior is driven by events. You therefore need to define the expected reaction on events that can happen in an AR session. These reactions are expressed as active rules based on Event-Condition-Task triples.

Reactive Programming with Active Rules

Whenever the ARchi VR App should perform a task, an event has to trigger its execution. The occurence of an event is an observation saying that “something has happened”. Events do not automatically trigger a reaction (or side-effect). An active rule needs to bind an event type to a specific task execution. In ARchi VR this is realized as an "Active Rule" which is expressed as Event-Condition-Task triple. Active rules are used to signal the App that there is specific work to be done which will change the internal state of the system.

Active rules are processed asynchronously to avoid coordination and waiting. In such a reactive programming approach there is no control over time-ordered execution.

Reactive Behavior Diagram

For a compact representation of active rules a diagram consisting of rule-reaction blocks is used. The first line shows the active rule as Event-Condition-Task triple. The blockquoted line after the rule is depicting the changed state as reaction:

Event Condition Task

changed state as reaction

The following example of an active rule is triggered by a temporal event (in 20 secs). If no item is found in the current AR session (the condition), the reaction will be voice feedback (the reaction) using the internal Text-To-Speech system:

in:20 if:items.@count == 0 do:say

"you may add an item" 🗣

If no condition is defined, it evaluates to true and the diagram shows an immediate execution arrow (→):

Event Task

changed state as reaction

Cascading reactions are presented as intented blockquotes:

at:start do:request GET:JSON

Action ← response •••

on:ommand do:set

data.val = 0

in:5 do:set

data.val = 5

Event Production Principles

An event is a signal that something has happened. Events are generated by a producer caused and triggered by various circumstances.

Event Categories

Within an Augmented Reality session of ARchi VR the following event types may happen:

Event Type Producer Cause Time Resolution
Session Event AR Session Change of session state → at:start, at:stable, on:load, on:error, at:stop in realtime
Invocation Event Command Initiation or Function Call Invocation of task → on:command, as:once, function call → on:call in realtime
User Event App User User interaction → on:tap, on:press, on:drag, on:select, on:dialog, on:poi in realtime
Data-driven Event Data Observer & Context Dispatcher Observed change of key-value in data model → as:changed, as:stated, as:steady, as:activated, as:altered, as:always, as:repeated 200 ms
Temporal Event Time Scheduler Elapsed time in seconds reached → in:time 200 ms
Response Event Remote Request Async response of REST API call → do:request 20 - 5’000 ms
Detection Event Installed Detector Discovery of designated entity → on:detect 100 - 500 ms
Notification Event Subscribed System: Bonjour or SharePlay Received notification on collaboration → on:enter, on:leave; Broadcast task → as:broadcast 50 - 250 ms

Common AR Session Events

Immediate execution of tasks or function calls at invocation is standard behavior and does not need any special handling. Default AR events are common triggers driven by the AR session, such as at:start, on:error, at:stop.

Listener Activation Pattern

Activate listeners with dispatch, such as as:always, on:change, on:error, ...

Detector Installation Pattern

Capturing a specific entity, Detector Activation pattern do:detect

Condition Evaluation Principles

Predicates within AR Context

Boolean Functions in Predicate Evaluation

Functional Side Effects

Functions with Spatial Results

Function/Task Equivalence

function/task pairs: naming equivalence implicit verb do:name vs function name fct Params

Content Composition Principles

Unique Identifier Principle

uid, not uuid,

Uniform Classification Principle

Uniform type Identification via: Type, subtype, name Object enumeration?

Spatial Model Modification Pattern

dynamic do:add, on:load

Component Aggregation

Scene graph, composition pattern, do:addto

Visual Representation Modification Pattern

3D: do:animate 2D UI: overlay, ...

Behavioral Patterns

Behavioral Pattern Description Example
Immediate Reaction Pattern Singular execution of task triggered directly by invocation event Immediate, singular command of task or function call
Timed Reaction Pattern Temporally executed action Delayed action, timed action sequence
Conditional Reaction Pattern Execute an action only when a condition is fulfilled after being triggered by event State-driven, asynchronous programming logic
Indirect Reaction Pattern Execute an action that is triggered by a changed state caused by a former event Loosly coupled reactive programming
Request Response Pattern Remote procedure call (RPC) to a server resulting in asynchronously receiving an action REST API call via a Web URL, e.g., GET:JSON or POST:CONTEXT
Chain of Reactions Pattern Course of events processed as sequence of indirect reactions Response of async request loads updated data and this will trigger a visual change in an item as a follow-up.
Complementary Reactions Pattern Two reactions with opposite result Reacting on toggling states with two complementary active rules
Detector Reactivation Pattern Reactivate detector with only once reaction Reactivate detector after result no longer visible
Publish-Subscribe Notification Pattern Receive notifications via a message queue from a subscribed system In FaceTime/SharePlay call, in Bluetooth connection, in WebSocket/WebRTC session
Continous Evaluation Pattern Continous polling of state change Existence check, visibility check, proximity check, repeated update checks

Immediate Reaction Pattern

on:command do:say

"Immediate Reaction Pattern" 🗣

Timed Reaction Pattern

in:5 do:say

"Timed Reaction Pattern" 🗣

A sequence of timed reactions as an example:

at:start do:say

"Here we go." 🗣

in:4 do:say

"Timed Reaction Pattern" 🗣

in:7 do:say

"Good bye" 🗣

in:10 do:exit

Conditional Reaction Pattern

on:change if:items.@count >= 1 do:say

"Conditional Reaction Pattern" 🗣

The built-in state management is observing model as well as data elements and dispatches the processing of reactions.

Indirect Reaction Pattern

Active rules are processed asynchronously to avoid coordination and waiting. Coupling of data and visuals or audio can be achieved by observing state changes.

on:command do:assign

data.flag = 1

as:stated if:data.flag == 1 do:say

"Indirect Reaction Pattern" 🗣

Request Response Pattern

Remote request with async response:

on:command do:request GET:JSON

Action ← response •••

on:command do:say

"Async Response Reaction" 🗣

Hint: REST is stateless, therefore handle state on client.

Chain of Reactions Pattern

A sequence of several chained reactions.

Continous Evaluation Pattern

Continuous query as repeated evaluation on each state change:

stated if:function('', 'proximity') < 1.2 do:execute

function('https://___', 'getJSON') ◀

Temporally controlled repetition:

repeated each 60 secs do:say

"Another minute." 🗣

Complementary Reactions Pattern

at:start do:add ahead 0.0 1.0 -0.9 ➕

on:altered if:function('', 'visible') == true do:say

"you see box" 🗣

on:altered if:function('', 'visible') == false do:say

"now you don't" 🗣

Publish-Subscribe Notification Pattern

on:enter do:say

"New participant entered session." 🗣

Detector Reactivation Pattern

Some detectors stop after capturing a first occurence of the depicted entity and need be reactivated by a do:redetect task. The reactivation can be driven by a separate active rule, e.g. by evaluating in the condition the visibility of an added item by the detector.

on:command do:detect:feature

chair 👁

on:detect do:execute:op

function('I found a chair', 'say') ◀

on:detect do:add

detected.feature.chair ➕

on:altered if:function('detected.feature.chair', 'visible') == false do:redetect


Augmentation Patterns

Augmentation Pattern Description Example
Geo-Located Action Pattern Triggering of action or of user feedback based on GPS location data (long/lat) or address data (city, street, ...) Visual or audio feedback in standard UI about location-based point of interest
Segment Overlay Pattern Presentation of 2D overlay on top of image segment detected in video stream Attaching 2D text description to a detected image segment
Captured Twin Pattern Captured element of real world added as 3D model Captured walls, doors and windows in an indoor AR session
Anchored Decorator Pattern Presentation of 3D content aligned to detected entity for enhancement Attaching visual 3D elements to a detected image (marker) or captured object
Anchored Substitution Pattern Presentation of 3D content replacing a detected entity (Re-)Place a detected image with another one
Ahead Staging Pattern Presentation of 3D content ahead of user Placing 3D item on floor infront of spectator
Anchored/Staged Progression Pattern Ordered, linear 3D story: temporal order or interaction flow of 3D presentations Sequence of 3D presentations with forth and optionally back movements
Contextual Plot Pattern Spatio-temporal setting that reacts on diverse event types to form a non-linear plot Scenography of dynamic, interactive, and animated AR

Event-driven Augmentation Patterns, user as spectator and actor/performer at the same time, illusive scenography, no absolute control over time and space (in contrast to film, theater and 3D game design), real context as stage with event-driven rules to create dynamic behavior.

Geo-Located Action Pattern

Action triggered by geo-location or presentation of information typically only in 2D UI or as audio feedback, not placed at located position in 3D (due to precision restrictions of +/- 20 m of GPS signal).

Segment Overlay Pattern

Using computer vision and machine learning approaches for detecting classified landmarks and image segments.

Results of image segmentations may constitute of:

Overlay is positioned pixel-based relative to segment in 2D image on top of video stream in the AR session.

Captured Twin Pattern

Captured twin virtually visualized as contour or transparent plane in 3D to keep real object recognizable; or no visible representation at all but available in spatial data model.

Anchored Decorator Pattern

Additional information aligned to spatially detected entity: e.g., 3D model anchored to detected img, aligned to door, ...

Anchored Substitution Pattern

Replace real world object with virtual object.

Ahead Staging Pattern

Ahead staging of 3D content, aligned to spectator's position and view direction.

Anchored/Staged Progression Pattern

Linear story; Directional or bidirectional, ordered presentation of 3D content in AR with an explicit start. Might start at anchored entity or staged ahead of user.

Contextual Plot Pattern

Aligned to spatial context. Staging of plot plus event-driven behavior rules.

Back to ARchi VR Content Creation

Copyright © 2020-2023 Metason - All rights reserved.