ARchi VR Behavior

Revision Info: Documentation for ARchi VR Version 3.3 - April 2024

Table of Contents

Up to ARchi VR Content Creation

Up to ARchi VR Content Creation


Event-driven Behavior in Augmented Reality

AR Experiences Driven by Spatial Context

Creating AR experiences poses additional challenges compared to designing virtual reality (VR) and 3D content (e.g., video games). When creating VR/3D scenes, designers are in control of the virtual world they are building (even if it’s programmatically generated), thus taking a sort of ”god role”. AR experiences take place in the uncontrolled real world, and scene understanding algorithms detect the user’s spatial context. The AR experience is then driven by elements detected in the real world, without having control over their occurrence and timing during the creation process.

Imperative Programming versus Reactive Programming

ARchi VR is supporting declarative programming to build AR experiences. „declARe“ is the declarative programming approach for the specification of AR experiences in ARchi VR. The content of an AR scene is a composition of virtual 3D items that are placed into the spatial context of the running AR session.

In an imperative approach, dynamic behavior would be realized by calling functions or object methods. This is not the way to go with „“declARe“. Instead the dynamic behavior is driven by events. You therefore need to define the expected reaction on events that can happen in an AR session. These reactions are expressed as active rules based on Event-Condition-Task triples.

Reactive Programming with Active Rules

Whenever the ARchi VR App should perform a task, an event has to trigger its execution. The occurence of an event is an observation saying that “something has happened”. Events do not automatically trigger a reaction (or side-effect). An active rule needs to bind an event type to a specific task execution. In ARchi VR this is realized as an "Active Rule" which is expressed as Event-Condition-Action triple. Active rules are used to signal the App that there is specific work to be done which will change the internal state of the system.

Active rules are processed asynchronously to avoid coordination and waiting. In such a reactive programming approach there is no explicit control over time-ordered execution.

Event-Condition-Action Rules

To design reactive systems, breaking down the system’s behavior into discrete events, conditions, and actions provides a structured and modular approach. An event is a signal that something has occurred, such as the start of an AR session (on:start), a user tapping on an item (on:tap), or the detection of an image marker (on:detect).

Reactive Behavior Diagrams

For a compact representation of active rules a diagram consisting of rule-reaction blocks is used. In the ARchi Composer such diagrams can be generated from "declARe" code. The first line shows the active rule as Event-Condition-Action triple. The blockquoted line after the rule is depicting the changed state as reaction:

Event Condition Action

changed state as reaction

The following example of an active rule is triggered by a temporal event (in 20 secs). If no item is found in the current AR session (the condition), the reaction will be voice feedback (the reaction) using the internal Text-To-Speech system:

in:20 if:items.@count == 0 do:say

"you may add an item" 🗣

Immediate execution of tasks or function calls at invocation is standard behavior and does not need any special condition handling. Default AR events are common triggers driven by the AR session, such as on:start, on:error, on:stop. If no condition is defined, it evaluates to true and the diagram shows an immediate execution arrow (→):

Event Task

changed state as reaction

Cascading reactions are presented as intended blockquotes:

on:start do:request GET:JSON

Action ← response ••• https://service.metason.net/ar/doit.json

on:ommand do:set

data.val = 0

in:5 do:set

data.val = 5

Event Categories

An event is a signal that something has happened. Events are generated by a producer caused and triggered by various circumstances. Within an Augmented Reality session of ARchi VR the following event types may happen:

Event Type Producer Cause Time Resolution
Session Event AR Session Change of session state → on:start, on:stable, on:load, on:error, on:stop in realtime
Invocation Event Command Initiation or Function Call Invocation of task → on:command, as:once, function call → on:call in realtime
Detection Event Installed Detector Discovery of designated entity → on:detect 100 - 500 ms
User Event App User User interaction → on:tap, on:press, on:drag, on:select, on:dialog, on:poi in realtime
Temporal Event Time Scheduler Elapsed time in seconds reached → in:time 200 ms
Data-driven Event Data Observer & Context Dispatcher Observed change of key-value in data model → as:changed, as:stated, as:steady, as:activated, as:altered, as:always, as:repeated 200 ms
Response Event Remote Request Async response of REST API call → do:request 20 - 5’000 ms
Notification Event Subscribed System: Bonjour or SharePlay Received notification during collaboration → on:enter, on:leave; Broadcast task → as:broadcast 50 - 250 ms

Session Events

Invocation Events

User Events

Temporal Events

Data-driven Events

By using a state machine, both value changes and state transitions can generate data-driven events, taking into account previous values. This dynamic triggering turns ECA to active rules.

Response Events

Detection Event

Notification Events

AR Patterns

AR patterns serve as a valuable means of communicating proven, reusable solutions to recurring design problems encountered during AR development.

Behavioral Patterns

The dynamic behavior of an AR experience is determined by its ECA rules, which are triggered by events occurring in the actual real-world context. The following table lists common behavioral patterns in AR that result from ECA rules.

Behavioral Pattern Description Example
Immediate Reaction Pattern Direct execution of task triggered by invocation event Immediate, singular command of task or function call
Timed Reaction Pattern Temporally executed action Delayed action, timed action sequence
Conditional Reaction Pattern Execute an action only when a condition is fulfilled after being triggered by event State-driven, asynchronous programming logic
Continous Evaluation Pattern Continous polling of state change Existence check, visibility check, proximity check, repeated update checks
Publish-Subscribe Notification Pattern Receive notifications via a message queue from a subscribed system In FaceTime/SharePlay call, in Bluetooth connection, in WebSocket/WebRTC session
Request Response Pattern Remote procedure call resulting in asynchronously receiving ECA rules or media assets REST API call via a Web URL to load rules or assets (images, 3D models), e.g., GET:JSON or POST:CONTEXT
Chain Reaction Pattern Course of events processed as sequence of indirect reactions Rule changing data that will trigger a rule to update an item’s visual as a follow-up
Complementary Reactions Pattern Two reactions with opposite result Reacting on toggling states with two complementary active rules
Detector Reactivation Pattern Reactivate detector with only once reaction Reactivate detector after resulting augmentation is no longer existing

Augmentation Patterns

While a VR/3D designer is placing virtual objects using positions in a controlled world coordinate system, an AR content creator primarily specifies object placement intents relative to appearing anchors, which are dynamically produced by detectors. These spatial anchors serve as reference points for pinning objects. Generally in AR Patterns, the augmentation intents are formulated as ECA rules that are triggered by detector events. When a detector event occurs, ECA rule’s reaction will add augmentation items to the AR scene.

The following table outlines several common placement intents for event-driven augmentation patterns that can be used to stage AR experiences. In AR, the real world serves as the spatial context for the stage, making users both spectators and performers. Their movements and perspectives influence the firing of events, leaving limited control over time and space for AR scenography (in contrast to film, theater, and VR/3D/game design).

Augmentation Pattern Description Example
Geolocated Remark Pattern Triggering of action or of user feedback based on GPS location data (long/lat) or address data (city, street, ...) Visual or audio feedback in standard UI about location-based point of interest
Segment Overlay Pattern Presentation of 2D overlay on top of image segment detected in video stream Attaching 2D text description to a detected image segment
Area Enrichment Pattern Approximately placing 3D content at area of image segment Presenting ballons in sky area
Captured Twin Pattern Captured element of real world added as 3D model Captured walls, doors and windows in an indoor AR session
Anchored Supplement Pattern Presentation of 3D content aligned to detected entity for enhancement Attaching visual 3D elements to a detected image (marker) or captured object
Superimposition Pattern Presentation of 3D content replacing a detected entity (Re-)Place a detected image with another one
Tag-along Pattern Presentation of 3D content within user’s field of view while head-locked Place interactive 3D elements that follow the user
Hand/Palm Pop-up Pattern Presentation of 3D content on palm of hand while visible Place 3D UI elements at palm of user's one hand
Ahead Staging Pattern Presentation of 3D content ahead of user Placing 3D item on floor infront of spectator
Pass-through Portal Pattern Presentation of partly hidden 3D content to force user to go through Placing 3D scene behind a portal / behind an opening
Staged Progression Pattern Ordered, linear story: temporal order or interaction flow of 3D presentations Sequence of 3D presentations with forth and optionally back movements
Attention Director Pattern Guide user’s attention to relevant place Use animated pointers to direct user’s attention
Contextual Plot Pattern Spatio-temporal setting that aggregates diverse AR patterns to form a non-linear plot Scenography of dynamic, interactive, and animated AR

For more details on AR Patterns see github.com/ARpatterns.