arrow-left

Only this pageAll pages
gitbookPowered by GitBook
1 of 42

Preserving Immersive Media Knowledge Base

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Getting Started

Loading...

Loading...

Loading...

Loading...

Preserving XR Hardware

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Preserving 3D Software

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Preserving 360° Video

Loading...

Loading...

Loading...

Loading...

Loading...

Licence

All content hosted on the Knowledge Base site is shared under a Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)arrow-up-right license, unless otherwise stated. A human-readable summary of the full licensearrow-up-right is provided below. Additional licences may apply to non-hosted content (e.g. links out or embedded media).

You are free to:

  • Share — copy and redistribute the material in any medium or format

  • Adapt — remix, transform, and build upon the material for any purpose, even commercially

  • The licensor cannot revoke these freedoms as long as you follow the license terms.

Under the following terms:

  • Attribution — You must give , provide a link to the license, and . You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.

  • ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the as the original.

  • No additional restrictions

— You may not apply legal terms or
that legally restrict others from doing anything the license permits.
appropriate creditarrow-up-right
indicate if changes were madearrow-up-right
same licensearrow-up-right
technological measuresarrow-up-right

Introduction

XR hardware are physical computer hardware components used to access XR content. Contemporary XR hardware typically consists of a set of interconnected off-the-shelf hardware and software components, which we will refer to as an XR system.

hashtag
Assessing XR Hardware

  • What kind of computer system is needed to run the software?

    • How flexible is the specification of the hardware?

    • Is a dedicated GPU required? If so, does it require one with a specific feature set?

  • How is sound played back? Is this played back via the HMD or is additional hardware required to support this? e.g. audio interface, speakers, headphones.

  • What kind of interactivity is supported? What kind of hardware is needed to support this?

    • Does the experience require the use of a controller or other human interface device (HID)?

    • If yes to above, can this be fulfilled by a generic piece of hardware or is a specific or custom-made device required?

  • Does the experience involve sound? How is this played back?

    • Is utilised and if so how is it played back? e.g. via positional speakers or head-related transfer function (HRTF) through an or headphones.

    • Is subtitling or captioning used, or might they be needed?

  • Are there any technical or conceptual reasons that specific XR hardware (e.g. ) is needed to access the XR content?

    • Dependency relationships with specific hardware can be baked into software when it is created (e.g. by using certain manufacturer-specific plugins).

    • A creator might prefer certain hardware for its characteristics (e.g. technical, visual, conceptual).

  • How many users are supported simultaneously? e.g. single or multiple stations providing access, discrete or shared experience.

  • Does the software require internet access? If so, what for and does this need to be maintained?

hashtag
Acquisition Checklist

If specific or custom hardware is an important component of the XR material you are caring for, the following measures may help you maintain access and prepare for long-term preservation:

  • Collecting multiple examples of exemplar hardware as a means of providing access in the short-term.

  • Create disk images of any internal storage media of computers.

  • Gathering documentation of the hardware and its components.

For custom hardware devices try and source or create circuit diagrams and a bill of materials.

  • Extract and archive any related software components (e.g. the which is required to drive the XR hardware system).

  • spatial audio
    HMD
    HMD
    XR runtime

    Preserving Immersive Media Knowledge Base

    triangle-exclamation

    This site is a work in progress! Articles may be empty or incomplete, and might move around.

    circle-info

    Can you share your knowledge with the community and help us add to and improve this site? We would welcome your contributions! Read our guide to contributing to get started.

    The Preserving Immersive Media Knowledge Base is a resource created to help share information between members of the digital preservation community who are caring for virtual reality (VR), augmented reality (AR), mixed reality (MR), 360 video, real-time 3D software and other similar materials. This site was born out of with additional funding from the .

    The Knowledge Base is designed as a flexible and collaborative space for sharing information and materials that may be work-in-progress or based on best available knowledge. As such, it is constantly evolving and pages can never be considered final or authoritative. If you are new here, these pages might be useful places to start:

    • with immersive media preservation.

    • of frequently encountered immersive media and preservation terminology.

    • that the community is trying to answer.

    For a longer read, the published as an output of Tate's immersive media research is a useful place to start learning about preserving VR.

    to the Knowledge Base.
  • from Preserving Immersive Media Group events.

  • Tate's Preserving Immersive Media Projectarrow-up-right
    Netherlands Institute for Sound & Visionarrow-up-right
    Getting started
    Glossary
    Open questions
    white paperarrow-up-right
    Guide to contributing
    Recordings and notes

    Community

    hashtag
    Preserving Immersive Media Group

    The Preserving Immersive Media Group (PIMG) is a community and email list for those interested in collecting, preserving and stabilising artworks that utilise immersive media, including 360 video, real-time 3D, virtual reality and augmented reality. This group was born out of the ongoing Preserving Immersive Mediaarrow-up-right project at Tate. We encourage all with an interest in these topics to join the PIMG email list on Groups.ioarrow-up-right. Members are welcome participate in discussion and to share any relevant information via the list, providing they agree to the PIMG Code of Conduct.

    PIMG also runs regular events, recordings and outputs from which are made available online.

    hashtag
    List of Immersive Media Preservation Projects

    circle-info

    The purpose of this list is to show foster exchange and collaboration between projects exploring the preservation of immersive media materials and experiences. This is a growing list maintained by the community. If you are involved in an immersive media preservation project, please consider .

    Project
    Host(s)
    URL
    Contact

    hashtag
    List of Immersive Media Artworks

    circle-info

    This is a growing list maintained by the community. If you care for an immersive media artwork, please consider adding it.

    The purpose of this list is to show the variety of artworks and technologies used, but also to foster exchange between conservators. The listed artworks are not necessarily well documented case studies. In the list below not only VR- or AR-based artworks are included, but also other interactive artworks that are produced with a game engine and hence require similar preservation strategies.

    Artwork
    Artists
    Artwork date
    Game Engine
    Runtime
    Game Engine, latest version (year)
    Display
    Interactivity
    Institution owning artwork

    2012

    Projection, U-shaped, plus flat screen for navigation

    Navigation with joystick

    HEK (House of Electronic Arts, Basel)

    Studer / van den Berg

    2017

    Professional V1.05

    Windows 7 / 10 with DirectX 9

    2016

    Projection 1920x1080,

    Walkthrough (like a 360 Video). Looking around using a mouse

    HEK (House of Electronic Arts, Basel)

    Mélodie Mousset

    2015

    Unity

    Windows 10, Oculus Rift

    under development

    Oculus Rift Headset

    Navigation with XBox Console as part of Oculus Rift

    HEK (House of Electronic Arts, Basel)

    Preserving Immersive Media

    Tate, London, UK

    https://www.tate.org.uk/about-us/projects/preserving-immersive-mediaarrow-up-right

    jack.mcchonchie@tate.org.uk and tom.ensom@tate.org.uk

    The Immersive Archive

    Mobile & Environmental Media Lab, USC School of Cinematic Arts, USA

    https://immersivearchive.org/arrow-up-right

    info@immersivearchive.org

    Collecting and Archiving Immersive Experiences

    Zone*Interditearrow-up-right

    Wachter / Jud

    2000, ongoing

    Crystal Spacearrow-up-right

    adding it

    CRAIC, UKRI and the V&A, UK

    Windows 10

    https://craic.lboro.ac.uk/essays/collecting-and-archiving-immersive-experiences/arrow-up-right
    Passage Park #7: Relocatearrow-up-right
    DarkBASICarrow-up-right
    We were looking for ourselves in each otherarrow-up-right

    Open Questions

    The preservation of immersive media is an emerging topic in digital preservation and presents many new challenges and opportunities for research. This page is a place for tracking questions and prompts that we have arrived at through research, discussion and daydreaming. Do you think you can contribute an answer to any these or do you have any questions of your own? We'd love your contribution!

    hashtag
    VR

    • What are the most effective approaches to consistently capturing video documentation of VR experiences? Including field-of-view capture, video etc.

    • What does the process of adding OpenXR support to an existing VR experience look like? Are there any changes that result and how might these be managed?

    • Diagram and technical specs for underlying technology of headsets

    hashtag
    3D File Formats

    • Which 3D data file formats are most suitable/sustainable for preservation purposes? Are there any which are not?

    hashtag
    Tools

    • Gap analysis of preservation tools - what is available, what could be developed?

    • Calibration tools- comparison with game brightness etc

    • Capture tools

    hashtag
    Acquisitions

    • A complexity matrix- tool to enable advocacy for institutional resources

    hashtag
    Documentation

    • How can we navigate the relationship between the artwork and the technology, to what extent is the experience determined by an individual’s engagement with the peripheral devices, and to what extent might this be understood and managed over time?

    • What non-technical frameworks exist to help us understand user experience? e.g. oral histories, historical context of technologies

    • What combination of technical tools and non-technical frameworks might we employ to try to create a sustainable catalogue of user experience, and how might this feed into the historical records of an artwork?

    hashtag
    360 Video

    • What are the most appropriate metadata standards and how can they be applied?

    hashtag
    PIM Knowledge Base

    • How do we identify and deal with broken links in the future? GitBook doesn't offer this service, but could we use the GitHub repo (e.g. )?

    • How should we run/support the knowledge base in terms of encouraging contributions? How might this play out in the long-term?

    File format identification- do these support 3D Objects and 3D software?

    If an artwork is so complex as to become unsustainable, how might we interpret its documentation for a new audience?

    https://github.com/marketplace/actions/broken-link-checkarrow-up-right

    Bibliography

    We are gathering a curated list of publications and other resources on the topic of immersive media and its preservation. You will find references to external resources embedded throughout the Knowledge Base, while an extensive bibliography of references can be found in our public Zotero library:

    https://www.zotero.org/groups/4453604/preserving_immersive_media/libraryarrow-up-right

    If you want to add something to the bibliography, drop us an email so we can invite you to the Zotero group or add it for you: tom [at] tomensom.com

    Past PIMG Events

    The Preserving Immersive Media Group (PIMG) runs occasional workshops, webinars and other events - find link to recordings of presentations, slides and other materials relating to them here. Event recordings from PIMG events can be found on the PIMG YouTube channelarrow-up-right.

    hashtag
    Knowledge Building Sessions

    A series of collaborative events held during 2023-24, focused on knowledge building activities around particular topics in immersive media preservation. Sessions invite participants to share experiences and collectively work towards building knowledge in this field, with a focus on possible contributions to the knowledge base.

    The first took place on Tuesday 12th December 2023 and collaborative notes from this session can be found here:

    hashtag
    Documenting the Interactive Documentary Webinar (6 November 2020)

    Recordings of presentations are available online via a YouTube playlist:

    hashtag
    Preserving Immersive Media Workshop (27 March 2020)

    Recordings of presentations are available online via a YouTube playlist:

    hashtag
    Slides, Notes and other Documents

    hashtag
    iPRES 2019 Hackathon

    The iPRES 2019 hackathon was an effort to better understanding the variability of virtual reality artworks e.g. what differs between artworks depending on the choices made by artists; and what happens when we make changes to elements of the work.

    A full abstract can be found on the iPRES website:

    Detailed notes from the event can be found on the collaborative notepad hosted by Rhizome:

    hashtag
    Preserving Immersive Media Workshop (8 March 2019)

    hashtag
    Slides, Notes and other Documents

    Glossary

    This glossary is an effort to gather definitions of frequently encountered immersive media terminology and support a common vocabulary in collaboration between disciplines. The was compiled during iPRES 2019 by participants in the VR Hackathon event, using definitions sourced from existing online glossaries:

    • Unity:

    XinReality wiki:

  • Oculus Creators Portal:

  • Interactive Advertising Bureau (IAB): AR and VR Terminology

  • Digital Preservation Handbook:

  • To do:

    • Review and agree on these definitions

    • Add visual examples/images to illustrate what is being described

    • Add relevant digital preservation terms

    hashtag
    VR General

    Virtual Reality: Virtual reality, commonly abbreviated to VR, is a technology that simulates a fully immersive virtual or imaginary environment in which a user feels that they are physically present.

    Augmented Reality: Augmented reality, commonly abbreviated to AR, is a technology that overlays virtual elements on top of a real-world environment.

    Augmented Virtuality: Similar to augmented reality, this refers to a technology system whereby a largely virtual environment is merged with real-life objects.

    Mixed Reality: Described variously as either mixed reality, MR or hybrid reality, this term refers to any technology that isn’t a fully immersive VR system, but instead augmented reality or augmented virtuality (see above definitions). This is also (confusingly) used to describe Microsoft’s virtual platform, which includes both VR and AR devices.

    Three degrees of freedom: Often abbreviated to 3DoF, this term refers to the ability to move in six directions, namely pitch, yaw and roll.

    Six degrees of freedom: Often abbreviated to 6DoF, this term refers to the ability to move in six directions, namely pitch, yaw, roll, elevation, strafing and surging.

    (cave automatic virtual environment): is a virtual reality environment consisting of 3 to 6 walls that form a room-sized cube.

    On-rails: A VR experience in which there is no significant use of positional tracking. It has a start and an end like a video.

    Scene behaviours: Can you manipulate objects and with what interaction?

    Caching: a grid/cache appears if you move out of the safe area.

    VR-runtime: integration of software and hardware (specific headset drivers etc.) for VR

    Presence/immersion: Both presence and immersion are used interchangeably to describe the sensation of feeling physically present within a virtual experience, as opposed to the detachment experienced through experiencing content via a conventional screen-based medium.

    hashtag
    Performance

    Frames per second: Also known as Frame rate or fps, this measures how often images (also called ‘frames’) are shown consecutively. This is related to, yet distinct from, refresh rate (see below). 60 frames per second is usually considered playable without causing motion sickness, but the best VR headsets have even higher refresh rates.

    Refresh rate: This specifically indicates how often the buffer is updated and an image (often called a ‘frame’) regenerated on a screen, an important element when creating a realistic virtual environment. This is measured in Hertz (Hz) and is related to, yet distinct from, frames per second (see below). A low refresh rate can cause judder (see below).

    Judder: Typically caused by a low refresh rate (see above) or dropped frames, judder is the manifestation of motion blur (also known as smearing) and the perception of more than one image simultaneously (known as strobing). This can cause simulator sickness (see below).

    : The time delaying virtual reality; a glitch in the VR system, when the images are not well synchronized with the sound, changing later than expected.

    hashtag
    Behaviour

    Teleportation: A common method of virtual navigation, this allows the user to quickly move between points without having to traverse the distance between them.

    hashtag
    Rendering

    3D API: A library and interface supporting common 3D rendering tasks. Examples include DirectX (Windows), OpenGL (cross-platform), Metal (MacOS), Vulkan (cross-platform).

    Ambient occlusion: Ambient occlusion is a technique to produce film-like lighting quality with real-time performance. Ambient occlusion is a lighting model that calculates the brightness of a pixel in relation to nearby objects in the scene.

    Anti-aliasing: Raster images are made of rectangular pixels, which can lead to jagged edges in curved lines. Anti-Aliasing aims to reduce the jaggedness created by these pixels, and there are multiple techniques to achieve this.

    : The least demanding type of anti-aliasing. Rather than running complex calculation depending on the geometry and colors displayed, FXAA simply applies extensive blurring to obscure the jagged edges. The end result is unnoticeable performance impact but a generally blurrier image.

    : It relies on color manipulation around geometric shapes to produce an effect of smoothness. It can use either 2, 4 or 8 samples – the higher the sample count, the higher the quality and the performance impact.

    : Force the GPU to render a game at a higher resolution and then downsample it. That way, it increases the overall pixel density of your display and renders a much sharper image.

    Bloom: Creates blurry feathered edges to light sources in post-processing

    : A tracking-based rendering method where the user’s eye movements are tracked, allowing peripheral vision to be rendered at a lower quality, thus reducing the amount of processing needed to render a VR experience in real-time.

    Lens Flare: The image is processed to replicate light reflecting in a simulated camera lens.

    Material shaders: (PBR vs traditional methods, BSDF/BRDF)

    Shader: A shader is a small computer program designed to run on the GPU. Shaders are written in languages associated with 3D APIs, such as HLSL (DirectX), GLSL (OpenGL) and SPIR-V (Vulkan).

    Asynchronous Reprojection / Spacewarp / Timewarp:

    Lens Distortion:

    hashtag
    Navigation & Tracking

    : the headset has all the tracking tech built-in. Disadvantage: it cannot track the controllers behind the person

    : This tracking method uses cameras fixed to the device being tracked in order to determine how its position changes relative to its environment.

    : sensors/cameras being mounted around the user creating a tracked play space. The benefit of this design is that players are monitored wherever they go within that area, and so are the controllers.

    : This term refers to the use of externally placed positional sensors to track a user moving in real-time.

    : The measurement of eye positioning and movement to discern where exactly a user is looking. This is a crucial element of foveated rendering (see above).

    Head tracking: This is a method of tracking a user in virtual reality whereby the picture shifts as they move or angle their head.

    Motion tracking: The use of positional sensors and markers that register where a device is, allowing it to be mapped to a virtual environment.

    : is a term used to describe how a piece of hardware determines how tilted something is. There are three different ways that an object can rotate, we call each of these ways a degrees of freedom, and you can track any number of degrees of freedom (see above for 3DoF and 6DoF).

    Positional sensor: Devices used to track the exact position of the user while they are using a VR system, and feedback data that is used to inform the information being shown on the screen.

    Room-scale: A virtual reality set-up that, thanks to an expansive configuration of positional sensors (see above), allows the user to physically roam around an entire room without experiencing limitations.

    Fixed viewpoints: Common to many VR experiences – particularly those that are mobile-based – this feature limits the user to a pre-defined number of explorable positions within a virtual reality build, rather than allowing open-world exploration.

    First or Third person Experience: Whether the user experiences the virtual world as through their own eyes, or as an external observer.

    hashtag
    Side Effects

    VR sickness: typically involves symptoms of dizziness and nausea.

    Simulator sickness: Sometimes referred to as VR sickness and with similar effects to motion sickness, this can be caused by factors including judder and users perceiving self-motion when stationary.

    a middle-ground between high persistence and low persistence displays that “warps” a current frame to compensate for the motion of your head before showing you the next rendered frame.

    Spatial desync: the effect or difference occurring when a user’s movements in real life and the VR avatar’s movements are out of sync.

    hashtag
    Peripherals

    hashtag
    Head Mounted Displays (HMD)

    Computer-based or tethered HMD: describes any headset that requires a connection to a stand-alone PC in order to function. Well-known computer-based systems include Facebook’s Oculus Rift and the HTC Vive.

    Mobile-based HMD: Any HMD where the processing and display for the VR experience are provided by a mobile phone. Notable examples include the Samsung Gear VR and Google Daydream headset.

    Standalone HMD: A VR or AR HMD where the entire system is self-contained within the device. Examples include the Microsoft HoloLens mixed reality headset and the upcoming Oculus Go from Facebook.

    hashtag
    Sound

    Headphones are usually provided with or within HMD’s. Specifications are often contingent to brand but common features may include 3D spatial sound for an immersive experience.

    Binaural:

    hashtag
    Hand controllers

    Peripheral device/s, often left and right to provide full motion tracking of a player’s hands in VR experience/play.

    hashtag
    Preservation

    hashtag
    General

    Complex Digital Objects: An object, item, or work with a component that is digital, is made up of multiple files (likely of differing file formats) and may or may not incorporate a physical component. A Complex Digital Object is likely to depend on software, hardware, peripherals, and/or networked or non-networked data, platforms, and/or services. Contrast with a Simple Digital Object which consists of a single file. XR materials can be considered Complex Digital Objects.

    hashtag
    Documentation

    hashtag
    Emulation

    hashtag
    Migration

    hashtag
    Code Migration

    first versionarrow-up-right
    What is XR Glossaryarrow-up-right
    The VR Glossaryarrow-up-right
    https://docs.google.com/document/d/1EeVqwLUY0xnCaV7aICbW8ZOMPLl8jrqd18obppoa83E/edit?usp=sharingarrow-up-right
    https://www.youtube.com/watch?v=K5ufLcoGMJg&list=PLQvZVm5rUUpgTLRINBfKvCxWLFkwlW2S-arrow-up-right
    https://www.youtube.com/watch?v=x_hz7Hg3hKs&list=PLQvZVm5rUUphocASxU0LFcOVT6aj5Kch3arrow-up-right
    https://ipres2019.org/program/conference-programme/?session=117arrow-up-right
    https://notepad.rhizome.org/ipres2019-vrarrow-up-right
    Termsarrow-up-right
    VR Glossaryarrow-up-right
    PDFarrow-up-right
    A complete Virtual Reality glossaryarrow-up-right
    VR Glossaryarrow-up-right
    Glossaryarrow-up-right
    CAVEarrow-up-right
    Latencyarrow-up-right
    Fast Approximate Anti-Aliasingarrow-up-right
    Multi-sampling Anti-Aliasingarrow-up-right
    Super Sampling Anti-Aliasingarrow-up-right
    Foveated renderingarrow-up-right
    Inside-out trackingarrow-up-right
    Inside-out trackingarrow-up-right
    Outside-in trackingarrow-up-right
    Outside-in trackingarrow-up-right
    Eye trackingarrow-up-right
    Rotational trackingarrow-up-right
    Asynchronous timewarp:arrow-up-right

    Overview

    Immersive media (XR) is a term used to describe a set of related technologies that aim to extend our physical reality in various ways. You might be familiar with terms like Virtual Reality (VR) and Augmented Reality (AR), both of which are kinds of immersive media. In this section of the Knowledge Base, you will find information to help you get started with immersive media preservation and steer you towards resources appropriate to your context.

    The recommended route through these pages is to start by reviewing the prompts in the Initial Assessment section, which will help you characterise the XR material/experience you are working with and then point you in the direction of relevant sections of the Knowledge Base.

    Code of Conduct

    circle-info

    By using this site, you agree to abide by the code of conduct outlined on this page.

    hashtag
    Document History

    The first version of this Code of Conduct is based on that used by the . Future updates or alterations to the Code of Conduct should be noted here along with the date and actioner.

    hashtag
    Code of Conduct

    The Preserving Immersive Media Group is dedicated to providing a harassment-free environment for everyone. We do not tolerate harassment of participants in any form.

    This code of conduct applies to all (online and offline) Preserving Immersive Media Group spaces, including including mailing lists, online meetings, in-person meetings, the GitBook space, the GitHub repository. Anyone who violates this code of conduct may be sanctioned or expelled from these spaces at the discretion of the response team.

    Some Preserving Immersive Media Group spaces may have additional rules in place, which will be made clearly available to participants. Participants are responsible for knowing and abiding by these rules.

    Harassment includes:

    • Offensive comments related to gender, gender identity and expression, sexual orientation, disability, mental illness, neuro(a)typicality, physical appearance, body size, age, race, or religion.

    • Unwelcome comments regarding a person’s lifestyle choices and practices, including those related to food, health, parenting, drugs, and employment.

    • Deliberate misgendering or use of ‘dead’ or rejected names.

    The Preserving Immersive Media Group prioritizes marginalized people’s safety over privileged people’s comfort. The response team reserves the right not to act on complaints regarding:

    • ‘Reverse’ -isms, including ‘reverse racism,’ ‘reverse sexism,’ and ‘cisphobia’

    • Reasonable communication of boundaries, such as “leave me alone,” “go away,” or “I’m not discussing this with you.”

    • Communicating in a ‘tone’ you don’t find congenial

    hashtag
    Reporting

    If you are being harassed by a member of Preserving Immersive Media Group, notice that someone else is being harassed, or have any other concerns, please contact a member of the . If the person who is harassing you is on the team, they will recuse themselves from handling your incident. We will respond as promptly as we can.

    This code of conduct applies to Preserving Immersive Media Group spaces, but if you are being harassed by a member of Preserving Immersive Media Group outside our spaces, we still want to know about it. We will take all good-faith reports of harassment by Preserving Immersive Media Group members seriously. This includes harassment outside our spaces and harassment that took place at any point in time. The response team reserves the right to exclude people from Preserving Immersive Media Group based on their past behavior, including behavior outside Preserving Immersive Media Group spaces and behavior towards people who are not in Preserving Immersive Media Group.

    In order to protect volunteers from abuse and burnout, we reserve the right to reject any report we believe to have been made in bad faith. Reports intended to silence legitimate criticism may be deleted without response.

    We will respect confidentiality requests for the purpose of protecting victims of abuse. At our discretion, we may publicly name a person about whom we’ve received harassment complaints, or privately warn third parties about them, if we believe that doing so will increase the safety of Preserving Immersive Media Group members or the general public. We will not name harassment victims without their affirmative consent.

    hashtag
    Consequences

    Participants asked to stop any harassing behavior are expected to comply immediately.

    If a participant engages in harassing behavior, the response team may take any action they deem appropriate, up to and including expulsion from all Preserving Immersive Media Group spaces and identification of the participant as a harasser to other Preserving Immersive Media Group members or the general public.

    hashtag
    Response Team

    Tom Ensom: tom [at] tomensom.com

    Jack McConchie: jack.mcconchie [at] tate.org.uk

    Contributing

    The Knowledge Base in a collaborative and community-driven space for sharing anything from publications, to guides or workflows, to technical notes and work-in-progress. We welcome contributions big or small!

    As a contributor, you agree to the Code of Conduct and that any content you create on this GitBook site will be shared according to our Licence terms.

    If you contribute, please do add your name to the list of contributors!

    hashtag
    How to Contribute

    If you are interested in contributing, you can join the team as an Editor by following this link:

    Once you have logged in and joined the PIMKB group on GitBook, you will be able to make edits to pages:

    1. You can start editing by clicking on the 'Edit' button in the top right corner. Changes you make in Edit mode will exist independently from the live version until you submit them for review.

    2. Add a brief description of the changes made during your edit by filling in the 'Describe your changes...' field at the top of the page. This will help the reviewer understand the rationale behind your changes.

    3. Once you are happy with your edit, click the 'Submit for review' button in the top right corner. Your edit will then be checked by an administrator before going live.

    For more information on GitBook and how to use it, check out the .

    hashtag
    Need Help Contributing?

    If you'd like to help or support with contributing or using the GitBook platform, or just want to chat about an idea for a contribution, please do get in touch with the site admins:

    • Tom Ensom: tom [at] tomensom.com

    • Jack McConchie: jack.mcconchie [at] tate.org.uk

    Gratuitous or off-topic sexual images or behaviour in spaces where they’re not appropriate.

  • Physical contact and simulated physical contact (eg, textual descriptions like “hug” or “backrub”) without consent or after a request to stop.

  • Threats of violence.

  • Incitement of violence towards any individual, including encouraging a person to commit suicide or to engage in self-harm.

  • Deliberate intimidation.

  • Stalking or following.

  • Harassing photography or recording, including logging online activity for harassment purposes.

  • Sustained disruption of discussion.

  • Unwelcome sexual attention.

  • Pattern of inappropriate social contact, such as requesting/assuming inappropriate levels of intimacy with others

  • Continued one-on-one communication after requests to cease.

  • Deliberate “outing” of any aspect of a person’s identity without their consent except as necessary to protect vulnerable people from intentional abuse.

  • Publication of non-harassing private communication.

  • Criticizing racist, sexist, cissexist, or otherwise oppressive behavior or assumptions

    Wikibase Stakeholder Grouparrow-up-right
    response team
    https://app.gitbook.com/invite/-MiR-vfqFP7-QsvY8wdc/b20bJHyul40s6q8KYlI9arrow-up-right
    GitBook docsarrow-up-right

    Further Reading

    hashtag
    Virtual Reality Fundamentals

    • Google. Google VR: Fundamental Concepts. URL: https://developers.google.com/vr/discover/fundamentalsarrow-up-right

    • Brown VR Software Wiki. URL:

    hashtag
    Virtual Reality Preservation Research

    • Campbell, S. (2017). A Rift in our Practices, Toward Preserving Virtual Reality [Master’s Thesis, New York University].

    • Campbell, S., & Hellar, M. (n.d.). From Immersion to Acquisition: An Overview Of Virtual Reality For Time Based Media Conservators. Electronic Media Review, Six: 2019-2020. Retrieved October 7, 2021, from

    • Cranmer, C. (2017). Preserving the emerging: virtual reality and 360-degree video, an internship research report. Undefined.

    Initial Assessment

    Getting started with the long-term care of immersive media (XR) can be overwhelming. Multifaceted production processes, complex components and challenging access requirements can all confound applying established digital preservation practice. This section will help you work through an initial assessment, identifying the characteristics of the XR experience you are trying to preserve and pointing you in the direction of resources where you can learn more.

    hashtag
    Context

    This section considers where an XR experience has come from, including established networks of care, and the institutional frameworks which will need to navigated.

    hashtag
    Production and Display History

    • How was the experience created? What conditions/context informed this process?

    • Who created and/or has cared for the experience up until now? What will their roles be as long-term care continues?

    • How has the experience been presented in the past? Can you gather documentation to support transmitting this to future stakeholders?

    hashtag
    Organisational

    • Can this media be collected in accordance with your organisations preservation policy? The longevity of an XR experiences is difficult to predict and there are few guarantees when it comes to long-term access.

    • What are your overall goals in terms of preservation i.e. what 'level' of preservation do you hope to achieve? e.g.

      • of the media?

    hashtag
    Experiential Characteristics

    hashtag
    Immersive Media Type

    Immersive media technologies can be used to create experiences that combine elements of physical and virtual environments in different ways. The following terms are frequently used to characterise types of immersive media:

    • Virtual Reality (VR) refers to experiences where a user is fully immersed in a virtual environment. All elements of the environment experienced are virtual (typically through an ) and the user is unable to see the physical environment they are occupying.

    • Augmented Reality (AR) refers to experiences which add virtual elements to a real-world physical environment. The environment experienced combines virtual and real-world elements, the former often being superimposed onto a camera feed of the latter.

    • Mixed Reality (MR) combines elements of VR and AR.

    Together, VR, AR and other related terms, are sometimes referred to using the umbrella label XR.

    hashtag
    Interactivity

    The extent to which the user can interact with an XR experience can vary in several ways. One is the extent to which the user has control over their spatial position in the virtual environment during the experience:

    • In a fixed-position experience, the user views the environment from a fixed position. Only rotational tracking, known as three degrees of freedom or 3DoF, is used in these experiences.

    • In an on-rails experience, the user moves through the environment along a predetermined path. Only rotational tracking, known as three degrees of freedom or 3DoF, is used in these experiences.

    • In a fully interactive experience

    Elements of the virtual environment may also be interactive. This is dependent on the way in which the VR content has been produced. 360 video content (see ) is typically not interactive, as the moving image sequence was predetermined when the video was produced. Real-time 3D software content (see ) is more likely to have interactive elements as the moving image sequence is generated on-the-fly and so can respond to user input.

    hashtag
    Technical Characteristics

    hashtag
    Identifying Key Components

    Technical characteristics depend on the kinds of components the experience makes use of. Immersive media content is the stored digital material that is created to be experienced through XR hardware. This generally fits into one of two categories, which significantly impact the approach taken to preservation. Identify what type of immersive media content you are working with and consult the relevant subsections of the Knowledge Base:

    • 3D Software is immersive media content produced or stored as real-time 3D software (see ). This is typically produced using .

    • 360 Video is immersive media content produced or stored as video (see ).

    Then consider what hardware or other physical components are needed to provide access to this content:

    • XR Hardware is the physical computer and human-interfaces devices required to access the immersive media content (see ).

    • Physical Components are other (non-electronic) physical components that form part of the immersive media experience (see ). This might include props, flooring, seating or other environment characteristics.

    hashtag
    Condition Assessment

    Having identified the components, consider their condition and the potential implications for long-term care. This can help identify potential vulnerabilities, inform preventative actions and highlight immediate treatment needs. The following questions may help guide this process:

    • If hardware has been received, is this functional? If not, what needs to happen to make it functional?

    • Are any of the components used obsolete (i.e. neither sold nor supported anymore)? If so, are these components integral to the work and can they be repaired?

    • Given responses to the above questions, would it be worth intervening now to stabilise the condition of the components?

    Contributors

    Thank you to all those who have generously given their time and knowledge to help shape the Knowledge Base:

    • sasha ardenarrow-up-right

    • Rasa Boycte

    • Jack McConchie

    • Claudia Roeck

    • Samantha Rowe

    • Jesse de Vos

    • WinZs

    If you have contributed to the Knowledge Base, please do add your name here (and optionally, a hyperlink to an appropriate URL) and .

  • Ensom, T., & McConchie, J. (2021). Preserving Virtual Reality Artworks. Tate. https://doi.org/10.5281/zenodo.5274102arrow-up-right

  • LIMA (2021). A Practical Research into Preservation Strategies for VR artworks on the basis of Justin Zijlstra’s 100 Jaar Vrouwenkiesrecht. URL: https://www.li-ma.nl/lima/article/preserving-vr-artworksarrow-up-right

  • https://www.vrwiki.cs.brown.edu/arrow-up-right
    https://miap.hosting.nyu.edu/program/student_work/2017spring/17s_thesis_Campbell_y.pdfarrow-up-right
    https://resources.culturalheritage.org/emg-review/volume-6-2019-2020/campbell/arrow-up-right
    https://www.semanticscholar.org/paper/Preserving-the-emerging%3A-virtual-reality-and-video%2C-Cranmer/1a11eea6b2a6c095f370180bf1ec153944e32e54arrow-up-right

    Consider what is needed to make the XR experience accessible to different audiences.

    • What is need to make the experience comfortable and safe for users? e.g. invigilation

    Archiving digital components of the media (bit-level preservation)?

  • Archiving physical and/or hardware components of the media?

  • Maintaining long-term access to the experience and/or components?

  • What infrastructure will you need in place to achieve these goals?

  • What expertise will you need to draw upon to achieve these goals? Does your organisation have any in-house expertise you can draw upon or will you need external support?

  • a user can change position within the environment freely. Rotational and positional tracking, known as
    six degrees of freedom
    of
    6DoF
    , are used in these experiences.
    Documentation
    HMD
    Technical Characteristics
    Technical Characteristics
    Introduction to 3D Software
    game engines
    Introduction to 360 Video
    Introduction to XR Hardware
    Introduction to XR Hardware
    Tom Ensomarrow-up-right
    Eric Kaltmanarrow-up-right
    submit this with your edit

    OpenXR

    OpenXRarrow-up-right is an open, royalty-free standard for APIs that provide XR applications with access to XR platforms and devices. This is implemented in the XR runtimearrow-up-right software supplied by the manufacturer of XR hardware. Application support for OpenXR is potentially useful for preservation purposes — as it is a open standard, which will make keeping software available that

    OpenXR is developed by a working group managed by the Khronos Group consortium, who describe it as follows:

    OpenXR is an API (Application Programming Interface) for XR applications. XR refers to a continuum of real-and-virtual combined environments generated by computers through human-machine interaction and is inclusive of the technologies associated with virtual reality (VR), augmented reality (AR) and mixed reality (MR). OpenXR is the interface between an application and an in-process or out-of-process "XR runtime system", or just "runtime" hereafter. The runtime may handle such functionality as frame composition, peripheral management, and raw tracking information.

    Optionally, a runtime may support device layer plugins which allow access to a variety of hardware across a commonly defined interface.

    —

    Up until the arrival of OpenXR, support for each manufacturers API would have to be built into the XR applications if they were to be used.OpenXR attempts to solve the problem of compatibility between XR applications and XR hardware. Image source: .

    hashtag
    Using OpenXR

    In order to make use of OpenXR, you need to:

    1. Develop software which supports — see Engine Implementations below.

    2. Make use of an XR platform which supports it — see XR Runtime Implementations below.

    hashtag
    XR Runtime Implementations

    The that the following XR runtimes/platforms are compliant with OpenXR:

    • Acer:

    • ByteDance:

    • Canon:

    hashtag
    Engine Implementations

    Documentation

    Documentation provides an important resource, accompanying, or even standing in for, an immersive media (XR) experience as it ages. As a starting point for documenting an XR experience, the following prompts may be useful:

    • Recording key information gained during initial assessment of the characteristics of an XR experience e.g. using the templates on this page.

    • Creating video documentation of XR content using video recording tools.

      • What is the user experience like?

    • Creating or recording personal narratives/accounts of works (i.e. oral histories).

      • From whose perspectives are these recorded? e.g.

        • User/viewer

    • Could your documentation become ''?

      • Documentation created while the experience is accessible can become a useful preservation pathway later, by transmitting some understanding of the original experience.

      • How do you measure the success of your documentation? e.g. completeness, multiplicity, accuracy...

    • Consider outside-in and inside-out perspectives on documentation - see referenced below for more info:

      • "An outside-in approach takes the technical characteristics of hardware and software as its starting point, where the material properties determine how content is rendered and experienced."

      • "By contrast, an inside-out approach is user-centered, where experience of the content reveals significant properties that guide decisions as to how content is rendered."

    hashtag
    Templates

    hashtag
    Acquisition Information Templates

    These documents were designed to guide information gathering and discussion during the early stages of the acquisition of virtual reality (VR) or augmented reality (AR) artworks, primarily with conservation and long-term preservation in mind. They are designed to be completed by or in close collaboration with an artist prior to receiving media from the artist.

    hashtag
    AR Templates

    hashtag
    VR Templates

    • (March 2019) developed by Jack McConchie and Tom Ensom during Tate's Preserving Immersive Media project:

    • (May 2019) developed as an extension of this template by Savannah Campbell and Mark Hellar:

    hashtag
    Publications

    hashtag
    Augmenting Our Approach to Preservation: Documentation of Experience for Immersive Media

    Written by sasha arden during their 2021–22 graduate internship at Tate, this paper considers the experiential dimension of immersive media (IM) and its significant role in preservation and future access.

    PDF available via Tate website:

    hashtag
    Documentation Infographic for artists and makers

    The infographic was created by Lieve Baetens in 2023, and provides artists and makers information about the importance of documenting their work. The infographic answers questions about why it is important for artists and makers to document their work and how they can document their own work.

    Repair Guides

    Extending the service life of hardware through repair could support preservation efforts if a particular XR system is required. These hardware systems are meant to be used, and in an exhibition context they can be subjected to serious wear and tear.

    The consumer market for XR hardware systems and mobile phones is built on planned obsolescence; products are superseded by new models, manufacturers only provide limited warranty periods for repair or replacement, and parts are not commonly available. Purchasing backup equipment or sourcing equipment to be used for parts may be part of a preservation plan.

    iFixit is a US-based company that advocates for repairability of consumer devices. The Preserving Immersive Media Group has no affiliation with iFixit. The guides linked here provide step-by-step instructions with photos, as well as tools and in some cases replacement parts. The teardown reviews are another resource should you have equipment that is intended to be used for parts. Popular XR systems and Samsung Gear VR are linked here, but many other device guides can be found on iFixit.

    Valve Index:

    Meta/Oculus Devices:

    HTC Vive Devices:

    Samsung Gear VR:

    Artist/creator and their technical/development collaborators
  • Professional/neutral or personal tone?

  • documentation as preservation
    sasha arden's paper
    file-pdf
    176KB
    Template_Pre-Acquisition Questionnaire for Augmented Reality (AR) App-Based Artworks_Rowe_Version 1.1_2022.pdf
    PDF
    arrow-up-right-from-squareOpen
    file-download
    42KB
    Template_Pre-Acquisition Questionnaire for Augmented Reality (AR) App-Based Artworks_Rowe_Version 1.1_2022.docx
    arrow-up-right-from-squareOpen
    Version 1arrow-up-right
    file-download
    99KB
    tate_pim_vracqtemplate_01_00.docx
    arrow-up-right-from-squareOpen
    Version 2arrow-up-right
    file-download
    12KB
    Virtual Reality Artwork Acquisition Template.docx
    arrow-up-right-from-squareOpen
    https://www.tate.org.uk/documents/1793/Augmenting_Our_Approach_to_Preservation.pdfarrow-up-right
    file-pdf
    70KB
    Infographic Documentation to the Rescue.pdf
    PDF
    arrow-up-right-from-squareOpen
    file-download
    22KB
    Text Infographic Documentation to the Rescue.docx
    arrow-up-right-from-squareOpen

    Preservation Strategies

    Computer hardware is difficult to preserve in the very long-term as its maintenance is dependent on access to suitable components and specialist knowledge, both of which can be lost with time. XR hardware provides a particularly challenging case study given its complexity. However, repair guides may be able to help maintain it in the near term.

    Collabora: Monado open-source OpenXR runtimearrow-up-right
  • HTC: Vive Focus 3, Vive Cosmos & Vive Wavearrow-up-right

  • Magic Leap: Magic Leap 2arrow-up-right

  • Meta: Quest 3, Quest Pro, Quest 2, Quest and Rift S and Meta XR Simulatorarrow-up-right

  • Microsoft: Hololens and Mixed Reality Headsetsarrow-up-right

  • Qualcomm: Snapdragon Spacesarrow-up-right

  • Sony: Spatial Reality Displays (ELF-SR1 & ELF-SR2)arrow-up-right

  • SteamVR: All supported headsetsarrow-up-right

  • Varjo: All Varjo headsetsarrow-up-right

  • Engine

    Versions Supporting OpenXR

    Supported Runtimes

    Unreal Engine 4

    4.27 (via plugin); 4.24+ (via beta plugin)

    Windows Mixed Reality; Oculus (via Oculus OpenXR plugin); SteamVR (via SteamVR Beta opt-in)

    Unity

    2020.2+ (via plugin)

    Windows Mixed Reality; HoloLens 2

    https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.htmlarrow-up-right
    https://www.khronos.org/openxr/arrow-up-right
    Khronos Group report arrow-up-right
    Spatial Display Seriesarrow-up-right
    Neo3 and PICO4arrow-up-right
    MREAL Platform Displays and Headsetsarrow-up-right
    OpenXR attempts to solve the problem of compatibility between XR applications and XR hardware. Image source: https://www.khronos.org/openxr/.

    XR Systems

    hashtag
    Introduction to Modern XR Systems

    Modern XR systems are unified sets of hardware, with supporting software, which are used to provide end users with the capability of accessing immersive media content like XR-enabled 3D software and 360 video. Common components of XR systems include:

    • Head-mounted display (HMD) which is placed over the users head, providing the screens which create the visual illusion of being in a virtual environment.

    • Human Input Device (HID) which allows the user to interact with elements of the virtual experience (e.g. by pressing buttons or moving a joystick).

    • Tracking system which translates user movement in the physical environment into movement in the virtual environment using camera and/or sensor data. In practice this is often integrated, at least partially, into HMD and/or HID devices

    • Computer system to run the XR software, which can be integrated into the headset (e.g. in the Oculus Quest series) or run standalone (for tethered headsets like the Oculus Rift S or Valve Index).

    • which provides interfaces between these hardware components and XR software.

    hashtag
    Head-Mounted Displays

    The simplest type of XR system uses a handheld mobile phone mount with lenses that help to focus on the screen content and create the illusion of an immersive view. Google Cardboard is an example of such a handheld device of this type. The next step in features and complexity involves a more sophisticated that is hands-free and provides a better illusion of immersion.

    Further interactivity and immersion is achieved with tracking systems, which provide input for the XR content and/or information about the user's position in space. Tracking systems can be built into the HMD (e.g. Oculus Quest series) or hardware incorporating sensors is used along with an HMD (e.g. Oculus Rift or HTC Vive).

    Another category among XR systems is standalone types and tethered types. Standalone HMDs (e.g. Oculus Quest, HTC Vive Focus) offer greater freedom of motion and are less expensive overall, while tethered systems (e.g. Oculus Rift S, HTC Vive Pro) use a connected computer to offer better performance in terms of graphics and location processing.

    The commercial marketplace has strongly influenced development of XR systems since the early 2010's when a "second wave" of immersive technologies started. Most companies produce tiered feature sets at varying price points. Releases of product models might follow incremental improvements in technology, resulting in very similar systems with only one or two differences between them -- or releases could represent a big jump in technology or hardware design from year to year. It can be challenging to identify defining features and technical specifications for a given XR system because of advertising styles (many hard-to-prove claims are made) and the proprietary, competitive nature of development.

    hashtag
    Comparison of XR HMDs

    A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet, that mounts either a single screen in front of both of the users eyes (monocular HMD) or a separate screen in front of each eye (binocular HMD). An extensive summary of the properties of VR HMDs is available on the wiki or on . HMDs are generally either "tethered" (such as Oculus Rift) to a PC that monitors tracking and undertakes the rendering, or "untethered" (such as Oculus Quest) where the unit is standalone.

    The table below is meant to gather information for the purpose of identifying the features of popular XR systems. It could be useful for considering compatibility in the case of a potential hardware replacement or migration. For example if the XR content uses 6DOF interactivity, an XR system that is only capable of supporting 3DOF would not be suitable.

    hashtag
    HMD-Runtime Compatibility

    Supporting Software

    Supporting software is required to connect XR software (e.g. a VR application or 360 video player) to XR hardware. This is typically a software package provided by the manufacturer of an XR hardware system. This may include a user-facing application with a storefront for purchasing content (e.g. Steam or Oculus) but also includes back-end software components which serve a technical role.

    The most important component of this package from a preservation perspective is an XR runtime, which provides an API and driver to support an interface between XR applications and XR devices. It can also provide features that modify the behaviour of XR applications (e.g. improving performance of image quality). Examples include the Oculus runtime and the SteamVR runtime. An XR runtime can provide XR applications with a variety of interfaces with which to interact with them, which might conform to a standards such as or OpenVR.

    Runtime Name
    OpenXR Support
    OS Support
    Download Packages

    Introduction

    3D software involves the use of software to dynamically generate a moving image sequence from 3D data and code. The real-time nature of the process means that the moving image sequence can be dynamic and responsive to user interaction. This can be contrasted with video, where frames are encoded and played back linearly. In practice, 3D software technologies used in XR production are often very similar to those involved in the creation of video games e.g. game engines, 3D modelling, texturing.

    Existing software frameworks and tools are typically used as a starting point for development of 3D software, from which a distributable form of the software is generated. Development for desktop or mobile applications is typically using a . This can result in builds supporting different platforms e.g. desktop, mobile, web. Development for the web may involve the use of web frameworks like and .

    hashtag

    Oculus Runtime

    Yes

    Android

    Windows 10/11

    Current version only downloadable through Oculus client. Legacy versions available on

    SteamVR

    Yes

    Windows 7 (SP1)

    Windows 8.1

    Windows 10/11

    Can only be downloaded through Steam (see ).

    Monado

    Yes

    Android

    Linux Windows 10/11

    Source code available from , build packages available for Debian and Ubuntu only.

    OpenXR
    Assessing 3D Software
    • What tools were used in its creation and why were these chosen? e.g. game engines, development frameworks, version control

    • If a game engine was used:

      • Is the specific engine version important?

      • Was the engine modified in any way? e.g. plugins, rebuilt from source

      • Is the game engine (and any external dependencies required to run it) still accessible and supported by developer?

      • Can executable software still be built from engine project in a contemporary computing environment? This may take some effort to achieve but is a very valuable learning experience and indicates you have everything you need to maintain the software.

    • What kinds of asset were used? e.g. custom made, third-party licenced

      • What tools were used to create the assets? e.g. 3D modelling, texturing, photogrammetry, animation/rigging

    • How were any audio elements designed? Is sound dynamically triggered or from linear playback source?

    • How compatible is the software with contemporary XR platforms? This will depend on how support has been built into the software during creation and how it has been distributed/displayed in the past e.g.

      • Was the software developed to run on a specific computing platform? e.g. Windows PC, Mac OS, Android, web platforms (A-Frame, Three.js) etc.

      • Was the software developed to run with a specific XR hardware platform? e.g. Oculus, SteamVR etc.

    • Are source materials available and how complex would these be to meaningfully preserve?

      • What tools are needed to build the project from source materials as a standalone app/executable/webpage?

      • Could source materials be modified to update the software?

    • Has documentation of the experience been supplied? If not, can it be created? e.g. fixed-perspective video capture, 360-degree video capture, installation photographs/video

    hashtag
    Acquisition Checklist

    The following measures can help support long-term access to 3D software and prepare for future preservation interventions:

    • Archive a copy of the executable software (ideally supporting as many operating systems and XR runtimes as possible).

    • Identify, gather and test the dependencies required to access the executable software (e.g. operating systems, libraries, XR runtime, drivers).

    • Archive a copy of the source materials (be it code or an engine project) and dependencies required to build the project — see Software Archiving Guides.

    • Identify, gather and test dependencies required to access source materials.

    • Source or create documentation of the software running.

    hashtag
    Further Reading

    US Library of Congress Recommended Formats Statement: Software and Video Gamesarrow-up-right

    Preserving 3D. Data Type Series. Artefactual Systems and the Digital Preservation Coalition. July 2021arrow-up-right

    game engine
    Three.jsarrow-up-right
    A-Framearrow-up-right

    HMD Name

    Supported Runtimes

    Oculus Rift CV1

    Oculus Runtime; SteamVR

    HTC Vive

    SteamVR; Oculus Runtime (via Revivearrow-up-right)

    Supporting software
    head-mounted display (HMD)
    XinRealityarrow-up-right
    Wikipediaarrow-up-right

    Does it make use of any external resources? e.g. additional software (e.g. Max); resources accessed via the internet etc.

    https://developer.oculus.com/downloads/package/oculus-runtime-for-windows/arrow-up-right
    Archiving XR Runtimes
    GitLabarrow-up-right

    Archiving an XR Runtime

    An XR runtime is a piece of software that support communication between XR software and hardware. This page describes the process for extracting an XR runtime as a contained unit of software, so that it can be archived independently of a computer system.

    XR runtime are often distributed using front-end tools that carry out downloading and installation in the background (e.g. SteamVR is downloaded through Steam). This makes it harder to extract and archive them for reuse in the future. This page describes the process of extracting different XR runtimes for independent archiving.

    hashtag
    Oculus (Windows)

    Legacy Oculus Runtime 0.8.0 (dating from 2017) is currently available to download from the Oculus website: .

    For later versions you are dependent on the to install and manage the runtime. After installation, the runtime can be found in C:\Program Files\Oculus. Testing is needed to ascertain whether this directory can be copied to a new machine.

    hashtag
    SteamVR (Windows)

    These instructions are adapted from Valve's of SteamVR. While this method allows you to extract a standalone copy of the SteamVR runtime, the version available through Steam cannot be controlled.

    1. Either a) Install and open the on a PC with full internet access or b) Access a computer which already has the appropriate version of Steam installed.

    2. In the Steam Client, open the Library section and find the part of it labeled "Tools".

    3. Find the entry "SteamVR" and install it.

    4. Right-click on the entry "SteamVR" and in the resulting popup menu click on the entry "Properties".

    5. A new window with multiple tabs will open. Select the tab "LOCAL FILES" and click on the button labeled "BROWSE LOCAL FILES".

    6. The directory containing the SteamVR Runtime will open. Copying this entire directory will capture the files required to run SteamVR on another computer. From this directory, SteamVR can be launched by running the "vrstartup.exe" executable file in "\SteamVR\bin\win64".

    Unity

    Unity is a game engine that was initially released in 2005. Released with the aim of democratizing video game development, it was intended to be an affordable engine aimed primarily at independent and amateur game developers. Unity can create both 2D and 3D applications. The engine typically has a major version update every year with multiple smaller updates throughout the year. Unity is free for developers that bring in under $100,000 in revenue in a 12-month period. After that, there are 2 different pricing models. Though originally designed to build games, today Unity is used across multiple industries including automotive, architecture, engineering, construction, film, education, and retail.

    This page is designed to help you understand what Unity is and how it can be used by creators. The Unity Engine was initially released in 2005 and was designed to provide affordable game development tools aimed mainly at amateur game developers. Unity can create both 2D and 3D applications. The engine typically has a major version update every year with multiple smaller updates throughout the year. Unity is free for developers that bring in under $100,000 in revenue in a 12-month period. After that, there are two different pricing models. Though originally designed to build games, today Unity is used across multiple industries including automotive, architecture, engineering, construction, film, education, and retail.

    Screenshot of the Unity Editor

    hashtag
    Anatomy of a Unity Project Folder

    When a new project is created, Unity creates several folders as a part of the project. Over the years and versions of Unity, this default folder structure has changed somewhat, but it now generally contains the following folders:

    hashtag
    Assets

    This is the folder where all game resources are stored, including scripts, textures, sound, and custom editor scripts. The organization of the folder can vary greatly from one project to another and from one organization to another. There can be numerous subfolders in the Assets folder, depending on how the project is organized. For example, there may be a folder for scenes, one for scripts, one for audio, or one for sprites. There is no limit to how deep the organization can be.

    As a best practice, subfolders in the Assets folder should be created within Unity and not the computer's local file system. Likewise, assets should be added directly to the folders in the Unity Editor and not the file explorer as seen above. This is particularly important due to the way Unity constructs metadata for assets.

    Unity reserves several special folder names under the asset folder. These folders are Editor, Editor Default Resources, Gizmos, Resources, Standard Assets, and Streaming Assets. Not every project will have all of these folders.

    Assets/Editor — This folder is for custom editor scripts that extend the functionality of the Unity editor. These scripts will run in the editor but not in the project at runtime. Multiple Editor folders can exist in the Assets Folder. The execution of the editor scripts varies depending on where in the folder structure the editor file exists.

    Assets/Editor Default Resources — This is where asset files used by Editor scripts are stored. There can only be one such folder and it must be placed in the Assets folder root. There can be subfolders in this folder.

    Assets/Gizmos — Gizmos are graphics in the scene view which can help to visualize design details. This folder stores images used for gizmos. There can only be one folder and it must be placed in the Assets folder root. Gizmo examples are seen below with the red squares around them. The one on the left is the main camera’s position and rotation. The one on the right represents a light source. These two are Unity built in gizmos:

    Assets/Resources — This folder stores resources so that they can be loaded on demand in a Unity project. There can be multiple resource folders. On demand loading is helpful for dynamically loading game objects that don’t have instances created by designers during design time. In other words, these resources may not have corresponding game objects placed in the scene at design time and can be loaded dynamically at run time.

    Assets/Standard Assets — This folder stores any standard asset packages that have been imported into a project. There can be only one standard assets folder. Standard assets are free assets maintained by Unity.

    Assets/Streaming Assets — This folder is for assets that will remain in their original format and later be streamed into the Unity application, instead of directly incorporating them into the project’s build. An example would be a video file from the filesystem. There can only be 1 streaming assets folder in the project.

    hashtag
    Library

    Moving on from the Assets folder, the next folder is Library. This is a local cache used by Unity for imported assets. It can be deleted and will be regenerated by unity automatically. All that is needed to recreate the folder are source assets and .meta files. If this folder is deleted, Unity will reimport all asset information and regenerate the folder the next time the project is opened in the editor. This folder should not be included in Version Control. These imported assets are used by Unity to save time when Unity is running.

    Of special note in the Library folder is the package cache folder. This contains information about all the packages installed with the current project. Though this can be regenerated by Unity like the other items in the Library folder, for archival purposes, it is important that this file not be deleted. This is because it may be helpful to be able to see what packages are included in the project without having to regenerate the cache, which would require opening the project in its appropriate editor version.

    hashtag
    Packages

    This folder contains the manifest file in JSON format used to maintain the dependencies link between packages. The manifest file is used to regenerate the package cache in the library folder. It also contains a file listing the individual packages installed with the project. These are used by the Unity Package Manager. The package manager was added in Unity 2018.1. Prior versions of Unity will not contain the package manager and the packages folder will not exist in those cases.

    hashtag
    Project Settings

    This folder contains all project settings from the project settings menu. It also includes editor version number and build settings and a slew of other settings used by Unity systems. Editor version number, as a standalone file, was not added until Unity 5. For any version before that, the editor version number can be found in the project settings file.

    hashtag
    Preservation Considerations

    hashtag
    Access and Licencing

    hashtag
    Export Formats

    hashtag
    Editor Dependencies

    hashtag
    Build Dependencies

    Game Engines

    Game engines are software development tools for creating interactive software. They package together libraries and software which simplify the development of interactive software. Game engines are a widely used tool in the creation of real-time 3D VR software, and many engines support VR production workflows out-of-the-box.

    A modern game engine will typically include:

    • A 3D or 2D renderer, which supports the rendering of a moving image sequence in real-time.

    Unreal Engine 4

    Unreal Engine 4 (UE4) is a game engine developed by Epic Games. The first version of Unreal Engine was created during the development of the 1998 game Unreal, at which point Epic Games started licencing the engine to other developers. Version 4.0 was released in 2014, and has been followed by 27 subversions (the latest is 4.27). Unreal Engine 5, announced in 2020, is expected to supersede UE4 at some point in the near future. UE4 is not an open-source engine (see the EULA), but the engine source code is freely available via their GitHub repository.

    circle-exclamation

    Unreal Engine 4 has now been superseded by Unreal Engine 5 and will no longer receive updates. It it possible to migrate projects from version 4 to 5 but manual migration of some elements may be required. See Epic's for more information.

    This page is designed to help you understand what Unreal Engine 4 is and how it can be used by creators.

    Editor for compositing and managing scenes and asset import.
  • Physics simulation.

  • Scripting and programming tools to support dynamic, simulated and interactive elements.

  • Sound processing.

  • Extension through plugins and/or custom code.

  • There are numerous engines in use today. For real-time 3D rendering applications such as VR, Unity and Unreal Engine are currently the two most popular. Both are free to download and use non-commercially, which has contributed to their popularity.

    An important implication of using an engine is that much programming has already happened before work on a project begins. The engines provides a toolset that can be used to realise the project and generate executable software, but as a creator you do not necessarily have full control or authorship of the code.

    hashtag
    List of Engines

    Name
    Platforms
    Download

    /

    Windows

    Binaries distributed through . Source code for 2013 edition available on

    Windows; MacOS (Intel & M1); Linux (Ubuntu and CentOS)

    Windows; Linux ()

    Binaries distributed through . Source code available on restricted access repository.

    hashtag
    Resources

    Adrian Courrèges (2020) Graphics Studies Compilation. URL: http://www.adriancourreges.com/blog/2020/12/29/graphics-studies-compilation/arrow-up-right.

    baldurk (n.d.). Graphics in Plain Language: An introduction to how modern graphics work in video games. URL: https://renderdoc.org/blog/Graphics-in-Plain-Language/Part-1.htmlarrow-up-right.

    Brown University VR Software Wiki. URL: https://sites.google.com/view/brown-vr-sw-review-2018/homearrow-up-right

    https://fgiesen.wordpress.com/2011/07/09/a-trip-through-the-graphics-pipeline-2011-index/arrow-up-right

    A simple 3D scene created in the Unreal Engine 4.27 editor.
    Unreal Engine 4 (UE4) is a real-time 3D engine developed by Epic Games. The first version of Unreal Engine was created during the development of the 1998 game Unreal, at which point Epic Game started licencing the engine to other developers. Version 4.0 was released in 2014, and has been followed by 27 subversions (the latest is 4.27). Unreal Engine 5, announced in 2020, is expected to supersede UE4 at some point in the near future. UE4 is not an open-source engine (see the EULA), but the engine source code is freely available via their GitHub repository.
    A simple 3D scene open in the Unreal Engine 4.27 editor.

    hashtag
    Anatomy of a UE4 Project

    A UE4 project consists of a collection of files and folders conforming to a well defined structure. A project folder typically contains the following at the top-level (more detail in the UE4 docs)arrow-up-right:

    • Binaries: If the project has been compiled for a specific platform, this contains the files produced.

    • Build: Contains files required to build the project for different platforms.

    • Config: Contains project settings stored in plain text files with the .ini extension.

    • Content: Contains the maps, assets (e.g. 3D models, materials, blueprints) and other custom content used by the project, including any third-party packages downloaded from the Unreal Marketplace.

    • DerivedDataCache and Intermediate: Contain temporary files generated during the build process (e.g. shader compilation).

    • Saved: Contains saved data created by the editor as it runs, including autosaves, crash reports and logs.

    • A .uproject file: The project file with which the project can be launched. Actually a json file containing structured information describing the project, including the UE4 version, enabled plugins and target platforms.

    hashtag
    Anatomy of a UE4 Build

    A UE4 build consists of set of files and folders that allow the software to be run on a suitable host computer.

    Some elements dependent on whether Development or Shipping option is selected prior to build, and other packaging options in UE4.

    hashtag
    Preservation Considerations

    hashtag
    Overview

    • Has been updated an average of 4 times per year since first release in 2014. Updates can result in deprecation or removal of features, or incompatibility with plugins. Unreal Engine 5 was released in 2022, which may mean an end to UE4 updates.

    • Free source code access via GitHub repository (note that the Unreal Engine EULAarrow-up-right applies).

    • Royalty payments only required with high product revenues, so unlikely to impact cultural heritage context.

    • For custom C++ projects Visual Studio dependencies are difficult to manage from an archival perspective.

    • Transforms imported assets to the poorly documented internal format (.uasset). More work is required to understand whether this transformation is lossless.

    hashtag
    Access and Licencing

    Criteria
    Assessment

    Source code access

    Public source code access although access to private GitHub repository needs to be requested. Not open source per se, released under the Unreal Engine EULA. Seems accommodating to preservation use case, but use must abide by EULA (e.g. no redistribution of engine source code).

    Licencing

    Users to pay a 5% royalty to Epic Games if product revenues exceed $1,000,000 USD.

    Availability of old versions

    Oldest version available is 4.0.2 (released 28 March 2014) via Epic Games Launcher or GitHub.

    hashtag
    Export Formats

    When an asset is imported to Unreal Engine 4 it is converted to the UE4 UASSET (.uasset) format. This format is not well documented, although there is some partial reverse engineering work herearrow-up-right. A UASSET can be re-exported from the engine in a variety of formats depending on the asset type. To do so, you need to right click on the asset in the Content Browser, and navigate to Asset Actions -> Export.

    The information below was derived from testing in Unreal Engine 4.27. Note that is simply a list of the export formats available and exports have not been tested against original import format.

    hashtag
    3D Models (Static) Export Formats

    Format
    Includes Material?
    Notes

    FBX

    Yes

    Can export to 2011, 2012, 2013, 2014, 2016, 2018, 2019, 2020 versions of the FBX spec.

    OBJ

    No

    Unreal object text .copy

    No

    Identical to .t3d

    hashtag
    Game Dependencies

    Windows applications created using the UE4 editor have a set of dependencies similar to the editor. These are automatically packaged with an application when built from the editor, and are installed when the application is run by an installer program called UE4PrereqSetup_x64.exe. This can be located in the application directory: [UE4GameRootLocation]\Engine\Extras\Redist\en-us

    hashtag
    Editor Dependencies

    These dependencies are required to run the editor, or applications created using the editor.

    Platform
    Dependencies

    Windows

    The Epic Games Launcher automatically runs a dependency installer when UE4 is installed, called 'UE4PrereqSetup_x64.exe' or 'UE4PrereqSetup_x86.exe'. Manually running these is required if a different method of installation is used (e.g. copying the engine binaries onto the system). These can be located within the Engine directory: [UE4EditorRootLocation]\Engine\Extras\Redist\en-us

    hashtag
    Build Dependencies

    These dependencies are required to build software from the editor for particular platforms.

    Target Platform
    Dependencies

    Linux

    clang for Linux (version depends on engine version: )

    Linux (cross-compile from Windows)

    clang for Windows (version depends on engine version: )

    Windows

    Visual Studio required for some (w/ code modification?) projects (versions depends on engine version: )

    hashtag
    UE4 Resources

    • Epic Game, Unreal Engine 4.27 Glossary, URL: https://docs.unrealengine.com/4.27/en-US/Basics/Glossary/arrow-up-right

    • David Lightbown, 2018, Classic Tools Retrospective: Tim Sweeney on the first version of the Unreal Editor. URL: https://web.archive.org/web/20180823012812/https://www.gamasutra.com/blogs/DavidLightbown/20180109/309414/Classic_Tools_Retrospective_Tim_Sweeney_on_the_first_version_of_the_Unreal_Editor.phparrow-up-right

    migration guidearrow-up-right
    https://developer.oculus.com/downloads/package/oculus-runtime-for-windows/arrow-up-right
    Oculus Rift softwarearrow-up-right
    guidelines for offline installationarrow-up-right
    Steam Clientarrow-up-right
    Unity Root Project Directory
    Unity Scene showing Gizmos

    Archiving a Unity 5 Source Project

    This page describes the process for archiving a Unity 5 project and its dependencies on Windows, so that it can be preserved independently of a specific computer system.

    circle-exclamation

    Work in progress!

    One way of preparing for the preservation of an application made in Unity 5, is to archive the project files associated with it. Executable software is created from a Unity project by exporting an application that supports the target platform. By archiving an Unity project, we aim to gather together all the materials required to carry out to repeat this build process. This opens up options for incremental migration to new versions of Unity, and the modification of code to support other hardware and software platforms.

    To build an Unity project, you need the following components, guidance on the archiving of which is provided on this page:

    • the collection of custom Unity content and project files

    • : software which allows you to open and edit a Unity project folder

    • : any additional software not included with the engine binaries or project by default e.g. libraries, modules

    hashtag
    Project Folder

    #

    hashtag
    Editor and Modules

    The Unity Editor software can be installed using the Unity Hub software or from the intallers distributed via the .

    Before proceeding, you will need to identify the version of the Unity Editor the project was created with. To do so, navigate to the ProjectSettings folder within the project folder and open the file named ProjectVersion.txt (tested in 2018.3.9).

    hashtag
    Using Unity Hub

    1. Install Unity Hub on a suitable computer and navigate to the Installs tab.

    2. Click Install Editor and select the appropriate editor version. If the appropriate version is not available, you will need to install it use the Unity Download Archive method.

    You may also wish to install additional modules to improve build support.

    3. Return to the previous Installs tab, click on the cog icon next to the Editor entry you wish to archive, and click on Show in Explorer.

    4. In the Window which opens, you will be looking inside the application directory for the Unity Editor version. This folder can be archived and used to access the Editor independently of the Unity Hub installer.

    hashtag
    Using Unity Download Archive

    hashtag
    Dependencies

    There are two common ways of extending the functionality of Unity which you may find have been used: Modules and Packages.

    Modules extend core features, and includes options to build for non-Windows platforms. Unfortunately, if any non-standard modules have been used by the project your only option is to use the Unity Hub application to download and install these.

    Packages are handled by the Unity package manager module and are included in the project folder once they have been added.

    Godot

    This page is designed to help you understand what Godot is and how it can be used by creators.

    Godot Engine is a free open source 3D game engine for Android, Linux, Mac and Windows. A WebEditor is also available.

    Some information with relevance to XR preservation, quote from the :

    hashtag
    XR support

    Archiving an Unreal Engine 4 Source Project

    Written by Tom Ensom (updated: Feb 2024)

    This page describes the process for archiving an Unreal Engine 4 project and its dependencies, so that it can be archived independently of a computer system.

    One way of preparing for the preservation of software made in Unreal Engine 4 (UE4), is to archive the project files used to create it. While it is the executable form of the software (or 'build') which is used to run it, archiving the source form of the software opens up more preservation options such as:

    • Creation of new builds which support different platforms and environments;

    Mobile Platforms

    Mobile platforms and app services are part of preserving and re-exhibiting AR content and mobile phone-based XR content. This page gathers published research and resources on these topics.

    “Towards a preservation workflow for mobile apps” (24 February 2021)

    “A Race Against Time: Preserving iOS App-Based Artworks” (2019-20)

    “Considerations on the Acquisition and Preservation of Mobile eBook Apps” (2019)

    Smartphones within the changing landscape of digital preservation (2017)

    3D Data

    It may be desirable to explore independent preservation pathways for 3D data the forms a part of a real-time 3D immersive media experience. On this page you can find links to initiatives/workgroups, events and publications on the topic of developing standards for data packaging, file types, metadata, documentation, normalization, migration, storage and access of 3D data.

    International Image Interoperability Framework (IIIF) Community 3D Interest Group

    CS3DP (Community Standards for 3D Data Preservation)

    Council on Library and Information Resources (CLIR): 3D/VR in the Academic Library: Emerging Practices and Trends

    Developing Library Strategy for 3D and Virtual Reality Collection Development and Reuse (LIB3DVR)

    Godotarrow-up-right

    Android; Linux; MacOs;Windows; WebEditor

    Binaries and source code available on Githubarrow-up-right and in download sectionarrow-up-right

    Sourcearrow-up-right
    Source 2arrow-up-right
    Steamarrow-up-right
    GitHub.arrow-up-right
    Unity
    https://unity3d.com/get-unity/download/archivearrow-up-right
    Unreal Engine 4 / Unreal Engine 5
    from sourcearrow-up-right
    Epic Game Launcherarrow-up-right
    GitHubarrow-up-right
    Godot makes cross-platform Augmented and Virtual Reality development easy.
    • Works with many headsets including the Meta Quest, Valve Index, HTC Vive, Oculus Rift, all Microsoft MR headsets, and many more.

    • Support for OpenXR, an emerging open standard that most hardware vendors are moving to.

    • Plugins give access to various proprietary SDKs, such as OpenVR (SteamVR) and the legacy Oculus SDKs.

    • WebXR can deliver AR/VR experiences via a web browser.

    • ARKit allows creating AR apps for iOS.

    hashtag
    Multi-platform editor

    Create games on any desktop OS and Android.

    • Works on Windows, macOS, Linux, *BSD and Android (experimental). The editor runs in 32-bit and 64-bit on all platforms, with native Apple Silicon support.

    • Small download (around 35 MB), and you are ready to go.

    • Easy to compile on any platform (no dependency hell).

    hashtag
    Multi-platform deploy

    Deploy games everywhere!

    • Export to desktop platforms: Windows, macOS, Linux, UWP, and *BSD.

    • Export to mobile platforms: iOS and Android.

    • Consoles: Nintendo Switch, PlayStation 4, Xbox One via third-party providers (read more).arrow-up-right

    • Export to the web using HTML5 and WebAssembly.

    • One-click deploy & export to most platforms. Easily create custom builds as well.

    hashtag

    features listarrow-up-right

    Unreal object text .t3d

    No

    Identical to .copy

    https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/Linux/NativeToolchain/arrow-up-right
    https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/Linux/AdvancedLinuxDeveloper/LinuxCrossCompileLegacy/arrow-up-right
    https://docs.unrealengine.com/5.0/en-US/setting-up-visual-studio-development-environment-for-cplusplus-projects-in-unreal-engine/arrow-up-right
    Incremental migration to new versions of Unreal Engine 4;
  • Migration to a new engine should this become necessary.

  • Additionally, source materials can contain rich technical history and support an understanding of how the software was developed.

  • As a rule of thumb, you want to have all the materials required to repeated the process of building the software. To build an Unreal Engine 4 project, you need the following components:

    • Project folder: the collection of custom Unreal Engine 4 content and project files;

    • Engine binaries: the editor software which allows you to open the project;

    • Dependencies: any additional software not included with the engine binaries by default e.g. plugins, libraries.

    A sensible approach is therefore to archive all these components either independently or in the form of a disk image. Each component type, including how you can locate it, is described in detail below.

    hashtag
    Project Folder

    An Unreal Engine 4 project folder is a collection of files conforming to a specific directory structure — more information on this format can be found in our introduction to Unreal Engine 4.

    The archive this, the supplier will need to send you a copy of this complete directory. The contents of this directory should look something like the screenshot below:

    An example UE4 4.27 project folder.

    One interesting thing to note is that this can include assets and other materials that are not used by the built application. This can make the project files larger, but also provides historical insight into the way it was created. If the creator of the project files offer to 'clean them up' before supplying them, you may wish to advise them against that.

    Project files can include hundreds or even thousands of files, so as a final step you may wish to ZIP them for convenience and reduced stored size.

    hashtag
    Engine Binaries and Source Code

    hashtag
    Engine Binaries

    To open an Unreal Engine 4 project you need an appropriate version of the Unreal Engine 4 editor. To avoid errors, you should use the editor version in which the project original developed. If you use a newer version, Unreal Engine will present a warning and give you a choice of whether to proceed or not. There is a chance you can open the project successfully, but doing so may break or change things, so do so very carefully and always using a duplicate copy.

    You can download prebuilt UE4 binaries using the Epic Games Launcherarrow-up-right. Once installed, the engine version can be located in your UE4 install directory and zipped for archiving.

    Alternatively, you can build binaries from the source code on GitHubarrow-up-right (this is a private repository and you will need to request access prior to use). See also the Binary Builder tool https://github.com/ryanjon2040/Unreal-Binary-Builderarrow-up-right.

    hashtag
    Source Code

    If the project involves a modified version of UE4, you will also need to archive a copy of the source code. Archiving the source code of the engine version used can also be a generally useful thing to have, as it can hold useful information for future preservation work.

    For a project using a modified source code, the creator should be able to supply this or advise on where it can be found. For unmodified UE4 source code, this can be pulled from the Unreal Engine repository on GitHub: https://github.com/EpicGames/UnrealEnginearrow-up-right. This is a private repository, so you will need to request access and link to an Epic Games account before being able to access it — see the official instructionsarrow-up-right.

    hashtag
    Dependencies

    Sometimes additional dependencies are required to open or build a UE4 project successfully.

    hashtag
    Unreal Engine 4 plugins

    Plugins are extensions to Unreal Engine 4's functionality. They can be installed from the within the engine or manually. There are two default locations for plugins:

    • Unreal Engine install location: /[UE4 Root]/Engine/Plugins/[Plugin Name]/

    • Project folder: /[Project Root]/Plugins/[Plugin Name]/

    You need to make sure that any required plugins have been installed and are then archived with either the project files or engine binaries.

    hashtag
    C++ projects

    If a UE4 project involves custom C++ code, you will need to install the appropriate version of Microsoft Visual Studio (e.g. 4.27 requires Visual Studio 2019 to build such project for Windows).

    Building for Tomorrow: Collaborative Development of Sustainable Infrastructure for Architectural and Design Documentation

    PARTHENOS: “Digital 3D Objects in Art and Humanities: Challenges of Creation, Interoperability and Preservation”

    • Digital 3D Objects in Art and Humanities: challenges of creation, interoperability and preservation. White paperarrow-up-right

    • Project Deliverablesarrow-up-right

    • Workshop presentation videosarrow-up-right

    Born to Be 3D: Digital Stewardship of Intrinsic 3D Data (US Library of Congress)

    The Institute of Electrical and Electronics Engineers (IEEE) Standards Association: Virtual Reality and Augmented Reality Working Group

    hashtag
    3D Data Repository Features, Best Practices, and Implications for Preservation Models: Findings from a National Forum

    hashtag
    US Library of Congress Recommended Formats Statement: Design and 3D

    hashtag
    Web3D Consortium Recommended Standards

    hashtag
    DURAARK (Durable Architectural Knowledge) Paper:

    Lindlar, M., Panitz, M., and Gadiraju, U. (2015) Ingest and Storage of 3D Objects in a Digital Preservation System.

    Project folder:
    Unity Editor
    Dependencies
    Unity Download Archivearrow-up-right

    Software Tools

    Software-based tools for access, assessment, and documentation of immersive media.

    The software tools listed here support preservation activities for compiled, executable files or project files. Some are proprietary and require a license or developer account, and some are open source. Not all may be currently supported or compatible with your files or hardware.

    Tool
    Purpose
    Description
    Link
    Platforms
    Open source?

    Android Debug Bridge (adb)

    Preservation Strategies

    Preservation strategies are applied to XR content to guide it into the future. In an ideal world, full functionality and the same look and feel would be a preservation goal. However, in the real world technical limitations and limited resources may apply. Stakeholders need to agree on acceptable degrees of loss or change to the artwork and develop the preservation strategy accordingly.

    Another consideration in developing a preservation strategy is its sustainability. If possible, a preservation strategy should reduce dependencies and stabilises the software so that the frequency interventive preservation measures is reduced.

    3D software used in AR and VR can be highly dependent on specific hardware, particularly the HMD with its sensors, optics and electronics. To make a XR material more widely compatible, it can be made headset-independent. Currently this is supported by building in OpenXR support, an open standard for XR software. If this cannot be used, alternative migration approaches or emulation can be used. These various strategies are discussed in more detail in the sections below.

    hashtag
    Preparation

    In order to select and successfully apply a preservation strategy, you need to understand what it is you are trying to preserve. A good starting point for this is our page. Working through these questions, ideally in consultation with the stakeholders of the XR material, you are aiming to identify:

    • Which components or characteristics of the software are core to the XR experience? e.g. functionality, look and feel, user experience, quantitative / technical parameters; physical integrity.

    • Which are variable or less important? Perfect replication may not be necessary to realise the XR experience authentically.

    • Are they measurable or documentable in some way? Documentation can be very helpful as a reference for transmitting these characteristics e.g.

    This approach will help you determine what can and cannot change during the implementation of a preservation strategy, and develop a framework by which you can assess the success of a particular strategy. It is possible that compromises will have to be accepted because of financial or technical limitations. Therefore, applying a documentation strategy (see sub-section below) can complement one which priorities technical intervention.

    hashtag
    Documentation Strategies

    A game engine can be used to output a 360 degree video, which can be preserved independently of the software as a video. Capturing the game engine output in this way is a type of documentation strategy. A 360 degree video is much easier to preserve than a game engine project with all its software dependencies. In addition, it does not depend on any specific XR hardware for continued access. To output a 360 degree video, usually the game project is necessary as most tools target specific game engine editors. However, it might be possible to capture the output from running the executable using .

    (2022) advocates for the inside-out documentation approach, where the impact that the experience has on the user is central. This kind of documentation can become more relevant than the technical documentation of the artwork, if the functionality of the artwork cannot be preserved due to technical or financial constraints. This approach to documentation is practiced in performance art where the video documentation of the performance can replace the actual performance (for instance, Charlotte Moore performing Nam June Paik's TV cello in 1976).

    However, if the capturing strategy is applied to interactive XR experiences, where the experience reacts to user input (other than the head movement), capturing the output leads to a total loss of that functionality. If this is not acceptable, the game engine project must be migrated to a version that supports newer XR hardware

    hashtag
    Migration Strategies

    A migration strategy involves updating the source code/project to run on newer computer or XR hardware. This requires access to the game engine source code/project (not just the executable) as well as the game engine editor software.

    Different approaches to migration can be taken e.g.

    • Updating the game engine project to the newest LTS game engine version, and build the project for the newest runtime of the same headset brand.

    • Updating the game engine project to the newest LTS game engine version and compiling the project for the OpenXR runtime. OpenXR has open specifications and "aims to

    hashtag
    Migration Case Studies

    Case study applying an incremental migration strategy: LIMA (2021). A Practical Research into Preservation Strategies for VR artworks on the basis of Justin Zijlstra’s 100 Jaar Vrouwenkiesrecht. URL:

    hashtag
    Emulation Strategies

    An emulation strategy (and allied virtualization strategy) involves using a contemporary computer system to simulate a historical computer system. If there is only an executable, and the source code/project is not available, emulation can become an especially important preservation strategy for 3D software. Through emulation, the software environment is preserved as a whole, including the 3D software and any dependencies (e.g. the XR runtime).

    Emulation is mentioned at the end of this list, as it is often not expedient in the case of XR material. Gaining access to XR hardware (e.g. an HMD) from an emulated environment has not been successfully achieved in a preservation context, as far as we are aware. However, emulation can become an option if there is no HMD used, but other immersive displayed equipment (e.g. video projectors or monitors) are used instead.

    Video Documentation

    Video documentation can be used to record aspects of an immersive media (IM) experience. This can take two complementary forms:

    • Physical capture: Video recording of the real-world physical actions of a user interacting with an IM experience.

    • Virtual capture: Video recording of the virtual actions of a user interacting with an IM experience, as would be sent to a display device.

    hashtag
    Virtual Video Capture

    Virtual video capture can produce two kinds of video:

    • Fixed-perspective video: video which representing a fixed perspective on the virtual environment; the standard form of video designed for non-interactive viewing.

    • : video representing a 360 degree view from a central point, therefore allowing a level of interactivity via rotational tracking.

    hashtag
    Capturing Video from an Application

    Hardware and software tools can be used to capture video from a real-time 3D application. This can be fixed-perspective or 360 video.

    hashtag
    Software Tools

    Tool
    Description
    Capture Formats

    hashtag
    Rendering Fixed-Perspective Video in the UE4 Editor

    circle-info

    Tested in version 4.27 of the editor only.

    The Unreal Engine 4 editor has built-in tools which allow export of video sequences. The actions of these sequences can be scripted or recorded from user interaction within the editor. Using them therefore requires access to the source project and a level of engagement with the editor software. This will may involve modification of the source project, so you may wish to work with a copy or version control system.

    hashtag
    Render Formats and Settings

    Video export format depends on the options you select in the Render Movie Settings dialogue (info derived from MediaInfo output of a rendered video):

    • With 'Use Compression' unchecked: RGBA (8 bits-per-channel) in AVI with OpenDML extension container.

    • With 'Use Compression checked: MJPG video (YUV, 4:2:2, 8 bits-per-channel), Interlaced, Top Field First) in AVI container.

    There are some configurable options, including output video resolution and framerate.

    The uncompressed video option yields the best quality output but the video files produced are VERY large - in our test 8 sec of capture (1080p at 24fps) yielded a 1.5 GB file. Take care that you have enough storage space to capture in this format if you are going to use this setting, at which point you can convert to lossless compression format like FFV1 for storage.

    hashtag
    Workflow

    In order to generate a video from a UE4 project using the editor tools, you need to first create a sequence: a scripted or recorded set of events occurring within the virtual environment. Some projects may already use a sequence to choreograph the actions that occur within the IM experience (e.g. an 'on-rails' experience). In these cases, locate the sequence in the Content Browser and open it. In the toolbar you should see a clapperboard icon, which will open up the Render Movie Settings dialogue.

    Where there is not an existing sequence, you can create a sequence by scripting or interaction in the engine.

    hashtag
    Rendering 360 Video in the UE4 Editor

    360 video can be created with a workflow that utilizes plugins enabling export of stereoscopic frames from UE4, which are then assembled into a video sequence with software like Adobe After Effects (which has support for VR and 360 video).

    The following workflow uses , a free plugin included with UE4.

    Note that there are some caveats to using Panoramic Capture Tool:

    • It does not export audio.

      • Audio could theoretically be pulled from fixed-perspective video of an "on-rails" experience, but another strategy would be required for interactive content.

    • It is an Experimental feature in UE4, and is not as actively developed or supported as other UE features.

    There are other 360 export tools that can be purchased in the Unreal marketplace.

    hashtag
    UE4 Panoramic Capture Tool Workflow

    [workflow to be added here]

    hashtag
    Physical Video Capture

    [To be added]

    Introduction

    360° video uses video formats to encode moving image sequences that surround the user in virtual space. The user is able to freely rotate their head, determining the viewing angle. The viewing position may be a fixed position, may follow a pre-determined path (on rails), or may offer some degree of positional interactivity through "portals" which jump the viewer to another piece of 360 video content.

    360° video can be created in a number of different ways:

    • Captured by a camera or array of camera lenses;

    • Generated as an export from 3D rendering software (e.g. Blender);

    • Generated from a real-time 3D . The 360° video exported from a game engine can be the artistic end product (s. for instance "" by Studer / van den Berg) or the documentation of a real-time 3D artwork.

    hashtag
    Assessing 360° Video

    • How was the 360° video created?

      • Was it created through camera capture? If so, is the camera type and output format known?

      • What is created using a 3D software tool e.g. 3D renderer or game engine? Is the capture and output format of the video known?

    hashtag
    Acquisition Checklist

    The following measures can help ensure long-term access to 360° video:

    • Ensuring metadata has been captured describing the projection format and mono/stereoscopic format used.

    • Ensuring the video file received is the highest quality available.

    • Considering whether the source video files (pre-stitching) should also be acquired.

    Video Formats

    hashtag
    Monoscopic vs Stereoscopic

    360° video can be either monoscopic or stereoscopic. Monoscopic video supports what is perceived as a 2D representation of the scene i.e. there is no perception of depth. Stereoscopic video supports a 3D representation of the scene with a perception of depth.

    • Monoscopic 360° video contains video captured from a single viewpoint within the scene.

    • Stereoscopic 360° video contains video captured from two viewpoints within the scene. These viewpoints can be different point of view for each eye. These are packed into a single video file where they can be arranged side-by-side or top-bottom.

    hashtag
    Video File Format

    Codecs: h.264/AVC, h.265/HEVC, VP9

    Wrappers: MP4, Mastroska, WebM

    Significant variables in choice of video file format:

    • Achievable bitrate / compression

    • Metadata container options

    • ...?

    hashtag
    Projection Format

    Projection format refers to the way in which data representing a 360° or spherical field of view is mapped to a flat image when it encoded. It is similar to the way in which a map of Earth is a flat representation of the spherical surface of the planet.

    Some common projection formats include:

    • Equirectangular

    • Cubemap

    • Equi-angular cubemap

    Significant variables in choice of projection format:

    • Pixel density

    • Tool support (encoding, decoding)

    • Requirements of video streaming platforms (e.g. YouTube)

    Pyramid
    ...?

    Debugging, sideloading

    Command line tool for debugging projects and sideloading content to Android-based hardware

    https://developer.android.com/studio/command-line/adbarrow-up-right

    Windows, Mac, Linux

    Yes

    Android Logcat

    Debugging

    Android Logcat Package is a utility for displaying log messages coming from Android device in Unity Editor

    https://docs.unity3d.com/Packages/com.unity.mobile.android-logcat@0.1/manual/index.htmlarrow-up-right

    Unity 2019.1 or above

    No

    apitrace

    Debugging

    Tools for tracing OpenGL, Direct3D, and other graphics APIs

    https://github.com/apitrace/apitracearrow-up-right

    Windows

    Yes

    Ardour

    DAW

    Opensource Digital Audio workstation ; ambisonic and binaural plugins avalaible

    https://ardour.org/arrow-up-right

    Linux, Mac, Windows

    Yes

    Blender

    3D

    Opensource 3D suite and more

    https://www.blender.org/arrow-up-right

    Linux, Mac, Windows

    Yes

    FRAPS

    Video Recording; Monitoring / Metrics

    Screen recording and frame rate monitoring tool (DirectX, OpenGL)

    https://fraps.com/arrow-up-right

    Windows XP, 2003, Vista, and Windows 7

    Free, not open source

    NVIDIA Ansel

    360° images

    Capture 360° still images

    https://developer.nvidia.com/anselarrow-up-right

    Unreal Engine 4, Unity 5

    Requires account on developer.nvidia.com to access SDK

    Intel Graphics Performance Analyzers

    Graphics Analysis

    Command line and scripting interface to expose capture and playback functionalities

    https://www.intel.com/content/www/us/en/developer/tools/graphics-performance-analyzers/overview.htmlarrow-up-right

    Windows, Ubuntu

    Free, not open source

    iVRY

    Hardware Emulator

    Use iPhone 6+, Android 4.4+ to view Valve OpenVR/SteamVR content for HTC Vive and Oculus Rift on Windows 7+

    https://store.steampowered.com/app/992490/iVRy_Driver_for_SteamVR/arrow-up-right

    Windows

    No

    libsurvive

    Driver / Library

    Open-source tracking library for Valve's Lighthouse and Vive tracking systems

    https://github.com/cntools/libsurvivearrow-up-right

    Windows, Debian

    Yes

    monado

    XR runtime

    Runtime for VR and AR on mobile, PC/desktop, and HMDs (OpenXR API)

    https://monado.freedesktop.org/arrow-up-right

    GNU/Linux

    Yes

    LIV

    Video recording

    Live capture and streaming of user interactions

    https://www.liv.tv/arrow-up-right

    Windows / SteamVR

    No

    Microsoft Hololens Emulator

    Hardware Emulator

    Test Hololens apps on PC, use mouse/keyboard inputs instead of controllers without code adaptation

    https://docs.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/using-the-hololens-emulatorarrow-up-right

    Windows

    Free, not open source

    NVIDIA GeForce ShadowPlay

    Video recording

    Screen capture in real-time 3D applications (not 360°)

    https://www.nvidia.com/en-us/geforce/geforce-experience/shadowplay/arrow-up-right

    Windows, NVIDIA GeForce graphics cards

    Free, not open source

    Oculus 360 Capture SDK

    Video Recording

    360° video recording in real-time 3D applications

    https://developer.oculus.com/blog/announcing-360-capture-sdk/arrow-up-right

    Unity, Unreal, NVIDIA and AMD GPUs

    Free, not open source

    vr5kplayer

    Video Recording

    Create and play a view-dependent version of a 5K x 5K 360 degree stereo video on Oculus mobile VR systems.

    https://developer.oculus.com/downloads/package/vr5kplayerarrow-up-right

    Oculus Go or Samsung S8 (and later) Gear VR systems.

    Free, not open source

    Oculus Compositor Mirror tool

    Monitoring, Documentation

    Displays the content that appears within the Rift headset on a computer monitor. It has several display options that are useful for development, troubleshooting, and presentations.

    https://developer.oculus.com/documentation/native/pc/dg-compositor-mirror/arrow-up-right

    Windows, Oculus Rift

    Free, not open source

    Oculus HMD head motion emulation

    Hardware Emulator

    Simulate the movement of a user directly in the Unity Editor

    https://developer.oculus.com/documentation/unity/unity-hmd-emulation/arrow-up-right

    Windows, Unity, Quest, Rift

    Free, not open source

    Open Broadcaster Software

    Video Recording

    Combine multiple computer sources in custom layout, with switching

    https://obsproject.com/arrow-up-right

    Windows, Mac, Linux

    Yes

    Open VR Recorder

    Tracking/Input Data Recording

    Record tracking data from devices with drivers for OpenVR / SteamVR.

    https://brekel.com/openvr-recorder/arrow-up-right

    OpenVR, SteamVR; HTC Vive, Oculus Rift VR systems

    Trial 10 seconds recording; $125 license

    Open XR Conformance spec

    Conformance tool

    Command line interface conformance test suite for OpenXR

    https://github.com/KhronosGroup/OpenXR-CTSarrow-up-right

    Yes

    OpenComposite

    Compatibility Layer

    Play SteamVR games without SteamVR

    https://gitlab.com/znixian/OpenOVRarrow-up-right

    Yes

    ReVive

    Compatibility Layer

    Compatibility layer between the Oculus SDK and OpenVR/OpenXR

    https://github.com/LibreVR/Revivearrow-up-right

    Yes

    OpenHMD

    Reverse engineering

    Distortion Maps for headsets

    http://www.openhmd.net/arrow-up-right

    Yes

    OVR Metrics Tool

    Monitoring / Metrics

    Generates performance metrics for applications running on Oculus mobile devices

    https://developer.oculus.com/downloads/package/ovr-metrics-tool/arrow-up-right

    Unreal, Unity, Quest

    Free, not open source

    Radeon GPU Analyzer

    Graphics Analysis

    Performance analysis tool for DirectX, Vulkan, SPIR-V, OpenGL, and OpenCL

    https://gpuopen.com/rga/arrow-up-right

    Windows, Linux

    Yes

    Radeon Software Adrenaline 2020 Edition (ReLive 2019)

    Video Recording

    Screen capture for AMD Radeon graphics cards

    https://www.amd.com/en/support/kb/faq/dh2-023arrow-up-right

    Windows, DirectX, Vulkan

    Free, not open source

    RenderDoc

    Debugging

    General purpose graphics debugger

    https://renderdoc.org/arrow-up-right

    Windows

    Yes

    RenderDoc for Oculus

    Debugging

    Branch of the RenderDoc project by Oculus for debugging the Oculus Quest

    https://developer.oculus.com/downloads/package/renderdoc-oculus/arrow-up-right

    Unreal, Unity, Quest

    Free, not open source

    RivaTuner Statistics Server

    Monitoring

    Framerate monitoring, On-Screen Display and high-performance video capture service

    https://www.guru3d.com/files-details/rtss-rivatuner-statistics-server-download.htmlarrow-up-right

    Windows

    Free, not open source

    SideQuest

    Sideload utility

    Sideload content to Oculus Quest

    https://uploadvr.com/sideloading-quest-how-to/arrow-up-right

    Windows, Mac, Linux, Android

    Free, not open source

    Sites in VR

    Calibration

    Mobile calibration tool

    http://www.sitesinvr.com/viewer/settings.htmarrow-up-right

    Spatial Media Metadata Injector

    Metadata

    360° video metadata injector

    https://github.com/google/spatial-mediaarrow-up-right

    MacOS, Windows

    open source

    Steam VR Mirror Mode

    Monitoring, Documentation

    Enables you to see what the user sees in the HMD for SteamVR content

    https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/XRDevelopment/VR/VRPlatforms/SteamVR/BestPractices/arrow-up-right

    Unreal, SteamVR Tools

    Free, not open source

    Surreal Capture

    Video Recording

    360° video recording in real-time 3D applications

    https://www.surrealcapture.com/arrow-up-right

    Windows

    No, paid license ($179.95) or 15-day trial

    Unity Recorder Package

    Recording

    Capture and save data during play mode

    https://docs.unity3d.com/Packages/com.unity.recorder@2.0/manual/index.htmlarrow-up-right

    Unity Editor

    Included in Unity

    Unity Stereo 360 Image and Video Capture

    Recording

    Renders the camera view for each eye; requires additional code and software settings

    https://blog.unity.com/technology/stereo-360-image-and-video-capturearrow-up-right

    Unity 2018.1+

    Included in Unity

    Unreal nDisplay

    Monitoring, Documentation

    Renders a scene on multitple synchronized displays

    https://docs.unrealengine.com/4.26/en-US/WorkingWithMedia/nDisplay/arrow-up-right

    Unreal Engine

    Included in Unreal

    Unreal Panoramic Capture

    Video export

    Exports 360° still images or frames that can be compiled into video with Adobe AfterEffects

    https://docs.unrealengine.com/4.26/en-US/WorkingWithMedia/StereoPanoramicCapture/arrow-up-right

    Unreal Engine

    Plugin for Unreal

    Unreal Replay System

    Recording

    Console tool for recording and playback of game play

    https://docs.unrealengine.com/4.26/en-US/TestingAndOptimization/ReplaySystem/arrow-up-right

    Unreal Engine

    Unreal Engine 4 feature

    Unreal VR Spectator Screen

    Recording

    View content from third-person perspective

    https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/XRDevelopment/VR/VRHowTos/VRSpectatorScreen/ arrow-up-right

    HTC Vive, Oculus Rift, Steam VR, PlayStation VR (PS VR)

    Unreal Engine 4 feature

    Videos and images of the work running.

  • A testing protocol which can help to verify different behaviours of the software, after changes are made. For example, applying a particular input and comparing the output to a documented reference.

  • Quantitative and technical parameters (e.g. framerate, frametime, display resolution) can be useful when assessing video or sound quality.

  • to standardize the connections between VR applications and VR runtimes, and VR runtimes and VR hardware". Hence, after this migration, the risk that the game engine project cannot be transferred to a newer runtime is smaller than if exporting it for a proprietary runtime (Ensom and McConchie, Preserving Virtual Reality Works, https://zenodo.org/records/5274102arrow-up-right, p. 11).
    initial assessment
    specific tools
    ardenarrow-up-right
    https://www.li-ma.nl/lima/article/preserving-vr-artworksarrow-up-right

    Did the video undergo a stitching process, and is the software known?

  • Did the video undergo a editing process and is the software known? Are these production assets available?

  • By inspecting file metadata, are you able to determine key characteristics of the 360° video format? These are:

    • The projection format used e.g. cubemap, equirectangular.

    • Whether the video is monoscopic or stereoscopic.

    • What codec and pixel format has the video has been encoded with?

  • game engine
    Passage Park #7: Relocate

    UE v4.19 and earlier use a different version of Panoramic Capture that does not have the benefit of Blueprints, and requires some manual programming; this earlier version is not covered here.

    Open Broadcaster Software (OBS)arrow-up-right

    Free and open source software for video recording. Cross-platform (Windows, MacOS and Linux).

    NVIDIA ShadowPlayarrow-up-right

    Requires NVIDIA GPU.

    Xbox Game Bararrow-up-right

    Windows 10/11 built-in tool.

    360 video
    recordingarrow-up-right
    Panoramic Capture Toolarrow-up-right
    The Render Movie Setting Dialogue in Unreal Engine 4.27.

    Tools

    hashtag
    Players and Playback Frameworks

    hashtag
    Transcoding

    360-degree has specialised transcoding requirements if properties such as projection format and stereoscopy to be properly managed. At this point the tools listed here have not been tested by us and inclusion should not be taken as a recommendation.

    List of tools:

    • ffmpeg,

    • Headjack VRencoder,

    Spatial Audio

    frameworks, spatial audio plugins , ambisonic and more

    hashtag
    Frameworks

    Wwisearrow-up-right is an audio framework for MacOs and Windows to create audio interactive content for UnReal arrow-up-rightand Unity arrow-up-rightEngines

    hashtag
    Spatial Audio

    hashtag
    IEM Plug-in Suite

    The is a free and Open-Source audio plugin suite including Ambisonic plug-ins up to 7th order for Linux, MacOs and Windows.

    VST2, VST3 and standalone app.

    StereoEncoder, RoomEncoder, EnergyVisualiser, BinauralDecoder, MultiBandCompressor and more....

    Metadata Standards

    Due to the increased complexity and number of variables in 360 video, there is a need for metadata standards to accommodate this. Google has a suite of standards and metadata injection tools on GitHub under the title of "spatial media" herearrow-up-right.

    In order to upload 360 video to YouTube and have it recognised and played back properly, it is requiredarrow-up-right that spatial metadata be injected using the Google tool. This supports MP4 and WebM (video only) containers. You can find specification information on GitHub for the support elements for videoarrow-up-right and audioarrow-up-right.

    It supports:

    • Projection type

    • Stereoscopic mode

    • Projection pose (yaw, pitch, roll) - presumably for initiation position?

    • [Add audio]

    hashtag
    Spatial Media Metadata Injector

    is a open source software that can be used to "convert standard" videos (equirectangular for example) to inject 360° / stereoscopic 3D (top/bottom) and spatial audio (ambix / sn3df) datas

    https://ffmpeg.org/ffmpeg-filters.html#v360arrow-up-right
    https://headjack.io/vrencoder/arrow-up-right
    IEM Plug-in Suitearrow-up-right
    Spatial Media Metadata Injectorarrow-up-right
    Slides, notes and other documents from the workshop Google Drive folderarrow-up-right.
    Slides, notes and other documents from the workshop .
    Google Drive folderarrow-up-right
    HTC Vive Repair Help: Learn How to Fix It Yourself.iFixitchevron-right
    oculus — Search - iFixitifixitchevron-right
    meta quest — Search - iFixitifixitchevron-right
    Valve Index Repair Help: Learn How to Fix It Yourself.iFixitchevron-right
    samsung gear vr — Search - iFixitifixitchevron-right
    https://docs.google.com/spreadsheets/d/1DkoRL4VxXeFyuoj-ybToMn4jrzSrStAPjgX9JXGZKs8/edit?usp=sharingdocs.google.comchevron-right
    XR Systems Comparison (In Progress) [2021-22] compiled by sasha arden.
    PASIG 2017: Smartphones within the changing landscape of digital preservationArchives and Manuscripts at the Bodleian Librarychevron-right
    Considerations on the Acquisition and Preservation of Mobile eBook AppsZenodochevron-right
    A Race Against Time: Preserving iOS App-Based ArtworksElectronic Media Reviewchevron-right
    3D/VR in the Academic Library: Emerging Practices and TrendsCLIRchevron-right
    https://lib.vt.edu/research-teaching/lib3dvr.htmllib.vt.educhevron-right
    Towards a preservation workflow for mobile appsbitsgalore.orgchevron-right
    IIIF 3D Community Groupiiif_iochevron-right
    OSFosf.iochevron-right
    Logo
    Developing Library Strategy for 3D and Virtual Reality Collection Development and ReuseCNI: Coalition for Networked Informationchevron-right
    Born to Be 3Dwww.loc.govchevron-right
    Logo
    3D Data Repository Features, Best Practices, and Implications for Preservation Models: Findings from a National Forumcrl.acrl.orgchevron-right
    Logo
    IEEE Virtual Reality and Augmented Reality Working Group (CTS/VRARSC/VRARWG)IEEE 2048 VR/AR Working Group (VRARWG)chevron-right
    Logo
    Recommended Formats Statement - Design and 3D | Resources (Preservation, Library of Congress)www.loc.govchevron-right
    Logo
    Logo
    Recommended Standards | Web3D Consortiumwww.web3d.orgchevron-right
    Logo
    Logo
    Logo
    Logo
    Logo
    https://projects.iq.harvard.edu/buildingtomorrow/homeprojects.iq.harvard.educhevron-right
    Logo
    Ingest and Storage of 3D Objects in a Digital Preservation SystemZenodochevron-right
    Logo
    Logo
    Logo
    Logo
    Logo