Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This glossary is an effort to define frequently encountered immersive media terminology and support a common vocabulary in collaboration between disciplines. The first version was compiled during iPRES 2019 by participants in the VR Hackathon event. Definitions are sourced from the following existing online glossaries:
Unity: What is XR Glossary
XinReality wiki: Terms
Oculus Creators Portal: VR Glossary
Interactive Advertising Bureau (IAB): AR and VR Terminology PDF
Digital Preservation Handbook: Glossary
To do:
Review and agreement on these definitions
Add visual examples/images to illustrate what is being described
Add relevant digital preservation terms
Virtual Reality: Virtual reality, commonly abbreviated to VR, is a technology that simulates a fully immersive virtual or imaginary environment in which a user feels that they are physically present.
Augmented Reality: Augmented reality, commonly abbreviated to AR, is a technology that overlays virtual elements on top of a real-world environment.
Augmented Virtuality: Similar to augmented reality, this refers to a technology system whereby a largely virtual environment is merged with real-life objects.
Mixed Reality: Described variously as either mixed reality, MR or hybrid reality, this term refers to any technology that isn’t a fully immersive VR system, but instead augmented reality or augmented virtuality (see above definitions). This is also (confusingly) used to describe Microsoft’s virtual platform, which includes both VR and AR devices.
Three degrees of freedom: Often abbreviated to 3DoF, this term refers to the ability to move in six directions, namely pitch, yaw and roll.
Six degrees of freedom: Often abbreviated to 6DoF, this term refers to the ability to move in six directions, namely pitch, yaw, roll, elevation, strafing and surging.
CAVE (cave automatic virtual environment): is a virtual reality environment consisting of 3 to 6 walls that form a room-sized cube.
On-rails: A VR experience in which there is no significant use of positional tracking. It has a start and an end like a video.
Scene behaviours: Can you manipulate objects and with what interaction?
Caching: a grid/cache appears if you move out of the safe area.
VR-runtime: integration of software and hardware (specific headset drivers etc.) for VR
Presence/immersion: Both presence and immersion are used interchangeably to describe the sensation of feeling physically present within a virtual experience, as opposed to the detachment experienced through experiencing content via a conventional screen-based medium.
Frames per second: Also known as Frame rate or fps, this measures how often images (also called ‘frames’) are shown consecutively. This is related to, yet distinct from, refresh rate (see below). 60 frames per second is usually considered playable without causing motion sickness, but the best VR headsets have even higher refresh rates.
Refresh rate: This specifically indicates how often the buffer is updated and an image (often called a ‘frame’) regenerated on a screen, an important element when creating a realistic virtual environment. This is measured in Hertz (Hz) and is related to, yet distinct from, frames per second (see below). A low refresh rate can cause judder (see below).
Judder: Typically caused by a low refresh rate (see above) or dropped frames, judder is the manifestation of motion blur (also known as smearing) and the perception of more than one image simultaneously (known as strobing). This can cause simulator sickness (see below).
Latency: The time delaying virtual reality; a glitch in the VR system, when the images are not well synchronized with the sound, changing later than expected.
Teleportation: A common method of virtual navigation, this allows the user to quickly move between points without having to traverse the distance between them.
3D API: A library and interface supporting common 3D rendering tasks. Examples include DirectX (Windows), OpenGL (cross-platform), Metal (MacOS), Vulkan (cross-platform).
Ambient occlusion: Ambient occlusion is a technique to produce film-like lighting quality with real-time performance. Ambient occlusion is a lighting model that calculates the brightness of a pixel in relation to nearby objects in the scene.
Anti-aliasing: Raster images are made of rectangular pixels, which can lead to jagged edges in curved lines. Anti-Aliasing aims to reduce the jaggedness created by these pixels, and there are multiple techniques to achieve this.
Fast Approximate Anti-Aliasing: The least demanding type of anti-aliasing. Rather than running complex calculation depending on the geometry and colors displayed, FXAA simply applies extensive blurring to obscure the jagged edges. The end result is unnoticeable performance impact but a generally blurrier image.
Multi-sampling Anti-Aliasing: It relies on color manipulation around geometric shapes to produce an effect of smoothness. It can use either 2, 4 or 8 samples – the higher the sample count, the higher the quality and the performance impact.
Super Sampling Anti-Aliasing: What it does is make the GPU render a game at a higher resolution and then downsamples it. That way, it increases the overall pixel density of your display and renders a much sharper image.
Bloom: Creates blurry feathered edges to light sources in post-processing
Foveated rendering: A tracking-based rendering method where the user’s eye movements are tracked, allowing peripheral vision to be rendered at a lower quality, thus reducing the amount of processing needed to render a VR experience in real-time.
Lens Flare: The image is processed to replicate light reflecting in a simulated camera lens.
Material shaders: (PBR vs traditional methods, BSDF/BRDF)
Shader: A shader is a small computer program designed to run on the GPU. Shaders are written in languages associated with 3D APIs, such as HLSL (DirectX), GLSL (OpenGL) and SPIR-V (Vulkan).
Asynchronous Reprojection / Spacewarp / Timewarp:
Lens Distortion:
Inside-out tracking: the headset has all the tracking tech built-in. Disadvantage: it cannot track the controllers behind the person
Inside-out tracking: This tracking method uses cameras fixed to the device being tracked in order to determine how its position changes relative to its environment.
Outside-in tracking: sensors/cameras being mounted around the user creating a tracked play space. The benefit of this design is that players are monitored wherever they go within that area, and so are the controllers.
Outside-in tracking: This term refers to the use of externally placed positional sensors to track a user moving in real-time.
Eye tracking: The measurement of eye positioning and movement to discern where exactly a user is looking. This is a crucial element of foveated rendering (see above).
Head tracking: This is a method of tracking a user in virtual reality whereby the picture shifts as they move or angle their head.
Motion tracking: The use of positional sensors and markers that register where a device is, allowing it to be mapped to a virtual environment.
Rotational tracking: is a term used to describe how a piece of hardware determines how tilted something is. There are three different ways that an object can rotate, we call each of these ways a degrees of freedom, and you can track any number of degrees of freedom (see above for 3DoF and 6DoF).
Positional sensor: Devices used to track the exact position of the user while they are using a VR system, and feedback data that is used to inform the information being shown on the screen.
Room-scale: A virtual reality set-up that, thanks to an expansive configuration of positional sensors (see above), allows the user to physically roam around an entire room without experiencing limitations.
Fixed viewpoints: Common to many VR experiences – particularly those that are mobile-based – this feature limits the user to a pre-defined number of explorable positions within a virtual reality build, rather than allowing open-world exploration.
First or Third person Experience: Whether the user experiences the virtual world as through their own eyes, or as an external observer.
VR sickness: typically involves symptoms of dizziness and nausea.
Simulator sickness: Sometimes referred to as VR sickness and with similar effects to motion sickness, this can be caused by factors including judder and users perceiving self-motion when stationary.
Asynchronous timewarp: a middle-ground between high persistence and low persistence displays that “warps” a current frame to compensate for the motion of your head before showing you the next rendered frame.
Spatial desync: the effect or difference occurring when a user’s movements in real life and the VR avatar’s movements are out of sync.
Computer-based or tethered HMD: describes any headset that requires a connection to a stand-alone PC in order to function. Well-known computer-based systems include Facebook’s Oculus Rift and the HTC Vive.
Mobile-based HMD: Any HMD where the processing and display for the VR experience are provided by a mobile phone. Notable examples include the Samsung Gear VR and Google Daydream headset.
Standalone HMD: A VR or AR HMD where the entire system is self-contained within the device. Examples include the Microsoft HoloLens mixed reality headset and the upcoming Oculus Go from Facebook.
Headphones are usually provided with or within HMD’s. Specifications are often contingent to brand but common features may include 3D spatial sound for an immersive experience.
Binaural:
Peripheral device/s, often left and right to provide full motion tracking of a player’s hands in VR experience/play.
Complex Digital Objects: An object, item, or work with a component that is digital, is made up of multiple files (likely of differing file formats) and may or may not incorporate a physical component. A Complex Digital Object is likely to depend on software, hardware, peripherals, and/or networked or non-networked data, platforms, and/or services. Contrast with a Simple Digital Object which consists of a single file. XR materials can be considered Complex Digital Objects.
Virtual Reality (VR) refers to experiences which fully immerse a user in a virtual environment. The extent to which the user may engage with this virtual environment can vary, and determining this can be a useful starting point in preservation planning:
In a fixed-position experience (3DoF - rotational tracking only), the user views the virtual environment from fixed position.
In an on-rails experience (3DoF - rotational tracking only), the user is moved through the virtual environment along a predetermined path.
In a fully interactive experience (6DoF - rotation and positional tracking), a user can move freely through the virtual environment.
Within these three types, there may be a varying level of interactivity with elements on the virtual environment depending on the way in which the VR content has been produced. 360 video content is typically not interactive or dynamic (the video frames are predetermined when it is authored) while real-time 3D software may have interactive or dynamic elements (the video frames are generated on the fly at the runtime).
There are two primary types of VR content: 360 video and real-time 3D software.
For VR content which makes use of real-time 3D rendering, existing software framework and tools are typically used as a starting point. Development for desktop or mobile applications is typically using a . Other approaches include web frameworks like and .
For 360 video,
Accessing VR content is contingent on a set of interconnected off-the-shelf hardware and software components - a VR system. This will typically consist of:
or other display device
Controller
Computer system
Software environment consisting of off-the-shelf software including:
Operating system
Drivers
Campbell, S. (2017). A Rift in our Practices, Toward Preserving Virtual Reality [Master’s Thesis, New York University].
Campbell, S., & Hellar, M. (n.d.). From Immersion to Acquisition: An Overview Of Virtual Reality For Time Based Media Conservators. Electronic Media Review, Six: 2019-2020. Retrieved October 7, 2021, from
Cranmer, C. (2017). Preserving the emerging: virtual reality and 360-degree video, an internship research report. Undefined.
Ensom, T., & McConchie, J. (2021). Preserving Virtual Reality Artworks. Tate.
LIMA (2021). A Practical Research into Preservation Strategies for VR artworks on the basis of Justin Zijlstra’s 100 Jaar Vrouwenkiesrecht. URL:
Google. Google VR: Fundamental Concepts. URL:
Brown CSCI1951S. VR Software Wiki. URL:
The Knowledge Base in a collaborative and community-driven space for sharing anything from publications, to technical guides, to notes and work-in-progress. We welcome contributions big or small!
As a contributor, you agree to the Code of Conduct and that any content you create on this GitBook site will be shared according to our Licence terms.
If you contribute, please do add your name to the list of contributors and create a bio page!
If you are interested in contributing, you can join the team as an Editor by following this link:
https://app.gitbook.com/invite/-MiR-vfqFP7-QsvY8wdc/b20bJHyul40s6q8KYlI9
Once you have logged in and joined the PIMKB group on GitBook, you will be able to make an edits to pages. You commence editing by clicking on the 'Edit' button in the top right corner. Changes you make in Edit mode will exist independently from the live version until you submit them for review.
Once you are happy with your edit, you can click the 'Submit for review' button in the top right corner, which will then be checked by an administrator before going live. Before submitting for review, please include a brief description of the changes made during your edit by filling in the 'Describe your changes...' field at the top of the page. This will help the reviewer understand the nature of your changes.
For more information on GitBook and how to use it, check out the GitBook docs.
If you'd like to help or support with contributing or using the GitBook platform, or just want to chat about an idea for a contribution, please do get in touch with the site admins:
Tom Ensom: tom.ensom [at] tate.org.uk
Jack McConchie: jack.mcconchie [at] tate.org.uk
We are gathering a curated list of publications and other resources on the topic of immersive media and its preservation. You will find references to external resources embedded throughout the Knowledge Base while a bibliography of key references can be found in our public Zotero library:
https://www.zotero.org/groups/4453604/preserving_immersive_media/library
If you want to add to the bibliography, either drop Tom an email for an invite to the Zotero group, or add your reference to the list below and someone should be able to add it for you.
Röck, Marti (2021): Capturing a VR-executable as a 360-degree video. A test report. [added 3/1/24]
The Preserving Immersive Media Group (PIMG) is a community and email list for those interested in collecting, preserving and stabilising artworks that utilise immersive media, including 360 video, real-time 3D, virtual reality and augmented reality. This group was born out of the ongoing Preserving Immersive Media project at Tate. We encourage all with an interest in these topics to join the PIMG email list on Groups.io. Members are welcome participate in discussion and to share any relevant information via the list, providing they agree to the PIMG Code of Conduct.
PIMG also runs regular events, recordings and outputs from which are made available online.
The purpose of this list is to show foster exchange and collaboration between projects exploring the preservation of immersive media materials and experiences. This is a growing list maintained by the community. If you are involved in an immersive media preservation project, please consider adding it.
Project | Host(s) | URL | Contact |
---|---|---|---|
This is a growing list maintained by the community. If you care for an immersive media artwork, please consider adding it.
The purpose of this list is to show the variety of artworks and technologies used, but also to foster exchange between conservators. The listed artworks are not necessarily well documented case studies. In the list below not only VR- or AR-based artworks are included, but also other interactive artworks that are produced with a game engine and hence require similar preservation strategies.
Artwork | Artists | Artwork date | Game Engine | Runtime | Game Engine, latest version (year) | Display | Interactivity | Institution owning artwork |
---|---|---|---|---|---|---|---|---|
Preserving Immersive Media
Tate, London, UK
jack.mcchonchie@tate.org.uk and tom.ensom@tate.org.uk
Wachter / Jud
2000, ongoing
Windows 10
2012
Projection, U-shaped, plus flat screen for navigation
Navigation with joystick
HEK (House of Electronic Arts, Basel)
Studer / van den Berg
2017
DarkBASIC Professional V1.05
Windows 7 / 10 with DirectX 9
2016
Projection 1920x1080,
Walkthrough (like a 360 Video). Looking around using a mouse
HEK (House of Electronic Arts, Basel)
Mélodie Mousset
2015
Unity
Windows 10, Oculus Rift
under development
Oculus Rift Headset
Navigation with XBox Console as part of Oculus Rift
HEK (House of Electronic Arts, Basel)
(he/him) is a London-based Digital Conservator specialising in the conservation of software-based art. He works with those caring for software-based art to research, develop and implement strategies for its long-term preservation. He currently works primarily with Tate’s Time-based Media Conservation team, where he has helped develop their conservation strategy for software-based art and works on the acquisition of a wide range of time-based media artworks. His current research focus is the preservation of artworks which employ real-time 3D software and immersive media (XR) technologies.
You can reach me on:
Twitter:
Email: tomensom [at] gmail.com
Knowledge Base content I'm currently working on:
Unreal Engine 4 related resources
Introduction to Real-Time 3D
Introduction to Virtual Reality
Rasa Bocyte (she/her) is a researcher at the Netherlands Institute for Sound and Vision. Her expertise lies in the areas of born-digital media preservation, access and reuse of heritage collections, and value chains in the creative and cultural industries. In her current role, she leads the development and execution of European research and innovation projects. Her background is in Archival and Information Studies and Art History.
Born Lithuanian, Mancunian at heart, resident of the Netherlands.
You can reach me on:
Twitter: https://twitter.com/rasa_bocyte
Email: rbocyte [at] beeldengeluid.nl
The Preserving Immersive Media Group (PIMG) runs occasional workshops, webinars and other events - find link to recordings of presentations, slides and other materials relating to them here. Event recordings from PIMG events can be found on the PIMG YouTube channel.
A series of collaborative events held during 2023-24, focused on knowledge building activities around particular topics in immersive media preservation. Sessions invite participants to share experiences and collectively work towards building knowledge in this field, with a focus on possible contributions to the knowledge base.
The first took place on Tuesday 12th December 2023 and collaborative notes from this session can be found here: https://docs.google.com/document/d/1EeVqwLUY0xnCaV7aICbW8ZOMPLl8jrqd18obppoa83E/edit?usp=sharing
Recordings of presentations are available online via a YouTube playlist: https://www.youtube.com/watch?v=K5ufLcoGMJg&list=PLQvZVm5rUUpgTLRINBfKvCxWLFkwlW2S-
Recordings of presentations are available online via a YouTube playlist: https://www.youtube.com/watch?v=x_hz7Hg3hKs&list=PLQvZVm5rUUphocASxU0LFcOVT6aj5Kch3
The iPRES 2019 hackathon was an effort to better understanding the variability of virtual reality artworks e.g. what differs between artworks depending on the choices made by artists; and what happens when we make changes to elements of the work.
A full abstract can be found on the iPRES website: https://ipres2019.org/program/conference-programme/?session=117
Detailed notes from the event can be found on the collaborative notepad hosted by Rhizome: https://notepad.rhizome.org/ipres2019-vr
The preservation of immersive media is an emerging topic in digital preservation and presents many new challenges and opportunities for research. This page is a place for tracking questions and prompts that we have arrived at through research, discussion and daydreaming. Do you think you can contribute an answer to any these or do you have any questions of your own? We'd love your !
What are the most effective approaches to consistently capturing video documentation of VR experiences? Including field-of-view capture, video etc.
What does the process of adding OpenXR support to an existing VR experience look like? Are there any changes that result and how might these be managed?
Diagram and technical specs for underlying technology of headsets
Which 3D data file formats are most suitable/sustainable for preservation purposes? Are there any which are not?
Gap analysis of preservation tools - what is available, what could be developed?
Calibration tools- comparison with game brightness etc
Capture tools
File format identification- do these support 3D Objects and 3D software?
A complexity matrix- tool to enable advocacy for institutional resources
How can we navigate the relationship between the artwork and the technology, to what extent is the experience determined by an individual’s engagement with the peripheral devices, and to what extent might this be understood and managed over time?
What non-technical frameworks exist to help us understand user experience? e.g. oral histories, historical context of technologies
What combination of technical tools and non-technical frameworks might we employ to try to create a sustainable catalogue of user experience, and how might this feed into the historical records of an artwork?
If an artwork is so complex as to become unsustainable, how might we interpret its documentation for a new audience?
What are the most appropriate metadata standards and how can they be applied?
How should we run/support the knowledge base in terms of encouraging contributions? How might this play out in the long-term?
How do we identify and deal with broken links in the future? GitBook doesn't offer this service, but could we use the GitHub repo (e.g. )?
Eric Kaltman is an Assistant Professor of Computer Science at California State University Channel Islands. He is the founder of the Software History Futures and Technologies (SHFT) research group and investigates methodologies for the examination of historical software systems. Previously, he worked with software preservation and software collections at Stanford University, Carnegie Mellon University and the University of California Santa Cruz.
Samantha Rowe is the Digital Archivist and Research Associate of the Wildenstein Plattner Institute, Inc. (WPI). She prepares, describes, arranges, and conducts item-level processing of digital archival materials using the WPI’s relational database to provide free public access to over 50,000 significant art historical resources and research materials on its growing digital platform. She holds an M.S.L.I.S. from Long Island University and M.A. in History of Art from the Institute of Fine Arts, New York University.
E-mail: samantha.h.rowe [at] gmail.com
sasha arden spent their fourth year internship working with the Museum of Modern Art and Tate media conservation teams. Their focus in Preserving Immersive Media is on documentation methodologies and techniques. They graduated from NYU's conservation program in time-based media in 2022.
Having been involved in arts production, installation, and management throughout their career, sasha embraces the long-term thinking and development of appropriate stewardship practices in conservation while being informed by practical experience. Their ongoing research examines the intersection of technical capabilities and the philosophical and ethical questions arising through the conservation process, advocating for a holistic approach to the integrity of cultural assets.
Email: sasha.llyn.arden [at] gmail [dot] com
All content hosted on the Knowledge Base site is shared under a Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license, unless otherwise stated. A human-readable summary of the full license is provided below. Additional licences may apply to non-hosted content (e.g. links out or embedded media).
You are free to:
Share — copy and redistribute the material in any medium or format
Adapt — remix, transform, and build upon the material for any purpose, even commercially
The licensor cannot revoke these freedoms as long as you follow the license terms.
Under the following terms:
Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
This section consists of an overview of key immersive media technologies, links to useful resources elsewhere on the internet and general notes that might be useful to others.
Immersive media is a term used to describe a set of related technologies that aim to extend our physical reality in various ways. For Virtual Reality (VR), this means immersing a user in a virtual environment. For Augmented Reality (AR), this means integrating virtual elements with a physical environment. Together, VR, AR and other related terms, are sometimes referred to using the umbrella term XR.
This section of the Knowledge Base presents a concise overview of the key technologies you are likely to encounter when working with immersive media and links out to other useful resources. XR content such as 360 video or real-time 3D software is the custom content that an individual or organisation might be interested in acquiring and preserving.
Jesse de Vos worked as a project manager and researcher at the Netherlands Institute for Sound and Vision. His research activities related to web archiving, game archiving and complex media preservation.
Image by Ziko van Dijk - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=37917581
This site is a work in progress! Articles may be empty or incomplete, and might move around.
Can you share your knowledge with the community and help us add to and improve this site? We would welcome your contributions! Read our guide to contributing to get started.
The Preserving Immersive Media Knowledge Base is a resource created to help share information between members of the digital preservation community who are caring for virtual reality (VR), augmented reality (AR), mixed reality (MR), 360 video, real-time 3D software and other similar materials. This site was born out of Tate's Preserving Immersive Media Project with funding from the Netherlands Institute for Sound & Vision. Since this project has it's roots in museum practice, there are many references to artworks and elements of conservation practice, however we invite collaboration from all related practices!
The Knowledge Base is designed as a flexible and collaborative space for sharing information and materials that may be work-in-progress or based on best available knowledge. As such, it is constantly evolving and pages can never be considered final or authoritative! If you are new here, these pages might be useful places to start:
Glossary of frequently encountered immersive media and preservation terminology.
Open questions that the community is trying to answer.
Guide to contributing to the Knowledge Base.
Our under construction resource for understanding immersive media.
Recordings and notes from Preserving Immersive Media Group events.
For a longer read the white paper published as an output of Tate's immersive media research is a useful place to start learning about VR and the preservation challenges it poses.
This section describes the components of XR systems.
There is a range of XR systems, from a simple and relatively inexpensive mobile phone used with a lensed viewing device (e.g. Google Cardboard) to complex integrated software/hardware systems with tracking features meant for full-body interactivity (e.g. HTC Vive).
The simplest type of XR system uses a handheld mobile phone mount with lenses that help to focus on the screen content and create the illusion of an immersive view. The next step in features and complexity involves a head-mounted display (HMD) that is hands-free and provides a better illusion of immersion. Further interactivity and immersion is achieved with tracking systems, which provide input for the XR content and/or information about the user's position in space. Tracking systems can be built into the HMD (usually 3DOF), or hardware such as handheld controllers or spatial tracking systems are used along with an HMD (usually 6DOF).
Another category among XR systems is standalone types and tethered types. Standalone HMDs offer greater freedom of motion and are less expensive overall, while tethered systems use a connected computer to offer better performance in terms of graphics and location processing. Tracking is included in both standalone and tethered systems, but the degree of motion/position information and accuracy is variable.
Some examples of the main types of XR systems are:
Handheld type
Google Cardboard
Samsung Gear
Standalone type
Oculus Quest
HTC Vive Focus
Tethered type
Oculus Rift
HTC Vive Pro
The commercial marketplace has strongly influenced development of XR systems since the early 2010's when a "second wave" of immersive technologies started. Most companies produce tiered feature sets at varying price points. Releases of product models might follow incremental improvements in technology, resulting in very similar systems with only one or two differences between them -- or releases could represent a big jump in technology or hardware design from year to year. It can be challenging to identify defining features and technical specifications for a given XR system because of advertising styles (many hard-to-prove claims are made) and the proprietary, competitive nature of development.
The table below is meant to gather information for the purpose of identifying the features of popular XR systems. It could be useful for considering compatibility in the case of a potential hardware replacement or migration. For example if the XR content uses 6DOF interactivity, an XR system that is only capable of supporting 3DOF would not be suitable.
Jack McConchie is a time-based media conservator at Tate. He has worked across loan, acquisitions and exhibition program areas including major retrospectives at Tate Modern such as Nam June Paik and Bruce Nauman. He has researched into the lives of complex artworks in the museum as part of the “Reshaping the Collectible” research project at Tate. Building on software-based artwork and video preservation strategies, he has recently co-authored a white paper that reports on Tate’s “Preserving Immersive Media” research project. Previously, Jack studied music and electronics at Glasgow university, worked as a musician and audio engineer, and collaborated on the design and fabrication of bespoke systems and components for artists working with time-based media.
OpenXR is an open, royalty-free standard for APIs that provide XR applications with access to XR platforms and devices. This is implemented in the XR runtime software supplied by the manufacturer of XR hardware. Application support for OpenXR is potentially useful for preservation purposes — as it is a open standard, which will make keeping software available that
OpenXR is developed by a working group managed by the Khronos Group consortium, who describe it as follows:
OpenXR is an API (Application Programming Interface) for XR applications. XR refers to a continuum of real-and-virtual combined environments generated by computers through human-machine interaction and is inclusive of the technologies associated with virtual reality (VR), augmented reality (AR) and mixed reality (MR). OpenXR is the interface between an application and an in-process or out-of-process "XR runtime system", or just "runtime" hereafter. The runtime may handle such functionality as frame composition, peripheral management, and raw tracking information.
Optionally, a runtime may support device layer plugins which allow access to a variety of hardware across a commonly defined interface.
— https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html
Up until the arrival of OpenXR, support for each manufacturers API would have to be built into the XR applications if they were to be used.OpenXR attempts to solve the problem of compatibility between XR applications and XR hardware. Image source: https://www.khronos.org/openxr/.
In order to make use of OpenXR, you need to:
Develop software which supports — see Engine Implementations below.
Make use of an XR platform which supports it — see XR Runtime Implementations below.
OpenXR is steadily being adopted by XR platforms, as they build support into their XR runtime software. The table below lists the XR runtimes which currently support OpenXR.
XR Runtime
Versions Supporting OpenXR
Platform
Oculus
v19+
SteamVR
Engine
Versions Supporting OpenXR
Supported Runtimes
Unreal Engine 4
4.27 (via plugin); 4.23-4.26 (via beta plugin)
Windows Mixed Reality; Oculus (via Oculus OpenXR plugin); SteamVR (via SteamVR Beta opt-in)
Unity
2020.2+ (via plugin)
Windows Mixed Reality; HoloLens 2
Thank you to all those who have generously given their time and knowledge to help shape the Knowledge Base!
Claudia Roeck
WinZs
Due to the increased complexity and number of variables in 360 video, there is a need for metadata standards to accommodate this. Google has a suite of standards and metadata injection tools on GitHub under the title of "spatial media" here.
In order to upload 360 video to YouTube and have it recognised and played back properly, it is required that spatial metadata be injected using the Google tool. This supports MP4 and WebM (video only) containers. You can find specification information on GitHub for the support elements for video and audio.
It supports:
Projection type
Stereoscopic mode
Projection pose (yaw, pitch, roll) - presumably for initiation position?
[Add audio]
Spatial Media Metadata Injector is a open source software that can be used to "convert standard" videos (equirectangular for example) to inject 360° / stereoscopic 3D (top/bottom) and spatial audio (ambix / sn3df) datas
Game engines are software development tools for creating interactive software. They package together libraries and software which simplify the development of interactive software. Game engines are a widely used tool in the creation of real-time 3D VR software, and many engines support VR production workflows out-of-the-box.
A modern game engine will typically include:
A 3D or 2D renderer, which supports the rendering of a moving sequence in real-time.
Physics simulation.
Asset import and management.
Scripting and programming tools to support dynamic, simulated and interactive elements.
Sound processing.
Extension through plugins and/or custom code.
There are numerous engines in use today. For real-time 3D rendering applications such as VR, Unity and Unreal Engine are currently the two most popular. Both are free to download and use non-commercially, which has contributed to their popularity.
An important implication of using an engine is that much programming has already happened before work on a project begins. The engines provides a toolset that can be used to realise the project and generate executable software, but as a creator you do not necessarily have full control or authorship of the code.
Adrian Courrèges (2020) Graphics Studies Compilation. URL: http://www.adriancourreges.com/blog/2020/12/29/graphics-studies-compilation/.
baldurk (n.d.). Graphics in Plain Language: An introduction to how modern graphics work in video games. URL: https://renderdoc.org/blog/Graphics-in-Plain-Language/Part-1.html.
Brown University VR Software Wiki. URL: https://sites.google.com/view/brown-vr-sw-review-2018/home
By using this site, you agree to abide by the code of conduct outlined on this page.
The first version of this Code of Conduct is based on that used by the . Future updates or alterations to the Code of Conduct should be noted here along with the date and actioner.
The Preserving Immersive Media Group is dedicated to providing a harassment-free environment for everyone. We do not tolerate harassment of participants in any form.
This code of conduct applies to all (online and offline) Preserving Immersive Media Group spaces, including including mailing lists, online meetings, in-person meetings, the GitBook space, the GitHub repository. Anyone who violates this code of conduct may be sanctioned or expelled from these spaces at the discretion of the response team.
Some Preserving Immersive Media Group spaces may have additional rules in place, which will be made clearly available to participants. Participants are responsible for knowing and abiding by these rules.
Harassment includes:
Offensive comments related to gender, gender identity and expression, sexual orientation, disability, mental illness, neuro(a)typicality, physical appearance, body size, age, race, or religion.
Unwelcome comments regarding a person’s lifestyle choices and practices, including those related to food, health, parenting, drugs, and employment.
Deliberate misgendering or use of ‘dead’ or rejected names.
Gratuitous or off-topic sexual images or behaviour in spaces where they’re not appropriate.
Physical contact and simulated physical contact (eg, textual descriptions like “hug” or “backrub”) without consent or after a request to stop.
Threats of violence.
Incitement of violence towards any individual, including encouraging a person to commit suicide or to engage in self-harm.
Deliberate intimidation.
Stalking or following.
Harassing photography or recording, including logging online activity for harassment purposes.
Sustained disruption of discussion.
Unwelcome sexual attention.
Pattern of inappropriate social contact, such as requesting/assuming inappropriate levels of intimacy with others
Continued one-on-one communication after requests to cease.
Deliberate “outing” of any aspect of a person’s identity without their consent except as necessary to protect vulnerable people from intentional abuse.
Publication of non-harassing private communication.
The Preserving Immersive Media Group prioritizes marginalized people’s safety over privileged people’s comfort. The response team reserves the right not to act on complaints regarding:
‘Reverse’ -isms, including ‘reverse racism,’ ‘reverse sexism,’ and ‘cisphobia’
Reasonable communication of boundaries, such as “leave me alone,” “go away,” or “I’m not discussing this with you.”
Communicating in a ‘tone’ you don’t find congenial
Criticizing racist, sexist, cissexist, or otherwise oppressive behavior or assumptions
This code of conduct applies to Preserving Immersive Media Group spaces, but if you are being harassed by a member of Preserving Immersive Media Group outside our spaces, we still want to know about it. We will take all good-faith reports of harassment by Preserving Immersive Media Group members seriously. This includes harassment outside our spaces and harassment that took place at any point in time. The response team reserves the right to exclude people from Preserving Immersive Media Group based on their past behavior, including behavior outside Preserving Immersive Media Group spaces and behavior towards people who are not in Preserving Immersive Media Group.
In order to protect volunteers from abuse and burnout, we reserve the right to reject any report we believe to have been made in bad faith. Reports intended to silence legitimate criticism may be deleted without response.
We will respect confidentiality requests for the purpose of protecting victims of abuse. At our discretion, we may publicly name a person about whom we’ve received harassment complaints, or privately warn third parties about them, if we believe that doing so will increase the safety of Preserving Immersive Media Group members or the general public. We will not name harassment victims without their affirmative consent.
Participants asked to stop any harassing behavior are expected to comply immediately.
If a participant engages in harassing behavior, the response team may take any action they deem appropriate, up to and including expulsion from all Preserving Immersive Media Group spaces and identification of the participant as a harasser to other Preserving Immersive Media Group members or the general public.
Rasa Boycte: rbocyte [at] beeldengeluid.nl
Tom Ensom: tom.ensom [at] tate.org.uk
Jack McConchie: jack.mcconchie [at] tate.org.uk
This page is designed to help you understand what Unity is and how it can be used by creators.
The Unity Engine was initially released in 2005 for the purposes of democratizing the game development industry. Its goal was to provide professional but affordable game development tools aimed mainly at amateur game developers. Unity can create both 2D and 3D applications. The engine typically has a major version update every year with multiple smaller updates throughout the year. Unity is free for developers that bring in under $100,000 in revenue in a 12-month period. After that, there are 2 different pricing models. Though originally designed to build games, today Unity is used across multiple industries including automotive, architecture, engineering, construction, film, education, and retail.
When a new project is created, Unity creates several folders as a part of the project. Over the years and versions of Unity, this default folder structure has changed somewhat, but it now generally contains the following folders:
Assets
This is the folder where all game resources are stored, including scripts, textures, sound, and custom editor scripts. The organization of the folder can vary greatly from one project to another and from one organization to another. There can be numerous subfolders in the Assets folder, depending on how the project is organized. For example, there may be a folder for scenes, one for scripts, one for audio, or one for sprites. There is no limit to how deep the organization can be.
As a best practice, subfolders in the Assets folder should be created within Unity and not the computer's local file system. Likewise, assets should be added directly to the folders in the Unity Editor and not the file explorer as seen above. This is particularly important due to the way Unity constructs metadata for assets.
Unity reserves several special folder names under the asset folder. These folders are Editor, Editor Default Resources, Gizmos, Resources, Standard Assets, and Streaming Assets. Not every project will have all of these folders.
Assets/Editor — This folder is for custom editor scripts that extend the functionality of the Unity editor. These scripts will run in the editor but not in the project at runtime. Multiple Editor folders can exist in the Assets Folder. The execution of the editor scripts varies depending on where in the folder structure the editor file exists.
Assets/Editor Default Resources — This is where asset files used by Editor scripts are stored. There can only be one such folder and it must be placed in the Assets folder root. There can be subfolders in this folder.
Assets/Gizmos — Gizmos are graphics in the scene view which can help to visualize design details. This folder stores images used for gizmos. There can only be one folder and it must be placed in the Assets folder root. Gizmo examples are seen below with the red squares around them. The one on the left is the main camera’s position and rotation. The one on the right represents a light source. These two are Unity built in gizmos:
Assets/Resources — This folder stores resources so that they can be loaded on demand in a Unity project. There can be multiple resource folders. On demand loading is helpful for dynamically loading game objects that don’t have instances created by designers during design time. In other words, these resources may not have corresponding game objects placed in the scene at design time and can be loaded dynamically at run time.
Assets/Standard Assets — This folder stores any standard asset packages that have been imported into a project. There can be only one standard assets folder. Standard assets are free assets maintained by Unity.
Assets/Streaming Assets — This folder is for assets that will remain in their original format and later be streamed into the Unity application, instead of directly incorporating them into the project’s build. An example would be a video file from the filesystem. There can only be 1 streaming assets folder in the project.
Library
Moving on from the Assets folder, the next folder is Library. This is a local cache used by Unity for imported assets. It can be deleted and will be regenerated by unity automatically. All that is needed to recreate the folder are source assets and .meta files. If this folder is deleted, Unity will reimport all asset information and regenerate the folder the next time the project is opened in the editor. This folder should not be included in Version Control. These imported assets are used by Unity to save time when Unity is running.
Of special note in the Library folder is the package cache folder. This contains information about all the packages installed with the current project. Though this can be regenerated by Unity like the other items in the Library folder, for archival purposes, it is important that this file not be deleted. This is because it may be helpful to be able to see what packages are included in the project without having to regenerate the cache, which would require opening the project in its appropriate editor version.
Packages
This folder contains the manifest file in JSON format used to maintain the dependencies link between packages. The manifest file is used to regenerate the package cache in the library folder. It also contains a file listing the individual packages installed with the project. These are used by the Unity Package Manager. The package manager was added in Unity 2018.1. Prior versions of Unity will not contain the package manager and the packages folder will not exist in those cases.
Project Settings
This folder contains all project settings from the project settings menu. It also includes editor version number and build settings and a slew of other settings used by Unity systems. Editor version number, as a standalone file, was not added until Unity 5. For any version before that, the editor version number can be found in the project settings file.
Name | Platforms | Download |
---|---|---|
If you are being harassed by a member of Preserving Immersive Media Group, notice that someone else is being harassed, or have any other concerns, please contact a member of the . If the person who is harassing you is on the team, they will recuse themselves from handling your incident. We will respond as promptly as we can.
Windows
Windows; MacOS (Intel & M1); Linux (Ubuntu and CentOS)
Windows; Linux (from source)
Binaries distributed through Epic Game Launcher. Source code available on restricted access GitHub repository.
Android; Linux; MacOs;Windows; WebEditor
Binaries and source code available on Github and in download section
360° video can be either monoscopic or stereoscopic. Monoscopic video supports what is perceived as a 2D representation of the scene i.e. there is no perception of depth. Stereoscopic video supports a 3D representation of the scene with a perception of depth.
Monoscopic 360° video contains video captured from a single viewpoint within the scene.
Stereoscopic 360° video contains video captured from two viewpoints within the scene. These viewpoints can be different point of view for each eye. These are packed into a single video file where they can be arranged side-by-side or top-bottom.
Codecs: h.264/AVC, h.265/HEVC, VP9
Wrappers: MP4, Mastroska, WebM
Significant variables in choice of video file format:
Achievable bitrate / compression
Metadata container options
...?
Projection format refers to the way in which data representing a 360° or spherical field of view is mapped to a flat image when it encoded. It is similar to the way in which a map of Earth is a flat representation of the spherical surface of the planet.
Some common projection formats include:
Equirectangular
Cubemap
Equi-angular cubemap
Pyramid
Significant variables in choice of projection format:
Pixel density
Tool support (encoding, decoding)
Requirements of video streaming platforms (e.g. YouTube)
...?
3DOF
6DOF
Outside-in
Inside-out
Can you help us write this page? We're looking for people who've worked with Augmented Reality materials/experiences to help us write an introduction for someone new to the topic.
A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet, that mounts either a single screen in front of both of the users eyes (monocular HMD) or a separate screen in front of each eye (binocular HMD). An extensive summary of the properties of VR HMDs is available on the wiki or on . HMD's are generally either "tethered" (such as Oculus Rift) to a PC that monitors tracking and undertakes the rendering, or "untethered" (such as Oculus Quest) where the unit is standalone.
This page is designed to help you understand what Godot is and how it can be used by creators.
Godot Engine is a free open source 3D game engine for Android, Linux, Mac and Windows. A WebEditor is also available.
Some information with relevance to XR preservation, quote from the features list:
XR supportGodot makes cross-platform Augmented and Virtual Reality development easy.
Works with many headsets including the Meta Quest, Valve Index, HTC Vive, Oculus Rift, all Microsoft MR headsets, and many more.
Support for OpenXR, an emerging open standard that most hardware vendors are moving to.
Plugins give access to various proprietary SDKs, such as OpenVR (SteamVR) and the legacy Oculus SDKs.
WebXR can deliver AR/VR experiences via a web browser.
ARKit allows creating AR apps for iOS.
Multi-platform editorCreate games on any desktop OS and Android.
Works on Windows, macOS, Linux, *BSD and Android (experimental). The editor runs in 32-bit and 64-bit on all platforms, with native Apple Silicon support.
Small download (around 35 MB), and you are ready to go.
Easy to compile on any platform (no dependency hell).
Multi-platform deployDeploy games everywhere!
Export to desktop platforms: Windows, macOS, Linux, UWP, and *BSD.
Export to mobile platforms: iOS and Android.
Consoles: Nintendo Switch, PlayStation 4, Xbox One via third-party providers (read more).
Export to the web using HTML5 and WebAssembly.
One-click deploy & export to most platforms. Easily create custom builds as well.
This is an evolving checklist of things to consider when bringing XR materials, experience and artworks into a collection. This is a complex process and involves navigating many uncertainties regarding the future of the technologies involved. While not exhaustive, this checklist is designed to assist navigating this process by highlighting key activities and questions to consider.
For all XR materials:
Work closely with the creator and/or their team to understand the components of the XR experience and document what is learnt during this process — see Documentation Templates.
Understand the way in which the XR experience has been presented in the past and gather any relevant documentation to support transmitting this understanding.
Assess how immediate the need for access to display equipment is and source anything missing. This might include:
XR equipment (e.g. HMD, tracking system...) — at least one set.
Computer system suitable for running the software and hardware.
Additional screen(s) for setup and monitoring.
If any computers are being supplied ensure you have:
Created disk images of any internal storage media.
Create a backup machine (if viewed as useful for display).
Tested them to ensure they function.
Documented their hardware components.
Extracted any relevant components for archiving (e.g. the XR runtime).
If any real-time 3D software is being supplied ensure you have:
Received executable software (ideally supporting as many operating systems and XR runtimes as possible).
Identified, gathered and tested dependencies required to access the executable software (e.g. operating systems, libraries, XR runtime, drivers).
Received source materials (be it code or an engine project) required to build the project — see Software Archiving Guides.
Identified, gathered and tested dependencies required to access source materials.
If any 360 video is being supplied ensure you have:
HMD Name | Supported Runtimes |
Oculus Rift CV1 | Oculus Runtime; SteamVR |
HTC Vive | SteamVR; Oculus Runtime (via ) |
An XR runtime is a software package which provides XR applications with access to XR platforms and devices. It can also implement functionality. Examples include the Oculus runtime and the SteamVR runtime. An XR runtimes can provide XR applications with a variety of interfaces with which to interact with them, which might conform to a standard such as OpenXR or OpenVR.
Runtime Name | OpenXR Support | OS Support | Download Packages |
---|---|---|---|
360° video is a video format in which every direction of view is available to the viewer. Though the direction of view is free, in most circumstances the viewing position in space is either fixed or on a predetermined “on rails” path.
360° video can be created in a number of different ways:
Captured by a camera or array of camera lenses;
Generated as an export from 3D rendering software (e.g. Blender);
Generated from a real-time 3D . The 360° video exported from a game engine can be the artistic end product (s. for instance "" by Studer / van den Berg) or the documentation of a real-time 3D artwork.
Resources created to help you effectively gather together materials that will support the preservation of an immersive media experience.
Immersive media can make use of complex systems of interconnected software. To support the preservation of immersive media, it is useful to gather and individual archive these components so that they can be preserved independently of any specific installation. Archiving is here only described to the extent of bit-level preservation. A deeper look at preservation strategies can be found in .
360-degree has specialised transcoding requirements if properties such as projection format and stereoscopy to be properly managed. At this point the tools listed here have not been tested by us and inclusion should not be taken as a recommendation.
List of tools:
ffmpeg,
Headjack VRencoder,
Acquisition Information Templates
These documents were designed to guide information gathering and discussion during the early stages of the acquisition of virtual reality (VR) or augmented reality (AR) artworks, primarily with conservation and long-term preservation in mind. They are designed to be completed by or in close collaboration with an artist prior to receiving media from the artist.
(March 2019) developed by Jack McConchie and Tom Ensom during Tate's Preserving Immersive Media project:
(May 2019) developed as an extension of this template by Savannah Campbell and Mark Hellar:
The infographic provides artists and makers information about the importance of documenting their work. The infographic answers questions about why it is important for artists and makers to document their work and how they can document their own work.
"Augmenting Our Approach to Preservation: Documentation of Experience for Immersive Media" written by during their 2021–22 graduate internship at Tate.
Oculus Runtime
Yes
Windows 10
Current version only downloadable through Oculus client. Legacy versions available on https://developer.oculus.com/downloads/package/oculus-runtime-for-windows/
SteamVR
Yes
Windows 7 (SP1)
Windows 8.1
Windows 10
Can only be downloaded through Steam (see Archiving XR Runtimes).
Monado
Yes
Linux
Source code available from GitLab, build packages available for Debian and Ubuntu.
This page describes the process for extracting an XR runtime as a contained unit of software, so that it can be archived independently of a computer system.
XR runtime are often distributed using front-end tools that carry out downloading and installation in the background (e.g. SteamVR is downloaded through Steam). This makes it harder to extract and archive them for reuse in the future. This page describes the process of extracting different XR runtimes for independent archiving.
Oculus Runtime 0.8.0 (dating from 2017) is currently available to download from the Oculus website: https://developer.oculus.com/downloads/package/oculus-runtime-for-windows/.
For later versions you are dependent on the Oculus Rift software to install and manage the runtime. After installation, the runtime can be found in C:\Program Files\Oculus - still to be tested whether this directory can be copied to a new machine, but seems unlikely.
These instructions are adapted from Valve's guidelines for offline installation of SteamVR. While this method allows you to extract a standalone copy of the SteamVR runtime, the version available through Steam cannot be controlled.
1. Either a) Install and open the Steam Client on a PC with full internet access or b) Access a computer which already has the appropriate version of Steam installed.
2. In the Steam Client, open the Library section and find the part of it labeled "Tools".
3. Find the entry "SteamVR" and install it.
4. Right-click on the entry "SteamVR" and in the resulting popup menu click on the entry "Properties".
5. A new window with multiple tabs will open. Select the tab "LOCAL FILES" and click on the button labeled "BROWSE LOCAL FILES".
6. The directory containing the SteamVR Runtime will open. Copying this entire directory will encapsulate the files required to run SteamVR on another computer. From this directory, SteamVR can be launched by running the "vrstartup.exe" executable file in "\SteamVR\bin\win64".
This page is designed to help you understand what Unreal Engine 4 is and how it can be used by creators.
Unreal Engine 4 (UE4) is a real-time 3D engine developed by Epic Games. The first version of Unreal Engine was created during the development of the 1998 game Unreal, at which point Epic Game started licencing the engine to other developers. Version 4.0 was released in 2014, and has been followed by 27 subversions (the latest is 4.27). Unreal Engine 5, announced in 2020, is expected to supersede UE4 at some point in the near future. UE4 is not an open-source engine (see the EULA), but the engine source code is freely available via their GitHub repository.
A UE4 project consists of a collection of files and folders conforming to a well defined structure. A project folder typically contains the following at the top-level (more detail in the UE4 docs):
Binaries: If the project has been compiled for a specific platform, this contains the files produced.
Build: Contains files required to build the project for different platforms.
Config: Contains project settings stored in plain text files with the .ini extension.
Content: Contains the maps, assets (e.g. 3D models, materials, blueprints) and other custom content used by the project, including any third-party packages downloaded from the Unreal Marketplace.
DerivedDataCache and Intermediate: Contain temporary files generated during the build process (e.g. shader compilation).
Saved: Contains saved data created by the editor as it runs, including autosaves, crash reports and logs.
A .uproject file: The project file with which the project can be launched. Actually a json file containing structured information describing the project, including the UE4 version, enabled plugins and target platforms.
A UE4 build consists of set of files and folders that allow the software to be run on a suitable host computer.
Some elements dependent on whether Development or Shipping option is selected prior to build, and other packaging options in UE4.
Has been updated an average of 4 times per year since first release in 2014. Updates can result in deprecation or removal of features, or incompatibility with plugins. Unreal Engine 5 was released in 2022, which may mean an end to UE4 updates.
Free source code access via GitHub repository (note that the Unreal Engine EULA applies).
Royalty payments only required with high product revenues, so unlikely to impact cultural heritage context.
For custom C++ projects Visual Studio dependencies are difficult to manage from an archival perspective.
Transforms imported assets to the poorly documented internal format (.uasset). More work is required to understand whether this transformation is lossless.
When an asset is imported to Unreal Engine 4 it is converted to the UE4 UASSET (.uasset) format. This format is not well documented, although there is some partial reverse engineering work here. A UASSET can be re-exported from the engine in a variety of formats depending on the asset type. To do so, you need to right click on the asset in the Content Browser, and navigate to Asset Actions -> Export.
The information below was derived from testing in Unreal Engine 4.27. Note that is simply a list of the export formats available and exports have not been tested against original import format.
Windows applications created using the UE4 editor have a set of dependencies similar to the editor. These are automatically packaged with an application when built from the editor, and are installed when the application is run by an installer program called UE4PrereqSetup_x64.exe. This can be located in the application directory: [UE4GameRootLocation]\Engine\Extras\Redist\en-us
These dependencies are required to run the editor, or applications created using the editor.
These dependencies are required to build software from the editor for particular platforms.
Epic Game, Unreal Engine 4.27 Glossary, URL: https://docs.unrealengine.com/4.27/en-US/Basics/Glossary/
David Lightbown, 2018, Classic Tools Retrospective: Tim Sweeney on the first version of the Unreal Editor. URL: https://web.archive.org/web/20180823012812/https://www.gamasutra.com/blogs/DavidLightbown/20180109/309414/Classic_Tools_Retrospective_Tim_Sweeney_on_the_first_version_of_the_Unreal_Editor.php
This page describes the process for archiving a Unity 5 project and its dependencies on Windows, so that it can be preserved independently of a specific computer system.
Work in progress!
One way of preparing for the preservation of an application made in Unity 5, is to archive the project files associated with it. Executable software is created from a Unity project by exporting an application that supports the target platform. By archiving an Unity project, we aim to gather together all the materials required to carry out to repeat this build process. This opens up options for incremental migration to new versions of Unity, and the modification of code to support other hardware and software platforms.
To build an Unity project, you need the following components, guidance on the archiving of which is provided on this page:
the collection of custom Unity content and project files
: software which allows you to open and edit a Unity project folder
: any additional software not included with the engine binaries or project by default e.g. libraries, modules
#
The Unity Editor software can be installed using the Unity Hub software or from the intallers distributed via the .
Before proceeding, you will need to identify the version of the Unity Editor the project was created with. To do so, navigate to the ProjectSettings folder within the project folder and open the file named ProjectVersion.txt (tested in 2018.3.9).
1. Install Unity Hub on a suitable computer and navigate to the Installs tab.
2. Click Install Editor and select the appropriate editor version. If the appropriate version is not available, you will need to install it use the Unity Download Archive method.
You may also wish to install additional modules to improve build support.
3. Return to the previous Installs tab, click on the cog icon next to the Editor entry you wish to archive, and click on Show in Explorer.
4. In the Window which opens, you will be looking inside the application directory for the Unity Editor version. This folder can be archived and used to access the Editor independently of the Unity Hub installer.
There are two common ways of extending the functionality of Unity which you may find have been used: Modules and Packages.
Modules extend core features, and includes options to build for non-Windows platforms. Unfortunately, if any non-standard modules have been used by the project your only option is to use the Unity Hub application to download and install these.
Packages are handled by the Unity package manager module and are included in the project folder once they have been added.
Extending the service life of hardware through repair could support preservation efforts if a particular XR system is required. These hardware systems are meant to be used, and in an exhibition context they can be subjected to serious wear and tear.
The consumer market for VR hardware systems and mobile phones is built on planned obsolescence; products are superseded by new models, manufacturers only provide limited warranty periods for repair or replacement, and parts are not commonly available. Purchasing backup equipment or sourcing equipment to be used for parts may be part of a preservation plan.
iFixit is a US-based company that advocates for repairability of consumer devices. The Preserving Immersive Media Knowledge Base has no affiliation with iFixit. The guides linked here provide step-by-step instructions with photos, as well as tools and in some cases replacement parts. The teardown reviews are another resource should you have equipment that is intended to be used for parts. Popular XR systems and Samsung Gear VR are linked here, but many other device guides can be found on iFixit.
Oculus Devices:
HTC Vive Devices:
Samsung Gear VR:
This page describes the process for archiving an Unreal Engine 4 project and its dependencies, so that it can be archived independently of a computer system.
One way of preparing for the preservation of software made in Unreal Engine 4 (UE4), is to archive the project files used to create it. While it is the executable form of the software (or 'build') which is used to run it, archiving the source form of the software opens up more preservation options such as:
Creation of new builds which support different platforms and environments;
Incremental migration to new versions of Unreal Engine 4;
Migration to a new engine should this become necessary.
Additionally, source materials can contain rich technical history and support an understanding of how the software was developed.
As a rule of thumb, you want to have all the materials required to repeated the process of building the software. To build an Unreal Engine 4 project, you need the following components:
: the collection of custom Unreal Engine 4 content and project files;
: the editor software which allows you to open the project;
: any additional software not included with the engine binaries by default e.g. plugins, libraries.
A sensible approach is therefore to archive all these components either independently or in the form of a disk image. Each component type, including how you can locate it, is described in detail below.
An Unreal Engine 4 project folder is a collection of files conforming to a specific directory structure — more information on this format can be found in our introduction to .
The archive this, the supplier will need to send you a copy of this complete directory. The contents of this directory should look something like the screenshot below:
One interesting thing to note is that this can include assets and other materials that are not used by the built application. This can make the project files larger, but also provides historical insight into the way it was created. If the creator of the project files offer to 'clean them up' before supplying them, you may wish to advise them against that.
Project files can include hundreds or even thousands of files, so as a final step you may wish to ZIP them for convenience and reduced stored size.
To open an Unreal Engine 4 project you need an appropriate version of the Unreal Engine 4 editor. To avoid errors, you should use the editor version in which the project original developed. If you use a newer version, Unreal Engine will present a warning and give you a choice of whether to proceed or not. There is a chance you can open the project successfully, but doing so may break or change things, so do so very carefully and always using a duplicate copy.
If the project involves a modified version of UE4, you will also need to archive a copy of the source code. Archiving the source code of the engine version used can also be a generally useful thing to have, as it can hold useful information for future preservation work.
Sometimes additional dependencies are required to open or build a UE4 project successfully.
Plugins are extensions to Unreal Engine 4's functionality. They can be installed from the within the engine or manually. There are two default locations for plugins:
Unreal Engine install location: /[UE4 Root]/Engine/Plugins/[Plugin Name]/
Project folder: /[Project Root]/Plugins/[Plugin Name]/
You need to make sure that any required plugins have been installed and archived with either the project files or engine binaries.
If a UE4 project involves custom C++ code, you will need to install the appropriate version of Microsoft Visual Studio (e.g. 4.27 requires Visual Studio 2019 to build such project for Windows).
Links to initiatives/workgroups, events, and publications on the topic of developing standards for data packaging, file types, metadata, documentation, normalization, migration, storage, and access.
International Image Interoperability Framework (IIIF) Community 3D Interest Group
CS3DP (Community Standards for 3D Data Preservation)
Council on Library and Information Resources (CLIR): 3D/VR in the Academic Library: Emerging Practices and Trends
Developing Library Strategy for 3D and Virtual Reality Collection Development and Reuse (LIB3DVR)
Building for Tomorrow: Collaborative Development of Sustainable Infrastructure for Architectural and Design Documentation
PARTHENOS: “Digital 3D Objects in Art and Humanities: Challenges of Creation, Interoperability and Preservation”
Born to Be 3D: Digital Stewardship of Intrinsic 3D Data (US Library of Congress)
The Institute of Electrical and Electronics Engineers (IEEE) Standards Association: Virtual Reality and Augmented Reality Working Group
Lindlar, M., Panitz, M., and Gadiraju, U. (2015) Ingest and Storage of 3D Objects in a Digital Preservation System.
Software-based tools for access, assessment, and documentation of immersive media.
The software tools listed here support preservation activities for compiled, executable files or project files. Some are proprietary and require a license or developer account, and some are open source. Not all may be currently supported or compatible with your files or hardware.
Tool | Purpose | Description | Link | Platforms | Open source? |
---|
Video documentation can be used to record aspects of an immersive media (IM) experience. This can take two forms:
Physical capture: Video recording of the real-world physical actions of a user interacting with an IM experience.
Virtual capture: Video recording of the virtual actions of a user interacting with an IM experience, as would be sent to a display device.
Virtual video capture can produce two kinds of video:
Fixed-perspective video: video which representing a fixed perspective on the virtual environment; the standard form of video designed for non-interactive viewing.
: video representing a 360 degree view from a central point, therefore allowing a level of interactivity via rotational tracking.
Hardware and software tools can be used to capture video from a real-time 3D application. This can be fixed-perspective or 360 video.
Tool | Description | Capture Formats |
---|
Tested in version 4.27 of the editor only.
The Unreal Engine 4 editor has built-in tools which allow export of video sequences. The actions of these sequences can be scripted or recorded from user interaction within the editor. Using them therefore requires access to the source project and a level of engagement with the editor software. This will may involve modification of the source project, so you may wish to work with a copy or version control system.
Video export format depends on the options you select in the Render Movie Settings dialogue (info derived from MediaInfo output applied to rendered video):
With 'Use Compression' unchecked: RGBA (8 bits-per-channel) in AVI with OpenDML extension container.
With 'Use Compression checked: MJPG video (YUV, 4:2:2, 8 bits-per-channel), Interlaced, Top Field First) in AVI container.
There are some configurable options, including output video resolution and framerate.
The uncompressed video option yields the best quality output but the video files produced are VERY large - in our test 8 sec of capture (1080p at 24fps) yielded a 1.5 GB file. Take care that you have enough storage space to capture in this format if you are going to use this setting, at which point you can convert to lossless compression format like FFV1 for storage.
In order to generate a video from a UE4 project using the editor tools, you need to first create a sequence: a scripted or recorded set of events occurring within the virtual environment. Some projects may already use a sequence to choreograph the actions that occur within the IM experience (e.g. an 'on-rails' experience). In these cases, locate the sequence in the Content Browser and open it. In the toolbar you should see a clapperboard icon, which will open up the Render Movie Settings dialogue.
360 video can be created with a workflow that utilizes plugins enabling export of stereoscopic frames from UE4, which are then assembled into a video sequence with software like Adobe After Effects (which has support for VR and 360 video).
Note that there are some caveats to using Panoramic Capture Tool:
It does not export audio.
Audio could theoretically be pulled from fixed-perspective video of an "on-rails" experience, but another strategy would be required for interactive content.
It is an Experimental feature in UE4, and is not as actively developed or supported as other UE features.
UE v4.19 and earlier use a different version of Panoramic Capture that does not have the benefit of Blueprints, and requires some manual programming; this earlier version is not covered here.
There are other 360 export tools that can be purchased in the Unreal marketplace.
[workflow to be added here]
[To be added]
Criteria | Assessment |
---|---|
Format | Includes Material? | Notes |
---|---|---|
Platform | Dependencies |
---|---|
Target Platform | Dependencies |
---|---|
You can download prebuilt UE4 binaries using the . Once installed, the engine version can be located in your UE4 install directory and zipped for archiving.
Alternatively, you can build binaries from the (this is a private repository and you will need to request access prior to use). See also the Binary Builder tool .
For a project using a modified source code, the creator should be able to supply this or advise on where it can be found. For unmodified UE4 source code, this can be pulled from the Unreal Engine repository on GitHub: . This is a private repository, so you will need to request access and link to an Epic Games account before being able to access it — see the .
Digital 3D Objects in Art and Humanities: challenges of creation, interoperability and preservation.
Workshop
Where there is not an existing sequence, you can create a sequence by scripting or interaction in the engine.
The following workflow uses , a free plugin included with UE4.
Source code access
Public source code access although access to private GitHub repository needs to be requested. Not open source per se, released under the Unreal Engine EULA. Seems accommodating to preservation use case, but use must abide by EULA (e.g. no redistribution of engine source code).
Licencing
Users to pay a 5% royalty to Epic Games if product revenues exceed $1,000,000 USD.
Availability of old versions
Oldest version available is 4.0.2 (released 28 March 2014) via Epic Games Launcher or GitHub.
FBX
Yes
Can export to 2011, 2012, 2013, 2014, 2016, 2018, 2019, 2020 versions of the FBX spec.
OBJ
No
Unreal object text .copy
No
Identical to .t3d
Unreal object text .t3d
No
Identical to .copy
Windows
The Epic Games Launcher automatically runs a dependency installer when UE4 is installed, called 'UE4PrereqSetup_x64.exe' or 'UE4PrereqSetup_x86.exe'. Manually running these is required if a different method of installation is used (e.g. copying the engine binaries onto the system). These can be located within the Engine directory: [UE4EditorRootLocation]\Engine\Extras\Redist\en-us
Linux
clang for Linux (version depends on engine version: https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/Linux/NativeToolchain/)
Linux (cross-compile from Windows)
clang for Windows (version depends on engine version: https://docs.unrealengine.com/4.27/en-US/SharingAndReleasing/Linux/AdvancedLinuxDeveloper/LinuxCrossCompileLegacy/)
Windows
Visual Studio required for some (w/ code modification?) projects (versions depends on engine version: https://docs.unrealengine.com/5.0/en-US/setting-up-visual-studio-development-environment-for-cplusplus-projects-in-unreal-engine/)
Android Debug Bridge (adb) | Debugging, sideloading | Command line tool for debugging projects and sideloading content to Android-based hardware | Windows, Mac, Linux | Yes |
Android Logcat | Debugging | Android Logcat Package is a utility for displaying log messages coming from Android device in Unity Editor | Unity 2019.1 or above | No |
apitrace | Debugging | Tools for tracing OpenGL, Direct3D, and other graphics APIs | Windows | Yes |
Ardour | DAW | Opensource Digital Audio workstation ; ambisonic and binaural plugins avalaible | Linux, Mac, Windows | Yes |
Blender | 3D | Opensource 3D suite and more | Linux, Mac, Windows | Yes |
FRAPS | Video Recording; Monitoring / Metrics | Screen recording and frame rate monitoring tool (DirectX, OpenGL) | Windows XP, 2003, Vista, and Windows 7 | Free, not open source |
NVIDIA Ansel | 360° images | Capture 360° still images | Unreal Engine 4, Unity 5 | Requires account on developer.nvidia.com to access SDK |
Intel Graphics Performance Analyzers | Graphics Analysis | Command line and scripting interface to expose capture and playback functionalities | Windows, Ubuntu | Free, not open source |
iVRY | Hardware Emulator | Use iPhone 6+, Android 4.4+ to view Valve OpenVR/SteamVR content for HTC Vive and Oculus Rift on Windows 7+ | Windows | No |
libsurvive | Driver / Library | Open-source tracking library for Valve's Lighthouse and Vive tracking systems | Windows, Debian | Yes |
monado | XR runtime | Runtime for VR and AR on mobile, PC/desktop, and HMDs (OpenXR API) | GNU/Linux | Yes |
LIV | Video recording | Live capture and streaming of user interactions | Windows / SteamVR | No |
Microsoft Hololens Emulator | Hardware Emulator | Test Hololens apps on PC, use mouse/keyboard inputs instead of controllers without code adaptation | Windows | Free, not open source |
NVIDIA GeForce ShadowPlay | Video recording | Screen capture in real-time 3D applications (not 360°) | Windows, NVIDIA GeForce graphics cards | Free, not open source |
Oculus 360 Capture SDK | Video Recording | 360° video recording in real-time 3D applications | Unity, Unreal, NVIDIA and AMD GPUs | Free, not open source |
vr5kplayer | Video Recording | Create and play a view-dependent version of a 5K x 5K 360 degree stereo video on Oculus mobile VR systems. | Oculus Go or Samsung S8 (and later) Gear VR systems. | Free, not open source |
Oculus Compositor Mirror tool | Monitoring, Documentation | Displays the content that appears within the Rift headset on a computer monitor. It has several display options that are useful for development, troubleshooting, and presentations. | Windows, Oculus Rift | Free, not open source |
Oculus HMD head motion emulation | Hardware Emulator | Simulate the movement of a user directly in the Unity Editor | Windows, Unity, Quest, Rift | Free, not open source |
Open Broadcaster Software | Video Recording | Combine multiple computer sources in custom layout, with switching | Windows, Mac, Linux | Yes |
Open VR Recorder | Tracking/Input Data Recording | Record tracking data from devices with drivers for OpenVR / SteamVR. | OpenVR, SteamVR; HTC Vive, Oculus Rift VR systems | Trial 10 seconds recording; $125 license |
Open XR Conformance spec | Conformance tool | Command line interface conformance test suite for OpenXR | Yes |
OpenComposite | Compatibility Layer | Play SteamVR games without SteamVR | Yes |
ReVive | Compatibility Layer | Compatibility layer between the Oculus SDK and OpenVR/OpenXR | Yes |
OpenHMD | Reverse engineering | Distortion Maps for headsets | Yes |
OVR Metrics Tool | Monitoring / Metrics | Generates performance metrics for applications running on Oculus mobile devices | Unreal, Unity, Quest | Free, not open source |
Radeon GPU Analyzer | Graphics Analysis | Performance analysis tool for DirectX, Vulkan, SPIR-V, OpenGL, and OpenCL | Windows, Linux | Yes |
Radeon Software Adrenaline 2020 Edition (ReLive 2019) | Video Recording | Screen capture for AMD Radeon graphics cards | Windows, DirectX, Vulkan | Free, not open source |
RenderDoc | Debugging | General purpose graphics debugger | Windows | Yes |
RenderDoc for Oculus | Debugging | Branch of the RenderDoc project by Oculus for debugging the Oculus Quest | Unreal, Unity, Quest | Free, not open source |
RivaTuner Statistics Server | Monitoring | Framerate monitoring, On-Screen Display and high-performance video capture service | Windows | Free, not open source |
SideQuest | Sideload utility | Sideload content to Oculus Quest | Windows, Mac, Linux, Android | Free, not open source |
Sites in VR | Calibration | Mobile calibration tool |
Spatial Media Metadata Injector | MetaData | 360° video metadata injector | MacOS, Windows | open source |
Steam VR Mirror Mode | Monitoring, Documentation | Enables you to see what the user sees in the HMD for SteamVR content | Unreal, SteamVR Tools | Free, not open source |
Surreal Capture | Video Recording | 360° video recording in real-time 3D applications | Windows | No, paid license ($179.95) or 15-day trial |
Unity Recorder Package | Recording | Capture and save data during play mode | Unity Editor | Included in Unity |
Unity Stereo 360 Image and Video Capture | Recording | Renders the camera view for each eye; requires additional code and software settings | Unity 2018.1+ | Included in Unity |
Unreal nDisplay | Monitoring, Documentation | Renders a scene on multitple synchronized displays | Unreal Engine | Included in Unreal |
Unreal Panoramic Capture | Video export | Exports 360° still images or frames that can be compiled into video with Adobe AfterEffects | Unreal Engine | Plugin for Unreal |
Unreal Replay System | Recording | Console tool for recording and playback of game play | Unreal Engine | Unreal Engine 4 feature |
Unreal VR Spectator Screen | Recording | View content from third-person perspective | HTC Vive, Oculus Rift, Steam VR, PlayStation VR (PS VR) | Unreal Engine 4 feature |
Free and open source software for video recording. Cross-platform (Windows, MacOS and Linux). |
Requires NVIDIA GPU. |
Windows 10/11 built-in tool. |
Mobile platforms and app services are part of preserving and re-exhibiting AR content and mobile phone-based XR content. This page gathers published research and resources on these topics.
“Towards a preservation workflow for mobile apps” (24 February 2021)
“A Race Against Time: Preserving iOS App-Based Artworks” (2019-20)
“Considerations on the Acquisition and Preservation of Mobile eBook Apps” (2019)
Smartphones within the changing landscape of digital preservation (2017)
frameworks, spatial audio plugins , ambisonic and more
Wwise
Wwise is an audio framework for MacOs and Windows to create audio interactive content for UnReal and Unity Engines
The IEM Plug-in Suite is a free and Open-Source audio plugin suite including Ambisonic plug-ins up to 7th order for Linux, MacOs and Windows.
VST2, VST3 and standalone app.
StereoEncoder, RoomEncoder, EnergyVisualiser, BinauralDecoder, MultiBandCompressor and more....