This branch is work in progress. Update, now it works with UE 5.1 (and UE 5.2 from ue5-main branch selected commit - see below)!
Airsim limitations in UE5:
- only airsim computervision mode is supported
- record button is broken now (I am recording data with python script)
I am using this project to record data (Images, Gbuffers, ...) in Unreal Engine 5 demo for machine learning project.
My stack:
- Visual Studio 2022
- Windows 10 SDK (10.0.18362.0)
⚠️ The version needs to be 2022: (Airsim Repo will not accept other toolset by default when manual building). Pre-built libraries are also shipped with this toolchain and linker will not be able to match them with other versions.
Compatibility version table
CitySample | Unreal Engine | Comments |
---|---|---|
5.0 | 5.0 | No stencil masks support |
5.1 | 5.1 | No stencil masks Support |
5.1 | custom ue-5 main | Stencil mask supported! needs custom ue5 build (see below) |
5.2 | 5.2 | Currently no cars and humans see (no mass entities thread)[https://forums.unrealengine.com/t/ue-5-2-citysample-5-2-no-mass-entities/1163486/2] |
You can use marketplace ue5 build. Note with version 5.1 and older, you will not have any segmentation masks except only humans. This is because back then nanite was not working with custom stencil buffer writing. Also CitySample at 5.2 does not have any moving cars and walking humans.
The only branch where most things works and this is the branch I am now using. You will need to compile UE5.
-
git clone ue-5 main branch from official repos https://github.com/EpicGames/UnrealEngine/tree/ue5-main
-
checkout exactly this commit. Note: it will not work with more newer commits! (Changes in Mass will not compile with current citysample)
c825148dc6e018f358c5d36346c8698c47835a48
- Generate project files. For generating visual files for Visual Studio 2022 use this flag:
GenerateProjectFiles.bat -2022
- go to:
Engine/Plugins/Experimental/ChaosUserDataPT/Source/ChaosUserDataPT/Public/ChaosUserDataPT.h
and change 102 line to:
if (const FPhysicsSolverBase* MySolver = this->GetSolver())
- go to:
Engine/Plugins/Runtime/MassEntity/Source/MassEntity/Public/MassRequirements.h
and change (comment) 199 line(s) to:
//checkf(FragmentRequirements.FindByPredicate([](const FMassFragmentRequirementDescription& Item) { return Item.StructType == T::StaticStruct(); }) == nullptr
//, TEXT("Duplicated requirements are not supported. %s already present"), *T::StaticStruct()->GetName());
- if you have compiler errors in ue5-main related to include order version, you can try for some selected something.Targeet.cs to add this line. In the end it should work without it, but for me this include ordering version is now a little buggy and random.
IncludeOrderVersion = EngineIncludeOrderVersion.Unreal5_2;
ℹ️ Stencils support only starting from UE 5.2: Writing to custom stencil depth buffers is possible starting from UE 5.2. If you want to have stencil buffers, you need UE 5.2 version with above modifications
Not yet available (300GB). I will share this possibly somehow in the future.
B. Download CitySample (5.1 version) from epic store/marketplace and build Airsim plugin for your own
Now it is supported also with ue 5.2 above commit checkout.
Build airsim plugin from this branch the same way as in official airsim building instructions. After you have build a plugin from the proper command prompt with build.cmd (please go with Visual 2022 cmd), copy a plugin dir (AirSim→Unreal→Plugins) to CitySample Project plugins dir (CitySample→Plugins).
Generate Visual studio project files (right click on .uproject, select proper UE5 version → generate). Add "airsim" to .uproject and Config\DefaultGame.ini. Follow these instructions: add airsim to .uproject
When it compiles, select CitySample Project in Visual and press CTRL+F5 (run and detach). Then, in the editor open Map\Big_City_LVL (it will take a while).
When done, you can package a project or you can run it directly from the editor
modify “World Settings->Selected Gamemode”. Change HUD to SimHUD.
Now at this stage you should be able to run a Citysample demo from the editor or package it.
If you want to get stencils/segmentation-masks, please check (set your stencil values accordingly) AirSim\PythonClient\computer_vision\UECapture\set_stencils_from_editor.py and run it from unreal engine python command.
After packaging, you are ready to go wtih the next steps.
⚠️ Record button broken - use python script: Please note that for now red Recording airsim button is not working (will crash). This is because I didnt fix yet velococity 3 channel float buffer yet there. For now I am grabbing gbuffers and image from a python script. For now, please also use python script instead of record button.
⚠️ Nanite stencils not yet supported in UE5.1 and earlier: As of UE 5.1, nanite does not support writing to stencil buffers (see in official doc Nanite Virtualized Geometry ). (humans/character stencil works, but car/street/buidling are not writing to stencil buffers).
ℹ️ Solution for not correct VS toolchain picked up when building CitySample: I have set confing in Citysample project now explicitly to use VS 2022. But still, if you have older Visual Studio installed in parallel, other toolchain might be choose by default. To change toolchain, open UE5, create empty project, and set toolchain for the project and for the editor in two places.
- Edit → Project Settings → Platforms → Windows → Toolchain → CompilerVersion → VS 2022
- Editor Preferences → search for “Source Code Editor” → VS 2022 (or Engine Settings → “Source Code” section. )
Paste a config json file to your "Documents/Airsim" dir. This File sets up “computer vision mode”, resolutions and how buffers are saved.
My config is in my_json dir:
Clone modified Airsim repo and switch to citysample branch (possibly you already have it when you followed step B “B. Download CitySample 5.1 version from epic store/marketplace”).
Install Conda, open Anaconda Prompt
conda create --name airsim python=3.8
cd to Airsim root dir
conda activate airsim
cd PythonClient\computer_vision\UECapture
pip install -r requirements.txt
pip install tqdm
pip install --user numpy numpy-quaternion
python -m pip install -e ./
# when you open Modified Citysample UE5 demo with our Airsim plugin, run
python main.py --record --verbose
python main.py --record
Python client should connect with compiled running ue5 citysample demo and move camera. UE5 game environment should be slowed down 100x times and you see camera teleporting and gbuffers being saved to hdd.
These are a bit older instructions how to use it in UE 5.01-5.1. For now, I am leaving them as is, before refactoring.
This Branch shows how I managed to run and use the AirSim plugin in UE5. The code comments unsupported vehicle setups, which allows to compile AirSim for UE5 and use it in "ComputerVision" mode. Please note that other modes: {"Multirotor", "Car"} are not supported here and will not work properly when using this branch.
What works for me so far:
- "ComputerVision" mode
- saving depth, normals, segmentation
- python api to control ue5 app
What does not work:
- opticalFlow (Hsli compilation errors - not yet investigated)
- "Multirotor" and "Car" (for now I am not bringing these as I don't need them)
my stack:
- Unreal Engine 5.0.2, Release branch
- Visual Studio 2022 and Windows 10 SDK (10.0.19041.0)
AirSim by default wants to be compiled with 10.0.19041.0. If you have other Windows 10 SDK versions installed, I suggest uninstalling them so Unreal Engine/Visual Studio will not pick them by accident. Get Unreal Engine 5.0.2 Release branch (or possibly newer). For the 5.0.1 Release, you will need to fix JSON 1.2 support manually.
Get code from this AirSim Branch. Compile it and add it to your Unreal Project (copy plugin dir, edit .uproject). Follow official AirSim build instructions for these.
In setting.json set:
"SimMode": "ComputerVision"
Enjoy AirSim "ComputerVision" mode in UE5!
Warning - you may need additional adjustments to your project or AirSim to make it work properly.
For example, when working with the City Sample UE5 demo I learned:
- AirSim secondary cameras are not working with Lumen by default.
- AirSim cameras are based on SceneComponent2D. This means they will not capture reflections outside of the camera view.
- AirSim is setting up world origin which causes trouble for City Sample Demo. SetNewWorldOrigin() methods must be commented out in the AirSim code, as it will cause a lot of glitches (spawning low-poly buildings, not drawing small mass ai crowd/traffic, crowd/traffic wrongly placed, etc.)
As a workaround for 1. and 2. I rely on taking screenshots from the main camera as standard RGB image. Position and FOV (90deg) for the main camera and AirSim ComputerVision camera(s) are the same.
I will paste here a link to a branch when I am using mentioned fixes in City Sample Demo.
AirSim is a simulator for drones, cars and more, built on Unreal Engine (we now also have an experimental Unity release). It is open-source, cross platform, and supports software-in-the-loop simulation with popular flight controllers such as PX4 & ArduPilot and hardware-in-loop with PX4 for physically and visually realistic simulations. It is developed as an Unreal plugin that can simply be dropped into any Unreal environment. Similarly, we have an experimental release for a Unity plugin.
Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform independent way.
Check out the quick 1.5 minute demo
Drones in AirSim
Cars in AirSim
For more details, see the use precompiled binaries document.
View our detailed documentation on all aspects of AirSim.
If you have remote control (RC) as shown below, you can manually control the drone in the simulator. For cars, you can use arrow keys to drive manually.
AirSim exposes APIs so you can interact with the vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. The APIs are exposed through the RPC, and are accessible via a variety of languages, including C++, Python, C# and Java.
These APIs are also available as part of a separate, independent cross-platform library, so you can deploy them on a companion computer on your vehicle. This way you can write and test your code in the simulator, and later execute it on the real vehicles. Transfer learning and related research is one of our focus areas.
Note that you can use SimMode setting to specify the default vehicle or the new ComputerVision mode so you don't get prompted each time you start AirSim.
There are two ways you can generate training data from AirSim for deep learning. The easiest way is to simply press the record button in the lower right corner. This will start writing pose and images for each frame. The data logging code is pretty simple and you can modify it to your heart's content.
A better way to generate training data exactly the way you want is by accessing the APIs. This allows you to be in full control of how, what, where and when you want to log data.
Yet another way to use AirSim is the so-called "Computer Vision" mode. In this mode, you don't have vehicles or physics. You can use the keyboard to move around the scene, or use APIs to position available cameras in any arbitrary pose, and collect images such as depth, disparity, surface normals or object segmentation.
Press F10 to see various options available for weather effects. You can also control the weather using APIs. Press F1 to see other options available.
- Video - Setting up AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using AirSim with Pixhawk Tutorial by Chris Lovett
- Video - Using off-the-self environments with AirSim by Jim Piavis
- Webinar - Harnessing high-fidelity simulation for autonomous systems by Sai Vemprala
- Reinforcement Learning with AirSim by Ashish Kapoor
- The Autonomous Driving Cookbook by Microsoft Deep Learning and Robotics Garage Chapter
- Using TensorFlow for simple collision avoidance by Simon Levy and WLU team
More technical details are available in AirSim paper (FSR 2017 Conference). Please cite this as:
@inproceedings{airsim2017fsr,
author = {Shital Shah and Debadeepta Dey and Chris Lovett and Ashish Kapoor},
title = {AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles},
year = {2017},
booktitle = {Field and Service Robotics},
eprint = {arXiv:1705.05065},
url = {https://arxiv.org/abs/1705.05065}
}
Please take a look at open issues if you are looking for areas to contribute to.
We are maintaining a list of a few projects, people and groups that we are aware of. If you would like to be featured in this list please make a request here.
Join our GitHub Discussions group to stay up to date or ask any questions.
We also have an AirSim group on Facebook.
- Cinematographic Camera
- ROS2 wrapper
- API to list all assets
- movetoGPS API
- Optical flow camera
- simSetKinematics API
- Dynamically set object textures from existing UE material or texture PNG
- Ability to spawn/destroy lights and control light parameters
- Support for multiple drones in Unity
- Control manual camera speed through the keyboard
For complete list of changes, view our Changelog
If you run into problems, check the FAQ and feel free to post issues in the AirSim repository.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
This project is released under the MIT License. Please review the License file for more details.