Short: |
9K33M2 Оса surface-to-air missile system simulator |
Tech: |
C#, C, Unity, GLSL, Shader graph |
Links: |
One of the learning projects that I did on Unity. It's a simulator for the soviet-era SAM system (the one that shoots missiles at planes). In the western world this system is known as a SA-8 Gecko.
Osa, as most of the SAM systems, have two radars - searching radar and tracking radar. Searching one is a smaller dish that constantly sweeps the sky looking for potential contacts. Tracking radar is the bigger one that locks onto the plane to provide guidance information for the missile.
I went deep into the research of this. While there was a simplistic simulator available where you can push some buttons, I had to dig up the manual (thankfully the system is quite old, and you can find operators manuals online, provided you can read russian) to figure out the proper procedures. I didn't find much in terms of technical specifications though, most notably the power of the tracking and searching radar (operators manual had some of the specs, like the speed of the sweep of the antenna, its search beam width and vertical volume limitations), so I had to approximate that using the range information that I've found online.
There is a short video of it:
This project took an unorthodox approach to radar simulation. Since there wasn't much going on in terms of the visuals, I've decided to use some of the GPU processing power to make a GPU simulation of the radar instead. Simplistic approach would be to just account for the plane as a dot in the sky, putting the distance and angles into the formula to calculate the radar return. While this would have been sufficient enough, I wanted to go deeper than that. One of the most complex things in radar simulations are the radar cross section calculation. There are formulas for that, but only for simple shapes (that's why F117 looked as a low poly model - at the time that was the only shape that could have it's cross section calculated). I had figured that I can also approximate that if I think of radar returns as a rendered images. After all we've gotten really good approximating the light travel, so why not use that to approximate similar in nature radar waves?
Therefore the radar became a camera accompanied by a spotlight. With a custom set of shaders I could render the sky, where planes would be illuminated by the radar. I could also use their normals as a mean to approximate the radar cross section (generally speaking the more direct a plane is to the radar the more visible it is), so that both geometry and size would affect radar returns. The intensity of the spotlight would simulate the power of the radar, with contacts gradually fading out with distance.
While the simulation is generally done by approximation approach (as there is no data available for those radars) it's still simulating a lot:
- Air molecular dispersion and precipitation dispersion
- Radar beamshape loses (radar doesn't emit energy uniformly)
- System loss (some of the radar energy is lost in the electrical systems)
- Radar cross section based on target model geometry and normal maps
- Occlusion calculations
- Doppler effect simulation
So how does it work in the Unity? As I said, there is a number of cameras places on the model, which are rotated by simulation logic. Those camera render world with special set of shaders, in which most of the calculations are done. Resulting image display what camera sees, with amount of reflected energy represented as one of the colors (other colors of the fragment are used for miscellaneous information, as doppler shift, for example), the resulting image is then parsed using a C library (C# is too slow to work with images this big), which outputs textures for the system displays and information for the tracking if neccessary (so the tracking radar can actually follow the target).
