Home | Documentation | Atomic Chat | Github

[Editor] Explore using Blender as scene editor


It would be very beneficial to support some sort of interoperability with Blender as a scene editor. This could be anywhere from allowing devs to use Blender externally, to recommending it as the main way to edit scenes in Blender. Blender has a powerful extension api, which not only includes writing custom scripts for it but also communicating with its gtk.Socket protocol.

Not only is Blender powerful, but it is one of the main industry-standard 3d modeling tools, and many people are familiar with it. It would be a great idea to take advantage of such a well-established ecosystem, and it would potentially bring in a lot of gamedevs that want to expand their existing knowledge rather than jumping into a new tool. This is one of the reasons I am using Atomic in the first place—it is very much like Godot but it uses actual, standards-backed languages for its scripting. If possible we should double down on this, and use another standard tool.

A recent example of this done (seemingly?) well is armory3d, a game engine for Blender.

Another example I am more familiar with is coa_tools, which allows people to create 2d skeletal animations using Blender, and output to some existing formats like Dragonbones. This project lets Blender be used in the same space as Dragonbones, Spine, and Spriter, which is remarkable because most of the tools in that space are proprietary and expensive.
I would argue that even if they were all free, I would go far out of my way to use Blender than any one-off piece of software if possible. The same is true here as a scene editor.


I would love to see tighter Blender <==> Atomic integration. I have been using Blender to setup scenes and then importing them into the Atomic editor for tweaking… +1 :camel: for this


Don’t forget 3ds Max. I never got on with blender.
The problem with editing in 3D software is that you cannot set up prefabs for repeated items.

You can use it to set up a scene, but you will still need to deconstruct it for a game.


Yes, this is something we discussed extensively on Gitter. I really like Blender and I’ve used the BGE in the past and boy, it was very nice to be able to “see what you’ll get” and edit meshes and animations seamlessly in the same program. Unfortunately BGE isn’t great. The problem with integrating an external editor is that rendering is a bit of hit-and-miss… even applications that support model exporting to Unity using the PBR workflow have some serious visual differences sometimes, so I assume it’s hard as heck to make it work flawlessly. Besides the visual part, it would be a pain to manage the objects, what if you open your scene in Atomic, add components to objects, but then you have to go back to Blender to change something? A bidirectional approach while possible in theory would be extremely difficulty to achieve. The scripting API in Blender also isn’t very reliable, plugins break quite often, so it’s going to be a pain to maintain something substantial.

That being said, I’m thinking about implementing a simplified “somewhat unidirectional” solution for a future project, which basically manages the Blender scene externally. The idea is to have an absolutely minimal Blender script that just adds unique identifiers to objects (possibly in their names), and we would read the scene externally to keep track of the changes we do in the Atomic editor, which would consist mostly of adding components that don’t exist in Blender. Blender does support GLSL so for shaders we could generate them externally and insert them into the Blender scene to keep rendering somewhat consistent.

I don’t know… I honestly don’t see it being used much in the industry. It’s not only anecdotal experience, I’ve talked to people and also whenever I see a ‘making of’ of some game I pay attention to the tools and in the AAA industry I’m yet to see people using Blender, a kinda big release I’ve seen recently is NMS :stuck_out_tongue: but besides that, Blender is pretty much an indie tool. I’m not saying that in a bad way, I’m indie4life and don’t like most big studios, but the fact is, Blender is far from being industry-standard.

Nothing keeps you from using 3D apps and rigged meshes to produce ‘skeletal’ animations for 2D games. While not ideal, 3D packages have been doing skeletal animations for decades now, and it’s just a bunch of vertices tied to bones. I personally think these specialized tools are mostly targeted at the 2D artist with no 3D experience, I mean, I never understood all the hype around 2D skeletal anims, I’ve been doing that since the stone ages in 3D apps, and you can either render and use images or use the mesh directly in your engine of choice.

Yeah, but as I said, that arrow going both ways isn’t going to be easy to achieve. Blender --> Atomic is fine, but then opening the scene on Blender and tweaking it while preserving the changes you did in Atomic isn’t going to be easy to achieve.

Well, that’s not 100% true, some apps do support “prefabs” and at least for meshes even in Blender you can reuse a single mesh data for multiple objects (ALT+D does that when duplicating), and stuff like child/group positioning could be easily identified as duplicates. Prefabs, especially if they include scripting components are not worthy having though imho. Too much hassle.


I never considered Blender industry standard, but more of a hobby 3d program. I use 3ds max and I would call that industry standard. Even in max you can duplicate objects, but they are not considered prefabs, but just a duplicate of an object.
You can make instances of an object in max, but even an instance is not a prefab, and most game engines would not see it as a prefab if you exported your scene.

I honestly tried blender a few times and never got on with it. Probably because I have been using 3ds max for about 10 years now.

Even if Max or Blender was able to create prefabs you would lose them as soon as you export your scene as an FBX to put into the game engine. Unless you built a special plugin for the 3d modelling program.


Yeah, I would say Blender isn’t industry-standard, but at least for game asset it is imho industry-ready. Especially when used in conjunction with something like 3D-Coat, there’s absolutely nothing in terms of game art these two can’t produce with AAA-quality and high productivity.

I believe Max is still the No1 industry-standard 3D package for game art, but Maya is surely growing very quickly these days.

Yes, the best way to preserve prefabs would be a plugin, but it’s not that hard to identify duplicates in a FBX file for example, while maybe not practical it’s still doable.


I don’t know about blender, but in 3ds max you can give objects in the scenes names. I am not sure if names are kept when you export to FBX, but you could call an object prefab_door, and then atomic could recognise that as a prefab.

Maybe you could have a naming convention like an original object could be called prefab_door_source, and then call the copies could be called prefab_door. So if it was called prefab_door it would take the source object and use that as the reference.

I think this method would work both in blender and 3ds max.

By the way I use 3D coat with max. Sometimes I use mudbox, and sometimes 3D coat, depending on what I need to do.


Why not make a Blender script that reads and writes Atomic scene files?


The problem is unless you use the same shaders it’s not going to be WYSIWYG, and even though there are effects and such, while it’s certainly doable since afaik Blender supports GLSL in the BGE rendering mode, and transforms and such are pretty much universal, it’s too much work for arguably little benefit.