Yes, this is something we discussed extensively on Gitter. I really like Blender and I’ve used the BGE in the past and boy, it was very nice to be able to “see what you’ll get” and edit meshes and animations seamlessly in the same program. Unfortunately BGE isn’t great. The problem with integrating an external editor is that rendering is a bit of hit-and-miss… even applications that support model exporting to Unity using the PBR workflow have some serious visual differences sometimes, so I assume it’s hard as heck to make it work flawlessly. Besides the visual part, it would be a pain to manage the objects, what if you open your scene in Atomic, add components to objects, but then you have to go back to Blender to change something? A bidirectional approach while possible in theory would be extremely difficulty to achieve. The scripting API in Blender also isn’t very reliable, plugins break quite often, so it’s going to be a pain to maintain something substantial.
That being said, I’m thinking about implementing a simplified “somewhat unidirectional” solution for a future project, which basically manages the Blender scene externally. The idea is to have an absolutely minimal Blender script that just adds unique identifiers to objects (possibly in their names), and we would read the scene externally to keep track of the changes we do in the Atomic editor, which would consist mostly of adding components that don’t exist in Blender. Blender does support GLSL so for shaders we could generate them externally and insert them into the Blender scene to keep rendering somewhat consistent.
I don’t know… I honestly don’t see it being used much in the industry. It’s not only anecdotal experience, I’ve talked to people and also whenever I see a ‘making of’ of some game I pay attention to the tools and in the AAA industry I’m yet to see people using Blender, a kinda big release I’ve seen recently is NMS but besides that, Blender is pretty much an indie tool. I’m not saying that in a bad way, I’m indie4life and don’t like most big studios, but the fact is, Blender is far from being industry-standard.
Nothing keeps you from using 3D apps and rigged meshes to produce ‘skeletal’ animations for 2D games. While not ideal, 3D packages have been doing skeletal animations for decades now, and it’s just a bunch of vertices tied to bones. I personally think these specialized tools are mostly targeted at the 2D artist with no 3D experience, I mean, I never understood all the hype around 2D skeletal anims, I’ve been doing that since the stone ages in 3D apps, and you can either render and use images or use the mesh directly in your engine of choice.
Yeah, but as I said, that arrow going both ways isn’t going to be easy to achieve. Blender --> Atomic is fine, but then opening the scene on Blender and tweaking it while preserving the changes you did in Atomic isn’t going to be easy to achieve.
Well, that’s not 100% true, some apps do support “prefabs” and at least for meshes even in Blender you can reuse a single mesh data for multiple objects (ALT+D does that when duplicating), and stuff like child/group positioning could be easily identified as duplicates. Prefabs, especially if they include scripting components are not worthy having though imho. Too much hassle.