Page MenuHome

Blender Interactive Mode Outline
Open, NormalPublic

Description

This task is aimed to outline vision of how to move from current implementation of Blender Game Engine (BGE) to what we call “Interactive Mode”.
It is not to be seen as a design proposal, it only states our vision of how to move forward with the project as a whole.

When discussing the BGE, often the possibility of an interactive-mode has come up. So best we document what is meant by this.

Current state of BGE

Current code base of BGE is almost completely isolated from the rest of Blender code. On the one hand, it allows individual contributors to work more independently, but also leads to the following issues:

  • The whole animation system is re-implemented in BGE.
  • Has own transform system.
  • Rendering has own implementation as well.
  • Single threaded nature.

What we have as a result of this is that due to developer power, BGE has been stagnating for years now (with some occasional improvements). Not only feature-wise, but also bug-fixing wise.
This is not something we consider healthy for the future of Blender as the whole project, and solution here is needed.

Interactive Mode

Interactive mode is to be seen as a light-weight game engine, which is totally based on core features of Blender itself: such as, dependency graph, animation system, physics, draw manager. Some improvements to the animation and event system would need to be done, for example, logic system needs to be integrated into a dependency graph.
There are some more in-depth changes needs to be done, for example, decouple physics clock from animation clock. But all such changes will also make movie creation in Blender easier.

Downsides of interactive mode comparing to current BGE:

  • We will lose compatibility with older .blend files created for old BGE. (Full compatibility will be lost anyway, due to new OpenGL requirement, at least).
  • It may be less attractive for individual contributors that their engine is no longer a sandboxed code environment (where having different behavior can be done more easily, fewer integration issues).

We will have to start over with quite a few areas.

  • At least initially, we won’t have anything as comprehensive as existing logic bricks, Python API physics access, library reloading, dome viewport... etc.
  • Some operations in Blender are not as optimal as the BGE, adding new objects for example (although this could be resolved).

Benefits for game creators:

  • All the improvements to shader system are immediately available for everyone.
  • So all performance improvements for animation, object and scene animation are.
  • No need to re-implement improvements on both Blender and BGE sides.
  • Guaranteed multi-threading and threads scalability (this is what we pay a lot of attention in Blender core).

Benefits for movie artists:

  • More sophisticated tricks for animation, where animation can be event-triggered.
  • Independent physics will make it easier to do pre-rolls and investigation of how world reacts to dynamics without running full animation system.

Features

Without a final design this is not very detailed. Adding this section because there is some confusion as to what interactive mode would include. We would expect the following features to be supported in early versions of interactive mode.

  • Basic visual logic system (logic nodes?).
  • Python scripting.
  • Respond to events (physics collisions, keyboard, pointer).
  • Export run-time (executable).

Use Cases / User Stories

Supported

  • TODO.

Out of Scope

  • TODO.

Details

Type
Design

Event Timeline

Campbell Barton (campbellbarton) triaged this task as Normal priority.

Is there a list of example use cases that this would support, and use cases that would be out of scope?

There is some mutual benefit in sharing code, but making the same system support both movies and games inevitably adds a lot of complexity and constraints for both as well. Consider for example WebGL / iOS / Android support, networking for multiplayer games, portal culling, streaming of open world maps, baking and compression of assets. Would we ever support those kinds of features, along with the complexity and maintenance cost for core developers not working on the game engine? Or would we aim for simpler use cases where these are out of scope, and the core design of Blender doesn't need to be altered too much?

My guess is for both high end games on computers and consoles, or low end games on the web or mobile devices, building a game engine on top of the Blender core (or existing BGE) will not get you competitive performance. The strength would be in usability, having a fully integrated environment to build something interactive easily.

If that is what the interactive mode would focus on, how would distributing these games or interactive demos work? There are some cases where distributing an executable is fine, though if the goal is to let artists easily build and publish something interactive then WebGL seems important. If artists want to get their work seen by as many people as possible, downloading an executable adds a high barrier. Porting the entire Blender core to WebGL may be problematic though.

@Brecht Van Lommel (brecht), what you're talking about would position blender as an alternative to unity/godot (which IIRC both have web/console output targets).

AFAICS the purpose of interactive mode is to be a fast prototyping tool. Taking advantage of Blender's existing feature-set (making only minor changes to Blender to support this).
This could also benefit the animation system - for animators to setup reactions to events.

I think a web exporter would be a separate (and larger) project which is more useful for publishing games, OTOH - there are already a few that exist. We could instead make sure blender can support this use case instead of developing our own.

Right, interactive mode is not an alternative to Unity or Godot. But I would like to see some example use cases to understand what it is then.

If for example the primary goal would be to make a prototype of a game, but not the actual game to be distributed, that needs to be clear. If the idea would be to add publishing at some undefined later time, to make interactive mode useful to more people, we need to understand if that's actually practical with this design. If game creators need to use other engines for web or other publishing, the benefits of sharing code are not applicable.

Properly communicating the scope of interactive-mode is important.

Users stories/use cases are needed (added TODO).

Hi,

  • No backward compatibility for logic scripts
  • No possibility to export games as exe

This will be without me, good luck anyway

BTW, very good idea to put all the development effort on exporters to other engines, where you won't have the same render as in Blender, where you'll loose half of your materials and animations data, logic... Great vision of the future.

There's of course no overlap having exporters to other engines which have their own renderer, animation system, logic... Great design choice.

Also, big thanks to all bge contributors and donators who helped to fund the exporters development.

And thanks for the code quest which exported us to the moon. Wait, I have an idea, why not launch a second code quest to fund great exporters to maya, 3DSMAX, unreal, unity, sketchup... This way, we could reduce the codebase

I have also an idea to ensure the users will be happy. Don't have any user. I'm quite sure it could work.

@Campbell Barton (campbellbarton) : I think that the definition of "fast prototyping tool" should be done more detailed as I have never understood the difference between a true game engine and a fast game prototyping tool. Maybe,

fast prototyping tool = game engine - publishing options (executable, webgl, etc)

or are there another differences?

I won't leave upbge to die, I already forked it and will be gathering funds to pay people to upgrade and maintain it.

@Campbell Barton (campbellbarton) : I think that the definition of "fast prototyping tool" should be done more detailed as I have never understood the difference between a true game engine and a fast game prototyping tool. Maybe,

fast prototyping tool = game engine - publishing options (executable, webgl, etc)

or are there another differences?

The only other difference is we accept the constraint that the primary purpose of Blender is 3D animation software.
In practice this means game data will simply be blend-file data *. Over time this may be less of a limitation - there is a lot of overlap. But there are limits to how much we want to change blender's internals to support interactive-mode.

_* initially at least, we may have a way to populate the draw-manager from different sources later, since there are plans to do this for alembic support.

Hi,

  • No backward compatibility for logic scripts
  • No possibility to export games as exe

No backwards compatibility is correct, although we would lose backwards compatibility for display with the move to eevee anyway.

Export games as executable can be supported.

Ok sorry about that. Parano crysis. Better for me to keep my distances for a while if I can. Sorry about all of that

Sorry if I am just adding noise here but as a BGE user I have some questions/concerns as well.

  1. I assume yes, but will this open up all bpy functions to the Interactivity engine?
  1. this question is dependent on whether or not the interactivity engine gets an exporter but... will there be support for android etc or just the primary OS's(win, linux, mac)?

I've never contributed any code, but as a user I can say that the plan sounds good. I see a lot of positive things coming out of it...The only real concerns I have are..

  1. Licensing
  2. Exporting a package or executable.

will the new animation system will it support keyframing add and end objects?

Deniz (Deniz) added a subscriber: Deniz (Deniz).EditedMar 23 2018, 10:31 PM

To be honest,I can see from where this suggestion is coming from,my only issue is that I just cant let go of the logic brick system. I can code and I love it but I love the logic brick system just as much and I'm all for more shader-functionality and all but I just cant sacrifice the logic brick system.

Currently this "interactive mode" sounds really like something for short-film animators rather than game creators. It feels like the interactive mode doesnt really care about interactive design and looks more like giving film makers/animators more,rather than providing something for actual interactive creators(aka game creators) they can better work with.
Not that thats a bad thing,but replacing BGE with this? No I wouldnt like that.

I think this is a good initiative. BGE has all sorts to quirks and inconsistencies, and a (well) redesigned system could work wonders. Logic bricks themselves make poor use of screen real-estate and are difficult to debug.
The biggest hurdle for BGE throughout its lifetime has been the licensing issue with any exported file being bound to GPL and this needs to be addressed early on without "hacky" work arounds.
A big hole in this proposal however is the dismissal of performance. As impressive as 3fps in EEVEE is for the average blender user, it is unacceptable in the "interactive/gaming" domain. To maintain good performance, many of the heavy integrated features will need to be stripped back to their bare-bones, and even then I would imagine there is a significant overhead (with performance and file-size), simply due to the tight integration with blender. So far armory3d looks the most promising implementation of this, however from my understanding it only uses blender code as an interface and not a foundation.
This is a challenging problem and I look forward to what solutions the BF can come up with.

As a side note, for people disappointed that BGE is being replaced: You can still use BGE in older versions of blender that include it. I would imagine 99% of the features in 2.8 are unusable in the BGE anyway, so really it shouldn't make much of a difference if you stick with 2.79

Added a "Features" section since there is some confusion about what interactive mode will be capable of.

@justin barrett (justinbarrett) - yes, bpy api will likely be available (even if there is a dedicated module for interactive mode).

If Blender is ported to a system, it can have an exporter.

@Jacob Merrill (blueprintrandom) - add/remove objects is something that may not be well supported by the initial version. (using bpy works but is slow). Possibly we can instance objects directly in the draw manager bypassing blend file storage.

@Deniz (Deniz) - using logic nodes (which can generate code for example - even integrate generate and user defined code) has potential. I don't see why this is necessarily something only for non-technical people.

@Tim (thatimster) - GPL does not apply to assets you distribute. Further I rather not get into licensing in this task.

EEVEE has not been optimized yet, poor performance is a known issue.

the keyframing of add object was actually to go the other way, and save things like adding explosions to the scene, or bullets etc,

I think one day soon we will be 'playing' games to record performances to render out as movies,

(with xbox360 controller + face mocap rigs etc)

in bge right now I can store

(timestamp):[scenedata]
without any additional cost per frame really to add keys

at the end, we can sort this data and insert that into the scene graph....

we can drive EEVEE with bge :|

we don't need bells and whistles for animation, we need fluid rock solid performance.

@Jacob Merrill (blueprintrandom) - since we dont have a detailed design, asking about how some combinations of features might work together isn't that useful.

Said differently, the answer to most detailed questions is "Nobody knows".

Other then to say what's possible in Blender will be possible in interactive-mode, further - this project is only an early concept.

Oh, that's already great as basis, it seems i'm really really sick and I really apologize.

Hello, I want to add something:

The focus in this discussion has been on games and game developers, but one of the greatest advantages of the old BGE was for pre-vizsualization. For example, architectural renders that could be explored in first person and interacted with. Using Eevee for an architectural walkthrough would be a godsend.

If you want PBR materials or "special"shaders like Refraction and SSS, you can't use the built-in BGE. I've seen artists use Unity instead, there's UpBGE and others, and on the FOSS side of things we have Godot. Eevee could provide artists with these materials, but since there are so many well-established options, I think the most important question to answer is "Why do we need Interaction Mode?". Convenience for the artist seems like the obvious answer, but does it give artists something that they need and can't get elsewhere? Eevee, like cycles, cold be integrated in other software environments. What kind of problems can Interaction Mode solve that no one else is solving?

I think that Interaction Mode should be tailored to working artists-- An animator will probably use Interaction mode in their own workflow, whereas a freelancer is using it to communicate to a client. Finally, a game developer is creating an experience for its users. For the animator, Interaction Mode helps create the deliverable, for the freelancer, it helps demonstrate the deliverable, but to the game developer, it IS the deliverable. Interaction Mode should be designed in view of the deliverable it is intended to create.

I worry that the developers feel obligated to create a new Interaction Mode because they are removing BGE. The goal of Interaction Mode shouldn't be to update for the sake of updating. BGE was created at a time when FOSS wasn't an option for game developers, but Godot has filled that vacancy. Maybe it's worth considering a live-link between Blender and Godot (and others, if they are interested) for dedicated game developers. Interaction Mode will flounder much like BGE if it doesn't fill a specific need that isn't filled elsewhere.

I want to make it clear that I'm not saying "We don't need Interaction Mode" or "Just use Unity instead." Interaction Mode is a great idea, and one that can play to Blender's unique strengths in a way that supports artists in their daily work- animators, designers, architects, freelancers, fine artists, game developers, and more.

@justin barrett (justinbarrett), I've removed your post as you requested, but the post above yours is on topic. If interaction mode will be added and supported for a long time, it's important to consider the specific use cases and evaluate if and how it can seriously support them.

I missread some items in the post...(from Joseph Burgden(hope I spell his name correctly))
SO apologies etc :)

I think he has a good point about bridging the gap a bit though with Godot, while not the same license they tend to benefit each other greatly in game development...but that is a completely different software.so??? .as far is interactive mode I am feeling this is going to be primarily aimed at animation only...AFAIK Blenders intent was always to be a full 3D application....
and animation covers a lot of things....it's primarily a game engine...we can reffer to it as "interactive filmogrophy" or whatever, but almost all the same features would apply...
particles, adding, removing, hiding, parenting, input etc etc....I honestly cannot see a use case where it is not the same as a game engine.....aside from frame rate considerations...

if you are presenting some visualization to a client it is far easier to email them an executable that to expect them to have and run blender to view it. Even if "Interactive Mode" is not a game engine...it would still be prudent to plan now to have it export to an executable.... a new format like .bex or something.

I hope this makes sense...while reading through all this again I see that this seems to be a second thought....but it might be more important than thought.

1if the bf were to have actually paid to develop the code that upbge refactored it would have been light years ahead of godot.

they removed all the super old immediate mode calls,
and refactored massive chunks of code / removed bugs and added functionality.

as far as I can tell, the Bf's drawing of the interface was holding them back from upgrading the glsl*

it's a big kick in the teeth to the community,
and no we are not 'a small but dedicated group'

we are huge and we make hobby games, commercial projects, and simulations for schools or scientific demonstrations.

I was able to easily route the principled shader glsl variant into the game engine, (this would be a very easy interface for pbr)
then we can just torch the old material system, and 'catch' the use of the node by the converter and overide it with our own code, (precomputed lightfields)

As much as I love BGE for its quirks and fact that Blender had a game engine, I believe the 'Interactive Mode' is a logical step in right direction for Blender. imho it will be one of the main Blender 2.8x features in long run.
One thing Blender Interactive needs is a native VR support. Especially now when VR is going mainstream, Blender will be a killer app for game content creators. For that, a stable framerate of 45fps-90fps is a must. Now Eevee currently favors quality over performance so renderer has to be optimized for "eevee interactive" preset. Probably with upcoming hardware support for real time raytracing (NVIDIA RTX & Radeon ProRender) it should be possible without sacrificing too much quality.

p.s. I once tried 3d modelling in Blender with VR headset (using Dalai`s old VR build) and it was awesome. I could also imagine Sculpting and texture Painting in VR. I know the mentioned features are low priority, but once you have tried creating virtual content in virtual 3d space (e.g. Quill, Oculus Medium) one can see a huge potential there for newcomers.

Ulysse Martin (youle) added a comment.EditedMay 5 2018, 1:08 PM

Godot, Armoury, Unity and Unreal are better for game making as it is tools dedicated only for game creation and delivering purposes.
Godot is better for 2D and for its license more adapted for the ones who want to protect their code and sell games. Armoury is better for the ones who want to sell games. Unity and Unreal are better for professionnals who want to sell games.

So if you want to create games to sell, Blender can be used for assets production, but you'll need to switch to another engine if you want to benefit from a more adapted license and better delivering options. I read somewhere that part of Godot success was because Casinos used it to create their Casino games (GPL is not adapted in this case). The big 2: Unity + Unreal are tools for serious/professionnals game makers (Blender can't compete with them for game making/delivering purposes).

Blender is better than all of these engines for its workflow (P key, no import/export, exactly the same render in viewport than at runtime, all creation process in 1 software). For me Interactive concept covers not only game creation. It covers interactivity in every Blender area. It can be used by some 3D professionals for advertising/demonstrating/previsualization purpose. But it could also be used for hobbyist game creators (the ones who don't expect to sell something).

I agree that "interactive mode "doesn't give" game artists something that they can't get elsewhere", except that Blender, because of its workflow, seems better than other engines in many cases.

For many professionnals, it can be a great tool. For professionnal game makers, it can be a tool for assets production but it seems not a suitable solution since GPL is not adapted + missing delivering options + it is not a tool dedicated for games.

But for Blender, it is a great advantage because :

  1. It can offer many possibilities to many different artists (so it can be interesting for workers).
  2. And it can attract hobbyists, who will either switch to a "game engine" if their aim is to create and sell games, or who will stay in Blender and for a few begin to be involved in Blender development.

So I agree with Martins Upitis to say that in the long term, it can begin a really important side of Blender + I'm quite sure that devs will enjoy to code it

(all right? I wasn't too aggressive this time :P ?)

Not sure if it makes sense to state opinions regarding the use cases, but there are some comments that encourage it ... so here goes nothing.

I very much want to second the sentiment:

...
There is some mutual benefit in sharing code, but making the same system support both movies and games inevitably adds a lot of complexity and constraints for both as well. Consider for example WebGL / iOS / Android support, networking for multiplayer games, portal culling, streaming of open world maps, baking and compression of assets. Would we ever support those kinds of features, along with the complexity and maintenance cost for core developers not working on the game engine? Or would we aim for simpler use cases where these are out of scope, and the core design of Blender doesn't need to be altered too much?

My guess is for both high end games on computers and consoles, or low end games on the web or mobile devices, building a game engine on top of the Blender core (or existing BGE) will not get you competitive performance. The strength would be in usability, having a fully integrated environment to build something interactive easily.

If that is what the interactive mode would focus on, how would distributing these games or interactive demos work? There are some cases where distributing an executable is fine, though if the goal is to let artists easily build and publish something interactive then WebGL seems important. If artists want to get their work seen by as many people as possible, downloading an executable adds a high barrier. Porting the entire Blender core to WebGL may be problematic though.

I see very little appeal in something that is confined to Blender. On the other hand, a concept for a Blender Interaction Runtime that people can use in different kinds of projects has a lot of potential ... even if it seems redundant at first.

What I would love to see is a specification for a Runtime Environment that takes its clues from modern Web Components:

  • Components keep their own state
  • Components can have child components (and influence their state)
  • There is exactly one root component (the entry point) for an "Interaction Project"
  • Components can be reused and shared
  • So called Stores are used for global, cross-cutting state properties, components can listen (and respond) to changes
  • Additional global modules like an event bus and a scene manager could also be used for global state management

Vue.js has an interesting approach to components: Single File Components. In Blender, components could be SDNA files that are not part of the main Blender SDNA, but are managed like images.

Like that, embedding a C /C++ implementation of the Interaction Runtime could have a small impact on the main Blender features. Blender main SDNA just needs to store Interaction Entries like Scenes.
To share an Interaction Project you can - but do not have to - include a Runtime Environment.
Making sure regular Blender Animations / Stills can benefit from this Interaction Runtime does not seem impossible either. There could be "baking" functionality that renders scripted executions as (animated?) Scenes.

I guess what I am trying to say is that the goal should be getting developers interested in working on a more ambitious project. Getting better in that department is something that would be awesome in general.
A design goal would be making sure low level functionality allows the creation of Emergent Features. The Runtime Environment could be designed to understand SDNA Catalogs and allow read access to values, so that new Blender Features can often be used in the Components without any code changes.