Fast highpoly mesh editing #73360

Open
opened 2020-01-24 15:16:12 +01:00 by Dalai Felinto · 114 comments

Status: Need work plan "approval", all tasks listed already. Could use a design / engineer plan and ideally a commissioner (though if the scope here is super technical just having some artists testing may be enough.


Team
Commissioner: ?
Project leader: @ideasman42
Project members: @fclem

Description
Big picture: 2.7x level of performance for mesh editing.

Use cases:

  • Editing high poly objects with good performance without modifiers.
  • Sample file with 3M avaiable on request, file is ~300MB.

Design: #74186 (Proposal for fast high poly mesh editing)

Engineer plan: ?

Work plan

Milestone 1
Time estimate: ?

  • Skip EditMesh to Mesh Conversion D7169 (committed)

    • Create API function to lazily initialize an edit-mesh objects evaluation mesh #64262
    • Lazily initialize by disabling edit-mesh to mesh conversion in the modifier stack, calling lazy initialization where needed.
    • Replace lazy initialization with code that uses the edit-mesh, where practical. (partially done)
    • Evaluate production scenes (yet to be defined) to ensure that conversion isn't required for common editing operations.
    • Draw manager support deformed edit-meshes, so an edit-mesh conversion isn't required when drawing an edit-meshes deformed cage (similar to 2.7x support).
  • Minimize GPU Transfer Overhead

    • Create API functions for tagging edit-mesh changes.

    • Limit updates to: selection data, active element.

    • Edge Flags Sharpness Seam

    • Limit updates to: deformations.

      • Coordinates, Face centers & normals.
      • Custom normals.
      • UV-tangents.
    • Limit updates to: custom-data layers.

      • UV coordinates.
      • UV selection.
      • Vertex color.
      • Vertex group color.
      • Custom normals.
    • Limit updates to: mesh properties.

      • Edge crease.
      • Edge flags (sharp, seam).
      • Face flags (freestyle).
  • Multi-Thread bmesh to mesh conversion.

    • For evaluation mesh (uses simplified logic). D6947
    • For the full bmesh to mesh coversion (uses more complex logic).
  • Optimize edit-mesh GPU updating code

    Benchmark production files, and investigate if the current code has room for improvement, making necessary optimizations.

Milestone 2
Time estimate: ?

  • Partial Geometry Updates

    • Support updating only modified geometry when transforming vertex coordinates.

      • Coordinates.
      • Normals.
      • Custom Normals (note: need to investigate feasibility).
      • UV-tangents (note: need to investigate feasibility).
    • Support partial updates for custom data layers.

      • UV mapping.

        • Mesh UV coordinates.
        • Mesh UV-tangents.
        • UV editor coordinate display.

      note: we may run into diminishing returns with UV layer partial updates for custom-data layers, since only updating a single layer will already give us significant benefits. I'd suggest only to support this if it doesn't add a lot of code-complexity.

  • Multi-object edit-mode

    • Undo optimization: skip mesh conversion for meshes which aren't modified.
  • BMesh support for the modifier stack.

Milestone 3 - opensubdiv
Fast mesh OpenSubdiv viewport implementation.
Time estimate: ?

Unknown Milestone

  • Multi-thread edit-mesh to GPU data uploading.
  • Improve geometry buffer configuration (data layout used by the draw manager).

Notes: Optimizing modifiers is out of the scope.


Relevant links:

  • #57936 (Mesh modeling performance regressions)
**Status:** Need work plan "approval", all tasks listed already. Could use a design / engineer plan and ideally a commissioner (though if the scope here is super technical just having some artists testing may be enough. --- **Team** **Commissioner:** `?` **Project leader:** @ideasman42 **Project members:** @fclem **Description** **Big picture:** 2.7x level of performance for mesh editing. **Use cases**: * Editing high poly objects with good performance without modifiers. * Sample file with 3M avaiable on request, file is ~300MB. **Design:** #74186 (Proposal for fast high poly mesh editing) **Engineer plan:** `?` **Work plan** **Milestone 1** Time estimate: `?` - [ ] Skip EditMesh to Mesh Conversion [D7169](https://archive.blender.org/developer/D7169) (committed) - [x] Create API function to lazily initialize an edit-mesh objects evaluation mesh #64262 - [x] Lazily initialize by disabling edit-mesh to mesh conversion in the modifier stack, calling lazy initialization where needed. - [ ] Replace lazy initialization with code that uses the edit-mesh, where practical. *(partially done)* - [ ] Evaluate production scenes *(yet to be defined)* to ensure that conversion isn't required for common editing operations. - [x] Draw manager support deformed edit-meshes, so an edit-mesh conversion isn't required when drawing an edit-meshes deformed cage *(similar to 2.7x support).* - [ ] Minimize GPU Transfer Overhead - [ ] Create API functions for tagging edit-mesh changes. - [ ] Limit updates to: selection data, active element. - [ ] Edge Flags Sharpness Seam - [ ] Limit updates to: deformations. - [ ] Coordinates, Face centers & normals. - [ ] Custom normals. - [ ] UV-tangents. - [ ] Limit updates to: custom-data layers. - [ ] UV coordinates. - [ ] UV selection. - [ ] Vertex color. - [ ] Vertex group color. - [ ] Custom normals. - [ ] Limit updates to: mesh properties. - [ ] Edge crease. - [ ] Edge flags (sharp, seam). - [ ] Face flags (freestyle). - [ ] Multi-Thread bmesh to mesh conversion. - [ ] For evaluation mesh *(uses simplified logic)*. [D6947](https://archive.blender.org/developer/D6947) - [ ] For the full bmesh to mesh coversion *(uses more complex logic).* - [ ] Optimize edit-mesh GPU updating code *Benchmark production files, and investigate if the current code has room for improvement, making necessary optimizations.* **Milestone 2** Time estimate: `?` - [ ] Partial Geometry Updates - [ ] Support updating only modified geometry when transforming vertex coordinates. - [x] Coordinates. - [x] Normals. - [ ] Custom Normals *(note: need to investigate feasibility)*. - [ ] UV-tangents *(note: need to investigate feasibility)*. - [ ] Support partial updates for custom data layers. - [ ] UV mapping. - [ ] Mesh UV coordinates. - [ ] Mesh UV-tangents. - [ ] UV editor coordinate display. *note: we may run into diminishing returns with UV layer partial updates for custom-data layers, since only updating a single layer will already give us significant benefits. I'd suggest only to support this if it doesn't add a lot of code-complexity.* - [ ] Multi-object edit-mode - [ ] Undo optimization: skip mesh conversion for meshes which aren't modified. - [ ] BMesh support for the modifier stack. **Milestone 3 - opensubdiv** *Fast mesh OpenSubdiv viewport implementation.* Time estimate: `?` **Unknown Milestone** - [ ] Multi-thread edit-mesh to GPU data uploading. - [ ] Improve geometry buffer configuration (data layout used by the draw manager). **Notes:** Optimizing modifiers is out of the scope. --- **Relevant links**: * #57936 (Mesh modeling performance regressions)
Campbell Barton was assigned by Dalai Felinto 2020-01-24 15:16:12 +01:00
Author
Owner

Added subscribers: @fclem, @ideasman42, @dfelinto

Added subscribers: @fclem, @ideasman42, @dfelinto

#86341 was marked as duplicate of this issue

#86341 was marked as duplicate of this issue

#68886 was marked as duplicate of this issue

#68886 was marked as duplicate of this issue

Added subscriber: @kookitoo

Added subscriber: @kookitoo

This comment was removed by @kookitoo

*This comment was removed by @kookitoo*

Added subscriber: @filibis

Added subscriber: @filibis
Member

Added subscriber: @LazyDodo

Added subscriber: @LazyDodo

Added subscriber: @Teds

Added subscriber: @Teds

Added subscriber: @Lumpengnom-3

Added subscriber: @Lumpengnom-3

Added subscriber: @CobraA

Added subscriber: @CobraA

2.8x could and should go beyond 2.7x in highpoly editing especially now we can edit multiple objects at the same time.
I hope things go well for this project.

2.8x could and should go beyond 2.7x in highpoly editing especially now we can edit multiple objects at the same time. I hope things go well for this project.

Added subscriber: @Ztreem

Added subscriber: @Ztreem

Added subscriber: @AditiaA.Pratama

Added subscriber: @AditiaA.Pratama

Added subscriber: @Pipeliner

Added subscriber: @Pipeliner

Added subscriber: @AlexanderKrause

Added subscriber: @AlexanderKrause

Overall performance should be the highest priority right now.

Everything is slower than 2.79. Mesh editing, Texture painting, weight painting and sculpting are all slower on the same PC where 2.79 is faster.
Maybe the Vulkan update will help?

There's a hardcore delay when selecting vertices. Currently it's test of patience.

Overall performance should be the highest priority right now. Everything is slower than 2.79. Mesh editing, Texture painting, weight painting and sculpting are all slower on the same PC where 2.79 is faster. Maybe the Vulkan update will help? There's a hardcore delay when selecting vertices. Currently it's test of patience.

Added subscriber: @AlbertoVelazquez

Added subscriber: @AlbertoVelazquez

Added subscriber: @Kjell-1

Added subscriber: @Kjell-1

I had a forum discussion recently where I did put together a test case to highlight the lacking performance. For your consideration: Video [HERE ]]. Test-asset [ http:*artbykjell.com/temp/Z3_500k.zip | HERE . (free model used)

I had a forum discussion recently where I did put together a test case to highlight the lacking performance. For your consideration: Video [HERE ]]. Test-asset [[ http:*artbykjell.com/temp/Z3_500k.zip | HERE ](https:*www.youtube.com/watch?v=CegHqsEtTgQ&feature=youtu.be). (free model used)

Added subscriber: @ManuelAlbert

Added subscriber: @ManuelAlbert
Member
Added subscribers: @ArtemBataev, @brecht, @Ace_Dragon, @1D_Inc, @zebus3dream, @cheteron, @realeyez, @RobertS, @maxivazquez, @hadrien, @xdanic, @FrancoisRimasson, @VertexPainter

Changed status from 'Needs Triage' to: 'Confirmed'

Changed status from 'Needs Triage' to: 'Confirmed'

In #73360#865015, @Kjell-1 wrote:
I had a forum discussion recently where I did put together a test case to highlight the lacking performance. For your consideration: Video [HERE ]]. Test-asset [ http:*artbykjell.com/temp/Z3_500k.zip | HERE . (free model used)

There have been many comparisons already, it looks like perfomance vs beauty issue.
Also interesting that Blender have fast sculpting mode, but slow edit mode. So the problem is in the mode.

> In #73360#865015, @Kjell-1 wrote: > I had a forum discussion recently where I did put together a test case to highlight the lacking performance. For your consideration: Video [HERE ]]. Test-asset [[ http:*artbykjell.com/temp/Z3_500k.zip | HERE ](https:*www.youtube.com/watch?v=CegHqsEtTgQ&feature=youtu.be). (free model used) There have been many [comparisons](https://youtu.be/4VmTBCFZE6Q) already, it looks like perfomance vs beauty issue. Also interesting that Blender have fast sculpting mode, but slow edit mode. So the problem is in the mode.

Added subscriber: @ckohl_art

Added subscriber: @ckohl_art

Editing high poly objects with good performance without modifiers.

What is a reasonable definition of "high poly" for the purpose of this task? 200,000 triangles? 500k? One million? Five?

Need tests cases

If you're looking for test cases let's assume that this photogrammetry object (exported from Meshroom) fits the definition of "high poly" that was established above:
tree.png

Cut the tree trunk out of the rest of the scene. This could be either via box/lasso selecting all the unwanted faces and deleting them or it could be with the Bisect tool with "Clear Inner" or "Clear Outer" enabled.

> Editing high poly objects with good performance without modifiers. What is a reasonable definition of "high poly" for the purpose of this task? 200,000 triangles? 500k? One million? Five? > Need tests cases If you're looking for test cases let's assume that this photogrammetry object (exported from Meshroom) fits the definition of "high poly" that was established above: ![tree.png](https://archive.blender.org/developer/F8346205/tree.png) Cut the tree trunk out of the rest of the scene. This could be either via box/lasso selecting all the unwanted faces and deleting them or it could be with the Bisect tool with "Clear Inner" or "Clear Outer" enabled.

In #73360#874734, @ckohl_art wrote:
What is a reasonable definition of "high poly" for the purpose of this task? 200,000 triangles? 500k? One million? Five?
If you're looking for test cases let's assume that this photogrammetry object (exported from Meshroom) fits the definition of "high poly" that was established above

Suzanne with subd=5 applied have million tris.
Suzanne with subd=6 have 2 mln tris, like the dragon from the video.

A good example to start from

> In #73360#874734, @ckohl_art wrote: > What is a reasonable definition of "high poly" for the purpose of this task? 200,000 triangles? 500k? One million? Five? > If you're looking for test cases let's assume that this photogrammetry object (exported from Meshroom) fits the definition of "high poly" that was established above Suzanne with subd=5 applied have million tris. Suzanne with subd=6 have 2 mln tris, like the dragon from the video. A good example to start from

Added subscriber: @Alrob

Added subscriber: @Alrob

Don't we any other objects to test or what. Why susan?
You guys have in house studio and there are plenty of High poly production meshes. Ànd it's almost a month since this job created. But still no plans to how to tackle things.

Don't we any other objects to test or what. Why susan? You guys have in house studio and there are plenty of High poly production meshes. Ànd it's almost a month since this job created. But still no plans to how to tackle things.

Added subscriber: @ostry

Added subscriber: @ostry

In #73360#876657, @Alrob wrote:
Don't we any other objects to test or what. Why susan?
You guys have in house studio and there are plenty of High poly production meshes. Ànd it's almost a month since this job created. But still no plans to how to tackle things.

Because subdiv lagging more than highpoly without modifiers

> In #73360#876657, @Alrob wrote: > Don't we any other objects to test or what. Why susan? > You guys have in house studio and there are plenty of High poly production meshes. Ànd it's almost a month since this job created. But still no plans to how to tackle things. Because subdiv lagging more than highpoly without modifiers

No.
Live Subd It is completely different issue.
Applied subdiv is "converted to mesh", a generated static highpoly geometry.
I want to say that it is easy to get highpoly mesh for basic testing edit mode.

No. Live Subd It is completely different issue. Applied subdiv is "converted to mesh", a generated static highpoly geometry. I want to say that it is easy to get highpoly mesh for basic testing edit mode.

Added subscriber: @Schiette

Added subscriber: @Schiette
Author
Owner

As for everyone pitching in with suggestions for test cases, what is needed are "production" .blend files that were performing better in 2.79 than in 2.80 as far as highpoly mesh editing goes. A cube with subdivision is not a representative of production assets).

As for everyone pitching in with suggestions for test cases, what is needed are "production" .blend files that were performing better in 2.79 than in 2.80 as far as highpoly mesh editing goes. A cube with subdivision is not a representative of production assets).

Added subscriber: @SamGreen

Added subscriber: @SamGreen

We have several mln verts historic restorations photoscans and single mesh houses in classic style but they are so NDA, sorry.
Anyway, it is just "high density mesh with material and smooth/autosmooth" in the most cases, that makes subdivided monkey (applied subd, uv unwrapped mesh with several islands) a still relevant example.

We have several mln verts historic restorations photoscans and single mesh houses in classic style but they are so NDA, sorry. Anyway, it is just "high density mesh with material and smooth/autosmooth" in the most cases, that makes subdivided monkey (applied subd, uv unwrapped mesh with several islands) a still relevant example.

Added subscriber: @VladimirTurcan

Added subscriber: @VladimirTurcan
Author
Owner

Already got a good public-domain sample file for the initial work, thanks.

Already got a good public-domain sample file for the initial work, thanks.

Added subscriber: @MichaelWeisheim

Added subscriber: @MichaelWeisheim

I think this plan is fine overall. Just two points I would like to stress (which may be obvious):

  • Don't hesitate to do refactoring and renaming where needed, also of adjacent code, to make design changes more clear. There is obscure logic spread out through the code to deal with mesh vs. editmesh vs. cages and evaluated vs. original, unifying that in functions with clear names will help write correct code in the future.
  • Feel free to skip bullet points if they don't have a significant impact in profiling. We can try to optimize in every little corner, but in the end it's about what has the biggest impact.
I think this plan is fine overall. Just two points I would like to stress (which may be obvious): * Don't hesitate to do refactoring and renaming where needed, also of adjacent code, to make design changes more clear. There is obscure logic spread out through the code to deal with mesh vs. editmesh vs. cages and evaluated vs. original, unifying that in functions with clear names will help write correct code in the future. * Feel free to skip bullet points if they don't have a significant impact in profiling. We can try to optimize in every little corner, but in the end it's about what has the biggest impact.

Added subscriber: @Andruxa696

Added subscriber: @Andruxa696
Author
Owner

@ideasman42 I just added as children of this task some old related tasks with investigation details on the performance issues.

@ideasman42 I just added as children of this task some old related tasks with investigation details on the performance issues.

Added subscriber: @Lolme3

Added subscriber: @Lolme3

Added subscriber: @johny.zlo

Added subscriber: @johny.zlo

Added subscriber: @lrevardel

Added subscriber: @lrevardel

Added subscriber: @item412

Added subscriber: @item412

Added subscriber: @AlexeyPerminov

Added subscriber: @AlexeyPerminov

Added subscriber: @2046411367

Added subscriber: @2046411367

Added subscriber: @MeshVoid

Added subscriber: @MeshVoid

Added subscriber: @Jaydead

Added subscriber: @Jaydead

Added subscriber: @AxelMening-4

Added subscriber: @AxelMening-4

Added subscriber: @Positivity

Added subscriber: @Positivity

Added subscriber: @PetterLundh

Added subscriber: @PetterLundh

Added subscriber: @ReguzaEi

Added subscriber: @ReguzaEi

Added subscriber: @YegorSmirnov

Added subscriber: @YegorSmirnov

Added subscriber: @StanislavOvcharov

Added subscriber: @StanislavOvcharov

Added subscriber: @ucupumar

Added subscriber: @ucupumar

Added subscriber: @juang3d

Added subscriber: @juang3d
Contributor

Added subscriber: @RedMser

Added subscriber: @RedMser

Added subscriber: @0o00o0oo

Added subscriber: @0o00o0oo

Added subscriber: @mattli911

Added subscriber: @mattli911

Added subscriber: @lemenicier_julien

Added subscriber: @lemenicier_julien

Added subscriber: @Vyach

Added subscriber: @Vyach

Added subscriber: @Orange3d

Added subscriber: @Orange3d

Added subscriber: @TasosK

Added subscriber: @TasosK
Contributor

Added subscriber: @KenzieMac130

Added subscriber: @KenzieMac130

Added subscriber: @mcolinp

Added subscriber: @mcolinp

Added subscriber: @marioamb

Added subscriber: @marioamb

Added subscriber: @GeorgiaPacific

Added subscriber: @GeorgiaPacific

Added subscriber: @garwell

Added subscriber: @garwell

Added subscriber: @HaroldRiverolEchemendia

Added subscriber: @HaroldRiverolEchemendia

Added subscriber: @Noto

Added subscriber: @Noto

Added subscriber: @MattCurtis

Added subscriber: @MattCurtis

Added subscriber: @JuanCarlosParedesCervantes

Added subscriber: @JuanCarlosParedesCervantes

Added subscriber: @Nominous

Added subscriber: @Nominous

Is there a reason why this is considered high priority, yet milestone 1 was barely worked on in spite of this being up for a year?

If the roadmaps are little more than window dressing, then the roadmaps should not exist at all and the BF should apologize to the donors (because they invested real money on the idea that Blender was now a well organized project as opposed to that little FOSS passion project with organic development).

Is there a reason why this is considered high priority, yet milestone 1 was barely worked on in spite of this being up for a year? If the roadmaps are little more than window dressing, then the roadmaps should not exist at all and the BF should apologize to the donors (because they invested real money on the idea that Blender was now a well organized project as opposed to that little FOSS passion project with organic development).

Keep in mind that roadmaps are just intentions and plans that are tied to change, and changes are expected when other high priority things appear, and those tie this tasks to other tasks to be completed, also the programmer reponsible of this may be occupied working in a different thing which schedule may have been delayed or increased because of requirements that appeared later.

An important thing to is that only a part of a roadmap will be completed, and that a roadmap can evolve and change, that does not mean that this project is dead, but that other things are being done first.

However for talk about this would be better if you open a thread in devtalk.blender.org, usually this threads are to do specific technical talks about the task itself :)

BTW: I fully agree with you about this being a super high priority task, this is dragging other parts of performance I think :)

Keep in mind that roadmaps are just intentions and plans that are tied to change, and changes are expected when other high priority things appear, and those tie this tasks to other tasks to be completed, also the programmer reponsible of this may be occupied working in a different thing which schedule may have been delayed or increased because of requirements that appeared later. An important thing to is that only a part of a roadmap will be completed, and that a roadmap can evolve and change, that does not mean that this project is dead, but that other things are being done first. However for talk about this would be better if you open a thread in devtalk.blender.org, usually this threads are to do specific technical talks about the task itself :) BTW: I fully agree with you about this being a super high priority task, this is dragging other parts of performance I think :)

Setting this as normal priority, note that there are no near term plans to work on this, although spesific tasks may be picked up.

Specifically:

  • Improve performance of subdivision surface modifier.
  • Reduce overhead of redundant data to the GPU.

Keeping open as a TODO since these should be tackled at some point.

Setting this as normal priority, note that there are no near term plans to work on this, although spesific tasks may be picked up. Specifically: - Improve performance of subdivision surface modifier. - Reduce overhead of redundant data to the GPU. Keeping open as a TODO since these should be tackled at some point.

What a pity, this is a very problematic bottle neck in production, not being able to properly work with meshes from one million polys upwards it’s a big problem, not sure why there are not near term plans to tackle this.

What a pity, this is a very problematic bottle neck in production, not being able to properly work with meshes from one million polys upwards it’s a big problem, not sure why there are not near term plans to tackle this.

Autodesk is only one consumer-friendly license overhaul away from erasing every bit of progress Blender made in 15 years of existence (as there would be absolutely no reason to use Blender if Maya licensing had a permanent option that was affordable).

If performance in the basic areas isn't a priority, then it is a pretty clear-cut case for a fork.

Autodesk is only one consumer-friendly license overhaul away from erasing every bit of progress Blender made in 15 years of existence (as there would be absolutely no reason to use Blender if Maya licensing had a permanent option that was affordable). If performance in the basic areas isn't a priority, then it is a pretty clear-cut case for a fork.

Added subscriber: @SteffenD

Added subscriber: @SteffenD
Contributor

It will be sad if even Blender 3.0 is slower than 2.79 for modelling :/

It will be sad if even Blender 3.0 is slower than 2.79 for modelling :/
Author
Owner

Hi @Gilberto.R , the only known case where this happens is with the subsurf modifier. @ideasman42 will update the task soon to give some status update and closure to it, since the remaining tasks are no longer a high priority target.

Hi @Gilberto.R , the only known case where this happens is with the subsurf modifier. @ideasman42 will update the task soon to give some status update and closure to it, since the remaining tasks are no longer a high priority target.
Author
Owner

(oh nevermind, Campbell has already replied to this task)

(oh nevermind, Campbell has already replied to this task)

In #73360#1107270, @Ace_Dragon wrote:
Autodesk is only one consumer-friendly license...

Potential performance is largely determined by design.

Autodesk software is mostly used for performance, so you usually forgive its UI/UX when use it, while Blender was used mostly for better UI/UX solutions that allow users to perform work faster, which played a significant role at the market.
It looks like during 2.8 redesign it was assumed that Autodesk software is used for UI/UX instead of performance.
As a result we has got a software with beautiful and, probably, expensive wireframe effects, which disallow you even to [see your selection properly ]] during linear modeling, with sacrificing [ https:*developer.blender.org/T78482#1014796 | draworder , compatible with architectural/CAD modeling to save some performance, making architectural modeling types incredible hard and frustrating, and so one.

Maybe it should be redesigned again?

> In #73360#1107270, @Ace_Dragon wrote: > Autodesk is only one consumer-friendly license... Potential performance is largely determined by design. Autodesk software is mostly used for performance, so you usually forgive its UI/UX when use it, while Blender was used mostly for better UI/UX solutions that allow users to perform work faster, which played a significant role at the market. It looks like during 2.8 redesign it was assumed that Autodesk software is used for UI/UX instead of performance. As a result we has got a software with beautiful and, probably, expensive wireframe effects, which disallow you even to [see your selection properly ]] during linear modeling, with sacrificing [[ https:*developer.blender.org/T78482#1014796 | draworder ](https:*devtalk.blender.org/t/blender-2-8-wireframes-discussion/666/375?u=1d_inc), compatible with architectural/CAD modeling to save some performance, making architectural modeling types incredible hard and frustrating, and so one. Maybe it should be redesigned again?

@dfelinto It is not affecting only Subsurf modifier, even without it, dens meshes that exceed 1 mil tris, is painful to modify. I just don't get it why such negligence towards traditional modeling and its performance. I see all these fancy features implemented in every update, but what's the point of them if simple traditional Hight to low poly modeling is sluggish and painful when used on Production-ready meshes !!! Sure subdivided cube as test might work good but when you have mesh what have multiple objects and is complex and dense, performance drops to the ground, and that is with Top-notch PC system with high-end CPU and Graphics Card. I personally think you must focus on the core of the program what In my mind is 3d modeling application, make that as good as possible and all other features will benefit from that also!!!

@dfelinto It is not affecting only Subsurf modifier, even without it, dens meshes that exceed 1 mil tris, is painful to modify. I just don't get it why such negligence towards traditional modeling and its performance. I see all these fancy features implemented in every update, but what's the point of them if simple traditional Hight to low poly modeling is sluggish and painful when used on Production-ready meshes !!! Sure subdivided cube as test might work good but when you have mesh what have multiple objects and is complex and dense, performance drops to the ground, and that is with Top-notch PC system with high-end CPU and Graphics Card. I personally think you must focus on the core of the program what In my mind is 3d modeling application, make that as good as possible and all other features will benefit from that also!!!
https://youtu.be/4VmTBCFZE6Q

Removed subscriber: @AlexanderKrause

Removed subscriber: @AlexanderKrause
Member

Added subscribers: @Sp3ci3s8472, @lichtwerk, @APEC

Added subscribers: @Sp3ci3s8472, @lichtwerk, @APEC

Added subscriber: @Voidium

Added subscriber: @Voidium

Removed subscriber: @Voidium

Removed subscriber: @Voidium

Added subscriber: @Carlosan

Added subscriber: @Carlosan

In #73360#1107270, @Ace_Dragon wrote:
Autodesk is only one consumer-friendly license overhaul away from erasing every bit of progress Blender made in 15 years of existence (as there would be absolutely no reason to use Blender if Maya licensing had a permanent option that was affordable).

I am not sure it is a problem.
Since Blender was redesigned to industry standards, you can start learn CG from Blender and then just seamlessly switch to Autodesk products, when you will reach professional performance requirements to satisfy!
This way Blender can be used as a consumer-friendly Maya student license and users will be in safe.

> In #73360#1107270, @Ace_Dragon wrote: > Autodesk is only one consumer-friendly license overhaul away from erasing every bit of progress Blender made in 15 years of existence (as there would be absolutely no reason to use Blender if Maya licensing had a permanent option that was affordable). I am not sure it is a problem. Since Blender was redesigned to industry standards, you can start learn CG from Blender and then just seamlessly switch to Autodesk products, when you will reach professional performance requirements to satisfy! This way Blender can be used as a consumer-friendly Maya student license and users will be in safe.
Contributor

In #73360#1151872, @1D_Inc wrote:

In #73360#1107270, @Ace_Dragon wrote:
A.....desk is only one consumer-friendly license overhaul awa [...]

I am not sure it is a problem.
Since Blender was redesigned to industry sta [...]

I think this conversation would be better moved to a devtalk or community thread or something. People subscribed to this task just want updates on progress and not a side discussion.

> In #73360#1151872, @1D_Inc wrote: >> In #73360#1107270, @Ace_Dragon wrote: >> A.....desk is only one consumer-friendly license overhaul awa [...] > I am not sure it is a problem. > Since Blender was redesigned to industry sta [...] I think this conversation would be better moved to a devtalk or community thread or something. People subscribed to this task just want updates on progress and not a side discussion.

Added subscriber: @not_a_boring_name

Added subscriber: @not_a_boring_name

Added subscriber: @FelixKutt

Added subscriber: @FelixKutt

Added subscriber: @Macilvoy

Added subscriber: @Macilvoy

Added subscriber: @easythrees

Added subscriber: @easythrees

Added subscriber: @ArmoredWolf

Added subscriber: @ArmoredWolf

Added subscriber: @Gabi_love

Added subscriber: @Gabi_love

@ideasman42
Hey, I was just wondering what is holding the patch from being worked on?
In the 3.x roadmap it's says :

Modeling
Modeling tools in Blender will be maintained and keep working compatible. Speedup for managing massive datasets (large scenes or large models) remains a key topic for more development.

So does it means this patch is not longer been tackled? or it's not on the agenda anymore?

@ideasman42 Hey, I was just wondering what is holding the patch from being worked on? In the 3.x roadmap it's says : > Modeling > Modeling tools in Blender will be maintained and keep working compatible. Speedup for managing massive datasets (large scenes or large models) remains a key topic for more development. So does it means this patch is not longer been tackled? or it's not on the agenda anymore?
Contributor

@Gabi_love Campbell is on vacation. There have been a lot of performance improvements for 3.0 . And more will come on subsequent releases, like faster subdvision surface on 3.1.

@Gabi_love Campbell is on vacation. There have been a lot of performance improvements for 3.0 . And more will come on subsequent releases, like faster subdvision surface on 3.1.

Removed subscriber: @HaroldRiverolEchemendia

Removed subscriber: @HaroldRiverolEchemendia

In #73360#1256832, @Gilberto.R @ideasman42 wrote:
@Gabi_love Campbell is on vacation. There have been a lot of performance improvements for 3.0 . And more will come on subsequent releases, like faster subdvision surface on 3.1.

Yes I have been tracking the changes that had happened for 3.0 (and I know about the open subdiv patch aswell). I'm just wondering why is this patch not been worked on at all.
Are there other things more important that has been worked on? or is it something else?
Don't wanna sound rude or anything just want looking so transparency (because this patch is like 2 years old now and it's due date was for 2.92).

> In #73360#1256832, @Gilberto.R @ideasman42 wrote: > @Gabi_love Campbell is on vacation. There have been a lot of performance improvements for 3.0 . And more will come on subsequent releases, like faster subdvision surface on 3.1. Yes I have been tracking the changes that had happened for 3.0 (and I know about the open subdiv patch aswell). I'm just wondering why is this patch not been worked on at all. Are there other things more important that has been worked on? or is it something else? Don't wanna sound rude or anything just want looking so transparency (because this patch is like 2 years old now and it's due date was for 2.92).

Added subscriber: @Royston

Added subscriber: @Royston

Added subscriber: @mod_moder

Added subscriber: @mod_moder

Added subscriber: @hatchli

Added subscriber: @hatchli

Added subscriber: @RoachBug

Added subscriber: @RoachBug

Added subscriber: @kmcurry

Added subscriber: @kmcurry

Added subscriber: @Yuro

Added subscriber: @Yuro

Added subscriber: @thomasmouilleron

Added subscriber: @thomasmouilleron

Added subscriber: @htuncay-4

Added subscriber: @htuncay-4

Added subscriber: @blenderND

Added subscriber: @blenderND

Added subscriber: @Djdraco

Added subscriber: @Djdraco
Philipp Oeser removed the
Interest
Modeling
label 2023-02-09 15:29:22 +01:00

is this project still ongoing

is this project still ongoing
Sign in to join this conversation.
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
85 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: blender/blender#73360
No description provided.