User Details
- User Since
- Oct 12 2018, 6:30 AM (193 w, 5 d)
May 3 2022
Mar 4 2022
Release notes look like this was fixed in Mesa 21.3.6:
Feb 12 2022
Curves with a bevel_object also count as double.
Feb 3 2022
@Richard Antalik (ISS) Ah, um. I don't know how the account above managed to close this, but it appears it was never triaged.
Feb 1 2022
I do not believe it's related to the other graphics driver issue. I'm using an unaffected version of Mesa, and haven't observed any of the other of issues from T94209.
Jan 31 2022
Jan 29 2022
@Marc-Andre Voyer (ispaure) The SVN files for Blender 3.1 have a patched Python 3.10 version in them, so hopefully that will bring the fix to more users.
Oct 4 2021
In 2.93 at least, this still happens if the vertex group hasn't been "initialized" on the modified object by having been assigned to at some point. I will open a new issue if it still happens in 3.0.
Aug 16 2021
Aug 3 2021
Still in Python 3.9.6. There doesn't seem to be any backport for version numbers below 3.10.
Jul 15 2021
Hm. There isn't any way to specify the popup menu size from the Python API though, AFAICT.
Jul 11 2021
Jul 10 2021
The current behaviour also projects even if the source vertex isn't even on the screen/view frustum, which naturally leads to some rather unpredictable behaviour.
Jul 9 2021
Jul 7 2021
Results from trying to nail down causes of the remaining variance in my project:
Jul 5 2021
That's odd.
Jul 4 2021
Jun 27 2021
I submitted a fix upstream, which has now been merged:
Jun 22 2021
Jun 21 2021
I opened an issue upstream, for the Python standard library:
For the other users in this thread:
This may have been fixed upstream:
Ah. This seems like probably a dupe of T88986. I focused on the factory aspect in my search, so didn't catch that one until I went back to trying to find a workaround.
Do AO or shadow settings affect this?
Jun 20 2021
Jun 18 2021
Jun 13 2021
Here's a somewhat shorter reproducer that triggers invokes the operator and triggers the crash automatically:
Jun 7 2021
I kept running into another issue a while ago that was similar but with changing the Redo dialog instead, in between script updates and re-runs.
Jun 6 2021
Jun 4 2021
Right, sorry about that 😅.
Discussion from D11344:
Discussion from D11344:
@Philipp Oeser (lichtwerk) Aah, sorry! I had the information typed up, with screenshots, the documentation page, and even a Stackexchange link, but didn't hit "Submit".
May 24 2021
Will this properly handle Collections instanced as Empties— Including arbitrarily deep recursive layers of subcollections and highly interdependent global-space modifiers?
May 22 2021
May 21 2021
I think this only affects the UI and not the underlying property, right?
May 20 2021
May 18 2021
May 16 2021
What would be the application of this? Isn't the number of targets strictly determined by DriverVariable().type?
May 12 2021
The last one also seems to have much greater weight for changes in normals across a continuous surface than for changes in depth, with sharper darkening of crevasses and brightening of ridges, and much higher contrast in general.
May 7 2021
This is a View Transform/Color Management thing.
Apr 30 2021
Apr 27 2021
Apr 25 2021
Apr 24 2021
Apr 22 2021
I know I'm just a user with an opinion.
Apr 20 2021
@Falk David (filedescriptor) Anisotropic/non-uniform scale is explicitly a part of the description and reproduction steps. Applying the scale doesn't solve the problem so much as avoid it, and isn't an option for objects with shared data.
Apr 19 2021
Iterating over and checking for object containment in collection object values also fails, but checking for object name string containment in collection object keys works.
Apr 18 2021
Apr 17 2021
Apr 16 2021
I was hoping that someone would provide information which would make it easier for me to figure out reproduction steps.
Apr 13 2021
Okay— Actually, "Using fallback" seems to be caused by Boolean modifier in 'FAST' mode with overlapping geometry, and the slow performance may be caused by that geometry being fairly poly-heavy and then being propagated along a series of other Boolean modifiers.
Mar 27 2021
Ah, yeah. I guess there's two contradictory behaviours: Custom properties are usually item keys, but dir() seems to treat them like attributes? Which one is more correct, and does anything rely on the current behaviour of dir()?
Mar 20 2021
Mar 19 2021
Without knowing how exactly the stale data in question is structured, this also raises some security questions for me. Does the kept panel include its input fields? Suppose I have an add-on panel from a proprietary cloud hosting or rendering service. I've input an API key into the panel, and saved the file. I then uninstall the add-on, so I cannot see the panel and have no way of know it's still in the file. I upload a stripped-down version of the file to a website like StackExchange. Is my API key now floating around in a .blend on the open Web?
Mar 17 2021
If I've understood the situation correctly, then I'd like to humbly suggest not using the simplest option. As a user, I don't really like the the idea of stale UI cruft being left in the file.
Mar 13 2021
Mar 5 2021
This happened again for me, in the same project:
Mar 4 2021
Mar 3 2021
@Abnilson Rafael (UltraBurstXD) I'd like to add that you shouldn't have to let weak hardware discourage you from CGI if it's something you're interested in or enjoy. I used to use a single-threaded Athlon64 with a similarly ancient and weak nForce 630 back in the Blender 2.5 days, I've heard of people who used original (1990s era) Pentiums around the same time and made amazing art, and I still rely on an Intel IGPU most of the time.
Again, "normal", "good", and "bad" are all subjective— They're not software bugs, they're opinions and judgements. And from the hardware you've listed and the scene settings you've described, the behaviour you've described sounds entirely "normal" to me, or at least plausible.
What is the specific problem, and expected behaviour? "Slow" on its own is vague, subjective, and a product of many factors, including hardware. Is it, E.G., 50% slower than it was in the last version? Or is it just "slow"? "Remarkable quality" isn't really related to speed. "Incorrectly" is also vague, unless you specify in what actual way it's incorrect.