Cycles CUDA not rendering image whilst viewport shading set to Rendered mode #97115
Labels
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
2 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: blender/blender#97115
Loading…
Reference in New Issue
No description provided.
Delete Branch "%!s(<nil>)"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
System Information
Operating system: Linux-5.13.0-21-generic-x86_64-with-glibc2.34 64 Bits
Graphics card: NVIDIA GeForce GTX 650 Ti/PCIe/SSE2 NVIDIA Corporation 4.5.0 NVIDIA 470.86
Blender Version
Broken: version: 3.2.0 Alpha, branch: master, commit date: 2022-04-03 17:57, hash:
637fe6f5ff
- also 3.0.1Worked: don't know
Short description of error
Rendering the default cube with Cycles when one of the viewport's shading is set to Rendered, results in the "System out of GPU memory" message coming up and the render aborting.
Rendering without a viewport set to Rendered works fine, for the default cube but also for a model with a million vertices.
Exact steps for others to reproduce the error
Added subscriber: @jack.herbert
Added subscriber: @PratikPB2123
Changed status from 'Needs Triage' to: 'Needs User Info'
Hi, thanks for the report. Do you see this problem in default blender scene (cube)?
Are you rendering with default render settings?: https://docs.blender.org/manual/en/latest/render/cycles/gpu_rendering.html#error-out-of-memory
Can you share your .blend file?
I don't have similar system to test so I will pass this report to someone for further investigation.
Yes, that with the default cube scene and after loading the factory preferences.
The only things I change is to select CUDA as render device, Cycles as render engine and change the viewport mode to Rendered.
Also, pausing the viewport (with the little [II] button) doesn't stop the bug from happening.
Changed status from 'Needs User Info' to: 'Needs Triage'
Changed status from 'Needs Triage' to: 'Needs User Info'
Hi, how much VRAM do you have? Did you see rise in VRAM usage when rendering the scene?
Do you get same error with Optix?
Hi! My card, the GTX 650 Ti only support Nvidia Compute 3.0 so Optix isn't an option.
It only has 1GB of VRAM, so I always thought the error message was normal and that my GPU couldn't handle Cycles. But actually, as long as I don't have the viewport in Rendered mode, it works fine, and is about 4 faster than using the CPU.
I did some testing with the default cube (8 vertices in 6s), the default Sculpting preset (24 thousand vertices in 9s) and my own model (1 million vertices in 1m08s).
In all those renders, the VRAM goes up when rendering but never fills up, at most just over 80% gets used, then it starts using the RAM.
With the default cube and the viewport set to Rendered, switching from EEVEE to Cycles makes the VRAM go from around 30% to around 70%. Pressing F12 makes it go up a little bit, about 5% more, but then the render crashes with the "System is out of GPU memory" error message
Just a wild guess, but it kinda looks like the ability to use RAM when running out of VRAM stops working if the viewport is already using it?
Changed status from 'Needs User Info' to: 'Archived'
Hi, thanks for the information.
2GB VRAM is minimum requirement for Blender to run.
If your system is out of VRAM then I think it should use the shared memory.
This doesn't look like a bug in Blender.
Will close this report for now as tracker is only used for bugs and errors in Blender.
Feel free to comment if there is any sort of misunderstanding. Will reopen the report in that case.
Yes, makes sense to close this if my card doesn't meet the minimum requirements, and since it doesn't seem to affect anyone else (so quite likely a problem with my setup).
And anyway it's not that much of a problem, as long as I disable rendered view I can render with GPU Cycles perfectly fine.
Thanks for your time looking into it!